Wednesday, November 22, 2017

"Universal" GPU texture/image format examples

The DXT1 images were directly converted from the ETC1 (really "ETC1S" - a compatible subset with no subblocks) data using a straightforward lookup table to convert the ETC1 base color to the DXT1 low/high colors, and the selectors were remapped appropriately using a byte from the lookup table. The ETC1->DXT1 lookup table is currently 3.75MB, and can be computed on the fly very quickly (using a variant of ryg_dxt) or (for higher conversion quality) precomputed offline.

The encoder in these examples is still my old prototype from 2016. I'm going to be replacing it with Basis's much better ETC1S encoder next. This format can also support alpha/grayscale data.

This format is a tradeoff: for slightly reduced quality, you can distribute GPU textures to most GPU's on the planet. Encode once, use anywhere is the goal. We are planning on distributing free encoders for Linux and Windows (and eventually OSX but it's not my preferred dev platform).

The current intermediate format design supports none, partial or full GPU transcoding. Full transcoding on the GPU will only work on those GPU's that support LZ in hardware (or possibly a compute shader). The process of converting the ETC1S data to DXT1 and the block unpack to either ETC1 or DXT1 can also be done in a shader, or the CPU. By comparison, crunch's .CRN design is 100% CPU oriented. We'll be releasing the transcoder and format as open source. It's an LZ RDO design, so it's compatible with any LZ (or whatever) lossless codec including GPU hardware LZ codecs. It'll support bitrates around .75-2.5 bpp for RGB data (using zlib).

All PSNR figures are luma PSNR. The ETC1 were software decoded from the ETC1 block texture data (actually "ETC1S" because all 4x4 pixel blocks use 5:5:5 base colors with no subblocks, so the differential color is 0,0,0).

ETC1 41.233



DXT1 40.9


ETC1 45.964



DXT1 45.322


ETC1 46.461


DXT1 44.865

ETC1 43.785


DXT1 43.406

ETC1 33.516


DXT1 33.339


2 comments:

  1. Interesting to read:
    "Full transcoding on the GPU will only work on those GPU's that support LZ in hardware (or possibly a compute shader)"
    just for curiosity:
    can name these GPUs? all desktop GPUs supporting DXT formats support LZ in hardware?
    mobile wise any GPU with LZ support?
    should your work make vendors expose LZ hardware decoding in some way on graphic APIs?

    thanks..

    ReplyDelete
    Replies
    1. I know that some modern consoles support LZ in hardware. I'm no expert on this, but I suspect some AMD chips have LZ in hardware. It may also be possible to do a reasonable LZ decompressor in a compute shader.

      Delete