So I watched a review of the mCable some weeks ago... and was immidiatly extremly hooked. Not only was this the first "premium HDMI cable" which actually was not snake oil, but the gaming version aimed to solve a problem I was looking for a solution to for a long time - patching antialiasing into the jaggy visuals of contemporary (or old) consoles.
Because of that I swallowed the steep price of 140$ for a 2m cable, and actually bought one (both to see if the thing actually was as good as reviewers said, and to get more enjoyment out of my PS4 games)...
Now the cable has arrived and I am pretty impressed. Its a plug and play affair, and the AA it offers is more than just decent. It's on par with high quality MSAA, and uses a sharpening filter of sorts. Almost all the jaggies are removed, beyond the capabilities of some of the less powerful AA algorithms I have tested on PC, and the only real downside is the slightest hint of image softening, and a slight contrast boost by the sharpening filter that might not be to everyones taste.
I was only able to test Horizon Zero Dawn until now, but boy, was I impressed by the difference it makes. No longer is the start screen a flickering mess thanks to the foliage and particle effects, and specular aliasing. And the effect is just as effective in the game. You have to hunt for problematic geometry to find the faintest traces of aliasing.
As said, there are downsides. The images looks slighly softer. But really, that might be mostly down to how oversharp the visuals of HZD normally look, and I am sure is something I will get used to in no time. It is nowhere near as bad as with bad AA algorithms like FXAA, and on par with what I would expect of a quality AA algorithm like MSAA.
Then there is the contrast boost. I am not sure if its a sideeffect of the sharpening filter of the AA algorithm (which can have the effect to unnaturally boost contrast at times), or if its an actual "feature" (some Unity posteffects used such a contrast boost together with a sharpening filter to make the textures pop more and make the game look sharper)... again, its very subtle, and you will probably get used to it. It is slightly sad that the brilliant AA cannot come without that, as contrast in HZD certainly didn't need a boost.
Now, was that all worth 140$? Well, lets say if you are as sensible to jaggies as I am and have money to spare (and play on consoles), the cable can be worth it (1080p and below, of course)... especially when playing PS3 or other low resolution games.
But I would prefer this to me more versatile than just being a plug and play cable with a built in, pre configured chip. You cannot configure anything (which is very console like, but being able to tone down the contrast boost effect for example would make it better in HZD for me for example), you cannot bypass it without switching cables (would make the cable more expensive I guess), and of course, the HDMI cable will get stale and will need replacement once TVs come out which need a faster connection (not sure the cable handles 4k HDR signals at 60HZ for example).
So in reality its a gadget to boost your consoles visuals for a generation.
Probably the PS4 Pro, when outputting the upscaled checkerboard image for a 4k TV, uses a similar approach (a hardware unit that works on the 2k output image to upscale it).... so that got me thinking:
Why not build such a chip into the hardware pipeline of common GPUs to relief the actual GPU from running the AA algorithm, and make the AA agnostic to the renderer being used? Especially now that VR games can completly saturate even powerful GPUs WITHOUT AA, and are in DIRE need for quality AA, and even normal games might use renderers that have no existing hardware AA solutions implemented in the current GPU hardware, the solution sounds extremly convenient.
Basically take an existing GPU (lets say a GTX 1060), let it render the game without AA (which could safe 10+% of performance, thus higher setting could be chosen or more FPS could be put on the screen), and wire a similar chip as the one used in the mCable between the GPU and the outputs to run AA on the final image (and potentially other posteffects).
Of course there are downsides.... might make the Card more expensive (I doubt it will be anywhere near the 140$ I paid for the HDMI cable on steroids though), and might introduce additional lag (on the mCable the lag currently is <1ms according to the manufacturer and reviews, and I haven't felt any additional lag myself, so don't know how much of a problem this would be).
But making a card 10-20% faster in case of running AA on the game because the GPU does no longer need to run this in its multipurpose cores, and instead a much faster and smaller asic could do the job sounds like a big plus to me. Having quality AA built into the hardware, and no longer depending on the engine or game used, or having to use expensive SSAA tricks with mods or through the driver also sounds pretty good to me.
So what do you think? Is this the future of Antialiasing? Will we see GPU manufacturers build additional units into the actual GPU to relief the graphics cores from AA chores? Might the Card manufacturers like Zotac or Gigabyte add additional AA ASIC chips to their cards for their custom designs, and sell them at a higher price as a premium product?
Or is there a problem with the idea I haven't thought about?
Clearly the first problem for an approach that integrates the function unit into the GPU itself is the fact that current GPUs are designed with HPC workloads in mind first, graphics second. Hence the double precision baggage even gamer graphics cards are still burdened with today.
But it seems times are changing, with Nvidia apparently differentiating their pro-grade/HPC cards and their gaming cards even further with Volta... apparently the gaming class products will use a new architecture called "Ampere" (which might be Volta minus the double precision and Tensor cores, and with a GDDR controller instead of the HBM one). Maybe some of the space safed by ripping out the tensor cores could be given to Posteffect Function units (which I would guess would run more advanced algorithms faster on a smaller space than more general cores)?
↧