Digital Foundry compared Intel’s Xe Super Sampling (XeSS) against Nvidia’s Deep Learning Super Sampling (DLSS) and it seemed to do quite well.
If it does not seem like we are bubbling with enthusiasm it is because we have not had our morning coffee yet and Digital Foundry only ran tests with Intel’s highest-end card, the Arc A770, using one game.
XeSS and similar technologies mean you run your game at a lower resolution, then use a bunch of machine learning algorithms to upscale it in a way that looks better.
It means you can use higher frame rates or turn on fancy effects like ray tracing without giving up a huge amount of performance because your GPU is rendering fewer pixels and then upscaling the resulting image, often using dedicated hardware.
According to Digital Foundry, if you have a 1080p display, XeSS will run the game at 960 x 540 in its highest performance (AKA highest FPS mode) and at 720p on its “Quality” mode before then upscaling it to your monitor’s native resolution.
Digital Foundry found that XeSS added two to four milliseconds to frame times or the amount of time a frame was displayed on the screen before being replaced. That could make the game feel less responsive, but getting more FPS even things out a bit.
XeSS did not have its own way all the time. It struggled with thin details, sometimes showing flickering moiré patterns or bands. These types of artifacts could definitely be distracting depending on where they showed up, and they got worse as Digital Foundry pushed the system.
To be fair, Nvidia’s tech wasn’t totally immune to these issues, especially in performance modes but it had less of them. XeSS also added jittering effects to water and some less intense ghosting to certain models when they were moving -- Intel could not handle Lara Croft’s hair.