Can Intel's XeSS Kill Nvidia's DLSS?

12 46
Avatar for LateToTheParty
2 years ago

Originally published on Publish0x.

On August 19, Intel showcased its Architecture Day 2021 presentation. The approximately 2-hour long event featured the Alder Lake architecture on the CPU side and the newly named Arc architecture on the GPU side. Overall, the presentation was very solid and I found the products to be worth getting excited for. It was definitely among Intel's better showings as opposed to those where it would oddly mention its competition more often than its own products or speak in word salads.

What particularly caught my eye was Intel revealing its own upscaling solution to compete with Nvidia's DLSS and AMD's FSR: XeSS. Not only will we get a (much needed) third competitor in the GPU market, but we may get an upscaling solution that adopts the strengths from both DLSS and FSR. The XeSS segment of Intel's Architecture Day 2021 presentation lasted about 6 minutes and there is a bunch of talk about:


The XeSS portion starts at 5:30.


Reading Intel's diagram of how XeSS generally works, it is more similar to Nvidia's DLSS than AMD's FSR. XeSS reconstructs an image based on motion vectors and past frames. In other words, it is using a temporal method just like DLSS. Not to mention, Intel even hired an engineer who worked on DLSS to help it make its own solution.


The resemblance is uncanny.


AMD's FSR, on the other hand, is not temporal, but strictly spatial. It implements an upscaling and sharpening pass after rasterization and lighting before post processing and UI elements. The end result will look much better than traditional bilinear upscaling. Because it also does not require any machine learning, FSR is very easy for developers to implement. That said, because it is upscaling with less data than a temporal solution, some details will inevitably get lost. It really becomes noticeable at low resolutions like 1080p. DLSS, however, can upscale to 1080p with comparable quality to the native image because it is working with more data from past frames.



Based on the short demo running on the Unreal Engine, XeSS was able to show comparable image quality to a native 4K image at just a 1080p target (i.e. the demo runs on 1080p and then, XeSS upscales it to 4K). Intel claimed that framerates can boost up to 2x, though it did not disclose how close or how often XeSS can hit that theoretical maximum.


I can glean how much of a boost XeSS can offer relative to traditional 4K, but for goodness sake, offer some soft of scale on the axis.


Another caveat worth mentioning is that the demo was rather calm, i.e. there wasn't much action and movement going on. One of DLSS's biggest weaknesses is ghosting artifacts. Ghosting is where you have multiple images superimposed over one screen, creating a smudgy, blurry look. Since DLSS relies on previous frames for data, it's no surprise that some ghosting will occur. Because XeSS is using the same method, it will also have the same weaknesses. XeSS is extremely impressive with still images and inactive scenes, I'll give Intel that, but how will it fare with something that is more hectic?

While the jury is still out on how well XeSS will perform relative to FSR and DLSS under similar workloads, I'm really happy that Intel is following AMD in terms of having XeSS available to all GPUs. Considering that Intel will effectively start out at 0% marketshare in the discrete GPU market, making XeSS available to everyone is a good move. However, I think Intel is also being clever by going with a hybrid proprietary approach by leveraging the Matrix Engines (XMX) in its Arc GPUs to reduce the XeSS overhead. Intel said that once XeSS matures, the technology will go open-source. It would definitely be interesting to see a fshack-like implementation of XeSS on Linux similar to what Georg 'DadSchoorse' Lehmann accomplished with FSR.


Not only can you use FSR on Steam, but also on Lutris (0.5.9 or newer) for your non-Steam games.


So can Intel's XeSS kill Nvidia's DLSS? Technically, anything is possible, especially since XeSS is doing the same things as DLSS. However, it greatly depends on how well Intel executes the technology. XeSS looks fantastic with inactive scenes. However, knowing the inherent weaknesses with temporal upscaling, will the image still look great when there's a lot of chaos? If Intel can get it to work well on action-packed games like Doom Eternal from the get-go, then Nvidia will definitely feel the pressure. Going with a vendor-agnostic approach was a smart idea as developers will be more willing to implement technology that will be accessible on all GPUs, even integrated ones.

Intel still has a long ways to go, but so far, it is ticking the right boxes.

1
$ 0.71
$ 0.71 from @TheRandomRewarder
Avatar for LateToTheParty
2 years ago

Comments

No one really cares about these technologies. What intel need is to keep up with the low end dedicated gpu solutions of nvidia and amd, and try closing the gap with the midrange. That strategy would allow them to slowly push out nvidia from the industry. Of course nvidia is aware of this, thats why they are so desperately trying to get more market share by creating SOCs.

$ 0.00
2 years ago

Uh... I can literally run FSR on any game right now on my Linux desktop whether it runs on Vulkan, DXVK, or VKD3D. That sort of utility is extremely useful for prolonging the life of older hardware.

Why you are presenting XeSS and Intel filling in the low end dGPU niche as mutually exclusive things is baffling. If people can play games at 1080p 60fps on just an iGPU thanks to XeSS, then yes, they will care. It will save a whole lot of money. If XeSS (and let's add FSR 2.0 here, too) allows low end dGPU's to run games at 1440p 60+fps, then yeah, people will care.

$ 0.00
2 years ago

good luck running my games with it then

$ 0.00
2 years ago

Wat... why would I care about games that I don't own???

$ 0.00
2 years ago

so you cant, case closed

$ 0.00
2 years ago

You can't close a case with a non sequitur. Spewing a bunch of random stuff that has no relation to your thesis is equivalent to conceding the point.

$ 0.00
2 years ago

I don't think we are agreeing in this. You have introduced us a technology which was proposed by intel, but this is only relevant for a small segment of AAA gamers. Then i have mentioned that they should focus more on closing the mid-range performance range instead of creating technologies like this - which is true - then you told me, it can run "literally any game", which it actually can't, because it cant run my games. xess, dlsr, fsr, are trying to fix a problem that doesn't exist outside the head of aaa gamers, and it seems irrelevant for the market as a whole.

$ 0.00
2 years ago

This was your thesis:

No one really cares about these technologies.

These is plural. So other than XeSS, what other technologies were you referring to? FSR and DLSS.

My rebuttal:

Uh... I can literally run FSR on any game right now on my Linux desktop whether it runs on Vulkan, DXVK, or VKD3D.

This effectively makes this claim of yours a strawman fallacy:

then you told me, it can run "literally any game", which it actually can't

(1) I mentioned FSR, specifically. The context of your claim only applies to XeSS. (2) My comment on FSR only applies to Linux. (3) I also qualified my rebuttal that as long as whatever game I run on Linux uses Vulkan, DXVK, or VKD3D, FSR would work.

because it cant run my games.

If I can run any game that I own on my Linux desktop with FSR, then I don't care about your games.

xess, dlsr, fsr, are trying to fix a problem that doesn't exist outside the head of aaa gamers

It allows games on my Linux desktop to run at higher framerates on my RX580 which is a very old GPU. And because the fshack FSR solution is not dependent on the developer implementing FSR into their games, I'm not at their mercy. All I need to do is add a command on Steam's launch options or click on a slider on Lutris.

Also, two major major flaws to this argument:

Then i have mentioned that they should focus more on closing the mid-range performance range instead of creating technologies like this

(1) Focusing on mid-range performance and focusing on XeSS are not mutually exclusive. Intel can do both and the company has the money to do that. Your argument basically sums up to "How DARE Intel focuses on something I don't care about even though they are already addressing the other thing I do care about!! Cater to me only! MEEE!!!!"

(2) Intel starts at a marketshare of 0%. Just being "good enough" or even "better" in the mid-range will not move the needle because there is no unique value proposition: https://en.wikipedia.org/wiki/Unique_selling_proposition . XeSS gives Intel's GPUs that UVP. While XeSS can run on all hardware, the XMX engine on the upcoming cards will reduce the overhead.

$ 0.00
2 years ago

"How DARE Intel focuses on something I don't care about" Its not just me who doesn't care, its the market. There are maybe about few million person worldwide from the few 10 active million PC AAA gamers, who are really interested in this technology. You maybe think this is an important market, because these people are loud, and maybe you built a social echochamber around them, amplifying their voices. The reality is, the overall users of PC and laptop hardware are about 3-5 billion people per year. A technology that targets a small minority which is so tiny than a drop of water in an ocean, for a company that goals to compete for the overall chip market for the yearly 10-20 billion devices, wispers its a desperate move. Probably because they cant put down an actual achivement, so they coming up with totally irrelevant technology for the sole purpose of putting them into the headlines again.

$ 0.00
2 years ago

Its not just me who doesn't care, its the market.

What a definitive statement.

There are maybe about few million person worldwide from the few 10 active million PC AAA gamers, who are really interested in this technology.

What a not definite statement with that qualifier. *Facepalm

If the market doesn't care, then Nvidia would be ditching DLSS by now and AMD would be no longer supporting FSR. The fact that the number of games that support DLSS still increases is an easy rebuttal. For FSR, because it does not require machine learning, it's super easy for developers to implement it. Heck, for one studio, it only took one person and a few hours.

As for XeSS, if it's good, then it'll offer the best of DLSS (better image quality than native) and FSR (platform agnostic and open source).

A technology that targets a small minority

Ah, the moving the goalposts fallacy. If Nvidia, AMD, and Intel find it worth enough to use upscaling tech, then whether the tech targets a small minority doesn't matter. It only matters if there are enough consumers who care. (Spoiler alert: There are)

Oh, and be sure to add another few million from Steam Deck sales (it sold over 110,000 within the first 90 minutes). Oh whoops, I rebutted your argument again...

$ 0.00
2 years ago

110 000 wow, you are more clueless than i thought :D

$ 0.00
2 years ago

110 000 wow, you are more clueless than i thought :D

Yeah, I'm surprised that you're talking to yourself all of a sudden. Are you suffering from dementia, by any chance?

Oh, and it's 110,000 within the very first 90 minutes of preorders opening. The fact that if you try to preorder a Deck now and can't get it until Q2 2022 is an indicator that it will sell more than just a few million. But I guess it's rather unfair for me to expect a geriatric person to remember that tidbit and do some basic linear regression. ¯_(ツ)_/¯

$ 0.00
2 years ago