PC & Video Games

AMD fighting the good fight for everyone!

  • Last Updated:
  • Jul 12th, 2021 11:47 pm
[OP]
Deal Addict
User avatar
Nov 2, 2012
3673 posts
5566 upvotes
vonblock wrote: Digital Foundry did this.

FSR vs UE 4 Temporal upscaling for the same game.

It was a massacre and that was with old school unreal engine 4 Temporal upscaling. Unreal engine 5 is suppose to be much better.

I guess for developer that use an engine without Temporal Upscaling, it's better than nothing....
Have you looked at more than just DF's analysis? I've literally watched about 8 or 9 complete reviews now, as well as read several articles, and DF is the only one that "destroyed" FSR in your words. The portion of the DF analysis that you're referring to (the last 3 minutes) is also using the worst case scenario settings for FSR vs TAAU. That is 1080p at performance settings. That's the lowest setting. Watch some of the other reviews I posted. There's also a KitGuru one I timestamped below that compares FSR to TAAU in Godfall at the same settings. TAAU is slightly sharper, but FSR has the overall better image with much less shimmering. 4K @ Ultra Quality (which is what Forespoken is using) is the highest quality setting. At that setting it actually performs very well with minimal loss to image quality, and is likely preferred to UE4 TAAU at those settings.

I think you're kind of overreacting saying FSR is terrible and hoping it doesn't get used. Also the fact that the guy talking about Forespoken said he implemented it extremely quickly, which just shows how convenient and hassle free it can be.

[OP]
Deal Addict
User avatar
Nov 2, 2012
3673 posts
5566 upvotes
BernardRyder wrote: It's better than a kick in the pants, no doubt. With the right hardware, having FSR versus not having it sounds like a no-brainer. I have a 2070 mobile and figure gen 1 DLSS would be a joke on it. If my games had FSR support, I'd probably give it a whirl, doesn't sound like it will cost me much performance.

I think the biggest issue is who's running the show. With AMD in charge of gen 2, I can see that taking a while to come out. It took a gen and a half of DLSS for this to come to fruition. By the time gen 2 comes out, DLSS will have implemented much better features and performance that will make comparisons even weaker.
For AMD to champion this and pour resources into it, they'll need good reasons to continue. If it at least gets great marketing support from hardware manufacturers and game devs, it has a chance to grow. It can as likely fall by the wayside in 5-6 years once AMD gets tired of championing it.

EDIT: now that I think of it, this would have been great 20 years ago, when there were more video card manufacturers like ATI, Nvidia, 3Dfx, S3, Matrox... FSR would probably have thrived versus DLSS because it could have been implemented across multiple manufacturers to compete with Nvidia's proprietary technology. Nvidia sort of killed that landscape though, as sadly it's now a 2-pony race.
Yeah, I think (with no proof Smiling Face With Open Mouth And Smiling Eyes) that when DLSS 1.0 released it was such a joke to the industry, being noticeably worse than everything else out at the time, that nobody took it seriously. AMD didn't feel the need to make a competitor for it, and I wouldn't blame them. When DLSS 2.0 came out, and was actually really good, then AMD was stuck. I think it took them long to come out with their own similar solution, because maybe they only started it when DLSS 2.0 hit.

We'll see where they take it from here. The fact that it's already starting to be used in future games is promising.
Deal Addict
Nov 18, 2017
2222 posts
1574 upvotes
Grande Prairie AB
Caerus wrote: Have you looked at more than just DF's analysis? I've literally watched about 8 or 9 complete reviews now, as well as read several articles, and DF is the only one that "destroyed" FSR in your words. The portion of the DF analysis that you're referring to (the last 3 minutes) is also using the worst case scenario settings for FSR vs TAAU. That is 1080p at performance settings. That's the lowest setting. Watch some of the other reviews I posted. There's also a KitGuru one I timestamped below that compares FSR to TAAU in Godfall at the same settings. TAAU is slightly sharper, but FSR has the overall better image with much less shimmering. 4K @ Ultra Quality (which is what Forespoken is using) is the highest quality setting. At that setting it actually performs very well with minimal loss to image quality, and is likely preferred to UE4 TAAU at those settings.

I think you're kind of overreacting saying FSR is terrible and hoping it doesn't get used. Also the fact that the guy talking about Forespoken said he implemented it extremely quickly, which just shows how convenient and hassle free it can be.

For what AMD promised and for what FSR actually (so far) ended up being it is terrible. This is beneficial to low end GFX cards mostly which will not be at 4K so the IQ is terrible and even at 4K the IQ compared to non-FSR is bad.
Deal Addict
Dec 2, 2004
1409 posts
1558 upvotes
I think you guys have mis-interpreted AMD's implementation of FSR.

It's not competing with nVidia's DLSS.

It's giving the user a choice to choose higher FPS vs image quality, translating to evening out the playing field between users in competitive gaming.

Rich guy: i have nvidia GTX 3090 - 4k 160fps
poor guy: I have amd rx5500 - 720p 60fps ---> FSR ON = 1080p 144fps less image detail... have a better chance at fraggin rich guy...
[OP]
Deal Addict
User avatar
Nov 2, 2012
3673 posts
5566 upvotes
iamsiege wrote: For what AMD promised and for what FSR actually (so far) ended up being it is terrible. This is beneficial to low end GFX cards mostly which will not be at 4K so the IQ is terrible and even at 4K the IQ compared to non-FSR is bad.
What did AMD promise FSR was going to be? Please provide an official statement from AMD. There's a difference between what they state, and what media and people assumed FSR was going to be.

You can feel however you like sure. But there's a difference between feelings and facts. From all of the reviews I've watched and articles I've read, the overall consensus disagrees with your statements. I'll go by their overall analysis based on facts and testing. I'm not here to argue with you, but you're just generalizing, and blanket calling it terrible.

Literally in the video from the post you just quoted me on, there is very little difference between 4K Ultra FSR and 4K TAAU 77% up sampling which is a direct comparison with the same internal resolution. TAAU is slightly sharper, but FSR has the better overall image with much less shimmering. Same thing with 4K Quality FSR vs TAAU 67%, same results. So if going by your statements that FSR is terrible and has terrible IQ, since the image is better than the TAAU implementation at equal settings, I guess TAAU is terrible?

Lets look at performance. This is taken from the same video that you quoted me in.

FSR.png

Comparing [email protected], [email protected], Temporal 77%, and Temporal 67%, the performance of FSR is better as well. But FSR is still terrible. It's also beneficial to low end cards only? Performance increased from 64.9fps to 92.5fps using the highest quality settings on a 6800XT. [email protected] gave a 43% jump in FPS compared to native 4K. 6800XT isn't a low end card last time I checked. Seems to benefit both higher end and low end just fine.

Now let's go back to some of the other posts where you quoted me.
RT is not separate from this, DLSS makes it 100% more playable and FSR was supposed to be AMD's answer to DLSS which now it's clearly not. RT on AMD cards are unplayable at 4K, and FSR was also to be open source which also seems to not be the case, I have both a 6900XT and 3090 as well as a 3080 and I can tell you that the Nvidia cards crush the AMD in anything with RT/DLSS, I am holding out hope that this article is wrong but AMD has backtracked on many claims they made when they first released the 6000 series.
My point was Ray Tracing is a separate implementation from FSR. They operate independently from one another. Yes, you can use them in conjunction, but one does not require the other to function. There's a difference between operating independently, and one being used to assist in performance of the other.

FSR is open source.

Won't dispute your RT/DLSS performance claims, nVidia can't be beaten here.

Here is an example of FSR allowing a game to be playable with RT vs changing in game settings to have the same fps as FSR is providing. It's timestamped. Using FSR allowed the game to have a higher framerate, keep more image quality settings, and allow ray tracing to remain enabled. Look at all of the settings they needed to turn off by comparison.



So why is FSR just terrible like you keep saying?... The difference between you and I, is I'm looking at it from what it brings to the table overall. I'm not just outright looking for ways to hate it like you seem to be. I see the benefits, and I see the limitations. No, it's not on the same level as DLSS 2.0, but what it is offering is still very good. Especially for it being their first implementation. The image quality is still very good. The performance gains are still very good. The ease of implementation is still very good.
Deal Addict
Nov 18, 2017
2222 posts
1574 upvotes
Grande Prairie AB
frugal69 wrote: I think you guys have mis-interpreted AMD's implementation of FSR.

It's not competing with nVidia's DLSS.

It's giving the user a choice to choose higher FPS vs image quality, translating to evening out the playing field between users in competitive gaming.

Rich guy: i have nvidia GTX 3090 - 4k 160fps
poor guy: I have amd rx5500 - 720p 60fps ---> FSR ON = 1080p 144fps less image detail... have a better chance at fraggin rich guy...
This^
Deal Addict
Nov 18, 2017
2222 posts
1574 upvotes
Grande Prairie AB
Caerus wrote: What did AMD promise FSR was going to be? Please provide an official statement from AMD. There's a difference between what they state, and what media and people assumed FSR was going to be.

You can feel however you like sure. But there's a difference between feelings and facts. From all of the reviews I've watched and articles I've read, the overall consensus disagrees with your statements. I'll go by their overall analysis based on facts and testing. I'm not here to argue with you, but you're just generalizing, and blanket calling it terrible.

Literally in the video from the post you just quoted me on, there is very little difference between 4K Ultra FSR and 4K TAAU 77% up sampling which is a direct comparison with the same internal resolution. TAAU is slightly sharper, but FSR has the better overall image with much less shimmering. Same thing with 4K Quality FSR vs TAAU 67%, same results. So if going by your statements that FSR is terrible and has terrible IQ, since the image is better than the TAAU implementation at equal settings, I guess TAAU is terrible?

Lets look at performance. This is taken from the same video that you quoted me in.


FSR.png


Comparing [email protected], [email protected], Temporal 77%, and Temporal 67%, the performance of FSR is better as well. But FSR is still terrible. It's also beneficial to low end cards only? Performance increased from 64.9fps to 92.5fps using the highest quality settings on a 6800XT. [email protected] gave a 43% jump in FPS compared to native 4K. 6800XT isn't a low end card last time I checked. Seems to benefit both higher end and low end just fine.

Now let's go back to some of the other posts where you quoted me.



My point was Ray Tracing is a separate implementation from FSR. They operate independently from one another. Yes, you can use them in conjunction, but one does not require the other to function. There's a difference between operating independently, and one being used to assist in performance of the other.

FSR is open source.

Won't dispute your RT/DLSS performance claims, nVidia can't be beaten here.

Here is an example of FSR allowing a game to be playable with RT vs changing in game settings to have the same fps as FSR is providing. It's timestamped. Using FSR allowed the game to have a higher framerate, keep more image quality settings, and allow ray tracing to remain enabled. Look at all of the settings they needed to turn off by comparison.



So why is FSR just terrible like you keep saying?... The difference between you and I, is I'm looking at it from what it brings to the table overall. I'm not just outright looking for ways to hate it like you seem to be. I see the benefits, and I see the limitations. No, it's not on the same level as DLSS 2.0, but what it is offering is still very good. Especially for it being their first implementation. The image quality is still very good. The performance gains are still very good. The ease of implementation is still very good.
See my previous post, FSR is fine for older generation vid cards, but does nothing for current gen. I will take DLSS everyday and twice on Sundays unless AMD can improve the IQ significantly. In 4K on my OLED it looks terrible, plain and simple. That's my opinion from trying it first hand not from watching a bunch of youtube vids.
[OP]
Deal Addict
User avatar
Nov 2, 2012
3673 posts
5566 upvotes
iamsiege wrote: See my previous post, FSR is fine for older generation vid cards, but does nothing for current gen. I will take DLSS everyday and twice on Sundays unless AMD can improve the IQ significantly. In 4K on my OLED it looks terrible, plain and simple. That's my opinion from trying it first hand not from watching a bunch of youtube vids.
Like I said you can feel however you like. But you can stop quoting me to repeat your same generalisations now. You don't seem to provide anything new or substantial for your arguments. So let's end it there.
Deal Addict
Nov 18, 2017
2222 posts
1574 upvotes
Grande Prairie AB
Caerus wrote: Like I said you can feel however you like. But you can stop quoting me to repeat your same generalisations now. You don't seem to provide anything new or substantial for your arguments. So let's end it there.
First hand usage over the last couple of days at 4K where you say "it's great" of said technology isn't new I guess. Have you actually used it or just watched youtube videos? I think I already know the answer.
[OP]
Deal Addict
User avatar
Nov 2, 2012
3673 posts
5566 upvotes
BernardRyder wrote: Forespoken devs are excited to use AMD tech

https://www.ign.com/articles/forspoken- ... resolution

It launches on PC in 2022 and I didn't realize would be a PS5 exclusive for two years. I guess FSR will be enabled on the PS5?
Yeah, to add to this, looks like it may get a good amount of support on consoles. Which is good to see. It's already been added to the GDK for Xbox/Windows. This might give a decent assist to the Series S.

They should get CDPR to use it for the console ports of Cyberpunk 2077 Smiling Face With Open Mouth And Smiling Eyes.

https://www.theverge.com/2021/6/24/2254 ... es-support
Microsoft is supporting AMD’s answer to DLSS on Xbox consoles

AMD’s FSR goes into preview on Xbox today

Microsoft is enabling Xbox developers to utilize AMD’s answer to Nvidia’s DLSS on console games. FidelityFX Super Resolution (FSR) appeared recently on Windows PCs with the aim of boosting frame rates and maintaining image quality. It’s AMD’s answer to Nvidia’s Deep Learning Super Sampling (DLSS) and early tests have shown it’s a promising technology that will work across multiple games and platforms.

FSR is now making its way to Xbox games, too. Microsoft has started previewing support for FSR in its Game Development Kit for both Windows and Xbox today. “It’s supported on Windows, Xbox Series X and S, and Xbox One consoles,” says Jason Ronald, Microsoft’s director of Xbox program management. “FSR was designed to enable developers to achieve higher frame rates and resolutions with minimal work for developers across both console and PC.”
Deal Fanatic
User avatar
Aug 29, 2011
5081 posts
3259 upvotes
Westmount (Montreal)
Caerus wrote: Have you looked at more than just DF's analysis? I've literally watched about 8 or 9 complete reviews now, as well as read several articles, and DF is the only one that "destroyed" FSR in your words. The portion of the DF analysis that you're referring to (the last 3 minutes) is also using the worst case scenario settings for FSR vs TAAU. That is 1080p at performance settings. That's the lowest setting. Watch some of the other reviews I posted. There's also a KitGuru one I timestamped below that compares FSR to TAAU in Godfall at the same settings. TAAU is slightly sharper, but FSR has the overall better image with much less shimmering. 4K @ Ultra Quality (which is what Forespoken is using) is the highest quality setting. At that setting it actually performs very well with minimal loss to image quality, and is likely preferred to UE4 TAAU at those settings.

I think you're kind of overreacting saying FSR is terrible and hoping it doesn't get used. Also the fact that the guy talking about Forespoken said he implemented it extremely quickly, which just shows how convenient and hassle free it can be.

I maintain my point

I prefer a bit more noise with a much better reconstruction technique that look more 4Kish than FSR

I won't argue with you, Digital Foundry team did it better in their last video (watch the first 30 minutes). They are the pro of image quality.

The point is FSR is cheap technique, not meant for true reconstruction and based on a technique that everyone in the industry abandoned years ago.

Deal Addict
Nov 18, 2017
2222 posts
1574 upvotes
Grande Prairie AB
BernardRyder wrote: DLSS can benefit from FSR's goodwill

https://www.digitaltrends.com/computing ... coattails/
That still doesn't change the fact that the IQ is terrible compared to DLSS. I am not going to argue with people over the fact it "works" and that AMD never claimed it was a DLSS equivalent (they did and so did the media) but it's still vastly inferior to DLSS but it's great that it works on lower end cards.

Top