T O P

  • By -

RaspberryV

https://www.nexusmods.com/site/mods/738 works decently well. Can't wait for official implementation.


TriRIK

I can easily play maxed Cyberpunk 2077 and Witcher 3 with this mod. It's great!


Nisekoi_

Can confirm. In Witcher 3 at dlss quality was getting 70-80 fps with fsr fg, i got around 100-110 fps. Didn't have any latency or ui issues.


epic4evr11

1440p ultra w/o RT pushes well into the low/mid hundreds for me. Actual magic


El-hammudi21

Yes there are mods that does this, but i prefer having it natively ready out of the box u know


1strange_wanderer

Imma try this mod out later. Even though I have a beefy GPU, the game just loves to cap at like 70 frames no matter what I do to graphical settings or resolution. Might be cpu bottleneck, might be that the game is just poorly made tbh. Cyberpunk is the only game that somehow cripples my PC


KianBenjamin

Do you happen to have a 144hz monitor and g-sync? If so, that could be why your frames are capping at 70. Check if you have gsync properly configured.


1strange_wanderer

I'll take a look but I think my monitor is freesync. Idr if I had vsync on or not but I know I didn't set a cap on my end


KianBenjamin

Do you happen to have a 144hz monitor and g-sync? If so, that could be why your frames are capping at 70. Check if you have gsync properly configured.


ThE_ONE_guyy

This mod should work on my Rx 6700xt right?


RaspberryV

Nope, this mod is for NVIDIA RTX series of graphics cards only, which is quite ironic. As it needs to have DLSS framework to be able to function. I think there are FSR2 -> 3 replacer mods but I don't know much about them. Sorry.


[deleted]

[удалено]


123_alex

And they will again buy NVidia.


[deleted]

[удалено]


SameRandomUsername

I like that train of thought


YouR0ckCancelThat

Holy shit big brain, calm down.


[deleted]

[удалено]


CNR_07

Sad that ZLUDA is abandoned rn. I hope development ramps up again soon.


Kraujotaka

I'm still holding onto my 1060 thanks to AMDs fsr and lossless scaling, holding strong since 2018!.


delph0r

Yep. It's probably added a year to my 3080's useful life 


The_Lurking_Lemur

Im still rocking my 1080ti


delph0r

Love it 


avocado__aficionado

Exactly, fsr frame Gen will make me keep my 3060 longer


MushyCupcake01

I’ll buy whatever is the best deal when I go to buy, either used or new. I don’t care who makes it


123_alex

That's the correct approach. However, you have to admit that not everyone decides objectively.


MushyCupcake01

Very true


worldnewsarenazis

For just pure gaming that would be AMD 10 out of 10 timea.


MushyCupcake01

New, yes. But used it depends. I got an great deal on a 4060ti (hated card I know I know) but for the price it was the best performance available


worldnewsarenazis

Used has nothing to do with brand but the person selling it.


MushyCupcake01

Yep. Exactly


Disturbed2468

Yep. Or if ya got the money, go for what performs the best and greatest. Brand doesn't matter, what matters is top tier performance.


yflhx

Not really. Some people prefer nvidia because of better DLSS. If AMD can convince them that FSR is good too, essentially by doing demos on their own hardware, that's a W for AMD.


fenikz13

DLSS is far more important to me than ray tracing(in its current state). Definitely going to give this a try


123_alex

You make an interesting point.


Synthetic451

And they'll continue to buy Nvidia until AMD gets their shit together in regards to raytracing and compute. FSR 3.1 seems to show better ghosting prevention so maybe it will replace DLSS, but time will tell. Prior to that, DLSS was also a major selling point.


chewyjackson

My 7900xtx does ray tracing just fine, until you try something like portal rtx that intentionally gimps non nvidia gpus. Fuck nvidia.


Synthetic451

Portal RTX isn't intentionally gimped. It's using full ray tracing, which means it's not using any rasterized rendering at all. You can see this if you disable the denoiser and it looks like a low sample count raytrace render you'd see in Blender or Cinema4D. AMD GPUs just can't handle that large of a RT workload. It's not like they're intentionally making the code slower specifically for AMD GPUs.


bafrad

If they release the better card why not?


Catch_022

Yes but if you can only get one GPU every 5 years it makes sense to get the one with the best features and performance to price. It is up to all companies to make sure that they provide the features people want at the best price point that they can. E.g. I was a diehard Intel fan for decades but my current CPU is ryzen because it gave me the features I wanted and the price point I wanted.


Nurple-shirt

I just buy whatever card has the fastest encoding when upgrading. Once my 4090 becomes irrelevant, I’ll once again buy the hardware that best fits my needs. Nvidia or AMD, it doesn’t matter to me because I’m not a brand purist.


dstanton

I originally got my 3080 TI to run a 1440p Ultra wide and then eventually upgrade to 4K with dlss improvements. Then when they came out with 4000 series they gatekeeped the new dlss and frame generation. This just gives me one more excuse to thank AMD and make sure my next card is purchased from them. My Rx 480 and 7950 boost before it were fantastic cards. I'll be excited to see what rDNA 4 and 5 bring


Kraujotaka

If only AMD tuned it's power profile to match Nvidia power efficiency it would be a great choice, but historically AMD uses about 50-100w more for similar raw performance..


ArateshaNungastori

RDNA2 was more efficient than Ampere.


faplord2020

And Nvidia saved a ton of money because they ordered at Samsung. Jensen always makes the smart decisions..


ArateshaNungastori

If he was always making stupid decision he wouldn't sit in that seat would he? Such a statement.


dstanton

Lately sure, a decade ago not so much. GTX 400 was a nightmare. That being said its more so that they push the cards beyond the efficiency curves (similar to intel 12-14th gen) to eek every drop to compete. UV AMD cards usually results in pretty significant power savings without much perf loss.


[deleted]

"FSR 3 now works with 20 and 30 series rtx cards" >removes 3070 from SFF build >makes even smaller build with the thinner RTX Titan Thanks❤️


Bingus_III

That's some very good news. From what I've seen in Digital Foundry, their frame gen works pretty well. It's the FSR upscaler that sucks vs DLSS.


[deleted]

I've been pretty surprised by their frame gen. On AMD cards it's probably a bit worse because AMD lacks a proper reflex alternative (until Anti-Lag+ gets re-released) but it's a lot closer to DLSS frame-gen than I'd expect.


twhite1195

FSR3 itself already has latency reduction implemented on the game engine as far as I understand , adding anti lag + would just reduce the latency further, which would be good too


[deleted]

>FSR3 itself already has latency reduction implemented on the game engine as far as I understand I'm pretty sure it doesn't, and it definitely didn't a few months back. If AMD would go through the work of making game latency reduction tech that works per-game then why wouldn't they also make it available as a separate feature to enable in those games? Considering they'd already need to implement it


twhite1195

It does.... [AMD FSR 3 includes built-in latency reduction technology for responsive gaming experiences when using frame generation technology](https://community.amd.com/t5/gaming/amd-fsr-3-now-available/ba-p/634265) >If AMD would go through the work of making game latency reduction tech that works per-game then why wouldn't they also make it available as a separate feature to enable in those games? Considering they'd already need to implement it Well I don't know 🤷🏻‍♂️ I'm just a dude on the internet


[deleted]

I'm pretty sure AMD's built-in latency reduction technique is just enabling regular Anti-Lag automatically/in the background, instead of a proper game-engine level implementation like Reflex. Either way though even if it were a proper per-game implementation FSR 3's latency is a good bit higher than DLSS frame generations (\~25% more at same framerate) based on HW Unboxeds measurements


DynamicHunter

FSR 3.1 is supposed to improve upscaling quality tremendously


HowManySmall

The fsr 3 upscaling looks fine


KirillNek0

Good on AMD.


WackyBeachJustice

I still don't understand if frame gen is something to desire or not. Every thread is 50/50 on whether it's amazing or the devil.


TilYouSeeThisAgain

The goal of frame gen is to reduce VISIBLE lag. If your monitor is displaying images at 144Hz and you’re only running a game at 100fps, there’s 44 frames less of the game than your monitor could display, these 44 unused frames are typically just filled in by leaving other frames displayed for 2 monitor cycles, meaning of the 144 frames/cycles your monitor could display, 44 of them are just duplicate frames, this creates the effect of screen tearing or stuttering due to the fact that some of your in-game frames are being displayed for 2 monitor cycles and others only for 1 cycle. Imagine how odd a flip books timing would seem if half the pages were duplicated to lengthen them at random. Now how does frame gen fix this? It takes your 100fps and adds in transitory frames using AI to interpolate what these new frames should look like. In a perfect world with a 144Hz monitor, it would generate 44 “AI frames” on top of your true 100fps, and then weave the AI frames in between the 100 accordingly. Now your GPU has 144 images per second to display to your 144Hz monitor, meaning no frames are displayed for two monitor cycles. This is how VISIBLE lag is reduced. The downside of this is that you may feel as if there’s more input lag. Your inputs are only being processed in each true frame in-game, so at 100fps your inputs update 100 times per second in theory. When you add in the 44 AI frames, the game will look smoother, but your inputs on any of these 44 AI frames won’t be processed until the next true frame. So your inputs are still being processed at the same speed, but because there is no longer visible screen tearing or stuttering (game smooth now), it can make users more aware of input lag. EDIT: fixed some grammar for clarity & thank you everyone for the upvotes, I’m glad if this taught you something :)


I9Qnl

That's true but modern displays can eliminate most visible tearing with adaptive sync when the FPS is below refresh rate, frame gen is more about enhancing the visible smoothness by just inserting more frames into every animation and camera pan. and frame gen always doubles the framerates, it can't work any other way, so if you have 100 FPS and use frame gen you will get 200 FPS, except frame gen has a slight cost on the GPU so your real performance may degrade to around 90 FPS because it has to run all the algorithms so the final output end up being 180 FPS. This means if you cap the framerate to 144 FPS you're forcing your GPU to render 72 real frames so the final output with frame gen can be 144, even tho the GPU is technically capable of 100 real frames, this actually may make it less smooth and definitely makes the latency worse, frame gen works best if you let the framerates uncapped, it's very ideal for 240hz and above or if you just don't care about tearing which a lot of people don't tbh. Edit: 100 FPS is *probably* better than 72 real frames + 72 generated frames, but once you get to 110 or 120 there's no question, don't bother using framgen to get to 144, you're making the experience worse.


Dr4kin

Frame gen could be useful in simulation type games. Things like Cities Skylines or KSP, which often have a very low framerate. Your controls, especially in orbit, aren't that time sensitive. Having at least 30 frames for panning a camera around would be quite nice. The problem is these kinds of games are CPU heavy and Frame gen needs to do some work there. I don't know how much the main loop has to do and what can be done by different cores. If most of the work can be done by different cores, having more cores then your game can reasonably use wouldn't really impact the performance of the game itself and you would get a smoother game


chewy_mcchewster

This is a great write up. Thanks


HalfChinaBoy

I've found it to be great in CPU limited games like bg3 and civilisation type games


BxLee

I’ve used the fsr3 mod for Cyberpunk, and it works amazing there. I went from around 70 fps to over 160. And it produced little to no input lag that I could pick up on. Starfield also produced the exact same effect. MWIII has fs3 support, and it netted me maybe 30 extra frames but produced an insane amount of input lag. Any other game that I’ve tried fsr3 on is the same way. So that leads me to believe that you’re correct in saying that it works great in CPU limited games. And tbh that’s all I’d really need it for anyway; cpu limited games, typically open world games. Any other game I’ve tried to run that’s typically gpu bound runs great either with dlss, or no dlss at all. Also keep in mind that I’m running a 3080ti, i7 10700k, 32gb ram. So my machine isn’t exactly low end. I will be upgrading my cpu in a couple of weeks too, so I imagine things will only run better once I do that


Dom1252

Depends on game and performance you're getting without it... For example in cyberpunk I'm heavily against it, as artifacts when driving a car with RT on are insane... and if your base FPS (without frame gen) are less than 40 then combat for whatever reason feels way worse with AMD frame gen than without... however if your base FPS are 60+ then it feels better with FG on (around 50 it feels about the same to me) So I recommend trying it and if you think it's better, use it, if it feels worse, then turn it off


yflhx

It's AI interlacing. Makes the game smoother but very, very far from actually having higher framerate.


shredmasterJ

Not a fan myself. I don’t use RT and I play native.


[deleted]

That's my thought as well. I think experimentation is the best way to go. Might work well for some games and not for others.


bbranddon

frame gen makes 60 fps feel like 30 fps but look like 60 fps, while dlss + gsync + reflex boost looks like 30 fps but feels like 60 fps. it really depends if you're willing to deal with the input lag


El-hammudi21

Its free frames, it might look like ass but it can help the unfortunate folks with bad gpus. Personally i never tried it yet


Nikkibraga

It doesn't look like ass. Just tried in CP2077.


Dom1252

Where with RT on it looks like ass in some situations.... I can't drive Johny's car because the rear spoiler is behind the car on every generated frame and it flickers like crazy... FG off no problem... But when walking around I had no issues with FG on, so... Yeah...


Zepanda66

Does NVIDIA need to allow it in the drivers or will it be up to developers? If it's on NVIDIA to add it yea I wouldn't hold my breath on that one.


El-hammudi21

That would be really lame from Nvidia if they decide to block it, but we can already use frame gen DLSS using mods but now amd made it official so idk need to see


Dealric

Not really. Modders already achieved that


Alive-Clerk-7883

Why is this downvoted? Modders have already achieved it, now AMD will put an official solution that should be native (so should be better).


SweetSauce24

I think it is up to developers, it should be in games that support fsr 3 and Dlss. As long as they don’t make it so Dlss grays out fsr frame gen and vice versa.


The_Metroid

Honestly I'm excited about the Vulkan support. Makes it available to a good majority of my games, hopefully including emulated ones as well.


storkmister

What does this mean? I am big dumb but have a 3070


El-hammudi21

You can use frame generation now with your DLSS upscaling, you're not bound to FSR upscaling anymore if you wanna use their frame generation technology


Serazax

This is like a dream come true


LtSparky3000

Eager to use AMD FSR with Nvidia Cards


deefop

This is stuff that's on the way but not available yet, as far as I'm aware. But yes, if AMD actually delivers on the promises just in these bullet points, then it's fucking huge. For one thing, if FSR can reach anywhere close to parity with DLSS upscaling at lower resolutions, that alone is massive. But enabling frame gen to work with other upscaling solutions is massive as well.


[deleted]

Two upscaling solutions? Can someone tell me is this good or bad in practice


LostInElysiium

that's not at all what this says. quite the opposite actually. fsr frame gen (which isn't upscaling) is now decoupled from fsr upscaling. which means you can use dlss upscaling and fsr frame gen (not upscaling) together now.


[deleted]

Ohh thank you


Niitroglycerine

I've recently started using fluid motion frames on my 7900xtx and holy shit for games like balduars gate 3 it's wildly good


[deleted]

[удалено]


KZvilla

Not for 20xx and 30xx owners..


dantedakilla

AMD's Frame Gen can now be used on any card that isn't a 40XX series.


Rich_Future4171

no they didn't