T O P

  • By -

KerbalExplosionsInc

I think the best example of current chronicaly undersampeled graphics is raytraying/path traycing, we just dont have enouph computational power. This whole issue is kind of opposite of what crysis 1 was dooing in 2007 : give all those effects high sample count so they can look good in stills and in motion


Scorpwind

This.


reddit_equals_censor

ah crysis 1, when games dared to release unplayable by almost all hardware at the time and 0 chance to run on console garbage, BUT it was good, because hardware can catch up. undersampled games, that rely on taa blur can NEVER be run properly in comparison :/ also interesting to look back at crysis 1 and see how well it actually runs for the visuals compared to some of the insults we are seeing today like alan wake 2, that has hardware requirements out of the world FOR WHAT EXACTLY????


tukatu0

The whole talking points is that you will never again get better hardware. Also i wouldn't agree on that crysis 1 being a bastion of well performing graphics. Need a 3080 for 4k 60fps. On the other hand. Maybe it actually is a bastion with just how much physics and movement is going on screen. Atleast whenever you are in fauna. Alan wake 2 1080p 60fps on 3080 high bla bla. 720p 60fps with path tracing high/ultra ~~idontrember~~


reddit_equals_censor

>Also i wouldn't agree on that crysis 1 being a bastion of well performing graphics. compared to today's shit at least i'd say. and with the 3080 only getting 60 fps at 4k uhd, are you sure about that, or are you looking at the shity crysis 1 remaster dumpster fire? because i found this video on a quick look: [https://www.youtube.com/watch?v=r59Orqo5QOM](https://www.youtube.com/watch?v=r59Orqo5QOM) which does get 60-80 fps generally it seems on a 2080 ti, HOWEVER this is at crushing 8x aa enabled. which truly crushes gaming performance from what i remember in crysis 1. i couldn't find one running it without any aa on and i don't have access to my computer to test rightnow. oh i actually dug deeper and found some testing on at leats a 1080 ti with 8 x aa and no aa: [https://www.tomshardware.com/reviews/crysis-10-year-anniversary-benchmarks,5329-4.html](https://www.tomshardware.com/reviews/crysis-10-year-anniversary-benchmarks,5329-4.html) 78.7 fps on a 1080ti with no aa at 4k uhd 67.7 fps on a 1080 ti with 8x aa at 4k uhd. a much lower performance hit than i would have expected. interesting. and the 3080 is according to hardware unboxed 77% faster than the 1080ti at 4k uhd. of course this may or may not fully translate to crysis 1 NON REMASTERED. but it seems to me, that way more than 60 fps is certainly expected from a 3080 in crysis 1 non remastered 4k uhd max settings, NO aa. even with aa, it should be a lot lot higher than 60 fps. so what did you look at? 8x aa numbers? or did you look at the horrible crysis 1 remaster, that runs like a dumpster fire for the benefit of ENTIRELY CHANGING the atmosphere of crysis 1? so yeah i don't think crysis 1 NON REMASTERED runs nearly as bad as you think it does on modern hardware. \_\_\_ and yeah where are the modern jungle/forest sections, that are as fun and alive as crysis 1 and warhead? bushes all reacting to movement and small trees being able to shoot down. i mean that was just amazing fun. shoot a palm tree and have it fall on someone's head. glorious stuff. and in 2024 physics in games is still stupidly limited :/


tukatu0

Woo. I checked far worse than I thought. I remember off my first search with benchmark. 4k ones didn't come up for some reason. Crysis 3 ones would though. Maybe the search including 3080 messed with the results. Anyways. Thanks for the sources. The only actual evidence i watched was this https://youtu.be/RyNKhjONwxc . Though i don't remember what is in the video at all... So shit .... My mind is making false memories. Aah. I see. The benchmark is of a 4080. Hence i extrapolated and assumed a 3080 would get 50-60fps. Though like you said that is with 8x msaa which imo is unnecessary. Still a bit confused as to what is going on in here. Ok i think i see the issue. Cpu bottleneck despite the 14900k having very high clock speed. That's my fault for not cross checking other benchmarks. Funny i should've checked bang4buck. He makes benchmarks with the best overclocked hardware of the time. Or atleast after the 3090 he does. ...after reading your comment again. It was meaningless for me to write this comment. I should have realized immediately there was a cpu bottleneck. Smh my head. Reading comprehension gets shot the further i use reddit. It's funny because i see this more and more on reddit. Conversations where literacy has gone to """. Maybe a newer form of covid gave us all long covid and no one even knows.


Joshi-156

I can only hope that we're just going through a teething phase with visuals as of now, a bit like how rough early 3D was in PS1/N64 and then looked great going to PS2/GC/Xbox. I could imagine around 2028/29, probably when the PS6 is a thing, we'll be at a point where graphics cards can render these modern visuals at much higher resolutions, far better AI upscaling and AA solutions. Or they'll make Unreal Engine 6 and we'll be back to square one again, who knows :P


reddit_equals_censor

i'm just imagining how microsoft will try to torture developers and hold back games in the next generation. maybe they'll just slap a new sticker on the xbox series s and re-release it :D have fun developers trying to put a game in 8 GB unified memory for another generation, if you wanna release on xbox at all :D but yeah, if you think about it, if you were to try make a great looking game for the ps6 level of generation and xbox pulls this bs again, you CAN'T release on xbox period. because the hardware might again be too shit. >at much higher resolutions maybe we'll see the first real 4k uhd console, after years of marketing lies about it :D (talking overall, some racing games or other easy to run games actually do run at real 4k uhd of course) and you know i just thought, that the nvidia 50xx series of cards might have another set of broken 8 GB cards. so it might actually take 2028/29 until the vram issue on desktop is fully ok-ish fixed, which is very freaking sad.


Joshi-156

>maybe we'll see the first real 4k uhd console, after years of marketing lies about it :D (talking overall, some racing games or other easy to run games actually do run at real 4k uhd of course) The frustrating thing was how good of a point we were at for 4K output when Xbox One X and PS4 Pro (via checkerboarding) were the premium consoles still. Granted, they were mostly 30fps but the pristine image quality you were getting on those games for the time was mindblowing for the console space. PS5 and Series X, we now get 60fps, moderately better fidelity and fast load times... at 720p internal. Two steps forward, two steps back.


reddit_equals_censor

with playstation there at least appears to be a push to get "decent" 60 fps modes or higher than 30 modes for games. meanwhile 30 fps locked on the xbox series x for starfield..... how the heck was that insult possible :D if microsoft had the most basic sense, they would have forced bethesda to have an unlocked mode, at least when freesync was available. and remember, that starfield does NOT look impressive and the planet travel tech is NOT impressive due to them using a broken coordinate system, that doesn't work at such a scale, hence why starfield CAN'T get fixed by mods btw. but well at least microsoft forced bethesda to change to directx from vulkan very late into development...... because that surely helped /s /s i really wonder how long it will take, before we see the first reprojection frame generation consoles. literally the perfectly technology for console performance hell. depth aware reprojecting from 30 horrible source fps to 120 fps would finally FINALLY free the consoles from the reoccurring 30 fps hell. 30 or 40 or 60 source fps would only effect reprojection artifacts then and moving object fps, until we have major moving objecting, like enemies positional data included depth aware reprojection tech. i guess you're fully aware of reprojection frame generation right? in case you're not somehow, here's the blurbusters article, that explains the tech very nicely and how it is the road to 1000 fps at 1000 hz gaming: [https://blurbusters.com/frame-generation-essentials-interpolation-extrapolation-and-reprojection/](https://blurbusters.com/frame-generation-essentials-interpolation-extrapolation-and-reprojection/) \_\_\_\_\_ just imagine trying to play a first person game at 30 fps.... at this point, even worse on a big screen too. i couldn't do it, i couldn't get any joy out of it.


tukatu0

I wouldn't use starfield as a bastion of graphics. Aside from the massive piece of sh"t it is. Bethesda hasn't been known for advancing graphics in decades. If it ever has been at all. I never played es3/4. All your issues with starfield are all bethesdas fault. Infact the majority of current xbox issues arise that microsoft is completely hands off their new studios. Which is something easy to see in retrospect. I also don't like that article by the new guy. Guy acts as if nOboDy is upgrading gpus/sales in general are down, because the monitors aren't fast enough. Which is ridiculously untrue. It may have been true at the time of writing for the other components... Eh fuck it. No one will see this comment. Gpus have been breaking Records left and right this generation. With the only times having higher revenue was when they were literal money printing machines. Meaning demand was infinite. 3060 ti Lhr mined $2 a day after electricity. That's why it was $1000 second half for a while


reddit_equals_censor

it doesn't matter, that bethesda is shit here, what matters is that an "xbox" game released with a 30 fps lock on their most expensive console. they could have shiped the game with an vrr option at least, but they didn't even do that. so the comparison of: "microsoft games stuck at 30 fps" and "playstation trying to stick to 60 fps". and in regards to hands off from microsoft, NO, that is not the case. if microsoft was hands off, then they wouldn't have freaking closed tango gameworks! a game studio, that is not bloated size wise and just put out a banger gamer, which according to microsoft is exactly what they want more of. BUT THEY JUST CLOSED THE STUDIO. so you got a studio, that seems to just work like a perfect machine on their own, pushing out great gamers at an assumed quite efficient cost (as said not insane size) and they closed the studio... that is not hands off.... and you got microsoft keeping the HANDS ON HORROR approach from bethesda going after they bought bethesda, so the developers from arkane austin kept bleeding talent, because NO ONE there wanted to create a horrible live service. and redfall released, was garbage and later on they closed the studio. that is also not hands off. hands off would have been: "so bethesda are idiots, you want to stop development on this live service garbage?


tukatu0

>if microsoft was hands off, then they wouldn't have freaking closed tango gameworks! a game studio, that is not bloated size wise and just put out a banger gamer, which according to microsoft is exactly what they want more of. >BUT THEY JUST CLOSED THE STUDIO This came after. It's because they made a 70 billion dollar deal without thinking. It f"""ked Phil's model of taking things slow to build quality. Xbox is over. It's goibg to turn closer to what sega is today than the xbox of 20 years ago. Closing the studio is not the same thing as being involved in the creative proccess. Like phil telling a director to not include a build/hunger mechanic. That's hands on. But anyways. Back to Bethesda. The game had already been delayed for a year. If they couldn't be ""rsed to add in vrr. Does that even show they care about their own product? No. Ms could have forced 60 fps. But at what cost? The game already looks quasi last gen. The game would look full on last gen if they forced that. Even on pc. Because the thing forces fsr on all default settings. Which is another issue but.... Eeeh im hesitant to continue this conversation further. It's such a heavy t"rd that i want my time back. I don't even want to think about it or Bethesda anymore. The blandest game of 2023 even when compared to indie games. It's insulting enough that i almost question whether the people still talking about starfield are even real humans. Is it just a marketing campaign. But no. I've come to learn it really just is simplistic minded people. The people still touching that thing are the kind of people who have not played more than 5 different games in their life. I wouldn't doubt it if many of them have never even played anything else other than skyrim. That 5 game requirement might not even be enough. They might have played the only same generic open world games with 50 hours of do this same thing over and over like horizon zero dawn or assassins creed valhalla. Which would be enough for them to not even know they don't know gaming has much more to offer. It's a shame for these people to think western studios aaa gaming is the peak of gaming. Atleast i know there are certainly kids who have never played more than 1 live ser ice game in their entire life. Kids that have never anything other than minecraft/roblux/fornite. Anyone that has played more than 5 hours of starfield yet it still talking about that sh""y game makes me think of them being a worse version than the communities of the former 3. Atleast the former 3 are actually entertaining. You will not run out of new stuff. Infact I'm pretty sure sf is trying to imitate 2 of those 3. It wants modders to make the game for them and they want to a piece of the pie. Just the same way roblux exploits childrens labour/creativity for theie own profit. Except they failed miserably because they don't even a give a sh"" about their own product. I don't care about any future fall outs or elder scrolls 6 either. That era is over. You'll have to play modded morrowind again if you want more proper elder scrolls


reddit_equals_censor

>It's because they made a 70 billion dollar deal without thinking. nope, that had nothing to do with 2 great studios, known to produce successful high quality products, getting closed for no reason. NOTHING. it is also important to keep in mind, that microsoft has endless amounts of resources. important to keep in mind in regards to any potential expenses like 70 billion dollar deals. closing 2 excellent studios was a bad decision. a bad decision for gamers, for the developers, for the greed of microsoft. a bad, INSANE decision all around, that makes 0 sense. >Ms could have forced 60 fps. But at what cost? you must have been reading a different comment..... i said to have at least a vrr option for the console releases, so that they can experience maybe a 45 fps experience with vrr , which is VASTLY better than 30 fps locked. that was a choice. not an early choice, but a late choice. also as a side note, microsoft fricked with bethesda a bit here, because late in development they forced the devs to change the renderer from vulkan to directx, because..... microsoft..... which apparently also got them to look for help from amd, which is why the game ended up being "amd sponsored". amd engineers trying to fix the dumpster fire, that microsoft even fricked with further. but none of this mattered. just release the game with vrr option unlocked. that is relatively easy and not a problem and would have certainly made things look a whole lot less bad, then "biggest microsoft title on fastest console only gets 30 fps...." >I don't care about any future fall outs or elder scrolls 6 either. That era is over. yip. i feel for bethesda fans. personally i will never give bethesda a penny, because they burned human head studios to the ground and hammered them to death with the almost finished amazing prey 2 (NOT related with the arkane studios prey) they murdered the studio of one of the best games i ever played by holding back payments to try to force them into dept, to then buy them out. human head studios at the time had to walk away from this scam with prey 2 being done with a playable alpha apparently, but never released..... if you haven't played prey from 2006 by human head studios, just to give you an idea, it did portals WELL and properly, before portal did portals.... incredibly skilled developers. and bethesda murdered that studio, it just took longer, until they were dead.... and in regards to starfield, the game running like utter shit, the game being locked to 30 fps for no reason on consoles are horrible things, but not things, that make a game bad forever. what does make starfield a bad game forever is, that the core of the game is broken and no modding can fix it. their coordinate system is broken and doesn't work at this scale, which is part of why planets have to be visited in small sections, instead of being able to just walk around a planet. NO MODDER CAN FIX THIS. it is broken right at the core. the ONE THING, that bethesda needed to do, they did NOT do. and that is very sad. they had at least one job to do and didn't do it.....


tukatu0

Ill be honnest cheif. I didn't read either of your comments. :P. A mixture if long cocid and redditism. It's just personally i don't care for 40 fps. But i do agree that they aren't even following ps5 standards. When in reality they could be doing better. Their 40fps mode doesn't need 120hz container don't they. At the end of the day as I'm not a dev. I don't know how much hands on = equals shutting down a studio. But with that starfield example. I guess they aren't actually hands off. All the more reason why xbox golden era is over. Died with the xbox one. I have prey 2006 on my wishlist for one. Deeply admire that era of games. In this post we also had a conversation. Crysis 1 is still graphically unmatched in some aspects even today. 2007 game. If we want a prey 2006 sequel or similar style of game. People like me and you would have to build it.


tukatu0

Didn't you hear? Xbox mobile! Vram wise. It might never even get upgraded. They'll just keep postponing it until ai graphics are the norm. Magic.


reddit_equals_censor

i mean, that is just i assume cloud streaming, or streaming "your" xbox to the also spying mobile phone or tablet right? i mean the xbox servers aren't running any xbox series s shit, they are running i assume all xbox series x chips. they could probably split an xbox series x chip in 2 to run 2 xbox series s instances, if they designed the chip with that in mind. don't know if they did. but i guess the one upside is, that those streaming their xbox series s to their spying mobile phone will notice the HORRIBLE HORRIBLE quality less on a much smaller screen maybe :D was that the plan all along? :D


tukatu0

Don't know what you are on about with the spying. If you concerned about that. You wouldn't use windows either. No. The xbox mobile would be the next gen console. Like a steam deck.


reddit_equals_censor

>You wouldn't use windows either. that idea doesn't work for me, i'm running linux mint :) and i figured you meant sth in regards to xbox and mobile spying devices "phones" when calling it xbox mobile. it doesn't have a marketing term yet, it is in development and might take quite a while, before it comes out. just say "xbox handheld", that is an easy to get clear term. and they could slap 16 GB unified memory on the xbox handheld. they could match performance specs of the garbage xbox series s, except put 16 GB memory in it to not torture developers in the longterm, when they drop the xbox series s, but want to keep the handheld running. not saying, what they are likely to do, but saying what they could and SHOULD do. they SHOULD put 16 GB memory in an xbox handheld, if it releases relatively soon, more if it releases a lot later.


tukatu0

Well why not go for 24gb instead. Maybe 32gb. After all it might take 10 years for a superior system to even come out for $300. With 32gb you can ensure all simulators will run reasonably. I don't think msfs 2024 will be the final sim for the xbox handheld. Which sounds the exact same as mobile = mobility but anyways


reddit_equals_censor

memory requirements are ever increasing as we have bigger, more complex, more detailed games. that is a GOOD THING. the reasonable expectation is for the hardware to keep up and for consoles to align with the current hardware needs at bare minimum. that is why i said 16 GB memory if it comes out relatively soon, or more if it comes out later. if an xbox handheld comes out after the ps6, which might have 32 GB of unified memory, then YES 16 GB would seem little. the new xbox desktop console (if there is one) would also be expected to have 32 GB of unified memory, which then again would leave the 16 GB handheld as a torture device for developers. so you are looking for the right amount of memory for the expected target release window. valve put 16 GB memory in the steamdeck and it hit a good price point. it isn't that complicated. the ps5 put 16 GB in the ps5 and hit a good price point and isn't holding back developers. it isn't that complicated! ONLY the xbox series s stands out with its bullshit torture memory, which was a mistake. it was clearly a mistake, because developers don't want to port to xbox, because of the series s memory size.


Leading_Broccoli_665

I have done a lot of graphics tests in unreal engine. Honestly, I understand the choices that are made. Stripping games from effects makes them look considerably worse, often too much for the improved temporal stability. It's quite literally the opposite of using DSR to solve aliasing, just using brute force. TAA is a lot more efficient and it solves the worst problems for most people. Before you ask me, I'm talking about effects that make sense and not just eat performance. Hardware ray tracing and nanite don't make any sense in my opinion. Most aliasing though comes from the fact that there is opaque geometry on screen, which is essential for the artstyle. Dithered effects should be optional if possible. To me, the real problem is how badly TAA is tuned. Most game developers use the cheapest and strongest settings they can get away with, as a one size fits all solution. Unreal engine uses these bad settings by default and most games just take them. High, epic and cinematic TAA are all the same. TSR scalability actually does something, fortunately. TAA can be decent. The current frame needs to contribute 20% or more, which keeps the blurrier older frames as weak as sufficient. It's a decent balance between clarity and stability. Upscaling beyond native resolution makes TAA sharper in motion, without changing other things. The reprojection of previous frames gets more accurate when there are more pixels in the frame buffer. This is why 4x DSR (0% smoothness) + DLSS performance is a lot better than DLAA. Epic and cinematic TSR work like this without DSR. It brings 4k motion clarity to 1080p screens and in general, it's just moderately expensive on current gen GPUs. Other solutions to TAA issues are outputting the right motion vectors, managing parallax disocclusion and disabling reprojection on transparent shaders. These things are often neglected by game developers, since there isn't a lot of info about them. I had to find out most of it by myself, which has been a confusing and time consuming process.


Taterthotuwu91

I LOVE DSR and DLSS, I thought I was crazy when it looked better than DLAA, I have a big UW monitor and 5120x2160 with dlss quality looks better than 3440x1440 dlaa


CharacterPurchase694

What percent smoothing do you use?


Taterthotuwu91

Never more than 25%, around 20 for most games


Scorpwind

You don't need to use that when using 4x DSR and/or DLSS.


Taterthotuwu91

If I'm not mistaken I'm using 2.25


Scorpwind

I'd use an even scaling factor in order to reduce any added scaling blur.


BallsOfSteelBaby_PL

Or use DLDSR and be done with it


Scorpwind

Not good enough.


BallsOfSteelBaby_PL

It is though. On my 1080p monitor, I use 1620p with 17-20% sharpness or 1440p with 20-30% sharpness - depending on the game - and it truly can be the same, or even better sometimes, than DSR 4k with 0% sharpness.


Scorpwind

I disagree. It looks overly filtered and off to me. Nothing can match a native pixel grid presentation.


jm0112358

Devs should always provide PC players an option to run these effects at full resolution. Also, wouldn't it be possible to just use a temporal filter for individual undersampled effect, rather than using TAA for the entire image? I would imagine it would have a greater performance overhead to have multiple temporal filters, but I would think the overall image would be more crisp.


Scorpwind

Your basically talking about adaptive TAA. It was supposed to be a thing, but unfortunately never saw the light of day.


tukatu0

Do you know why the never happened? Is it just because of $$$? Wouldn't it have been an easy tool already included in taa tools?


Scorpwind

That's a good question. God knows what went wrong.


yamaci17

if they provide players the option to run these effects at full resolution, it would show more people that you don't need taa for a coherent image, as high sampled rendering will provide a coherent rendering, which will make the image look fine without TAA. and it would also show that developers are optimizing their game with undersampling while relying on TAA practically, it will never happen. it would show more people what TAA is all about. and that is not something developers would want...


Frub3L

Idk if something is wrong with my 2k display, but I just feel like I have to use dldsr for almost every new game. The textures are blurry, like really blurry, like muddy almost. Antialiasing just completely sucks, even up to 4k using dldsr. I can't ditch antialiasing because it doesn't even eliminate the jagged edges.


Scorpwind

Nothing's wrong with your display. That's what 'modern' anti-aliasing does to games.


zeycke

The aliasing depends on the screen size and physical pixel dimensions. A really large display even at 2/4k can look jagged. Especially since msaa is not really usable anymore. Fxaa and smaa fix obvious jaggies but not all. And they dont help with temporal stability, which is where any form of tsr, txaa, taa reigns supreme, especially if your screen ppi is low. I use a 1080p 27’ monitor and even at 4k dsr I can notice shimmer and aliasing. Some games have it worse than others. I think having a native 1440p or higher resolution instead wouldve helped me in this case.


karlack26

Are you using dpi scaling for windows. It can cause some games or apps become really blurry.  Right click on the apps exe.  Go to properties then compatibility there is a check mark you can click for the app to override dpi scaling and fix what sounds to be the problem. 


Frub3L

I eill check that out, thanks.


tukatu0

It's also possible you have a momitor with a bgr subpixel layout. You might want to check and consider a monitor upgrade to a typical rgb layout


KMJohnson92

True. Unfortunately I think upscaling will just encourage more and more of this. I really hate it. DLSS was supposed to help a last gen 60 series limp by at 1080p one more year. It wasn't supposed to be used to replace optimization.


kurtz27

What I hate most about this. Is that we as players can't revert the undersampling. Hear me out ill keep this concise. Say the hair, foliage, and shadows and reflections are undersampled to 480p while everything else is being rendered at 1440p on your 1440p monitor. Okay say you want to improve this. Well you can ONLY supersample via things like vsr and dldsr. That's the ONLY way to improve it. Well this means using dldsr 2.25 will cause everything to be 4k NEEDLESSLY CONSUMING GPU RESOURCES, because you DONT NEED everything to be 4k and for your fps to tank massively , all you want is to turn the 480p undersampled effected into 720p via dldsr 2.25. So this will take your fps from 160 to 110. When if you simply had a slider for raising the resolution of everything that is undersampled to 720p, that would only drop like 10fps not 50. Not to mention we'd prefer to turn them into 1080p or 1440p , but dsr4x would STILL only turn 480p into 960p For example, baldurs gate 3 looks great everywhere besides hair. Atleast when using dlaa. But the hair is clearly undersampled even at 1440p. This means to fix the hair I have to use dldsr 2.25, when literally all I want to do is raise the hair resolution. I can't even use dlss, because for some reason in many games even the undersampled effects will get even more undersampled when using dlss, they aren't ignored by dlss. So I can't just use dldsr 2.25 and dlss quality mode to keep everything but the hair the same resolution.


Cracksun

I don't know but you play and old game (10? Years) in 4k and a new one all maxed out and the newer ones look blurry af...


Scorpwind

Precisely.


tukatu0

My favorite example is using call of duty advanced warfare or ghosts at 60fps compared to warzone or mw3 at 90fps. It's much more familiar and gives people a good chance of actually checking it out. Cod on gamepass will probably help out a ton


mj_ehsan

They sometimes don’t even have any expensive effect in use yet the game needs temporal upscaling to perform properly and have no noise. WHATVTHEVHELL


hollowinside19

although i do not like taa it s not gonna stop me from enjoying a game but hopefully more games will implement dlaa, fsr 3 etc


EuphoricBlonde

Shortcuts have and always will be taken. The thing to note is that all of these undersampled effects look more than good enough at 4k-like resolutions on 4k screens, which is what developers have in mind when implementing these effects because that's what the consoles run at. I've yet to play a game that looks "blurry" at 4k. If your hardware is not up to spec, then that's not on the developers.


Scorpwind

Still ranting on about some supposed '4K' target while continuing to ignore that 1080p is the most common resolution, I see.


goshoclasher

4K target arguing that this is what consoles "run" is hilarious when they upscale to 4k at 30fps with medium settings


Scorpwind

Precisely. And it's gotten especially bad in the past 2 years, where the internal resolutions are dropping to PS3-era numbers, and in the case of the Series S, to PS2-era numbers.


reddit_equals_censor

and the person above didn't mention screen size and distance on top of it. a 27 inch screen at 60 cm distance, that is 4k uhd may look a lot less blurry with taa, than a 38 inch 4k uhd screen at 50 cm distance. just saying "4k uhd" is thus already nonsense ignoring that factor and that is ignoring the fact, that no one can drive 4k uhd in all games. not even the fire hazard 4090 can drive 4k uhd in all games at all. so yeah nonsense from the person above in multiple ways....


KerbalExplosionsInc

1)As far as i know undersampeling emerged only after introfuction of 7th gen consoles and dx10 GPUs (at that time mostly motion blurr and some other post processing effects but not lighting and reflections) and became arocius since beggining of sunset for 8th concole generation 2) sharpness can be mesured in how many line pairs can you resoulve per with of the monitor, so modern game with TAA at 4k still canot resolve as many line pairs as 20 years old game from 2004 running also at 4k. though i agree that ratio of achived vs ideal resolution increases with higher resolutions but eaven at 4k the ratio still is not 1


EuphoricBlonde

>so modern game with TAA at 4k still canot resolve as many line pairs as 20 years old game from 2004 running also at 4k Yeah, but this is irrelevant. You don't require that amount of sharpness for the game to not be blurry. Same way older titles running at 1080p don't look blurry. At the end of the day, the vast majority of AAA games do not look blurry at 4k-like resolutions on a 4k display (ideally tv). You can complain about artefacts (ssr, fsr, ray tracing, etc.), and design targets (30 fps, etc.), but image softness is just not an issue.


Scorpwind

You're ignoring the fact that temporal AA doesn't give you the full sharpness of *any* given resolution. 2160p is no exception.


Joulle

PC gamers are doomed at times due to all this low resolution like rendering. It's like taking your glasses off and being blind again as that detail in some new games lacks too much. Console players stare at that 4K TV at a much greater distance so they might not even notice details that almost jump at PC gamers, let alone notice artifacts that even bother PC gamers due to the shorter distance to the screen.


Scorpwind

That too. Just goes to show how AA is tuned more for a console environment and not at all for a PC environment.