is this another flavor of the month gimmick like SLI, 3D Vision, and PhysX or is this fake shit here to stay?
Is this another flavor of the month gimmick like SLI, 3D Vision, and PhysX or is this fake shit here to stay?
It looks like shit but it's here to stay because ticking the framegen option is much cheaper than hiring non-pajeet dev teams to actually optimise your games.
It’s here to stay and practically required for newer games.
You are poor
Oiks can't tell the difference and it's cheaper than making your game correctly
Yes, it's here to stay
Its all fake and gay 15 years later and games still don't look better than mirror's edge.
Considering it's required to pretend to run titles like the new DOOM game. It's here to stay. That means that god awful artifacting is going to be this generation's equivalent of bloom, chromatic aberration and motion blur.
You are aware that devs aren’t going to keep doing rasterized lighting forever, aren’t you? Gay tracing WILL become a requirement, your future card WILL be a massive space heater, and you WILL need fake frames to play games.
your future card WILL be a massive space heater, and you WILL need fake frames to play games
Perhaps I simply won't purchase new games.
you can bake in 80% of the lightning with 0 quality loss
devs would rather 100% rely on gaytracing
it's not even incompetence at this point it's just pure laziness
The easiest person to fool is one’s own self.
It’s over. Pre baked lighting is absolutely going to die and the consumer doesn’t have a say in this. Eat up.
ayymad victim cant handle Fucking Shit Resolution anymore
Gay tracing WILL become a requirement
it certainly will not. What is AMD's answer? oh right nothing.
It's going to be around for as long as people keep buying badly made games.
It was bound to happen eventually once consoles started just use PC parts. Otherwise, AMD and Intel wouldn't have made their own versions. But it's technically AMD's fault. It's their tech being used, and I mean the actual hardware.
Misinformation for free
Rent-free.
I mean with DLSS 4 they’re already putting in two fake frames for every rendered frame. They’re going to go for 3 for 1 next. At some point it’s just going to make games look gorgeously lit at 4k while being pretty much unplayable, isn’t it?
FSR4 makes DLSS redundant and DLSS is no longer a selling point
it's another flavor of the month gimmick like Raytracing, yes, but it'll stay anyway like Physx even though it won't become industry standard unlike the dogshit that is TAA
gorgeously
Nah. Upscaling and fake frames make the game look like it's been deep-fried with video compression.
flavor of the month
Has been the standard for like 6 years now. This shit is staying, if not for Nintendo and co using it well into the late 2030s
name 3 games with fsr4
exclusive to AMD Radeon RX 9000 Series graphics cards
how about no
can't use it in most games
can't use it on linux
fake frames
lol
Almost every game with DLSS2+, XeSS or FSR2/3 support.
There's going to be FSR4 lite for unsupported hardware.
fotm
its been here for 4 years
At some point they'll have to pull some other bullshit out of their ass. Considering they're working on AI generated physics engines, they'll probably develop a game engine and devs will use it to "offer better performance to a wider audience". The caveat being that all games will now require online access to function and some bullshit about locally cached AI simulations for each level.
DLSS 4 is okay but the problem is it looks like garbage unless you're using a 4k display.
It fucking sucks how little either brand does to help 1440p users because it looks the worst at that resolution which boggles my mind
Surely we can have GPUs be actually able to handle raytracing by 2040?
there's literally nothing wrong with DLSS, anon :))
(taken at DLSS 100% rendering resolution / DLAA, by the way. this is not "DLSS performance")
1440p dlss transformer looks fine
DLSS using the transformer model is crazy good. like multiple times better than CNN model.
I love what it SHOULD be - which is free performance boosts. I hate what it is - a crutch dogshit shitwater hotdogwater fucking shit paki devs use to try and buy 5 frame for their shitty unreal 5 slop.
Do its just dlaa not dlss.
Transformer offers 10 fps less than cnn
This. 4k is a good spot for upscaling from 1080p, since 1080p native is here to stay and 1440p native is pretty much a niche. You can upscale 1080p to 1440p with simple programs but 1080p to 4k is worth it.
Transformer offers 10 fps less than cnn
thats not consistent. ive tested in a few games, particularly darktide. its MAYBE 3-5 fps difference, negligible. but even if it was 10 frame its probably worth it because it looks twice as good.
Heard a podcast not too long ago hosted by devs about the state of the industry and how studios are flipping over couch cushions to find savings and hand composing the lighting will be one of the casualties, the quality of rasterized lighting will inevitably nosedive while ray tracing will be “100% necessary” to realize the artist’s intent. Future seems dark lads.
i hope FSRlite won't be like DLSS4 where it rapes my performance instead of making it better
savings or not these retards gave on on hand-crafting anything long ago, at least in the big shitty studios.
there are still a handful of devs out there who care about their art.
I use ray tracing in some games but never DLSShit. In the future, cards will actually be strong enough to run full path tracing at native resolution at acceptable framerates.
Yes, it is. Something better will come along but the foundational idea will linger.
No? True gaytracing is the non-stop work of thousands of connected GPUs to render one single frame over days.
1 frame per fifty hours, basically.
lol
lmao, even
It needs to stay until a new tech about optimizing shit appears.
my nigger this game renders DLAA at a lower resolution thanks to french incompetence
DLSS Q will look better unironically
All this doomposting because you fags don’t want to retire your crusty old 1080s REALLY?
No, this AI slop is locked in now. We will never have good GPUs again in this time line, in fact they will actually get worse. The end goal is to render at very low resolution, like 320x240, and upscale.
You've just made me realize that this debate is the same as pixel art vs hand drawn.
Pixel art would never have been invented if not for hardware limitations and so it fucking SUCKS. People that still use it are retards bound by nostalgia. Horrible, horrible "art" style.
With that said, current gaytracing is terrible.
Everyone bitching about DLSS, post your specs.
You faggots should not talk about gaytracing without, at least, reading this short article.
Positively hellish.
Seeing that's exist since 2019 and it's almost ubiquitous on it's adoption.
Yes, i think it's here to stay.
SLI
fotm
hello zoomer
Tranformer model is genuinely really good. It's release and compatibility with even Turing forced me to kneel to dlss schizo
Even at 540p it'll ghost or boil any dynamic lighting but it's never blurry or anything
Wat does lighting and ray tracing have to do with DLSS? you retards.
it's nvidia shills and bots derailing the original topic, retard. they don't want you to see that games running like shit is entirely their fault.
Both of those are usually going to be effected by internal resolution.
Which part of that do you think is funny?
People have been saying "it's a temporary gimmick that'll get cancelled" since 2019, it's only gotten bigger and better over time. DLSS is here to stay, until Nvidia come up with something better.
It looks like shit
It looks amazing at 4k and acceptable at 1440p. It only looks like shit if you use it at resolutions from two decades ago because there's just not enough pixels to work with.
DLSS MFG feels great though. The only issue is the odd transparency artifacting that, in theory, will get ironed out over time.
I wish, devs now rely on this shit
DLSS is no longer a selling point
Holy cope
Wait REALLY? I WAS WONDERING WHY THE FUCK DLAA LOOKED LIKE SHIT. LOL. I LITERALLY UNINSTALLED BECAUSE THE GAME WAS A BLURRY PIECE OF DETRITUS WHEN MAXED OUT.
Still worse than dlss and is only available on cards that are expensive enough you might as well just spend a little more and boughted Nvidia
*Affected.
It was a thing since 20xx series so not a gimmick at this point
it's AI so it's trash.
FSR4 makes DLSS redundant
FSR4 is good but it's only available for two recently released AMD cards, and its game support is very lacking. DLSS4 super resolution is available on everything from the RTX2000-RTX5090 range and exists in basically every graphically intensive game from the last half decade. And at the end of the day, DLSS4 is still slightly better. So FSR4 is not quite there yet.
i will affect you by blowing out your prostate
it's AI so it's trash.
Funny you should say that, because the rendering methods of current games are fully man-made yet they're worse than DLSS.
download DLSSTweaks and set GlobalHudOverride to Enabled (All DLLs)
This will show up a tiny HUD on the left bottom of the screen that tells you what resolution is upscaling and what version of DLL it's using
if you drop it into the game folder you can also tweak all DLSS resolution scaling for that specific game
"full path tracing"
There's no point in achieving real time path tracing for games. It's akin to running complex physics simulations.
It's virtually impossible to do it, even with futuristic quantum computers.
all this hate
I dont understand the adverse reaction. if you can't meet your v/g/free sync cap then this would immediately solve the issue. it's a win i'd say
it's only gotten bigger and better over time.
It's fucking killed modern gaming. every single modern release is such a badly made piece of shit that it can't even run without AI scaling and frame gen.
ye, i think it's 3k instead of 3840
they still didnt fix it btw
No it hasn't. Most modern releases run fine on intended settings as long as your hardware is not older than 24 months, which is how it has been in PC gaming for the last two decades.
It's nothing like any of those things, it is in every single game now and has been for 6 years now. It may evolve away from its current form but it's not going to disappear. And Physx isn't gone either, it has been implemented in 14 games this year already.
intended settings
intended settings is forced AI shit, forced TAA & heavy motion blur to hide the artifacts.
these games look like fucking vomit compared to games form 10 years ago and run 100x worse.
>you can bake in 80% of the lightning with 0 quality loss
Yeah as long as the entire game consists of static objects where nothing moves, not even the player character. That's the only feasible scenario where painted on lighting might be identical to ray tracing.
intended settings is forced AI shit, forced TAA & heavy motion blur to hide the artifacts.
And intended settings used to be 4x MSAA which murdered your framerate more than gaytracing does now, yet you're ok with it because that's old good and this is new bad.
Kino
Transformer offers 10 fps less than cnn
But the quality increase is worth more than 10fps, such that you can drop down a resolution level and enjoy better performance for the same quality. Transformer quality mode is almost indistinguishable from CNN DLAA.
used to be 4x MSAA which murdered your framerate more than gaytracing
LOL fucking retard. I can run 8x SSAA on older games and it still will run at a flawless 60 FPS without the need for AI dogshit.
It doesn't run anywhere nearly as bad as this modern dogshit.
In the future, cards will actually be strong enough to run full path tracing at native resolution at acceptable framerates
save for some breakthrough in physics, no that won't happen. GPUs are going to repeat what happened to CPUs over the past decade. A 5090 running at 1080p native just barely breaks 60fps on cyberpunks "PT mode" which btw only traces certain things, uses a lot of tricks degrading the quality of the trace, and relies on a tempoeral denoiser to actually resolve the image which is arguably worse than what DLSS will do to your frame, oh and that's with ~1 ray per pixel and a low limit on bounces.
To actually path trace truly natively you need ~100 rays per pixel and a higher bounce depth (can be selective based on material). this is several orders of magnitude more performance required and you will still need a denoiser (though it can cope with 1 frame). if scaling kept up maybe this would be achievable in ~40 years (still 1080p btw), but it won't.
60FPS is good
Imagine being that poor.
DLSS was literally created to make ray tracing possible dumb fuck.
it looks like shit
It looks better than native while doubling framerates
Also because 1080p -> 4k res performance hit is fucking enormous
I can run 8x SSAA on older games
Because they're older games, dumb faggot. Go wait 15 years and play a 2025 game with 2040 hardware and it'll run flawlessly too.
And with the rtx 22000 you'll be able to play any game from 2025 totally maxed out at 16k 6 gorrillion fps, what are you even saying?
And good luck running a lot of older games with stability comparable to even the sloppiest of ue slop, wasn't that long ago games were totally single core perf reliant, which has basically like only doubled in 15 years or however long, an increase you likely won't see in another 15 years
Its been around for like 8 years at this point and has industry wide support. Flavor of the month, he says
DLSS is the only tech i would actually say is worth every premium penny for paying for nvidia. it's made my 5 year old card age beautifully and also it fixes the TAA blur in games since every game comes with TAA now. it's genuinely brilliant tech, especially DLSS4 but even DLSS3 quality is awesome. i have it enabled in every game i play because it's free performance + free image quality boost.
I could do it at the time too retard.
no fake frames needed.
You're poor, so you were running at substandard resolutions and detail levels then.
Games haven't done fucking anything with dynamic lighting in the past 15 years. Might as well pre-bake raytracing like Mirror's Edge at that point.
what's substandard faggot?
1080p is the standard. let me guess you bought a 4k meme screen and wondered why you couldn't run an 8x SSAA fragment shader over (4096x2160 x8) it's because the shader it not going to process such an absurd amount of sub-pixel samples well you stupid nigger.
Games look better than ever, DLSS brings massive upsides and the few downsides are constantly being ironed out, input lag on frame generation is a myth, and you are poor and mad that new games need new hardware, which has always been the situation since the birth of video gaming.
yes I agree sar. DLSS very good future of gaming.
help cover up very badly made game, bad effect, bad framerate. means dev dont need to make good graphic or good running game.
very good for us yes. save time.
Pixel art would never have been invented if not for hardware limitations and so it fucking SUCKS
Wait till you find out that almost everything was invented because of limitations. What does this even mean
I'm not a fan of these temporal solutions to AA but DLSS manages to look better at lower resolutions than TAA does at native ones so at least its better than TAA.
However it is never the case where DLSS is better at literally every single thing. TAA usually handles something small better where DLSS does the general image quality best.
AMD cards perform better than NVIDIA in games where ray tracing is forced
It might be nice that the light moves and casts shadows but it looks like ass. A lamp as big as that with a bright cone would illuminate the whole room.
Literally just idtech
It's pretty incredible how bad TAA is, that it manages to look worse with significantly more pixels to work from. Astounding actually.
Might mostly be a problem with resolutions below 4k but still, you'd figure this many years into using TAA that they'd get better at it. But DLSS4 manages to look better upscaling from 960p to 1440p than TAA does at native.
lights can be dim you know
I don't need to supersample because my native resolution is 4x yours. "The standard." The standard is as high as you can get it. I don't even watch 1080p porn anymore because it feels like poverty.
I watch shitty 480p and lower porn still. Because the specific roleplay scenario I'm into is what gets me off, not image quality
I don't need to supersample because my native resolution is 4x yours
Good because you can't super sample. you can't even run basic AA because you bought a meme screen with 8847360 pixels.
even if you scale that to 2x the amount of times the fragment shader needs to run gets out of control.
actually my weakness is strength
how do 1080p fags do it
I can run basic AA at no performance cost.
Not only has DLSS been out for years, physx became a permanent fixture in gaming by forcing game engines to implement the same tech natively.
You are too stupid to share this hobby with me. Please kill yourself.
D:SS looks bad
I game at 1080p , what's the issue?
you forgot the part where your "4x resolution" is on a (likely) 32" screen destroying most of those resolution gains
Are you saying you can't tell the difference between a 1080p and 720p phone screen?
Most of this board subscribes to the Wimp Lo school of argument.
SLI was also huge, as was Crossfire for ATi.
55" monitor, and I sometimes stream to my 83" TV.
coping against ML upscaling is pointless
how often do they update dlss? It's been a few months since presets k and j dropped
I need that 2ms for Tomb Raider because I'm a pro gamer.
its actually good
less gpu power and better anti aliasing
Games like Shadow of the Tomb Raider and Last of Us 2 arguably look as good as current gen releases and can run at 60fps/native 1440p on modest hardware.
The fact that we're using upscaling even on top end hardware like the 5090 indicates to me that something went very wrong this gen.
It depends on how autistic they get tweaking it or fixing bugs. If you look at DLSS Swapper, they have a fuckton of different versions for 2 and quite a few for 3.
you can't even run basic AA
sure I can, why couldn't i? the more pixels you have, the better temporal rendering is
60fps/1440p
That's not a good target for someone with a 4090 or 5090. You want higher framerates and resolutions, especially with all the bells and whistles like RT.
This shit has ruined GPU reviews. It's impossible to find out the real performance of a GPU now as they're tested with DLGBT. No, I don't care about the GPUs performance at 640x480.
That's why I said modest hardware. (i.e. 3060)
A 4090 will run TLOU2 at 4k native.
Yeah, but I mean your comment about having to upscale on the top end cards. We only have to upscale when we crank up the RT. Even Child Porn 2077 runs quite well native with baked lighting. You don't NEED to upscale to run the games on modern cards, but you do if you want RT.
it's impossible
only if you're a retard. Reviews have DLSS on/off
vs
deferred rendering with MSAA
native res, no upscalers or framegen needed
2020 tech
I hate you Nvidia faggots for leading us off a cliff
With more and more games moving towards forced raytracing for their lighting solution (i.e. Indiana Jones, Doom, Stalker 2), you really don't have a choice anymore.
Hell, even Alan Wake 2 didn't run at 60fps/native 4k with RT turned off on a 4090 and that game used baked lighting.
DLSS was never going to give us free performance, it only gives developers free money in the form of saved optimization effort. If DLSS can double your frame rate, devs will spend half the time optimizing their game. DLSS, frame gen, and gay tracing are meme features that only benefit greedy developers.
The fuck? Is this a thread from 2020?
Reminder that most of Anon Babble is on 2008 smartphones.
I loik gamingz at 30fps
UE5 kinda sucks but TLOU2 looks last gen
that's my bad I meant to say forward rendering.
Fuck raytracing, you need good art and talentless engineers (not jeets)
Baked lighting has been used for decades for a reason.
This push towards everything being realtime has been stupid.
There's no level of "good art" that can mimic what RT can do when implemented properly.
Eh, I play at 1440 and acceptable is being too generous. A lot of games still look like shit with it on
What GPU and monitor do you have?
Unlike your other examples, this garbage exists because the hardware has hit a brick wall and they needed to obfuscate that by rebranding a console-tier bandaid as a premium feature. But really I should say the terrible software of today has forced the hardware's hand.
I think it has some use for specific types of game design but full on dynamic everything as the default is fucking pointless.
A lot of games still look like shit with it on
relative to what? dlss4 at 1440p certainly doesn't look like shit if you compare it to TAA at 1440p.
Okay Ranjeet. When the company you work for inevitably goes broke and they deport your ass back to India, we are gonna have one big party.
quick , pull out the baked lighting game which has GI added to the textures
Everything being realtime is cool if games use it to make more dynamic gameplay and environments.
Sadly a lot don't and only really benefit from improved iteration times from not baking which isn't exciting.
7900XTX and some LG UltraGear 1440p/165Hz monitor
I guaranty you that DLSS 4k looks better than 1440p native
shows some shadows
Now show a moving light source with shadow opacity/size that's based on the distance from the light source.
Show a transparent surface with reflections that are distorted based on the shape of the surface.
Show a broken mirror that you can see your reflection from all the angles of the break.
Not that you actually care but the Star Wars Battlefront games actually have realtime GI and reflections. They use a third party lighting solution called Enlighten.
you image looks delayed but what happens is that illuminated embers moved to the sides of the doorway out of view which still lit the room.
You can be sure they are going to make it obsolete in a few years.
I care. I didn't know. Thanks anon
Do you understand how the point of graphics is to make it look good using current tech and tricks and not just flick a RT toggle and hope for the best? Every UE5 game looks the same and runs like shit and needs upscalers and framegen to work properly. Why shouldn't we go back to the previous generation?
4k 32" = 137.68 PPI
1080p 27" = 81.59 PPI
1080p 24" = 91.79 PPI
even in your best case ppi meme scenario you are btfo
why arent those blade of grass properly illuminated by the lumen trash? did the devs forget to tick a checkbox?
Tell me about the stutters. Do you have some DF screencaps to illustrate this?
I know I've never seen a game which I preferred with RT off and the people who cry about RT can't run i
I have a folder full of these. Keep egging me on Patel.
give it a decade, we've peaked for now
That looks like complete vomit
I know the previous shot was harvested from some website. This video probably was as well. You likely post on a phone (and use Readchan or the like, since that's how it names the files).
Fallout 4 unironically looks better than that screenshot.
digital foundry defend force reporting for duty saars
Ravenholm in HL Alyx vs Ravenholm in HL2 RTX mod. Baked vs Raytracing. Notice while the shadow on the right may be more "accurate" the shadow on the left looks better. That's why accuracy doesn't matter.
MAKE
IT
LOOK
GOOD
You are literally paying Nvidia 3k+, rendering at a lower resolution and playing with framegen input latency just to go back to privilege of using blob shadows from the late 1990s LMAO.
I run alyx at 120fps at 6000x3000 VR resolution and it's mind-blowing.
Meanwhile dogshit unreal games in VR struggled to hit 80fps at 5000x2500 and the image quality is so fucking shit when they try to force TAA on you.
Right looks better here.
Pin crisp shadows only look good when you have a very strong light source.
did you miss the part where he said his monitor was 55" and his tv is 83"? or the part where we're talking about super sampling?
Half Life Alyx looks so good.
6000x3000 VR
lol but why ?
Right looks better here.
You're a retard.
We don't use ableist slurs on this subreddit, chud. :3
basing ray traced graphics from some 720p webm
lol
It does. The shadows in the Alyx screenshot look like something from Doom 3. Very unrealistic.
Notice how the shadows on the RTX version get crisper as they get closer to the ground. (i.e. the shadows coming from the grass) Which is how it should be.
you render one image per eye so that's 2x 3000x3000 pixel image and why this resolution well it's to get the full pixel density of a basic bitch headset like the quest 3 which has like 2100x2200 panel resolution per eye but the way the image gets distorted when passed through the lens so super sampling an image by 40%~ kinda fixes it
some new headset are coming out soon with like 4k PER EYE so that means 8K rendering minimum and 12K rendering optimal and while the GPUs have technically the horsepower to render at such high resolution the engines can't keep up
based
Why didn't 3D Vision take off
Don't you guys like 3D?
Crisper? Nice ESL Ranjeet. There's a very noticeable lag when casting the shadow on top of the blurry picture and ghosting. Look at how fucking blurry the grass is in the previous webm. Vidrel, you can see the ghosting even on a tryhard 5090 with DLSS. This technology isn't ready. But I'll tell you what is ready - Pakistan's drones when Modi orders you to rush their borders after you've been deported back to India. SAAAARSSS!!!
Sir!!
I meant, it's a little low. You should be rendering higher, like 3760x3760.
120fps on a Quest is just pointless. The headset decoder just isn't made for doing 120hz. You're severely limiting the allocated bandwidth per frame and your image quality will be terrible.
We're talking about the quality of the lighting here which of course you ignore entirely and go on a rant about image quality instead..
Do you understand just how much larger, hotter, and power hungry GPUs would become if everyone was just forced to rely solely on raster performance/raw power? Not too mention just how much more expensive GPUs would become?
Its clear most of you motherfuckers have no engineering backgrounds or understanding. Software solutions are always going to be cheaper to get more and more performance. "but muh fake frames" stfu retard, games are not "real" so all frames are fake.
Even over a 20Gbps USB link?
but it has ghosting
and it still looks better than without
the incoming video stream resolution caps out at around 6k so the only benefit you'll get for a quest to render higher is improving the already good anti aliasing the game has
as for 120fps the headset can handle it with no issue with h264 with no buffer or with h265 with a buffer
i prefer smoothness and good enough visuals compared to going wired and trying h264 with 800mbps since the colors are a bit too washed out on that codec
the cable doesn't mean anything when the encoder/decoder in the quest link software can only handle like 800-900mbps max before dying and even a shitty USB 3.0 cable from aliexpress will negotiate at 2.5gbps sync speed easily
Vidrel, you can see the ghosting even on a tryhard 5090 with DLSS.
There isn't really DLSS ghosting in the HL2RTX showcase, in this case it's RT ghosting. Yes there is a difference.
It looks about the same as DLSS but performs worse.
DLSS is good with a 4K monitor
And is only available on weak GPUs. So you can do FSR4 on a brand new 9070 that's weaker than a 2.5 year old 4090.
at 90fps your GPU has about 11ms to draw each frame
at 120fps your GPU has about 8ms to draw the same frame.
it's not good for a severely bandwidth limited device like the Quest.
On a monitor raising the frame rate to 120hz will make it smoother. On a bandwidth limited headset like the Quest there's a good chance you're making it more jittery
Do you think you're the only person who has owned a Quest?
It is related retard. Both ray and pathtracing produce noise. You need denoisers to clean the picture up. Cleaning it up also removes detail. Which is why everything looks so soft, so plastic and even blurry.
and you WILL need fake frames to play games.
you already do
Except image quality will improve as hardware gets faster whereas your shitty looking Alyx shadows will always look shitty. You're not comparing the method. You're comparing the end result. That's why you're dumb.
post 5090
7 years of dev time with multiple delays. truely a marvel of slav game development
just 2 more generations bro
Raytracing has only been a thing for this console gen. You're just being silly if you don't think it'll be the norm 10 years from now.
GPUs are already larger, hotter, more power hungry and more expensive. Stop projecting. Raster doesn't need to get better because games peaked in the mid 2010s. We've been taking 2 steps forward but also 2 steps back this entire gen. I fucking hate it. Why don't you shitgineers try to improve AI or destruction or population density rather than put all your eggs into the lighting basket? At least Star Citizen is doing that and has an entire solar system to show off. You jeets can barely outdo 10 year old Battlefront.
Every video game released today looks worse than those released a decade ago, but hey, at least they also run so poorly you need this dogshit to hit a reasonable framerate!
Lights can't get a dim as you though
I love it and usually turn on DLSS quality even if I can natively hit my FPS target.
Indiana Jones is the best looking game I've played.
In this case the software created the problems it "solves" in the first place. A paradigm shift is needed, until then hardware is getting more power hungry, hotter, and more expensive no matter what.
Raytracing has been around since Quake Wars. Quake Wars is almost 20 years old but yeah only 2 more generations Ranjeet. You'll be deported before then. Now get ready to rush a minefield.
Your opinion on DLSS is basically just a way of checking what your resolution is.
The reason it gets a bad rep for being ugly is because the vast majority of users are on 720p-1080p, resolutions that are now old enough to drink drive and have children of their own. While 4k users swear by DLSS even at 50% input resolution, but they are realistically a tiny percentage of PC gamers.
So what I'm saying is there's two ways DLSS can escape its image quality criticisms: either PC users actually upgrade to modern resolutions and have at least 1600p output, or Nvidia spends enough electricity training their models until even resolutions as low as 1080p are enough for DLSS to work nicely.
That's not how it looked when I played it. You're up to something.
SAR SAR BEST GAME OF GENERATION SAAR!
Every video game released today looks worse than those released a decade ago
This is the kind of shitty exaggeration that just tells people you don't really play games. Not every game from the 2010s looked like Arkham Knight or RDR2, and not every game from the 2020s looks like Forspoken.
Playing Monster Hunter Wilds maxed out with DLSS Quality in 4K is maybe the ugliest game I've ever played. First time I've used the tech, everything else I've played I can run in native 4K, and it's not exactly been a glowing first impression.
you're mind broken lmfao
SLI, 3D Vision, and PhysX
Out of all of those DLSS is by far the worst.
I don't give a shit if the polycount is higher if the screen is smeared in vaseline or there's AI hallucinations whenever I turn the camera. Was running through FFXIII recently and even a completely fucked port like that is easier on the eyes than most recent AAA titles.
You're playing Monster Hunter Wilds. That's like testing out whether you like music by listening to Justin Beiber.
Not saying DLSS looks good in every game, it does not, but you really picked the worst example to start with. MH Wilds is a technological mess. Try it in a game like Doom Eternal or Darktide or KCD2.
Take a screenshot and show us then
The left also doesn't look blurry af
That's MH's fault. DLSS is pretty much the best type of TAA available even if it's still far from flawless especially if you make it upscale from too low resolutions.
FSR4 is also very good. Both are better than native with TAA/TSR unless you render past 100% resolution using those. Like 150%+
But that won't save games that looks like horse diarrhea like MH Wilds.
In the future, cards will actually be strong enough
Pure delusion, GPU companies are driven only by profits, the cards will actually get weaker and lean even more into AI up-scaling.
Alyx is based and artist driven composition will always win. I think HL2 RTX has too low RT sample to trully look realistic and accurate. Both sucks compared to Cyberpunk with max PT.
stfu retard, games are not "real" so all frames are fake.
what a retarded cope
Really? How does it run so fast then? I highly doubt it's realtime and those MP games have no reason to run that in realtime since there's no realtime time of day and other kind of lightning changes. I think Enlighten has a baking option.
Enlighten is heavily optimized because it works under the assumption that the majority of the geometry will never change.
So yes it has a baking step but the lights can move, turn on/off and change intensity at runtime.
Enlighten was the lighting engine for Unity 5.0 so you can go download it and play around with it if you want.
Yeah in 20 years
On paper, it would've actually been a great way to keep older rtx cards relevant since the performance increase only comes with minimal quality loss. In reality, however, it's a crutch used to push out bloated, unoptimized dogshit that even modern cars struggle to run.
On paper, it would've actually been a great way to keep older rtx cards relevant since the performance increase only comes with minimal quality loss.
Not really, that's an assumption people made because they'd like if it were true. But DLSS upscaling works on a "richer get richer" logic, it doesn't really work well for low pixel count (what users with budget PCs would have) but it works amazingly well for high pixel counts (which is something people with higher end PCs have)
I don't understand this kind of thinking. Modern games use TAA as a way to deal with aliasing anyway, you claim to run games natively but you must surely mean native with TAA since most of the time you cannot even turn it off.
DLSS is nothing more than a better tuned implementation of TAA. It's NOT actually ai, the ai part is only a very tiny part of the process for correction but the majority of DLSS/DLAA is just bog standard TAA/TSR that's using a jittering technique to extract more details out of the same sampled original pixel.
It's just higher quality TAA, why do you think it's shit when it looks better than the NATIVE WITH TAA garbage you seem to think is superior? Genuinely wtf? I think you just don't understand what you're talking about. And everything I've said applies to FSR4 btw, and it's just as good so don't go and call me an Nvidia shill when I fucking hate their guts and everything they currently represent.
Yeah, this is 100% what's happening, but it would have happened without DLSS anyway, because Epic/Unreal were already pushing for the whole industry to rely on their TAA/TSR as a cheap denoiser/AA solution/res upscaler anyway before NvIDIOT introduced DLSS. So they would have pushed for their software RT lighting and super geometry meme garbage regardless and they needed a way to reduce the render resolution to save performance without the output image quality suffering too much.
It was inevitable and the only way to fight against this shit is to not buy these unoptimized games.
4k monitor here, it looks like shit and always will. it's just a tool for devs to make their games unoptimised as shit
4k monitor here
unoptimised
sed
s
I doubt it. You probably live in a mud hut.
games are not "real" so all frames are fake
This is what passes for professional 4chaners
Ok retard
looks like shit
fake, since it'd say "rezolution" like "optimise," because terrorists can't speak American
4k monitor here, it looks great. once it has enough input pixels and output pixels its visual loss of quality becomes extremely difficult to notice
it looks like shit
brother gets a 5080 and messages me