Why does Anon Babble hate ray tracing so much? It's an objective visual upgrade over raster and makes development significantly easier
Why does Anon Babble hate ray tracing so much...
Tiny visual upgrade that requires a massive increase in GPU power.
Graphical fidelity plateaued like 10 years ago and ever since then they just keep making up stupid gimmicks to coerce you into buying $1000 GPUs every other year
objective visual upgrade
I'm looking at doom eternal vs doom dark ages and I see zero difference in graphics.
Errm, your heckin' shadows and reflected lights simulations, lil' bro?
Inb4 250 yous just saying shit like
It helps le developers lighting easier
Games look the same it but runs worse
Anything above gtx 1060 is a meme
Poorfags holding back games
Saaar buy ray tracing gpu
And Dark Ages runs like shit
If they used the same power for raster, we would have unironic photorealism.
TRVKE
Because they bought novideo cards with no drivers.
Tiny visual upgrade that requires a massive increase in GPU power.
It really doesn't though. Doom DA just released and it runs at 60fps with perfect frame timing on anything above a 10XX series card, which is 9 years old at this point. this means the largest scale game with Ray Traced RTGI so far runs at 60+ on 20 series cards that released in 2019. It's literally less demanding than ambient occlusion if the hardware supportis it
Baked lighting is infinitely better than raytracing
the best looking movies use unrealistic lighting to make things look good, I thought games wanted to be movies?
oh zamn is that the "mario's a nigger" dude????
Yeah, it's totally normal that 1 to 2 gen old GPUs are suddenly absolute worthless whereas I managed to take my GTX 980 to fucking 2022 before upgrading.
It's totally normal that games released in the past 5 years look WAY worse than games released a decade ago, yet they miraculously have much higher requirements.
It's also totally normal that even the most modern GPU struggles with all the modern bullshit gimmicks and poo coding.
What a terrible example. There is no measurable difference between the graphics of doom eternal and doom da, except that DA runs worse.
That looks like a gamecube game
the best looking movies use unrealistic lighting to make things look good
RT doesn't mean realistic, you can adjust it to be stylized
shhh don't break the matrix
8 year old GPU can run games that look better than shit releasing today
But you need to spend $6900 on a gpu to play games at 144p upscaled because "more accurate lighting" that in 99% of the time could have been baked
buy novideo
get scammed
meanwhile on amd: tomshardware.com
Because nearly everyone on Anon Babble is a reactionary retard about everything.
well over 30 fps
30 fps
The environments in DA are about three orders of magnitude larger than those in Eternal
except that DA runs worse
Only if you're playing it on a card without hardware accelerated RT, at this point we're talking hardware from the first Trump term or handhelds
8 year old GPU can run games that look better than shit releasing today
like what?
Half Life Alyx
Guess the author is comparing it to consoles. It's a lot higher than 30 in reality.
I'm poor okay
Half Life Alyx
Yeah...
Half-Life Alyx is a perfect example of a game that could've used a little real-time raytracing. Because every time you pick anything up all of the shadows it casts disappear and it's very jarring.
makes development significantly easier
And that's the only reason it is being pushed. Because modern developers are incompetent and lazy, and they don't give a damn that the method they are choosing is for their benefit over the customer's benefit.
RT is not the problem. Forcing it as necessary when best practices and optimizations are still being developed is a problem. Particularly when that "visual upgrade" is minor in 99.9% of cases, and an actual downgrade when the developers don't actually understand how to set the lumen values of something, like, say fire, and make it much more bright than it should be.
I have a RTX4070TI and I dislike ray tracing because it's a 50% FPS decrease for a 5% fidelity increase
Kill yourself
22k updoots
this is what happens to a website when you aren't allowed to call other users retarded
I don't know what you guys are complaining about, raytracing looks amazing.
Graphical fidelity plateaued like 10 years ago and ever since then they just keep making up stupid gimmicks to coerce you into buying $1000 GPUs every other year
This. It is literally impossible to improve graphics in any meaningful way. Companies should try to improve interactivity instead. All new games should have 100% destructible envirnoments with real word physics on every object and NPCs should react intelligently to when you punch every object and wall in their house into a million pieces.
It's an objective visual upgrade
Often times it looks the fucking same and just runs way worse
makes development significantly easier
a) that's a benefit for devs not consumers
b) game development takes longer than ever now
The previous generation consoles had the single core perf of a 1.6GHz Core 2 Duo. This ruined all cross platform games.
gay tracing
The best return in graphical improvements would be to create better tools for making the assets.
But nope, gotta use autodesk slop forever, or use penguin shit that IS better than the autodesk crap.
It is literally impossible to improve graphics in any meaningful way.
What a retarded thing to say
You're mistaken, but in a way that reforce your argument.
Core 2 duos are faster than the jaguar cores.
But that's hard, it's easier to just not optimize and call it an advancement of graphics
Mirror's Edge is actually a really good example of why bitching like is myopic. The game is so short and tiny because the early GI system they had to bake out took like days if they touched any of the level geometry. So you ended up with a game with really small segmented maps. That, and the baked lighting meant every dynamic object is lit completely differently. So for example none of the physx objects they added in the PC version can cast shadows which looks weird as hell.
Part of the reason why Catalyst looks like shit compared to Mirror's Edge is because they wanted to have a much bigger and more open map, but because of that they literally could not do the same lighting tricks. That wouldn't be a problem today.
The fact my 3070 runs so many games like shit already when you look at what was 'top of the line' visually when it was a new card compared to now says it all
Why does Anon Babble hate ray tracing so much?
Poor people hate things they can't afford.
You've been fooled into thinking that a 0.1% increase in graphical fidelity is a huge revolutionary breakthrough because you don't know what real massive breakthroughs in graphics were. Similar to how phonefags go apeshit about how crazy a design of a phone is when it shaves off like half a mililiter of screen bezel.
That's why I specified 1.6 GHz Core 2 Duo. Look up the benchmarks for a SU8500? and ps4.
Obviously a q9550 @ 4.2 GHz is faster
Remember when things like Anti Aliasing had an option to turn it off? It isn’t your GPU. It’s the ray tracing.
/thread
Anon Babble can't afford a $30 card?
Anon Babble can't afford a $30 card?
The PC playerbase has completely gone upside down since Covid. Half the threads I see now have people trying to justify playing in 720p.
People not buying something =/= people not being able to afford something. It means they don't see the value in it. No, billionaires don't wipe their ass with $100 rolls of toilet paper, because that's terrible value and even if you can easily afford it you'd be retarded to buy it.
I can't afford a boat but I love boats
Don't drink beer for 3 days and you afford a raytracing card.
People not buying something =/= people not being able to afford something. It means they don't see the value in it.
That's great an all, but that's not what OP asked. OP asked why Anon Babble "hates" ray tracing, and the answer is simple, because people are poor and are running console tier rigs these days.
It's a middieclass mindset, you see it on Anon Babble all the time from retarded goys. They earn slightly above the median income or break into six digits and think they're loaded and waste money on stupid, senseless shit.
this
Can’t you guys sue one of the GPU makers for promising compatibility with modern games? Every single one advertises that it can raytrace, which is a lie.
If you can't game in 1440 at 60 in new releases, you should either just be playing on console, not gaming altogether until you have more income, or should upgrade.
Simple as.
this
ding ding ding winrar
NOOOOOOO! IT DOESNT MATTER IF MY PC CAN'T MATCH THE SERIES X! I SHOULD BE CONSIDERED MASTER RACE!
Do you see the infinite blur here? Or maybe the upscaling AA shit all over the edges of the objects? Or the endless fucking half fog? Yeah, me neither
graphics have a barely noticable difference
one runs on 8 y/o gpu just fine
the other requires a gpu released just a picosecond ago
tech threads on Anon Babble are comprised of a couple autists and a couple autistic shit posters that post those same talking points without end. Monitor and GPU threads might as well just be started by one guy getting ready to be an autistic fag the whole time.
It's one of those funny things with people making less and less money relative to inflation. It used to be that a status symbol for a young man was a house. Then houses got so expensive it wasn't viable to afford one until way later in your career if at all, which left younger men with more disposable income (not enough to buy a house though), so they bought expensive cars instead, basically as a cope. Now it's moved beyond that into buying graphics cards and gaming peripherals because young people can't afford expensive cars.
What if I only play games made before 2010?
All better graphics are used for is an excuse for devs to re-release the same game but prettier.
The only time this is broken is when the game actually does look like a step up above the industry like Cyberpunk 2077.
I'd rather just play games that are fun and immersive experiences that focus on sound to be honest since it's the cheaper option to immerse you and just works better for me.
I noticed that any game that has DLSS option runs like complete dogshit without it while any other recent games that don't need it works fine.
i can run Space marine 2 on high/epic setting at a steady smooth 60 FPS but Darktide runs like fucking shit, without DLSS i get 20-30fps while with DLSS on it rise up to 50fps at best.
I have to play the game with everything on low to run it smoothly.
DLSS feels like it's made for incompetent devs that can't optimize for shit.
What if I only play games made before 2010?
this is another cope excuse.
a decent car costs less than a good gaming pc in 2025
kinda fucked up if you think about it
GTX1080ti
Totally overpowered for its time, making every game look optimized
RTX2080
Unable to run games with RTX on.
RTX3080
Same as above.
RTX4080
Same as above
RTX 5080
Same as above, but now you can create fake frames to give the illusion that you can run a game with ray tracing on
My PC costed me under 3k and can hit ultra settings at high fps in new titles.
A decent car costs far more.
I have a 5080 and it runs RTX just fine with no frame gen.
The ease of development has absolutely NOTHING to do with me or my expectations as a customer. If it makes the game I PAY FOR run worse, then I am against it. The difficulties developers face ARE NOT MY PROBLEM.
But modern games are poop...
I drove a $450 car for 5 years
Graphical fidelity plateaued like 10 years ago
No one thinks this, 10 years ago was Bloodborne, that game absolutely is not even close to what is the peak of fidelity today
inb4 uhhhh but that was kneecapped on console
Fine then, 10 years ago was also The Phantom Pain. Big Boss's hair still looks like shit even on max settings.
You look like shit on max settings.
how would you know? you havent played any.
driving a used car
so you're poor then, cool.
I've played some and I don't want to play any more
Sure. I love being poor because I can do what I want. I don't get harassed because I wore the wrong type of hat for playing tennis.
I've yet to play a game where my 3060ti can't do that
Nooo goyim you don't understand you need to spend 50k going into debt buying a new car from shlomo
lol
At some point you have to realize a lot of Anon Babble is still playing at 1080p and thinks there's no difference, which technologically is now the equivalent of saying the eyes can't perceive past 30 fps
The fact that a 50k purchase, not even in one lump sum mind you, puts you into heavy debt says a lot about your financial situation. People like you are the ones bitching you can never afford a house because you have a shit job. I bought a house after 4 years at my job.
I like MPR but you should have used like, Counter Strike 2 or Half Life Alyx since those games look really good and use baked lighting.
having a job
calling people poor
Fuck off wagie
Don't bitch about things costing money when you want to act like a NEET, you might as well be a fucking communist at that point.
It's an upgrade, yes. But is it wise to spend 1000 bucks so your games look 1% better? This anon thinks not.
Lying on the internet on an anonymous image board
I hope it makes you feel better about your shit life!
spbp
doom runs well actually
lol
lmao
that game blatantly used raytracing as means of cost cutting
No I just actually fucking applied myself and took loans to go to med school instead of leeching off my family. Now I have a job and can actually buy things I want, and go on vacations, and not have to worry about money problems. What stopped you from doing the same?
5% improvement in graphics
20% increased costs
for
-50 % improvement in gameplay
No thanks, not worth it. Buying expensive GPUs to tinker with local AI models fine, for games, dont be a retard, not worth it.
Went to med school
Got a job with time to do any of that
Nice LARP. I figured you were gonna hit me with 7 figure programmer lmao.
Super long hours is a hospital thing, I work in outpatient where we work regular shifts mon-fri.
Fuck the ray tracing. Any time a game comes out that looks halfway decent there's threads of people crying about optimization. They lie about the gpu they have DONT WORRY BRO I GOT THAT RTX 5 BILLION, never mention their shitty 12 year old cpu, don't install anything on an ssd, etc. It gets old. Surveys and data already showed most people aren't anywhere close to having a good PC.
Poor people should stick to working to feed their families, gaming is hours wasted
So you make less than 6 figures yet can easily afford to drop more than half your wages on a car while paying a mortage.
Again nice LARP lol. Hope it was worth it.
Surveys and data already showed most people aren't anywhere close to having a good PC.
tbf you have to keep in mind all the thirdies in that survey when steam says most people have a 1060.
I'll fully admit that NVIDIA is overpriced but at the same time, you don't have to buy the absolute best thing on the market to play at max settings as long as the rest of your rig isn't shit.
raytracing in fortnite straight up ruins the snow areas
it's cool in buildings and castles and shit but my god does it look like shit in snowy areas (which are my favorites)
that's the only game where i cared enough to enable it, any more questions?
yet can easily afford to drop more than half your wages on a car
Yeah anon it's called paying in installments also the government paid for half of it on a rebate because it's a Tesla so in reality it was much cheaper than that
The Polaris RX 470-580 cards were the same price as 1060 and those can raytrace just fine. I think it's just a case of getting scammed by novideo and being too ashamed to admit it.
don't install anything on an ssd
Man I played TWW2 on an HDD for a while, literally a 5 minute load time on shit.
You can run Doom Dark Ages on highest preset with a 3050 at 1080p at around 60fps.
Granted, it's upscaled from 340p and looks like shit but it's playable.
So there's no excuse
This
muh hair
That's a testament to how little graphics have improved when you're nitpicking stuff like this. When was the last time we had a graphical leap as big as ps1 to PS2? Or the latter to PS3?
So... you went into more debt after mortage and student loans for a shiny explosion machine that even china banned? Very cool goyim.
at 1080p
anon...what year is it?
muh hair
I brought up the hair specifically because the game makes you look at it so often with its helicopter closeups, it's REALLY noticeable.
makes development significantly easier
How does this affect me in any way?
Well for one games being easier to develop means they generally come out with less glaring bugs, less spaghetti coding, not being delayed because of development issues, etc. Things that do all negatively affect the consumer because they make a worse product.
cool
still not buying whatever jeet outsourced pos you're selling mr shekelbergbaumstein
less glaring bugs
less spaghetti coding
not being delayed because of development issues
All three of these things have been on the rise industry wide, you are living in a fucking bubble. If devs are going to cost cut on me with no return, I'm going to revenue cut on them.
Tiny visual upgrade
Graphical fidelity plateaued like 10 years ago
Tell me how I know you still play at 1080p and that's why you can't tell the difference.
Caring about anti aliasing and all that shit is inherently autistic. These people think that implementing technology is what makes something have "good graphics" rather than the game being visually pleasing.
If gpus didn't continuously double in price allowing for a steady upgrade cycle, then you could blame the cards for being old. We don't live in that fantasy land so upgrades aren't a thing and anything that kills performance is simply a non starter as there is no extra performance to spare.
tldr: rt and nvidia a shit.
bloodborne looks the same as all the AAA slop published today
And runs on cards from the fucking middle ages
No one who says this owns a 4k monitor and plays at that setting. Bloodborne is a fucking 1080p, 30fps (sometimes, when the planets align) game
It's slow. It takes seconds for light to fill up and leave rooms.
It relies on temporal antialiasing slop to denoise makes the graphics look blurry and vague.
It takes far more computing power to achieve similar results to modern rasterization rendering techniques. I prefer motion clarity and responsiveness (360hz monitor) to pinpoint accurate lighting and shadows.
the ps4 emulator can't run it at 4k 60 fps?
nobody gives a shit dude, it all looks the same
It can but it's not a true 4k, it's AI upscaling. And even then it can only sorta do it because it turns out running it at 4k makes the game really fucking buggy and unstable because it was never made for that.
Please, get some prescription glasses, being half-blind isn't good for your life.
you are personally responsible for why games are so fucking shit now. stop encouraging devs to waste their time on pointless graphics garbage, you dumb slut
it doesn't make a difference
actually it does but that's a bad thing because it's made everything more expensive in order to do it
You're exactly the same as the retards who 10 years ago said 60fps is pointless because there's no difference between it and a solid stable 30fps. The talking points have changed but the mindset hasn't.
60fps is pointless
It only works on nvidia xx90 cards, to begin with. Not even a 5080 runs it properly.
I was playing deus ex human revolution earlier, and it's still a beautiful game all these years later. it released in 2011. like fear and loathing in Anon Babble says, graphical fidelity peaked like a decade ago.
throwing more polygons and lighting effects and etc ain't gonna make your game next gen.
Games in the fucking NES era could run at 60 fps (sometimes, in a blue moon, with a real struggle), and yes you definitely could tell the difference even back then.
10 years ago was Bloodborne
The Order 1886 came out in 2015. Bloodborne just didn’t care about graphics (and Fromsoft still doesn’t).
irrelevant, it doesn't make a game good. you waste all your time caring about lipstick on a pig
it doesn't make a game good
It literally is the difference between a fighting game being good or not because framerate is directly tied to things like recovery frames.
at 60fps with perfect frame timing on anything above a 10XX series card
not only are their recommended specs a 3080, it doesn't even run well on that
i'm glad to know that every fighting game that runs at 60fps is good. don't backpedal now
I was playing deus ex human revolution earlier, and it's still a beautiful game all these years later.
Now play the Director's Cut which has the piss filter turned off and holy shit do you understand why that filter is there. There are so many shitty jagged edges on every model that the filter explicitly hides.
Funny you talk about framerate when most of the graphical gimmicks in use these days reduce framerate stability, to say nothing of the fact that no hardware configuration known to man seems to be able to run a UE5 game well.
Do you wake up a retard or does it take effort throughout the day to reach that position? I didn't say 60 fps automatically makes a fighting game good, what I said was a game being less than 60 stops it from being good, and only an idiot couldn't infer that from my statement.
It's even more baffling because 16 and Eternal were incredibly well optimised and could be run on cards generations behind recommended specs. What the fuck happened?
good to know there are zero good 30fps fighting games. don't backpedal now
Anon, crysis ran at 1080p 19 fucking years ago.
good to know there are zero good 30fps fighting games.
You're right.
i hope you pay for your sins
Virtua fighter 1 was alright, but VF2 doubled the frame rate and this immediately made it infinitely better.
And VF is the most forgiving fighting engine i know of.
Yeah, it's also a 1080p game, what's your point? It was also literally the exact kind of thing people bitch about modern gaming today, that you needed a monster giga rig to run it and that being able to run Crysis was the benchmark for flexing on poorfags of the day.
Ray tracing benefits lazy devs, not the consumer. Bespoke lighting takes more time to make but results in greater soul.
Crysis did looked significantly better than anything else released back then.
Modern "heavy games" look about the same or worse than the "lightweight" ones.
Hell, modded crysis is still able to mog most of the modern games, and we're talking about a 17 year old engine.
Everyone knew Crysis needed top notch hardware to run, it wasn't a secret and they made no attempt to delude people into thinking you could run it with hardware two generations behind like ID has with TDA.
It could, but Crysis was insanely unoptimized at launch which is exactly why the whole "my PC can run Crysis" meme was a meme, because only someone with top of the line equipment could run it on max settings and not have their PC shit itself in the process. So basically what said, it was a flex on their specs.
I can't believe I bought into rtx cards.
Actually, crysis keeps mogging shit, even modern crysis.
It runs raytraced realtime ambient light on the fucking switch.
What the fuck happened?
Updated engine which explicitly requires ray-tracing enabled hardware. That alone is a gigantic red flag.
You're missing the point that Crysis didn't need those specs because they asked that of you beforehand, it needed those specs because Crysis WAS SHITTILY CODED. I repeat myself, it is the exact same problem people have with modern gaming today, devs asking you to own the most top of the line shit for their graphical fidelty because they can't do their job making not-shit, unoptimized code.
But it is the raytracing. In games without it or at least the option to turn it off then that old GPU can run the game just fine. Adding it causes a huge hit to overall performance, and for what?
Yes, it was a single core effort, but even with that, it was the best looking game from the time by a mile.
Which would be an argument against modern games today if anyone here was actually playing them at max settings like people who flexed with Crysis did back then. But no one does, everyone is playing at like Medium settings, 60fps if that.
you NEED hardware rayt-
1080ti still can game bro
he goes from 500 dollars to 50k out of nowhere
You don't have the money to play video games responsibly. Get a job. A real job.
Let me just make this clear for people who don't seem to get this. If you're saying you can't tell the difference, and your monitor can't output in 4k, you are literally incapable of being able to tell the difference because your PC isn't capable of displaying the difference. You can set it to 4k in the tab but it's not actually outputting on your monitor in 4k because the monitor can't do 4k.
Is this a good thing? No not really, games needing the best specs just to bruteforce coding issues is a problem, I just take issue with the idea that it's a modern problem because as Crysis shows, it's been a problem on PC for at least 20 years (really longer, Doom 3 also had this problem, just less so).
Like this video, I guarantee maybe 5% of Anon Babble can actually watch this video at its 4k setting and their monitor is displaying it properly.
I find it ironic that NVIDIA keeps calling rendering without DLSS/framegen "brute force rendering" when that's quite literally what ray tracing is. Baked lighting was the optimized approximation and real-time RT is the dumb brute force solution
People don't want good looking games, they want to play on ultra.
Actual 4k is still a pipe dream in most cases with all the upscaling and upsampling and upfucking messinventing shit.
You can do it with older games, but there's no focus on doing it, or doing it and doing SSAA for people without 4k screens.
No 8k is a pipe dream. 8k EXISTS but it's not going to be an actual real thing for like...15 years, if that. It's asking so much more than 4k that it's utterly insane.
If you can play your game on ultra, and it's actually running on ultra and is stable, and it still doesn't look good, it probably wasn't that graphically impressive to begin with. There is a world of difference between a AA title on ultra and a AAA one.
Be so retarded you don't know the difference between a new and used car.
They don't give a fuck. Delete the high graphic settings, rename medium to ultra and 75% of optimization complaints disappear.
Do you go on boating sites and say a boat you can’t afford is dogshit?
Sad thing is a lot of these people don't even keep up this consumerism because they get harassed irl, but because people on an anonymous cabbage farming chatroom talk then into it
Bro you sound like a total loser. Keep changing those bed pans and fronting like you’re something.
This is what they did for a long time and people were happy. It's 90% of why people liked pascal and to a lesser extent Maxwell so much, ultra was just console settings but with some placebo uplifts in things like shadow resolution.
They could also make it more clear that most games are comparable to pc minimum settings, or in the case of a lot of recent games like Indy a unique ultra potato mode.
I recently changed my 1060 not because I needed it, but because there was a good deal within my reach. Also it was 10 years old already, the wear would show soon. But certainly the difference was almost none and I was playing recent releases. The AAA games are the ones that look the same as before but run as shit on it though. Now that I got my new card I'm having no issues, but the games i play are still the same. It is absolutely an optimization issue
What’s it like driving around alone in your tesla and passing a guy in beater car with the hot nurse you pine for in the passenger seat?
Why would I need Ray Tracing to play Elona?
did someone make a mod that actually just fucking 2x or 3x's the pixel ratio yet
i want to play some elona but i need to fucking SQUINT or fuck with my screen resolution
Seething console faggot