Doom The Dark Ages on PC is a *MASTERFULLY* optimized game
youtube.com
but Anon Babble told me that it's not well optimized...
Doom The Dark Ages on PC is a *MASTERFULLY* optimized game
youtube.com
but Anon Babble told me that it's not well optimized...
if only we could optimize artistic taste
The framegen is good in this game. No extra latency and I'm getting a solid 240 fps
~50fps with dlss on a 4060 while looking like a ps4 game
sounds like ass
framegen
I'm never in a million years turning that shit on.
Game ran like a dream for me and that's all that matters.
oh my science I love raytracing!
digital foundry
LMAO
If he was alive, he would be calling out forced raytracing as the scam it is. Now there's only shills left.
4060 with dlss @50fps
lol
so wait with a 5070 does it get to 100fps with dlss?
truly a sight to behold
3060 ti can't run a game at native 1080p medium settings without drops to sub 60 fps
optimized
Digital Foundry are fucking shills and water is wet
If only DF could be optimised out of life.
If he was alive he'd be blaming the president for getting cancer and demanding we all pay for it.
forced RT
optimized
Id has always been top-tier when it comes to tech. On the other hand, their penchant for tech demos nearly killed them when Rage flopped.
he'd be blaming the president for getting cancer
possibly true
90% of the contents of his fridge would be deemed unfit for human consumption in any other country
Were you crying about pixel sanders becoming mandatory too? Did you cry when shader model 2 became standard and replaced SM1.1?
He could have lived to see the wonders of ozempic.
dose it have lenovo yes or no
dose it have lenovo yes or no
post GPU
framegen
digital snakeoil
Have you ever seen a salad in your life?
okay, let me run it on my 1060
Your card is 5 years old.
bros wtf I can't run chaos theory
CT actually supports 1.1 and 3.0 but not 2
/ESL/ general?
I'm on 9070 XT. 3060 ti should run every game at 1080p 60 fps medium settings minimum.
I have a 2070 super.
Id Software does it again, the best programmers in the industry.
if it can't run at native 1080p@60 on my baby then it's not worth playing
Yeah
RTX 4070
6 fps
lol
Anon, I hate to break it to you but your little 1060 isn't a gaming powerhouse and never was.
3060 ti should run every game at 1080p 60 fps medium settings minimum.
Forever and ever? What do you base this on?
rajesh upgrade already
still watching df after they have been outed as shameless shills
I get 400+ FPS with goy generation with everything on ultra nightmare at 1440p. Cope.
Forever and ever? What do you base this on?
Base hardware for game development for this gen was set with PS5.
PC GPUs don't magically become less powerful with time.
I'm guessing it's because my drivers are too old, but 535 is as new as you get with Debian 12 stable right now, and idk if it's worth the risk of trying to install slightly less out of date drivers from nvidia cuda and rick my package manager throwing a fit later. I'm thinking I'll just refund the game and try again later when it hopefully has better support.
What distro are you using?
5080 slower than AMD and many 40 series cards.
Yeah, I get it, it's more like Nvidia optimization, but still. You buy a 5080 and it runs like that, in your mind it's not well-optimized.
Well-optimized
When he gets to the optimized settings, every single setting shows almost no performance difference between medium and ultra nightmare
The 4060 can only pull off 60 FPS with most settings on low, and even then it still drops below
Fucking lmao. I can't take these guys seriously anymore.
The kicker here is that the game barely looks better than Eternal.
The irony is that it's not only ray tracing exclusive game, but also sponsored by Nvidia.
why are you using debian with nvidia bro thats just retarded, respectfully
Debian 12, which I'm willing to acknowledge is probably part of the problem.
Have you considered that console settings a) aren't exactly 1:1 with PC ,b) the unified memory architecture might actually have some performance benefits not present on PC, c) it's easier to optimize a fixed target, and d) consoles do tricks like dynamic resolution scaling that PC doesn't?
Without clicking on the Link I bet that's an Alex battaglia vid
If you won't optimize your PC version the way you do your console version, why should I buy it?
Yes but devs get better at optimizing for that hardware, my poor friend
isn't he the resident pc nerd besides the other guy with the small teeth
Simply because it's what I was used to before I got the card.
Everything consoles do is possible on PC and medium settings should be the console equivalent settings.
Sony games actually do that right on PC, where medium settings are called something like "standard", low is below consoles, high is above consoles and ultra is overkill.
Every time a "ray tracing exclusive" game runs well on AMD you can bet it's actually RT lite, meaning the level of RT utilization remains low. Some games utilize it more, and Nvidia always seem to do better the more it's being done.
Unlike eternal this has destructible environments with physics
switch to endeavorOS for you own sake bro
He's also the biggest faggot on DF
Damn, that explains the game running at 1/5th the framerate, totally worth it, I'm sure the destructible wooden shacks add tons to the gameplay!
optimizing for an incalculable number of different systems is exactly as easy as optimizing for a fixed target
only AMD chads a generation or 2 behind can game on debian
You can't expect a 9th gen game to make "full" use of RT, as retards like to call it, since it has to run on consoles.
And path tracing runs like shit on everything that isn't 4090 at 1080p, unless you're using some insane upscaling from extremely low resolutions, which is why ray tracing capabilities of stuff like 5080 aren't game changer compared to AMD anymore.
you gotta be the retard of the century if you still watch DF drivel
He's the one that spergs about ukraine on not twitter all day.
Not my problem.
I believe it is, for a game with forced raytracing due to retarded devs
That's why Crysis was so taxing back in the day you ignorant zoomer
I see. What's a good distro for gaming? Bazzite perhaps? I'm on w10 and was thinking about switching to linux after seeing how shit w11 is becoming
The engine heavily favors AMD GPUs. It's been the case for over a decade. This new breed of hardware warriors are actual troglodytes.
Path tracing also currently looks like shit no matter what GPU you have because the number of actual rays being cast is pathetic and all the tricks and temporal rendering shortcuts needed to get playable performance just make the whole game end up looking like shit.
Like with webm related.
And no, this is not just an UE5 issue, this game doesn't even run on UE5.
What exactly is your point here? That we should try and emulate the mistakes of the past?
This game's physics aren't even as good as Crysis's were you fucking retard.
poors calling the game shit because they can't run it
It's literally every new release these days. It's fucking funny
2025
using vsync
It's funny, because this basically simulates software lumen by using actual hardware raytracing and runs even worse lmao
You need it enabled for async to work. Are you mentally handicapped?
No-one said anything about path tracing. Plenty of add-on ray traced games run better on Nvidia. They just put more elements of it in a game which favors Nvidia. AMD has strong raster performance so the more it relies on raster the higher AMD climbs.
mistakes
Kys dumb cuck
no you don't lmao. you can just turn on gsync/freesync then cap frames below refresh rate. just works.
Path tracing is the main reason to care about raytracing these days.
RT reflections are nice, but mostly a gimmick since consoles can't use them.
RTAO and RT shadows are worthless, literally no reason to use either of those instead of currently-available raster techniques outside of cutting down on dev time, and if your game needs to sacrifice performance for dev time it's going to flop anyways.
RTGI is hit or miss, most implementations are mediocre while a few are decent like Metro, but it still has a lot of the same issues as path tracing.
Nvidia flavored ray tracing games introduce ray tracing as a bonus. Disable ray tracing, and you actually run the game the way it was indented to begin with.
The games that were actually made around ray tracing, have to be optimized enough to run well on consoles, which are weak when it comes to ray tracing.
That's why there's going to be less and less Nvidia flavored ray tracing games, until the next gen, which once again, will be using AMD hardware, but AMD already got decent enough with ray tracing.
Well, not Debian as we're seeing here. Seems like Fedora is good from what I hear. Lot of people also get mileage with Mint or Ubuntu, which also happen to be fairly easy first distros to get started with.
It still has random crashes and the music bugs out and refuses to play quite often. I wouldn't call that optimized, my dude.
framegen
no extra latency
Sorry bub, but that is 100% incorrect.
why did Anon Babble go from loving to hating DF? Some say because they're Sony shills, but they've been vocal that shit like the PS5 Pro is not worth it
Wrong. Most ray traced games don't use path tracing. THey just apply varying degrees of RT and the more they apply the more it favors Nvidia. That's just the simple truth. Not all games are going to have an equal amount of RT. And no, it's not a choice between "le AMD-appropriate RT" and path tracing, because in most RT games Nvidia already wins.
Bullshit. You are trying to sugarcoat the argument when the facts don't even support it. The game is still going to have plenty of optimization factors than just "keep it low enough so AMD becomes just as good."
Regards think that gay tracing is implemented as a visual upgrade, while it's main benefit is that you no longer need to actually light anything. everything is now a light source so just place a bunch of luminant objects in a scene and you're done.
RTAO and RT shadows are worthless
Nigger I hate seeing shadow maps slice in, and RT shadows are fucking cheap compared to reflections.
It's weird how DF jacks off to things like cloth, hair, water and other environmental physics as well as degradation systems when those things have been standard for over 20 years now
runs fine on my Intel Arc770
Nigger I hate seeing shadow maps slice in
Already solved without RT shadows, if a dev is so lazy that they can't implement smooth shadow cascade transitions into their game and need to rely on RT then I'm not paying for their game.
Lazy in, lazy out.
They're not solvable with any level of cascade, RT makes them work at any distance.
"full rt"
not even 5080 can handle 60 fps, while upscaling from sub-1080p resolution
You're literally proving my point, retard
DLSS
Medium
$400 8GB Graphic Cards
Radeon 6600 cards flopping for the ESL third worlders too
8GB GPU VRAM is not enough. Buyer's remorse thread for PC Eurotrash with a Valve cult personality.
Mint handles everything thrown at it.
And working at any distance is worthless when the actual draw distance is shit and the BVH structure for the RT implementation doesn't even extend 50 meters away from the camera anyways.
Having a fully pathtraced or raytraced game doesn't mean shit if it's only a small 10m2 cube around the player that receives any of those traced rays and everything else just looks like shit or has to fall back on traditional raster techniques anyways.
Nope.
Runs fine. Pajeet mald thread.
post specs
I have a shitty Zen 5 3600 and a 3070 and I get 60FPS on 1440p.
Could be much worse.
what is happening here?
Linux moment.
settings do nothing
lol
lmao
also DLSS on and getting 40 fps on a 4060 and they are praising it ...
This is happening, which is not good for the fucking menu screen
u nightmare deactivates your windows apparently
fake screenshot, i got a 5090 and i dip under 60 in native 4k, i didnt even try to turn on DLAA bet then i would be at 40 fps
Frame gen will fix this
yeah but thats clearly something wrong with your pc computer
lmao, just turn on DLSS and frame gen, how hard is it?
easy will go from 6 to 30 fps and you dont need more
He's definitely using 4x MFG, like all the shitters talking about how the game runs great.
This is going to be the norm for all modern Nvidia owners, talking about how great a game runs while using MFG to up their framerate from 30 FPS to 120 FPS (but with the latency of 30 FPS).
How do i fix the lighting looking grainy like that in some places?
pleb
You don't.
Enjoy your raytraced lighting.
Holy cope-a-roonie!
Isn't it shader compilation? It happened when I was trying OW2 on Linux.
As far as I'm concerned even up to 3x framegen has less input latency than console at the same real or even framegen fps when going like for like
So why is framegen a issue?
I've asked the same but thanks for pointing out that it's just not some GPU artifact from me.
It's an issue when AMD frame gen is your FRAME of reference for frame gen, and thus a poorfag issue. I bet they think Lossless Scaling is representative of how it actually is in motion, LOL. Point and laugh and Ranjesh.
Do developers really think this looks good, or is this the same obsession as with procedural generation back in the day?
Because it doesn't have the latency of the framerate it's generating to.
That is the issue you retard.
It's why MFG isn't going to help anyone because generating from 40 FPS to 160 FPS is still going to feel like 40 FPS (ie. shit).
Framegen is issue because it always makes your inputs feel off. Even if you lose 0 base fps, playing at 60 -> generating to 120 -> your actions feel off because what you see is 120 but what you feel isn't.
DF lost all their credibility since they started shilling UE5 garbage.
This game barely looks better than Eternal and requires way higher specs. Also the first game I've seen that asks for 32GB of RAM. If you have any programming background you'll know that's simply surreal.
While I don't like the idea of framegen if used well at over 80fps 9 times out of 10 you won't notice it especially if you play on other platforms
While wilds is shit framegen on my 5090 going at 80-90 fps with reflex overdrive at 2x was no issue for me and I was able to do all my counters without issue.
It's a good cope for us that know 240fps is still a pipe dream on 4k max settings
absolutely fucking not. I put on ultra nightmare + dlss quality and get 110fps in a big area, which is fine, but I have a 9800x3D and 4080. Now, if I turn the settings down, it barely changes. So it may be heavily CPU bound.
good goy
that said, it runs absolutely fine but I think this is more cpu dependent than eternal. I get good fps and dont need framegen, at nightmare/ultra nightmare.
Current hardware isn't actually powerful enough to do proper, decent raytracing without resorting to tons of tricks to make the games not become completely unplayable.
That "graininess" you're seeing is a result of those tricks, likely because the amount of rays being cast per pixel is probably pathetic (likely sub-native) and on top of that the game uses temporal AA to smooth out jaggies, which also increases that graininess thanks to every frame pulling data from previous frames to help smooth out those jaggies.
PLUS on top of all of that, the game is also upscaling from a lower resolution even if you're not using any upscaling due to the nature of temporal reconstruction, which means fewer pixels, which means less detail.
This is basically why almost all modern games look soft as shit at 1080p, and now many of them also look soft even at 1440p.
amazing
what is this unpacking shit
The motion fluidity is appreciated and again you shouldn't be using that at under 80fps and even 60fps to 120 fps is more responsive than native 120 on console
Not really for many games and if you play comp shooters you should go native and reduce gpu strain by disabling retarded eye candy effects to gain a edge spotting ect
I still believe doom runs like dogshit and should not be bought
and even 60fps to 120 fps is more responsive than native 120 on console
Maybe with a controller, but on a KB+M, no, it's not.
You can't "out tech-bro" the laws of physics, anon.
and even 60fps to 120 fps is more responsive than native 120 on console
Why do you keep memeing about console latency? It is still going to make your game feel weird because your see in 120 but feel in 60.
It means it runs good relative to how it looks and doesn't lag spike or hitch ever. You couldn't Doom 3 +60 fps all maxed on a release year GPU. But it wasn't a badly optimized game.
Yeah I imagine when people are gonna be having like 7000 series GPUs and coming back to this game they gonna say wow this runs so well when the new games struggle in performance.
Consoles have bad input lag on a lot of games for some reason.
I play Fall Guys with my nephews (I play on PC, they play on console) and it's almost unplayable for me on their consoles - it has a shit ton of input lag despite being 60fps.
Retard
barely looks better than doom 2016. Id fell for the raytracing meme.
I would assume that you would use framegen on a controller game most kb&m first games won't run like fucking garbage
Console is that shit 2x will fell better if you reach the target initial frames
Yeah but rt shadows always look better so you're wrong.
RTX 4060 with DLSS AND NO RTX
43FPS
MASTERFULLY OPTIMIZED
You couldn't Doom 3 +60 fps all maxed on a release year GPU.
Huh?
looks exactly like doom 2016 but ofc running significantly worse to that game
fuck...
That one doesn't even run Eternal very well, just try going through Super Gore Nest and see it drop to 40s no matter the settings.
Mmm, I love ghosting
shill foundry
it looks like a game from 10 years ago if you can't run it on 10 year old gpus, it is most certainly not optimized
barely looks better than eternal, runs significantly worse
ray tracing is such a fucking meme it's not even funny
kek this nigga retarded
you are paying their development saving with your gpu
The top end gpu on that list was the 5090 equivalent though, most people had the x800 Pro or vanilla 6800 which were closer to the 5060 today.
Expect all Microsoft raped titles to be garbage
Yes not even 60fps on a $1000
I sometimes defend DF, but this is so bullshit. I even sometimes defend RT but again not this time since I don't think it's doing anything for the actual end user, outside of destroying performance in a badly optimised games. I'm pretty certain it's not just the RT that's killing performances in DA, but it sure as fuck doesn't help.
Doom Eternal flies at over 300fps even with RT with the same cards for fuck sakes..
Yes. All maxed.
Should have boughted amd. Polaris, vega, and rdna1 aged better than pascal+can play this game.
Poors can play this game, it's just they refuse to accept that if you have a console level gpu you have to play at console settings or drop your resolution.
NO RTX
RT reflections are turned on in your screenshot though?
That framerate is unironically good for 1440p on what should be an rtx 4045
You're actually correct, but we shouldn't let these devs rely on framegen. It's fine if you reach 240fps, it's not fine if you reach like 80, or 60fps, and we're going that way very fast even on PC (consoles are already doing this)
The top end gpu on that list was the 5090 equivalent though
Not really, the 5090 is $2000+.
The 6800 Ultra was $300, which even with inflation is only ~$500 today.
Even the 6800 Ultra Extreme was only like $450, which is sub-$700 today.
You zoomers and third worlders don't remember when PC part prices weren't fucking retarded, so this argument will never actually work because modern PC part prices are insane.
And you can't even say that the modern GPUs use more of the die than the older GPUs, because current RTX 5000 series GPUs are dogshit in that regard too.
This is what pushing PC hardware forward looks like chud.
but Anon Babble told me that it's not well optimized...
Well, digital foundry cheated by using a "brand new" 4060 and a six year old AMD CPU. We prefer GTX cards here.
You can't "out tech-bro" the laws of physics, anon.
They actually could do this if they wanted to. That sort of tech for twisting or distorting the previous frame was developed for VR. It lets you do small changes to the camera position which makes looking around smooth even if the frame generation isn't. It's different than the normal frame generation tech that everyone is using now because it works based off of outside input before the next frame comes in and only distorts the previous frame.
Stop referring to yourself in third person.
i'm getting over 100 fps with a 3080, ultra nightmare and dlss performance at 1440p. going to try and force dlss 4 in nvidia app to see how performance is. the game is so dark and blurry that dlss performance looks fine.
This is equivalent to running a current game at 4k with 4x SSAA.
Show me how well the 5090 manages that in this game.
dlss performance
And it doesn't look like ass with it?
and? I was using a 4200Ti with no problems until 2014
thats a good point, i thought he meant technical optimization, i didn't realize they meant financial optimization
Stuff like this really gives me some perspective... Because I don't believe that I could ever enjoyed Doom 3 on release. But I played it a few years ago and I think it was fun game for 5 euros.
dlss performance at 1440p
so actually 720p, yeah no shit it's blurry.
If you add DLSS4 it's gonna be sharp as fuck, but instead you'll get occlusion issues, and lower perf. Make sure to use preset J over perset K.
people actually play with dlss unironically
am I missing something? do you guys not have eyes?
They are way better poorfag
you do know if you turn dlss off the game gets even blurrier right?
It's better than TAA.
You could though. Because CRT monitors with that res don't even look worse when running lower res like 800x600, and CRT monitors looked clearer at 60fps than 99.9% of modern display at 120fps
If I'm trying to check what DLSS ver my game is using what letter is DLSS4?
No the fuck they aren't.
youtu.be
it looks fine. i'm not one to pixel peep and in the case of this game, it's so fast paced and has motion blur that it looks fine with dlss performance. so it's free performance with negligible visual impact in my experience. i'm still early in the game though.
Transformer model, there's no way to cope anymore, Nvidia won.
J and K ( I think)
Dark Ages uses dlss4 by default though
Low
Medium
High
Ultra
Nightmare
Ultranightmare
It's fine at 4K and the right preset.
Framegen is also fine when starting at 70fps.
Doesn't excuse these scumbag tech companies, but it's fine in the right conditions.
DLSS4 = preset J and preset K. Anything else is DLSS3.
I think preset J looks way better in motion.