DOOM BENCHMARKS ARE IN

I have a 4k monitor and I'm still gonna play at 1080p

That site makes fake benchmarks btw

benchmarks only few gpu+cpu combos

pulls the rest of the benchmarks out of his ass

12gb already obsolete

Okay I understand 8gb isn't enough anymore, but wtf are these devs doing?

how is it possible that visuals in games are abrely even better than 10 years ago, and yet we require 4x the VRAM to make it run?

mixing upscaling and non-upscaling results in a single graph

gas yourself, kike

The fact people defend NSHITIA's release of an 8GB 5060. It can barely run any games released the same year it was release.

If you bought 12 GB of VRAM, you only have yourself to blame. The absolute MINIMUM TODAY is 16 GB. There were games released just a couple of years ago that can hit 16 GB VRAM in 1440 fucking P back then.

Let me give you another tip: in 2 years' time when the new generation cards will come out from both AMD and Nvidia, you better get that 24+ GB VRAM GPU. You think the PS6 and the new Xbox won't come with at least 20-24 GB VRAM then? AMD has no problem handing out VRAM like it's candy and game developers will have no issue utilizing all of it without restraint.

i will play at 1080p on my 6600 on medium settings as god intended

Those tests without path-tracing patch.

Results will be even worse next month.

image.png - 619x102, 39.41K

3080 gets absolutely bodies by a 6800

How much fucking VRAM does this shit eat up?

So it's yet another game with crazy spec reqs? Lol devs can keep doing this but I won't be buying and doom and monster hunter are some of my favorite game series ever. I'm just not putting up with this meme shit for a handful of games. I can just play other stuff while you lose money. If you want a consumer base who buys without question then you should be courting mainstream sony or nintendo audiences because PC players have options

Wasn’t the whole point of the “PC Master Race” movement that we can have a “PS5 killer” at lower cost AND free games (with piracy)? Now you’re saying we have to make our PCs more powerful than a PS5 in order to play console ports? What happened to the PC Master Race movement? What about “1050 ti, let me guess you need more?” gigachad memes?

12+
Basically all modern games now need 16gb of vram.

image.png - 1954x1092, 2.39M

my 3080 10g is already dead

Unless you pay 2k+ nvidia doesn't make gpus from gaming anymore

VRAM isn't really an issue if people actually bought AMD as AMD cards come with plenty of VRAM in 99% of their GPUs. The problem is that unfortunately PC gamers are a bunch of fucking retarded fanboys who NEED to have Nvidia. So after they've gotten their ass raped by Nvidia, they demand more ass raping by buying the next Nvidia card. Who is to blame in this situation? Is it Nvidia or the average PC gamer?

I bought my AMD 6900 XT back in 2020 when the PS5 and Xbox Series X were just released. 5 years later and my GPU still takes a massive fucking shit all over these consoles. I haven't had any VRAM issues, and I won't for at least the next 2 years, and 7 years is a very good time for an upgrade.

Do i need a new PSU for the 5080?

5070 12GB is already banned

Why did Nvidia even make it if we’re not allowed to use it for gaming? AI and crypto? I thought they needed more VRAM?

After it blows your old one up yes

8GB was borderline obsolete in 2020.
Now, five years later, 12GB is borderline obsolete and 16GB is the bare minimum for reasonable longevity (4+ years).

Nvidia keeps releasing cards with the absolute minimum amount of VRAM to run current games, and people keep buying them.

"This game that just came out requires 7.90 GB of my 8GB of VRAM? I'm sure this won't become an issue very soon!" - 3070 owners in 2020-2021.

DLSS Q + 4x framegen

When am I waking up from this nightmare

Doom is garbage tho. Every single one. Especially this one coming out.

/thread

4k

Get memed on

AMD had a real shot to capture a massive chunk of the market this generation had they not lied about the availability of the 9070s at MSRP. Since the moment they've launched you haven't been able to find one at anything lower than 300 bucks over MSRP and they aren't worth that.

russian language

murder any unkranian babies lately?

clearly RTX 2080 Ti 11GB (wtf?) is at the bottom you blind retard.

my card isn't even on the graph

h-haha

as an nvda investor this is unironically a good thing. fuck "gamer"

Fuark

when dlss 5 arrives cards will kost 3k

8GB was borderline obsolete in 2020.

This is 100% facts, and if you knew but the slightest of things about gaming and the PC hardware market, you would come to this conclusion back in 2020. And if you google something like "RTX 3070 8 GB VRAM enough? Reddit", you'll find plenty of fucking retards on Reddit (and other forums) telling people, who genuinely asked that question, "yes it's plenty enough". Doom Eternal was already using more than 8 GB VRAM back in 2020. The RTX 3070 could easily run that maxed out if it just had a bit more VRAM. The RTX 3070 is a decently fast card even today, but it's handicapped by such a small VRAM buffer.

You have to wonder how many fucking idiots bought a scalped RTX 3070 8 GB instead of a 6800 (XT) 16 GB.

Path-tracing isn't even in at launch?

A lot. Anecdotally I remember there being a massive line of retards at my local microcenter buying out the stock of 3070s while I just walked in and grabbed a 6800 xt for 450 dollars while they were paying 600+ for a 3070.

imagine buying a GPU which can't hit 200fps in Doom. lol

Looks poorly optimized, requiring upscaling to hit good framrates.Sad to see even id tech fall from grace

murder

killing animals isn't murder.

gamegpu

c'mon now

Priced out of pc gaming because jacket man is max profiting and gamers aren't that important to the bottom line.
I'm not buying a new card every generation to try and keep up. last gen cards should be more then enough.

Nvidia keeps releasing cards with the absolute minimum amount of VRAM to run current games, and people keep buying them.

It's because if NVidia doesn't create an artificial limitation, no one will buy the most advanced models.
I had an RTX3070, and it was totally okay for 3D rendering. Yes, it took about 10 minutes for something that an RTX5090 will do in 1.5 minutes, but guess what, the whole production of the 3D scene takes several hours, sometimes days, so the gain in the final rendering part isn't significant for the average user who isn't rendering a Pixar movie or something. I wouldn't have minded continuing with the RTX3070 for 3D rendering, leaving the PC at night to render more time-consuming things, etc., but I didn't have this option, because the RTX3070's 8GB of VRAM (much less than a GTX 1080ti released several years earlier) simply makes it unviable for the task.
If the ...60 series GPUs all had 16GB of VRAM, it would be the choice of eight out of ten consumers.

Knew a guy on steam that bought a 3070 scalped for 1200 dollars on ebay, 2 years later was unhappy with its performance so he sold it on ebay for 200 dollars then bought a 4070 and now is unhappy with that too and asking me if he should buy a 5070. Never underestimate the stupidity of nvidiots, if you ask why they're so retarded they'll tell you 'muh drivers'.

Yes, and that was true until very recently. This video from 3 years ago was still made in a reality where people didn't really realize the problem that 8GB of VRAM represented, see how the guys are kinda surprised that the RTX3060 performs better than the RTX3070 because of the difference in VRAM:

youtu.be/igPvgb-uGyo?si=KbZhHBJz9FPemQ32

9070XT buy still looking great bros

What possibly reason was there for the RTX 5080 to come with just 16 GB VRAM? To force people who work with AI (those are the people who have money compared to gamers who have no money) to buy the 5090. Nvidia would be livid if people who work with AI could just buy a 5080 and get away with it.

If you were lucky enough to buy it at MSRP it would be 1080 ti tier of a bargain, but unfortunately I can't find it anywhere close to its MSRP.

absolute state of pc fats

because most people only need 16gb ram. and if you knew you'd need more than you'd buy the 5090

Wow, forced raytracing really is yhe future!

I remember all the retards who bought 3070s and 3080s for "futureproofed raytracing," now almost half a decade later those GPUs are shit at raytracing in anything besides maybe Cyberpunk (non-pathtracing) and those crappy RTX remakes if you're okay with upscaling from 540p.
No idea how retards convinced themselves that VRAM wouldn't be an issue with fucking raytracing of all things.

Refer to . 16 GB is enough now, and I would say for the foreseeable future, but the new consoles that will come in 2 years will have plenty more than that. The 5080 should've never come out with 16, but 24. I wouldn't be surprised if they release a Ti/Super version with 20-24 GB soon for 2000+ dollars.

I paid a $740 because I wanted a white one

but the new consoles that will come in 2 years will have plenty more than that.

New consoles ain't coming earlier than 2030

thought a 4070 super card would be a good middle ground

it's pratically dead last in this list

Holy fuck why do I need to spend 6k on a computer to get decent preformance?

2026 or 2027 for the new Xbox and fall 2027 for the new Playstation. This gen is almost over already.

Still not as terrible as the prices I see today, there's nothing under 900 dollars. AMD had a real shot to dominate this generation but they can't seem to get their shit together or they're not even trying to compete with nvidia.

2nd hand 4090 is fucking $2500

gay

citl.jpg - 1264x1136, 346.05K

You think AMD doesn't want to make those cards as cheap as possible? Their margins are already really thin, they can't make it that much cheaper. They still need to run a profit or the Radeon division will get shut down. They don't need to overtake Nvidia's market share tomorrow, they need to chip away at it slowly like they did with their CPUs. Ryzen didn't start dominating several years after Ryzen launched back in 2017.

I just checked and it's priced a hundred dollars higher at Micro Center now from when I bought it lmao

The MSRP is 599 dollars bro and they came out and lied that there would be ample stock at MSRP except that from the moment it launched it was already hundreds over MSRP because they never had stock, it's literally the same shit nvidia has been doing since kung flu and this scam is getting tiresome.

no 5070 ti fe

so it's actually impossible to ever get at msrp now

i hate it

If you want to do raytracing at anything above 1080p60 and you paid anything less than $1500 for an Nvidia GPU, you're a fucking retard.

gamegpu

come on now.jpg - 600x375, 25.94K

And if you google something like "RTX 3070 8 GB VRAM enough? Reddit", you'll find plenty of fucking retards on Reddit (and other forums) telling people, who genuinely asked that question, "yes it's plenty enough"

A ton of people like that here too. I've been fighting this battle since the 770 2GB vs. 280X 3GB back in 2013.

Ask how many cores you need for a gaming PC, and you'll get this answer:

"Current games require at least 4 cores. So I suggest a CPU with 6+ cores, just to be sure / longevity."

Ask how much RAM you need for a gaming PC, and you'll get this answer:

"Current games require at least 16 GB of RAM. So I suggest 32GB of RAM, just to be sure / longevity."

Ask how big your PSU should be, and you'll get this answer:

"Under load your PC should consume a total of 450W. So I suggest a 650W PSU, just to be sure / longevity."

This are all reasonable takes. But ask how much VRAM you need, and an army of retards will say:

"Modern games require 10-11GB of VRAM, so 12GB is plenty! Anything more is a waste, completely useless!"

the jews found out about the pc gaming market

BEAST

file.png - 852x50, 5.32K

MINIMUM

STOP BUYING BLOATSLOP GAMES
YOU ARE ONLY ENCOURAGING IT

As much as I want AMD to provide actual competition they are somehow even more retarded than nJudea and completely fails to capitalize on their opponent shooting themselves in both feet multiple times week after week to the point it feels like malicious compliance to keep the antitrust/monopoly jannies away

The problem is that unfortunately PC gamers are a bunch of fucking retarded fanboys who NEED to have Nvidia.

nvidia is better for productivity and people don't want to lock themselves out of potential ai shenanigans in the future. those are the 2 main issues, amd is flat out better for gaming but some lads don't want to 100% commit to that.

8GB is enough if you don't quote gay and fake sites like Gamegpu

frame generation

in a first-person shooter to boot

nvidia is better for productivity

Bro, just say AI, no one believes you're doing shit like rendering work or Blender animation or fucking crunching numbers on your Nvidia GPU in fucking 2025, just say AI, we all know that's what you mean these days.

frame gen 2x/4x

Lol lmao

The effect of the "maybe someday I might need it" logic is really amazing

mfw still running a 2080ti

file.png - 703x508, 413.21K

The 2 CEOs are cousins, AMD definitely has the technology to provide fierce competition but they just don't want to. Within a few generations AMD pretty much crushed Intel into irrelevance in the CPU sector but they don't even try in the GPU sector, usually catering their prices to whatever Nvidia puts out there.

The same thing happened with raytracing.
Just look at the 3000 series and all the tards who bought those GPUs for raytracing.

I've said this for the last 3 generations, only buy Nvidia for their flagship card. If money is no object and you don't need to worry about price to performance then Nvidia's top card is the only choice. If you are even remotely concerned about value for money then AMD offers better value at every price point. There are no valid reasons to buy any Nvidia card besides the 5090 right now.

The game doesn't even look that great, how the fuck is a 2080T1 struggling so much?

my 6750 xt doesn't even get 60 fps?

the benchmarkers are retarded. there is texture pool setting in the game which you can adjust depending on your card. if you lower it to 1.5gb from the default 2gb you can run 8gb cards just fine without any image quality difference. source: hardware unboxed video they just released

supposedly because of the larger maps + many more enemies on screen all at once

Ultra Nightmare + Frame generation 2X/4X

2X/4X

no indication of which is used on each listing

So they're just using 4x on compatible cards and 2x on everything else, i.e. comparing apples to oranges?

Bought a 5060ti 16gb for $490, and will play it in 1080p regardless

Why are modern PCs struggling against the PS5? Has there been a major regression in PC technology? Because this game runs at 60 fps on consoles without a problem.

Why waste money optimizing when framegen will fix it for you? For the consoles they only need to optimize for 1 set of hardware so that's much easier.

Consoles use lower settings.
The issue now is that ultra settings in modern games barely look better than medium settings.

You're not playing the same game as the one on PC and you're not actually getting 60 fps, you're getting fake frames.

the game isn't out so there are no drivers for it on pc yet

Framerate has lost all meaning in the last couple of years because you can just run 480p 8x upscaled at 8x frame gen bro look at all these frames i'm getting whoaaaaa

the PS5 is holdin up just fine!

Jedi_Survivor.webm

It doesn't have anything to do with that. The standards that PC and console gamers have are wildly different. Console gamers are happy with 60 fps with dips to as low as 40 with every setting being the equivalent to medium/low. PC gamers don't want 60 fps anymore, they literally want like 120+. They want even more than that, they want 200+. A lot of PC gamers have monitors that are 144 Hz and above. I've seen how games look on the PS5, and it isn't pretty. It really is not a pretty sight.

Reviewers don't benchmark for fake frames because why would they do that, technically every card on this list would be well over 60 fps if you pretended that fake frames counted

60fps at 240p at lowest settings

no drivers for it on pc yet

amd released theirs already yesterday.

Nah they do for sure, it's high vs ultra that is barely noticeable yet there's a massive fps difference

do not post this shitty site ever again, they only test a few configs and extrapolate to the rest

here's one game we won't see on the switch 2 I guess...

Why are you posting Russian shit?

DLSS shit 4x

fuck off moron

Glad I bought my 7900xt for 670 right before the launch instead of "just waiting".

Is my 1080ti finally obsolete

if you understand 8 (as opposed to smaller numbers) why not 12? retarded normgroid

"just waiting" has unfortunately become a meme because these heebs just pull old gen stock off the market in order to keep up their scarcity illusion to price their new generation at massively inflated prices to make share holders happy

fake frames

What is this thirdie logic? If you see it, it exists. How is that “fake”? Would you seriously rather play at 1080p on low settings just so you get “real frames”? As a PC gamer, I’m jealous that console players can have decent frames and resolution (upscaled) without needing to spend $1,000+. We can’t have that. Why do we have to settle for less? I thought PC was the “master race”?

9070XT chads we stay winning

8gb cards are so obsolete for modern gaming it's not even funny anymore

different between ultra and medium is 5% fps

only cards that can push over 100 fps on 1440p medium is 4090 and 5090

even cards like 3070 will be below 60 fps on 1080p medium

youtube.com/watch?v=Zlzatw1E2vQ

Glad I got 9070 XT

Fake frames are shit because you're still playing with the input latency of the original framerate.
They're only useful for high refresh rates, like taking 90 FPS to 180 FPS.
Going from 30 FPS to 60 FPS like consoles do just feels like fucking shit, even on a controller.

doesn't understand what a frame rate even is

Bro stick to consoles lol

same bro, it beats the 5080 and that shit costs 1200€ where i live lmao

In his defense. More framerate isn't only beneficial, because it's smoother on your eyes. The higher the framerate, the lower your input lag is. This is why nobody recommends turning on things like Frame Generation in competitive games, because that increases input lag. FG tends to make games feel laggy when it comes to input, but make it smoother feeling on your eyes. Not to mention that artifacting, ghosting and just errors in general in the image with FG are very real issues as well.

are you having a melty you dumb nvidia cocksucker?

And considering that 16GB Nvidia cards are $800+ and vastly overpriced and understocked, this will essentially mark the end of mainstream PC gaming. If what you say is true, that is. And yet, no console has games besides Nintendo (according to the gamer movement). Dare I say that gaming itself is basically over?

Optimize your textures nigger

All that blacked AI sloppa isn't gonna gen itself!!!

If you keep insisting on this, then your platform will die out. Steam won’t get new customers and companies won’t port their games. This means you’ll have less games to play in the coming years. All because the PC hobby insists on people buying $1,000+ GPUs to stay compliant. Is that really what you want? You want every “dudebro” to go back to Xbox and PlayStation because of your hubris? Do you understand how much that would hurt your ecosystem?

At the bottom

Uhh, it looks pretty close to the top, comrade.

3060 getting over 60fps on ultra nightmare with nvidia trickery

Cool, honestly this game looks good even on low

Setting aside how retarded your entire post is, consoles have the equivalent of about a 3060 for now 700 dollars so you absolutely can get equivalent performance to one for well under that if your standards are 30 FPS frame gen'd to 60.

What are the settings i don't read moon runes.

im sorry that your poor and brown, i just dont want your kind on my platform

Playing games with mouse in 60 fps is ass no matter how pretty your textures are.

100% usage gpu

but somehow only 49 °C

Is your gpu mexican?

I hear it's worse than doomie ternal so I'm not sad about being unable to play it.

And now the PC movement is segregating itself into an ethnostate. For the record, I’m 100% White and 80% Northern Germanic, but that shouldn’t be a topic of discussion. OUR people will suffer if you insist on siphoning hundreds of dollars from them needlessly. This is how people become poor in the first place.

There could be games that incorporate text based LLMs in the future, that's moreso what you'd be locked out of.

Yeah you faggots only care about muh performance
Enjoy getting ass gaped by njewdia

bro what are you talking about, look at steams hardware chart, no one is buying 1000 dollar gpus, there's a nigh infinite catalog of games to play on pc that don't require these cards and anyone can get console tier performance on any discrete gpu made in the last decade

They play differently anyway. If anything this looks like a toned down Eternal that people who can't into movement and weapon swapping can enjoy, they just look at green parries and go whoa.

10 years ago people weren't playing in 4k

They were definitely trying

muh performance

Well, yeah, performance is very important. The higher the framerate, the smoother the experience is. You literally see more on higher framerates. It also reduces input lag, so it makes the game more responsive. The settings are also able to be put on high, making the game look prettier as well. So, overall, that translate to a much better experience.

It uses stupid amounts of VRAM in 1080p too. Resolution doesn't increase VRAM consumption that much.

2015 is not as ancient as you think tech wise.

Higher framerates add much to the immersion, it isn't just some esports meme thing. Ever played VR in low fps? You feel so large disconnect to the world because of low fps.

DUDE GRAPHICS LMAO

Your kind ruined gaming for everyone

What does texture pool size do?

Its over for me

As a PC gamer, I’m jealous that console players can have decent frames and resolution (upscaled) without needing to spend $1,000+.

DLSS is also upscaling so PC players can trick themselves into believing their PC can run the game
FrameGen is essentially AI generated frames being added between actual rendered frames to trick retards into thinking the game is running smoothly but it negates all the benefits of high framerates and keeps all the problems of low framerates.
DLSS and FrameGen are crutch tools to make poorly optimized games appear to be running as expected on overpriced cards. They should never, ever be used in benchmarks.

blaming people for having standards instead of the corporations gauging people

if a game is made in 2025 you'd expect it to look better than something made in 2005, that's kind of the entire point of innovation

dont care, my screen supports 720p at max t.3050 8gb oc chad

Basically a 2nd texture slider.

This is really bad, right? Like, basically unplayable? I heard these are “fake frames” and the real FPS is half. Also, you need at least 144 FPS otherwise you have insurmountable mouse lag. At least, that’s what the Internet (this thread) told me. Is that true?

bigly true. It's what a lot of people are saying.

PC gamers don't want 60 fps anymore, they literally want like 120+.

Lower the resolution, reduce graphics quality
Oh wait, they're already doing that by turning on DLSS and FramGen kek

whom'st've'wldve?

Actually, the funny thing is that PC gamers still have options to turn that off, but console gamers are playing with an old dogshit version of FSR they can't turn off. lmao

So you'd be okay paying $80 dollars MSRP for an RPG maker game that looks like fucking garbage and cost them nothing to produce? I would wager a guess that you're a tendie.

where's the deck in this approximatively?

It has poor scaling. You're talking like 1080p low 60 fps on 3060 ti. And these tests are done on the earlier areas so the fps is gonna go to shitter in late fights.

dfghj.png - 1343x709, 286.94K

it isn't on the list, the deck is the equivalent of a GTX 1030

It's not like I like him but he is the most blatant shill around boomer shooters and Doom.

2060 bros... it's over...

My 4080 should be fine

Probably around the 5090. The switch 2 is around GT 210 levels in comparison

Ditto
I don't even notice the difference above 1080. Maybe 1440 but definitely not 4K.

I have no idea who he is. Sorry! Haha. I just don't care to watch people I don't like haha sorry! It's just you see... haahahaha its just that I don't care about people I don't know about! I'm really sorry!

The deck has been declared non-compliant. In fact, it’s so non-compliant that we in the PC Movement have completely disavowed it.

less than 144 FPS

requires lower graphics settings

no mouse aim by default (have to plug one in)

no RGB

doesn’t have 16 GB of VRAM

affordable (this is a bad thing)

Steam Deck is only relevant because it lets you contribute money to the PC Master Race ecosystem (Lors GabeN), but make no mistake: you are NOT a member of the PC Master Race. Just a tributary.

Prices for cards are never coming back down again. This is the new norm.

Works on my PS5

so is this shit gonna have denuvo or can I pirate it?

Probably around the 5090

lol

The switch 2 is around GT 210 levels in comparison

It's actually 1050ish(750 handheld) which is still dogshit so it's not going to be getting this game obviously

NOOOOOOO FAKE FRAMES DLSS NOOOO

Any CPU benchmarks?

I wouldn't be surprised if we get a port of this game to Switch 2. It will be dogshit, but it might happen. If they got both DOOM games on the OG switch, they'll probably try here too

framegen and non-framegen in the same benchmark

different levels of framegen too

disingenuous as fuck if you ask me

The fuck are you blabbering on about with this shartpost, I love my steam deck even if I have a gaming PC as well, it's my comfy mode device when I want to just lay in bed and play emulated games

Doom games used to be the gold standard of optimization. What happened?

Extreme bloatslop

1050ish(750 handheld)

This is complete BS btw. The guy who published that "simulated" benchmark is a fucking retard who is just pulling shit out of his ass

Everyone knows the hardware inside the switch 2 tendiebro, your tablet is not stronger than a series s

I'm pretty sure Anon Babble was crying about 2016 Doom's size

tfw own a 4070

good thing i didnt even like eternal

Yes, we know the hardware. But the benchmark people are referencing did not test that hardware. It tested different hardware at different clock speeds on a different operating system, then drew conclusions about different hardware based on the results. It's nonsense.

is called Competence Crisis anon, they hire pajeets for low costs and the developers can't do anything right so they ask for more power despite the games looking worse or 20% better than the game from 8-10 years ago
Witcher 3 worked on a 960 at 1080p in 2015 i know this because i fucking played it, now modern games need 8 GB of vram to work at 1080p

i am sure it does, less resolution than a smartphone

Same. I watch the content not the person.

thought it was DD1 at the top for a moment kek
what a mess

It doesn't even look any better than Eternal...what the fuck?

How's the NPC density in CP2077?

SAAR DO THE NEEDFUL AND PLAY DARK AGES AT 50 FPS

need to sell those GPUs

i think europoors already do that by default

snagged a 4070 ti super during cyber monday

Prepped for modern 1440p at a budget price.

SONYGODS ALWAYS WIN BABY

honestly it's funny how CP got shat on for poor npc density, but it still mogs every big game released ever since

Nope. They take these day one stuff away to announce it later on Twitter for ''engagement'' and drive more sales.

1080 chads report in!

i hate how well W3 ran and visually was captivating then CDPR started cranking out literal slop after. Its like watching ACUnity degrade to Origin all over again.

I blame UE5
Back in 2021 8GB was fine until it suddenly isn't around a year later

And what of good 6700xt?

latest (15).jpg - 300x450, 40.18K

Benchmarking for 4k

This has to be the most retarded shit in the world considering how tiny a minority it is.

If you don't think the second gaming crash is imminent then you're straight up delusional. Its all a fucking mess in every front.

Everyone has had 4k displays for ages anon, we just can't fucking use them at native 4k in games because modern devs are retards

Resolution doesn't increase VRAM consumption that much

It's literally one of the most resource intensive settings in every game

it's got frame gen turned on to bloat vram usage.

like always drop textures from ultra giganigga nightmare to ultra or high and turn off frame gen, problem solved.

Looks like shit on all three.

That's in 4k I'm pretty sure anon

Why the fuck are all these retards doing benchmarks on Ultra Supra Epic Nightmare settings for fuck sake?
How about medium/high settings benchmarks?

Everyone can Google. What's your endgame?

FM-f.png - 2149x2145, 144.18K

Denuvo

DLSS

FSR

Ultra Nightmare

That explains everything.
Also probably the OS.

6700xt can't do fucking 60fps in 1440p native

It's fucking over for me

either a 4070 ti super or a rx 7900 xt to barely get playable 60 fps with no dosghit 1% lows under 60

remember when doom was known for being absurdly well optimized? goytracing was a mistake and im tired of pretending it isn't

it doesn't help that all the competent programmers get hired by FAANG for way more money either. game dev used to be small passionate teams of nerds that all liked each other and were friends, now it's corporatized and if you're going to work for a soulless corporation that hires people that you'll never get along with then you may as well get paid more to do it.

The memers saying 8gb isn't enough once again left in shambles when reality steps in.

I remember my 386 not being able to get a mere 35fps in doom so no, I don't remember when any of the games were optimised.

It runs on the same engine as Indiana Jones and has similar spec requirements so I suspect it'll run the same.

Indie was surprisingly CPU intensive as well so I hope you guys have upgraded from your 2009 i5s.

SONY WON

I will just play at 1080p anyway on 4K monitor

Just use DLSS 4 you niglets.

thinking of upgrading my 6700xt to 7900xt (i have a 7600x cpu)
fuck gaytracing

All these seething gtx 1650 owners ITT.

DLSS/FSR

Framegen

FUCK
OFFFFFFFFFFF

This reeks of currycoding.

Because devs started designing games for PS5 rather than PS4.
Back in 2013-2014 you could game with 500MB of VRAM and 1GB was plenty. Then devs started making game for PS4, and by 2016 you needed at least 2-3GB minimum.

Back in 2016, the 1060 card came with 6GB of VRAM. Now in 2025... the 5060 has 8GB. So blame Nvidia.

I blame the engine Doom isn't running on

the fuck

another thread where third worlders complain that their 6 year old low end gpu can't play a game at max settings

Blame the dogshit 5080 for that. RTX 4090 has not been replaced by the 5000 series. Unlike the 3090 and 2080 Ti before that.

5070bros... are we really as bad as 6 year old GPUs?

what the actual fuck

A 5070 can play the game at 1440p 60 fps max settings

PS4 had 8GB unified RAM, with 5GB available for games, a 6GB card could run PS4 games no problem

PS5 has 16GB unified RAM, with 12.5GB available for games, a 16GB card could run PS5 games no problem

How much RAM do you think the PS6 will have?

ultra nightmare

who. the. hell. cares.
game looks great at medium-high presets

Why can I get almost 300 fps in Eternal while recording a 120 fps video but only 60 in The Dark Ages? That's 5x the frames.

Untitled.png - 2158x652, 1.49M

Id has fallen, billions must perish

So it's actually as good as a 5 year old card.

file.png - 852x68, 30.85K

Eternal uses baked lighting in relatively small, linear maps
Dark Ages uses raytracing and has massive open worlds

They're not really comparable

Didn't read can I get 1440p 144fps with a 4080 super with all the meme scaling and frame gen stuff?

Assuming it's a 256-bit bus, 24GB (3GB GDDR7 memory chips * 8).
But maybe 4GB modules will be a thing in a few years, and cheap enough to be adopted by consoles. In that case, 32GB (4GB * 8).

Yes.

Do you need to beat Zoom Eternal to understand Zoom the Dork Ages?
And does the new game still do the stupid thing of only X Gun can kill X monster?

Prob someone who's used to the days where vram size was a lot smaller. A "small" increase nowadays used to be a big deal until at least the gtx 1080. Remember the 900 series 3.5gb meme ?

And does the new game still do the stupid thing of only X Gun can kill X monster?

Hows that like Eternal? youtube.com/watch?v=Ia7a1dNu4MM

And in the new game you will be parrying the green orbs back to enemies so if anything that's a lot more like do X and have no choice.

It's a prequel so technically if you wanted to get the pieces before and right after the game, you'd play doom 64 and doom 2016, but they seem to want to do a lot of story cutscenes so you'll understand what's going on anyway

This entire rotten gaming and tech industry needs to crash with no survivors, we saw Idtech as the pinacle of optimization to this.

I really hope that china invades taiwan for this to happen now.

The 5060 Ti is too slow but the 5080 is too expensive? What am I supposed to do???

They're getting paid by nvidia to unoptimize their games just so they can force people into buying paltry upgrades. That's effectively the one explanation that makes sense at this point, incompetence is beyond this current generation.

5070ti and 9070xt exist

competence crisis

The actual problem is normgroids. That's why every nice thing is ruined when it becomes popular. Shitskins, women, etc. come after that.

And Microsoft probably asked them to raise the price so everyone will just play it on Gamepass instead because it's too expensive for what it is.

so much for

le based optimisation chads at ID software will save us!

and

in-house engines are better than Unreal slop!

Shalom, kike.

a 5090 still gets more frames than 5080+dlss

damn son

Welcome to the wonders of raytracing.
Enjoy!

600W gpu provides more fps

Shocking.

turns out incels are retarded, don't have STEM degrees and don't know shit about tech huh? but at least they voted trump

A $400 GPU in 2008 had 512MB of VRAM
8 year later in 2016 a $400 GPU had 8Gb of VRAM
8 years after that a $400 GPU has 8Gb of VRAM

4k

frame generation

so this means nothing?

Taiwan needs to be invaded with the two there asap

I mean, with dlsslop boost it should be at least the same thing

Stop being poor bro

don't you just love crony capitalism (the only capitalism there is)?

And card manufactures don't care, because gamers aren't their market anymore. Everyone needs cards that can render video for AI shit. No one needs to have a tons of shit on the screen at once. PC cards are essentially moving on to being for making movies and deep fakes, while companies like Sony and Nintendo get custom cards made for their consoles that Nvidia only does for chump change on the side.

dlss 4 "slop" is better than native + any AA you can get and its up-scaling from 1440p to 4K so no it won't have higher 100% FPS in this scenario with DLSS4 quality setting.

Graphics reached "realistic enough" with battlefield 1. There's no reason to """"advance""""" further than that other than selling cards.

P-Zombie NPC DRONE

12 GB was obsolete in 2020

There's no reason to """"advance"""""

Good, because they don't. Why do they charge more for the same thing though

Games are hyper-optimized for consoles, not so much for PC. A PS4 can run RDR2 infinitely better than a GTX 1080 despite the latter being much stronger than a PS4.

Wait for the super versions next year

frame gen

this graph doesn't even count

A PS4 can run RDR2 infinitely better than a GTX 1080

Yea, no.
A 1080 can run RDR2 on high/ultra at 1080p60 just fine, and with some settings tweaking it can even pull off 1440p60.
The PS4 struggles to run the game at 1080p30 with some settings lower than PC's lowest.
The PS4 comparison in RDR2 is more like something along the lines of an RX 460, since the older GCN parts that were actually comparable to the PS4 had less VRAM (3 GB on the PS4 vs. 2 GB on the HD 7870 equivalent).

A PS4 can run RDR2 infinitely better than a GTX 1080

Do consolefags really believe this?

Lol if Gmanshills is saying that then its over.

So this is the power of Unreal Engine 5... Whoa

You don't.

what are you smoking, the ps4 barely does 1080/30 and is significantly hampered by the 2008 tier cpu performance.
a gtx 970 smokes it.

Do consolefags really believe this?

They actually do believe that. They actually believe that their games run at a stable framerate. Like when a game advertises 30/60 fps, they actually think those games run at that framerate constantly. A lot of the more demanding games easily have dips that go from 60 to 40 and 30 to 20 on console games. It's a mess. Do not fall for console gaming. Not only do games run like fucking crap on consoles, they look like crap as well. The difference is that console gamers literally don't know any better. Go and ask them how Elden Ring runs on the PS5 and they'll tell you it's running at 25 fps most of the time, it's a fucking movie.

Just got mine for $699.99. The additional fees were due to not being in the continental US and Newegg's greed. Newegg apparently stocks the cards one at a time. That is, the single card gets sold, they list the next one. I was able to get it after refreshing until it let me add to cart and then until it let me continue in the check-out form. The single ASUS Prime MSRP model might be gone, though. Wanted a 5070 ti, but those are botted too hard. It's not possible to check-out in time, and MSRP models also appear to be gone. Not to mention Amazon is taking advantage of the situation by locking the highest-demand cards behind prime. The app I used for alerts was InStock, but I also had their stream opened on my PC since they don't have auto-checkout for Newegg, and even then it doesn't work since it delays for several seconds before attempting to check out. The only way I justified going for that $699.99 Asrock model was because it had RGB, but it might not even be ARGB.

Anon Babble is having another meltdown over these benchmark results.

The OP was for 4k and max settings. You're not running it at 4k on a ryzen 4800s and a gutted, mobile RX 6800. Wait for Digital Foundry to put out their Snoy shill video for a decent estimate of the actual resolution and PC-equivalent settings its running at. And you don't seem tech literate for saying it's struggling. It'll run at 60+ on a GPU from 2020 on lower settings.

It can handle the game on low or medium, which won't be a big deal because the game barely looks any different for any of the settings. The most you lose out on is reflection quality.