Explain to me why 12gb VRAM is a bad buy if you're not doing 4k gaming.
Explain to me why 12gb VRAM is a bad buy if you're not doing 4k gaming
Why wouldn't you game in 4k? It's not 2010, stop holding games back by playing in 1080p. ffs they're releasing 8k monitors these days
not a single game worth playing uses more than 2gb of vram at 4k
Games are getting more and more unoptimized and for a little bit more (or less if you're willing to go AMD) you can get a 16gb card and future proof yourself better. Some AAA crap like MH:Wilds already exceeds 12gb on anything but the lowest settings
I play at 4k with 10gb vram so 12gb will be fine at 4k. At 1440p 12gb is even better. Nothing wrong with it.
If you want to fuck around with VR in modern titles you will suddenly discover you are a high framerate 4K+ gamer with a desperate need for more vram
Because you want it to last more than a few years?
But 12 is certainly better than 8
I was running out of VRAM at 8gb with my 3070, this thing is going to fall behind too because it's going to require more VRAM intensive features as the years go on
Jeets don't optimize, and Nshitia is forcing RT on new games like Doom The Dark Ages
pro tip: if you inject a 2x2 png to replace all the textures you don't need more than 512mb of vram
It's not a bad buy now but you are not buying your GPU for now, you're buying your GPU for the next few years. Nowadays you need 10GB VRAM for 1080p, and once next gen consoles come out with a larger VRAM buffer then videogame VRAM requirements will drastically rise, leaving 8-12GB VRAM cards in the dust quickly regardless of resolution.
normalfags listen to youtubers and watch shorts about graphics cards and they are all scared shitless and are looking for a used 7900XT because everything else apparently is not worth having. These people will not buy shit and then one day when they get a 20 GB VRAM card that they thought was a good deal it will be 3-4 generations old and run like games like trash
Nowadays you need 10GB VRAM for 1080p
Up until a month ago I was gaming at 1080 and never even once did I max out an 8GB 5500XT that I had. The 5070 is breezing through everything I throw at it at 1440p
Same I have a 10gb card and it doesn't even get maxed out in native 4k in the games I've played on it like hitman 3 recently or dlss 4k in indiana jones. 10gb for 1080p is a clown statement to make.
5080 super doesn't even do 4k it's marginally better than my 3080 ti
No, Do basic research like a human being that consumes oxygen is supposed to instead of expecting total strangers to encourage your tech illiteracy by thinking for you.
I have 4070, 59hz monitor, always play at 1080p. To be clear, I can easily afford to get a better monitor and GPU. But I'm not going to.
come with me the beach, i've got a great deal on sand to sell you.
There are games that spill over 12 gb even at 1440p nevermind the next generation of games.
Ignorance
No good games require more than a phone and the YouTube app.
Give it 2 years and we suddenly need 16gb for game to boot.
Pc gaming is in a fucked up place at the moment, I wouldnt buy anything.
We had a few gens in a row that were good on launch and sucked 2 years in.
Same I have a 10gb card and it doesn't even get maxed out in native 4k in the games I've played on it
"in the games I've played on it" is the telling part of your statement. Go run any graphically intensive recent release with its intended settings, it'll need 11-15 GB VRAM at 4k depending on the game.
I wouldn't worry about 12GB until a new console generation is coming out
Because all tests online you see are done with quick methodology - load the game, test for 20 sec, write down results.
When you are actually PLAY the game vram consumption can and WILL go up dramatically.
For example pic realted - i played the game for 2 hours yesterday. It's started with 8 gigs lmao when I launched it.
Some game like FF16 are fucking unplayable with 8 gigs after 5 minutes of gameplay. But in TESTS you will see something like 3070 have 100+ fps though in reality after 5 minutes of gameplay you will get vram spillover in ram and the game starts stutter every 3 seconds until you relaunch it.
Don't buy 12 gigs. It's a scum.
Here is 1440p+path-tracing+dlaa+mfgx4+ultra-settings.
Frametime graph and fps are irrelevant here. The game was running at locked 15 fps for test purposes.
Notice how much vram it actually needs.
12GB is retarded planned obsolescence crap. If you just want to play old games you don't need to buy a shitty 12GB card to do that. You can use a 6GB 1660 super or something. If you're buying a new card you need 16GB minimum.
go play (game you dont play that's poorly optimized)
then you'll see your graphics card sucks!
we have more than enough hardware to run these games, not my problem devs like you are retarded these days
HFW is actually a fairly well optimized game. "Optimized" does not mean that a game must account for the last 11 years of graphics cards or else it's unoptimized.
game you dont play
One of the main reasons you don't play games that need above 10GB VRAM is because you know they won't run well on your system, so you don't even bother and you justify it to your brain that they're not worth trying anyway. I've been there and I've done that too, nothing wrong with it, but at least don't be delusional about it.
not the guy you're replying to but I don't know if my 5070 can run forbidden west, I actually never thought of it because it would never cross my mind to play it. Also I either set RT to low or turn it off completely because its a shit technology that I don't like but thats just me
5 seconds on YouTube proves you wrong
Anyone noticed that amd fags are only only people peddling this vram shit and don't even play games? They try to tell you their random screenshots are more valid than people's actual experiences.
HFW doesn't have raytracing. But yeah I shouldn't have specifically referred to HFW as an example of a game you'd wanna play, it's slop, but it is an example of how newer games don't really play nice with 8-10GB VRAM buffers unless you're willing to accept some visual/stuttering compromises.
need advice, i'm in need of a new GPU because my old one died, should i bite the bullet on a 5070 ti and get fucked out of $900?
i've been watching the market nonstop for a few days now and i'm thinking a good GPU deal won't exist unless you get lucky with something local
skip 5060
7600xt or 5060ti/16
skip 5070
5070ti or 9070/xt
That's about it.
Get a 9070XT or 5070ti, depending on which one is closer to its intended price.
You are right on that, there is no better deal coming. Arguably 9070 XT could get better.
AMD fans are the biggest enemies of PC gaming. God forbid you're having a great experience with your 4060, or whatever, there'll always be an AMD fan round ready to tell you how you're *ackchually* having a bad experience because of some confirmation bias screenshot he had in his GPU folder, which is bigger than his game folder btw.
Because as a RTX4070 owner there's games (Cyberpunk for example) where If it had 16gb I'd be able to run them at 4K
This, NVidia are literally throttling their own cards capabilities
It's fucking ridiculous
Lies, MH:Wilds runs fine for me 4k with 12gb
Because there is a trend that games demand more vram even for 1080p gaming and that makes everything under 16gb less future proof. If you get a 12gb card for dirt cheap and think of upgrading again shortly it's probably fine. But these cards aren't dirt cheap. I already ran out of 12gb with my 1080ti from almost 10 years ago. So I definetely knew I needed more.
What are the chances 9060XT will get scalped to death?
I'm waiting for the 9020, that runs half the speed of the 750Ti, because AMD won.
When it comes to the budget cards there is almost no demand for amd. Everyone always buys the nvidia x60. So there's very small chance it gets scalped.
IDC I'm using a GTX1080 on a 4K screen
Who said you can't?
Same I have a 10gb card and it doesn't even get maxed out in native 4k in the games I've played
something tells me the games you played are all
here's space marine 2, without any raytracing or framegen, using all my 16gb card's vram at 4k. and if i had a 24gb card, it'd allocate and use over 16gb
But the 5060 is apparently a disaster.
Its not though. Its the current value champ. There isn't a better card on the market as of today that provides better dollar per frame value.
nothing wrong with it anon
i had fun on my 4060 ti 8gb it played all the latest games
Are all what? From oblivion to indiana jones to avowed to hitman and more I don't run into any issues at 4k. I can even get a high refresh rate experience in some games if I use dlss on my 4k 144hz TV.
Is 5070 Ti the best card for 1440p high framerate?
Modern gpus make no sense vram modules are incredibly cheap. So it is just done to keep you upgrading every 2 years instead.
Stop noooticing
4k is a meme, i bought an oled 4k 32 inch monitor and i run it at 2k because there is literally no noticeable difference.
i have 8gb
i don't give a shit
i won't play your jew game
i won't buy a new gpu
i blame devs for not using basic dx12 features like sampler feedback streaming. vram becomes irrelevant with sampler feedback. look at doom the dark ages, doesn't matter how much ram you have because the highest quality texture will be in memory anyway because sampler feedback allows quick loading/unloading of textures if you have an ssd (which is a basic requirement for next gen games anyway).
8K is a meme, wait for 32K
Why do you need a new card if you are still an 1080 CHAD??????
From oblivion to indiana jones to avowed to hitman and more I don't run into any issues at 4k
Oblivion at 4k input needs over 10gb VRAM unless you downgrade textures.
Indiana Jones at 4k input needs from 13-17gb VRAM depending on whether you turn off RT or you max out RT.
Avowed is very light on VRAM, that's true. And I don't know about Hitman, never played it.
Your system deals with a VRAM problem differently depending on the game. Some games just become an unplayable slideshow if you're out of VRAM, like Doom Eternal. Some games try to swap data into your system RAM which gives the illusion that VRAM usage isn't high, but your 1% low FPS is much worse and the game's framerate is a lot more inconsistent, like Jedi Fallen Order. And some games just stop loading textures properly once your VRAM is full, your gameplay will remain smooth but your game will visually look awful because half the textures are missing, like Space Marine 2.
4k is a meme
nah, it looks pretty good
not "giving Njewdia chinks 2K$" good tho
same for GayTracing
There is not good deal coming. It will only get worse.
How much do they actually cost per GB? I need my consumer grade AI card NOW.
Its probably the best value midlevel card right now. Can get it for the 600 range and it will do whatever you need at 1440p or lower for a few years.
I have a 3080 and am waiting until I can get a 5070 ti or better for around 800 or so, and who knows how long that will take.
I game at 1440p with 16gb
inflated price because of ai demand, just wait out the ai-bubble
Indiana Jones at 4k input needs from 13-17gb VRAM depending on whether you turn off RT or you max out RT.
this is a screenshot of indiana jones maxed out with medium texture pool (recommended by DF and the game). hits 9gb in complex scenes like this with lots of assets and textures, but that's pretty much the extent of it. the highest i've seen is 9.3gb. never even comes close to 10gb. usually hovers around 9gb.
12gb is more than fine with 1080p, once it becomes a problem you can extend its lifespan for a year or two with DLSS (though i would preferably not use DLSS at 1080p unless your screen is tiny, just not enough pixels for the algorithm to do a good job)
SIX GIGS is more than fine with 1080p
yes
4 if yer not a raytracing fagqueer
youtu.be
youtu.be
I mean yeah you can also lower the texture quality but that's generally not seen as a real solution because that can downgrade the visual quality of the game.
Intel just announced their new GPU which will come with 48GB VRAM at less than 1k$ so I guess it's mostly njewdia being jewish, as always.
"announced"
intel
i sleep
7800xt 540
dude, I paid 400 for that card
texture pool =/= texture quality. the textures are the same, it's just how much is cached in ram. if you have an ssd this is basically a moot point. maybe if you have a shit ssd it will load textures slower at the lower settings. but i have one of the best gen 4 drives and it's never been an issue. 4090 isn't a good example because the game will use more because it the engine can load more. that's how modern engines work nowadays.
12gb VRAM
you don't need more than 8 gb, stop falling in jews trickery.
What type of ram and whats the bandwidth?
implying dlss is not mandatory even with current year hardware
you are coping hard
"extend" my dick in your ass dumbfuck
It's not mandatory, unless you've got outdated hardware or entry level cards.
SAVE ME DADDY JESEN I NEED YOUR FAKE FRAME SMEARS
pathetic
We should go no further than 4K. Long ago I thought that 1440p was more than sufficient for the average consumer and that opinion holds today, but I am also open to the idea of 4K as the ultimate perceivable luxury option. But more than that, like the ridiculous 8K displays coming out now? Display manufacturers, STOP!
can you even see anything when the game is like that?
2k is good enough for now. 4k performance stinks, but you can play most shit in 2k with a good pc without issue and it looks nice.
so these 4k fags are all playing with 60 fps tops?
how fucking embarrassing that is
yea you got nicer pixels but your game is choppy
what kind of a retard would take that trade
inb4 ai frames
should
market has rejected it
8K is stupid and they stopped trying to push it years ago
PS5 box 8K logo
lmao
he said TEXTURES, not resolution.
4k users just resort to upscaling if they want more performance, even 50% input resolution at 4k looks better than native 1440p with shit like DLSS4.
monster hunter
8GB is more than enough if you're not doing gaytracing
Yeah with textures on low it's okay.
that's called downscaling you downie
Because I don't buy downgrades
posts literal picture showing it's only using 7gb of the allocated ram on high
why are jews like this?
12Gb is good enough for modern gaming, the only you're going to need more is for modern games that are opti like shit. I have 16Gb on my setup and exept on Indiana Jones last circle on 4k with supreme textures and ultra settings where I sometimes hit 12Gb
Doing ultrawide gaming, and it's not enough. Even 16gb is not enough with everything cranked up with a 2 year old game like re4r
ultrawide gaming
yes, you pay extra for edge meme cases
Keep looking through your window.
Because the less vram you have, the less capable you'll be on future titles.
Keep looking through your window.
says anon looking through a slightly wider window
future titles
GTA6 will be out for PC in ~3 years tho
not necessarily if devs do their job and use features which literally exist for them in directx/vulkan to reduce texture cache requirements. also, nvidia will probably announce AI texture compression soon too.
And I don't care about Grand Assault Tranny 6.