Do you think 12gb cards will be obsolete in 2-3 years from now?
8gb cards are dead
12gb are fine right now but almost every new game eats up 11+ gigs of vram,
Monhun Wilds looks like shit without hi-res texture pack which you can not even enable on 12gb cards.
I truly believe that 5070 will age like shit really fast.
I don't play any game released after 2019.
nobody asked
Why would you grab a 12G card when 16gb cards are readily available?
yea, if you want a card to use for at least 5-6 years without having to turn down textures and shit because of the vram running out, you need at least a 16gb card. ironic that these cheaper cards that they use dlss and framegen more as a selling point on have no vram when that stuff requires more vram than running the games natively
not if you play at 1440p or below, omly 4k fags get BTFO
These cards are made so cheap you HAVE to use upscaling tech for playable framerates yet they're still sold at a premium.
Why do hardware requirements keep going up with no discernable quality improvements to visuals?
Is it incompetence or diminishing returns?
I did
I think it will really matter when we start seeing PS6 only games and 12gb cards will just have to deal with running below console textures in the pc version.
Because 5070 is faster than 5060ti.
VRAM requirements scale depending on the amount of unified RAM the newest PS/Xbox console has
For example the PS4 had 8GB RAM but only around 5GB could be used for games, which is why any game released pre-2021 or so will run just fine on a 6GB GTX 1060
Modern game engines just suck ass where everything is automated so devs don't have make any effort to tweak thing to make the game run better.
The fact they all design games on 4090's which just bruteforces performance doesn't help either.
not if you play at sub console resolutions
That is the real big brain move here.
ray tracing
BAHAHAHAHAHAH
look just because you're a thirdie in a net cafe or whatever doesn't mean everyone is on a pentium 2 or some shit, many of us CAN see the improvements
No. I have a 10gb card and I've been hearing how 16gb is the new baseline for 5 years but games run perfectly fine on it to this day.
almost every new game eats up 11+ gigs of vram
Have you tried not playing unoptimized UE5 slop?
Well if he plays games that are 5+ years old now then it answers OP’s question.
So yeah, this was the whole topic.
Retard.
he didn't notice the textures being downgraded
then claims games look like shit
gimping your GPU's performance with goy tracing
somehow I'm the retard
Stop making me laugh anon.
Who are you quoting?
nvidia was withholding vram because of the ai thing
amd's next line of APUs use unified memory and you will be able to have 48gb of vram much much cheaper than nvidia is offering (you can do similar with the current APUs but their dedicated ai processing is too slow for it to matter for most cases)
nvidia is going to be losing their reason to only give proper amounts of vram to the highest skus
12 gb will last until PS6 comes out in 2027 or 2028. Who knows how much VRAM those consoles will have, but if it's 24/32 then literally every budget card today will be aged.
Radeon owner here. It's really not an issue. This narrative is usually peddled by AMD fanboys on social media because more ram is a selling point on AMD cards but in reality it's not a big deal. There are people in this great platform who seek to divide and spread hate. Just ignore the AMD fanboys.
90°C
Your GPU is dead in a year
I wonder who could be behind this post?
Given it is 2 years old and memory chips are rated for MUCH higher temperatures I will conclude you have no idea what you are talking about.
Ray tracing makes a huge difference, you're just a poor retard.
It's over for vramlets.
If you count modern game devs, then it's already obsolete right now
ray tracing makes a huge difference
nah
Even Vagrant Story would look better with ray tracing.
Ray tracing makes a huge difference
I've been testing it in a number of games over the last few weeks.
70% performance drop
forced to use downscaling to get 60 fps
straight up broken in most games
barely any noticable difference when actually playing the game
The only game where I left it on was the Dead Space remake. It made everything worst in everything else.
inb4 hurr durr but goy tracing is the future
Maybe but it's not there yet. Not even fucking close. As far as I and most other people are concerned it's still a gay performance hogging gimmick.
unoptimized pajeetslop runs like shit despite looking just like the previous installment
Shocker
Too bad you're cooked unc since most new games have mandatory RT now
gpu usage 100% vs 98%
vram usage 7.5k vs 10k
power consupiton 69W vs 144W
What fucking vram is Jensen using for his cards? How is it doubling watts?
rated for MUCH higher temperatures
It doesn't mean it's ok to let it run this hot and hints at some issues with the cooling system. Consider repasting the die and replacing the pads. Deshrouding and replacing the stock fans with proper 92/120 mm ones is always a great option because of noticeable noise and temperature reduction.
Indiana Jones
most new games have mandatory RT now
Such as? The only one I can think of is Zoom Dark Ages
Indiana Jones
post deleting nigga
Doom TDA
Indiana Jones
'most new games'
I'm not sure you understand what tjmax is.
They are the two biggest releases of the past 5 months.
No, that would be oblivion remastered (OH WAIT THAT ALSO HAS FORCED RT) and expedition 33.
No, they really aren't. They are BOTH gamepass fodder.
ray tracing cum guzzling niggers are most insufferable faggots ( that literally take massive dildos down their ass ( from my experience knowing one )) they do not even comprehend that it is all a software sim and it has no literal correlation with how it would look in reality since they are so low IQ that they shill for shit they did not even bother comprehending and at fucking best case scenario ray tracing is about 1% real ( desnity ) i guess just as much as You are woman )) and the effect of it makes games look like absolute blurry messy noisy shit, literal clutter and shit everywhere, it adds literally nothing to games and visuals are even more so underwhelming due too all shitty artfacts and filters on screen, i would take clean PS2 graphics over this cum guzzling blur mess any day
sour grapes, the post
pretty sure consoles are upscaled from 1080p
there isnt a single game worth playing that requires more than 8gb
the only reason you should get a better card is if you do 3D modeling or use AI
New metro game this autumn will be RT only.
GTA6 RT only on consoles.
Death stranding 2 RT only on ps5.
WItcher 4 will be RT only.
Of all upcoming games only silent hill f will feature old way of rendering things.
You know that poorfags are coping when they go from "no game requires X to run" to "no GOOD game requires X to run"
I'm still baffled how 8gb somehow still is the standard in lower budget cards, we went from 512mb in 2008 to 6gb in 2016, 8 years and a 8x increase, but in 2025 we're still in 8gb, a 1.33x increase in 9 fucking years. I didn't expect the same leap but we should be getting at least 12gb in the xx60 line at minimum.
I want this slant eyed jew to get CEO'd so bad.
I want this slant eyed jew to get CEO'd so bad
No chance. Amerigoys love sucking corporate cock too much. It's 2nd nature to them.
16gb cards are already obsolete for VR.
Neither, really.
The real reason is development cost.
The software industry has a downright magical ability to instantly negate the benefits of any hardware improvements. They'll just use it to churn out code faster and sloppier than ever before.
Really sucks that vidya industry caught up with that shit. PC gaming used to be the forefront of insane software optimization tricks only a couple of decades ago.
12 GB will be fine until another horribly unoptimized game comes out, which will probably whatever open world adjacent game Capcom makes next
At this point if I want to play newest released games that require high spec hardware I might aswell get a good game streaming service sub than buying a whole new PC for a single player game I'll play once.
You don't NEED high resolution textures on your expensive new gpu my guy.
In short, Ray tracing.
I got 11gb vram 8 years ago. That's all I gotta say. Both Nvidia and Amd scamming but at different levels.
nigga I'm still on 8GB
unless you only care about AAA slop even 6GB is more than enough for 99.99% of the PC library
2080ti?
OP is a retard. Just use 1080p. No texture packs needed and plenty of vram to spare
8gb has been available on consumer cards since 2014.
No they are not but 16 is the absolute minimum one should get.
That's why I really feel fucked. 9070xt feels like a good deal but I would either want it to be cheaper or with more vram and 15-20% extra performance.
5070ti same. 5080 is gimped as fuck and 16gb aswell.
5090 is 2700euro and I cant justify that pricing with the 1 decent game release per year.
At this point the only option is waiting for 5080ti eoy, probably super uplifts or Amd releasing something stronger.
None of that will happen in the next 6 months.
I hope next gen will have giant generational leap and every current gpu will be sold at 70% discount but these fuckers don't wanna replicate the 1080ti and cut into their own sales plus even if the next gen leap is great I can see them justify another 50% pricehike or more. I feel like both Amd and nivida is collaborating to fuck us over so our only chance is fucking Intel to cook something extreme.
tfw 3090let
In Indiana Jones you can't set maximum textures on 5070 12GB fps drops to 8 and technologies like frame generation also require VRAM, so 12 is already outdated
I've yet to see a single game where you're actually limited by the card having 8GB of VRAM and nothing else. Usually when you're at the type of settings that utilize more than 8GB on the type of card that would have 8GB, the game would run like shit anyways, even if the card had unlimited VRAM. Maybe I just play different games than all these people and I happen to miss the exact games where this would be an issue, idk.
You retards never learn.
Nothing is going to change till the new consoles generation.
Just like 6gb vram was enough for everything during ps4 era, 12gb will be enough for everything during the ps5 era.
Yes, of course you can crank up everything to max and complain that 12gb is not enough, but if you want to play on 4k with everything on max just buy some highest end card with 24gb vram or something
It's a great card and your only upgrade path is 5090 for 3k or snatching a used discount priced non scam (core+vram stripped) 4090... So your only option is 5090 for 3k.
They are obsolete right now. I say this as someone with a 10GB card. The future of GPU usage is ML applications and those eat VRAM like crazy. If you want to build a PC today, get yourself a 5090 or don't bother. Seriously. The 16GB """enthusiast""" cards are low/mid-range for AI uses. If you think $2500+ for a GPU is crazy (it is) remember those prices are only going up over time. No matter how bad things get, don't worry comrade, it's going to get much worse.
A card nobody owns lol
can't enable hi-res on 12gb cards
yes you can?
For AI, the meta is getting multiple used 3090s. The 32 GB of VRAM that 5090 has still isn't much for the money. Hopefully AMD is saving local AI slop though
B-but on 4chin every (jeet shit)poster got a 4090/5090, according to them (posting on their free temu tablet)
Well who knows it might be true. Pray tell, how good is your mandarin?
Aren't these stats inaccurate because of old data or are these recently users using steam?
Who knows? Valve doesn't say how they collate the data they collect.
The steam hardware survey also doesn't track very well with numbers mercury or jon peddie put out. For example radeon ships millions of units a year but apparently their gpus aren't used by gamers and they sure as shit aren't used in workstations (validation yo) so they must just be dumped into the ocean or desktop linux users took over the world without anyone noticing.
It's current, but only from retards who willingly let spyware from companies like steam or microsoft collect their """"telemetry"""" data.
desktop linux users took over the world without anyone noticing
QUIET, YOU FOOL
The vast majority of GPU compute goes into crypto mining. Burning electricity doing useless math because it's technically profitable and therefore infinitely scalable for free money.