pcgamesn.com
But Anon Babble told me 8gb wasn't enough anymore and you needed 16gb minimum. Were those just shills trying to sell me the new cards?
pcgamesn.com
But Anon Babble told me 8gb wasn't enough anymore and you needed 16gb minimum. Were those just shills trying to sell me the new cards?
AMD claims most gamers don't need more than 8GB of VRAM
I mean if you play old games or indies then sure but wouldn't you be just better off a getting some older 8gb gpu at this point?
It's only bad when AMD does it
It's only bad when Nintendo does it
It's only bad when America does it
Getting real tired of this shit
"Anons" who told you that were just the usual Nvidia pajeets shilling their new rigs just in time for the new Doom game release that forces ray tracing. I am sure 8gb plays like 95% of old and modern games. It helps a lot that these days the difference between medium and ultra high settings is minimal.
if a game needs more than 8gb of vram it's the most unoptimized pile of shit on the planet.
what a fucking retard, their entire appeal right now is that they offer plenty of VRAM
games notoriously struggle with 8GB of VRAM
I've never seen a game use more than 5-6 gigs of VRAM at 1080p (I have 12 gigs)
Ok so i definitely need 16, got it.
That's true.
You don't need more for Fortnite, Minecraft, GTA V and so on.
It's also enough for those fotm streamer games with their shitty graphics.
AMD are correct.
Digital Foundry are wrong (As usual)
9060s are for the turbo poorfags who play new games at low settings so that vram claim is understandable
Still don't see any reason to upgrade from my RX 7600. Plays Horizon and Remake 4 pretty well.
they're right, games really shouldn't require more than 8gb
We need GPUs that can run 4K easily. 16GB is the new minimum
how about you fix your game instead rasheed
yeah why the fuck can't i play modern games on my voodoo 2
fucking indians
The only thing that will get me to upgrade again is AI porn but the tech isn't there yet. Is getting close but not yet.
Just put on DLSS/FSR.
I bought a 4090 for ERP LLM and it's surprisingly decent, don't write it off entirely just yet
12 GB?! What do you need 8 GB for?!
jumping to extremes
16gb is ludicrous, even for 4k you streetshitter
It all depends on the resolution you play at and the fidelity of the game's graphics.
Even back in 2016/17 with my gtx 1070 at 1440p newer games were getting very close to 8gb vram usage.
All those things suck though. Especially america with your “Antisemitism is the worst thing ever” president.
Can't think of a single 8gb GPU that doesn't struggle with 4k resolutions.
There's no reason for AMD to make such an embarrassing statement.
They really should just keep quiet and sell their shit.
defending anything over 12MB memory
now who's the paki
Please stop trolling and talking out of your arse with obvious intent of misinformation. Before making these claims at least say that you play 10+ year old games and not on 4k.
I'm on ultrawide and maxing out even intensive games from a couple of years ago goes up to 14-15gb vram usage. Imagine 4k.
Play the new Indiana or Alan woke
And they're correct.
Most gamers just play CS:GO/Fortnite/LOL/Valorant etc etc
So he's correct.
The new consoles have 24gb you will all have to upgrade
not untrue. if developers stopped hitting
auto optimize
in UE5, every game could run like 2007 Crysis (and better) with modern hardware. hell, it's gotten to the point where I see a Unity game that looks as good as UE4 and I am relieved because it actually runs at a stable frame rate instead of micro-stuttering and needing to fuck with mandatory frame gen and upscaling shit.
I don't play shit games
2007 Crysis (and better) with modern hardware
2007 Crysis runs like shit even on modern machines
I wonder who could be behind this post... youtube.com
4gay resolutions
found the problem. (You) fell for the meme.
People play games released past 2001?
Why?
can't say I've tried it recently, but last time I fired up Warhead it was fine. I didn't think there were that many engine optimizations between the two games.
I played Indiana Jones and that shit eat all the 6GB VRAM, luckily it went smooth, maybe because I'm on 32GB ram. Textures were on High, everything low or off, 1080p and DLSS on. Fuck off they're saying 8gb is enough. AMD was always generous, giving away more VRAM than Nvidia for years. Why tf they changed their stance since RX 7000 scam? They are just a red Nvidia now.
because Ninja Gaiden II came out in 2008 and utterly rapes anything you might say to the contrary
games notoriously struggle with 8GB of VRAM
werks on my machine
Problem is retards drive 4K resolutions with 1080p/1440p GPU's.
8 is barely enough today and will be too little in 5 years
Depends are you going to play shitty unoptimized crap or not.
Stop right there criminal scum!
Old games are not for recreational purposes!
It's painfully single threaded and doesn't scale at all. Remastered was supposed to fix that, and didn't.
For casual games that would run at max settings with a 1080 for under $500? Sure, this GPU is perfect for them.
Were those just shills trying to sell me the new cards?
yes
all games are made to run on consoles which dont have 16gb of vram
the next generation of games will be made to run on switch 2, which has hardware on the level of ps4
you dont need to update for the next 10 years if you only use your pc for video games
I agree. 240p with DLSS should be enforced. Not allowed higher.
Better for money to go to a better GPU than more RAM too. Might need to make RAM on GPUs upgradable. Or make integration with system ram better. Silly not to fully utilise system RAM. It's one of the main reasons the PS3 did games worse than the 360.
It is better to have a better GPU than lots of VRAM.
Onimusha 2 was great.
just watched someone playing it at 4K on a 5090. I'm sure if you're blowing up half the screen at any given time, it'll crash out eventually. otherwise, the footage looked amazing for a game that's nearly 20 years old.
its been over 8 years since 8GB vRAM GPUs were in the midrange (~$300)
releasing 8GB vRAM in 2025 is retarded
people who play in 240hz 1080p are just casuals
but me, a fat fuck who spends 10k on their 4K resolution battlestation, I am the apex gamer
yeah ok
AMD doesn't limit VRAM, though. They are also the only ones selling it. They are telling you that you don't have to feel obligated to buy their own stuff. They are the only one who doesn't do this, schizoman.
PC mustard race: CONSOLES ARE LE BAD BECAUSE THEY HALT THE PROGRESS!
Also PC mustard race: I don't need more than 8GB vram, any game that needs more than that is unoptimized shit
give low end systems dogshit GPUs for years
look at all these people with dogshit systems! most people dont need better GPUs!
the worst part are the fags that bootlick these companies
PC: you can choose what you want to do
Console: no choice
You: ALL GAMING IS JUST ABOUT GRAPHICS
This sounds more like pro-consumer behaviour. Instead of trying to sell you something you don't need they're being honest about it.
Nigger just put your monitor on the table
300 dollaridoo GPU even with 100000 GB couldn't still run 4K.
and if your choice is to have half the vram of a fucking PS5 then you are a faggot
I'm mentally retarded.
I understand that it may be disheartening to hear the company tell you that you do not need more but nobody is still ultimately stopping you from buying more expensive products if you so wish.
4k resolution is a meme
Having less graffix than a PS5 makes me a faggot. Having more makes me a faggot. There is no winning move here.
Unless you are gay then you always win.
Yeah so a PS5 user.
Indy maxed out + frame gen used up all my vram on a 5070 ti. Vram requirements are going to skyrocket as devs start using more raytracing tech in their games
Oh wow maxed out wooow. Oh no what if I just... what if I don't run my games with maxed out with path tracing and frame gen how about that?
Nvidia holds midrange and low end GPUs back for a decade with an effective market monopoly
AMD takes the blame
K E K
E
K
9060 XT 16gb is looking pretty good so far. Recommended power supply is 450w compared to 600w need for 5060 TI 16gb. It can make a difference for people like me with 550w power supplies that are looking to upgrade, but I think 5060 ti wouldn't have any problems with 550w either. I will wait and see what the actual pricing is like in Europe
what if I don't run my games with maxed out with path tracing and frame gen
Works on my machine.
24 unshared and 64GB RAM physical for higher resolutions like 4K+ any lower you can start trimming.
technically correct because most are still on 1080p
If you stop "holding back" low end gpus then they wouldn't be low end gpus anymore. Just buy a stronger gpu if you want to.
And you are technically retarded because VRAM increase from 1080p to 1440p is about 0,2.
My dad still games on 3.5gb. medium textures, 2x AF and 1080p.
these retards also NEVER take into account that people run 2 monitors using up more vram with shit running on their second screens
BASED. What games does this king play?
Lately, RDR2 and hoping for a new Tomb Raider.
16GB should be the minimum, textures take very little actual GPU power to render and high-res textures are noticeable at even 720p so it's effective on cheap low-transistor GPU models
expensive GPUs spend most of their processing power on meme postprocessing shaders and ray tracing which are nice but less important than raw texture detail
Not him but I use 1440p main + 4K secondary and my baseline VRAM usage is 0.7GB
Why is it that AMD just has to be their own worst enemy all the time?
Wow RDR2 runs with that?
We thought that textures were good 10 years ago. And they have always gotten better. Same with graphics. FPS. It's all relative. You've gotten used to the 4K meme and now you have hard time going back and it makes you feel frustrated.
i like how Anon Babble is just an aggregator for bots to copy paste news off of for twitter
What are you even trying to say? Textures are just as important as they always were. I don't game on a 4k monitor.
Open the catalogue. Do you really think that bots come here to make news off?
Gamers claim most dont need AMD
Schizo
Just because the textures weren't in giga high res it didn't make the games bad.
what will you do if you say yes
i am NOT taking my medications
Kek Totally believable. You can always trust AMD to waste a good opportunity. They could slap Nvidia, but that's asking too much. It's difficult to support a retarded company, bros
isn't VRAM super fucking cheap?
Lmao that RTX 4090 with 24GB I bought in 2022 for msrp turned out to be quite the investment. Still playing Doom TDA and Expedition 33 in 4K at 100-120 fps thanks to DLSS4. Not even using Frame Gen yet.
Okay. 8GB cards were mainstream 10 years ago.
for AyyMD? yes they're using ancient GDDR6 used in RTX 20 series
Nvidia is using latest GDDR7 tech
This is from Twitter.
980 had 4 GB VRAM and even that wasn't "mainstream" card.
Well yes, it's a twitter screencap posted on Anon Babble posted on twitter, now posted on Anon Babble again.
wasn't everyone losing it a while back when some headline said a majority of people playing games right now are playing 5-10+ year old games?
isn't amd just statistically right? not defending their shitty cards just don't buy them if they don't have what you want.
people still play new games, they just go back to the old games after
not defending their shitty cards just don't buy them if they don't have what you want.
computers do age and at a certain point you NEED new GPUs
computers do age and at a certain point you NEED new GPUs
Haha yeah...
I would legitimately be happy with a 6700XT that has more VRAM but only because I want to generate more furry porn. New video games just suck.
Computers don't age. The GPU is just as fast as day 1. The load is just bigger in new games.
New video games are just as good as they always were. The indie scene just vary and mix more styles and mechanics due to their constant struggle for identity, which is expected since they usually lack experience and vision. I think older gamers just tend to be jaded as adult life sometimes takes a toll.
I mean, looking at Steam stats and most played games on there, then the majority of people just plays random ass esports crap and older shit, so in a sense they do have a point. It's pretty much fine on a niche card made for comp games and older shit, but when you actually want to play newer stuff, 8GB tend to become increasingly more problematic. Overall it really hinges on the price point though and I doubt anyone would have an issue with 8GB if those cards would cost $200. Asking $350+ is simply too much and that goes for both AMD and Nvidia.
They're right.
less then 24GB is literally unplayable now
but when you actually want to play newer stuff
But these actually want to play games are just 3rd person over the shoulder walking games. Almost all of them run on 8GB ultra too, just some require you to drop textures from ultra to high.
based AMD. They know 4k is a meme
the majority of pc users are still using 1080p so they don't need much vram for most games.
8 is really pushing it for any modern triple ay release. If you want the minimum of 2k and max or near max graphics(ray tracing or not), 16 gb is far safer. 3090 users still sitting comfy after all these years with 24gb.
indians aren't even trying to hide it anymore
2k
Stop using this retarded marketing term please and just call it what it is, 1440p. It doesn't even have nowhere close to 2000 horizontal pixels for that shit term to even make sense. If anything, 1080p is closer to "2K" than anything else.
I remember when AMD said that 6 GB was too little for a gpu and they promised to never release a new card with that little vram on it in the early RX 6000 series days. Then they deleted the article right before the 6500 xt 4gb was going to release.
this. for 1080p 8gb is enough.
->
you unironically do not need anything above high framerate 4k
but really high framerate 1440 is already good
They need to start making GPUs with VGA ports again.
You support more video memory than me. Using your logic, that makes you more paki than me.
You will never be white.
but you need more than 8 gb for 1080p ULTRA
most people dont play in ultra or "very high" as hardware unboxed calls it. just because unreal engine gives you the option to maximize certain details doesnt mean you absolutely have to do, to enjoy the game and if you play competitively youre going to dial down the details anyway.
That card is not going to be playing anything on settings that would push it over 8 GB anyway and even then textures would merely be one of several settings you're turning down a notch. GPU market is a total shitshow but it feels like so did all the internet retards spinning dramas about budget tier GPUs potentially not being able to play games all maxed out with ray tracing and shit.
I'm just annoyed when people say it like resolution changes how much VRAM you need. This is 10+ year old information. The 0,2GB increase used to be significant when our cards barely had 1GB VRAM. But now with 8GB+, that 0,2 is nothing.
They're not wrong.