Let me guess, the dude who's the head of AMD's gaming division doesn't know what he's talking about?
Chief AMD Engineer: most gamers "have no use" for more than 8Gb of VRAM
The full text is a lot more reasonable than your clickbait headline
Like women can we start excluding third worlders from the statistics? Cs2 f2poor thirdies should not be counted as part of the gamer population
Seems reasonable to me. What am I supposed to be mad about
He would be right if gamedevs weren't retards using UE5 presets that run like dogshit. Most consumers are indeed 1080p thirdies with toaster PCs.
What am I supposed to be mad about
Be mad at this shill for shilling unoptimization and voting against his own interest.
Cool twatter thread but why don't you kill yourself?
The problem is developers making bullshit games.
Red Dead 2 requires less than 8gb to run everything on Ultra settings and it has better fidelity and complexity than almost every other game out there
It's possible
Modern developers are just talentless hacks and that's the truth
6 million gbs of prophecy-inducing VRAM wouldn't even change that.
this retard does know gamers are also running AI?
tell this cheap jew faggot to eat a dick
my 12gb card cant play oblivion at 1080p is this nigger retarded
I still play on 1080p desu.
be me
buy 3080 RTX on launch
play RDR on ultra 4k etc.
GPU sounds like a fucking airplane taking off
switch to 1080p and have a comfy quiet time with better performance
I don't know. I don't care for 4k that much.
AMD
running AI
maybe but what about in 2-5 years. I don't want to buy something that will quickly be obsolete.
Anyhow, I thought the consumer was always right? Why should it matter why we want the extra VRAM?
At 1080p no game uses more than 8gb unless it's UE5 slop but Unreal Engine isn't a game engine anymore, it's a movie engine thats used in actual movie/CGI studio projects now.
My radeon card played oblivion with 512mb of vram in 2006. No point playing the remaster when it's the same gamebryo engine running underneath except without any mods to fix it.
bro gamers are still playing at 1080p
yeah with bullshit frame generation and god knows what else to pull a solid 30FPS out of some unoptimized milled out UE5 slop
My 12gb card can't handle a Jeet-ified reimaging of a 20 year old game
No one is surprised but this is a bad thing that should not have happened
The fact that you find it to be normal or acceptable is the problem
Exports games.
I didn't play a single esport game for 30 years
Actually, AMD created the ultimate chip for average AI + gaming workloads. Not the cheapest machine but the best for people like me who have little space and want a machine they can use for both productivity and some gaming:
youtu.be
Nobody actually uses 4k for anything. Just look at all the desktop software on your computer. All the UIs are built for 1080. If you try to run it at hiher res it will look microscopic because the devs didn’t support 4k because they know it’s dumb for anything but movies
He's talking about the kind of people who only play games like counter strike or league of legends.
Gacha are the new esports games. Those can run on phones and consoles aka toasters. But they can also get super demanding on a dime too, WuWa has full RTX features for instance.
That's true just because you got marketed 16gb to use 3 of them doesn't mean he's wrong
Why not just get a laptop with a 4060, hell even the 4070 laptops are pretty cheap nowadays
Hey, bitch! Shut the fuck up! Where is ROCm for Windows?! Why are FSR4 and freaking frame gen stuck on DirectX? When is FSR4 coming to Linux?
Why is vram so expensive anyway?
Just let me bolt an nvme to the card
The 8gb option is mainly for OEM prebuilts. Its probably hard to find too. Like how you found more custom new RX 560 4gb at shops while all the 2gb models are generic and used.
Framegen is exclusive to DXGI on Windows. Linux has no concept of DXGI. Rocm is already on Windows you set it up with one powershell copy paste.
When the FSR4 SDK is actually released
Linux has no concept of DXGI.
Linus even more useless than usual hahahah
No DirectX on Linux is a good thing as RADV direct is 50% faster than Nvidia on Windows in all non RT games
They don’t. The large majority of gamers play 1080p and below. They aren’t manbabies who need 4k 120fps
He's wrong. UE5 is a RAM hog. I got a 5 fps boost going from 16GB to 32GB in Oblivion remastered.
What the fuck are you even mad about? It isn't enough for higher VRAM cards to exist, we also need to abolish all lower VRAM cards?
if the goyvidia PR manager said this, we'd already have six 500 reply threads on the catalog
UE5 is not a game it's a movie
Only poops play in 1080p what the fuck stop marketing to third world countries we only do 4k in America
Oh my, how special you are. Not like the common folk doing their commoner things, you are different.
hes talking about gpu vram, not ram
PS5 is still rendering games at 720p-1080p. Their quality mode for FF16 is dynamic 1440p.
Super high textures + Frame gen increase the memory requirements by a lot. And the retarded devs are starting to optimize their games assuming people are using frame-gen ( MHWilds ) which makes it even worse.
It’s true. Most gaymers are still at 1080p after all
Yes
Death to poors
I mean the most played games are games that came out 10+ years ago, so he isn't wrong
fortnite
minecraft
gta5
league of legends
well, he's right. i just play RTS games, other old games, and occasionally some newer indie games.
Super high textures + Frame gen
You won't be using either on the tier of card that comes with 8 GB.
why does this 80s Honda shitbox only have trunk capacity of 100L? what if i want to transport half a ton of bricks? the manufacturers need to change this
Fortnite and GTA 5 are the only games that are somewhat intensive since they got engine updates.
LoL and Minecraft are so old they could run on Windows XP laptop.
Rocm is already on Windows you set it up with one powershell copy paste.
Only partially. Back down before I tear you a new one.
i'm enjoying the fight between hardware engineers and software devs
engineer: you don't need more than 8gigs vram, clean up your sloppy fucking assets.
devs: PLEASE REDEEM MOAR VRAM
Thats the issue. Jewvidia is selling 8gb cards and showing benchmarks using multi-framegen, and they're clearly trying to push multiframe gen even on 8gb cards.
most people are buying/gifting prebuilts or laptops that won't know better, they'll just see it's $50 cheaper and click add to cart hoping it's better than what they have but it won't be
Fully agreed. We hate Counter-Strike.
By the way, do you have a rough date as to when we started hating CS? I have it in my Steam library from long ago but I want to conform to our new definition of Whiteness. You see, I'm Aryan and I can't be seen playing something that isn't approved by Anon Babble. Are CS:S and 1.6 approved?
WSL gives you a full Linux environment on Windows without VM bullshit. You can set up rocm with powershell in seconds as long as you got NVME and Windows 11. The only thing you can't do is install RADV because that would mog Nvidia in gaming and break the monopoly.
I wish vram was modular. Cause I need as much vram as I can possibly cram into a gpu
just don't play games that are running ai?
anything before CS:Gambling is fine, with the exception of jew fortress 2
Correction; games are so shitty these days people are playing old shit instead and the data collection is showing that the majority play games which only ask for 2GB of VRAM at most
just use a proxy instead of running shitty local models
Correction
No you literally just said the same thing but with different words.
What am I supposed to be mad about
If you have a sound mind that hasn't fell for the psyop to figuratively "break all your toys," then nothing. The man spoke some sense.
There's a hoard of retards that think the standard of gaming is 4k, 16gb vram minimum. This is what happens when a people/culture are raised on tech celebs like LTT that shill the latest hardware. Now we apparently need to upgrade every generation cycle now to enjoy latest product, like it's the car industry.
What's sad is that this is even worst than the car industry btw, or more so the people/culture in that industry. What I'm getting at is that some people that spent over a grand on a gpu tend to not like to hear that some other guy can get the same experience on a $300 gpu, and they will bully these guys. When you account for those types, then all this drama becomes more demystified.
Let me guess, the dude who's the head of AMD's gaming division doesn't know what he's talking about?
Hold it. He's talking about 1080p only.
You clickbaited the thread. You should get a job at polygon.. oh wait...
Those people aren't in the market for a 300 dollar card. He's just deflecting and being a disingenuous piece of shit.
This rotten industry needs to crash immediately.
maybe if devs optimized their shit but frankly UE5 and RE Engine are running roughshod on niggas.
What I'm getting at is that some people that spent over a grand on a gpu tend to not like to hear that some other guy can get the same experience on a $300 gpu
Because you're not getting the same experience, hell I spent 500 on a 6800xt a year after release and I would of laughed in your face for saying this.
NJUDEA MIDJOURNEY FRAMES THREAD LMAOOOOOOOOOOOOOOOOOO
CS was outdated by the time BF2 came out and CoD4 put it in the grave.
to be fair I liked RE Engine until they decided to put in the most asinine asset streaming in for 'muh open world'
RE Engine was not intended for anything with a heavy amount of logic on screen.
My GPU is almost 9 years old and has 8GB of VRAM, it's almost unthinkable they would sell brand new cards with that much even as budget options.
Not him but correct. Dragon's Dogma 2 is such a good example of this.
It gets more exposed everytime they release a new game with a bunch of shit needing active work. They were downright insane to think it could handle massive lobbies with Wilds.
destroy demand by marking up cards 1000%
g-gamers are still playing at 1080p because they want to not because we priced out 90% of our market!
That's a weird sentiment from an AMD guy considering they generally give you more VRAM on a card than the Nvidia equivalent.