Imagine spending 2k for a gpu, just so that bitch cant even play games at 4k 60fps 3 years later. Literally why would you pay for this shit? i used to be that you pay for a highend pc and that would last you,
Also funny that the vram cope is showcased here, a XFX RX 9070 mogs a 3090 ti, despite having 8gb less. You can buy one new console and that one is gonna LAST. Wtf is going one with graphic cards
PC Gaming is a complete joke, the 3090 ti a 2k$ gpu from 2022 is already obsolete
Couldn't tell you, man. I play at 1080p and 4K is a meme.
yeah a new console is gonna last if you play in low 30 fps 720p, retard, do the same with a gpu and don't expect to play every game at 4K 60 fps
This is the only actual way to play 4k, consoles don't push these resolutions, it's more like 1440p with shit FSR2 upscale.
that's a retarded mindset
buying a card for 4k60fps does not mean it will achieve that for the next X years, it just means its capable of doing that with the latest games at the time of release
the GTX1080Ti was a great card but by your metrics it was dogshit because 12 months after its launch it no longer did 4k60fps in new releases
4k
lol
obsolete
yea man, uh, totally haha
People are going to look back at the 2010's and 2020's and realise that chasing 4K was fucking retarded. After 8 seconds of playing, 1440p and 4k is indistinguishable. Most of the shit people are playing doesn't even run at 4k, it's upscaled sloppa.
I play at 1080p and 4K is a meme.
I play at 4k and 1080p looks horrible. Hideously unstable, low-detail, like there's heavy motion blur and depth of field everywhere all the time. Like going from 20/20 vision to having cataracts. Even upscaled 4k looks pristine compared to 1080p.
Obsolete how exactly? You do realize if you want the fancy framegen bullshit that is entirely accessible to you still right? The 3090 is still a great card and is the 4th best vr/ai card right now on the market. Cope.
Anybody here selling their obsolete 3090 Ti? I'll give you $100.
I have simple buying standards
I bought my 3070 in 2020 and i can still run modern games at good settings.
24gb vram
obsolete
You could've gotten this secondhand for like 600-900usd and it would play games without issue at 1440p and higher AND also be amazing for AI.
now a 3070 with 8gb however.
lol
lmao
You can buy one new console and that one is gonna LAST
Yeah and the game will run in 900p 45 to 55 fps
got my 3090 for 500 bucks 2 years ago :P
it's clearly a 1440p card btw
Literally why would you pay for this shit?
Because to some people $2k for a GPU is not that big of a deal.
...I'm still playing everything fine on my 1060 6GB other than maybe IndiAAAna Jones and other DLSS unoptimized trash
Just... turn down a slider? Disable Gaytracing? Run at a normal below 4k? You deserve it for buying a scam resolution monitor. Most things worth playing are indies meant to run on older hardware anyway.
You're poor shut u-
I have a 5070 TI and 9800X3D being parted out right now just waiting for stock
Just... turn down a slider? Disable Gaytracing? Run at a normal below 4k?
Show me a couple of modern games you are playing at 1440p resolution, no raytracing or upscaling, with a GTX 1060. And when I say modern game I don't mean MGSV, I mean a modern game.
if you can't see the pores on her asshole is the game really worth playing?
modern gaming is dumb
Lol, I also like 4k but stop deepthroating that resolution
this right here
got it for $700 two years ago and been working great for games at 1440p, difussion and daz
3070 for 700 dollars 2y ago? Holy scammed
haha, nah bruh. Anon Babble is just full of faggots. I love my 3090, you can't have it for any amount of money.
3090
4k
more than 60fps
OP falling for all the memes - just add in HDR and maybe VR and we'll get the full set!
cant even play games at 4k 60fps
not even the 5090 can lmao
nta but
24gb
same performance as a 4070ti~4080
It's 8gb more than a 5080, anon. Also 3090 is like more than 1000usd secondhand now, if there is even a stock.
More like 4070ti, it's far from a 4080, and that's without counting DLSS with which the 4070ti btfos the 3090
700 dollars for a 3090 is still expensive for 2 years ago though, got mine for 450 eu at the same time
The 3070 is more powerful that a PS5 Pro.
lucky, still I don't think 700 is that bad. Considering the absolute state right now.
a 2k$ gpu from 2022 is already obsolete
false
video games are not progressing that fast
the problem is that video games are not getting optimized anymore
Is it in the state? I can see some used 3090 at like 600, sometimes even 500, doesn't seem that bad
yea
false
Even the beefier 3090ti can't hit 60fps - the regular 3090 is absolutely obsolete.
It's almost like you're a stupid retarded bitch for paying 3x what someone paid for something a decade ago
8 gigs
That can barely handle 1080p now
just reduce the settings from ultra to high and use dlss4/fsr4 at 85% resolution, voila you just hit over 60fps
this super powerful gpu that should be able to run any video game cant hit 60 fps either
its not because of optimization bro
you are retarded
Unless you do the same thing as the PS5 Pro and use an upscaler.
kills your perfomance
nothin personnel, saar
Moore's law is dead, if you want more transistors, you pay more.
If you want a system that is 3 times more powerful than a PS5, you pay 3 times more.
However you will get 3 times more power than the PS5.
There's no such thing as "console magic" anymore.
Console magic now is running shit on low settings and upscaling with FSR2.
Works for my game.
3090 for $700
Is there a name for the law that every year reliance on Pajeet outsourcing increases and performance goes down?
Good thing you have another 7 years to upgrade
You could lower details to high from ultra which would give you 99% of the visual clarity at twice the performance but noooooooooooo every placebo slider
MUST
BE
MAXED
OUT
pajeet names
Kind of related: I work for a huge corporation and a lot of the IT related stuff is done by international teams from all over the world. For some IT tasks we get to choose which handler will deal with our task, we see their name and their upcoming schedule. And what happened naturally over the course of a few years is that nobody wants to pick the pajeets and phillipinos, they get ignored and everyone picks the anglican-sounding names. Because non-pajeets and non-phillipinos are usually better at their job, they're punctual, their english is much better, when you're on voice comms with them they don't sound like they're screaming from an overcrowded scamcall center, etc.
So what started happening in the last year or two is that pajeets and phillipinos are starting to use FAKE NAMES so people stop avoiding, names they believe sound anglican or english. The problem is that these names don't sound anglican at all, they're lvery obviously clueless foreigner's poor notion of an anglican name. So instead of Rajeesh Pajeeti being your handler you get Raiychurd Koovengson and shit like that, it's hilarious. Everyone at my workplace knows how to spot them even with fake names. So this might start happening in the gaming industry too, if you start seeing a bunch of names in credits that sound like fake English, know it's third worlder mercenaries trying to hide their identity.
i shouldnt need to buy a new gpu every month because it won't run at 5 fps on lowest with ai slop smoothing
a GPU from 5 years ago should run a game from today at the highest setting at at least 60 fps
keep consooming faggot
People are already doing that, think back to how absolutely mega retarded PC gamers were in the mid 2010's, retards actually used to buy 2 - 4 top end GPU's like a 980 to run quad SLI to just TRY and have even 60 FPS at 4K. Of course now the GPU market is so Jewed up that even accounting for inflation buying FOUR top end GPU's back then was cheaper than buying one 5090
The 3090 is absolutely not obsolete.
if you start seeing a bunch of names in credits that sound like fake English, know it's third worlder mercenaries trying to hide their identity.
We're going full-circle to the days when Japanese devs had to use aliases
grim
Raiychurd Koovengson
holy kek
I bought a 3090ti to play at 1080p actually.
I play old games at 8k and modern games at 4k
I have a RTX 4090 and Im waiting for my new RTX 5090 to arrive.
Yes Im also ESL spic lel.
i used to be that you pay for a highend pc and that would last you,
wut? my PC from 1999 was outdated by 2000
It's called "game companies are mostly scams to get investor money"
5090
aren't you worried about it melting?
play at 1080p
buy a new pc every 6 years for $1000~
can save up money since you're only rarely buying games
it's a simple life but a good life.
my Radeon HD7770 lasted years and years. Had it from 2012 to 2022. It even managed to run RE8
China got it right, everyone there gets an English name and a Chinese name at birth. Chinese workers are actually extremely competent though, at least if they're part of your own company. If they're part of a different company they will do everything they can to scam you, and unlike Poojeets they're smart enough to know all the legal loopholes.
Luckily when I was in Chongqing we had a Russian there and they know exactly how to deal with them.
No,never had a problem with my RTX 4090 MSI Gaming X Trio OC
Now I have upgraded to an even better model the RTX 5090 Gigabyte Aorus Master Ice.
Its top of the line,its almost impossible for anything to go wrong.
You now realize that the bleeding edge tech is a scam. Even the newest cards struggle to reach 60fps with the newest games at 4k.
Just Ignore raytracing, realize that 4k is a meme, and be willing to turn shit off ultra for the newest AAAA trash. Boom, you card is now good for the next decade.
gigabyte
top of the line
anon, please don't. their current gen cards are fucked. Just get a FE edition and slap a waterblock on it.
9070xt Gigabyte Aorus making a grinding noise on fanstop / also reported on Nvidia 5000s.
Just my experience.
he bought nvidia
blames everything but himself
many such cases.
This one is a 5090 tho.I ve done my research everyone says Gigabyte has made their best work yet.
When I was playing WoW a few years ago, the mexicans had a same problem where no one picked them for groups because they usually sucked or didn't know English. They were all on the same Latin American servers and usually had spanish sounding names so it was very easy just to blacklist entire servers. Their solution was also to transfer to different servers and have more English sounding names.
my ASSROCK 9070XT sucks
Another note is that their cards are all leaking thermal paste too. Go ahead if you trust your gut, but I find Gigabyte to be really sloppy on their quality control and ever since that PSU debacle, I stay the fuck away from them.
MSI Suprim (watercooled) is a good alternative. MSI have always been more consistent and their warranty team actually responds to you
went for Assrocked
My Powercolor Red Devil 9070xt fucking slaps.
Shills will tell you that the $2000 card you bought for your 4k monitor being able to play things at 1080p means it's "fine" and works as intended. Shills don't deserve human rights.
name three games released after 2022 worth playing
damn wtf that shit is 200$ more than what I paid. what makes it worth 200$ more
Not really, even the 4090 can't do 60fps 4k in unreal engine 5 games.
that was an exception since things stagnated in that era. It was normal before 2010 for PCs to be obsoleted every couple years
UE being a retarded meme engine is not all of PC gaming
Eat shit. 4K should be standard by now and would be if you didn't buy unoptimized dogshit
i didnt pay that much more but i would
temps way better
way quieter since the massive heatsink takes stress off the fans
gay ass hellstone rgb
idk, i just like it. no issue or fuss, it just works.
4k gaming is dumb because the world's least optimized details on ultra placebo settings hits 59fps on average
this means 4k as a resolution is worthless
average Anon Babbletard take
And that's why the 4090 is only worth 400 bucks max.
1440p is a retarded cope resolution. Doesn't scale 1080p media correctly. 4K isn't difficult at all to reach on decent hardware in engines that don't suck ass.
Nah, a more fair price would be $600-$800 range. $2000 is fucking stupid, but Nvidishits will pay what ever.
PC gaming
Gaming as a whole. Don't forget the games you have to run at 1440p on a 3090Ti are the same a PS5 has to run at 900p and Series S at 540-600p.
You can blame Nvidia for stalling GPU generations, since their money is all on AI now and they just re-releasing the same cards with software tricks, but you can't blame them for "old" hardware like this struggling to run games that look like they could run on a PS4.
Do you think we like spending 3410$?
(Actual price for a top model in europe)
No but there is no alternative! AMD sucks,Intel sucks there is NOTHING.
Nvidia has us by our collective balls.
Where is the 5090 rival by AMD???
Or Intel?
I completely disagree. It's a power hog and unpleasant to be around. 400 is being generous.
Do you think we like spending 3410$?
Yes? You don't need a 5090 unless your work involves GPU intensive shit.
PC gaming is an absolute joke for incels only.
No they don't. 7900xtx is overkill and you'll never need anything more than that unless your a faggot ass AI user. Nvidia as tricked you into thinking you need 50million gigashits so you can render your porn hentai games in 4k
You need it for games not to play like absolute shit at 4k.
Retarded uniptimized dogshit software is the fault of developers and publishers, not hardware makers
yeah i can run 4k on my 4060 with fg tho. so its a you problem really
Software tricks are basically the only way to keep costs down and pretend that they're still innovating. That's why Sony is dumping money into helping AMD get FSR working better so they can use FSR upscaling as the gimmick for PS6.
a complete joke
3090 ti
obsolete
Of course.
This is good. Because of their jewishness they're slowly weening me off video games which was my last major expenditure aside from food.
I'm still on the gtx 1070 and when that dies I probably wouldn't be too tempted to buy an enthusiast tier gpu anymore.
Was originally waiting for an affordable card to max out CP2077 with raytracing but it never came. So I guess now I'm waiting for whichever card can max out Witcher 4 and The Blood of Dawnalker but if it's over 800 USD I'll probably pass on that card and those games too.
And planes that crash are the fault of gravity, doesn't mean I'll fucking ride one knowing it'll crash. Cry about it.
The issue with crying over the hardware instead of the software is that you're not addressing the actual issue.
Think its worth even upgrading from my 3070TI to a 5070TI or 9070XT? I'll buy whatever is available at a decent deal.
Expedition 33 ran fine on the 3070ti. I feel like I'm just fomoing because GPU prices seem to only get higher.
4k has never been practical for any current gen game
i dont know why people still care about 4k gaming when you'll never see stable framerates unless MFG or AI is mixed in
1080 is still S-tier when it comes to gaming period, 2k if you want a monitor larger than 25"
4k has been and continues to be a bad investment for people who want gaming above 60 fps
Yes, it is worth it but not at MSRP. 3070ti is an aging card.
It's the only way forward when 90% of your wafer output is focused on AI chips, yeah.
Think its worth even upgrading from my 3070TI
No. It's not like you're some retard trying to play new releases on a 1060 and complaining about having to use potato settings.
1080 is still S-tier
in terms of performance sure
in terms of the image quality and the performance you get from it, it's awful
t. spic with no impulse control
We'd have 4K as standard by now if you didn't mindlessly buy whatever lazy sweatshop dogshit ports capcom or EA decides to drop in your lap.
If you want to play at a meme tier resolution you need to be ready to pay out the ass regularly, how it's always been.
console and that one is gonna last
just play games at console settings and suddenly an 8 year old graphics card you buy for $130 is all you need
You're such a retard nigger faggot
please, im begging you to use 1080p on a monitor bigger than 24". You'll see it atrocious.
playing new games
the most recent games i really liked were new order/old blood and doom 2016. fallout 4, doom eternal, alyx, and terminator resistance were alright. they're all 5-10 years old. nothing has been good for the last four years, at least for shooters. my dad who uses playstation hasn't upgraded to 5 for the same reasons. gaming is a ghost town.
Software adapts to whatever hardware is available. The only issue that needs to be addressed is hardware prices.
except most games still run like crap at 4k and team green has completely given up on rastered framerates at that resolution
So have fun playing AI generate crap when any resolution below that can raster what is actually in the game
You can't take my 1060 from me.
YES SISTER! Games running at 45 fps 4k WITH upscaling is amazing! Truly worth every dollar
explain AI frames
Discrete GPUs are not that expensive outside of the retardedly powerful meme cards
Software adapts to whatever hardware is available
Software developers are not the tide, they can and should be bullied into not releasing unoptimized dogshit. But, more importantly, you're not even right, these games play like shit on anything less than a flagship GPU almost nobody has.
So have fun playing AI generate crap when any resolution below that can raster what is actually in the game
my AI upscaled crap will look better than your raster, because output resolution is more relevant to image quality than input resolution
That's why Sony is dumping money into helping AMD get FSR working better so they can use FSR upscaling as the gimmick for PS6.
Sony isn't doing that to save money on hardware. It's due to the nature of consoles. They get obsolete over their lifetime, so lowering resolution is one of the few ways to recoup some performance. By having a good upscaling you need to worry less about the output not looking like absolute shit.
Nvidia on the other hand is going hard on gimmicks because they don't see gaming as a priority anymore, so games get subpar hardware with some tricks their marketing team convinced them are good replacements for generational improvements.
bigger than 24"
24" is perfect for focused gaming use.
Eat shit. 4K should be standard by now
nah.. we clearly don't have the computing power yet for 4k main stream. Consoles can't do it without massive frame drops. PC can't do it without thermal nuclear powered cards pulling so much power they some times just melt wires and catch on fire. Everyone is trying to fall back on fake frames to make it happen.
main stream 4k would mean even mid-level gear has no problem doing running it native, medium high setting and at least 100+ or higher frames. That's just isn't happening. People like to blame the engines but even optimized stuff can still struggle. It's just not a main stream ready level of tech and I doubt it will be for at least 2 or even 3 more hardware gens.
Even then we need way more efficient sort of hardware too. As you can't just keep pulling more and more power with more heat etc. Being at some point you just run into the limits of home wiring
you're retarded
Then there we go, 1080 is fine for you. Its way too small for me though, it's perfect for boomer shooters and cod, but not for the games I play.
Wtf is going one with graphic cards
Discrete GPUs are not that expensive outside of the retardedly powerful meme cards
They are, specially when you consider that the midrange is stuck in the same performance tier they were 5 years ago. They are selling you shit that should be entry level at this point for prices that used to belong to mid-high tier hardware.
Look. I get it. I was running an RX480 until last year. I just roll my eyes when I see people complain about having to jump through hoops to get games to run on hardware weaker than current consoles. Because of course you do. Hell, it can even be fun, like solving a puzzle.
"Team green" doesn't develop videogames, retard. We can, and should, simply refuse to buy games that don't run 1080p 60 native on N060 grade cards, 4K60 on N080 tier cards, and 4K144 on N090 tier cards. And refuse to buy games that force ray tracing.
pc fags care more about graphics card discussion than actually playing games
Nvidia should be bullied into not releasing shit cards.
Eat shit. 4K should be standard by now and would be if you didn't buy unoptimized dogshit
Voting with the wallet is dead and buried and not just for games.
Companies cater to investors, not you, your purpose is just to generate good news that bump up the stock, which is why they freak out when Anon Babble starts to attack games and generate "bad buzz".
There is nothing you can do to influence directly with buy or not buy..
Enjoy concord 3,4,5,6, X, X2, X3, X4, Zero Legends and star force zerker vs ninja
It's not even just shooters. I tried playing RTS on a bigger screen and it's like the UI is coming to my face and I need to move my eyes much more to be able to manage everything.
Even for more relaxed content like watching videos smaller screen works better for me. I can capture the whole scene in my view just like with TV because TVs are so far it becomes smaller and my eyes feel relaxed.
nvidia actually hate their gamer audience. The entire 5000s was a massive fuck you to them. They are actively trying to make you miserable.
Consolefags are constant graphical bitches too.
They are selling you shit that should be entry level at this point for prices that used to belong to mid-high tier hardware
Maybe if you're one of those coping retards who pretends a 2025 USD is the same value as a 2015 USD.
The cards are fine, they're just overpriced.
b-but the 5000 series
Only sucks because apple took all of the 3nm capacity and tsmc has a monopoly over bleeding edge nodes.
The cards are fine. A 4070 can run elden ring 4K60 no problem. It's the devs that are the problem.
Something tells me im going to be using my 3080 12GB for a awhile.
Gamers and battlestation builders are not the same.
Because that game needs RT, for which new cards are better at that, but the 3090Ti is gonna play perfectly fine raster games much better than the 5060ti or the next 60xx from that tier screencap this if you want anon.
elden ring as a graphical benchmark
is using the same engine from 2009
llol
main stream 4k would mean even mid-level gear has no problem doing running it native, medium high setting and at least 100+ or higher frames.
We can easily do it if the devs aren't retarded. Just stop using meme tracing, stop using insane unoptimized assets, stop using expensive particle effects that usually just REDUCE visual clarity.
play on low settings if you only care about resolution
they aren't fine, not even a little bit. The 5070 is just a bad card, even in a time where GPUs are selling like hotcakes, you will find fucking pillars of 5070s in a microcenter.
what set, the set which poors all lose their shit over?
the robot at the top
kek
raytracing is a dev-oriented gimmick you're only going to see more of in the future. raytracing is a big time and resource saver for them. doom dark ages was an early teaser of what you're gonna be seeing later on, except most other forced RT games will also have awful optimization
Yes. Easily. Just like they made VR viable by making the games look like 10 years behind. But most people don't care that much for 4K.
I played at 4k on my TV with DLSS ultra performance and I can barely even tell it's on unless I get close to it
the 5000 series is an anomaly because of tsmc's monopoly and apple buying out 3nm. every past generation has been fine but overpriced
4k will never be standard as ray tracing will never be easily and efficiently rendered without fake frames. With the advent of game engines such as Unreal Engine 5 and ID Tech 8 implementing ray tracing as a requirement (even if some of it is software based) will always take away performance that would have otherwise went to rendering a game at a higher resolution better. 4K will always be difficult to render without AI Upscaling, but when you add in ray tracing, and eventually "forced" ray tracing, 4k takes a back seat in priority.
Exactly, you fucking retard. Use engines that work, not 2020s unoptimized jeetcore dogshit
nvidia did this with Physx before just abandoning it all together.
I just emulate actual games worth playing and not modern slop
pc gaming is at a bad point imo, like fuck for 400 dollars you CAN play everything on an xbox being released just fine without that weird nagging in the back of your head telling you that youre not running the game at true max max like playing on high vs ultra on pc is. Then the studios completely dont optimize games to run on pcs anymore so super expensive cards just go bust for no reason. Its a frustrating time to be a pc user minus playing pre 2020 games on steam or your one off well optimized AAA game. Also indies.
Games are still being released on PS4 and Unreal Engine 4. We're going to be stuck on RTX 2060/RX 6700 for a long time thanks to something called mobile gaming. Switch 2 is barely stronger than Steam Deck and it's going to be running Cyberpunk Phantom Liberty which PS4 never got. The next pokemon will sell 30+ million copies while barely evolving past Unreal Engine 4 tech.
why the fuck are you buying the latest GPUs to run elden ring at 4k60fps?
The way games are being cureently developed, they can't even do 1080p native. We need to stop making excuses for them.
And doom the reddit ages flopped. Which is good. We should be relentlessly shitting on lazy devs and costing publishers money until they put out products that fucking work.
Lmaoooo sub 50 average AND 38 fps 1% lows. The way its meant to be played. Laos it's doing even worse on Clair Obscur. E-waste.
not nearly the same thing. physx is a plugin gimmick for physics and interactivity that requires a lot more time and work to implement properly, raytracing is something that can drastically ease the job of lighting and environmental artists for games. once goyvidia and ayymd cram in enough RT cores into their cards, you'll see every graphically intensive AAA game will be made with RT in mind. it's much faster and cheaper for the devs, they don't care if it runs badly for you.
nah, its not viable. Doom dark ages flopped hard with RT requirement. Other companies will take note of that.
as ray tracing will never be easily and efficiently rendered without fake frames.
Which is why we can and should refuse to buy games with forced ray shading. Stop buying UE5 games.
wait and see
As opposed to what? Buying a card to run monster hunter at 1080p 60 after DLSS?
The games forcing ray tracing are id tech engine not UE5, anonette...
they don't care if it runs badly for you.
They will when it sells badly and has high refund rates
I play at 1080p and 4K is a meme.
1080p looks worse, but you can suck it up for the sake of being able to run a game well. 4K isn't a meme, but it's not easy to run.
i can barely get above 60 fps on 1080 with my 3060 ti with several big titles that have come out this year
it's getting to the point where I want to invest in the 50xx only realize how terrible those cards are because VRAM isn't just for 4k gamers now and 16gb is chump change for the future of gaming
developers dont want to optimize because publishers keep them on a leash of time constraints and layoffs
$2000
anon, it was like 1200 when it came out
that's nothing if you have a job
they are hoping people will use FG+Upscalers to remedy poor optimization, but people are rejecting it so its only a matter of time before RT is abandoned if theres no money in it.
When you move you can instantly tell.
When you stand still or use photo mode you can't tell.
Look at any game with a waterfall. DLSS hates waterfalls. It's like it's kryptonite.
I personally just think that cloud gaming and console gaming will eventually take over if it continues like that. People don't want to spend 1.5k just so they can be told after 3 years that they have to tweak settings and constantly have to check out the newest fixes so they can play their games at an acceptable framerate. I truly believe that, once internet connection speeds over the world are getting better and better, that eventually we'll actually just rent those gaming cloud services for like 20-30 bucks a month and then get the full 4k experience. This shit is unsustainable. 300$ for the entry card?
Bro just spend 750 for the 5070ti, best performance per dollar trust me
You can get a console with 24 gigs of vram next year. Why tf would anyone do this to themselves?
A game that can't run 1080p 60 on a 3060 is shovelware and you should not bother with it even if you had all the money in the world.
says "forced ray shading"
this forced RT is a ue5 thing
Is there some kind of "graphics for retards" video you idiots keep coming from? Where some basement dweller in racing chair tries to get you angry about graphics by feeding you complete nonsense that teaches you nothing?
1080p looks worse
You don't fuckin say, anon.....you don't fuckin say.
its only a matter of time before RT is abandoned if theres no money in it.
Quite the opposite. Devs will be using RT to save money on development. Your performance being shit doesn't matter when everyone else is doing it too.
Fpbp
4K is a meme
BUT CONSOLSE DO IT
You mean fake 4K, real 4K isn't even feasible
I have a 5090 and the only game I seriously play is star citizen, at max settings and a 4k resolution it tops out at 300 frames in space.
Performance being shit matters when sales fall off
All modern games suck, at this point im becoming a schizo thinking chip manufacturers push for modern games to be more demanding just so they can sell more, vidya barely look any better and the only change i notice is games on 1080p look blurry as fuck, and companies keep rereleasing/updating older games with "next gen updates" that makes normal games blurry for no reason, i refuse to update my 6700XT, just going to play classics until i die
If a game can't run 1080 with 3060ti in modern day it's trash and not worth playing
Maybe if YOU are. The prices didn't scale only with inflation and the performance stalled since all their top chips go to AI.
It's feasible on engines not made by and for lazy retards
Look at any game with a waterfall. DLSS hates waterfalls.
not really
DLSS is decent with waterfalls, it's fsr2 that hates them at least in Horizon Forbidden West: youtu.be
That's because you have a 4k monitor retard.
PC monitors always look bad /blurry if they are not running at their native resolution, same reason playstation 2 (480p) looks terrible on a 1080p PC display.
Increases in performance stopped being exponential because of the fucking laws of physics.
Kek, I play with GeForce now at 4k 60 fps. That 20 bucks a month. The funniest shit is that people like to tell that it looks grainy, at 4k with av1 this shit looks perfect, vegetation is perfect. I even compared the frames, might have only gotten 4 frames per minute, but it got to the point I had to pull out a magnifying glass. Still looked about the same.
he believes in public consumption physics
Scammed by the Jews once again
4K can do true integer upscaling of 1080p
Well the VRAM retards come from hardwareunboxed Youtube channel because they're STILL milking the same VRAM clips for content. UE5 just gets hate because it's the only deferred rendering game engine they know.
the average pc gamer is buying a xx60 prebuilt and doesn't care about anything else as long as the game runs. the min/max autists that want to max out everything are a different breed, most people just go with whatever settings the game puts them on. look at the steam hardware surveys, it's mostly shit gpus.
store.steampowered.com
No it's not because of that. I have a 27 inch 4k monitor and a 27 inch 1080p monitor side by side. 1080p looks unstable and low-detail on both of them. And even if I only had a 4k monitor, 1080p scales perfectly to 4k so it wouldn't matter.
this thread is fake & gay
Kinda.
The push to 4K made the move to 2K feel bad (What? 2K? Are you POOOOOOR?) and the industry is not ready to adopt 4K as a standard. It is terrible for games and movies. The most popular 4K game is Fortnite because with a good GPU you can play it on a wide display and that gives you an unfair advantage over anyone else. Similarly this happens with every esports game to the point it is considered legal cheating.
The only winner in this scenario is Nvidia as making movies or games for these resolutions is notably more expensive and laborious. If the industry had pushed 2K as the new standard it would had been more convenient, now we are pushing $2000 worth of hardware for games that are essentially the same as GEN VII because resources are allocated heavily into GRAPHICX rather than gameplay.
I want to play the latest sloppa in 4K
Your problem lel
I have a 1080 and my main game still runs butter smooth on 144 fps
1080p4Lyfe
never ever falling for the 4Gay jewish skam
Your definition of which games are worth playing is artifically limited by your hardware. You don't actually know if you do or do not have any interest in playing any recent games, because your brain preemptively discards them from consideration due to your inability to run them.
1080P is so fucking blurry
All the custom TAA methods(DLSS/XESS/FSR) break the waterfall in Horizon and Death Stranding except the default TAA. The video doesn't do a good job at all since it excludes the default method. DLSS (and competitors) can't into motion vectors unless they get the same technical artists in the room but they never do at AAA studios as they lay em off or outsource everything.
T. Game employee
Biggest changes this month
6th
Rtx 3050
Nigger literally how? Are people buying this just for GeForce now? It's also crazy that people don't wait like 1 month and get themselves 5060 laptops instead of 4060 laptops. The framegen and better upscaling utilization would've been extremely valuable to them.
everyone is jumping on 1080 or 4k but forget about 2k being the real sweet spot for gaming resolutions
You get double the pixels of 1080 so everything looks crisp AND you get good FPS on your games without AI generated BS
empty center node
I dont think you flow charts
I don't have to play a game to know if I'll like it or not.
as making movies or games for these resolutions is notably more expensive and laborious.
No, it fucking isn't. It would actually require less work. We don't have 4K because devs insist on using hyper fidelity assets, high tri polys for random background shit everywhere, and obnoxious and expensive visual effects.
All the custom TAA methods(DLSS/XESS/FSR) break the waterfall in Horizon and Death Stranding except the default TAA.
nope, it looks fine with DLSS and looks fine with FSR4
i linked you the video where it looks fine with FSR4, are you just going to pretend that video is fake or something?
this nigga plays on a 27" monitor
If the dev can't even be bothered to make the game run properly, it's obviously not worth considering as something of quality.
I play 1080p it's still higher than many console games. I bought a midrange card several years ago and I don't plan on upgrading soon. I don't play games at release and I'm rarely interested in doing so. I only buy games for $10 or less and preferably under $5. I also shamefully have a huge backlog from giveaways and bundles that I got for pennies on the dollar. If I upgrade my PC all my old games will still work on it. I don't pay for an online subscription. I don't pay to play old roms. PC gaming is great.
Games still run properly, most of them do, at least on modern hardware. If you've got a GTX1080 then your definition of games that "run properly" for you is inevitably going to warped.
2K is a cope resolution and 1080p media (IE, almost all video ever) looks like shit on it
45fps or even 30fps WITHOUT a single dip and upscaling? I prefer it before the 7060 or the 8060 with 100fps and constant dips.
i'm old as shit and I can't see very well so it all looks the same to me.
You can buy one new console and that one is gonna LAST.
Yep, silky smooth 30fps at 720p (upscaled to 1080p). The good stuff.
1080p
Blurry
Get better eyes
Games still run properly, most of them do, at least on modern hardware.
No, they don't. They can't even do 1080p 60 native on discrete GPUs released in the same year.
Nah, modern AA is, 1080p was fine before they started using temporal memes. Just use DLAA, it does the same DLSS does when you are have a higher res monitor.
you should get a better monitor. 1080p looks either blurry (AA on) or extremely pixelated and unstable (off)
But this is blurry. Look at the fucking trees.
Shouldn't have spend 2K on silicon waste retard.
And it can still play the games, retard.
Fine, hold on to your money forever.
Literally a (you) issue, modern games look blurry because they keep throwing in shitty blurry features that tank performance and look fucking awful
Yes they do. My 4070 ti super runs nearly everything I've played so far above 1080p60fps except for games that are fundamentally broken (like ARK or MH:Wilds). Any game that isn't fundamentally broken runs above that.
2K-144hz Gods, where we at?
4k
mario if he dural
a lot of people don't know what the funny numbers mean other than higher being better. ignorance is bliss and all that, if games are running and not a complete mess then they are happy. a Anon Babble user might be waiting months for a 5070 ti to go down in price or for the 5070 super with 18gb of vram to come out, a normalfag just buys a prebuilt in their price range and enjoys playing his games months earlier because he has no idea he just bought a "bad" pc.
a lot of people seem to be confused as to why the 5060 8gb even exists. it's for that crowd that just wants to buy a cheap pc, shove a 14400f in there and they're loving it. no need for a x3d cpu.
he gayme on an LCD
You already fell for the biggest jewish skam retard.
Literally a (you) issue, modern games look blurry because they keep throwing in shitty blurry features that tank performance and look fucking awful
You're just throwing around word salads with 0 insight. Modern games look blurry precisely because of the AA I mentioned (usually TAA). TAA does not tank performance at all, and while it does look awful you'll find most new games also look even worse when you turn AA off.
Well I'm talking about the fundamentally broken games. Though I would say that any game that can't do 4K 60 on a 4070ti S is broken. Even last year that was a 7-800 dollar card.
I want a 9060xt for MSRP
i'm dreading the day where i have to build a completely new pc
still have first pc taking up space, collecting dust
too lazy to list current pc parts on ebay
he fell for the meme
AI GPUs outperform enthusiast tier cards. The current midranges are a fraction of the flagships. If you updated this chart for the 5060, it would be under 20%.
My friend has a 3090 and I have a 4080 super, the disparity in performance between our PCs is substantial. Either the 40 series was actually a large uplift in performance, or the 30 series was underpowered even for its time
6600 is nearly 400 dollars
Cor blimey! Just buy something used at that point. The coin mining days are over.
That just low quality textures. It's a PS3 game.
This is skewed by the fact that the 4090 and 5090 are fucking insanely powerful.
I was playing at 4K120 on a 3060ti.
I'm currently playing 4K120 on a 5070ti
4K oleds are the only way to get rid of the lcd sample and hold motionblur, and they're literally cheaper than most decent monitors even cheaper than many 1440p ones.
I got my gaming endgame setup playing games on a big screen with perfect clarity and I couldn't happier, while (you) deal with motionblur worse than on 50 years old black and white TVs and you're so blind you don't even notice anything.
I have more respect for CRTfags than 1080p fags. If you're gonna be a poorfag, at least do it the right way.
The 40 series was decent
Its the 50 series that sucks
Literally a (you) issue
It's everyone's issue because it's the truth. 1080p looks bad nowadays no matter how you try to salvage it. Temporal rendering methods make 1080p a ghosty mess that loses detail whenever you move, modern graphical fidelity makes 1080p look like a jagged mess of pixelcrawling whenever you move.
You can't seriously sit here and expect me to believe that 1080p has always been blurry and throw out word salad terms yourself. Go look up any game from even 10 years ago and look at how sharp they are in comparison to modern games, modern games look terrible because developers don't know to properly make games now
forgot to say that's my current pc
modern graphical fidelity makes 1080p look like a jagged mess of pixelcrawling whenever you move
When you disable temporal rendering methods*
Yeah okay but where the fuck does the 3050 come from? That shit is literally not worth it, you cant play new games with it. I intended to buy one just for the purpose of it allowing cloud g-sync, and that one was in a 450$ pc. The people that buy this are getting ripped off hard.
3090 was 1.5k though
Never give scalpers money
The 4080s is about 50% stronger than the 3090. People blaming the cards for running games poorly are coping retards who refuse to accept that they are enabling lazy devs and publishers.
Bro, read. 3090 T I.
I never said 1080p has always been blurry. I said "modern games" which means deferred rendering and higher asset fidelity. Which means either you go TAA and get a blurry mess that loses detail because you don't have enough pixels for good results, or you turn it off and suffer the visual eyesores that happen because modern graphical assets mean too much pixel detail.
So? The other cards should be as well. 3090 was also insanely powerful at its time, so was the 980Ti or any other flagship. They are just labeling what should be lowend cards as midrange and still asking for more money, since retards pay for it. Want more?
1060 to 2060 = +50% cuda cores
2060 to 3060 = +86% cuda cores
3060 to 4060 = -14% cuda cores
4060 to 5060 = +25% cuda cores
But at least the 5060 got +25%, right?
2050 mobile to 3050 = +25% cuda cores
3050 to 4060 = +20% cuda cores
Oh, it's just a xx50 tier card.
All the edges are very soft because there's no AI scaling. These are both 1080p cropped at equal level. AI scaling is just superior to the old ways.
head to the american amazon and type in "3050 pc", there's a couple with 50+ sold this month and one with 100+ sold this month. i assume they're cheap ass entry pcs that people are buying because they're $500-600. it's the kind of pc that a clueless parent would buy their kid.
You mean the white halo? That's literally just a sharpening artifact, you are retarded bro.
Well the image is full of artifacts because it is low res and MSAA isn't enough to make up for it. If it supported DLSS it would look much better just saying.
3060 to 4060 = -14% cuda cores
This is the only objectionable change. All the rest is you throwing a tantrum that the cards aren't getting better for the same price as quickly as you want, which is fucking entitled and retarded and ignorant of the limitations of physics, chemistry, and engineering. Moore's law is dead, dipshit.
There was a time when 3d rendering in real time was impossible so guess we'll never see that!
I play at 1080p and 4K is a meme.
here's what 1080p looks like in new games
1/3
Edges are supposed to be soft, that's the point of supersampling.
That isn't the point, the point is that 1080p is not blurry. DLAA is essentially DLSS but you don't mess with the native res, so you get all the benefits of not relying on shitty TAA implementations while still having a sharp image.
here's what 1080p looks like in new games WHEN YOU TURN POSTPROCESS ANTIALIASING OFF (for all those anons who swear they play without taa because it looks much better)
2/3
And with DLSS this looks like not even from a 1080p monitor anymore. (even though it is technically running 720p because DLSS quality lol)
i don't play dogshit like this thougheverbeit
The entire RTX 3000 series, or Ampere, was fucking dogshit. There's a reason they immediately went back to TSMC after having used Samsung crap that entire generation. Remember that the RX 6900 XT cost $1000 and was $500 cheaper than the RTX 3090 while being as good on 1080p, 1440p and like 5% slower on 4K.
Ironically, AMD cards, if you could get them as MSRP, were actually pretty damn good that generation. The 6800 and 6900 XT are still damn good cards to this day. They're fast, they have good drivers, they're not starving for VRAM. You had to go with AMD 5 years ago.
Having said all of that, the RTX 4090 is a beast. And it will remain a beast for years to come. The RTX 4090/5090 are the only worthwhile Nvidia cards, everything below that is gimped in some way, mainly from a VRAM aspect, and they're just slow in general. All the RTX 4090 owners did a good job in buying that card.
and here's what the 4k meme looks like in comparison
bear in mind this isn't 4k at all, this is actually 1080p but upscaled to 4k which means it doesn't actually need that much more GPU power than raw 1080p
3/3
Jews. The answer is always Jews.
Edges are supposed to be soft
Eww who told you that...
engines that don't suck ass
Unfortunately, those are in short supply these days.
i'm not buying a 5060 jensen
"ESL spic lel"
HOW CAN (You) AFFORD ALL THAT?? HOOK A FRIEND UP WITH YOUR CARTEL MAN
what game? i want to test this myself
Best goy ITT. The cards are still getting more powerful, just the midrange isn't, simple as that. And you are still paying absurd prices for them. Just compare the flagships.
And Moore law was always a meme because it doesn't account for anything other than dies getting shrunk. Not the point here anyway, the point is that midrange used to be ~50-60% of the flagship, but marketing teams have brainwashed retards like you into thinking paying $500+ for a card that can barely do 1080p and perform the same as a card from the same tier from 5 years ago is a good thing actually.
I can't afford $2k/3 dollars on GPUs per year
One upscaled from 250p, the other any form of aa turned off lmao.
Why not show 1080p with Dlaa? You showcase both low res upscaling (250 or smt to 1080) AND 1080 with no aa. Why not simply 1080 with Dlaa?
Annoying Corposlop Gameshow FPS with Destructible Environments
the Finals in case it's not obvious
the first two webms are 1080p with 0 upscaling, integer scaled 4x to be the same resolution as the 4k webm
i cant fucking sell my 3070ti
but i can do a 1080p dlaa webm no worries, give me a few mins
yes that's how its always worked. xx90 owners upgrade to the new xx90 card every gen
1080p vs 4k
why not 1440p, is it really the worst of both worlds or something
FPS>extra pixels you won't see because you developed myopia from staring at a horrid refresh rate
it's the cuck middle of the road resolution just like 720p was compared to 1080p
what new GPU do I buy lads?
gimme an optios with fake frames(cuz maybe I want to slop images down the future) and one without ai stuff
1440p. I honestly didn't notice much difference from Ultra settings in some games screenshots so at best I will be playing modern games with medium settings and only turn it up to High for bullshots to post in screenshots thread
here's 1080p dlaa as you wished
What I find funny about 4k is that the "4k card" owners are literally pussywhipped into buying new cards every other year because of the sunken cost fallacy. 2k on 4k monitors looks WORSE than native 2k on 2k monitors. So once their "4k" card fails, they either have to constantly try and change the settings or simply buy a new one. Their choices are literally
See low textures in crispy clear 4k
Pay 2k for the newest 4k card.
Because the alternative would be
Downscale to 2k which makes it look worse than native full hd.
And they genuinely want to convince you to buy the newest nvidea slop because it's a good idea. I legit believe they desperately try to convince themselves.
the 3090 ti was obsolete when we learned about the 3060 ti
just the midrange isn't
They are, just slower than you want. Also the 70 series is mid range, not 60.
the point is that midrange used to be ~50-60% of the flagship
Which is a retarded and arbitrary comparison. Yes, the 4090 is bonkers. That's why it costed 2000 bucks.
for a card that can barely do 1080p
This is the fault of devs and publishers. And you refuse to hold them accountable.
2k on 4k monitors looks WORSE than native 2k on 2k monitors.
that's ok, i can just upscale 1080p to 4k and it'll look better than your native 2k lol
inb4 it won't
are you sure you wanna go there?
whatever you can afford and trust that hasn't been run into the ground
you have to be careful about gpus that were sold cheap because the chips were burnt out from crypto mining
Lower the price
No, I just use GeForce now which looks better than your upscaled slop kek. The whole "you sure you wanna go there tells the story". Back to trying out more settings paijeet.
Or, we can just not buy dogshit unoptimized games. I am not telling you to buy a 5080 ti super whatever the fuck, I am telling you to stop buying ray raced indian shovelware
if global warming was real I'd buy a 5090 to heat my house. it's a good catalyst
kinda want a brand new one.
But in case I buy a used one what are the signs I am getting scammed?
your geforce now streaming is still limited to your monitor's maximum resolution
even if you're playing 4K DLAA with it, you're still get 2k on your monitor because that's your monitor's limit
and i get better image quality by upscaling 1080p to 4k than you do with 2k no matter what you do
they either have to constantly try and change the settings
game doesn't hit desired frame rate
Switch from 2160p to 1800p
game looks virtually the same
continue on with life
It's really not that difficult.
start learning what they look like and if the chips are
if you really want to know your going to have to start watching YouTube.
Full set of poorfag cope, it seems. I can see that's the source of your shitposting.
Huh? I have a 4k monitor, who are you replying to?
The switch 2 will have several 4k games, but mostly because nintendo will just use the switch 1 engine with better textures.
Maybe dial back the racism and you too might achieve financial stability, you retarded nigger.
the obvious signs are the chips
obsolete for what? the latest sloppa? shut the fuck up
there's a million games you havent played that run on hardware from 10 years ago but youre too much of a faggot to see that
go buy a console cause the endless possibilities of PCs are wasted on you
You would think people learn after falling for:
3D monitors
curved monitors
2k displays
But no, retards keep falling for silly tricks.
Anon Babble hates gaming nexus but that fat bastard will help learn about gpus
No it won't... upscaling is not 4k and I doubt it can even do 4k upscale except for the absolute pixeliest games maybe
Metroid 4 is doing actual 4k/60 (or 1080p/120).
But it's because it's a switch 1 game pretty much.
Baked lights, minimal shaders and this kind of stuff.
Epic is working with nvidia to make unreal engine 5 unoptimized so you need multi frame gen to play new games
go 4k to play new games that look blurrier than 1080p did 10 years ago
The absolute state
don't reply to that shill retard.
tfw rtx 2070
Is it time to upgrade bwos?
that's what 1080p looks like with taa.
upscaling doesn't count
If it looks identical then who cares
What are you talking about? My 3080 is still doing 4k 60fps, outside a few outliers. MH Wilds and Oblivion remaster are the only games i had to lower setting and tinker to get a playable experience.
Hideously unstable, low-detail, like there's heavy motion blur and depth of field everywhere all the time.
That's an issue of modern games, not 1080p. Old games in 1080p look sharp as fuck. Modern graphics add a ton of retarded effects that just fuck up and smear the entire screen.
710675393
fuck off retard
twistedvoxel.com
And other better sources point to that.
But the monkey paw here is that it's literally a switch 1 game but stretched up.
The best case here would be just improving the texture resolution, because texture resolution does not impact performance, just VRAM usage
This but just cloud computing
It's the only way to finally stop cheaters
Except it looks nowhere near the same unless you're blind, there's no such thing as a free lunch
Exactly what I did. Everything I want to play runs at 4K just fucking fine
why are you retards obsessed with metroid? it's not even good
my 3090 ti was $1100 in november 2022
there's no such thing as a free lunch
With dlss it looks the same. It actually looks and has better fps than native
The magic of ai
4k is easy to run. It just when you start slapping all the other effects on that it becomes untenable. I have no doubt mario games will be 4k/60 with baked lighting and (((old)) rendering techniques
The 3090 Ti it's the new 1080 Ti.
Screencap this.
Except it looks nowhere near the same
depends on your resolution
at 1080p it clearly doesn't look the same (although some would argue it's still worth the performance gain)
at 4k there's no functional difference between 4k native and 4k Quality mode
I got mine for under msrp at around $350
4k was always a pipe dream resolution only good for movies on 80+ inch TVs. You fell for the memes
It's not about metroid, it's about choices.
Nintendo made the choice of focusing on 4k over gayass modern rendering techniques in metroid.
I'm gaming at 4k on my cheap 4070ti super no problem
You're insane or a marketer
If ignore all the disocclusion artifacts and other problems. Listen i use dlss when available but saying it's better then native is horse shit.
if posting webms of gameplay is insanity or marketing then sure
I can't deny thay I fucking hate Nintendo.. but you're still a boner
only compared to TAA smearing
aren't we mp4 yet or is that just
I'm sure the switch 2 will be a battleground between 4k "old techniques" and 380p stretched to 1080p with new slop
if my choice is between dlss shit or taa smearing shit or no antliasing (see: ) or msaa/smaa/fxaa (they don't actually work) i'll choose the aislop
I just want to run GTA 6 without paying so much what is the best card for me?
nu games are just optimized badly
threat interactive explained all this
Nintendo always has a good console every other.. I don't don't the all digitital go fuck yourself switch 2. it's all digital
bro in the old days you couldn't even start some games with a video card from 2 years before it was released, if anything, that you can still use a card from 5 years ago is proof of stagnation
Unfortunately I had one
8gb vram causes heavy stuttering
I don't trust the all digitital switch 2
4K isn't a meme, but you have your heart in the right place. Resolution is just one aspect of the visual quality of a game. 1080p on a nice display still looks fine. I have a bunch of old CCFL monitors that I tinker with and honestly 1600x900/1440x900 at around 20" or less still looks good natively with modern games.
a GPU from 5 years ago should run a game from today at the highest setting at at least 60 fps
This has never be true in pc gaming, ever.
didn't read
SHOULD I upgrade my 3070ti. I only play emulated games, arpgs occasionally, rare FPS shooters like Deadlock (rip) and the most intensive work program I use is microsoft word.
Will I be priced cucked if I continue to wait? Is hope left for reduced prices for being patient and waiting this out?
As soon you add internet to the console, it is fucked.
Your physical or digital collection can be killed by a mere update, or even your entire console.
Not saying that "you should forgive the switch 2", but more like, pirate everything if you want to have anything.
Don't. 3070 Ti is basically a 5060 ti at this point. Unless you want to spend money on a 5070 ti or 5080, don't bother. With the games you play, you probably won't even notice an increase in performance.
I built a PC with a 1050ti years ago when my current PC at the time broke down and I couldn't work out the cause, just to get back online. I needed a new PC anyway.
based
read this
thank you
Only if you want to gen ur waifu taking 600 dicks up her ass. Otherwise you could just stick with ur bad gpu and play games in 800x600 at 30fps
Most of the people on Anon Babble bitching about graphics are phoneposters who don't play games. It's hard to have organic discussion here because there's a campaign to ruin games and discussion about it. If you think the discussion here (or anywhere) is organic and representative of gamer culture, you're sadly mistaken.
You have the exact same posts by the exact same bots on reddit and everywhere else. The difference in most of those places is that you can further stifle organic conversation by downdooting it.
blurry shit smeared mess
jaggy mess
anon doesn't tell us if it's no AA or AI upscaling at 4k
I'll just wait for the next supers and actually get a good card instead of dogshit 16gb, barely play modern games with that little vram.
I have a 4080S and only use it for indies, gachas and emulation. I think I'm good for the next decade.
looks better than the 4k ones
poorfag can't afford a 5090 already
Lmao'ing at you're life
I think you wasted money.
*Solves TAA*
Nothing personnel, kid.
I have a 3080 10gb and I still play most games at 1440p 120+ fps.
I'm having fun gaming.
Now that is truly a waste of money. You own a tesla too?
Will the 60 series be a true 4K60 card generation?
braaaaaapppp
NOOOOOOOOOOOOOOOO YOU CAN'T HAVE FUN AT THOSE RESOLUTIONS OR FPS
IT'S GOTTA BE THE BEEEEEEST!
I still play most games at 1440p 120+ fps.
No you dont
Based, I played E33 at great frames, high settings, and 1440p and didn't have a heart attack from it.
How are chink/gook/nip mercs compared to jeets and seaniggers? Or ones from countries you don't typically see
It's cool man just wait 20 years then you'll be able to afford a 32gb card whilst everyone else is rocking the 512gb ones.
its obsolete
It isnt though, the past two gens sucked ass and there is no good vidya that needs anything super high spec. I have a 2070 and play everything i have at highest graphics.
b-but... muh meme tracing...
Yea because when Im 30 hours into a game, I really give a shit about slightly better shadows. Fuck off with this shit
Bharat's Law
Poo's law
sorry about the tesla prices, you guys got RAPED badly.
blurry shit smeared mess
yes that's 1080p with taa
jaggy mess
yes that's raw 1080p
anon doesn't tell us if it's no AA or AI upscaling at 4k
read the filename, it's 4k dlss performance so 1080p ai upscaled to 4k. and dlss/fsr/xess upscaling always incorporates antialiasing automatically
looks better than the 4k ones
how?
he didn't buy the dip
No wonder ur poor lmao
literally no good options
He'd only have to play in that resolution and fps if he played dogshit (read: modern) games, althoughbeit.
he says while brainlessly maxing out yet another line of credit.
you know you gotta pay that all back...right?
I have a 5060 and I'm happy. Can literally play any game that interests me over 60 fps on 1440p. Imagine spending 800$ or more for a gpu
not necessarily
OKAY but I actually did take out some paypal credit to buy myself the ROG Astral GeForce RTX™ 5090 32GB GDDR7 OC Edition.
Realistically what happens if I just "forget" that I did this. How badly will they destroy my credit.
Delinquency will rape your credit monthly as you keep failing to pay, until they charge off, then the debt stays (but the hits stop) for 7 years.
No one uses GPU's for games anymore. Haven't you noticed all the stable diffusion threads on every board?
Why get the newest entry level card instead of a mid range card from a previous generation for the same price or less?
makes half the fine detail fade out of existence as soon as the camera moves
just in time for the next upgrade
a perfect system
Can't get the fake frames with the old cards
also, gaming is unattractive to women
shall I tell you how much UE5 stutters?
you should just stop playing games
Now your getting it
Well heh I'll just sell it for even more after a couple months of winning the graphics battle on Anon Babble. NVIDIA GPU prices ONLY go up
Meh I'll just stick with my 1070 and continue to play the video games where I have logged the most hours in.
exhentai
terraria
women do kinda hate videogames...maybe I'll just buy a fancy watch instead..that'll get me some pussy...
If your gonna do this go balls deep. Debtmax then declare bankruptcy. You won't be able to finance anything for 7 years.
Real men use AI to gen nudes of girls on Facebook then blackmail them into sending thousands of dollars so that they can live a comfy neet life with a 5090ti super OC edition.
DLSS looks like shit
1080p looks like shit
1440p chads win again. That said, you can supersample at 1080p if you're not retarded.
if you watch all 3 webms you'll see that 1080p to 4k dlss looks way better
it also looks better than native 1440p
for me it's 1440p DLSS quality
Based Rich Piana enjoyer
I was playing Metro in 1440p DLSS Quality on 1440p 27" display. It looked like ass.
Then I played it on a 1080p 24" display in the same 1440p DLSS Quality resolution and it actually looked less blurry. Shit is magic.
I hate HOW SHIT shadows and lightning are these days when Doom 3 alone and IDtech 4 solved the issue
3gb or 6gb?
fpbp
Yes good stuff. But WHAT is the best priced card if I want to gen AI porn
You can thank creative for that one.