Red won

with those numbers, I don't consider anyone a winner

Can I see 1080p 60 fps

So I can't play it with the 5060 12 I just bought, brother

Now turn on DLSS and Ray Tracing

16*

I'm glad ultra settings are actually ultra again rather than being trumped up medium. You shouldn't be able to play the highest graphics settings comfortably on current hardware. The game will be around for decades.

5070 sisters...

RTX 5080 chad here

sister you cannot turn off gaytracing in this game

Playing FPS games with a framerate lower than 60 FPS is not acceptable

Ultra should murder your frame rate for a graphical improvement that cannot be perceived because uhm I'm a huge faggot that owns Nvidia shares

Kill yourself

RT is directly baked into id Tech 8
You cannot disable it

RTX 5080

Chad

Performs worse than the 9070xt which is like a midrange GPU

id7

principle code by john carmack

runs at 300fps on potatoes

id8

indian code

50fps on a 5090

it keeps happening

another AMDpoorfag validation thread

If I didn't need CUDA features I'd have got a 9070XT.
So a 4080 SUPER barely used, 30% off is good for me. (+ power efficiency)

you are brown and view GPUs the way your less nerdy kindred view sneakers
it's genetic unfortunately

The only way to get better performance than a 9070xt is to shell out 2000 bucks into Jewvidya memetech

Red won spiritually

I'm broke and get laughed at by the way I dress

we won

we won

55fps

ha ha ha

i explained that. the game will be around for decades.

if you do not have 5090 and you do not, your comment is just pathetic and sad, keep posting from 1650 lol

Retard.

throwing 10k on a PC does not make you rich, in fact, every single gypsy throws more on car while living in poverty, You are just like them

1080p

BABAHAHAHAHAHAHAH

Depends on the game. Some engines heavily favor AMD. The 5080 generally has vastly better performance.

frogposter

Yeah, I just wasted seconds of my life typing this, LMAO.

Some engines heavily favor AMD.

You mean like Dark Ages which is a Goyvidia sponsored game?

he only has one 5090

ha ha ha shut up poorfag

Make a 7090 xtx with 24gb in VRAM and I might care.

forced to use dlss

fsr performs so bad even a 5070 gets near 9070 xt

state

3060 can't even average 60 FPS at 1080p medium

lol
lmao, even

This unironically runs worse than UE5 slop.

doom.png - 1126x1471, 545.59K

this is what planned obsolescence looks like.
last year's hardware barely functional

1080p monitor

holy kekarooney you're retarded

I can’t max out my 1440p/165hz monitor with my 4090 anymore even if I run 1080p medium

Putler better start that nuclear war rn

IMG_8797.png - 572x744, 637.96K

Red won.

But I see more green than red??

hardware unboxed

21312152311.png - 452x268, 104.74K

9070xt and 5080 can't even get 60fps

why is pc gaming such a joke now?

jewvidia exposed for making e-waste intentionally to push people towards their overpriced xx90 cards

Gigapoors will always buy Njudea despite the fact that all cards below XX90s are scams. The worst that could happen to them, in their opinion, is being called out. Even if it costs them more and gives them less performance.

50-70 fps on $1500-$2500 GPUs

Big oof. So, this is the power of ai generated upscaling huh
Game looks like shit too, how did they manage to do this? Doom 1 and Eternal were running well and looked only slightly worse. It even runs like shit on Microsoft's own console. I still have my goypass subscription because I get it for cheap and I have a Series S and I am not even going to bother with this garbage

Now turn on DLSS

Turning on DLSS defeats the purpose of having a higher frame rate since it introduces input lag. I always try to make the frame times as even as possible and minimize the input lag with MSI Afterburner. Frame rate is not even a good indicator of how good the experience feels.

rendering lower image sizes introduces input lag

what?

You need a $3000 GPU to have a stable 60fps

The modern PC gaming is cancer.

921380128401.png - 1311x451, 142.15K

All of the graphics settings look identical in this game.

That's why I bought a PS5 Pro. 5 more years of 4k gameplay without an issue.

Native dlss does not have any input lag bro

doom has forced on raytracing

he's probably thinking of framegen

Upscaling reduces input lag by virtue of improving performance.

just spend 3k $ to play high speed fps arena shooter at 80fps

id fell off

Upscalers literally render additional fake frames in between of the real ones. They don't even take into account mouse movement between a real and a fake frame. Therefore, even though you're getting more FPS, the game feels less responsive. You're better off just playing at your monitors refresh rate with VSync forced on and frame rate limited to 0.03 less and getting even frame times.
I believe DLSS 2 doesn't have this problem, since all it does is actually run at a lower resolution and try to resolve the image at a higher resolution before showing it to you. But the moment you turn on frame generation your experience will turn to shit even if the frame rate number is higher.
Don't fall for Nvidia marketing, DLSS 3/4 is trash

Yes.

Upscaling sure
Frame generation not so much

Stop using muddied language. Upscaling is upscaling, framegen is framegen. DLSS is a suite that does not necessarily entail either.
DLSS 4 upscaling is fine.

clueless AMD drone is clueless

DLSS doesn't increase latency you retarded bellend

Contrary to popular belief, DLSS introduces a good chunk of latency to the render stack. The benefits you see are only from the performance gains in terms of frame rate. If you hypothetically have the same frame rate with and without DLSS, you'll see less latency with it off.
The latency benefits of DLSS upscaling also become lower the higher your actual frame rate is due to the impact of frame time. So at some point, the game will feel more responsive without DLSS even if the frame rate is higher with upscaling enabled.

Frame gen can get fucked though lmao

No it doesn't you retard

all I see is that consolefagging is the only viable option to play AAA games nowadays.

Show me some proof, buddy.
You can't

consoles have the same shit...

I'm not the one making retarded claims. I'm dismissing your stupid shit. You prove it. I know it doesn't.

not in 4k you bought a 1080p card im afraid

I wonder if the console version has denuvo

3080

How? Ultra Nightmare at 4k consumes 10+gb of VRAM

what do you do with the other one? give it to your wife's boyfriend?

this better be troll

Framegen can be fine, extrapolation doesn't add latency. Interpolation which is what DLSS-FG and AMD FMF does (minimum 1 frame)
Intel is working on extrapolation based FG for Arc.

These developers know most people are going to have 9th generation consoles and mid-range PC's built from the last 5 years. So why did they create a product that was not going to run well with the majority of users today?

I get your mother to pose with it as she sucks me off

he has a 540hz monitor you retards that's completely valid name one other card that can fill 540fps except the 4090?

hes legally blind console shitter who is content playing 30fps upscaled from 720p at graphical settings lower than even lowest possible settings PC can use.

he isn't a retard because he bought a retarded monitor

lol shut up

Everyone wants to be the new Crysis

extrapolation

how would that work wtf are you smoking

500hz has is crt level motion clarity we all basically have forced motion blur inside our monitors some people care about that stuff

is the concession of using a TN panel for that refresh rate even worth it? there are 1440p 480hz OLED screens out there already

A 540hz monitor is unironically a waste of money. You already get basically no benefits from going from 240hz to 300hz due to diminishing returns, going up to 540hz is pointless. He is a retarded little consoomer.

I'd be more impressed if it wasn't teeny tiny 24"
I had a 24" monitor in 2008.

Where is my 600hz 43" 32768zone miniled?

Turning on DLSS defeats the purpose of having a higher frame rate since it introduces input lag.

No it doesn't, why do you fags STILL not understand how upscaling works? It's been years.

2015

Get a whole new shiny rig with a 980ti

Can literally throw anything at it, ultra settings, 60fps, native 1080p/1044p, no questions asked

Rig lasted me 8 years, was able to play Eternal and even Cyberpunk fairly well with it

2020

Get a whole new shiny rig

Your shit is utterly obsolete after 2-3 years and you'll need to run your games at 1080p with blurry ass upscaling shite (1080p native is already blurry enough)

I guess it's not good business for both Nvidia and AMD no one buy their new shit every couple of years, they must have realized that after the golden era in 2015
Rigs that last you a whole decade without upgrading shit is a thing of the past

the waste of money here is settling for a TN
you use your monitor all the time, that refresh rate makes regular computing much more comfortable and it doesn't obsolete itself over time like actual hardware

500HZ

Now that's literally the eyeball can't comprehend that. Otherwise, we wouldn't see afterimages of things like ceiling fans or your hand shaking with a loose wrist.

there needs to be a new GPU maker at this point
GAYMD sucks for AI and gayming but is good on linux
NSHITTIA sucks overall with the latest generation but is still the best there is for AI and gayming but it sucks on linux
Intel is just intel, they just suck

get bought by microshit

EVERYTHING turns to trash

and water is wet

youtube.com/watch?v=osLDDl3HLQQ

It does. DLSS is another anti-aliasing post-processing effect added on top of the render queue. If the benefits of higher frame rate don't surpass the detrimental effect of having to process more shit, it makes your latency worse.
It's just like any application of TAA if you don't factor in frame rate gains. If you're hard CPU limited, it'll only harm you. Even if the impact in your current testing scenario is small, it's still there.
If you want true latency reduction without overhead, you can render the game natively at a lower resolution. It'll look like shit, but it's pure gain.
You could go try it out yourself with a frame rate limiter active, or activate DLAA to see exactly what the performance impact is without upscaling.

DLSS is cool though don't be a faggot about it, it's not a flawless god technology with no drawbacks.

file.png - 684x448, 80.25K

4k is viable with modern hardwa--ACK

Enjoy your melted cable lol

Refresh rate is pointless if the response time is dogshit. Only OLED can even hope to approximate CRT motion clarity. You also don't have to worry about losing even more contrast by using black frame insertion.

it's higher

posts vid to prove it's higher

vid says it's lower

like I said. you're retarded

settling

That's a word dumb people use to justify wasting money on shit that doesn't actually make much of a difference.

You're a dumb fucking nigger.

AI not only destroys every piece of software it touches it also kills companies and hardware that push it, pottery.

that has nothing to do with that, more frames makes the motion smoother simple as that

I remember CRT well. It was shit. It warped. It had bad contrast. It had bad geometry. It had noise. It had image distortion/chromatic aberration etc
Stop romanticising dogshit old tech. You're like a hifi bellend prefering vinyl noise over CD quality

3060 already outdated

4060 already outdated

5060ti already outdated

5060 about to fall in between and be outdated

wtf, what card should someone buy to replace his ancient 1070 that isn't a complete waste of money?

image says opposite of claim

but you're dumb nigger , not me

retard

9070xt, literally

Concession accepted.

not him but aren't you confusing it with frame generation
dlss renders fewer pixels and then jannies them up so there's overall less work being done on the gpu therefore better latency while frame gen has to make schizo frames that detract from the main job of rendering the game which is an additional burden that results in fewer real frames unless the cpu was the bottleneck

Enjoy your 1600p 30 fps experience

my brother in christ we are talking about motion clarity this is the one thing crt is still the best in this isnt even up for debate

The average CRT was blurry as shit though

PS5 Pro is better than 90% of Anon Babbles "gaming" pcs

I don't get it. There's two green bars on top. How did red win?

is cheaping out on a mattress settling?
you can sleep on anything right?
quality matters and monitors aren't an inherently depreciating asset

9070XT if you think 16GB of VRAM will last with how things are going.
7900XTX otherwise.

mostly the TVs, the desktop monitors weren't as bad

do you not know what motion clarity is? 240p is blurry wow i could have never guessed

AMDpoors count 56fps as a win. If it gets higher than 56fps that's a loss, if it gets lower than 56fps that's also a loss.

No, I'm not. DLSS renders the scene at a lower resolution then upscales it using a machine learning algorithm to make it look good. That machine learning algorithm isn't free, and introduces input latency on its own. If you want pure performance, you run at a lower resolution without the algorithm, namely lower native with no DLSS.
You can try it yourself right now, go record frame rate with native then with DLAA on. DLAA is the DLSS upscaling anti-aliasing algorithm applied without upscaling.

DLSS will usually be faster than no DLSS, but will always be slower than lower native res. Frame gen will naturally always be slower than native.

still defending the retards 1080p 500fps monitor

is he your retarded boyfriend or something?

Ps5 pro has no games

I know what you're trying to say, but you're being an obtuse equivocating faggot.
We're talking about upscaling here. DLAA is specifically not doing that, therefore it's irrelevant to the discussion.

B-but acksually it's technically upscaling internally

Pedantry is not a sign of sophistication.

you can dumpster dive a better cpu lmao

Forced RT was mistake

DLSS increases latency

you can try it right now

even though the vid I posted said the opposite

but you have to not actually upscale with DLSS . you have to use DLAA, which isn't upscaling at all , in order for me to be right

just shut up you dumb retard

Honestly, with a high quality VRR display like my C3 that's solid.

The 4090/5090 being 4 times the cost because AI work is insane, yes Id rather have them but paying over $1500 for even a bleeding edge GPU is insane.

I bet you can disable a few things like depth of field and annoying light effects to get it a bit better too.

Post HWINFO

yeah, running the unupscaled source resolution will be faster because upscaling gives the gpu more work, that's kinda obvious
i don't think anyone was arguing against that

5700x3d 9070xt pulse

hwinfo

run out of arguments and start calling him gay

great tactic, this totally covers up your retardation and that you have no clue what youre talking about

me and my boyfriend are gay, ok

can't be bothered
the ps5 has a worse r7 3700 anyway

So I'm just not allowed to explain how shit works when somebody says something wrong? Can't allow you to have your feefees hurt regarding your favorite multibillion dollar company's killer tech?

even though the vid I posted said the opposite

It didn't, you didn't watch it and just plucked a screenshot to pretend that it corroborates your argument.

First guy said DLSS doesn't increase input lag. It's a very technical thing obviously but there's a meaningful difference between a raw benefit to input latency and the compromise you get with upscaling tech like DLSS and FSR to still have it look good on a high res monitor.

stop posting video evidence from the video I posted claiming the opposite

lol shut up retard

you win how could i ever argue against someone with this level of retardation, i kneel.

It’s fun to remember it’s cheaper to hire a bunch of jeets to spread misinformation like “post-process upscaling isn’t upscaling because upscaling isn’t post-process” and other such nonsense than it is to just make gpus that are worth buying.

You bought a 1080p 500hz monitor. You didn't have to supply any further evidence

Why don't you make your point to whoever it is your addressing rather than some random comment?

I can live with 96 fps on 1440p max settings, I'll probably bump down the RT to get it higher. When you consider what ue5 games run at maxed out this isn't bad at all

You left out ARC shill

i haven't i just know that there are reasons to buy one, but your poor ratard brain cant handle is information without chimping out, if this isn't true provide a single argument against my position.

it's lower on the chart

5080

1 fps more than 4080

imagine being the fucking cucks buying this shit

So the latest nuDoom is unoptimized slop?

speaking of monitors

amazon.co.uk/LG-UltraGear-27GX790A-Compatible-DisplayPort/dp/B0DT26YLD4/ref=pd_ci_mcx_mh_mcx_views_0_title?pd_rd_w=NVeg9&content-id=amzn1.sym.d63274d0-bf52-45e7-ae69-2bcf85c5865c:amzn1.symc.ca948091-a64d-450e-86d7-c161ca33337b&pf_rd_p=d63274d0-bf52-45e7-ae69-2bcf85c5865c&pf_rd_r=RBMCVB7M9JV8R5JF2P9S&pd_rd_wg=7Lrvm&pd_rd_r=bec52bcf-933b-425b-a456-59c501ceec2e&pd_rd_i=B0DT26YLD4&th=1

worth? i don't want to go for 4k because that halves your framerate

yes, mixed with force RT for maximum dogshit

typical ESL monkey

Anon said, in his post that has grammar errors every other word.

you must argue against me being retarded

lol no I don't. Retard

4k

Who the fuck cares?
nothing has been that interesting in resolution since the jump to 1080p from standard def.
Sure 1440p is slightly better but i honestly cannot tell the fucking difference between 1440p and 4k. I can tell the performance difference though and 4k runs like fried ass.
Whats the point other than to sell people shit they dont need?

InB4 I can totally tell the difference you must be blind or poor.

No you fucking cant dipshit shill. nobody can

you're proving my point, you can't defend your own position, now go cope somewhere else please.

nobody can

people with working eyes can

Do you even own a 4k tv or monitor? This is such a simple and obvious thing to test and see for yourself so you clearly don't

Got 2020 mofo. youre a shill tryin to seel useless shit.
or a teenAged wannabe troll
yes i do. It makes so little difference i cant fucking tell, except it runs shittier. Run that shit @ 1440. 4k is bullshit for bragging rights. like a made in china guchi bag or an apple logo.

i cant afford nice thing

so nice thing must be stupid

haha you bought nice thing you're so stupid

why are you guys like this?

How? The game looks the exact fucking same as doom 4

a thing literally only you stare at is for bragging rights

You know not everyone is a loser bragging on an anonymous website about their computer parts. It looks better and thats really all that needs to be said. If its not worth the performance trade off to you then don't run it

I brag on here and I'm not a loser

yes u get better latency when you upscale native 4k game to 1080p and get like twice the fps you would usually get, you know how to get even better latency? No dlss and play at 1080p you tard.

if you don't upscale with the upscale tech then you get the same latency as running native

oh shit, you really are one dumb fuck

You're trying to justify wasting money on what basically amounts to a pointless feature. You're the kind of retard who would buy premium water from the Alps in a matte glass bottle because it reduces the light refraction effects in the water. The kind to buy gold coated HDMI cables despite them not being any different from regular HDMI cables.
540hz is essentially indistinguishable from 300hz

b-but muh motion clarity

Yeah, keep pretending you can tell the difference, audiophiles can hear the difference between ALAC and FLAC files

Is PC the most technically advanced platform for the most cutting edge graphics or not?

optimum disagrees and hes better at games than you

So you paid hundreds more for the mere chance of running some games better than a mid range GPU.

having a better screen is not a pointless feature
you can use it until it dies, it doesn't obsolesce
i wouldn't buy a TN panel personally in this day and age but given how long it can serve you the cost is absolutely just a drop in the bucket given how much mileage you get out of it

I have one and I can tell you it makes such minute difference that it doesn't really matter.

There is no point arguing with clueless retard who doesnt even understand why comparing 1080p with dlss to native 4k is stupid.

All studies show that humans perceive differences up to and past 1900hz. Our eyes don't operate at fixed frame rates.

Bullshit. In modern games the difference between 4k and 1080p is like putting on glasses

DLSS upscaling increases latency

but only when it's not upscaling

did you forget your brain?

real life has way more motion blur obfuscating your perception though, you won't see anywhere near as much due to literally everything having to move to display new information
it's way easier to perceive refresh rate on a screen where it's a bunch of tiny lights changing colors

he still doesnt get it

im not the anon you were arguing before i just pointed out how retarded you are "upscaling" if just fancy word for playing in lower resolution you retarded mongoloid.

the point is DLSS , not upscaling you mouth breathing slobbering spastic

You're probably thinking of how phantom array effects (move your cursor around, see those trails?) make moving objects on computer monitors appear less blurry because the eye interprets each displayed frame as a still image.
Real life doesn't have motion blur, each person just has different rates of interpreting input.
You really do have to reach into thousands of fps before you stop seeing frames. You might not notice it, but your brain does.

mine doesn;t

Yours doesn't what?

first time building a PC
thoughts on my planned build?

file.png - 730x152, 11.13K

Dark Ages is this generation's Crysis.

make it a 9800x3d and it's perfect
there's a reason everyone is jerking it off nowadays

Drop that motherboard to a B650 or B650E instead and put some more money into your CPU, get an x3d variant.
If possible do 6000mhz ram, AMD CPUs love that.

go team green trust me

if possible get the 9070xt pulse variant
biggest cooler possible and also tends to be the closest to msrp

Holy shit it's literally always been like this, if you retards insist on chasing diminishing returns and meme resolutions you're going to pay out the ass
Doom and Indy are like two of the least demanding recent AAA games.

Nvidia has better long term support and Transformer model actually works in every game unlike fsr4 (and it's just straight up better in every way)

team green

when their drivers are currently shitting themselves and they've released the stinkiest generation to date

Nvidia has better long term support

Lol, their cards age like milk. Nvidia doesn't do anything that doesn't immediately result in a revenue increase.

TRVST THE PLVN

he has a 540hz monitor you retards that's completely valid name one other card that can fill 540fps except the 4090?

Most cards????
At that resolution the CPU is the limiting factor for most esport titles, not the GPU. It's overkill. I'd understand a 5090 if he had one of those fancy two mode monitor swapping between 240hz 4K and 480Hz 1080p, but just a 1080p monitor then a 5090 doesn't do much.

3090 ti doesnt show on charts anymore

it's over

the 3090 is on the chart so just increase the perf of that result a little

Don't buy a 50xx series?

drivers

There were like 5/6 gens straight of amd cards being housefires with drivers that would bluescreen at least twice a week. I'm sure they'll eventually fuck things up again somehow.
Basically all of turing still good to go, meanwhile rdna1 and low end rdna 2 borderline unusable for games in windows.

just buy old nvidia hardware out of your pathological hatred for amd

show me 1 modern game where you can get 540hz with a 2080 or 3080 in 1080p csgo? boomer shooters? yeah right

Show me a game where playing it 1080p 500hz is a better experience to running 4k 150fps?

Piss5 pro

Worse than a 4060ti, barely better than a 4060

No thanks.

csgo? boomer shooters? yeah right

You only buy 540hz monitors if you play CS or OW2, both games which can run that run 100s of fps at 1080p on cards lower than a 5090. A 5090 shines at higher resolutions like 4k, the CPU will be the major bottleneck at lower resolutions.
And if you bought a 1080p high refresh rate monitor but don't play twitchy esports games (which are by design optimized), well then you're just a retard.

never tried one

just parrots soundbites from others to justify himself

3080 barley gets 300 fps with a ryzen 7 9800x3d in cs2 and ow2 1080p so your making my point

Literally have one sitting in my downstairs living room, used by the kids to play nothing but Apex and Fortnite, try again sweetheart.

used by the kids to play nothing but Apex and Fortnite,

why do you hate your kids? give them PCs so that they can play shooters with proper controls

why do you hate your kids?

they're not his kids silly

lol. lies. Why would you be sitting upstairs on your PC while your "kids" are downstairs?

Just bought nvidia because of all the AMD shilling. Nobody is mentally ill enough to do it for free, there must be kikery involved.
Anyway, TPU aggregate results show that 5080 and even 5070 Ti are faster on average than 9070 XT at every resolution, even with RT-off. I can cherry pick a game where 5070 non-Ti is faster than 9070 XT, doesn't mean it's better. Single game cherrypicks are only done by paid shills or actual sub 80 IQ retards. Literal AMD itself uses aggregate scores on their marketing material and they, as biased as you can be, themselves said 9070 XT OC = 5070 Ti on average.
I think 9070 XT is a really, really good card and you don't need disingenuous homosexuals like OP to sell the card.

Unironically true faggot, but neither are they my girlfriends children, just our nieces and nephews.

Why waste money? besides I'm quite sure those games have native M&K support on console. I quite literally only bought a PS5 pro to make my living room TV stand look less empty.

What resolution? 9800X3D is useless on most builds that don't involve a 5090 or for some reason go for 1080p super high refresh rate.

I have a PS5 Pro downstairs which someone elses kids play on while I sit upstairs in the bedroom on Anon Babble

imagine just making stupid shit up like this

What you're asking makes no sense.

Are you stupid? Do you not have siblings or any family members? Do you let children into your bedroom?

I'm not asking anything. I'm calling you a liar.

Dark Ages barely runs at 100 FPS while being blurry and fake frames

Eternal is crystal clear at 420 FPS

What a downgrade

I have my shit in the living room. Kids have shit in their bedroom. Not the other way around. You're living in your parents.

Ok, random loser on Anon Babble I'm lying.

Yes. I know

When did I say the children live with me? I said they use my console to play Apex and Fortnite in the living room, We have no childrens rooms in my home. Do your nieces and nephews not love you and avoid you at all costs? must suck to be such a freak that even children don't want to be around you.

you can thank Pajeetsoft for that

OP is a ni/g/ger that spams these nonstop over there. He has admitted he doesn't even have a gaming PC because he can't afford it.
How sad is that guy to spend the entire day debating high end GPUs and making these threads while he himself is stuck on an iGPU that barely runs Minecraft.

AMD/nvidia won!

Says the guy window shopping this stuff his entire life.

I think posting these threads without posting a CPU-Z or another proof of having a decent rig should be a bannable offense. Filter out these third worlders from my board.

Holy shit it's literally always been like this

Eight years ago, you could build a complete PC for $1,200 with a GTX 1080 Ti and an i7-8700K, the best GPU and CPU at the time. There wasn’t a single game it couldn’t handle at 1080p/120fps. Today, if you want the best CPU, GPU, and 32GB of RAM, it can easily cost over $5,000, and it can barely handle games at 4K/60fps.
There is no justification here.

you made the right choice desu as long as you didnt pay 1k amd will shit the bad again sooner or later its inevitable

Nvidia drivers are super bad do not buy any RTX GPU. Radeon fixed their drivers at least.

5060ti 4k 42 oled
Turns on dlss
Yep its gaming time

Made to compete against the 5070

priced accordingly

Actually performs like a 5080

Is almost half the price of one

I feel like if AMD knew how weak the 5000 series card were going to be they would have priced the XT much higher to out of corporate greediness but instead they made the most high value card of the decade lol. I get the feeling the 9070 XT is the new 1080 ti and nivida is about to lose a lot of its market share to their own greediness.

there is 1 game where this is true and its a driver issue sorry to crush your dreams

and nivida is about to lose a lot of its market share to their own greediness.

over 2 decades of GPUs and you fags still dont learn
it has NEVER mattered when AMD had better GPUs everyone bought Nvidia

back in the HD 5000 series

back when nshitia gave you 3.5gbs of vram vs AMDs R9 290x 8gb

back when the RX 480/580 was the most bought AMD gpu the 1060 still beat the everliving shit out of it

Why are Goyvidia tards like this?

Yes we know, Nvidia cattle is retarded

Not him (I personally disagree and think that kind of thing should be left to mods) but basically the game will continue to look good as hardware improves.

sating the truth

Why are Goyvidia tards like this?

idk maybe because i'm not a lying shill like you

kek

This is WITH ray tracing, retard.

Doom just prove that AMD can do raytracing, despite being Nvidia sponsored title

So you own a PS5 Pro , it's in your living room, it's not "the kids" because they just visit you, or do they bring it with you?
Post a time stamp or you're a lying cunt

Should I buy a 9070xt now or pray for a midrange 20-24gb GPU to get released this year?

a 5070 is the best dollar per performance upgrade right now

Why don't we ever see benchmark charts for good games?

16gb of VRAM not being enough is a ways off

I just bought a 5070 12GB. Upgrading from a 3070 8GB.
I game at 1440p. I should be good for a long time, right? I only need 60fps

how and why are people infatuated with the series? they're not good. it's a fucking ADHD sweatlord braindead shooter.

12GB

I game at 1440p

I should be good for a long time, right?

know.jpg - 1280x720, 76.18K

yes

I mean I love AMD prices too, but shit is only worth when Vulkan is involved, also there is the fact the greens keep pushing shiny puddles in game to sell you rtx cores

16gb of VRAM is the bare minimum these days, anything lower you are getting scammed

List all these games where its bare minimum at 1440

wasn’t a single game it couldn’t handle at 1080p/120fps

Definitely not. Not to mention games at that point didn't give you the option to genuinely push the game out in a meaningful way beyond the consoles.
Play everything on console equivalent settings (so basically low) and suddenly 98% of what comes out is 1440p/120 on a xx60 card.

nivida is about to lose a lot of its market share to their own greediness

Imagine being this new. Modern AMD is literally incapable of keeping drivers long term in just an "okay" state, the entire shtick with their cards is something that is a moderately good value from a raw performance point of view with a lacking feature set and a 50/50 shot at the card basically being abandoned after 2 years.

youtube.com/watch?v=Zlzatw1E2vQ

there are six graphic presets

ultra nightmare (highest) and low (lowest) look almost exactly the same, with the only difference being shadows are softer

Holy shit lol

My 5060ti 16gb gets random blackscreen with nvlddkmp something crash every other time i enter fullscreen in mpc

Nvidia have their own issues

Most of the settings in Eternal looked the same, too.

Generally vastly superior performance

The only game I've found where that is true is black myth Wukong. Generally the 5080 only performs 2-4 frames faster then the 9070 xt

Total AMDick domination

anon has never had a tight knit family

sad

Post a time stamp or you're a lying cunt

no. I couldn't care less if you believe me or not, the real problem lies in the fact that it's unimaginable that kids would play a game console in the living room or that the phrase "the kids" is incorrect because you say so.

Does the 9070 do relatively poorly in ue5 like RDNA2/3?

the 5080 is about 20% faster than the 9070xt on average but i'd say it's not worth the markup
this generation is definitely not worth upgrading to between amd bailing on the high end and nvidia shitting the bed, but if you have to the 9070xt is definitely the best pick
if i didn't have my 5700xt die on me i'd stick out until udna

Based.

This is literally what the crysis devs did, and they were all hu-whute people. Anyone against this is brown.

Sdr

Lol RETARD. Cry with your shitty monitor someplace else please.

good games still run at 1080 / 60 on a 970gtx

Can't even reach 60 fps

4090 Chads win again

you bought a premium product with a killswitch connector
no matter what you buy you lose

It seems like the only real choices this gen are 9070xt if you want to be economical or a 5090 if you don't care at all about cost. Though the 5090 is still questionable due to hardware/software issues.

card has been working great for the last 3 years

nuh uh you lost

sure thing anon.

card that can imperceptibly self destruct due to nonexistent load balancing

don't shill for 12vhpwr, it's objectively a bad and cheap connector

dont forget to seatbelt your child

You can easily undervolt and keep performance

1 more fps over an entire gen

lmao wtf

3463677.jpg - 2601x174, 73.16K

starts pakiposting for no reason

Never reply to me again you nonwhite.

So, how is this not false advertising?

forget about the cards, who is that cutie in the backseat?

****with maximum amount of frame generation
where maximum is 2 real frames and 2 fake frames for the 4090 while it's 1 real frame and 3 fake frames for the 5070
that's how

some redditors girlfriend, who cares

sure
doesn't change the fact that it's spreading the same power through half as many cables that can't do load balancing because nvidia was desperate for their gay form over function aesthetic on the founders

but i'd say it's not worth the markup

"markup" is a little light of a word for an $800 difference lmao

Why are nvidia fans dyslexic? Because when they see AMD, they get MAD!

9070 XT

faster than 5080

You are lying

Low looked like shit, but it still didn't run much better.
At least the frame rate was already high regardless of settings, so it didn't matter.

red won

Green is clearly in the top AMD shill

With modern Nvidia drivers, anything is possible.

except that doom feels like garbage on anything under 120fps. For me its an absolute failure if a game like that doesnt run at over 120fps on a midrange card. Given how HFR monitors became mainstream, it still baffles me how we still hold 60fps as some sort of target.

how long are we gonna have these threads now?

Until Jewvidia shills off themselves

Now show the price for each one

forget about the cutie, who is that in the passenger seat?

Current gen xx80 card it's not even half the flagship gpu. It's the same as the xx60 card from 6 years ago, but three times more expensive

I don't think this level of performance is acceptable for any card on that list

Nobody plays games at 4k and even if they do they don't because of upscalers

sssss.jpg - 500x556, 23.65K

LOL HE DELETED IT

FAGGOT

1440

dlss performance is 1280x720

1080

dlss performance is 960x540

well even the newest nvidia cards (except 4090 and 5090) can't even get 56fps

In the past I used to think that the hardware we had was shit and that's why games run like shit, but now I came to the realization that, despite the hardware being shit, developers must chose whst performance and hardware target they have and design the game towards it. If the average game has a 3060 level gpu and a 1080p 165hz display, then that is what they must target. If the game doesn't run at 144+ fps on a 3060 GOu, then the developer is at fault for not targeting the game at the current dominant hardware. It doesn't matter if the 3060 is shit, it is what people have in their machines. It's like if Toyota made all Corollas have F1 suspension and then complained that the asphalt is shit.

15-20 FPS difference from top to bottom of relevant gaming cards

best gaming card still cant crack 60 FPS

Other than framegen, what is using so much goddamn VRAM?

VRAM.png - 560x570, 48.39K

Damn, look at that 5080 getting raped & gaped by the old hat 4090. Imagine buying a 5000 series card, these things are seriously the worst garbage NVIDIA released in a decade, if not two decades even.

90 class cards should beat 80 class cards for a generation or two after anon

so winning is 56fps, just like what was posted.
It's not the best. it's not the worst. It wins because it scored 56fps

It wins because it doesn't cost 2000 bucks

4k ultra

double full retard

it costs $700 and get more fps than the $2000 rtx 5080

he's a retard who plays esport slop for teenagers
also 8 bit color lmao

9070 XT is the new 1080 ti and nivida is about to lose a lot of its market share to their own greediness.

people are retarded and will keep buying nvidia even if they have even worse drivers now
been months and Nvidia still havent fixed their shit

so which metric are you measuring here? Is it because it's the cheaper GPU or is it the frame rate. Because there are cheaper GPUs and there are higher frame rate GPUs.

Blind retard-kun, the 4080 rapes the previous gen 3090 so badly that its 1% lows are equal to the 3090's average, so no, you are wrong, the new xx80 should not lose to the old xx90 at all. Not only that, there is literally only 1 FPS difference between the 4080 and the 5080, so you can clearly see how trash the 5000 series is and how it has made extremely little progress.

The 5000 series is shit and garbage scraped from the gutters.

file.png - 3840x2160, 1.42M

4080 would have been top tier if it had 24gb but yes 4xxx make the 5xxx looks like trash

it really feels like anything that is not 4090 or 5090 is getting the bottom of the barrel engineering

do you buy GPUs just to play a single game?

wish AMD wasn't scared of competing in the top-end market
would have given them a shot but i only upgrade every 5ish years so wasn't gonna buy a mid-range card

it definitely lowers it and you're definitely retarded

they weren't scared, they just locked themselves into a meme not worth sinking big money into with rdna in an age where nvidia developed gimmicks and where they sell the same chips to ai slop corpos while rdna is gaming only

he fell for the 1440p meme with current GPU prices

I only buy a new gpu every 5-10 years how does going from a 5700xt to 5090 effect me if got the 5090 before the price increase?
DLSS 4 gives me 1440p performance on my 4k monitor while still looking better than 1440p native kukuku
The drivers will improve plus I use linux so I enjoy windows users suffering as I do

they don't have something better than nvidia xx90, and ultimately everyone rich enough to buy that will never choose AMD because because they tend to need CUDA acceleration or they just "always bought nvidia". so ultimately they did make the right choice. it's essentially peak "i want AMD to compete at high end so i can buy nvidia anyway but for less money"

there's a 40 point delta in relative performance with its primary competition, so there's no surprise you'd see similar performance to a 5080 in titles that favor the 9070 xt. ff14 lmao runs at like half the fps of a 5070 ti or something but who cares about that

8 years ago everything cost 2-3 times less

nta but this thread mainly only caught my interest because i was wondering why the fuck anyone cares about how well this dogshit game runs
all new games are shit
i havent had any problems with 1440p

5090 is higher

red won

TAA

red won

Firefly is a whore

What is inflation you fucking retard.
Do you think prices stay the same always?

i assume the red under the green cost half as much right?
and the green under the red cost twice as much

From the official trailer, it doesn't look remotely good enough to justify those terrible framerates.
Guessing Carmack had nothing to do with this engine.

new game that isn't even out yet is very demanding at the highest possible settings

No way! I wish there was a way to get higher fps somehow on a PC

vastly better

it's like 10% faster than the 4080/5070ti/9070xt. one of the worst 80 gpus along with the 2080.

RT. Huge textures

well done

>TAA

>red won

what are you implying?

Nobody wins using TAA.

4k

Didn't read lol.

Who cares. Just disable it with console variables.

I'd rather use DLSS on a card that supports it.

last time ive checked less than 10% of steam users played in 4k so the benchmark is irrelevant

there are poor souls who bought current gen 8GB GPUs

prayge-pray.gif - 498x354, 127.49K

5070 is worse than the 4070 Super

Why even bother?

It's on by default, same with Oblivion, for example

Wait for the path tracing benchmarks, fellow chuddies
Only then will we be able to truly funpost

fellow chuddies

says the troon saying a word only troons use

That monitor is $1000 dollars because it runs at 540hz, I don't think he wants eye candy

That's unironically the perfect monitor for competitive shooters, but I'd never buy something like that unless I were a pro making money off games.

I am thinking about getting a PC to replace my laptop.
Is 7800X3d+RTX 4060 a good choice?

I guess I can hold out for another year with my 3070
I do kinda want 9700 xt tho

Is 7800X3d+RTX 4060 a good choice?

No, it's a horrible choice. You're MASSIVELY overpaying your CPU relative to your GPU. Get 7800X3D, if you have something at least as powerful as 4080 or 7800XTX, otherwise you will never use that CPU to its full capacity in gaming.

Get something like a Ryzen 5 7500F for your 4060, or better yet get the 7500F and use the money you saved to get a better GPU.

I have a 10gb 3080 and it's fine for 1440p gaymen (worst case scenario is I'd have to use optimized settings/DLSS like in indiana jones)
I don't think the true rape will come until the ps6 gen hits

16gb seems like the sweet spot though

It will last you longer than the alternatives

Do take into account that you can always upgrade your GPU later and have the same rig with a good CPU in there
Think long term if that's something that matters

like the other anon said go for ryzen 7500f and 5070ti/9070xt

I actually bought one at launch. I upgraded from a 1080, I've been using it to play games my 1080 couldn't really run and runs every great and while it can
Handle ray tracing without an upscale on some games on others it needs it and FSR 2&3 aren't very good. So while the future of the card look bright that era from 2020-2024 is gonna be a scar on it unless AMD push for FSR4 updates

I don't regret buying it or anything though in Leaf land the 5080 literally double the price of the 9070 xt for basically the same performance

5700X3D

9070 XT

32GB RAM

2TB M2

I'm set for the rest of this generation. If current gen game can't run well on this, then it's not worth buying either way

I only upgraded from my 3080 10gb because I play on my LG C1 most of the time. At 1440p I had zero issues, it's not that demanding of a resolution especially using DLSS Q.

use OptiScaler to force FSR4 in non officially supported games

It's AM5 not Intel where you're stuck on the same gen CPUs. He can just chuck a 9800x/10800x3d to replace the 7500f in 5 years.

Spend more money on the gpu and less on the cpu.

9070/xt + 7700x/9700x will do you fine

That's a fair point I just got my first AMD CPU myself so I'm not used to this flexibility
The only negative would be a depreciation in value for the old CPU but whatever, eh?

serious question: why the fuck cant amd implement something like this if a random on github can? it is insane how little they do software wise with the amount of resources they have.

probably because AMD got a lot of shit for triggering bans for giving driver level Anti-Lag 2 game injection

please understand

amd lady.jpg - 2000x1333, 589.28K

AMD is a huge company, the guy on Github is a hobbyist

Yeah but the newer CPUs will also drop in price over time. I bought a 5700X3D for £150 to replace my 3600 now I'm good for another few years.

then make 40 disclaimers or something before hand

i want to punch this dyke bitch

yes exactly

Any games I can fully enjoy on my 5090?

Quake

These charts cant be real. What the fuck kind of performance is this? Seems shit in general not just muh ngreedia and ayymdee.

Minecraft with Rethinking Voxels shader. Just don't get near lava.

It looks like RTX 5090 is already struggling? Wtf?

playing MP games

ever

brought it on themselves, filthy normies

That's not a lot of VRAM usage at all lol. Maybe it would be considered high usage in 2021. 7GB 1080p, 8GB 1440p, 10GB 2160p is pretty normal.

5600X

RX 6600

I'll stay comfy at 1080p for another generation

top of the line cards today can barely break 60fps on 4k max settings on modern games

either the cards are absolutely indifferent to cards 10 years, or gamedevs today care a whole lot lee of optimization
like jfc how much longer is that going to be the standard?

Nvidia cucks are this retarded

Enjoy your shitty drivers faggot.

idTech uses Vulkan. It's not exactly a modern engine.

For low settings, it is. Good luck running it on a 2060.

when the fuck has 4k 60 fps max settings ever been possible on anything other than the highest end cards?

it's native res (no upscaling tech like DLSS or FSR) at highest settings

Game devs these days do a lot of what's called box checking. The use a lot of "tools" that do a lot of the work for them with the click of a box and it produces and incredibly bland looking environment that is horribly optimized.

How is the 5070 such an absolute shit card?

Hot

be me

rocking a PC built during the Obama administration

GTX 750 Ti, FX-4100, 6GB RAM

case sounds like it’s summoning demons when I launch Steam

hear new open-world game dropped

looks like absolute fire on trailers, 4K god traced grass physics and NPCs with actual brain cells

“MY BODY IS READY”

download 100GB

boot it up

PC makes a sound like a whale giving birth

temps hit 110°C

fans spin so hard I achieve local airlift

get 11fps with motion blur making it look like I’m underwater

turn settings to low and get 18fps

post rant on Steam forums and my anime Discord

"how come this game doesn’t run well on my setup? it’s not even that demanding"

see social media posts of console players enjoying 60fps on PS5

rage boils

go to Reddit and post screenshot of game running at 720p with textures set to 'Clay'

title: “Proof next-gen games are scams, consoles are holding gaming back"

ignore the fact I could've afforded a new GPU this year if I hadn't spend all my money on vape juice

brag about being a “real gamer” because I don’t play on consoles

real gamer experience involves 20-minute loading screens and textures not loading in

spend 6 hours tweaking .ini files to completely remove grass and shadows

get 4 extra fps as a result

leave 1/10 steam review with a tirade about Unreal engine (even though the game isn't using it)

Why would you still be on a 2060? The game costs more than that.

Remember when Indy first came out and ESLs were on their "UE5SLOP STUTTER SHIT FART" spiels until it got disseminated through the call centers that it wasn't?

when the fuck has 4k 60 fps max settings ever been possible

The 1080ti had a good run with this until they started making games for the ps5/niggerbox.
Outside of that, time period appropriate high resolution has always been you either drop settings or play below 60.
For one thing games until recently resolution and framerate were the only thing you could gain over console versions. Most console games now are either low or below the pc minimum except for the textures. Ultra should be considered the future gpu/I'm playing on a 1080p60 screen with an xx80/xx90 card settings.
PC releases not being shackled by nearly decade old budget/mid range hardware is a good thing.

Holy fuck. There's no shot the 5090 is that dogshit compared to the 4090.

What the fuck is happening to GPUs? Look at the 4090 compared to the 3090, that's how a new flagship GPU should fucking look. That graph is actually sickening for more than just AMD being insanely uncompetitive.

3090 vs 4090 = Almost double the FPS jump (30+ fps)

4090 vs 5090 = 10~ fps difference

Why would you hold on to a GPU that works fine in 90% of games while the market is so terrible

A.I. VERY GOOD SAAR DO THE REDEEM FAKE FRAMES SAAR! UPSCALE THE 540p FOR THE GOODS PERFORMANCES BITCH LASAGNA!

What the fuck is happening to GPUs?

There were no architectural improvements between the 4000 and 5000 generations.

Long story short: Nvidia and AMD don't actually make GPUs. The main engine (so to speak) of a GPU today is made by a Taiwanese company called TSMC who make chips. Nvidia or AMD build their GPU around these chips. Whenever these kinds of companies make a big architectural leap in chip design such as shrinking the transistors to make each chip denser and more efficient, that's what constitutes 99% of a generational performance leap with GPUs.

The reason the 900>1000 series leap was huge was because it went from 28nm transistors to 15nm transistors. The reason the 3000-4000 series leap was good is because the transistors went from 8nm to 5nm. However 4000>5000 had no transistor shrinkage or no chip upgrade made by TSMC, they haven't managed to produce 3nm yet. So there was no actual architectural upgrade for the 5000 series, it's just Nvidia turning up the power usage, reshuffling their own ingredients a little bit, and adding gimmicks like MFG.

I am guessing that the game doesn't utilize all of the vram.

zoom: the dark rizzle

who cares

64fps

Guaranteed that drops to like 40 during combat. Anyway you can just turn some settings down. Plus 4k gaming is a meme

Plus 4k gaming is a meme

Why?

montreal?

exactly my point
10 years later and that's still the high bar, you'd think by now any console or any mid-level gpu card can run any game at 4k max settings 60fps because that would be the new standard
that was exactly what everyone else thought back in 2015

4k. Is. A. Fucking. Meme.

this
i'll be a boomer in 50 years still using 1440p

Why?

So did I make a mistake buying a 5070ti? Keep in mind the 9070xt is more expensive than the 5070ti in the US

are you happy with it in whatever you play?

Diminishing returns when DLSS is a thing and most games especially older games and indies have shit UI scaling in 4k and beyond unless they have dedicated modding

Yes

Dlss4 performance 4k looks better than native 4k

then no

boo.jpg - 836x1024, 82.88K

*1440p native

another game that has mysteriously just become unoptimized

youtube.com/watch?v=WBsEP3pGvzQ

You've never actually compared this yourself, you're a teenager who bought a 4k screen as his first monitor and now autistically defend it.

You could have scrolled down

stretched 720p looking better than anything

kek

You must not have access to it, you seem very upset are you a hue?

Does Karmack involved in development?

I'm kinda surprised the 7900XTX competes with the 5080 in a lot of games

It's over.. Expedition 33 keeps crashing with my 9070 XT....

could it be because I only have 8gbs of ram?

I don't play meme resolutions. I play native 1440p with TSR instead of TAA and no framegen. You play stretched 720p with 2/3 of your frames being fake and quadruple my input lag lol.

A $3000 gpu can't maintain 144fps in a modern AAA game

The absolute state

When has a GPU ever been able to do that exactly

do you have an old version of the fix
need to get newer version since the game updated

I just got it on gay pass today. What fix?

In doom (2016)

There is no input lag schizo and my upscale cums on your native like how a pajeet will cum inside you to make rent money

nevermind sorry it wont stop whatever you got going on
it's just a 'fix' for skipping intros/removing cinematic fps cap/ultrawide stuff

i just know i was getting some random crashing in cut scenes after the update and forgot about it

github com/Lyall/ClairObscurFix

There is no input lag

KEEEEEEEEK

You're not playing 4k, shit-eating subhuman. You're playing 720 stretched wider than your whore of a mother's shithole with 20 actual fps and 40 fake ones with quadruple my lag. Only 4k is native 4k, everything else is cope, kinda like you being human lol.

No it doesn't. Looks better than 1440p native though.

4080s beating 5080

lmao what the fuck

I get over 700 fake frames in resident evil 3 @ 1440p maxed settings with my steel legend 9070XT

Only framegen (which to be fair is actual dogshit until you're starting at roughly 120fps, and at that point even lsfg and fsr are good) introduces input lag, regular dlss it would actually reduce it because of the framerate increase.

4k isn't a meme resolution, it's been a modern standard for over 10 years and has become more affordable than ever. It's the next logical step up from 1080p. No one is talking about 1080p being a meme

No one plays at 4k. Protip: soaped up 1080p isn't 4k.

Based OG Crysis Dec telling it like it is

I'd love to see the dollar to fps ratios of graphs like this. i'm never going to pay 1000 bucks for a gpu. i'd rather play old shitty games

You've never used 4k have you. 4k with DLSS performance looks way better than 1440p DLAA. No reason to not be at 4k when DLSS exists.

4k with DLSS performance

literal stretched 720p

better than 1440p TSR

Kek this shit-eating subhuman is deranged.

It takes 4X the GPU processing power of 1080p. Is it 4x better for playing games? Fuck no. 1440p at high refresh rates is much better than wasting all of your GPU resources on resolution.

And here's 1440p DLAA. Same performance cost, worse image quality.

1440pdlaa.jpg - 3746x1928, 2.81M

still image

retard

it's been a modern standard for over 10 years

Maybe for movies, we still haven't even hit 1440p being a reasonable gaymes resolution for lowend/console hardware.

Looks like soapy dogshit. TSR deathmogs any AA method in UE5. You're a retarded shitskin subhuman playing in 720p, you've never even sniffed real native 4k kek.

never used 4k but thinks hes an expert

kek. anyone with a 4k and 1440p displays can compare for themselves.

It looks better than 1440p native. I guess that means you think 1440p native looks like PS2 gen?

never used 4k

Sounds like the shitskin upscaling from 720p playing at 15 real fps kek

Still images or youtube videos aren't how you compare this type of thing.
And any blur that might be visable is going to 95% be a product of the screen size or scaling
Also the second image looks better(like it matters)

It looks better than 1440p native.

Only if you're a blind shit-eating subhuman with brain damage. DLSS is a soap brewing technique that produces artifacts that don't even exist in the scene, which is why no competitive gamer EVER uses this dogshit since they never want fake pixels lol

AMD won

two fastest cards are nvidia cards

OP is a flaming homosexual as always.

b-but they cost more

Get a job, gringo.

thats crazy chat
still playing blood

Tchernobog.jpg - 600x450, 66.71K

anyone with a 4k and 1440p displays

You once again out yourself as a teenager or 3rd worlder, literally any adult with a job in the western world that's even remotely into tech/gaymes has seen the progression firsthand from 480p or lower to 4k over the last like 20-something years.
4k still a meme for (playing video games) televisions and totally placebo for a desktop.

sounds like poorfag cope to me. bet you dont even have a hdr capable display.

HDR is a meme. High framerates are a meme. PCs are a meme. Not living with your parents is a meme.

Sent from my iPhone.®

How fucking poor are you? The difference 4K makes is massive and even normies see it.
I think you need your eyesight checked if you are serious and aren't just a coping poorfag.

Considering how big of a selling point Apple considers XDR and ProMotion, an iPhone user wouldn't dismiss HDR and HRR gaming. I think you meant a PocoPhone or another Xiaomi Samshit chinkgookphone.

poorfag cope

Any shitskin who needs to upscale to 4k is a poorfag, poorfag shitskin.

I upscale from your target resolution. How sad is that?

I upscale from your target resolution

I don't play at 720p, retarded shitskin.

If you play at 1440p, my original statement is correct.

doesnt even know that dlss uses different resolutions depending on your output resolution

calls others retards

shitskin already outed himself as someone who uses PERFORMANCE DLSS

gets called out as the retarded subhuman he is

now he's trying to claim he's upscaling from 1440p

Kek.

if you had a 4k display you could compare 4k dlaa, quality and performance yourself. there are very minor differences, all look way better than 1440p.

Retarded shitskin, everything has already been compared by actual people. You're not fooling anyone with your soap brewery kek:
youtube.com/watch?v=3nfEkuqNX4k&t=840s

It's all fucking dogshit with soap and artifacts everywhere. You're not fooling anyone.

Anon Babble is one person

I guess you're too poor to pay attention.
I use quality, jewnose. Why would I use ultra performance on a 4090? No game needs that shit for a decent framerate.

why is there only 1% between RTX 4000 and 5000? Something is wrong with these benchmarks

no one is even talking about frame generation lmfao. let me guess you have an amd 6000 or 7000 gpu and are stuck with fsr 2/3.

no one is even talking about frame generation

KEEEEEEEEEK 2/3 of your frames at """"4k"""" are fake, shit-eating shitskin. You're not fooling anyone. You're not playing in """4k""" with 60 real frames.

Lol 3500 dollars for only 8 more frames with the 5090. Nvidia is fuckin COOKED.

You're not playing in """4k""" with 60 real frames.

True, I'm playing at 4k 120 with no frame generation.

hdr capable display.

another meme for video games and general desktop usage.

YES I LOVE TWEAKING EVERYTHING FOR HALF AN HOUR AT A TIME EVERY TIME I LAUNCH A NEW GAME/MOVIE/WHATEVER

4K makes is massive and even normies see it.

No one who's owned both a 1440p screen and a 4k screen would claim this unless they were broke enough the money they spent made them feel like they needed to deny their remorse on Anon Babble of all places in an attempt to prevent suicide. Even the difference between 1080p and 4k on a reasonable monitor size and distance is marginal at best.
Stop coping poorfag :)

some literally who cope schizobabble I didn't even watch

when GN already exposed the soap brewery that is DLSS

lol

Show me your resolution and framerate. I'll upscale from that and beat it.

We're not talking about counter strike here, retarded shitskin.

retarded shitskin

retarded shitskin

retarded shitskin

Is that the only thing in your ESL phrasebook, Paco?

hdr is a meme (biggest image quality increase in the past decade with zero performance cost)

4k is a meme

confuses frame gen with dlss upscaling

1080p and 4k difference is marginal

TOP KEK. stay with your shitty 1440p/1080p sdr displays then, maybe you'll catch up in 2035.

You can't even beat a static image you fucking subhuman shit-eater lol

thats blurry as fuck. average ue slop enjoyer.

I simply call things the way they are. You're a subhuman shitskin upscaling from 720p, so you get called a subhuman shitskin, subhuman shitskin.

No one is upscaling from 720p at 4k.

Try washing the soap out of your eyes after using the DLSS soap brewery, retarded shitskin.

Except I told you I'm not. Post a screenshot of you running any game a reasonable person would have, and I'll match your input resolution and beat your framerate upscaled to 4k. You won't, because you think "your card can't do 4k240" is worse than "my card can't play games, so I'm a smart consumer."

No one is upscaling from 720p at 4k.

Yes, retarded subhuman shitskin, you are indeed no one.

retarded shitskin too poor to buy real games since he can't run them

KEEEEEEEEEEEEEK

Why is the 6800 XT so fucking low at 1440p? What kinda unoptimized shit pile is this?

no framerate

smears vaseline all over the screen

Don't care, still buying Nvidia. Nvidia keeps upgrading DLSS so the GPUs age like fine wine. Also some game devs treat AMD like a retarded stepchild and don't even bother to add FSR. Every game has DLSS.

2070S reporting in

No, you don't smear vaseline over your screen, you smear shit, retarded shitskin.

biggest image quality increase in the past decade with zero performance cost

Is this real? Can someone post a comparison?

4070ti (not a Super)

How over is it for me?

you can't honestly think that looks good. I feel like I'd need to turn the resolution scale to 150% to fix the blurriness (TAA? TSR?)

You can't view it in SDR. I could screenshot a game with HDR metadata, but you wouldn't be able to see it if your monitor doesn't do HDR.

The newer cards, that have on-board hardware specifically for RT which Nvidia has been doing, can do RT. This image just proves that the XTX was a beast of a card.

FSR4 is also pretty good, but old games stuck on FSR1-3 have to be dev-updated which sucks ass.

you can't see the difference without a hdr display. I thought it was a gimmick to until I got an LG OLED, now games without hdr look very dull (if there is no native hdr you can mod it in with renodx or rtx hdr)

The only thing you feel is my shit sliding down your throat since you can't even launch the game lol

I tried to play Indiana Jones 4k on my 4070Ti Super and ran out of VRAM. It's fucking over.

What if I'm still at 1440p?

still not displaying the framerate

did you try turning down the textures one notch

i'm not underage so I've already played oblivion multiple times over the years. no need for me to play unreal engine demakes.

Nah I just switched to playing on my 1440p monitor and used extra VRAM for framegen. Tbh, the framegen works really well in that game. It's first time I've turned it on and kept it on.

files.catbox.moe/aveynf.mp4
There you go, faggot. 1440p Ultra upscaled to 4k. This is actually on the low end of the performance scale, but still better than you.

No one's going to launch the game just to make new screenshots for you. I'm locked at 60fps.

60

KEK
KEK
KEK, SEÑORA
KEK YOUR BODY LINE

Nice soap you fucking shit-eating subhuman. Absolute putrid fucking dogshit lol.

60fps

let me guess you think high refresh rates are also a meme

shitskin who can't launch the game opens his shitstained maw

Didn't read.

60fps

Welcome to 2008

I don't speak poorfag. Is this "soap" referring to the fact that I made a 25mb video for catbox? The details are maxed, the source resolution is 1440p, and I get nearly double the frames you do.
That's my video, dumbass. Also, look up "sam72c3" on Google.

Those are real fps without any framegen, you don't have even 1/3 of that in your upscaled 720p with fake frames and quadruple my lag kek

still confuses upscaling and frame gen

not the sharpest tool in the box are ya

That's my video

KEK so you're a shit-eating poorfag running the game in unironic upscaled 720p with enough soap to fill a soap brewery LMAO

upscaling and frame gen

You're using both in your """4k""", shitskin. You're not fooling anyone.

Post your specs. You won't.

file.png - 1485x1340, 331.91K

AHAHAHA

OHNONNO

THIS IS THEIR """"4K"""" IN ACTION

OHNONONO.png - 2560x1279, 2.17M

4k

Don't care

Everyone on Anon Babble is one person

You're a few fries short of a happy meal aren't ya

upscaled 720p

So you're just outright lying. Okay. You see in the video the source is 1440p, which is the most you can run at natively. And no, reflex + upscaling counters any lag that framegen introduces, actually making it faster than native. But since you're just openly lying like browns are wont to do, I'll just stop talking to you.

Good morning sar.

Doesn't know what compression artifacts are

Not the brightest bulb on the Christmas tree are ya

intel trash poorfag

AHAHAHAHAHA

KEEEEEEEEEEK.png - 2560x1284, 1.72M

look at these video artifacts from the low bitrate h264

that's in the game

totally

Also

didn't post specs (like I said)

Poorfags like to pretend they're better off than gamers, but they're fooling nobody.

He say's while running a ryzen 3600 and 6700xt

artifacts

Yeah, it's called dogshit DLSS soap brewery shits out. Looks like a PS1 game lmao

LOOOOOOOOL.png - 2537x1278, 1.83M

Microsoft acquires id Software

their next game turns into barely playable slop

coded entirely by pajeets

Posts your specs, poor brownoid.
13900K/14900K is the second fastest gaming CPU behind the 9800X3D if you aren't tech illiterate and can tune your memory.

I can view it on my phone or my laptop with HDR screen.

shitskin subhuman running intel dogshit screeching in agony as he posts literal fucking soap

I CAN'T FUCKING BREATHE AHAHAHAHAHAHAHA

LEEEEEEEL.png - 2535x1279, 2.31M

If you had a decent rig, you'd know that's not how DLSS artifacts look like.
You are are a poorfag subhuman rigless cattle, you don't have the right to speak about these topics. Vacate these sacred halls immediately.

intcel shit-eater talking about "decent rigs" while pasting a whole soap brewery worth of soap

dee.jpg - 2544x4000, 381.85K

This is good ray tracing performance

liar just keeps lying

If you don't know the difference between DLSS artifacts and video artifacts, I submit you stole your Oblivion images and don't own a PC.
Meanwhile, same area, spinning and taking screenshots (so blur would show up): files.catbox.moe/v3us5f.png
Also

still hasn't posted specs

still no specs

Stay poor, rigless and angry while I play vidya on my PC you could never afford :)

I'm locked at 60fps

Your dogshit soap brewery has been utterly exposed by GN, shit-eating shitskin. Your own dogshit fucking webm exposed you even harder. No amount of screeching will change that. Snap your fucking neck.

That's roughly 3x the real fps you have, retarded shitskin, kek

still no specs

buzzwords and anger

Yep, you're happy with your lot in life.

still screeching after shitting out more soap than Hitler

lmao fucking shitskin

bro posting like an edgy 15 year old on Anon Babble in 2007

shit fuck retard dogshit shitskin fuck shit fuck

I'm happy. I said lmao, which means I'm laughing.

file.jpg - 640x640, 72.46K

Yep, I think I've mindshattered the shitskin kek

The only blood here is the one spilling from your torn shithole shitskin

the rigless are still screaming

kek.jpg - 1000x667, 74.34K

Yes, the shitskin who has Intcel e-waste instead of a CPU is still screaming lol

Damn, bro is MAD.
He already devolved to mid 10s YouTube comment section lingo.

Anon Babble is still one person

Imagine the smell of your village.

mindshattered so hard he's now having a schizo episode

"it wasn't me those were the voices in my head!"

lol

wants the thread to die so bad that he won't link anyone he's talking to, but has to have the last word

want to eat shit so bad he posted a soap brewery's worth of soap in his dogshit webm

lol

Intel is still better than being a rigless incel like you. Do they even have access to clean water in your village? Are you okay with the fact that your parents are siblings? Is being an incel is due to you being inbred?

I'm tired of all Anon Babble discourse boiling down to accusing each other of being Indians

If you don't post specs you are automatically wrong in every argument. Hardware threads should require a rig proof to post so we'd filter out all the barely sentient neanderthals.

Intel is still better

605.png - 716x690, 475.58K

Not at all ultra settings at 4k.

4k

No one plays at 4k.

That is with zero upscaling you retard.
Its native 4k all ultra settings.

TAA isn't native.

That's just the anti aliasing bro. The native resolution is 4k.

That's just the anti aliasing

Yes.

it samples a different location within each frame, and it uses past frames to blend the samples together.

Makes the resolution not native. You have literal fake frames.

So, 1440p60fag lost. Screencap this.

Faggot. I got the last word and I linked you because I'm not scared of you like you are of me.

If you want native resolution, you have to use methods like MSAA or SMAA.

The only thing getting lost here was your anal virginity, shit eating subhuman shitskin posting soap lol

I win.