You know, these AI frames aren't so bad

You know, these AI frames aren't so bad...

Miss with that shit in multi-player, fake ass hit boxes thanks to ai frames. Yes I have tried dlss 4 on a 5070 ti it looks amazing but I swear the head hit boxes are fake. Luck will provide you to win. No night nigga is safe.

I will not purchase a 5000s gpu
now fuck, Rajeet

saars... we need to thank Nvidia gods

file.jpg - 1122x698, 60K

You didn't see the frames

but I swear the head hit boxes are fake

in multiplayer games DLSS doesn't account for unseen geometry and you still don't get usable frames for hit markers, so this leads to the model appearing to move at 100+ FPS even though it's actually moving in-engine at whatever the real FPS is.

This doesn't matter in single player games that are usually slower paced and more predictable.

And you think Nvidia doesn't lower the quality of older graphics cards through drivers? Since the 5000s came out, my card, which belongs to the 4000s generation, doesn't run as well, I find.

now show this to me in motion

stop prodding the poorfags
they'll never be able to afford a whole nvidia gpu

How to ai generate vram?

looks like shit

new floor reflections appear when objects move out of screen

Is this some anti-screenspace reflection tech?

PC latency

Fakest bullshit. All of the latency improvements are coming from the upscaling part of DLSS, because the "DLSS Off" benchmark is being ran at a resolution that the card cannot reasonably handle so it has extreme, unplayable latency. By using that retarded red herring result as a baseline and then combining both the upscaling and fake frames into one brand name of "DLSS" they can pretend the fake frames are reducing latency, when they're not. Your actual rendering latency if you were getting 245fps(Real) would be around 3ms.

Then they pretend the fake frames are real frames so they can claim they're comparing "245fps" to 27fps which is not what's happening. This is extreme jewery and gaslighting

"245fps"

looks like 25fps

Screen tearing style artifacts in a webm

Checks out

Correct.
Problem with this marketing is that upscale and fake frames can be allright but have incredible diminishing returns the wider the gap. Getting 20 frames extra to hit 144 frames on ultra is worth it. Are you genning up from 30 frames it's a bad experience. Currently they try to haul underpowered cards that rely on ai shit too heavily. It's not a pleasant experience but even for a experienced user it's hard to pinpoint what exactly feels off unless you turn everything off to compare. You can't turn it off if you otherwise get sub 30 frames.

Yeah it's very impressive. Finally stopping the climb of resolution. Sony and MS made the mistake of the century switching to AMD hardware. Then again if NVIDIA was interested in improving humanity they'd have made the tech they took credit for free. They're not good guys, but they're the better option and have done some things right. Like the original dawn demo. People working for them not those in charge. The people at the top are the scum of the earth. Deliberately damaging products is disgusting. Inflated costs cost progress.

I only see 1 frame in that comparison, what's up with that

yeah they're not bad at all, it's very impressive stuff, if I had the choice to play a game at 27fps or 200+ fps I'll choose the latter, my problem comes down to the fact that they advertise it as performance, which it isn't, it's just improved visual fidelity.

of course this is still a massive upgrade from playing at a visual 27fps, but still, it's going to feel the same or worse to control.

it makes the conrolls of the game lag and you feel like you are playing underwater or wading through mud, basically everything controlls like shit/like red dead redemption 2

improved visual fluidity*

I'm eating

You are not supposed to see what it looks like in motion. And certainly not a frame by frame comparison in motion since that will reveal too many ugly details.

dlss 2 is acceptabl in most games except fast twitchy games but anything above that and you have input lag
its good stuff for turn based games but anything besides that shits itself

Thank you for your service sars

at least read my post before assuming I'm shilling

if I had the choice to play a game at 27fps or 200+ fps

That is not the choice
And if that were the choice then the first question you should be asking is why does your $1000 dollar latest gen GPU only output 27 real frames per second

hey I'm not saying I approve of this new line of marketing, it's fucking awful, I'm just playing devil's advocate.

Indians don't have anything to do with technology. They're scammers. Nothing more.

you know, i know these frames are fake.
I know that when i use this GPU, it's just AI telling my eyes that I am playing at, 4K and I'm averaging 120fps.
after playing at 1080p 40fps for nine years, you know what i realize ?
DLSS is bliss.

Yea, DLSS 4 is fucking magic. Sadly 90% of PC users will completely miss it because figuring out how DLSS Swapper works is just too much for them.

Nice screenshot. Now start moving the pov around. That's ltierally where 99% of the problem lies.

Frame generation is good only if the initial framerate is already high

1080p 40fps

Hey retard have you tried turning memetracing off?
Did you know that the greatest improvement to your gaming experience that you can make is pic related?

I didn't notice that in oblivion remastered at least

I love how Cyberpunk 2077 is like the new Crysis 3. The game is so massively underdeveloped for modern hardware that anytime we get new hardware we need to immediately see "how it runs Cyberpunk 2077."

If you are able to play Cyberpunk 2077 in 60 FPS at a decent resolution then count yourself lucky, because apparently most people can't do that.

RTX is gorgeous when done properly. I ain't turning it off in cp77, it literally makes the game feel real in VR.

UPSCALING is magic, upscaling done well is legitimately good tech
Fake frames are not upscaling and Nvidia is attempting to conflate multiple memes into one brand name to muddy the waters deliberately and deliver bullshit to you in the form of a trojan horse

Anon, DLSS is an upscaler, not framegen...

did you just miss op's post or what

Turn it on FFXIV because it got added in the last expansion

Input lag out the ass, everything looks like vasoline

I'm actually fucking up inputs because of this

The best thing, in fact the ONLY thing I would use it for is for games that put graphics first see: most walking simulators like SCORN

Personally I find my games feel the most real when I can move my mouse and see the screen move at the same time as my mouse instead of half an hour behind it like I'm playing underwater and your fake frames do not deliver that

You don't even notice it's frame gen above 60 native in most games. Doom TDA is flawless.

Modern games are so bloated an unoptimized they need an AI to hallucinate frames for you

An industry built on fraud

You need an explanation for what a transformer model is? DLSS is upscaled AI frames, anon.

Nvidia is attempting to rebrand all of their shit collectively under the DLSS label so they can ride the positive reception of the upscaler to introduce other shit which shouldn't be positively received

Literally no one is talking about framegen except you two.

imagine being so poor you can't even tell the difference between dlss and fg because you can't afford either

Retard, NVIDIA is talking about framegen, while trying to pretend that they are not talking about framegen, which is why the OP image pretends "DLSS4" results in "245fps"

They don't want to say frame generation because they understand it's a dirty word and people don't want fake frames. They want to pretend they are delivering 245fps for real, which is why they're now calling their frame generation DLSS4

They're pretending they aren't hallucinating 80% of their frames

Why are you lying? The only reason DLSS4 has more frames is because it can upscale an even lower resolution without introducing artifacts.

nta but you're entirely fucking wrong, and you're falling for nvidia's shit

Look at this fucking shill right here

I accept your concession.

Pull the other one

Oh neat, nvidia does "microstutter" now too. That's innovation, frens!

Upscaling yes
Framegen, fuck no

This is a real shame, since Crysis 3 certainly looks better.

FFS those aren't "AI frames". Upscaling frees up resources and allows for higher frame rates.
Frame generation is completely separate from this.

wrong, nvidia have exclusively advertised dlss4 with framegen

Didn't know you can play as a fish

Frame generation is completely separate from this

Check this guy out, Nvidia suckered another one

It also says that DLSS4 is powered by 50 series but works even on 2070. It's just marketing talk, that's all.

I love pulling out my sword in Oblivion Remastered and it leaving afterimages along my screen.

It's just marketing talk

just like their fake marketing benchmarks

All crysis games look like shit. Cyberpunk 2077 looks amazing. Crysis had the biggest downgrade to it's presentation to release, more than the Witcher 3. It's the biggest disappointment of the 00's. Killzone 2 ended up looking great.

has more scratches on the table than the original frame

buy PC

AI frames

buy PS5 / XBOX

AI frames

buy Switch 2

AI frames

its literally impossible to avoid now

its literally impossible to avoid now

I avoid it by only playing good games (games released before 2008)

AI will save gaming

Buy AMD

Turn memetracing off

problem solved

ur VRAM sars.jpg - 2444x1439, 436.46K

I just don't play shitty games that force ai frames on pc. Current consoles don't have much of a choice in that regard.

I wish it would hurry up then because right now all it's doing is making games worse.

you're overcomplicating it. here's an easy way to understand it:
fake frames in a movie game: LE GOOD
fake frames in an action game: LE BAD

simple

launch new game

RTX: off

DLSS: off

framegen: off

motion blur: off

chromatic aberration: off

nvidia reflex: off

graphical settings: lowest

resolution: 1920x1080

ah, yes, it's gaming time.

Games are easier to make than ever

Games are becoming worse

Performance is also awful

Enabling bad practices is a bad idea.

Creepy tentaclehairsisters, I feel so RIGHT

DLSS is not AI frame generation

You should try telling Nvidia that

that screenshot is from
nvidia.com/en-us/geforce/news/dlss4-multi-frame-generation-ai-innovations/

On the GeForce RTX 5090, DLSS 4 with Multi Frame Generation multiplies performance by over 8X versus traditional brute force rendering in this Cyberpunk 2077 scene, PC latency is halved for more responsive gameplay, and image quality is further enhanced

PC latency is halved

These lying small hatted pieces of shit

lmao DF fatties in shambles

actually rendering the scene for real is being referred to as "brute force rendering"

If you tolerate this then your children will be next

it looks fucking good?

Crazy how much njeetia fanboys just eat up such blatant lies kek

DLSS refers to the entire suite of technology but generally refers to the thing that all RTX cards have: DLAA, Superresolution, and Ray Reconstruction. So the anti-aliasing technology, the resolution upscaling thing and raytracing. None of that is fake frames or frame generation. This is apparent because if you use the first two your fps goes up AND latency is lower than native without turning on any Low Latency features.

Frame Generation and Multi-Frame Generation specifically are "fake frames" which DO increase latency. That latency is only decreased by low latency modes.

The issues is that the Raytracing tech which is the visually impressive thing basically doesn't work without using some or all of the DLSS stuff even though that'ss the first thing that came out and they worked backwards to get it running at acceptable FPS performance/latency through Frame/Multi framegen

salt becomes some black substance

graphical settings: lowest

That's where you went from based to poorfag thirdie

Raceswapping gone too far?

None of that is fake frames or frame generation

ok but in that specific instance they are showing off frame generation.

see white npc in video game

quickly spin a circle

hes now a nigger

imagine being so bad at gamedev you need jeetnology to hide how retarded you are

its depressing if you go back and read interviews or watch postmortems from gdc.

well we decided to rewrite all this shit in asm because it would save 2 mb and give a 3% performance boost

now they are too lazy make proper lod meshes.

unlike your amerimutt ass I can't stand playing below 300 fps

DLSS 4 with Multi Frame Generation

too poor to afford GPU good enough to play 1080p at 300 FPS

Like I said, poorfag thirdie merely pretending to be based

Anon Babble: AI LE BAD!!!!

also Anon Babble: WHAT YOU DONT USE AI FRAMES WHAT ARE YOU POOR LOL!!!

lower resolution

base framerate increase, decreasing latency

see? fake frames are good actually

nice try jew

that Anon Babble guy seems like a real jerk

oblivion is fucking disgusting because of fake frames

Shut up retard
The benchmark in the OP is trying to compare 1080p rendering with upscaling and 8x imaginary hallucinated frames to 4K rendering with no frame generation and pretending it's a relevant comparison
They took 3 wheels off a car so they can claim their scooter is 8x faster than a car

Sure.
If video games really were still images of dead objects, then sure.

URP

The great thing about DLSS is not running games at high FPS, but pretending online that I did

file.png - 600x821, 1.07M

It's more like they put a car engine on a bicycle with no way to utilize it and then added car parts each iteration until it was a jalopy that can run taht is technically faster than a bike but is not a sports car and the sports car doesn't actually exist.

Native 4k rendering with RT does not exist on any hardware with acceptable framerates without using DLSS aliasing, upscaling, or frame gen features.

dlss

showing close-up pictures where DLSS isn't even applied

fucking hell
dlss mainly works on distant geometry, it has borderline no effect on something close to you

Exactly!
After all if one posts on Anon Babble he isn't playing bideo games anyway.

it's not ai it's just basic interpolation like TV did for a decade (proof, FSR framegen doesn't even pretend to use ai and it works and look the same, it's faster and work on every cards) and it's not very good. X2 is pretty good starting at 70FPS (processing cost bring fps back to 60, then interpolates to 120), at x3 it falls appart and x4 is horrible.

it's not ai it's just basic interpolation like TV did for a decade

god that rustles my jimmies
I turn that off on my mates' tvs if i ever get the chance
who the fuck would actually want that

AyyMD has two kinds of fake frames, one is less shit than the other.
AFMF is driver side and it's shit, same as Lossless Scaling simple interpolation without any input from the game needed.
FSR 3 or 4 FG need things like motion vectors to function but have much better results because they don't fuck up the UI for the most part.

This has nothing to do with frame gen, it's just fucked up SSR that would be there with it on or off

You are adding latency and motion visual artifacts. Absolute frames numbers on a still image is not a good indicator, after 60fps latency and frame time is better.

it's not ai

DLSS Multi Frame Generation

Powered by GeForce RTX 50 Series and fifth-generation Tensor Cores, new DLSS Multi Frame Generation boosts FPS by using AI to generate up to three frames per rendered frame.

using AI to generate up to three frames per rendered frame.

using AI

to

generate

up to three frames per rendered frames.

Input lag isn't so bad with dlss4...

DLSS 5 will generate your mouse movements for you :)

Everything will be ok, Eve will save me

Yes that's what I'm talking about, I have a 5070ti and I literally just use FSR3 framegen over dlss's, it's just faster, look the same and feel the same with injected Reflex, also allows vsync which I need for reasons.
It's decent at 120fps output. Still not real performances.

Source Nvidia

Ok goyim retard, keep consooming the lies and slop
tech illiteral retard.

Sure if you have high end card.
For low end it's awful because neither image quality or FPS are good enough.

It's awful 100% of the time, just like on TVs.

Why yes I too fully trust the trillion dollars marketing informations, why wouldn't you? The 5070 is AS FAST AS THE 4090! IMPOSSIBLE WITHOUT AI!

Native 4k rendering with RT does not exist on any hardware with acceptable framerates without using DLSS aliasing, upscaling, or frame gen features.

So people should stop asking for it.

This isn't about trusting Nvidia.
It's not a good or cool thing that they use or don't use AI.
This is just a semantics discussion.

FPS views are the best case scenario for framegen, try the same with a character in the middle of the screen.

AI has to generate gameplay one day

it already has but object permanence is still an issue

keep consooming the lies and slop

I have a 5070ti

produce frame

repeat same frame 10000000000000000 times

claim that nvidia and amd are dead

INTEL WHAT ARE YOU DOING

Intel dead

Many many things are "still" issues with fully AI generated gameplay.

but-but intel arch-magus can literally cast power word:death on them

why would they lie about using fake frames to hit 240fps? wouldn't it be better if the card was powerful enough to do that without the fake frames...

"AI" doesn't exist because AI stands for Artificial Intelligence and your GPU is not intelligent
Framegen is not interpolation, it's extrapolation which is why they like to claim it doesn't add latency. Extrapolation requires more than just averaging frames together, which is why it uses "AI" (which is actually just pattern prediction) to extrapolate imaginary frames and spam them at the screen between actual game updates so that Nvidia can claim in it's marketing that it is performing 4x better than it actually is

Feels strange, 500fps on my screen but it's fake. Life is fake too...

Yes that's what I'm talking about, I have a 5070ti and I literally just use FSR3 framegen over dlss's, it's just faster, look the same and feel the same with injected Reflex, also allows vsync which I need for reasons.

It's decent at 120fps output. Still not real performances.

I've tried multiple fake frames technologies and it doesn't make sense because I have only 144Hz display. I get better experience with simply capping frame-rate to 120 or even 90 FPS.
With 60 -> 120 FPS the input lag is still bad because 60FPS is just shit, doubly so because modern games stutter which is even more jarring with FG.
Also FG has compute cost so if you can hold 60 without it it won't give you double the framerate. And that's before you even mention artifacting caused by it.
If I had high end display like 4K 240Hz OLED and 5090 I would use it for sure right now there is no point.

It all sounds quite silly when put in in layman's terms like that.

240fps

on a 120hz display

This is why parrying in E33 feels wrong and you have to hit it earlier than you would in better optimized games that were made properly.

you must have a real shitty pc

Looks like garbage. Like there's a rotating headlights inside the room

You need to have your eyes checked

brute force rendering.

YOU MEAN PROPER FUCKING RENDERING!!!!
I'm sick of this Fucking trash. OPTIMIZE YOUR FUCKING GAMES.
NATIVE GRAPHICS, 60 FPS.
not games that are designed to be smeared in AI shit.
games that look good and run well at native res, 60fps NO AI SHIT.

I'm so fucking sick of people buying these trash games that aren't even mae properly and force AI shit.
demand better.

What are you talking about? E33 doesn't even support framegen and if you force it through the driver that's your own fault.

Native res

Unfortunately, while nobody seems to want to admit it, 4K was a fucking meme and despite over a decade of claiming 4K performance it is not at all viable to run games at native 4K
Remember when Sony claimed the PS4(FOUR) was going to output at 4K, then claimed the PS4 Pro was going to output at 4K, then claimed the PS5 was going to output at 4K
Yea we're going to do native 4K any day now

Upscaling is good tech. Everything else being rammed under the same brand name is a meme

AAAAA THESE FAKE FRAMES ARE DRIVING ME CRAZY! I CAN SEE THEM IN EVERYTHING! EVEN REAL LIFE!!

Every AI Scaling mode (which it forces on) is also frame-gen.

4K is not a meme, playing the latest trash at 4K is.
DLSS is finally getting good enough to be used after 7 years, while hardware accelerated real time RT is still not worth using.

It's not, it's just upscaling.

NATIVE GRAPHICS, 60 FPS.

60 FPS

ew, what is this, 1999?

I wish ;_;

It is viable to run games at 4K. but not to scale them any higher with AA. if you have a 4K monitor you don't get to use super sampling or anything like that sorry.

the problem with modern games is they're just pieces of shit. They don't run well at 4k, they don't run well at 1080p & full settings. they don't run well.
they're not made properly. they have artifacts everywhere, shader stuttering, framdrops on the most expensive hardware.
and when you render them natively what do they look like?
this grainy trash. Mid level TAA is on and still can't hide it.
So you they crank the AI upscaling and try to smear the fuck out of it. but a jittery game with dogshit rendering underneath can never truly look good no matter how much you smear it with AI.
it's just putting lipstick on a pig. The cheap methods they are using to render modern games need to fundamentally change.

UE5 shadows.png - 516x494, 527.77K

multi-player, fake ass hit boxes

Like you have an actual problem with that given that games post CS have had broken ass netcode and hitboxes since forever. Anyone complaining about hitboxes and luck like that shit was ever competitive in the last two decades is a fucking moron. Shit, people be playing that esports shit with valve and CoD garbage where you can check back the shots and shit ain't even right... bullets that don't register, people that aren't where they're supposed to be, angles that are incorrect. No one has been legitimately serious about competitive multiplayer in ages. It's all just a fucking popularity shell game being upheld by cheaters and shit anyway backed by the esports organizations and the publishers. Every pro player in the money tournies cheats. Period.That doesn't mean they couldn't be good - but that don't matter. What matters is them being the cheaters who walk out with thousands of dollars split between their team and the kickbacks to the cheat developers.
And I know you ain't saying shit about that because you're giving a flying fuck about multi rather than having already even examined the situation of money in esports and the scandals and shit. If you ain't just rolling SP then you ain't know shit.

I hate how right this Anon is.
I don't see things getting better any time soon.

Uh, FFXIV doesn't have frame gen, it only has DLSS

theoretically lower input response and tearing

theoretically lower input response

That doesn't happen because the frames aren't real
You can't get a faster input response by hallucinating more frames, because the fake frames are being hallucinated entirely on the GPU by an image generator which is taking no further input from the game engine or your controller at all

i think it's going to get a lot fucking worse. i unironically think games are going to end up like this in a few years
youtube.com/watch?v=iyZmpPHq1Ag
they will make a skeleton game and then use ai to texture and light it.

microcenter.com/search/search_results.aspx?Ntt=NVIDIA GeForce RTX 5090 imagine paying this much for fake frames. Imagine paying for a cable to melt over time. Imagine paying to hope you don't get shitty impedance mismatched cables. Imagine you think you're beating the odds by not removing that cable for 2 years only to find you can't even remove it because it melted.

Imagine being such a massive faggot to buy nvidia.

comments saying "games could look like this" as if it's a good thing.

Fucking terrifying.

I don't think we will get those any time soon.
We are already hitting the wall with neural networks.
But I bet someone will try to make a game like that.

DLSS4 - GPU hallucinates frames which aren't real

DLSS5 - GPU hallucinates player inputs which aren't real

DLSS6 - GPU hallucinates games which aren't real and then plays them for you while you watch

We're only 2 generations away

well if dlss 4 is already inserting 3 fake frames for every 1 real frame. 75% of the shit on the screen is already ai slop, whats another 10-20%?

DLSS7 - By popular demand Nvida adds the ability for limited player interaction with the game the GPU is hallucinating

So in 3 generations we go full circle and achieve Sega CD FMV games

Maybe, maybe not.
I think we will see Asynchronous reprojection becoming more common to lower input lag first.
You can already just display fake frames if you want with Optiscaler for example.

sewer shark remaster
lets fucking go

Am I the only one who can't tell the difference between the first two?

Tech illiterate anon here
What are those white lines appearing?

IMG_3022.jpg - 1024x576, 68.01K

nvidia is over saaaar

GPU hallucinations

The point is that you shouldn't be able to tell.
It makes MORE FRAMES with no noticable difference in picture quality!

However. We're just looking straight at some inanimate objects, without moving the camera. No shit an algorithm can predict fairly well what the next frame will be.

Doesn’t that mean AI is advancing if it can hallucinate?

Not him but no. It's unwanted and it's there all the time.

deepdream-40.jpg - 1800x1409, 1.47M

hallucinate is the only thing it can do tho

sddefault.jpg - 640x480, 60.22K

instead of adding textures they make a giant program that auto creates them

total waste of CPU power

completely unnecessary

”industry standard” because it’s so expensive

heres the thing: you WANT the AI to hallucinate correctly and logically and not fake shit like those white lines, which is why stuff like the mangled hands on some AI made fanarts are still an issue because AIs have a lot of trouble figuring out how a hand's finger's should be positioned. We have not reached that point yet, which is why a lot of people reject AI in the first place.

I do not want to live in the dogscape.

I hate niggers holy shit.

Industry standard because it's expensive for YOU, not expensive for THEM
modern game engine tech is about offloading work from the developer's artists to the consumer's hardware

Get used to it

To be fair our brain is filtering all the fractals we see and interpreting them.
This is why when you take LSD or Psilocybin and your brain turns off all the filtering the world looks all fucked up.

MFG is only 4x

somehow nearly 5x frames

This shit is so fucking fake man

As retarded as that graph is 5060 is around 20% faster than 4060 and nearly 50% faster than 3060 at 1080p on average.

now fuck, Rajeet

The already do that enough. Please don't encourage him.

It matters is slower paced games too. If you play something like Mechwarrior (a VERY slow paced shooter relative to most shooters out there) where you need to aim at specific bodyparts in order to inflict damaged to a desired location, you're dealing with very small hitboxes that will get mangled to dogwater, if you have a lot of fake frames. It only really doesn't matter in shooters with no locational damage, but those are rare as enemies in most shooters have at least one head or some other weakpoint these days.

high refresh rate monitors solved screen tearing so we had to invent GPU level screen tearing instead

The screen tearing you see in that pic is actually an in-game effect due to electromagnetic interference after being hit by a beam weapon. Note how only the HUD is distorted, but not the actual physical objects.