Yeah bro i'm playing at TRUE NATIVE resolutions...

yeah bro i'm playing at TRUE NATIVE resolutions, i would never use those pesky upscalers because they look too soft and bad in motion

taaa.png - 608x531, 14.07K

DLSS > TAA
DLAA >>> TAA

meaningless alphabet settings

i just set everything to ultra

meaningless

And then you'll wonder why your game looks blurry and soft.

SSAA >>>>>>>>>>>>>>>> AI shit

8 fps

just about every game nowadays has fsr3/xess native aa or dlaa
playing at native resolution without having to deal with taa blurry mess

in terms of quality sure
in terms of the performance you get for that quality, it's retarded

just get rid of half your FPS to smooth out some edges a bit

unfortunately dlss4 is the only working AA solution right now

explain

playing on a 4K meme screen

you don't get to have AA sorry

playing unoptimized unreal engine trash

yeah it's no wonder you need fake AI frames.

SSAA at 1080p is not very expensive. There are different algorithms.
Every emulator including switch works perfectly with Super Sampled AA.
Any game that uses forward rendering works perfectly with real AA.

You know what's more expensive than SSAA?
-Deferred rendering
-Dynamic reflections
-Robust Physics simulation
-Too many Lighting calculations in a scene
They should get rid of these things first.
The problem isn't Real AA. the problem is modern games are bloated unoptimized garbage taped together with low quality solutions.

Nothing to explain TAA is still native and better than DLSS at least on 1080p
4k is a different story but it all boils down to how good the TAA is. It's awful in RDR2
It's great in ID Software games (including Starfield)

TAA is the standard antialiasing mode in most slop from the last near-decade. It's good at antialiasing but it degrades image quality and motion clarity, the downsides are just not worth it. Unfortunately the alternative antialiasing modes have been phased out or deprecated over time, and in most games you can't even turn TAAshit off without breaking the game's visual presentation.
Alternative rendering modes like DLSS/FSR have been becoming more prominent over the last few years. Part of DLSS/FSR is usually upscaling and ML algorithms so of course it gets criticized for lacking detail and sharpness compared to native rendering. But here's the joke: most people shitting on DLSS/FSR are the same people who just play with TAA on at native resolutions, not realising their image is already being ruined. And with the recent updates to DLSS/FSR, not only are are DLSS4/FSR4 both unironically superior to TAAshit when it comes to image quality and motion clarity, but they're superior even when using lower input resolutions.

TLDR: TAA is dogshit, can't compete with ML rendering modes like DLSS4/FSR4 even when it's given more pixels to work with. Most of the people bragging about "playing at native" are playing with worse image quality and motion clarity than those using ML rendering modes. We need to go back to forward rendering so MSAA becomes viable again and we don't have to deal with this shit anymore.

This anon is retarded, SSAA (typically at minimum 2x2) on 1080p is literally just the same cost as running at 4K with an extra compositing step. The only reason it wouldn't half your FPS is if you're already bound elsewhere. It's only practical for scenarios where your GPU has fuck all to do anyway which is typically not any modern game that has any graphics.

deferred rendering

Literally invented to be cheaper than forward because of the lights problem

physics simulation

Entirely dependent on developer and can run either on CPU/GPU, dumb statement

Too many Lighting calculations in a scene

Read deferred rendering. Forward+ and tiled renderers solve this problem to a degree.

TAA bad

Is there a more reddit opinion?

TAA is fake AI shit that crates unacceptable artifacts. It relies on how fuzzy and grainy Photo-scanned Unreal Engine games look to try and hide how shitty it is.
If you have a game with actual clean stylized graphics it's so obvious how unacceptable TAA is.

DLSS is also fake AA. It uses AI upscaling to create a higher resolution image (because modern games are so shitty they can't render high resolutions the correct way.)
BUT from what I understand it uses traditional Super sampling to then downscale the AI upscaled image to smooth out all the aliasing and so that makes it closer to real AA.

Both solutions are fucking garbage and are a scathing sign of the shitty condition of gaming.
-fake tranparency
-fake frames
-fake AA
-fake resolution
Modern games can't render anything. they're garbage.

TAA.png - 611x327, 288.43K

If this has DLSS 4 in the comparison it would look better than SSAA because it's trained on 64x supersampled ground truth.

Everything is fake in video games, get real

Shilling for forward rendering but I doubt Reddit even knows what's forward rendering and why devs left it to rot

AI is involved!?!?

I AM AGAINST IT!!!!

DLSS is just super sampling but instead of rendering the higher res image properly, it generates a fake one with AI shit.
There for it's less accurate.

FSSGSSAA >>> SSAA >>> MSAA >>> SMAA >>> FXAA >>> DLAA >> TAA

What makes you think that the people who complain about upscalers are the same people that play with TAA?

FXAA >>> DLAA

you retard, FXAA is basically a filter
There are some things AI is actually good at believe it or not and AA is one of them

I hate TAA
I hate raytracing
I hate fake frames and fake pixels
I hate noisy gfx techniques that make all this blur shit necessary
bring back MSAA and stencil shadows

SSAA at 1080p is not very expensive.

SSAA is equally expensive regardless of resolution. Assuming you're GPU limited t would impact your performance by a degree proportional to the internal resolution increase.

DLAA is just DLSS at native resolution
You cannot pinpoint a SINGLE inaccurate artifact in any DLAA image

Literally invented to be cheaper than forward because of the lights problem

What lights problem I'm sorry how many fucking lights do you need in your scene?
-The sun or moon
-a few scattered lights that don't need to reach far and are easily optimized

I don't want or care about having 1000 different dynamic lights especially when it just makes games uglier.
I don't give a fuck if I can see a light from 10000 miles away bouncing off my characters ugly exaggerated skin pore normal maps, making 0.01% difference to the overall lighting.
that shit does not make games better. Especially when it means you can't even fucking render transparent objects like hair and you image is grainy trash.

There are games that have almost no dynamic lighting and use baked in lighting for 99% and they still look amazing today.
find an art style and optimize it. Games do not need that many light sources or reflections, nor is that why we play them.

I am seeing more people preferring DLSS over TAA here than usual (and they're right). Is it because America is still mostly asleep?

I am so fucking tired of modern games looking so blurry and smudgy. You basically need to supersample from 2x2 to make it as sharp as older games.

Cutscenes in MGRRRRR are pre-rendered and they take 20 GB of the 24 GB game. You can delete 20 GB worth of files and the game is still playable, it just skips the cutscenes.

I am American and I prefer it. It's way more clear than TAA. It's only crabby old "back in my day" retards who despise anything involved with AI who hate it. And AMD users I guess

you niggers hate TAA but I never saw aliasing in new games with TAA and dlss4 quality with a bit of sharpening

and yeah fucking msaa doesn't just still have aliasing, you also need way better hardware to run it

So they just don't play video games at all?

when you turn on dlss4 it turns off TAA. You have one or the other

I haven't bought nor will ever buy a game with forced TAA

Not that anon but DLSS is essentially just another form of TAA but yeah that anon is really just trying to be an ass about it. It looking much better than the normal TAA is what people are talking about after all.

literally what?? prove it

Its popular to hate AI because artroons ans fiverjeets are being sidelined.
People think jeets are using AI, but somehow forget that they cannot afford any subscriptions or hardware.

DLSS does anti aliasing and it does it way better than TAA
DLAA is just DLSS without the upscaling part (all native res) and it's almost as good as MSAA for way less power

not sure what OP's gay fucking retarded argument is, because I don't use TAA either

MSAA or Downsampling or fuck off

NO it's not. did you know that DLAA uses SSAA? the super sampling is not the expensive part, it's rendering the higher resolution image to downsample.
if your game render pipeline runs like a dream rendering it at a higher res is no problem. see pic. all these old games super sample so well. they look so fucking clean and tasty.
But the bloated poorly made rendering engines of today? they can barely render at 1080 native in the first place. it's not the SSAA.
it's the modern games and shit exactly like UE5.

if you use an AI image upscaler and then scale it back down to the original resolution to get SSAA. Don't you think those mixed colours would be less than accurate since the addition details for the sample points were "made up" by an AI?
it's better than nothign but if you can get the real image to downscale and have the GPU power, surely it's better to do that?

TP.png - 1920x1080, 2.46M

TAA is one of the biggest leaps forward in games. Cheap AA that looks like super sampling. Eliminates shimmer.
The downside to it is it dulls whites and brights. Blur is fine.

AI upscale replaces it, but it integrates it.
Games without good AA are hardly playable anymore. I take notice of the shimmering and aliasing. I use the highest blur settings on my TV to lessen aliasing. Blur is also better for gradients and shading.

People will always complain about anything. Which is unfortunately why complaints should be ignored by default.

I complain about high res. Good to see AI upscale has made low res popular again and Dev is moving toward building around low res and AI upscale for better visuals. A little. Severe damage done by degenerate YouTube channels that measure performance, which call higher res higher quality. Such at least shows that incorrect views are from the stupid given a voice, media, manufacturers, parasites. Not sure if the right thing should struggle against such scum using the same platforms. Or they should remain rejected. Good saying that "don't argue with idiots. Those looking on won't be able to tell the difference". It empowers parasites to have contact with them, to be aside them, it lends them credibility they don't deserve.

The competition is using your mind. That done results in ongoing improved viewpoints. Parasites are just passing on lies, no brain, no individuality. Just spam. There's nothing there. Victims try to siphon otherwise to them, so aren't discluded. They don't get any better if denied though.

It actually kinda hurts playing modern vidya. I hate TAA and its impact on modern visuals. It's unbearable.

proprietary tech locked behind a paywall by a greedy anti consumer corporation is not something i will ever support.

The hair looks that good in engine tho.
here. Forward games could render much better hair. You know why?
because they could actually render something simple and fundamental like transparency.
Deferred shit can not.

since the addition details for the sample points were "made up" by an AI?

they weren't really, not like the image upscalers you use online. It has direct access to the rendering pipeline and engine details to predict what the image will look like without having to improvise so much

Can someone explain to me how other methods of anti aliasing are an example of ai but SSAA isn't? Genuine question because it seems that in the last five years the term "ai" gets thrown around so broadly when it wasn't before.

1.jpg - 652x210, 29.78K

FXAA >>> anything

lol
lmao, even

DLSS shill doesn't even know it's TAA but with AI to enhance the image
I'll take a well implemented TAA over DLAA everytime

If you don't play with upscalers and you don't play with TAA, then you don't play modern slop at all so why the fuck would you even comment on them?

inb4 i play without AA

No you're not. Maybe you did in 2007 but most games visually break without AA.

Was RDR2 the first big game that used it? I remember the game's picture quality being really fuzzy compared to the rest of the games that came out before.

biggest leaps backward

DLSS is temporally stable and trained at 64x which is higher than anything you can run real-time. SSAA will still suffer from temporal aliasing and will be less crisp/smooth as a result. DLSS has other downsides obviously.
t. former OGSSAA and occasional SGSSAA enthusiast, used to force them on all slightly older titles.
You typically get 8 lights on forward bro, i.e. on Unity which still has its forward path.

shilling for even more static comes

Bruh.

No it's not? That's just wrong.
There is no TAA technique at any point, which is why it’s clear and not smeary

when upscale from 360p it’s blurry!

No shit, try at native resolution

comes

Games. Fucking phoneposting.

Xbox 360 ran Ninja Gaiden 2 in 585p. This "look I took 4k screenshot from old game (maybe even from the remastered version) and then compare it to 1080p TAA zoomed in hair strands" is such an ill way of argumenting.

You typically get 8 lights on forward bro

You can get more than that but why do I need more than that?
Am I creating a world with 8+ Suns?
Is my level design restricted to city streets at night time with super close together streetlights that all need to be dynamic lights?

I hate raytracing

I hate fake frames and fake pixels

It's funny that you hate fake shit for being hate but also hate raytracing. If you changed the names of raw raster lighing and raytraced lighting, I swear that people would have the exact opposite opinion. "Would you rather have games that have light act like it does in real life, or do 10,000 hackey bullshit things to make fake light that looks similar?" If someone called raster "AI lightgen" I wonder how quickly people would turn against it.
Just to be clear raytracing sucks because it's not a guarantee of better visuals and it's still too heavy.

wtf guys i just realized just now that motion blur is literally ai fake frames

I play with proper AA yes

2x

4x

8x

16x

Modern AA? Foolish.

all anti aliasing methods are ai because they all rely on automatic processes
when these nerds talk about ai they are really using it as an insult to mean something they don't like. ai = the computer equivalent to jews on Anon Babble.

No, every game from PS4 onwards has used it.

Xbox 360 ran Ninja Gaiden 2 in 585p

Yeah and My PC runs that same engine at 4k 60fps with AA.
Even the 360 didn't have trouble rendering the transparent hair. The n64 could render tranparent hair cards (but modern games cant)
NG2 ran like that on 360 because of the sheer number of complex NPC enemies it throws at you.

Yeah and My PC runs that same engine at 4k 60fps with AA.

And your PC will run "unoptimized modern crap" in 8K 60 FPS in 18 years.

18 years

Assuming those modern cards even render things any more rather than creating AI hallucinations.
But those games will always look like shit.
because the hair looks shit now it will look shit in 18 years.
NG2 hair looked good on the 360 and will always look good.

The hair problem is solved even today by rendering in 4K because the whole reason it happens is the lack of enough samples. And if you need comparison for how much our tech has advanced: when Doom 3 was released even the most powerful cards couldn't run it in 1080p 60 fps. Now we're having games that can go to 4K 120 fps with up to date hardware.

temporal accumulation blurs your entire image and even DLAA detroys texture quality compared to true native
now id be fine with it if there was alternative but the big green corpo and shilltubers are really hellbent on convincing you that theres no solution

because none of this shit is "ai" in the first place, it's a stupid fucking marketing term, it's all just more advanced algorithm shit. But the current garbage they use to refer to as "ai" is the worst version of years old technology.

temporal accumulation blurs your entire image and even DLAA detroys texture quality compared to true native

true native with high graphical complexity means awful image instability and motion aliasing, so there's no winning

He higher the resolution the less complex graphics can be. Resolution costs performance.

It is preferable to use a low res regardless what hardware you have. Hardware should be spent on graphics instead.
Hair is costly. The lower the res, the better hair can be.

alyx and cs2

oh wait but no it means devs actually have to do their job properly

yea

and even DLAA detroys texture quality compared to true native

proofs? I have not noticed that

just stand really far away from the screen and don't look at the jittering bro.

Not solved. the effect is fundamentally bad. no amount of smearing or scaling will completely remove the jitter.
Unless you render and then downscale from a resolution so absurdly high that it costs more GPU than using 1000 lights in a forward renderer would have.

The artifacts (of which there are many types) will always be there. Deferred rendering was never worth it. we just have a generation of permanently invalid destroyed games.

oh wait but no it means devs actually have to do their job properly

modern society functions through as low effort as possible for as much profit as possible, you're not be getting the "developers alter the entire rendering pipeline and graphical assets around me not having to turn on antialiasing or being able to use msaa" thing outside of rare unicorns anymore

sorry not gonna use yoir BLMAA

That's Lumen.

It was noticeable on 3, somewhat bad on 2, not true on 4.

Resolution costs performance.

Nigger you're literlaly running a 585p game in 4K. That is almost FOURTEEN FUCKING TIMES bigger resolution. I am not sure if this can be even called arguing in bad faith or just being a fucking retard.

operates on similar principles to TAA. Like TAA, it uses information from past frames to produce the current frame. Unlike TAA, DLSS does not sample every pixel in every frame. Instead, it samples different pixels in different frames and uses pixels sampled in past frames to fill in the unsampled pixels in the current frame. DLSS uses machine learning to combine samples in the current frame and past frames, and it can be thought of as an advanced and superior TAA implementation made possible by the available tensor cores

en.wikipedia.org/wiki/Deep_Learning_Super_Sampling

Educate yourself

DLSS4 upscaling

you said DLAA (no upscaling)

Yeah I'm not the other anon, just pointing out DLSS 4 running on literal 720p still looks better than 4K native texture-wise. Will obviously have other issues though.

native 4k taa

begone clown

That guy is defending DLSS, not against it. So with some brain usage you should be able to figure out that it's not the same person.

can someone explain the difference between DLAA and DLDSR

AI in that case referred to the machine learning bit. Hence "DEEP LEARNING Anti-Aliasing".

How in the fuck-
Has anisotropic filtering become a lost art, too?

No. But I *think* that they're the same thing, DLAA is just much more convenient way of doing it. Like there are some games that I can run with DLDSR but it changes my mouse sens when menuing and my desktop either spazzes out on alt tab or becomes entirely different resolution from my native.

you're thinking of mip maps and yeah they dont even bother to turn that shit on anymore they let green jesus take the wheel and smooth your image

Fucking hell. Let the industry burn to ashes.

DSR exposes resolutions higher than your monitor can support, DLDSR uses deep learning to do the downsampling back to your monitor's resolution. The reason people often use it together with DLSS is because often something like 4K DLSS will still look better than 1440P DLSS even if the internal resolution is the same. This is not really the case for DLSS 4 anymore.

you hate fake shit for being hate but also hate raytracing

not him but interesting question. It's probably because fake reflections don't have a negative impact on games, but fake AA and fake Transparency do.
in most artstyles it's very bad to use "accurate reflections" you want to make something look shiny, you want something to look "good" not "accurate".

No one really minds when a puddle or something doesn't have completely accurate reflections. In many cases approximations actually look better, ray tracing removes artistic control.
When Ray tracing is turned on (or forced on) it TANKS the fps even on the highest end cards.
people see that as unacceptable for some reflections they we're never going to look at and no noticeable visual improvement.
Same with ray traced lighting. people would rather lighting look "good" rather than "accurate".

Finally people find it absurd that we would use something that often has a higher cost than just drawing the surrounding objects twice.

frozen lake.png - 1920x1080, 2.59M

and it's always going to be in the game.

It's probably because fake reflections don't have a negative impact on games

you kidding? screen space artifacts have been hated for a long time

ray tracing removes artistic control

Have you ever considered the fact that real lighting allows for more artistic control considering all of film and whatever the fuck was achieved in real-life. Being able to way more accurately model light instead of a laser with only one bounce and no radiance (which is how direct lighting is done in all traditional games) allows for way more creative lighting scenarios.

film and whatever the fuck was achieved in real-life

I don't want games to look like live action films. that gross and pointless.
film and real life look ugly and all live action is the same shit with different color grading. I spend my whole life looking at "real lighting" it looks like shit. I want to see some real art when I play my games.

Photographers are the niggers of artists. they create slop. they do not have as much value as animators or painters who create unique beauty.
the less games look "real" the better. I'm tired of watching ugly facescanned monstrocities cry and wrinkle up their simulated skin folds about being lesbian or whatever, in sub 30 fps.

I spend my whole life looking at "real lighting" it looks like shit.

7b0.png - 423x751, 255.71K

good 3D graphics are not restricted to a resolution.
NG2 will look good at any resolution because the graphics are robust, clean and scalable.

Modern deferred slop will not look good at any resolution. it doesn't even look good at native res and forced AI upscaling.
because the visuals at their core are grainy artifact riddled garbage.

Bilinear interpolation was used on MRI scans way back in the 80s. All of this "ai shit" is decades old.

g005.jpg - 1274x1270, 169.16K

Bilinear interpolation

How is that AI?
that's just
output = mix(subsample1,subsample2);

How do you think DLSS works? Do you think that you have to train your cards AI or something? You cannot be this stupid. Please.

what the anti-taa people won't tell you is that the cat is out of the bag, and with geometry being this complex and resolutions too low, there is no other solution to shading, specially specular, aliasing.

If it looks good on any resolution then why do you play it in 14x resolution? And btw the original had pre-rendered cutscenes.

im agreeing "ai" is just a marketing term and encompasses a bunch of stuff that has been a thing for a while but zoomies and millennials think is new because of the term.

Using a convolutional neural network for 1-3 and a transformer-based model for 4, trained on Nvidia's Saturn V supercomputer. Both commonly referred to as AI or deep learning. Why?

its probably just a matrix transformation

Yes so it isn't learning anything new. It's just automation based on patterns that Nvidia trained it on.

yeah i love playing at 2 fps

ssaa x2 turns 60 fps into 30 fps
ssaa x4 turns 60 fps into 15 fps

and so on, and you need at lest 4x to get half decent results, and best come from sampling apixel at lest 8x

If it looks good on any resolution then why do you play it in 14x resolution?

Because I can? it's not required to hide rendering errors and artifacts. no AI needed either.

it's not required

Then fucking stop doing it.

Because I can

And you can run the current games in 4K too if you shill out top of line hardware. And in 10 years you will be able to do 4K in these games on a "potato" just like you did it in your 14 year old game.

I'm a retard, can someone explain the difference between:

MSAA

SSAA

downsampling (e.g. Nvidia DSR or like in emulators)

Cause I'd imagine all 3 methods bloat your internal resolution, but downsampling/DSR doesn't look as nice as SSAA/MSAA, why?

Assuming 4x downsampling, like from 4K to 1080p.

No.
The part it's used for is learning something new because there would be no point in approximating a known algorithm which would only result in loss in accuracy.

No.

AI is a matrix transformation

DLAA is still temporal and temporal uses past frames to eliminate jaggies so there will ALWAYS BE ghosting and smearing. No I don't give a good shitfuck how much propaganda Nvidia has shoved down your throat.

DLAA

GHOSTS

Simple as.

right one dosnt work on longer hair, there is no way to sort order on simple shader, and if you do sort it then its more expensive than ray tracing.

there is just no good way to do transparency in rasterization

Inferring and training AI is a matrix operation saved as weights, the inferred results from the model are not.

Then fucking stop doing it.

don't play in full screen at max settings?
fine I'll play in a standard HD window. game still looks great.
Still renders transparencies without having to fake it with horrible dithering.

RT is cool.
gaming RT is fucking trash though because it's barely any samples per pixel, mostly made up of temporal accumulation and and then blurred with an aggressive denoiser because current hardware is nowhere near good enough for the real deal.

Hair cards naturally sort using the Z buffer.
Both games use hair cards and ribbons the only difference is team ninjas engine supports translucency and the Unreal engine game doesn't..

ssaa = downsampling
just rendering entire screen at higher resolution
so if youare playing at 1080, ssaa x4 means you are rending at 4k internally
you will run out of vram very fast, and it ruins framerate by like 99%, you need a game to run at like 1000fps to set high sampling

MSAA is the same, but you only render at higher res on geometry edges, great back in 2000 when models were blocky, but the more complex the models get the closer to SSAA cost it comes, also dosnt fix transparencies like the TAA hair problem or specular aliasing
a dead end tech that become more and more demanding with each year, AA just cant cut your fps by 50%+, nobody wants to go from 60 fps to 30 or 15 to have smoother edges

When downsampling a game from 4K to 1080p via say Nvidia DSR there's still visible shimmering (e.g. I've tried this with Dark Souls II), but running an old (DX9 and below) game with 4x MSAA doesn't have any, it's perfectly smooth. What's the difference? You probably don't know either, since you just regurgitated what I said.

you will run out of vram very fast

Should be pointed out this is a deffered rendering problem.
With forward rendering you can just draw straight to the screen but with deferred you have to render and save several fullscreen 4K buffers like the lighting normals.
so 4K actually becomes 8K and so on.

???? how you storing 100 positions per pixel? in forward rendering you can go from back to from rendering each layer at a crazy cost per pixel, but in not in deferred rendering, but in either solution the zbuffer will make zero sense since it will stop at first hair layer
i mean look at hair in expedition 33, fuck you wanna do here?

I hate how TAA makes everything blurry garbage. I'd rather just have the jaggies

also dosnt fix transparencies like the TAA hair problem or specular aliasing

i hate you so much
please parrot more of that bullshit you'll eventually convince someone at a game studio to bring the death of videogames even faster

ESL lacks reading comprehension

Well color me surprised. Everyone knows what supersampling is, no idea why you felt the need to explain that the cost involved with SSAA is from rendering internally at a higher resolution, nobody was confused about this. That was exactly my point. If you're GPU bound at native res and then use 2x SSAA, your performance is going to drop by roughly the same percentage whether your native res is 720p or 8k and whether you're playing on a 2080 or a 5090.

And DLAA does not use SSAA since SSAA by definition runs the game at a higher than native res. You can see the internal resolution in the DLSS hud and with DLAA it's native res. You could call it a form of supersampling I guess since it's sampling multiple pixels across several frames to approximate one pixel but SSAA in your own definition refers specifically to resolution.

One of the few pros of having a 27" 4k monitor is that i can set the scalling all the way down to dlss balanced and not notice a difference.

Why doesn't UE5 just use the same hair tech as RE Engine? No dithering or shimmering whatsoever even at 1080p like in this screenshot here. The hair in RE4R honestly impressed me a lot.

nothing, its placebo or slightly different rescaling algorithm
while there is some difference between all this crap none of them produce much visible diffrence, some are more blurry, other less
SSAA, so supersampling means more samples per pixel, both it and nvidia will then have to downsample the result for your display, how much blur it will apply i dunno, nvidia probably just uyses one of the sharper algoritms to preserve more detail

You're comparing full 1080p into zoomed scenes that's why it looks better.............

idiota.png - 801x578, 619.79K

Because UE5 doesn't use real programmers and coders and developers. They use Jeets who steal shit from the asset store and just copy and paste. Every single UE5 game is broke and devs just make excuses after excuses after excuses.

being blind helps

fuck you mean retard ?
hair is almost always done on on cards, msaa has no way of knowing which pixel is a drawn hair, and which is the poly edge, it will lead to crazy frame rate cost with no visible results
valve has some way of resampling transparencies, but its shader side, not msaa

you only write to the z buffer when you render the 100% opaque hair cards and the rest of the opaque objects.
when it comes to the vertex pieces that have transparent textures you render them last and only read from the z buffer.
you get their center xyz as a depth value and then render them over each other in that order. from furthers to closest to the camera.
it's just about how you space them.

hair strands tanks the framerate and acts weird in motion

Photographers are the niggers of artists. they create slop. they do not have as much value as animators or painters who create unique beauty.

So what say you about 3D animators who have been using ray tracing in mainstream productions (Pixar for example) for 20+ years? Did you thing they were going through the entire movie frame by frame and drawing the lighting and shadows onto each frame by hand?

You cannot pinpoint a SINGLE inaccurate artifact in any DLAA image

nigger what
raptors in jwe2 were streaks across the screen when they ran around
birds in the sky were also streaks across the screen in spiderman

that shit ghosts like crazy

it does. RE engine just has a better downscaling effect on hair.

test

ReShade SMAA makes Helldivers with noAA look better than TAA. DLAA is basically TAA with a sharpening algo.

you just described forward rendering, the problem still is there
you still get hit with the shading cost for each layer, great for old games with low polycounts or games designed around it, but i will ask again, what the hell you do in games like expedition 33? the moment camera zooms on the characters head in a cinematic the game will shit itself

When downsampling a game from 4K to 1080p via say Nvidia DSR there's still visible shimmering (e.g. I've tried this with Dark Souls II), but running an old (DX9 and below) game with 4x MSAA doesn't have any, it's perfectly smooth. What's the difference?

The problem is the complexity of the scene and the types of aliasing you're seeing. Old games didn't have 4k textures, thousand poly models and dozens of effect shaders on top. Play RDR2 with no AA, even at 8k your eyes will bleed from sparkling lights, crawling textures and all the thousands and thousands of jaggies along the edges of all the super fine geometry detail. That dude already explained this in the post you replied to though, it's as if you didn't even read it before you responded.

If you're GPU bound at native res

or a below native res with the game needing AI upscaling..
That's exactly the point. it's not super sampling that's nonviable. it's the unoptimized nature of modern games.

fxaa

lmao, people who complain about taa blur then tell you to use fxaa with straight face

FXAA.jpg - 1080x564, 42.96K

What does this post have to do with FXAA?

DLAA is basically TAA with a sharpening algo.

Dishonest description. It doesn't take a TAA-blurred image and sharpen it. It's sharpened by virtue of retaining more of the original detail from the image.

This. I hate sharpening algorithms with a passion, it's really just added local contrast.

what the hell you do in games like expedition 33

You optimize them. You pick an art style that's viable and you squeeze out the value of every poly.
mip maps, lod, baked in lighting.
not just using unreal engines cheap systems and lazily slapping on expensive lighting and particle effects.
Lumen this Niagara that.
pre baked hair/clothing animation instead of wasteful and lazy physics simulation that clips through everything.

I can think of so many ways to optimise this game.
maybe make the painted world, look like an actual painting. you could get rid of so much clutter and bloat that way.

forward is not viable!!1!!111!

then how does doom eternal manage it?

DLAA is what TAA should have been
no idea why most TAA solutions introduce so much blur when the changing subpixel position should lead to more detail, not less, gotta be some "optimizations" that reduce accuracy

I instantly lose all respect for anyone who suggests post-processing sharpening filters in any game ever and especially as a means to combat blur. The issue with blur is not that the image is too soft, it's loss of detail. You don't recover any of that detail by sharpening a blurred image and turning it from a wet smear to a dry chalky smudge.

You cannot be this stupid. Please.

Why are you such a faggot

Although that manual effort can pay off, it takes a tremendous amount of additional production work that balloons with every scenario and asset. That's a reason the industry is enticed by these "lazy" solutions, because having to squeeze the value out of every poly with mips and lods and bakes is a tremendous pain in the ass and gets in the way of just making games. Sometimes it directly interferes with what you can even make, such as having to compromise with very limited scale environments.

I think that it hurts people more. They've come too desensitized to words like "fucking retard tranny nigger saar, and faggot".

So is there no way to do true native resolution if the game only has TAA, DLSS, FSR and Xess?

Anti-Aliasing is cringe and pointless in general, especially with today's resolutions

So your solution is to ignore the problem and have everybody be bald?
The obvious answer is that there is no simple solution, transparency could be solved by some specifically made solutions like alpha to coverage, but devs prefer an universal solution like TAA

maybe make the painted world, look like an actual painting. you could get rid of so much clutter and bloat that way.

just change your artstyle bro

yeah, thats not happening

featured.jpg - 1251x485, 201.4K

I have accepted the jaggies and turn AA off.

just change your artstyle bro

They don't have an artstyle.
they have generic photo-scanned UE game #5548750646

This is a man defeated and broken. (it must suck when UE game will do everything they can to not let you turn off TAA)

SMAA >>> FXAA >>> DLAA

How to spot someone who hasn't even booted up a videogame made in the last decade. The vast majority of aliasing and shimmer happens during motion which neither Smaa nor Fxaa can address. They do almost nothing.

That is certainly a take

There's always a way to turn "forced" TAA off but then you're met with the stark realization that half of the rendering artifacts are deliberate because of a temporal pass to clean it up

hasn't even booted up a videogame made in the last decade

Good.

Anti-Aliasing is cringe and pointless in general, especially with today's resolutions

Vast majority of PC gamers are still at 1080p, so today's resolutions are for most people the same ones they used 20 years ago.

you retards need to stop shilling msaa, it just doesn't work in modern games

heh I just watched a video from a guy explaining this just a few minutes ago

Just increase the sharpness in you're monitor

proprietary tech locked behind a paywall

You're telling me that a company created something and wants to make money from it? The horror

Grim state of affairs

when UE game will do everything they can to not let you turn off TAA

with UE it's actually a nonissue it's proprietary engines that are a pain.
only ue game that gave trouble was NG2B because TN did some really fucking weird hackery to remove AA options.

Sharpness is just an increase in local contrast, it does not increase detail. And it does not help with motion clarity.

Yeah if you are lucky with lighting SMAA can be good for screenshots but it sucks for general use. I mean injecting it as a custom shader can beat FXAA or no AA easily if you don't have any other option but it sucks in general.
I have no idea why it gets glazed so hard.

hurr durr bad implementation

hurr durr you don't know what you are doing

I have tested in many games.

t. NVkike marketer

AMD had no problems sharing FSR with GTX users, giving free Gsync technology for monitors and making drivers open source for linux users. Hope you get killed by a train you stupid jeet.

DLSS Quality: on

Sharpness slider : maxed

yep its crispy gaming time

WHY don't devs just add MSAA?

it's too graphically costly

IDGAF ADD IT AS AN OPTION if the game runs at 10 fps with it on it doesn't matter it's a future proof setting!

I can't tell if this is one of those ironic posts or not. But you can just play games in higher resolution if you want to.

AMD had no problems sharing FSR with GTX users

And how's that working out for their GPU market share?

People who think TAA is bad should check FF16.

The game is shit, but its a technical masterpiece, raytracing, prefect hair even on a wolf made out os infividual hair starands. This is upscaled from 1080p to 4k, and despite such agressive upscaling there is almost no blur or aliasing when with depth of field. Look at the chicks hair, despite the dof blur having to go around the hair it dosnt destroy it.

No idea how they did it, but man, can it look great. Shame the game is so bad it might be the worst FF to date.

r.resolutionscale=200 or whatever

The issue is that it doesn't work in deferred rendering, which is the standard.

Why?

stalker.png - 3839x1639, 1.27M

the game is shit

stopped reading right there

I think TAA is bad because every case I see it, it looks bad, which is backed up by the reality of how the tech works.
One exception disproves nothing. Also TAA is mostly irrelevant to the reasons that game looks good.

that ends up being even more costly than msaa my guy

What are you even meant to do when the rendering pipeline itself is broken, and the bandaid consists of GPU-generated noise to fill the gaps?

It works in STALKER CS and COP, and STALKER was one of the first games with deferred rendering.

you just know retards will turn it on and complain that the game runs bad
this is literally only reason why future proof settings arent in games

AI is used as an insult with predictive methods, aka methods that make shit up on the screen instead of methods that melt adjacent pixels together with color tricks.

add a warning to it when you switch to it, that easy

I think Kingdom Come 2 did that and people are convinced that game is optimized

I know it is easy to dunk on devs but this honestly.
They could name the setting as SUPER OVERKILL HYPER DO NOT USE UNTIL 2030 and retards will switch to and then go to plebbit to bitch about how unoptimized the game is.
We just can't have nice things in life.

if the game runs at 10 fps with it on it doesn't matter it's a future proof setting

the problem is that TAA isnt a name for a single solution, its a name of technology, its like saying "all games are bad" since there is one bad game
each dev will do TAA diffrently, from the way it does motion vectors, oclusion masking or how it overlaps the frames, also not every game uses dithering and noise for hair, which make TAA look even worse
also if a game dev fucks up and dosnt generate motion vectors right you get shitload of ghosting

Epic games by default has by far the worst TAA implementation, FF16 is best one i saw so far

This is what most people don't get. Good TAA isn't even noticeable in games like GOTG

Image quality has got so bad in the last few years.
Remember when games were actually sharp and clear?

shame doom the dark ages is such an shitty game, its technically a big step up from eternal, but the level design got much worse, they removed all colors and dynamic lights

add ray tracing to make it more static than baked light

i dont get it

real lighting allows for more artistic control

It doesn't in a video game. Dynamic realtime lighting means object color, saturation, brightness etc. is constantly shifting making literally everything less readable and harder to distinguish from each other, harder to identify at a quick glance etc. It's not artistic, it's "simulationist". As a dev you have less control over the game's presentation because raytracing doesn't care about artistic intent or gameplay convenience.

Live-action movies don't even have realistic lighting, they work within the boundaries of how photons behave in real life but the lighting setups for sets and scenes are totally artificial with lamps and reflectors all over the place to control how the actors and environments are lit.

FSR AA seems to work fine, although that's maybe because it might just do nothing.

the problem is that TAA isnt a name for a single solution, its a name of technology

It's the name of "temporal anti aliasing" which has a specific function and that function results in garbage rendering anywhere it's used. Don't use semantics to be disingenuous.

each dev will do TAA diffrently, from the way it does motion vectors, oclusion masking or how it overlaps the frames

This is technically possible but seldom is it ever wrangled on a low level. Even then, at best, it minimizes the garbage, does not eliminate or change the nature of it.

This is akin to saying "affine texture mapping isn't a solution, it's a technology" and then pointing to one of the few cases to ever exist where it's less noticeable as a refutation of any complaint that it warps the fuck out of textures in games.

If you actually listen to actual devs implementing raytracing, not even in the final game but in their pipeline, you'll find out they most of the time find it extremely valuable as reference to how it brings satisfying contrast into a scene. A game without raytracing will rely on a single luminance or colour of all indirect lighting unless it uses GI via probes or some other stopgap, arguably muddying colours together more than the accurate nuanced contrast raytracing can add.

Live-action movies don't even have realistic lighting, they work within the boundaries of how photons behave in real life but the lighting setups for sets and scenes are totally artificial with lamps and reflectors all over the place to control how the actors and environments are lit.

Yeah and you can't do any of this to begin with because there's no photons. It just means you need better lighting artists.

You're mistaking art style with graphics.
Dont worry, a lot of other retards do the same.

Because MSAA does NOT ACTUALLY ANTIALIAS PROPERLY anymore. Read up on deferred vs forward rendering, or if you want a real world example of why MSAA is dead go boot up Deus Ex Mankind Divided and use MSAA. More expensive to run than path tracing and yet it does absolutely nothing to address aliasing.

No shit it means "temporal aa", but all that means is that its done over more than 1 frame. Even TAA already split up into nvidia, amd, intel, epic and other solutions, that inlude upscaling or not. simillary to "ray tracing", and all the implemnetations we see now, while it all uses ray tracing as a base, you can trace GI, shdows, reflections, AO, everything, at low or high sample count, with temporal or not denoising
all creates different quality of results

Even then, at best, it minimizes the garbage, does not eliminate or change the nature of it.

but can fix the 2 most popular problem, ghosting and blur, ghosting is done by poor accuracy on disocclusion where motion vectors dont match pixel position, can by fixed by more aggressive masking or higher motion vector accuracy, blur comes again from poor motion vectors or just shitty way of combining frame info

if TAA is both sharp, and with no ghosting, wouldnt then it be considered great?
I mean fuck you would complain about then?

You honestly believe that AI is not currently just an algorithm?

Turn off TAA

Game all surfaces in the game look like Michael Jackson's gloves

It's over...

SOVL non AI slop

So lets go back to forward rendering when it does work properly.

Are you the same anon who made DRG webms without anti-aliasing? I really wish I saved them because the pixels in those were so ugly during movement.

Cool I'm in, now try to convince gamedevs who believe the path forward for videogames is more and more graphixfaggotry. They'll never listen to you.

THis isnt even an aliasing issue, I don't mind playing old games like far cry 1 without aliasing on, the jaggies on edge aren't that bad.

This is because these shitty games are designed to use TAA so they have these weird mesh and shimmer patterns because the blurring from TAA gives the effect they desire, which is why its almost always forced and needs third party shit to turn it off.

At that point it's a lot of heavy handed hackish methods just to degarbage an algo with the singular purpose of removing jaggies, which seems like it could be accomplished with something much more straightforward. It's bizarre to sit here praising the technology based on how much bullshit it has to do to not suck at a basic function.

easy, just dont have shiny or metalic surfaces in your games
or water, or hair, or glass, or mirrors ( wait, they already do this and people coimplain)

game devs are just lazy, if you remove everything then it will look great

TAA is fine, but TAAU is an abuse of the system.

No, sorry. I just had this in my image folder because for the longest time I couldn't explain to people what I was trying to describe when devs made games with TAA in mind and what happen s when you force it off.

I wish I had some showing things like character hair because that shit is awful.

All this shit worked fine when we used forward rendering. Just look at half-life alyx for a recent game that doesnt use this garbage.

alyx has full on pbr support and doesn't suffer from this issue

AI is involved!?!?

SAAAAAAAARS THIS IS THE GOOD!!!!!! BEST THING EVAR SAAR!!!!!!

all anti aliasing methods are ai because they all rely on automatic processes

Retard alert! Retard alert!
There are two types of anti-aliasing, the type that uses existing frames to create new frames and the type that uses past frames to guess what new frames should look like.
The result of the latter type of anti-aliasing is that it causes blurriness and smearing. The smearing part is pretty much solved, the blurriness is not.

At that point it's a lot of heavy handed hackish methods just to degarbage an algo with the singular purpose of removing jaggies,

You just described rasterization as a while.
Everything in rasterization is a hacky method to do what ray tracing does properly.

Also its not hackish, you just need subpixel accuracy for those effects, but that comes with a lot more vram usage thats why its not done, but as tech progress so will those solutions

no it fucking didnt you retard, tyou had same problems in crysis 20 years ago
and during hl1-hl2 era people played at like 320-640p resolution with no AA, with jaggies so big they just had to mentally fiter them out

AI blind haters and AI activists are both annoying and retarded.
If it's good, it's good, if it's not, it's not.
And posting six fingered garbage as "good" is absolutely retarded.

no it fucking didnt you retard, tyou had same problems in crysis 20 years ago

Okay retard. You think turning off MSAA in crysis causes a shimmering effect across all surfaces? Because it doesnt you stupid nigger faggot, you'll see jaggies along edges.

i love how disingenous you can be with videos like these
its like the perfect example of "hey we have so many problems only taa can fix!!!" and then 95% of the discussion is noise to drown out dissent

how it brings satisfying contrast into a scene

Cutscenes aren't gameplay. The most valuable visual quality during gameplay is consistency, the player being able to quickly spot and identify gameplay critical elements. With raytracing, the player's position and the camera's position changes how object surfaces are lit. If you want to have recognisable designs or even recognisable objects you pretty much have to make them self-illuminating to override the environmental light, or alternatively your game has to be extremely slow paced and static or even use fixed camera. Raytracing is a tool which like all tools works for some cases and doesn't for others. There's a real danger of every game looking even more samey because they have to implement "light control" with object/effect glow and giving the player a flashlight or something so they can always see for gameplay purposes.

That said simulationism vs art/gameplay was already an issue in non-raytrace games, raytracing's biggest negative is introducing a big performance cost on top of that.

I seriously don't get why mipmapping isn't more popular.
It solves almost every issue.

Cutscenes

A scene isn't a cutscene

no shit, alyx uses every new solution they could think of, alpha to coverage for transparencies, msaa only for opaque geo edges and shader filtering to deal with specular flickering, a collection of surface specific solutions rather than something universal
and the game was insanely demanding when it came out

it worked for them since it was a short low scope game, same solutions can be too expensive for other games if they have too much geometry

In that case, the shaders and shit are made with the assumption that there will be TAA.
Anisotropic filtering is mip map on steroids and is the standard.
However it is getting common to make shit that rely on TAA completely.

With raytracing, the player's position and the camera's position changes how object surfaces are lit

It changes how rays are sampled but not how object surfaces are lit. You're basically arguing that in real-life everything blends together and you can't see anything.

What I wrote still applies.

Anisotropic filtering is mip map on steroids

bruh you need mip maps enabled so that your texture filtering can work... it doesnt fucking work with a full res texture at an angle

fuck off, i remember the same specular shimering in crysis 3 no matter what AA i used
also its not edge shimmering, its normal map aliasing, while textures do get filtered in uv space, the shader itself if screen space and final result creates those walking white dots, msaa CANT fix this, since its not related to geometry, its a shader problem

Mipmap is having several versions of a texture with reduced detail, and swapping between em at distance.
Trilinear is the same shit, but you smoothly blend between the stage.
Anisotropic filtering is having several texures but squished vertically and horizontally instead of just multiple regular steps.
It's the third evolution step of mip map.

aniso.png - 1200x568, 1.29M

Because most aliasing in games these days is temporal. There are not jagged edges on the edge of objects, which is what you need MSAA for. You cannot see aliasing in still screenshots. You see it in motion, with subpixel detail shimmering and crawling across the screen due to rasterization being an ill-fitted sampling technique for the extreme level of detail in modern games. For temporal aliasing, you need temporal anti-aliasing. The other guy mentioned DE:MD, for a more recent example that you probably own go boot up RDR2, turn off TAA/DLSS and toggle MSAA on/off. It does literally fuck all to remove the kind of aliasing that graphically complex games have.

Mipmapping is standard on any actual texture format anon. As an amusing observation, DLSS actually solves any aliasing you get from not doing mipmapping, I actually tried it. It was pretty interesting because you could adjust the LOD bias in games so you'd get the full resolution texture always and it would still look good.

It changes how rays are sampled but not how object surfaces are lit

It does if the object moves, which wouldn't happen in an older game that doesn't use dynamic realtime lighting.

the extreme level of detail in modern games

What do we need all that detail for, anyway?

new game

open graphics settings

upscaling off

native resolution of monitor always

dlss fsr off, i dont even fucking know what they stand for let alone do

all antialiasing off

motion blur off

bloom off

post processing off or set to minimum if it doesnt let me turn it off

simple as
im old and technology has zoomed past my understanding, and i dont know any of these terms or how to really configure them so i get performance boosts without sacrificing image quality. i just turn all of it off and throw a huge GPU at it. rip.

MSAA has never been good. It didn't do hardly anything and had a massive performance hit.

Wide Tent and Quincunx were effective. But they blurred things. And the same worthless idiots that claim MSAA was good now hated that.

MSAA advocates have shit for brains and will complain regardless.

Free Super Sampling is the best thing since women.

It's the name of "temporal anti aliasing" which has a specific function and that function results in garbage rendering anywhere it's used. Don't use semantics to be disingenuous.

He isn't being disingenuous at all by stating that TAA is a vague umbrella for various techniques with a common usage. Just like Lumen, Nanite, RTX, and various screen-space techniques can all be placed under the umbrella of "ray tracing".

There in fact are methods that are fundamentally different. DLAA is one example, it uses a proprietary and wildly different sampling algorithm from traditional TAA, but is still by definition "temporal anti-aliasing" since it samples multiple frames to remove aliasing.

You are even more funny of an old codger. Modern graphics rendering is dependant on temporal accumulation to show up right.

the problem i have with MSAA is that for it to do shit it need to be 4x +
and 4x is really demanding already when the scene is busy, and cost nothing when scene is simple, leading to insane frame rate swings and frame rate drops, and if a lot of particles on screen? say hi to silky smooth cinematic 20fps

TAA is the specific technique of deforming the last frame to look like the current frame, so you can accumulate partial results into something that was processed across multiple frames.

The common use of anti aliasing means you shift every frame by a sub pixel offset, and it does accumulate into an anti aliased picture if the whole "deform the last frame" process is done well enough.

But now many effects are just done partially or in a noisy fashion so the TAA can "complete" it.

Modern graphics rendering is dependant on temporal accumulation to show up right.

just imagine that i gave you a brainless, slightly slack jacked stare of uncomprehending ignorance, and thats my reaction to this sentence

DLSS actually solves any aliasing you get from not doing mipmapping

And then introduces an amazing amount of blurring.
Seriously, look at this shit.

no shit, you are upscaling here, use dlaa if you want to look at native result

just drop your fps from 60 down to 20 to get better graphics, bro!

No thanks. Monster Hunter Wilds had the same exact problem.

1080p

The sad truth anon is that no matter what you do, you cannot get a good image at 1080p in deferred rendering.

so why you complain about blurry results of you arent rendering at native?
if you rescale your image to what your base render is (guessing here its balnce), your imagine becoles clystar sharp with no blur, showing that dlss isnt causing any bad image, its the upscaling itself

smaller .png - 1114x626, 1.03M