Daily reminder that Unreal Engine 5 does not render graphics

Daily reminder that Unreal Engine 5 does not render graphics.
It renders grainy artifact riddled dogshit and then blurs over it with TAA smear.

Buying UE5 games is like bringing home an ugo because you were fooled by her makeup.

i fucking detest hair in UE5
i'll take what i think is a nice looking screenshot and then when i look at the still image there's a horse with hair that looks more like a digital artifact

Hair renders like that in every game if the hair is modeled very fine like that.
Thats why styalized anime games have an edge.
This shit has been a thing since ps3 era when they started tryin to have individual hairs on screen. Resolution was just too low to really see it like we can now. Not an unreal engine issue

Deferred rendering isn't unique to UE5. And the industry in no way "needed" to adopt deferred rendering, in case some shill comes here to shit up the thread.

They do this because deferred rendering is fucking dogshit at rendering multiple blended alpha layers, which is what you typically do for hair rendering. It's not only bad performance wise, but also causes alpha layering bugs and its inherent to how deferred rendering works so you can't get rid of or fix the problem.

The solution people have mostly settled on is to use alpha masking instead of blending and dither the texture, then smear it around with anti-aliasing blur so it looks more like alpha blended textures than alpha masked + dithered. This is like 1000x more performant than layering alpha blended textures on top of each other and doesn't cause layering/transparency/shading bugs.

And you're also not going to like the upcoming solution to this problem

which is ray tracing

but unreal engine renders everything as grainy pixelated abominations.

all """transparent""" objects

foliage

shadows

specular lighting

reflections

ambient occlusion

bloom

Everything.
when you turn TAA off it's just a dirty mess of grainy pixels.

render all effects at 720p or worse

game still runs like dogshit

BRAVO

transparency is:

colour + colour

one of the most simple GPU calculations in existence.
the N64 can do it, the SNES can do it.. and modern devs rendering method of choice is so shit it can't even do that.

Is there a big ass forums all the retarded graphics games devs in the world decide to agree on using the most consumer unfriendly techniques known to man? I find it sad that all of this shit happened overnight without me having a chance to prevent it.

BG3 had exactly the same problem and it didn't use UE5.
The "solution" is to either play at 4K or use AI filters.

RE engine hair looks amazing though. or even better, Koei Tecmos katana engine.

IMG_1848.jpg - 3840x2160, 538.87K

It's a low resolution issue, try playing with DLAA or at 4k and it mostly fixes this.

posts blurry dithered garbage

Only game in recent memory that still uses traditional rendering for hairs I can think off is KCD2

probably. gamergate proved journalists have a mailing list they use to collaborate, in order to make their jobs easier and more efficiently shit on their readers. When it was exposed, everyone was called racist and sexist and it was memoryholed. why wouldn't devs do the same?

Most state of the art game engine in 2025

hair rendering is barely above the Dreamcast fromover twenty years ago

doa.jpg - 600x451, 33.14K

when I pirated this slopfest, I had to check my settings about 5 times before I realized that's just what the game was supposed to look like. what a fucking abomination. definitely GOTY material
Goyslop Of The Year

AI shit and absurdly high render resolutions to try and hide the fact that the rendering quality is trash.

a literal fucking scam fucking pathetic. Remember when you just rendered the game at native resolution and it looked and ran great because everything was being rendered the correct way?

one of the most simple GPU calculations in existence.

Anon, you're talking about something you don't understand. Transparency is not expensive (it's not cheap either because it's not as simple as that, but let's just go with it) but it becomes extremely expensive in deferred renderers because the renderer has no concept of layer sorting. It doesn't know if some object is behind or in front of another. Furthermore, objects are clustered and grouped and rendered in one pass for performance, and often these are the sort of objects that need transparency sorting, like grass or foliage.
If you want cheap-ish transparency you need to do like id tech that uses a mix between deferred and forward, if you do it right that could give you proper object sorting.

I can see exactly the same artifacts. That's just a 4K screenshot, all these problems are mitigated on high resolutions.

It doesn't know if some object is behind or in front of another

Every pixel processed in a fragment shader has a z-buffer value. If the value is higher, it's closer to the camera than the last written pixel and so it will be overwritten, else it's discarded.
if deferred fags are too stupid to implement and save a z-buffer to the deferred render pass. that's on them.
when it comes to doing translucent obs you need to do a forward rendering pass instead of doing them in the deferred pass.

alternatively. get an art director and don't use 5000 light sources in every scene.

It's deffered rendering that results in ditthering transparency unrelated to UE5, industry did that choice after gen7, retard

You really don't understand what you're talking about. It goes way beyond just lights. Most effects rely on TAA and the fact that I haven't seen anyone complaining about them means they must be working as expected, with transparency being the only thing that looks bad.
In any case, the Z buffer only stores one single depth value per pixel. You don't understand how deferred rendering works. You have a salad in your head of half chewed thoughts and you seem to believe deferred renderers work similarly to forward ones, but you probably don't know how either one works.

Ouch, OP's avatarposting with the image in OP so he can't escape the embarrassment when he posts it again. People know he's oblivious.

this game looks like something from 10 years back tech-wise
but it doesnt matter cause it has art style and it runs fine even on old gpus

If it's not an Unreal problem explain KCD2

file.png - 578x575, 585.2K

how autistic do you need to be to care about this?

one of the most simple GPU calculations in existence.

the N64 can do it, the SNES can do it.

You don't know what you're talking about

EU5

RE Engine

RED engine

Luminous Engine

It's like that in all engines with deferred rendering.

it's a cost saving technique that many devs use regardless of engine

regular transparency still exists, they just use this because it's cheaper and has no sorting problems and you can't see the backsides

This bothers me too. We're sending people to Mars and we can't render polygons on a screen.

the Z buffer only stores one single depth value per pixel

That's all you need for opaque objects since all your lighting calculation is done on a single frame buffered for the deferred pass.
then you can render all the transparent objects in a forward pass while reading from the Z-buffer.

This guy uses fucking Game Maker to do what Unreal engine devs can't.
youtube.com/watch?v=RPwqoqZu_wY

If I blur out the problem, it doesn't exist

I know that you can mix and match passes. That's not deferred rendering, that's hybrid rendering and in my first reply I mentioned id tech which was one of the pioneers of blending rendering methods.

Use forward shading with a reduced lighting for transparent surfaces

we have literally regressed as a society due to feminism and transgender rights
all of our art and technology will fall into obscurity and we'll return to the dark ages, only distant legends of arcane sorcery remaining
we will witness this in our lifetimes

then you can render all the transparent objects in a forward pass while reading from the Z-buffer.

Then that's not deferred rendering anymore

To be fair I'd rather have shitty hair and hair locked behind vendor specific technologies (not to speak of the fact that hairworks looked like absolute ass if you weren't making vanilla straight hair)

1st year of cs

yeah, there's a reason most UE5 games force you to use AA. it's a known issue and why devs pretend native doesn't exist nowadays. if I own a 4K monitor, I shouldn't need AA.

good.

stuff like not perfect hair rendering is wayyyyy below in importance to other aspects of video games

Like claire obscur doesnt have proper lipsync, its fucking terrible
claire obscur has shit quality videos for no reason while 2tb nvme ssds cost 120€, like the space is there for you to use

A GTX 1060 is the bare minimum to play a game that looks worse than a PS3, and people are ignoring how awfully optimized it is just because the game is apparently "decent enough."

HairWorks tessellation

Fucks performance in the ass as shown in Witcher 3

This shit has been a thing since ps3 era

Wrong. In PS3 era hair was just layers of semi-transparent textures like every old game. That's why hair was so much better in the not-too-distant past.
What happens is that now everyone uses another rendering method, which is supposed to be faster, and in this new method it's harder to deal with transparencies, so instead of being transparent layers, hair is now layers with dithering, just like people used to make fake transparencies in the 1980s-1990s.

Is this why all hair in UE5 looks like absolute dogshit and seems to disintegrate the further you zoom out from characters?

Can only be fixed with heavy temporal AA that blurs everything the moment the camera moves

I kneel...

People don't like to admit it, but there is no thing like a "progress", there are a few algorithms and hope that their shortcomings will be mitigated in future, the industry traded transparency for more light sources

Here's my solution.
You're welcome Tim Sweeney. You can send me my check in the mail.

TAA.jpg - 706x847, 51.39K

No one cares Threatnigger, go copyright strike people who calls you out on your bullshit.

Why did Hairworks become the norm rather than the superior TressFX from AMD?
Did Nvidia literally moneyhat developers to force it? That's the only conclusion I can think of.

it didn't, its so bad its dead

Why did Hairworks become the norm

It didn't, it got killed and replaced by RTX just like every proprietary Nvidia technology

When both were relevant it did.

I always chuckle when I see two anons arguing, and after each response I change positions on whom I think is correct simply because they sound like they know what they're talking about in the response.

Me reading each response:

Ohhh

Ohhhhhh

OHHHHHH

OHHHHHHH!!!!

OHOH@O#HJ$L:HJ@L#$JLK@

All the while I have no idea who the fuck is right until the end (if I find out at all).

Well yes I was mostly referring to when they were both vying for relevancy. I remember TressFX being superior yet everyone still used Hairworks shit.
It mirrors current situations in a lot of ways is all.

You can run 2 render passes, for opaque objects in deferred shading, and for transparent in forward shading, then combine them, but it's not efficient, and making transparent assets for forward rendering requires strict ordering, your artists can't make coorazy hairstyles anymore, hair cards should be layered and not to intersect.

Engines used to use forward rendering, but that had its problems such as lights being hard to calculate, and not having access to utility buffers. Deferred rendering solves those problems but sucks for transparency. Modern engines are leaning toward hybrid approaches.

This is even blurrier

it's because of dithering
dithering is a pixel art technique that has no place in semi-realistic 3d games
it looks weird
like seeing a sprite in a 3d game
vaseline makes it even fuglier

Also I forgot to mention, but this guy is completely right Forward rendering has the advantage that it renders objects one by one, so sorting is almost trivial. But if you have two hair cards intersecting, then the renderer will render one fully on top of the other.
This problem doesn't exist in deferred so you can make any kind of weird hairstyle that you want.
For example, something as simple as braids (one hair card being both on top and behind another) is almost impossible to do with forward, unless you make each braid segment fully opaque.

I hate unreal engine so much.

first thing you notice when the game starts

autistic

I'm sorry anon but you might be the one with brain damage here

there were studios that were handling Hair in UE3, with the new tech out there this shit should be easy

lights being hard to calculate

They started using too many lights and horribly optimized lighting effects.

The problem is that I don't give a fuck if a characters grotesque skin pores are reflecting 500 irrelevant light sources from 500 miles miles away causing a 1% difference to the image that only makes it look worse..
But suits think i do.

The real world isn't limited to 10 lights in your range of vision, tard. Too many lights my ass

I hate unreal engine so much.

so much it's unreal.

the real world looks like shit.

Unreal GOAT era:
UE1
UE2
UE3

Unrealsloppa era:
UE4
UE5

Every game is now Fortnite Era:
UE6

It's not just lights. Deferred gives you the G-Buffer that you can use to run all sort of screen space calculations, and it's easier to work with, see this anon's reply

Fuck you forward+ is the only way mang.

Based anime induced psychosis enjoyer

Don't care use a cubemap instead.
I want muh smooth graphics and transparency back.
Priority 1.
if you can't even get that fundamental, nothing else you do has value.

t. third world monkey who's never seen more than 10 lights in his range of vision

BG3

definitely looked like shit if you didn't turn up the settings but it was ok once you did

tav.jpg - 1920x1080, 293.55K

SDFs can solve the hair problem. If hair were first rendered as SDF curves, then it could be placed into the scene has a single flat object, instead of a hundred individual cards. 1 Transparency. Easy to deal with.

Anon Babble actually knows how game engines work

wtf I thought you retards just got angry at videogames

Why can't vidya do hair right?

Don't worry, this is an anomaly. The catalog still is 90% turdworlder rage bait or thinly veiled coomer threads.

like bringing home an ugo because you were fooled by her makeup.

so every woman ever

They've been watching that Threat Interactive guy. They're just repeating what the youtuber told them to think.

Isn't that with Physx on? Hair did look better 10 years ago with Physx and TresFX.

That's not UE, that's devs making objects grainy so that TAA blends them.

We've lost so much

Screw the hair look at how hard the dithering cucks vegetation

has he actually gone into the problems with deferred rendering and the superiority of forward rendering yet?

These are transparent polygons with some dynamic movement which is still a great technique. Slap in some shader and it's great.
Most nu-engines are using some other faggotry.

That kid is mentally retarded and doesn't know how game engines work though. He also promised to fix UE's problems if people donated a million dollars to him.

You have no clue what you're talking about.

You're free to debunk him. Nobody has so far. They just attack his credentials, and then go on to make the same TAA smear shit that everyone hates.

Pedantic complaints of someone who doesn’t actually enjoy video games

Everyone's hair looking like pixel grit at all times, is acceptable.

Nobody that actually knows anything about rendering has bothered because we all know that arguing with retards is a pointless endeavor.

Nobody thinks that those small strands look good.

How can zoomers look at this and think it looks bad?

Zoomers?

You do know they romanticizing the early 2000s? They like those graphics because they're retro.

Hair has always looked bad in vidya. You are just jealous at the success of Expedition 33.

Beautiful, it looks perfect, I have myopia btw

DLSS 4 fixes this

Dithering is in every modern game as a way to push high quality materials and shading rendered in a low resolution which causes a grainy look that should be cleaned up with temporal anti-aliasing.

Ironically, the least amount of dither I've ever seen in a modern game comes from Bethesda and people mercilessly criticized their engine. Well, Oblivion Remaster is Unreal now. Happy?

UE5 is much worse than UE4. Also UE3, while not terrible, allowed all the poor development practices to creep in.

Yea because anime hair has almost 0 animation to it you might as well be looking at a png being moved in paint

kill yourself shill

No because you can't shill for forward rendering when its graphics are stuck in 1999

Well it's the case between semi translucent and dithering, semi translucent textures usually have sorting issues and overdraw and dithering has a lot of pattern artifacts and well noise in general.
A better solution would be alpha hashing but it's expensive and more with animation

No-one's arguing that it looks bad, thankfully. The modern audience has just grown used to eating slop

Anime hair could be swinging around at 480 FPS and still look better than a still image of UE hair.

They don't

what game has the bestest looking hair?

ue5 invented dithering and taa

any kind of thread about the technical aspects of vidya is always the most embarrassing dunning-kruger shit.

Well, un- optimised hash functions are a problem.
If we stopped thinking about performance using perlin noise distribution or even multi octave noise that shit would look good

If you're so knowledgeable about rendering, then why do you all have shitty TAA smears?

grainy artifact riddled dogshit

To be fair, thats how reality works but on a smaller scale. Thats basically what atoms are anon

Answered above, 4th post in the thread

Yes the problem with real life is alphabet hashing

oh no my video games are ruined, whatever will i do

no wonder it runs like shit

Tranny weeb genshit enjoyer. Stick to unityslop I'm sorry your pc can't run real games

The words you are all looking for are "Forward Rendering"

Forward rendering excels at real time lighting, sucks at real transparencies. So as a bandaid for that hiccup, we dither things that used to use alpha layers. And to fix the grainy dithering, we use temporal anti-aliasing. And to fix the smeary blurry temporal anti-aliasing, we use algorithmic sharpening. And to fix the performance hit of insisting on using real time lighting, we use AI assisted frame interpolation. And to fix the artifacts from AI assisted frame interpolation - we fucking kill ourselves because this industry is cooked.

tbu (this but unironically)

Deferred rendering*

we were talking about the shit devs need to do because they rely on that rendering technique. We know that deferred rendering is

Did all this shit really come from devs wanting real time lighting that much?

So basically, your answer is "we going to make TAA smears, and you just have to get used to it". Sorry, but that's not a satisfactory answer. It sounds like you don't actually know how to get good performance, with good visuals at the same time. So you pass the blame down onto the consumer.

Not the same anon but what would you prefer? Priorizing performance or using expensive distribution like perlin and multi octave noise, reducing a lot of performance

I'm asking you the same question what would you prefer? Priorizing performance with taa and screen space or using expensive distribution like perlin and multi octave noise, reducing a lot of performance?

Uhhh... Which option has actual has smooth alphas, and not faked grainy alpha? I choose the smooth alpha option. 4 channels.

Not the same anon but The problem is specific to alpha, since transparent objects can and usually will have tons of depth values and its impossible to capture them correctly on a single g-buffer.

how did the n64 and dreamcast do it

Well we are talking about dither, if we are talking about semi translucent materials in a deferred pipeline we are talking about weighted blended order independent transparencies

Man, from the same engine that gave us bioshock, batman arkam series and dishonored, UE5graphics may looks better in the modern sense but beautiful grainy blurry ghosty shit is still shit. It's like taking 1 step forward, 2 steps back with the evolution of this game engine

Forward rendering does it just fine because its not doing a gorillion calculations concerning said depth values.

Is this a real question? I mean I can answer it if you want

tranny le cinematic slop with almost 0 gameplay that can barely be considered RPG

can't even look good

E33 brownoids... how...

Maaaan... just tell me how to make the dithering grain go away. Like ACTUALLY go away. Not just hidden behind smear. I just don't want to see blurring and jaggies. We didn't used to see it back in the day. Now we do. Make it stop.

You need deferred rendering if you want multiple lights, forward is dogshit for most cases.

Why gamers pretend they know anything about computer graphics?

katana engine.

lolno
the current hair problem is because of modern rendering techniques

Why do you pretend?

Using Hash functions like Perlin distribution and also multi octave noise but that would impact performance like crazy

That IF is doing a lot of heavy lifting.

I'd personally prefer Forward+ or Clustered Forward. Fuck this framebuffer heavy, dithered nightmare swamp we're trapped in.

It's not an unreal problem, it's a lazy and untalented dev problem.
Unreal has a lot of solutions that work right out of the box and devs use them whether it suits their project or not.

Yes you can't capture correctly multiple depth values in a single g-buffer pass

Multiple lights aren't that hard. Just shoot rays from the viewpoint to the geometry, then from the geometry to the light source. If the ray is shorter than the full distance required to reach the light source, then the pixel is in shadow. This relatively inexpensive, because it's only a single ray per pixel, and only a single bounce per light source.

Are you sure that's how it's done? With noise? Noise often implies some kind of dithering is happening. Why can't you just have an alpha channel? Like a PNG image.

saar i am not the brown, you are having the brown

saar i post tv show from whiteland of people like I see?

lmao jeet retard

They should use graphical techniques that don't require them to slather everything in ugly ugly TAA or DLSS or whatever to function. Older games have good looking hair that doesn't need it, so why force it now when the downsides are so bad?

Yes with noise, without noise you have a grid like effect in the alpha hashed object I said perlin because of high frequency detail, but it's more expensive

works on my machine™

626612.jpg - 3000x2000, 2.4M

with almost 0 gameplay

The only bad review it got was PC Gamer giving it a 7 for having too much gameplay

You used the wrong buzzwords, check the script your trooncord gave you.

Alpha hashed? Why are you talking about Alpha hash? Isn't that the alpha method that creates jaggies? What is the alpha method that doesn't create jaggies? It used to exist. Games used to have it. What was it?

play at medium with 1080p fake resolution

ERM.... THE HAIR IS BAD

Are you niggers for real? The hair looks fine for me playing on Ultra with 100% upscale.

I mean as I said before Even in a deferred pipeline we can use weighted blended order independent transparencies

mad cuz bad

That would be semi translucent materials

Post a pic.

Strange name. I'm not picking up anything concrete on google. I don't think that's the term used in the industry. How would I find more information on this? What would I look up?

Any decent mods yet?

yeah but the real problem is that developers force things like TAA on as bandaid fixes to cover up for incompetence so even if you've got really good hardware it still looks bad and doesn't run nearly as well as it should

incompetence

Prioritizing performance and good visuals. RDR2 is praised for its graphics yet it's one of the most notorious examples of TAA being heavily used to clean up the image.

You want to search forward rendering and it's called alpha blending, search it that way

Prioritizing performance

Incompetence.

Peak performance

Imagine using multi octave noise

What what exactly is preventing developers from using Alpha Blending? I'm reading that deferred rendering can't handle transparencies at all! Huh. Maybe deferred rendering is the mistake? Maybe we should go back to the drawing board, and figure out something better?

as shit as veilguard was, and having never played it, I remember seeing they did hair really nice

Everything was unity slop in the 10s, now in the 20s it's all unreal engine slop.

What happened to people who could make their own non shit engine for their games? I'm tired of everything looking like a slightly modded unreal tournament or Fortnite clone

ur supposed to use dlss4

the industry in no way "needed" to adopt deferred rendering

They had to on account of deferred rendering being quite a lot more efficient in silicon.

That's not the case, the deferred pipeline has problems with alphas because of g-buffer
The way it works is that deferred rendering stores depth and material in g-buffers.
Since you have multiple direct objects along a ray you get non clear depth values so you can't store completely accurate depth information in the g-buffers, and also is the problem with colour accumulation....

deferred rendering doesn't suck at transparencies, it can't do them at all, dithered transparencies are an unfortunately ugly performance saving measure but are not used "because" of deferred rendering, we could do alpha blended like we used to but it'd run like absolute shit.

there's no ultimate solution, clustered or tiled mixed forward/deferred renderers have been a fantastic addition to the development pipeline but until we have near unlimited bandwidth, we're stuck with dithering.

Daily reminder that Unreal Engine 5 does not render graphics.

posts game not made with UE5

???????

Strange...every game for the past 6 years has done this but the cobsoleniggies only mention it whem it suits them.

Sooo.... the problem is deferred buffering then. If it can't do all it needs to do in order to get data for transparency, then that's a MASSIVE problem.

sorry, I meant deferred rendering. Not deferred buffering.

also dithered transparencies are not a new problem in the gaming scene, FF13 replaced almost all of it's alpha blended transparencies with dithered ones before release to make up performance, it was a massive downgrade from the pre-release footage.

I'm not sure whether that or the hair animation bothered me more. It felt very floaty and unnatural; like if it were in some low gravity.

because it's harder to market and sell an open source, non-hardware specific technology for your own gain.

It's so fucking insane to me that we're sacrificing so much for graphics that look like this. I feel like games haven't gotten much prettier since like CoD MW2, but they run like dogshit now and feature shit like this.

Did Nvidia literally moneyhat developers to force it?

Most probably. If Nvidia gave them hardware to develop their games, then you must use their tech even if it's worse than the competition.

ITT Autists who don't know what a game engine is.

I swear one of you retards said every in a previous thread that unreal game had cartoon textures as if it's a requirement of the game engine that your game looks like fortnite.

complain all you want but just be less of a fucking retard please, you can appreciate the art in older games til the end of time, but you can't say that real time rendering itself hasn't improved to a RIDICULOUS degree since 2009, what an unbelievably silly belief.

what we're gaining in technical progression is a loss of good artists and older, experienced developers with the taste required to get the art out the door we actually like, that's the problem.

This is such bullshit. You're telling me that hardware is 10x more powerful than it used to be 10 years ago, but we can't add a few more alphas without tanking performance? Someone isn't telling the whole truth here. What are you omitting?

There's a bunch of solutions but they kill performance
You can use multiple frequencies of noise you can search it as octaves or multi octave noise.
Also perlin algorithm or Simplex but those are more expensive than the common fast gradient noise most used. Also you can search weighted blended order independent transparencies, that may be more what you are looking for instead of screen door transparency.

There's a bunch of solutions but they kill performance

Do they kill performance? or do they kill performance *for deferred rendering*?

Have you ever considered that your lack of knowledge and experience with the subject might be why can't you understand what he's saying?

absolutely nothing, you just don't know what shaders are, what lighting is, what resolution they run at now compared to before, the reasons they're expensive to run, the mesh density, the rigging, it all adds up. I'm not going to pretend it's all sunshine and daisies, the industry right now is riddled with piss poor coders who can't be fucked optimizing their work because it's accepted by the layman masses that we can just use upscaling techniques to work out the kinks.

kek what the fuck are you talking about you retard

NTA but you can't really use that argument, lets say we switch back to a forward renderer? real time lighting is now near impossible without some new breakthrough in tiled/clustered network rendering. you get your transparencies, you lose lighting, the thing that makes games look "good" AND you lose your performance.

Wrong. In PS3 era hair was just layers of semi-transparent textures like every old game

saints row 1, 2005. check out those eyebrows.

file.png - 730x452, 615.4K

Should have stuck with UE4, hair in Rebirth looks 1000x better.

because it doesn't use any upscaling and an overly aggressive TAA, not because it's unreal 4

there are some old games are filled to the fucking brim with things that make them nearly magic products that shouldnt exist if a couple of mega autists didnt try their hardest to make something happen. Comparatively, the way the industry treats the customer is fucking obscene and its genuinely appealing that this is how things have went.
Gay retarded thalidomide nigger babies will shout

poor

brown

ok but this literally doesnt matter, why do you care

while ignoring that games

look worse

run worse

have more problems

need more solutions to fix problems that didnt used to exist that eat up more power and still look bad

take up 10-1000x as much hard drive space

makes having to turn down your settings a punishment rather than missing out on bells and whistles that you can do without

for uhhhh

uh

LIGHTS

and uh...

4k TEXTURES (that get upscaled from 720p since you cant run it at native btw)

raytracing and anything associated with it is the biggest fucking waste of development time possible. Why spend the time to craft every piece of your several hundred MILLION dollar+ game so that every frame could practically be used as a wallpaper when you could just force the consumer to brute force it and then berate them when they complain about how poorly it runs.

you MUST buy the new gpu

y

because we added a permanent asset of an apple that sits just outside of your characters FOV and every week we add 20k tris to it until you cant play anymore

what we're gaining in technical progression is a loss of good artists and older, experienced developers with the taste required to get the art out the door we actually like, that's the problem.

we are losing fucking both. How much is technical advancement worth if it doesnt fucking hardly work. If i step in water and the only thing that happens is a 4 frame splash png, i would take that E V E R Y single time over stepping in water and losing 20 fps over it

Well since we are talking about deferred pipeline and alpha hashing the use of noise is to don't get a check board like pattern, that's why I'm pointing out noise algorithms since they are the cause of the artifacts op mentioned. In forward rendering you don't need to use noise

because sorting them in the depth buffer becomes a problem very quickly

That's a distinct possibility. But in my limited experience exploring the world of 3D graphics, I have encountered people who tell me something isn't possible, when it's totally possible. Tech nerds are the biggest "we have no choice" fags in the world, when they totally have a choice.

I refuse to believe that hair transparencies jumping from about 20 during the PS2 days to 80 for modern hair, is destroying performance to the point where you have to switch to dithering.(Don't give me shit about exact numbers. It's just the idea that the transparencies haven't gone up *that* much proportionally.)

What do you mean by real time lighting exactly? Because older games have various lighting solutions. What specific kind of lighting does deferred rendering do, and how is is better than what forward rendering is capable of?

Well maybe deferred rendering is bad and we should stop using it?

Exactly, but what about OIT?

in deferred lighting you light the scene after rendering the opaque polygons, storing all of the materials in the g-buffer etc, it allows for near infinite light sources for almost no performance cost, compared to forward rendering where everything a light touches has to be re-rendered every frame for every surface.

and it's not about the amount of transparencies, it's more about how they have to be re-inserted to the scene, again, in deferred rendering you can't DO transparencies so you have to render the scene, light it, render transparancies in a separate buffer and add them back in.

Yeah, I really hate the way it renders hair
At times it looks like a very low rest texture

Unreal GOAT era

UE3

You have giga rose-tinted goggles. UE3 was diabolically shit.

DOOM 2016 for example is a hybrid renderer, has fantastic materials, lovely lighting, AMAZING performance, and the transparencies are handled shockingly well, even below native res.

it's not impossible, it's just unlikely that the talent is around to pull it off in most modern dev teams, from their point of view it's like "why optimize this specific aspect of our renderer if we can just deal with the performance with a post process blur or upscale?"

I'm not here to dissuade everyone or change their opinion, I just want it to be clear that it is so very rarely the engine at fault, it's the people.

it allows for near infinite light sources for almost no performance cost

This doesn't make sense to me. Because I haven't seen a great increase to lighting quality. If deferred can really do near infinite light sources, then why don't games look raytrace quality already? Shadows are still jagged along the edges. Shadows are still imprecise. Ambient occlusion is still often disabled in many scenes, or imprecise in other scenes. Where are all the scenes with dramatic lamp lit shadows, like back in the Fear and Doom 3 days?

OIT is very computationally expensive, on cpu and gpu, but yes it would look nice.

why's that?

light source count and resolution are independent to shadow count and shadow resolution, ambient occlusion is a post process, again independent from lighting or shadowing. you can selectively decide whether or not a light source is shadow casting or not, and it can have an independent shadowmap resolution separate from the light and from the lightmap resolution of the world itself.

moving to RT and unreal's virtual shadow maps is solving this by unifying both, a nice step forward like John Carmack's Doom 3 first unified lighting and shadow itself back in 2004

Buying UE5 games is like bringing home an ugo because you were fooled by her makeup.

I'm sorry I didn't quite catch that, can you say that again but with a food analogy instead please

ambient occlusion is also solved by it actually being real time with RT, instead of something you just pull data from the depth buffer for, we're in a better place with all of these things now.

Is there any engine that can get hair right?

It's not that simple, the lighting calculations in the deferred pipeline are done in a completely separate pass after everything is stored in G-buffers, that makes it super efficient and reducing overhead when the scene has tons of lighting. That doesn't work in forward rendering since they are done in the same pass. So it's alpha blending Vs light sources.

also to get back your point about light quality? light QUALITY is essentially photo realistic nowadays, the issue of realism now comes down to material response, if the material looks bad, the lighting will reflect that, just take something like Jusant or fortnite's update from a few years back, the lighting itself is beyond fantastic, so good that it makes these clay-esque rendered games look semi-photo-real.

minecraft RT is another good example, as it takes a world of cubes that is so abstract visually to a level of fidelity, from LIGHTING alone, that is in my opinion "photo-real"

Whoa there. You're starting to sound a little market-y there, shill bros. Calm down on the cocksucking.
What's "RT"?

If you remove all the difficult stuff like shadows and ambient occlusion, then claiming you can create a lot of lights, isn't an impressive claim. N64 games could already render multiple lights along surfaces. Current computers are like 1,000 faster than an N64. So I suspect that forward rendering can render 1,000x more lights without an issue. Deferred rendering might be able to do more like you claim. But Do you really need 10,000 lights in a scene? Seems like a pointless way to measure the quality of the rendering method.

Have you considered that you're looking at things a little too closely? Most people don't give a shit about these microscopic details and have been singing the praises of E33's visuals.

do you really need 10000 lights in a scene

have you seen cyberpunk? yes, we need to ability to do that when required, we're also not lighting models consisting of 200 polygons anymore, I can't tell if you're messing with me, honestly, no hate.

microscopic

in the center of your screen at all times in a 3rd person game

commit die pls

anyway, legitimately interesting thread but I need to fuck off and do meal prep, if it's dead when I'm back it's been fun talking

you are literally splitting hairs here, you just might be autistic anon

I'm not messing with you. I'm actually considering how many lights might be in a scene at one time. Even if you consider a game like Cyberpunk, how many lights might be enabled at once? Say you're on a street with neon signs. A bout a light per sign. So maybe like 200 lights in a scene. I don't see how that's so impressive.

Not that anon but you don't have an idea of how light works on n64, using fixed point arithmetic to calculate the light would not be a good way of comparing modern graphics calculations

NTA but I don't know what you mean by that, of course 200 lights in real time is impressive, please attempt that with a forward renderer and have your game EVER look or run as well.

or don't do it, and again your game will look like shit.

j-j-just bake it all into a lightmap

LOL

Both Veilguard and Assassin's Creed Shadows have hair strands. I'm sensing a pattern.

Why is every reply I'm getting "not that anon"? Are niggas leaving a single reply and disappearing? Or are you actually just lying and samefagging?

Anyway you might have a point about cyberpunk.(MIGHT. I just don't have the knowledge to dispute you there) But most games aren't lighting their scenes with that many light sources. Typically, it's the sun. Or Maybe a few street lamps. Or maybe passing car lights. Or the light emanating from a screen. You get the idea. It's rare for a character to be lit with more than a few lights in the first place. A feat that was already doable in the early 3D days. Now you're telling me it's necessary to have a bajlllin kajillion lights. Even though all the transparencies look like dithered shit now. Doesn't make sense to me.

Why the fuck is ur5 engine pushed so much when it's so shit.
It just looks awful. Truly the worst.
It made robocop look like shit and it made silent hill 2r look like shit

"competition" in this case concerns how many of their customers use AMD vs nvidia
the quality of the tech does not even enter the conversation
nvidia dominates gpu market share so developers revolve around nvidia

Why the fuck is ur5 engine pushed so much when it's so shit.

If you want a public engine for realistic graphics you don't have much choice

there are samefags every thread, but those posts are 30 seconds apart.

The past unreal engines do not look this bad. Ur5 is a downgrade that just clogs up graphics cards

it's not necessary to have infinite lights, it's necessary to have lights, and when your budget allows for multiple hundred lights with overhead to spare, that is much more preferable than every single light source being such a hog that more than 10 becomes too expensive.

overhead is the key here, before you cite that not a lot of games have more than a few lights on screen at a time.

you want those lights to be cheap to render, no matter how few there are.

it's not the engine but a rendering style. Everyone does this now.

Why would anyone use cgi software developed for hollywood (UE5) to make a video game?

ue3 fucking sucked

it is my world now
cope and seethe

It's fine. Even Rockstar can't solve animated finely stranded hair. Just stop kvetching.

hair.png - 560x482, 792.6K

it's only bad if you want games to look like games forever

it's for the future and will quickly outpace old graphics because it's designed for unlimited detail and textures that only scales with the number of screen pixels

Compare this to FFXVI… lmao

XVI has an actually good story as well

What was the last game that broke Anon Babble like this? BG3 and FF16 weren't this bad.

didnt ask + dont care + nigger + touch grass + incorrectly rendered + fake transparency + cheap + smear filter

I meant over the course of the thread. I've been getting multiple people proclaiming to be new anons. I'm not saying those two posts are both the same person. Also, take note of the pattern where one guy makes his argument, and then he always has a crony replying to him in total agreement. Cosigning his every post. It just smells... coordinated.

32 lights. What scene is going to be lit by 32 lits? Also, where is the advantage here? The grass texture in the deferred version is more blown out. The stock version is more evenly lit. So you have more lights with no real advantage that I can see.

In this kind of scenario, with stadium lights, you wouldn't light each light individually. You would make a single area light that spans across 4 of the spot lights. Thus only requiring to render 8 lights in total.

i don't get why anyone would care if the engine "renders graphics" or not, it looks fine

nigga u gonna gloss over the framerate difference?

there are too many "guys" to know who you're talking about but I wouldn't doubt it to some extent, also please look at the framerate again

What dark magic did RE4 use for its hair strands? It looks smooth even without vaseline filters though it does act weird during movement

by lighting it with an invisible hero light and not with the environment lights, it doesn't slot into any scene correctly

days since Anon Babble hasn't seethed about expedition 33: zero

artists want to put any asset they want in the engine and place as many lights as they want

they don't want to optimize everything for shitty old engines. that's why UE5 is better and so many developers are using it

days since they...haven't seethed? so they arent seething?

in 10 years you will look back and think of this as soul. I myself have no problem with this and rather like it as it makes games look unique and proves that we still have room to improve. I like it

Framerate

I saw. But I just gave you a solution that requires 4x fewer lights. A creative solution that competent developers would think to do, instead of just throwing infinite lights into the scene with no advantage.

Also, isn't that Kerbal Space Program? Doesn't it use some unique physics that most other games don't use?

0 days since they have not seethed, so they are seething. That means 0 days have passed since they have not seethed... Which means they seethed today. But wait... he said SINCE they haven't seethed, not days without seething, so it implies they aren't seething. Yeah you're right.

Yeah, the character models look fucking terrible in this

4x fewer lights does not result 4x higher performance in a forward renderer, it is not linear, you said you had *some* experience in 3d renderers but I've lost faith, if 32 lights cost nearly nothing in a deferred renderer but cost 70% of your performance in a forward renderer, you should be able to apply that to any lit scene in your mind and realise how quickly it becomes a problem, if the LIGHT SOURCES are killing your performance, where the fuck else are you going to pull your framerate from in a real scene? the fucking ether?

puke inducing UE crap mogged by a 15 years old game

hahahaah

57yrfw3s.jpg - 3840x2160, 1.31M

Are there any good UE5 games? I genuinely can not think of one.

robocop gets a pass from me

Can you run one

Come one, Rebirth wasn't that ugly.

The solution is to use Unity, which has had a fully released Forward+ renderer for years now

have they fixed their weird internal refresh rate cap on physics and camera movement?

Rogue One: A Star Wars Story

Unreal 3 didn't have this problem and thats a buggy as fuck engine.

Is there a way to turn OFF this god awful blurred pixelated shit that unreal 5 does??
Is there a cmd line or an ini file that can be used to stop it.

It just looks so bad. Unreal 4 never looked this bad. Who looked at unreal 5 and went "yeah this looks good"

games used to look a certain way because of limitations

with advancements in computer graphics now games render differently allowing things that wouldnt be possible before , but with some details the old rendering never had

autists scream because things are different, in complete disregard of the benefits

Summary of this thread

GTA5 used a hybrid rendering method that avoided most of the issues with deferred rendering. Devs are just lazy.

I loved the game but I can kinda agree on this point. At the end of the game there's like 5 different kind of parries to look out for and atleast for me it felt too bloated.

GTA5 used dithering for all transparencies AND fizzled them into existence instead of a soft fade in, it looked atrocious in that regard, the only reason hair looked "better" is because it was all polygons and few hair cards

You are comparing a $265 million budget game with some indie game

Werks on my engine, bro.

Asscreed Noggers cost more than that and didn't do it. Same with pretty much all other AAAs.

Again, Kerbal Space Program is performing physics calculations in the background. So that right there is taking up a adding to the cost.

As for the number of lights, wouldn't it work non-linearly in the opposite direction? Meaning 2 lights is exponentially more taxing than 1 light? And 3 lights is more taxing to the power of 3? 4 lights to the power of 4, etc? So by reducing the lights from 32 down to 8, you would actually be reducing cost more than linear. Not less than linear.

"Some details", as in, every hair on all characters looking like pixelated garbage.

ah, I did mix up linear and exponential yes.

but the physics performance is clearly not the factor that's dragging down the framerate, by having one stationary object with no propulsion you're eliminating the cpu bottleneck already, leaving just the fundental cpu drawcalls and the rest down to the gpu to render the scene.

nothing going on, just lights added, 70% of performance gone down the drain.

I'm pretty burnt out discussing this by now but it's been interesting, I'll leave off by saying a hybrid renderer will probably always be the go-to from now on, forward rendering can make a lot of things look better, but we need aspects of deferred rendering to make anything modern that looks good, run good.

hybrid engines employ techniques from both but isn't a magic bullet, I'm excited to see where we end up in 10 years, especially with the progression of RT

We don't like the trade offs, they are not good for making gaming more fun to actually play, they are meant to create cinematic bullshots and traillers
I want games with old style beautiful clean graphics, insane draw distances and high FPS.

tanoa.jpg - 2560x1032, 1.49M

You can make your game with forward rendering + MSAA if you want, even in Unreal 5 it's possible to do it the old way. But is not the direction the industry and gamers want to go

Go check MGS4

At least we get photorealistic lighting... oh...

lumen.mp4 - 1920x1080, 1.78M

oh no...

Talos2lumen.mp4 - 1920x1080, 3.54M

for me shit kept unloading in the middle of my screen like shadows and character, and the foliage and other various background materials were unloading on the side of my screen as i was turning. goty btw. how the fuck people excuse this type of shit in this game and monster hunter is beyond me. the low fps and shitty looking games are alright as long as the story is "decent" apparently.

Daily reminder that Unreal Engine 5 does not render graphics.

Go learn RED Engine if you are so smart.

God its crazy how fast STALKER 2 fell off people's radars

Every Unreal Engine game is quickly forgotten... I wonder why...
reason: soulless shit

Assasssin's Creed is using better hair tech than GTA 6 btw

none of this matters if your game is fun

What's the problem here, his hair looks fine

has reading comprehension truly regressed this far

Be kind with him :)

Anon, what that post meant is that since this supposedly happens in every modern game and not only in UE5, how come KCD2, running on the Cry Engine, doesn't have that problem.