I hate ue5

I hate ue5

lumen.mp4 - 1920x1080, 1.78M

buddy people stopped making ragebait threads about this game like 6 months ago

This looks like byproduct off shitty optimization

works on my machine

The door closed so fast that it left afterimages. Just like in DBZ

this would looks cool in a stalker type game where like the laws of physiks and shit are fucked

this. its an anomoly

why yes I am a insufferable contrarian, how could you tell?

ngl that looks cool af

Yes.
It’s called unreal engine

Yes saar it's intended saar

ue5lumen2.mp4 - 1920x1080, 3.79M

he isn’t capable of running hardware lumen and has to settle with the shitty software version

Not my problem

It’s not really ue5’s fault that GSC has no fucking clue what they’re doing, but yeah I don’t like it either

when the Palinopsia hit

It also happens in expedition 33 which is a flawless gem according to Anon Babble

filtered
this is kino

defending this

Kek

Talos2lumen.mp4 - 1920x1080, 3.54M

I thought this was a lira video at first.

you bastard bitch these are my clips get your own!

Talos2lumen3.mp4 - 1920x1080, 2.76M

sounds like user error

woah! A-are those l-lumen reflections??? Holy shit they look better than real life!

Talos2lumen4.mp4 - 1914x1080, 3.52M

effects of being in the zone for too long

that's just how robots see the world

Its just cool after images you wouldn't get it bro

Bloody bitch bastard shit up

lumen.webm - 1920x1080, 3.7M

i refunded this shit game so fast. Not even the first one will be spared from the UEShittification

Why have such """features""" in your engine?

god, stalker 2 is such a soulless piece of subhuman shit.

It's not an ue5 issue, but a memetracing issue of not updating the lighting fast enough with how resource intensive it is.
njudea literally sold us a fucking lie and they still haven't even been able to make it not shit the bed when you move the camera.
Just look at how all the rtx comparisons are always still images after the rays have been properly cast.

isnt this just nvidia virtual resolution/frame gen garbage

No that's disocclusion artifacts from UE5s lumen global illumination which uses temporal accumulation to slowly sample the lighting over time

It's not an ue5 issue, but a memetracing issue of not updating the lighting fast enough with how resource intensive it is.

This.
It also happens in those shitty RTX remakes of HL2, Quake 2, and Portal, and 2 of those predate UE5's existence.

ue5 just makes me think of badly made wegs
if a game lacks art direction it probably isnt worth looking at for the graphical fidelity

saar redeem the path tracing

DIE ONE THOUSAND DEATHS

What the fuck I thought valve was based what happened bros?

Valve didn't make it

None of those shitty RTX remakes are made by their original developers, they're all just glorified mods.

Use. Baked. Lighting.

DeS.jpg - 3840x2160, 1.46M

As it stands, the current implementation of raytracing is praised almost exclusively by retards who don't understand how light works.
Raytracing on actual consumer hardware is still probably a decade+ off given the current pace of hardware advancements.
Probably even longer, since everyone decided to forego actual hardware advancements in order to chase the AI buzzword.

Nvidia made all of them

This is more of a case of ray tracing being significantly over hyped.

Yeah the temporal stuff is kind of aids but it looks good when you're not in the middle of a transition

plays game about anomalies

gets triggered when seeing anomalies

Yeah it looks good in marketing screenshots and garbage in motion

So it looks good when you're standing still and not actually moving the camera.
And there's nothing moving in the environment.
And there are no small (or, god help you, sub-pixel) elements on-screen.

Yeah that's what I said.

That looks absolutely nothing like the usual DLSS blur.
I am not sure if the other anon is right, but I am sure that it isn't DLSS.
Also virtual resolution is the AMD's name for super sampling, Nvidia equivalent is called DSR and neither are related here.

The temporal stuff is megaAIDs and defeats the entire point of raytracing because it makes everything look like smeared shit and the lighting is somehow even less realistic than the approximations games were already using since now all light in "raytraced" games moves at the speed of molasses since current GPUs can't actually trace rays fast enough to present actual realistic lighting.

Lots of this in Oblivion too

You mean UE5? You can run the exact same project in UE4 and UE5, the UE5 one will perform a lot worse for some goddamn reason.

It's usually reductive and unproductive to just say an engine is shit, as it's usually the developer that's largely responsible for a games performance, but UE5 really does just suck total dick, and that's because it's lighting solution is Lumen, and Lumen is really bad.

That's not true. You can get UE5 to UE4 level performance by just disabling all of the UE5 features. (namely nanite, lumen, VSMs, TSR)
It'll actually run a bit better because they mitigated the shader stutter issue in 5.2.

buy sh2 remake to give it a chance

leaves blow over puddles looking like tracer rounds in the opening parking lot

refunded

We're almost a decade into raytracing in retail games and it still has the exact same problems it had in 2018.

If anything it's worse lmfao
When it was just raytraced reflections you didn't notice the noise and temporal smearing as much
Now that it's used for everything it's like the entire image

Cool plastic world

it has gotten so much worse
I thought with metro exodus getting full ray tracing and it look pretty good we were going to head in a better direction
but oh boy was I wrong
a game using UE5 is just a sign that it's going to run like shit and have visual artifacts, smearing, light bleed, and many other issues

it's a great movie/cinematics engine but a shit game engine

It has nothing to do with DLSS lol. It's a combination of TAA and Lumen. If you think that's what frame generation artifacts look like then you have no idea what you're talking about.

Lumen isn't real raytracing. It's the pajeet equivalent of ray tracing with the mindset of "I can do it without specialized hardware and without the considerable FPS cost" of course it looks like shit. Properly tuned hardware raytracing looks good, see: Doom Eternal.

see
None of those games use Lumen or UE5, and they still have the same issues.
And Doom Eternal's RT is baby-mode.

every RTX game has the exact same temporal smearing noise bullshit

nvidia shills continue to pretend it's only a UE5 problem

alright rasheet back to the nvidia farms for you

None of those games are also 'real' games either, they're RTX remixes or fan projects of older games.

casually approach child

What engine do you prefer then and don't say inhouse or handmade per game because that's unrealistic for 99.9% of games

>every RTX game has the exact same temporal smearing noise bullshit

No it does not. In fact I specifically mentioned a game in which the RT is visually pristine, yet you ignored it. Don't you have some steam deck penis thread to shitpost in?

Is Cyberpunk also not a "real" game?

happens in oblivion remastered too. its kinda hilarious because i've just gotten used to ghosting at this point

It's clearly UE5's fault, and not a shitty third world developer's fault.
Post specs. You won't. You'll start raging about "amerisharts" and then tell me to cope about having a better PC than you.

Same exact shit happens in MW5 Clans and there's no way to get rid of it due to how fucked the engine is

Doom Eternal uses raytracing for reflections only and even then it's only on super smooth surfaces. The rest of the game is standard rasterization. That's why it's not a total disaster.
Older RTX games like Battlefield V did the same thing and also looked fine.

The path tracing mode of Cyberpunk is not, seeing as the game options explicitly say EXPERIMENTAL/beta with the path tracing settings.

Source 2 with forward rendering and baked lighting

Raytracing as a whole is experimental, because it's not realistic to pull off full pathtracing in any game at playable framerates even on an RTX 5090.
If you think this is just an UE5 issue it's because you're retarded.

It has the same ghosting and temporal instability issues. There are webms that are posted here regularly about it but I don't have them so here:

youtu.be/HwpxLDFtb8c?t=82

Every anti-UE5 pachuco here, post your specs.

I get those with a 4080. Should i be using a 5090?

Ray tracing and path tracing are not synonymous. It is realistic to pull of raytracing in games, there are games that have done it without issue. It's not the same thing as some experimental path tracing tech demo bullshit where the light needs ten fucking years to accumulate.

lazy shit devs fault, the finals uses ue5 and both looks and runs beatifully

a GPU with full DX12 and Vulkan support

x86 CPU

What else does UE5 need to render graphics correctly? Do i need to pay a monthly fee to Timmy Tencent?

Uh, señor (me am Brasil), UE5 stutters. Allow me to post DF video screencaps I took on my phone, my only electric device in my tin favela shack.
How do you even know if it has Vulkan support? Nobody uses Vulkan except trannies.

Source engine is only good for its physics, realistically all the games are ugly as shit

Specs wont fix the engine

Take me down to artifact city
Where everything's a smear
And you can't get rid of it easily.
Take me back (hardware anti aliasing is gone)

Take me down to artifact city
Where everything's a smear
And you can't get rid of it easily.
Take me back (hardware anti aliasing is gone, yeah, yeah)

The only games that pull off decent-looking raytracing are the ones that only use it for reflections.
It looks like shit for lighting because of the issue posted ITT.
And it's not because it's "experimental" it's because not enough rays are being cast or "traced," so everything has to go through a million denoiser filters to not look like absolute shit and needs to be done at the equivalent of quarter-resolution (or lower) to not run like absolute shit even on a 5090.
Reflections can get away with it because they're already heavily diffused often times and thus you generally won't notice, but the results will be inferior to baked lighting in all instances since in order to create fully dynamic raytraced lighting you would need light to, y'know, move, and when that light moves the effect of current raytracing completely fucking falls apart because it can't simulate the actual speed at which light moves.

They unironically will eventually. Early UE4 had a lot of similar complaints.

look at this blur which is a result of running at a low resolution and framerate

upgrades won't fix that

UE4 was shit right up until the end, a lot of recent UE4 games still run and look like shit (looking at you Homeworld 3).

2050

you have the power to run UE5 at native 8k with lumen cranked beyond cinematic settings and instant final gather

everyone has already moved on to AI generated visuals

Dead Island 2, Stellar Blade, and Callisto Protocol (post launch) all look and run great even on meh hardware.

The only games that pull off decent-looking raytracing are the ones that only use it for reflections.

Nope. Metro Exodus EE, Jedi Survivor, Indiana Jones, KCD2, just to name a few. Games with RTGI that isn't noisy or smeary.

Callisto Protocol ran like absolute fucking dogshit for like a year, you could at least use a game that wasn't a fucking disaster at launch like Lies of P.
And Stellar Blade isn't out on PC yet, we'll see how it runs at launch.

Three literally whats.

BUT I'M MARRIED TO EVE BLAH BLAH BLAH

Callisto runs great now though and it's arguably the best looking UE4 game which is why I used it as an example.
But Lies of P works fine too.

UE4 ruined the remaster of the Arkham games with its shit ass lighting

No, the developers ruined the remasters by taking a bunch of non-PBR assets and trying to render them in a PBR engine. And in some cases they actively made the original models worse by changing them for some godforsaken reason. (see: Harley Quinn)

You see this shit even in pre-rendered animations done in UE5 like in this video at around 8:20 and onwards
youtu.be/ftC82XScxV4?t=499

file.png - 1107x592, 460.82K

Jedi Survivor's RT looks like shit and KCD2 doesn't even use raytracing, it uses Cryengine's own approximation SVOGI which isn't "real" raytracing.
Indiana Jones as a whole is mediocre-looking, especially outdoors where you can see the fucking awful draw distance where shadows disappear 20 feet away from you.

Callisto runs great now though

After a year.
Shit example, don't use it.

I really should dig up the release version of Misery 2.0 and play it just for seeing if the faggots at /sg/ were right in constantly bitching and moaning about it or if they were being retarded faggots as usual
but on the other hand I know some /sg/ members made a mod that is called "les miserables" and after reading what it is about and it's changelog I can safely say it was an attempt at making an easy mode mod for Memery, and that isn't helping /sg/'s case one bit

t. a lil' nig that beat Misery this year and loved it despite some very questionable things that boil down to "we won't explain this because it's common sense" and some actual bullshit like Nimble's guns being shit thanks to their integrated scopes
and for the record I was among the people that got filtered by Misery when 2.0 just released, I didn't get back to it until this february

Jedi Survivor's RT looks like shit

I'll grant you it looks unimpressive compared to standard lighting but that's because that game's standard lighting is already overcranked. And more importantly the RT does not look noisy or smeary.

Indiana Jones as a whole is mediocre-looking, especially outdoors where you can see the fucking awful draw distance where shadows disappear 20 feet away from you.

Correct but entirely unrelated to its raytracing, which is like I said not noisy or smeary.

Because of temporal ray tracing bullshit in UE5, this scene here is impossible to do with massive light fade/lag.

but that's because that game's standard lighting is already overcranked.

Not really, it's pretty standard for a modern UE4 game that runs as poorly as Jedi Survivor does/did.

Correct but entirely unrelated to its raytracing, which is like I said not noisy or smeary.

Nah, it's definitely noisy, it's just not as bad as your average UE5 game, partially because id put a lot of effort into their TSSAA tech in modern idtech, so temporal elements in that engine in general look a lot better than in UE, or most other engines in general.

RTX 5080
9800x3d
Teamgroup 32gb 6000 cl30 ram
Samsung 2tb 990 plus
What now?

5080

32GB

I'm sorry for your situation. CPU and SSD are fine, though.

file.png - 1360x880, 116.2K

It works fine with a 5090 and an Intel CPU

the finals

The only game that doesnt use ue5 fetures but old GI tech?

the hole point of denoisers and upscaled is to not have to do 1:1 pixel cost. The hole engine is designed around that, it will look like dogshit anyway

I honestly have no idea what i'd need 128gb of ram for. I have not have anything come close to even getting to 30.

Im gonna denoise your hole

Not really, it's pretty standard for a modern UE4 game that runs as poorly as Jedi Survivor does/did.

Funnily enough, Jedi Survivor runs perfectly on AMD gpu's. It's a mess on nVidia.

Half-Life Alyx and Counter Strike 2 look great

cs2.jpg - 1280x720, 40.62K

9950x3d
5090
64gb 6000 cl30
990 pro

door so fast it has afterimages

H-Hayai!

just use baked light-ACK

Preach brother.
Half Life Alyx had more

is this real life

moments than all other AAA games combined.
And it doesn't even look like it's trying hard compared to UE5 throwing 80 quadrillion polygons at every surface.

alyx.png - 960x960, 1.61M

architecture visualization/digital production/film industry

prease understand

4.24+ is where the engine started to be in a good starting shape. Then they fucked all up big time on 5 for no fucking reason blowing up the few doc the engine had in the process

lazy fucks

Yeah, UE5

lazy shit devs fault

dont use bad features then?

old "future proof tech" looks like absolute dogshit now you have no idea what you are talking about tim

autistic sperg genius codes magnificent engine

retarded normalfags struggle to maintain and update it

jeets break future versions, making them even worse

the future is bright

sell out to MS

become soulless

we all seen it coming

I want ultra realistic games at 4K and 200 FPS without drawbacks

Ahh yes, "realistic."

this is more of a bad user settings result than a bad engine result

centennial getting shat on

fuck. that's my alma mater.

Nah, Lumen is just shit.
Nanite is also mostly shit.

Honestly? The remake is pure soul kino.

4K

but i didn't fall for that television marketing meme

I'm not really a fan of how cs2 looks but it's preference. BODYCAM is probably the most photorealistic game ever made and was done in UE5 if you're going for realism that's the best option

It does not run perfectly fine on AMD GPUs, it does not run fine on any GPU because the stuttering issues of Jedi Survivor happen irrespectively of GPU. They're PSO cache issues which is something the developers failed to address during development or patches.

brown filter

anti soul gas

pure soul kino

Growing up I never expected the game technology to go backwards in terms of cheating graphics just so consoles can get away with selling weak hardware. Things were supposed to get more powerful allowing less gfx shortcuts. Instead the engines made shortcuts automated so those of us with good hardware have difficulty avoiding it.

Unfortunately the gaming development industry did fall for 4k, and unless you're at 4k output resolution you won't escape image instability or blur or visual issues. Actually you need about 1600p to escape all those but 4k is closest to that.

they captured the European wilderness and its pagan sovl superbly. Not that you'd know a lot about that, Dikshit Sukdeep.

then why use the engine if you have to write half of it from scratch? Thats the definition of a shitty engine btw

I run The Finals in 4k 144 on a few year old AMD set up atm that's UE5 though but it is one of the more graphically intensive games on the market because of destruction

now we just ended up having more fake frames to make up for that dogshit performance

moore's law died (yeah, rip) so now all they got is betacuck copes

...but enough about the original game.

there are now multiple sources of ghosting.

It was bad enough when it was just TAA but this is fucking bullshit man.

the gaming development industry did fall for 4k

game devs didn't fall for it, consumers did. game devs are giving consumers what the majority of them want

modern "remastering"

i propose we call call the modern iteration of this practice "redeeming"

techjeet.jpg - 666x1024, 81K

then what the fuck is the point of higher resolutions if you need to run things at 6 gorillion k to get the same level of crispness as before when we played at 1080?

devs are also paid by gyppy (You) manufacturers to force these memes in games

That law dying doesn't mean the curve flatlined, just less steep.

i just wish devs would start treating texture sizes like language with games and make them an option checkbox on install, instead of just all being included in the base game for a fuckoff file size for resolutions you'll never use

true but people want ray tracing. they're retards, but it's what they want

consumers expect large upgrades as before, i.e. for the pattern to extend into infinity, that's why those copes are needed

I bet he voted for the lefty politicians that made it the way things are.

Don't have kids

Women do act like men

Immigration to fill roles of a million unborn babies (not like they will live in the politician's neighborhood anyway)

Nebraska is less steep than Colorado

Why did you read that?

Irrelevant pseudointelligent response

Devs got really got with rasterized graphics

throw it all away for RT which can't even claim to 100% look better.

everything costs more and runs worse.

Thank god for indies making shit on their 1060's as their performance target.

I read things online, anon posts image and I read it sometimes

I'd love to know how they got arkham knight to look so good on unreal 3

Devs got really got with rasterized graphics

Devs reached the ceiling of what they can realistically do with rasterized graphics, and it takes them 5+ years to develop a rasterized game that looks impressive for 2010s/2020s standards. RT is in a shit state now but the idea is that it can move past that ceiling and can cut down dev time.

where's the liru?

Look at this realistic lighting!

Devs reached the ceiling of what they can realistically do with rasterized graphics

Not really.

and it takes them 5+ years to develop a rasterized game that looks impressive for 2010s/2020s standards

RT games take just as long, fuck are you talking about?
You assume that all tech advancement will be used for good, which tells me that you're naive.
In reality RT will be used to lazily do all the shit that used to be done by hand before, so games will look roughly the same but run much much worse.

If there wasn't glowy alien shit in the middle and you told me this is just a photo, I'd believe you

that's how the outside world works, unc. Have you tried going outside? I mean, touch grass.

RT is in a shit state now but the idea is that it can move past that ceiling and can cut down dev time.

Yeah but why now then? You can cut baking time A LOT just by baking the maps with RT tech and you dont have to mess with the UVs.
Making it real time because of the meme is retarded. There should be a cheaper aproximation for open world gi that doesnt involve ghosting half the screen or taking 5 min to update a light change.

The only reason is money, people will eat anything so thats what they got.

In reality RT will be used to lazily do all the shit that used to be done by hand before, so games will look roughly the same but run much much worse.

companies just want to shit out slop that makes them money and is easy to produce and doesn't take a long time

Not really.

Yes really. The graphical differences you will get between something like RDR2 and a 2025 game built on raster will be solely created by extra development time, because nothing in raster tech has really advanced in the last few years.

RT games take just as long

No they really don't. Come on now. Developers have talked about this and even released videos showcasing this stuff, lighting is way faster to get working in a game when it has RTGI.

What are some meme features that still don't work and should've been removed years ago?

Raytracing

Motion Blur

Bloom

every dark souls game with the lighting engine mod mogs 2026 games and gta vi

10-.jpg - 2560x1440, 1.68M

that looks impressive for an entire decade span of games

ignored because retarded.

did that stayd dude finally release his ds2 lighting mod?

this game having ray-tracing with lego vaseline graphics is actually a gamechanger

you will not score any gotchas with that trashy looking game.

I don't think so, I think that's another Dark Souls 2 dynamic lighting mod that actually did come out, according to some anons that played it needs a lot of configuration in order to look good and a layman won't be able to do that, as also was made by a stinking pajeet

The graphical differences you will get between something like RDR2 and a 2025 game built on raster

So...nothing?
Even RT games don't look better than RDR2 outside of maybe the reflections.

nanite is the most useless fucking thing ive ever seen.

you can now have infinite triangle scenes!!!!!!!!!!!11

but nobody will see that because it's obscured by fog and the game runs at 50% render resolution, and there's additional bloom mixing with the scattered fog so you can't see past 50ft!!!

1000 Hz monitors will fix this! Trust the plan!

still in beta

Yes really. The graphical differences you will get between something like RDR2 and a 2025 game built on raster will be solely created by extra development time, because nothing in raster tech has really advanced in the last few years.

NTA but first, RT is used for some things, the rest of the game is raster. I don know what the fuck you mean with RT vs raster. Every single game is raster because is the fastest way to render geometry, period. RT lighting only makes sense in non static environments or absurd HUGE maps, with tons of detailed areas where the disk space of the light texture will be too big, even though there are many new tech to compress those maps.

There are some aspects that benefit from RT, but atm we dont have the power to run it, there is a good reason why offline rendering exists. That absurd amount of noise only makes current deferred rendering pipeline 100 times worst. Is like TAA for a 240p image so you can output 1080p to then TAA again to 4k and it looks like shit even on high end hardware, so they have to use frame generation, which adds even more artifacts.

Old tricks work the same for what is a small fraction of the cost because believe or not graphic cards are absurdly powerful.

No they really don't. Come on now. Developers have talked about this and even released videos showcasing this stuff, lighting is way faster to get working in a game when it has RTGI.

They take as long if not more. Optimizing a heavy RT game takes 110x times more, because its literally a black box specifically tunned for just enough ghosting, the moment you try to adjust anything it all falls apart if you dont have HUGE raw power. Ironically thats why we get the pieces of shit we got today regarding image quality. Dither/ghosting/smearing/frame pacing everywhere.

Do you thing optimizing rt on top of complex logic and asset streaming is easy breezy? Not to mention its changing very fast NvidiaTM tech and is not mature yet.

Alan Wake II was the best implementation of ray tracing I've seen yet, pretty clean looking, stable, gets rid of all sorts of trash screen space artifacts and so on.
Though the rasterized fallbacks are not as good as they could have been, like the non ray traced shadows look like absolute garbage even on the highest settings.

UE5 : the slop is unreal

raytracing makes your workflow faster

how to out yourself as being a dev working for a shitty studio that doesnt have software developers and just rely on the latest slop engine to shit out assetflips made in under a year

I think Timmy is here and he's not being a fag for once
why can't Timmy always be like this instead of being a spiteful little prick?

if you've worked with rendering in 3d software before, you'll know this has nothing to do with ue5, and is in fact everything to do with ray traced lighting, which is not exclusive to ue5.

I replayed most of DS2 from start to finish with Lighting Engine Mod a few months ago.
It works fine. Not sure what "it needs a lot of configuration in order to look good " is about just switch to the alternate preset unless you have a good OLED.
Besides that and activating DLAA I think I messed with one or two parameters only.

halfway there!!

how to out yourself as being a dev working for a shitty studio that doesnt have software developers and just rely on the latest slop engine to shit out assetflips made in under a year

That is almost every single modern AAA studio thougheverbeit.

I have and somehow UE5 has the worst implementation being the biggest engine out there.
if you've worked with rendering in 3d software before you would know that realtime RT has nothing to do with offline rendering RT

mind you, the posts that mentioned that it needs fucking about with the configuration are more than six months old and the stinkin' jeet seems to be cool kind of fag that constantly updates and tweaks his mod, instead of having an ambitious grand project that is 80% done after 10 years of hard work but will never get released because the dev got a family or killed his tranny ass or whatever

I work in the industry for a living and it's 100% UE5. Their proprietary GI solution heavily relies on temporal anti-aliasing that accumulates slower than the vast majority of software GI solutions. True hardware raytraced GI solutions don't take nearly as long to denoise as Lumen's. It needs a lot of work still to be even usable in dynamic lit situations. But I don't think Unreal gives a shit at this point, since it works for TV/Films when bumping up those values to rid of the noise and temporal issues, and think its fine enough for games.

this isn't an unreal engine 5 problem, it's a "lol let's do pathtraced lighting in real time with 1 a pixel sample" problem. pathtracing requires thousands and thousands of pixel samples. lighting like this is certainly the future, but it's definitely not the fucking now.

all the bot responses to this

lmao, been working with Unreal Engine for 8 years and never had this issue.

6" display

shit opt.? inside MY shit opt. engine?
it's more likely than you think.

hi tim.

i have this weird ass conspiracy that all raytracing is still screen space based and they access the other depth info for the bare minimum so the fake effect is more believable

if u move ur camera just a tiny bit there are times when the lighting completely shifts and tries to adapt to the latest position without the character moving.
look at this cybershit example: youtu.be/K3ZHzJ_bhaI?t=767
the char doesnt move, just the camera, yet the lighting still takes time to update because it only raytraces the shit that's in your view. how is this any less faggy than ssr

UE5 is what made me decide to just stop trying with new games. There are like 20,000 old games that are decent and now you can play them at 120fps and 4k and they look amazing. I don't need to play the newest whatever the fuck bullshit at 30fps with framegen and dlss that looks like muddy shit. none of them are even fun. the last good game was elden ring.

we have arrived at bluepoint glazing

the demons souls remake looked good graphically it was the design changes that were fucking stupid

no, he posts on twitter about lgbt rights or some shit and does twitch streams to an audience of 3 where he mods 30 year old armored core games. I just happened to check in on him yesterday

People who play japanese games are sad.

hey now! monster hunter is good

Probably a gullible yet talented guy who was influenced when activist trannies entered casual gaming spaces and dictated how people should think. Now he just repeats their ideas because he was caught up in the trend and never bothered to find a new home.

Seems like every UE game coincidentally has these optmization problems

Reminds me of trails in older GTA games.

This thread is full of Indians defending UE5 saying it's the devs fault every single UE5 game in the last 3-5 years has been a performance hog piece of shit with ghosting & artifacting.

yeah it's really shitty. robocop was on ue5 and looked damn good and i'd honestly argue it was a solid choice for the game because they worked with the engine well, but even though the game was a clear labor of love no matter how much work is done ue5 will still just have that lumen shit.
i think the only real way around it is not doing what op's webm does and having light/dark mix too harsh, it wasn't noticeable at times because the game mostly took place at night
i think if your game has a darker color palette it's better
also fuck all modern anti-aliasing, fuck frame gen, fuck AI, and fuck beautiful women

every single UE5 game in the last 3-5 years has been a performance hog piece of shit with ghosting & artifacting.

The Finals is a well performing UE5 game with good motion clarity.

Post specs to prove you're not a jeet.

strange considering that most devs in the last 5 years have been Indians.

This. Engine isn't excuse, it's just the newest one at a time where gay people and indians (dot not feather) are working in the industry

SAAR I NO RUN GAME ON PHONE ENGINE BAD SAAR

This, UE5 can be fine and would be fine if retarded studios would learn to still bake the lighting and take the time to actually optimize shit