I hate ue5
I hate ue5
buddy people stopped making ragebait threads about this game like 6 months ago
This looks like byproduct off shitty optimization
works on my machine
The door closed so fast that it left afterimages. Just like in DBZ
this would looks cool in a stalker type game where like the laws of physiks and shit are fucked
this. its an anomoly
why yes I am a insufferable contrarian, how could you tell?
ngl that looks cool af
Yes.
It’s called unreal engine
Yes saar it's intended saar
he isn’t capable of running hardware lumen and has to settle with the shitty software version
Not my problem
It’s not really ue5’s fault that GSC has no fucking clue what they’re doing, but yeah I don’t like it either
when the Palinopsia hit
It also happens in expedition 33 which is a flawless gem according to Anon Babble
filtered
this is kino
defending this
Kek
I thought this was a lira video at first.
you bastard bitch these are my clips get your own!
sounds like user error
woah! A-are those l-lumen reflections??? Holy shit they look better than real life!
effects of being in the zone for too long
that's just how robots see the world
Its just cool after images you wouldn't get it bro
Bloody bitch bastard shit up
i refunded this shit game so fast. Not even the first one will be spared from the UEShittification
Why have such """features""" in your engine?
god, stalker 2 is such a soulless piece of subhuman shit.
It's not an ue5 issue, but a memetracing issue of not updating the lighting fast enough with how resource intensive it is.
njudea literally sold us a fucking lie and they still haven't even been able to make it not shit the bed when you move the camera.
Just look at how all the rtx comparisons are always still images after the rays have been properly cast.
isnt this just nvidia virtual resolution/frame gen garbage
No that's disocclusion artifacts from UE5s lumen global illumination which uses temporal accumulation to slowly sample the lighting over time
It's not an ue5 issue, but a memetracing issue of not updating the lighting fast enough with how resource intensive it is.
This.
It also happens in those shitty RTX remakes of HL2, Quake 2, and Portal, and 2 of those predate UE5's existence.
ue5 just makes me think of badly made wegs
if a game lacks art direction it probably isnt worth looking at for the graphical fidelity
saar redeem the path tracing
DIE ONE THOUSAND DEATHS
What the fuck I thought valve was based what happened bros?
Valve didn't make it
None of those shitty RTX remakes are made by their original developers, they're all just glorified mods.
Use. Baked. Lighting.
As it stands, the current implementation of raytracing is praised almost exclusively by retards who don't understand how light works.
Raytracing on actual consumer hardware is still probably a decade+ off given the current pace of hardware advancements.
Probably even longer, since everyone decided to forego actual hardware advancements in order to chase the AI buzzword.
Nvidia made all of them
This is more of a case of ray tracing being significantly over hyped.
Yeah the temporal stuff is kind of aids but it looks good when you're not in the middle of a transition
plays game about anomalies
gets triggered when seeing anomalies
Yeah it looks good in marketing screenshots and garbage in motion
So it looks good when you're standing still and not actually moving the camera.
And there's nothing moving in the environment.
And there are no small (or, god help you, sub-pixel) elements on-screen.
Yeah that's what I said.
That looks absolutely nothing like the usual DLSS blur.
I am not sure if the other anon is right, but I am sure that it isn't DLSS.
Also virtual resolution is the AMD's name for super sampling, Nvidia equivalent is called DSR and neither are related here.
The temporal stuff is megaAIDs and defeats the entire point of raytracing because it makes everything look like smeared shit and the lighting is somehow even less realistic than the approximations games were already using since now all light in "raytraced" games moves at the speed of molasses since current GPUs can't actually trace rays fast enough to present actual realistic lighting.
Lots of this in Oblivion too
You mean UE5? You can run the exact same project in UE4 and UE5, the UE5 one will perform a lot worse for some goddamn reason.
It's usually reductive and unproductive to just say an engine is shit, as it's usually the developer that's largely responsible for a games performance, but UE5 really does just suck total dick, and that's because it's lighting solution is Lumen, and Lumen is really bad.
That's not true. You can get UE5 to UE4 level performance by just disabling all of the UE5 features. (namely nanite, lumen, VSMs, TSR)
It'll actually run a bit better because they mitigated the shader stutter issue in 5.2.
buy sh2 remake to give it a chance
leaves blow over puddles looking like tracer rounds in the opening parking lot
refunded
We're almost a decade into raytracing in retail games and it still has the exact same problems it had in 2018.
If anything it's worse lmfao
When it was just raytraced reflections you didn't notice the noise and temporal smearing as much
Now that it's used for everything it's like the entire image
Cool plastic world
it has gotten so much worse
I thought with metro exodus getting full ray tracing and it look pretty good we were going to head in a better direction
but oh boy was I wrong
a game using UE5 is just a sign that it's going to run like shit and have visual artifacts, smearing, light bleed, and many other issues
it's a great movie/cinematics engine but a shit game engine
It has nothing to do with DLSS lol. It's a combination of TAA and Lumen. If you think that's what frame generation artifacts look like then you have no idea what you're talking about.
Lumen isn't real raytracing. It's the pajeet equivalent of ray tracing with the mindset of "I can do it without specialized hardware and without the considerable FPS cost" of course it looks like shit. Properly tuned hardware raytracing looks good, see: Doom Eternal.
see
None of those games use Lumen or UE5, and they still have the same issues.
And Doom Eternal's RT is baby-mode.
every RTX game has the exact same temporal smearing noise bullshit
nvidia shills continue to pretend it's only a UE5 problem
alright rasheet back to the nvidia farms for you
None of those games are also 'real' games either, they're RTX remixes or fan projects of older games.
casually approach child
What engine do you prefer then and don't say inhouse or handmade per game because that's unrealistic for 99.9% of games
>every RTX game has the exact same temporal smearing noise bullshit
No it does not. In fact I specifically mentioned a game in which the RT is visually pristine, yet you ignored it. Don't you have some steam deck penis thread to shitpost in?
Is Cyberpunk also not a "real" game?
happens in oblivion remastered too. its kinda hilarious because i've just gotten used to ghosting at this point
It's clearly UE5's fault, and not a shitty third world developer's fault.
Post specs. You won't. You'll start raging about "amerisharts" and then tell me to cope about having a better PC than you.
Same exact shit happens in MW5 Clans and there's no way to get rid of it due to how fucked the engine is
Doom Eternal uses raytracing for reflections only and even then it's only on super smooth surfaces. The rest of the game is standard rasterization. That's why it's not a total disaster.
Older RTX games like Battlefield V did the same thing and also looked fine.
The path tracing mode of Cyberpunk is not, seeing as the game options explicitly say EXPERIMENTAL/beta with the path tracing settings.
Source 2 with forward rendering and baked lighting
Raytracing as a whole is experimental, because it's not realistic to pull off full pathtracing in any game at playable framerates even on an RTX 5090.
If you think this is just an UE5 issue it's because you're retarded.
It has the same ghosting and temporal instability issues. There are webms that are posted here regularly about it but I don't have them so here:
Every anti-UE5 pachuco here, post your specs.
I get those with a 4080. Should i be using a 5090?
Ray tracing and path tracing are not synonymous. It is realistic to pull of raytracing in games, there are games that have done it without issue. It's not the same thing as some experimental path tracing tech demo bullshit where the light needs ten fucking years to accumulate.
lazy shit devs fault, the finals uses ue5 and both looks and runs beatifully
a GPU with full DX12 and Vulkan support
x86 CPU
What else does UE5 need to render graphics correctly? Do i need to pay a monthly fee to Timmy Tencent?
Uh, señor (me am Brasil), UE5 stutters. Allow me to post DF video screencaps I took on my phone, my only electric device in my tin favela shack.
How do you even know if it has Vulkan support? Nobody uses Vulkan except trannies.
Source engine is only good for its physics, realistically all the games are ugly as shit
Specs wont fix the engine
Take me down to artifact city
Where everything's a smear
And you can't get rid of it easily.
Take me back (hardware anti aliasing is gone)
Take me down to artifact city
Where everything's a smear
And you can't get rid of it easily.
Take me back (hardware anti aliasing is gone, yeah, yeah)
The only games that pull off decent-looking raytracing are the ones that only use it for reflections.
It looks like shit for lighting because of the issue posted ITT.
And it's not because it's "experimental" it's because not enough rays are being cast or "traced," so everything has to go through a million denoiser filters to not look like absolute shit and needs to be done at the equivalent of quarter-resolution (or lower) to not run like absolute shit even on a 5090.
Reflections can get away with it because they're already heavily diffused often times and thus you generally won't notice, but the results will be inferior to baked lighting in all instances since in order to create fully dynamic raytraced lighting you would need light to, y'know, move, and when that light moves the effect of current raytracing completely fucking falls apart because it can't simulate the actual speed at which light moves.
They unironically will eventually. Early UE4 had a lot of similar complaints.
look at this blur which is a result of running at a low resolution and framerate
upgrades won't fix that
UE4 was shit right up until the end, a lot of recent UE4 games still run and look like shit (looking at you Homeworld 3).
2050
you have the power to run UE5 at native 8k with lumen cranked beyond cinematic settings and instant final gather
everyone has already moved on to AI generated visuals
Dead Island 2, Stellar Blade, and Callisto Protocol (post launch) all look and run great even on meh hardware.
The only games that pull off decent-looking raytracing are the ones that only use it for reflections.
Nope. Metro Exodus EE, Jedi Survivor, Indiana Jones, KCD2, just to name a few. Games with RTGI that isn't noisy or smeary.
Callisto Protocol ran like absolute fucking dogshit for like a year, you could at least use a game that wasn't a fucking disaster at launch like Lies of P.
And Stellar Blade isn't out on PC yet, we'll see how it runs at launch.
Three literally whats.
BUT I'M MARRIED TO EVE BLAH BLAH BLAH
Callisto runs great now though and it's arguably the best looking UE4 game which is why I used it as an example.
But Lies of P works fine too.
UE4 ruined the remaster of the Arkham games with its shit ass lighting
No, the developers ruined the remasters by taking a bunch of non-PBR assets and trying to render them in a PBR engine. And in some cases they actively made the original models worse by changing them for some godforsaken reason. (see: Harley Quinn)
You see this shit even in pre-rendered animations done in UE5 like in this video at around 8:20 and onwards
youtu.be
why not both?
Jedi Survivor's RT looks like shit and KCD2 doesn't even use raytracing, it uses Cryengine's own approximation SVOGI which isn't "real" raytracing.
Indiana Jones as a whole is mediocre-looking, especially outdoors where you can see the fucking awful draw distance where shadows disappear 20 feet away from you.
Callisto runs great now though
After a year.
Shit example, don't use it.
I really should dig up the release version of Misery 2.0 and play it just for seeing if the faggots at /sg/ were right in constantly bitching and moaning about it or if they were being retarded faggots as usual
but on the other hand I know some /sg/ members made a mod that is called "les miserables" and after reading what it is about and it's changelog I can safely say it was an attempt at making an easy mode mod for Memery, and that isn't helping /sg/'s case one bit
t. a lil' nig that beat Misery this year and loved it despite some very questionable things that boil down to "we won't explain this because it's common sense" and some actual bullshit like Nimble's guns being shit thanks to their integrated scopes
and for the record I was among the people that got filtered by Misery when 2.0 just released, I didn't get back to it until this february
Jedi Survivor's RT looks like shit
I'll grant you it looks unimpressive compared to standard lighting but that's because that game's standard lighting is already overcranked. And more importantly the RT does not look noisy or smeary.
Indiana Jones as a whole is mediocre-looking, especially outdoors where you can see the fucking awful draw distance where shadows disappear 20 feet away from you.
Correct but entirely unrelated to its raytracing, which is like I said not noisy or smeary.
Because of temporal ray tracing bullshit in UE5, this scene here is impossible to do with massive light fade/lag.
but that's because that game's standard lighting is already overcranked.
Not really, it's pretty standard for a modern UE4 game that runs as poorly as Jedi Survivor does/did.
Correct but entirely unrelated to its raytracing, which is like I said not noisy or smeary.
Nah, it's definitely noisy, it's just not as bad as your average UE5 game, partially because id put a lot of effort into their TSSAA tech in modern idtech, so temporal elements in that engine in general look a lot better than in UE, or most other engines in general.
RTX 5080
9800x3d
Teamgroup 32gb 6000 cl30 ram
Samsung 2tb 990 plus
What now?
5080
32GB
I'm sorry for your situation. CPU and SSD are fine, though.
It works fine with a 5090 and an Intel CPU
the finals
The only game that doesnt use ue5 fetures but old GI tech?
the hole point of denoisers and upscaled is to not have to do 1:1 pixel cost. The hole engine is designed around that, it will look like dogshit anyway
I honestly have no idea what i'd need 128gb of ram for. I have not have anything come close to even getting to 30.
Im gonna denoise your hole
Not really, it's pretty standard for a modern UE4 game that runs as poorly as Jedi Survivor does/did.
Funnily enough, Jedi Survivor runs perfectly on AMD gpu's. It's a mess on nVidia.
Half-Life Alyx and Counter Strike 2 look great
9950x3d
5090
64gb 6000 cl30
990 pro
door so fast it has afterimages
H-Hayai!
just use baked light-ACK
Preach brother.
Half Life Alyx had more
is this real life
moments than all other AAA games combined.
And it doesn't even look like it's trying hard compared to UE5 throwing 80 quadrillion polygons at every surface.
architecture visualization/digital production/film industry
prease understand
4.24+ is where the engine started to be in a good starting shape. Then they fucked all up big time on 5 for no fucking reason blowing up the few doc the engine had in the process
lazy fucks
Yeah, UE5
lazy shit devs fault
dont use bad features then?
old "future proof tech" looks like absolute dogshit now you have no idea what you are talking about tim
autistic sperg genius codes magnificent engine
retarded normalfags struggle to maintain and update it
jeets break future versions, making them even worse
the future is bright
sell out to MS
become soulless
we all seen it coming
I want ultra realistic games at 4K and 200 FPS without drawbacks
Ahh yes, "realistic."
this is more of a bad user settings result than a bad engine result
centennial getting shat on
fuck. that's my alma mater.
Nah, Lumen is just shit.
Nanite is also mostly shit.
Honestly? The remake is pure soul kino.
4K
but i didn't fall for that television marketing meme
I'm not really a fan of how cs2 looks but it's preference. BODYCAM is probably the most photorealistic game ever made and was done in UE5 if you're going for realism that's the best option
It does not run perfectly fine on AMD GPUs, it does not run fine on any GPU because the stuttering issues of Jedi Survivor happen irrespectively of GPU. They're PSO cache issues which is something the developers failed to address during development or patches.
brown filter
anti soul gas
pure soul kino
Growing up I never expected the game technology to go backwards in terms of cheating graphics just so consoles can get away with selling weak hardware. Things were supposed to get more powerful allowing less gfx shortcuts. Instead the engines made shortcuts automated so those of us with good hardware have difficulty avoiding it.
Unfortunately the gaming development industry did fall for 4k, and unless you're at 4k output resolution you won't escape image instability or blur or visual issues. Actually you need about 1600p to escape all those but 4k is closest to that.
they captured the European wilderness and its pagan sovl superbly. Not that you'd know a lot about that, Dikshit Sukdeep.
then why use the engine if you have to write half of it from scratch? Thats the definition of a shitty engine btw
I run The Finals in 4k 144 on a few year old AMD set up atm that's UE5 though but it is one of the more graphically intensive games on the market because of destruction
now we just ended up having more fake frames to make up for that dogshit performance
moore's law died (yeah, rip) so now all they got is betacuck copes
...but enough about the original game.
there are now multiple sources of ghosting.
It was bad enough when it was just TAA but this is fucking bullshit man.
the gaming development industry did fall for 4k
game devs didn't fall for it, consumers did. game devs are giving consumers what the majority of them want
modern "remastering"
i propose we call call the modern iteration of this practice "redeeming"
then what the fuck is the point of higher resolutions if you need to run things at 6 gorillion k to get the same level of crispness as before when we played at 1080?
devs are also paid by gyppy (You) manufacturers to force these memes in games
That law dying doesn't mean the curve flatlined, just less steep.
i just wish devs would start treating texture sizes like language with games and make them an option checkbox on install, instead of just all being included in the base game for a fuckoff file size for resolutions you'll never use
true but people want ray tracing. they're retards, but it's what they want
consumers expect large upgrades as before, i.e. for the pattern to extend into infinity, that's why those copes are needed
I bet he voted for the lefty politicians that made it the way things are.
Don't have kids
Women do act like men
Immigration to fill roles of a million unborn babies (not like they will live in the politician's neighborhood anyway)