When does it end?
When does it end?
when you upgrade your potato PC
That does nothing to help with the stuttershit.
You would know that if you actually had a good PC.
gay soulless slop shill
LMAO THIS FAG DOES IT FOR FREE
5090 and 9950X3D still stuttering from bad frame pacing in virtually every single UE5 game
nah must be the PC
soon
our savior is upon us
People complain about that shit, but it's definitely an improvement over the age of un*ty games trying to burn your GPU while rendering low poly graphics.
That was really nice of that piece of malware disguised as a game engine to commit a marketing suicide.
When does it end?
never. The tech boom is over. Computers are just going to be unreliable pieces of crap with 1000 vulnerabilities and weekly updates thanks to being permanently connected to the internet. Devs also know they can push updates instead of making solid software
Not OP but forget that bullshit.
4080, Ryzen 7 9800x3D and 95% of UE5 games I've played so far have had stutter, texture blur, texture pop-in, meshes popping in and general blurry image quality. All these things in various degrees. There's always performance issues. The literaly only UE5 game that ran as smooth as it should was Manor lords of all things.
All advancements in technology start out rough you whiny faggot. The rough beginning is what can then be optimized.
Never. They are in hand with Nvidia to keep releasing shit games with shit optimization, so you'll need to buy the next $5k GPU to play at a passable frame rate. It's win-win for them, no matter how much the goyim cry about it.
Good thing the realistic looking game on ue5 is absolutely shit so you can play older better games, right? Who is out here being hyped for ue5 games anyway? I guess the next Witcher and cyberpunk? I can't think of anything else
Recently played a couple of unreal engine games that didnt look like shit, Dead Island 2, Slitterhead
they do exist just few and far far far between
how those nvidia drivers doin for ya?
Who?
Nah, its just poorly coded because apple fucked with the current generation that barely can code in C.
ive got a 4090 and a 7950X3D. I understand how to set up my PC, unlike you.
Remember that time when the Q3 engine (which was used by many games at the time) ran poor like shit and had all sorts of technical issues? Remember when everyone complained about the horrible performance and all that? Yeah, me neither you fucking faggot.
Post specs.
Wow, I fucking hate jews so much it's unreal
DAT ISH UNREL CUH
There's a reason (OpenGL) why nobody used the Q3 engine and everyone went Unreal.
What's so sovlful about this picture?
Our Engine, Who art in Epic, Unreal be Thy name; Thy Graphic come; Thy will be done in game as it is in trailers. Give us this day our daily thread; and forgive us our shitboxes as we forgive those who shitpost against us; and lead us not into timtation, but deliver us from evil.
Awomen...
I came here to say this too. Built a 5090, 9800x3d build a month ago and UE5 games still run like ass compared to everything else.
I think they mostly fixed them at this point. I haven't seen any issues yet.
AMD
stutters
And you wonder why. Get a Mac like a real gamer.
Cawadooty still has idtech3 DNA. It's besides the point, anyway.
Literally the same issues emerge with any AMD or Intel combination of CPU and GPU. If you don't believe me go run a frametime graph while playing any UE5 game and then move around the game world
Unity has it's physics timesteps programmed by default to 50 FPS.
To fix it, you simply have to go to this setting, and change the 0.02 to 0.0167. That makes the physics timestep run at 60 FPS.
NOBODY DOES IT, and as a result, we have thousands of stuttery shit games running internally at 50 FPS, with horrible stuttery cameras, broken ass frame distributions and all sorts of absolute dogshit results, even in big titles.
One fucking number.
You really expect these people to actually know how to optimize shit and remove stutters?
Stable release 1.32b / August 19, 2005; 19 years ago
How's the ambient occlusion? How about the dynamic volumetric lighting? Real-time shadows? Oh wait, it has none of those you disingenuous retard.
idTech has always been sloppy with Q3 onwards, and nobody liked using it for that reason. Unreal dominated because it was consistent in development principles and offered reasonable performance and visuals.
This does solve the issue by not allowing you to run any games.
No games = no games stuttering.
What the fuck are you saying retard
None of this matters, all the stutters come from the CPU side, caused mostly by the game code being single core because it's entirely implemented in a massive blueprint script instead of using C++ and threads.
nothing, yet it pisses you off
Okay. Remember when (gold)source ran like shit and looked like ass? Remember Cathode having performance problems? Or how about that awful foxengine?
The point is I can keep naming engines that dont' run like shit like only unreal 5 can. And let's be real, ue4 also had these problems, just less visible.
names a bunch of shitty engines nobody wanted to use
It's about to be winter down there in Brazil, correct?
*grinds your performance to a halt*
Nothing personnel.. kid!
good
unpopular = bad
The shift towards UE5 has nothing to do with quality. It has everything to do with cost reduction and hordes of asset flipping pajeets.
It has to do with the fact that it's an engine that's consistently easy to develop stuff at a high quality for, as proven by all the high quality releases like Oblivion Remaster, The Talos Principle 2 (and the remaster of 1), and so on.
Post some DF screencaps you made on your phone to tell me about the stutters. Extra credit: call me an amerishart.
Cool man. They implemented new features. They tried doing it a certain and it didn't work well. Nothing you have said contradicts my original point that new features introduce new problems.
They didn't implemented "new features", blueprints are an horrible programming language that delivers nothing new or great, it just allows terrible programmers to write terrible code that runs like absolute dogshit.
That's not a feature, that is a problem.
high quality releases like Oblivion Remaster
This may be the best oxymoron I've read in a year. Oblivion Remaster runs like absolute dogshit, especially when you're out roaming in the world
dogshit
Oh, you're poor.
I didn't mention it stutters, and it doesn't for me. Just the average fps shouldn't be as low as it is for a 2500$ GPU.
Go back to your UserBenchmark bubble intelfag.
consistently easy to develop stuff
True. Hence the push for it. Cheap labour.
high quality
It's just a polished turd and you're blinded by it. Name me one UE5 game with a reactive worldspace that matches something like Selaco. Indy game, should be easy enough.
It's just not there. It's bland and superficial, performs badly but all you can see is le pretty gfx. Oblivion remaster has been reported running like shit too. It's the defacto slop engine. Only a handful of somewhat creative games are running on UE5, everything else is just the slop you'd expect. And you're eating it up.
And before you get all poorfag on me to, I'm
That's the expected reaction of a poorfag that don't actually have a high end machine and knows how the game runs on it.
it just allows terrible programmers to write terrible code that runs like absolute dogshit
Sounds like a programmer problem.
It's easy to pretend your magical imaginary machine does not stutter when its just in your head because you're too poor to afford a real one.
let me guess.. amd?
$ after the number
Third worlder confirmed.
I don't even know what Selaco is.
No, you're poor. I have a shitty 4090 and it runs quite good, as long as you're not one of those "I have to play with only 'real' frames and pixels" dumbasses.
Ultimately yes, but if the blueprint system didn't existed, these people wouldn't be able to touch game code, and games would run better.
heh, you're poor! you dont even have the most expensive equipment!
uh, yea i do
w-well nvidia!
and if he said he had amd you would call him poor. this is why no one likes indians. annoying smarmy little faggots.
Selaco
And actual game with some love and soul put into it instead of outsourced pajeets shitting in the next UE5 slop
I have a shitty 4090
No you don't, and you don't even understand the problem, given you listed the GPU.
Good thing I wait 5 years to play new games. Some niggas have no patience.
not using Matrox
True poors.
Here we go again. Let's turn the thread into shitposting
Meanwhile in reality
youtu.be
Stuttering mess on a fucking 5090 and 9800x3d
5090 bro
No you don't game runs like donkey shit
It's crazy how you tech illiterate retards will defend these shitty fucking games, I'm glad people are bucking back and Doom is getting no real purchases the moment they decided to rush some slop through the door that runs like shit
Not many people can handle the superior matrox image quality and driver stability.
They end up just melting.
No, I don't. You sure told me, poorfag mcfaggins.
Why not just make them run by any fps by default?
intel
Low IQ detected everyone laugh at this faggot, I had one that bought 2 5090s a few days ago seethe at me and shit himself because he was in full cope over having a Intel system and was begging to compare furmark benchmarks
LMAO
It's physics, you need to keep a constant rate or the math don't work.
However, the constant rate should by default by higher than the game refresh rate.
If anything, it should default to 120 FPS, given most games using it will just be 2D slop.
AMD shill tries to deflect how wrong he was
Any high end CPU from the last few generations is adequate for gaming. I don't worship Intel, but you seem to worship AMD because the saars in charge of your call center told you to. Post your specs, poorfag (you won't, and will continue to assert your ambiguous PC is better).
Le mayonnaise. Lawl. Ruffles have ridges.
Intel has been worthless for 3-4 generations now and you bought it, I find it hilarious how you fucking morons always try to poor shame people when you can't even do basic research
Also sit down I tower over you
The PC really doesn't help. 7900 XTX + 9800X3D for 1440p. The games still have an insane stutter. As someone already mentioned even 5090 gets a bad experience. Arch Linux is supposed to have better 1% lows, 9800X3D also greatly improves the 1% lows while gaming, but not even these things can save me from the UE5 stutters. The only solution for UE5 is to boycott it.
Now post one with a timestamp.
arch.b4k.dev
Do you blame Ferrari when a drunk asshole kills a family in a minivan?
he has to be fucking stopped
I blamed The Dark Knight for that guy who looked just like the Joker killing those people in the theater.
7800x3d and 4080
a midrange computer should be able to play medium
If they did a car that can only at either 0 KPH or 300 KPH, and made driving at normal speeds a lot harder than the normal? yes.
You can play the vast majority with DLSS Quality and Framegen cranked up at 4k target. That's how I play on my 4090, and the 4080 isn't slow enough to be unplayable (especially at medium like you claim). Native is pretty much gone at this point due to transistor size limitations, and people need to accept that. Even the next process that comes along is a bandaid for what's going to be a catastrophic halt.
Are we done yet?
I don't need to larp and you should really try to be humble and have standards
1nm? nah, it will be great, 20% more transistors probably, and $600 per chip die, a thing of beauty.
...fucking what? Want to do me a favor and try that one again?
He shouldn't need that for the majority of these shitty games with hardly any graphical upgrade to run, Framegen also requires you to have a high base framerate
Me too. I might have considered it if I had paid money to see The Dark Knight Rises,
what's my motivation?
Without the retarded car analogy, UE4 is designed in a way that programming for it well is harder than other engines.
You either use blueprint that is single core trash, or you use C++ with an atrocious API full of retarded dog tricks and massive interfaces.
There's no inbetween.
AMD's 1% lows beat Intel's average these days in gaming stress tests. I used to go Intel only but times have changed with how shit they've been for gaming lately, and how far X3D chips have come.
If there was an Intel competitor for X3D I'd probably buy it. I know they can do it too, they have had Xeons in the past that have an X3D like L3 cache on them, but the fact right now is that a consumer grade chip like that doesn't exist and you're gimping yourself hard in gaming by buying an i9 over an X3D chip.
My advise to you is maybe quit being a fanboy for a company that sold an entire generation of degrading chips, and cut the power to them as the fix for them.
Crazy how silent the 4090 anon became
Retards like you hurt us all and ain't no way most of these games shouldn't run at 4k native 120fps on top end cards when the graphical improvement for many of these games are small to no existent.
Honestly, I was just seeing how long I could keep my straw man going at this point. I literally don't even know what "blueprint" is. I stand by my retarded car analogy of blaming a company for giving tools to people who can't handle using them, though.
even his monitor is better
Brutal mogging
calls others poor
gets mogged
runs like a bitch
Seriously no rebuttal?
You just turn full bitch?
when people realize it's not an engine problem but a greedy suit problem
it lowers the bar of entry so dramatically that you can get cheap overseas monkeys to make 2/3 of the game for you
any AAA dev that doesn't use their own in-house engine should be bullied out of the industry because they're obviously not serious about their game
poorly optimized
This is the era of fake frames and resolutions
Why render a frame when you can guess?
I have a way shittier PC and Expedition 33 ran fine
Only game so far that just doesn't run well enough to be playable for me is MH Wilds which has its own shitty engine
Crazy how silent the 4090 anon became
Retards like you hurt us all and ain't no way most of these games shouldn't run at 4k native 120fps on top end cards when the graphical improvement for many of these games are small to no existent.
he's way sucking nvidia ceo penis, he'll be back in a sec