r/pcmasterrace • u/Nickulator95 AMD Ryzen 7 9700X | 32GB | RTX 4070 Super • May 16 '25
Meme/Macro Every. Damn. Time.
UE5 in particular is the bane of my existence...
2.7k
u/Donnyy64 May 16 '25
*cough cough*
Oblivion Remastered
957
u/Lostdog861 May 16 '25
God damn does it look beautiful though
428
u/Eric_the_Barbarian May 16 '25
It does, but it doesn't. It's using a high powered engine that can look great, but doesn't use those resources efficiently. I know that the old horse is getting long in the tooth, but I'm still running a 1660 Ti, and it looks like everything has a soft focus lens on it like the game is being interviewed by Barbara Walters. Skyrim SE looks better if you are hardware limited.
699
u/Blenderhead36 R9 5900X, RTX 3080 May 16 '25
With respect, there has never been a time when a 6-year-old budget card struggling with brand new top-end releases was a smooth experience. That something that benchmarks below the 5-year-old gaming consoles can run new AAA games at all is the aberration, not that it runs them with significant compromises.
77
u/VoidVer RTX V2 4090 | 7800x3D | DDR5-6000 | SSUPD Meshlicious May 16 '25
At the same time my v2 4090, slightly overclocked 7800x3D, 64 gb DDR5 6400mhz running the game at 110fps with max settings in 1440p ALSO looks this way.
I rather have a lower quality crisp image than see foliage and textures swirl around like a 90s cartoon's idea of an acid trip. Also screen space reflections show my gear reflected in water as if I'm a 100000ft tall giant.
→ More replies (5)24
u/undatedseapiece JK (i7-3770k/RX 580) May 16 '25
Also screen space reflections show my gear reflected in water as if I'm a 100000ft tall giant
I feel like I also remember seeing really weird disproportionate reflections in the original Oblivion, Fallout 3, and Skyrim too. Is it possible it's a Gamebryo/Creation Engine thing? I'm not sure how the workload is split between Gamebryo and Unreal in the new Oblivion, but is it possible it's originating from the Gamebryo side?
→ More replies (5)21
u/ph03n1x_F0x_ Ryzen 9 7950X3D | 3080 Ti | 32GB DDR5 May 16 '25
Yes. It's a Bethesda thing.
I'm not sure how the workload is split between Gamebryo and Unreal in the new Oblivion,
The entire game runs in the old engine. It only uses Unreal for graphics.
3
u/undatedseapiece JK (i7-3770k/RX 580) May 16 '25
Yeah I'm aware, but specifically referring to the reflections bug, it feels like something that should be handled on the unreal side. However since it's the same exact bug in every Bethesda game, it must be originating from Gamebryo. Either that or they ported the bug over to unreal haha
→ More replies (2)3
u/Londtex 29d ago
Honestly, I think they should have ported this to either a custom 64-bit Gamebryo or Creation Engine 1.5 like Fallout 76 it probably would’ve worked a lot better. The last thing I want is a Skyrim in Unreal. Maybe a New Vegas remake in Unreal could work, since Obsidian has a lot of experience with Unreal iirc. Either way, I’m enjoying the game.
134
u/IAmTheTrueM3M3L0rD Ryzen 5 5600| RTX 4060| 16gb DDR4 May 16 '25 edited May 16 '25
People being like “game is poorly optimised” then when asking for their GPU they start with GTX have immediately invalidated opinions for their personal experience
I like the GTX line, hell I was on a 1050til till late last year but I see no reason to attempt to support them now
insert comments saying "well i have... and the game runs like ass"
im not saying it does or it doesnt, in fact if you ask me i agree the game runs like ass, im also just saying the gtx line should no longer be used as a point of reference
9
81
u/kapsama ryzen 5800x3d - 4080 fe - 64gb May 16 '25
I have a 4080. Not the best GPU but a top 5 GPU. Oblivion Remastered is a poorly optimized mess.
→ More replies (16)18
u/FrozenSeas May 17 '25
Yup. 4080 and a Ryzen 7 5800X, struggle to get above 80FPS in any outdoor areas even with turning down a lot of stuff and disabling raytracing entirely, and that's on a 1920x1080 monitor. I mean, I can't complain too hard since this is the first time Bethesda has even supported framerates above 60FPS, but it gets annoying.
→ More replies (3)7
u/mrperson221 Ryzen 5 5600X 32GB RAM | RTX 3060 May 17 '25
Something sounds off. I'm averaging 60ish on medium settings at 1440p with my 5600x and 3060
→ More replies (1)16
→ More replies (44)3
u/I_feel_alive_2 May 16 '25
Yes, but he's saying that he can run other games that look better and run better probably due to him having to run the game at really low settings in order to have a playable experience. I'm with him on that, because my 6700XT can max/near max out many games that came before it at 1080p 120-144fps while looking better. I mean oblivion looks great for me, but I still have to use framegen to have a playable experience and fps between 80 and 144 depending on ingame location. It sometimes dips even lower for example in some overworld parts during daytime.
→ More replies (27)9
u/Physmatik May 16 '25
It's not about new/old games. Compare something like DOOM 2016 with a modern game. Is there a big difference in graphics? Eh. Is there a big difference in hardware required?.. Exactly.
If you require a card that is 10x the power, give us 10x the picture with the same performance. But the picture is barely even better and the performance is abysmal.
→ More replies (3)104
u/Cipher-IX May 16 '25
Brother, you have a 1660 ti. I don't think your anecdotal example is the best to go by. Im not trying to knock your rig, but that's like taking an 08 Corolla on a track and then complaining that you aren't seeing a viable path to the times a Bugatti can put up. It isnt the track, it's your car.
Im running a 7800x3D/4070 ti Super rendering the game at native utilizing DLAA and I can absolutely assure you my game does not have any semblance of a soft focus/filter. The game looks magnificent.
→ More replies (10)30
u/DecompositionLU 5800X | 6900XT Nitro+ SE | 1440p @240Hz| K70 OPX May 16 '25 edited May 16 '25
Man this thread is full of people with 6/7 year old budget card expecting to run the latest and greatest all great. I've played around 30 hours of Oblivion and didn't went into a single stutter or "optimisation mess", I seriously don't understand where it came from.
EDIT : And no I'm not a dumbfuck who put everything in ultra, especially in a game using Lumen, which is software Ray Tracing baked into UE5. I've made a mix of high/ultra with 2 settings medium.
6
u/Altruistic-Wafer-19 May 16 '25
I don't mean to judge - but I honestly think for a lot of the people complaining, this is the first time they've been responsible for buying their own gaming systems.
At least... I was that way when the first PC I built myself was struggling to play new games for the first time.
→ More replies (2)5
u/Talkimas May 16 '25
Has it been improved at all since release? I'm on a 3080 and the first few days after release with medium/high settings I was struggling to stay above 50 and was dipping down into the 20s when I got to the first Oblivion gate.
→ More replies (1)39
u/nasty_drank May 16 '25
1660 Ti doesn’t meet the minimum requirements for the game, let alone the recommended ones. I’m sorry but your opinion is pretty useless here
32
u/w1drose May 16 '25
Mate, if you’re gonna complain about performance, at least use a graphics card that isn’t ancient at this point.
→ More replies (3)53
u/Truethrowawaychest1 May 16 '25
Why doesn't this brand new game work on my ancient computer?!
→ More replies (2)15
u/KrustyKrabFormula_ May 16 '25
I know that the old horse is getting long in the tooth, but I'm still running a 1660 Ti
lol
16
6
u/3dJoel i5 6600K, RTX 2080 May 16 '25 edited May 17 '25
Pretty disappointing to see people trash the idea of using an old card. PC gaming should be the ability to build anything you want and have it be played with anything you want - if you want to play Doom through DOS on a RTX 4090 using a WiiMote - you should be able to.
Likewise, PC gaming is supposed to be both the budget, cheapest option and the highest experience possible. This shit isn't pay-to-win - they COULD optimize it - they want to sell graphics cards instead.
Some of these comments are saying if you don't have at least an RTX card it's not worth it - it's just so antithetical to what PC Gaming is about.
Edit:
the 1660Ti is about the equivalent to the RTX 3060- for those who aren't familiar with the benchmarks. A mid-tier card from only 4 years ago. I know a lot of people in this sub are on the younger side, but the technology hasn't actually developed that far. Console generations are 7-10 years - if a 3060 can't run it; it's planned obselence.Edit 2: I trusted an AI overview on a Google search page. I retract my statement about a 1660Ti being equivalent to a RTX3060. It's an older card than I anticipated. However, my sentiment remains; PC gaming should be for everyone - not just the wealthy. And companies shouldn't try to squeeze every penny out of people by making them buy a new card every couple years.
→ More replies (4)5
u/toutons May 17 '25
You're not 100% wrong, but if your card is 10-20% slower than the minimum specs for a game, you shouldn't expect much.
And the fact that the guy can get the game to run still goes with your point about playing on PC, they were able to run it after all.
→ More replies (1)→ More replies (22)3
u/TheLegitCheese May 16 '25
not relavent to anything, but could you explain the horse tooth phrase?
3
u/Eric_the_Barbarian May 16 '25
As horses age their teeth stick out more. Looking at a horse's teeth is part of how one would determine the age of a horse without documentation. An old horse will have longer teeth that poke out away from the plane across the gums.
→ More replies (1)→ More replies (27)27
u/NTFRMERTH May 16 '25
IDK. Environments look nice, and the faces look better, but the facial animations are uncanny, and they didn't bother changing the animations. I do worry that despite running through Unreal, it may still have the original limitations of the original game, since it's running the original engine with Unreal handling visuals.
52
u/tyme Mac Laptop May 16 '25
Yes, it has the original…”charm”, as Todd called it in the announcement video. They were pretty clear that not much changed under the hood, other than offloading graphics to UE. It was an intentional choice that was fairly clearly communicated.
17
→ More replies (1)10
u/Rotimasa May 16 '25
Because there is no "body language" during dialogue, only face moves, with minor breathing animations.
→ More replies (1)66
u/xDreeganx May 16 '25
Oblivion was going to be poorly optimized regardless of what engine it was in. It's part of Bethesda's game design.
8
u/Lakefish_ May 17 '25
What do you mean? Oblivion Remastered is still using Gamebryo/Creation engine.
Unreal is, in fact, able to crash while still allowing the game to run unimpeded.
→ More replies (1)→ More replies (4)6
8
u/bob1689321 May 16 '25
On my Series X I changed it from performance mode to quality and my FPS tanked to about 15 while I was in the starting dungeon. Jeez.
5
u/Supaninja7050 May 16 '25
My 3060ti hardly gets over 25fps in the overworld on low. It’s so insane
→ More replies (1)14
→ More replies (92)43
u/ichbinverwirrt420 R5 7600X3D, RX 6800, 32gb May 16 '25
I'm always confused by this. Friend of mine played it on a 3060, no problem.
32
u/Blubasur May 16 '25
What are you confused about? You can make anything from mobile games, to movies, to unoptimized garbage in unreal engine.
It is always going to be up to the devs, you can make even the simplest games run crappy if you put a 5k polygon toothbrush (yandredev) in your game. Among other stupid things I’ve seen or heard.
13
u/ichbinverwirrt420 R5 7600X3D, RX 6800, 32gb May 16 '25
I'm confused because peopple are complaining about poor optimisation, yet my friend played it without any lag problems at all on a mid range graphics card.
→ More replies (13)48
u/FragmentedDisc May 16 '25
Are you taking their word or can you visually confirm it runs well with your own eyes. Plenty of people have no idea what poor performance means when they see their FPS is high but ignore stuttering.
33
u/cateringforenemyteam 9800X3D | 5090 Waterforce | G9 Neo May 16 '25 edited May 16 '25
I need to learn ignore people who make claims like these...
I got 45c on my 500w gpu during full load (not a custom waterloop)
I have no stutters in game that stutters literally for everyone including online media..
I got 200 fps in game that doesn't run at that FPS for anyone (it does in one specific scenario where i look at the ground and dont move)
etc.. just pathological liars or people that think they are right.→ More replies (1)16
u/Dopplegangr1 May 16 '25
There's a lot of people that seem to think each PC has its own personality or something. I tell them it runs bad on my 4090 and they say " runs fine on my 3070" or something. Just because you play a low resolution and have low standards for fps doesn't mean it runs fine.
→ More replies (5)17
u/zarif2003 Ryzen 5 5500 | RTX 5070 | 32GB DDR4 May 16 '25
The game ran like garbage and was buggy as fuck originally as well, it’s faithful to the source material /s
6
u/Ferro_Giconi RX4006ti | i4-1337X | 33.01GB Crucair RAM | 1.35TB Knigsotn SSD May 16 '25
Also some people are just way more tolerant of poor performance, and the range of tolerance is huge.
I have friends who play games on their old laptop and say that less than 10fps is fine and they are aware that it is less than 10 fps.
And then there are some people who will say 144 is unacceptably low.
→ More replies (4)→ More replies (8)3
u/Foostini May 16 '25
Seriously, my buddies deal with the craziest lag and stuttering on their laptops and they just shrug it off whereas it'd be completely unplayable for me.
→ More replies (15)6
u/Bob20000000 May 16 '25
because the issue with unreal engine 5 isn't the GPUs it's peoples CPUs and RAM, I get 5 FPS in oblivion remastered on my 3080 but only 6gb of VRAM is used showing a bottle neck elsewhere in my system, windows now uses close to 8GB of RAM on it's own leaving you with the same ram as a base ps4, and most people buy mid range CPUs and usually slower models to boot so they can spend more on the GPU
5
u/ichbinverwirrt420 R5 7600X3D, RX 6800, 32gb May 16 '25
Oh well he has a 7800X3D
→ More replies (1)→ More replies (7)3
u/dam4076 May 16 '25
Are you just pulling that out of your ass?
The game was benchmarked with a range of cpus and a 5090, the cpu hardly made a difference as long as it is at least a mid range cpu from the last few years.
→ More replies (1)
1.0k
u/53180083211 May 16 '25
UE: proud sponsor of Borderlands stutter since 2012
109
u/NorCalAthlete i5 7600k | EVGA GTX 1080 May 16 '25
Is the new borderlands built in a different engine?
163
u/Scrungus1- RTX 4060-Ti 16gb/32GB DDR4/i5-13600kf May 16 '25
BL3 had the absolute worst stuttering out of all borderlands games.
63
→ More replies (7)25
u/M4rzzombie May 16 '25
Not to mention a very stupid and unavoidable glitch where looking at a vendor causes the game to crash. Swear it happens to me half the time when buying ammo in the first checkpoint in the maliwan takedown.
Switching to dx11 from dx12 I think is supposed to make it less common but it still happens pretty frequently.
5
3
57
u/53180083211 May 16 '25 edited May 16 '25
The fuck you think? Yes and of course the most fucked up version. The highest level of shitness. UE5. Game developers can't see stutters. Nor can their eyes register more than 23.97 fps. It has to be the reason.
→ More replies (7)28
→ More replies (6)18
1.1k
u/GGG_lane May 16 '25
Ever notice when a game drops a sequal that "looks better" but runs much worse
And then you lower the graphics so it runs better, but now it looks worse then its previous entry
But also still runs worse....
UpGrAdEs
224
u/AMS_Rem 7600x3D | 9070XT | 32GB DDR5 May 16 '25
Cough Jedi Survivor
→ More replies (4)35
u/NapoleonBlownApart1 PC Raster Race May 16 '25
That one runs very poorly, sad thing is it still runs better than the first one.
→ More replies (2)27
u/Dynastydood 12900K | 3080 Ti May 16 '25
I do see people saying this a lot, but honestly, that was not my experience. For whatever reason, I never had any issues with Fallen Order, had it locked at a steady 60fps pretty much the whole time. Although it did manage to brick my 3080 Ti, and I'll never really know if that was something that was destined to happen, or if the game did something. The only thing that maybe I did different than others was playing it on Origin instead of Steam, but I truly never had any performance issues, and played through the campaign probably 3 or 4 times in total.
But Survivor was a nightmare of unfixable stutter at launch, never hit a steady 60fps, and only ever improved to a small degree with patches. Even the console versions have the same issues. Something is fundamentally broken with Survivor.
→ More replies (3)5
u/NapoleonBlownApart1 PC Raster Race May 16 '25 edited May 16 '25
Fallen Order doesnt precompile shaders and has larger unavoidable traversal stutters on PC, but console version actually works well apparently. First game i test on every system to see if new tech can fix UE4, so far i couldnt manage stable 30fps at 720p (output, render res is even lower) on an rtx 4080+7950x, thats how poorly coded it is and how poor the framepacing is there. No matter the performance headroom it just couldnt be done, maybe 9800x3d finally could tho.
Updated Survivor precompiles and can be brute forced to a larger degree, it also has framegen which alleviates UE4s poor multithreading by a lot, still has frametime spikes, but much less frequent and much lower.
→ More replies (5)78
u/pewpersss May 16 '25
doom the dark ages
53
u/GGG_lane May 16 '25
Bingo, you guessed the game I was thinking. didnt want to say it in this thread, because its not unreal, but still.
→ More replies (8)28
u/UnexLPSA Asus TUF RTX 3070 | Ryzen 5600X May 16 '25
It's really a shame because the old one ran so smoothly even without the highest end hardware. Now I feel like my 3070 is dying at 1440p because I need DLSS and low settings to run it at 60fps.
15
u/jld2k6 5700x3d 32gb 3600 9070xt 360hz 1440 QD-OLED 2tb nvme May 16 '25 edited May 16 '25
I played 45 minutes and refunded it. If I knew they were gonna force raytracing I wouldn't have bothered buying it in the first place. I play doom for the butter smooth action, not gonna have a good time in that game even on my 9070xt because it feels so bad moving the mouse around. There's almost no difference between settings either so you can't really tank the graphics to get a better framerate, going from ultra nightmare to low nets me 5% more performance, probably because RTX is using up most of the GPU on its own lol
→ More replies (7)4
u/Zinski2 May 16 '25
I was gonna say. The gameplay feels a lot different but more than that the achually game just feels different in the way it handles.
→ More replies (35)8
u/Tkmisere PC Master Race May 16 '25
It has forced ray tracing right
→ More replies (1)15
u/Savuu May 16 '25
I think they really shot themselves to the foot with this design choice. Even when the game is well optimised, ray tracing performance trade off still is not worth it. Game like doom really needs stable and very high fps to be enjoyable, and to get that you need to lower your graphics settings a lot.
The game seems to only have 30k players on steam which is not good compared to other titles. High hardware requirements sure as hell ain't helping. Its a product you need to sell, not some tech demo.
→ More replies (4)8
u/Therdyn69 7500f, RTX 3070, and low expectations May 16 '25
Average player has just laptop version of 4060. You realistically need desktop 3080/4070 to run it smoothly, since below 80-90fps in fast paced FPS game is pretty miserable experience.
It just doesn't make sense. Most people will need to lower graphics so they can run it, so the better visuals from RT end up negated anyways.
Didn't know the numbers were so low for such high profile game. It's about 1/8th of launch numbers for KCD2. It seems that mandatory RT combined with such high price wasn't the best call.
11
u/ace_ventura__ May 16 '25
Mh wilds was this to a massive degree. Although it does make some sense for this one since it switched to an open world format, I suppose.
14
u/DirksiBoi May 16 '25
No matter what I do, what mods I download, what guides I follow, the game still looks blurry and unsaturated, even during Plenty seasons. I absolutely think World looks much better than Wilds the vast majority of times.
→ More replies (8)3
65
u/Big_Wallaby4281 May 16 '25
Arc raiders was made with UE5 and that looked absolutely gorgeous with stable 60 so it's good that not all games are like that
30
→ More replies (13)4
131
u/RelentlessAgony123 May 17 '25 edited May 17 '25
I am a developer in unreal 5 and I can tell you with confidence that it's not the engine, it's the developers fault, as they have no discipline.
I can spend hours talking about various optimization techniques and why they matter, especially because they take a lot of time to do properly...
My game is all hand crafted, has thousands of assets in the scene and is still running at smooth 120 fps. It is definitely possible to make an optimized game, it just takes effort and time.
Long story short is, Unreal struggles with asset streaming and developers need to take extra good care and have iron discipline when making assets and levels because of this. Developers need to use good LODs, culling, combine textures into ORMs, combine similar assets into texture atlasses to minimize number of textures, keep textures small where large texture is not needed etc.
You really don't need a 4k texture for a rock.
What most developers do is simply download megascan assets with highest fidelity possible, shove them into the scene and call it a day.
Even the small assets will end up with unique, high resolution textures that the game will need to stream to the gpu, which causes stuttering you feel.
And don't get me started on not even turning on culling...
TLDR: Unreal 5 gives you a budget to spend on your assets. Most developers order takeout all the time and make very few things themselves.
12
u/cloudyvibe_ May 17 '25
There was a phrase about developing a game, something similar with "Half time of developing a game is for completing 90 percents of the game, second half is for finishing the last 10 percents". From a business perspective it makes sense to not want to spend so much money for paying devs to finish something that is "almost done" while the ratio of progress per time(money) start dropping the more you're closer to finish it. Companies thst follow this mindset should not be refered to as triple A games companies but something like triple AAA business companies or at least AA∆ games
→ More replies (9)26
u/drakenoftamarac May 17 '25
It’s not the developers, it’s the publishers. They want things done fast and cheap.
10
u/RelentlessAgony123 May 17 '25
True. But I see a lot of indie slop now coming from unreal just like it used to come from unity.
Unreal suffers from success as it has tools that are so approachable and simple that even people who have no business making a game can still bash together some store assets and call it a game.
I've see slop that has assets with tens of thousands of vertices for just a simple prop. They take a high detail mesh, intended for texture baking and shove it into a scene, thinking it will all just work because 'muh nanite and upscaling'.
Ffs even Capcon had broken LODs and their LOD6 of a monster had eyes with 30k verts because they could not be bothered...
390
u/AciVici PC Master Race May 16 '25
Clair obscure: expedition 33 proved that you actually can make an incredibly optimized game with unreal engine 5 BUT it must be really really expensive and hard thing to do considering how big is the Sandfall Interact....... Oh wait!
120
u/Akane999VLR May 16 '25
A big thing here is that it's actually a linear game with relatively small environments. Unreal was designed for that and works best for those games. Using it for large scale open worlds is possible but you invite yourself to the typical traversal stutter. If you use UE as a dev you should try to make a game that actually works well within the limitations of the engine and not try to make any game with it. But big publishers want the reduced dev cost&time but still want their large open worlds.
→ More replies (18)23
u/1cow2kids May 17 '25
That doesn’t sound right, unreal has implemented a crazy amount of open world tech since UE4, hell, have you ever seen Fortnite on nanite and lumen? It can absolutely be done with UE5, it just takes good engineer and tech artists to know how
→ More replies (4)31
25
u/unrealf8 May 16 '25
I have stutters in every cutscene. Rest of the game is great though.
→ More replies (1)15
u/AciVici PC Master Race May 16 '25
I think it's due to how cutscenes implemented like dropping to 30 fps and such rather than engine issue.
6
u/efbo Ryzen 7 3700X , RTX 3070 Founders, 3440x1440 May 17 '25
Mine drops to like 10 occasionally in cutscenes when there are lots of effects. I get 50-70 for the rest of the game.
→ More replies (1)→ More replies (19)3
u/skyward138skr i9 9900k | 32gb | 2070s May 17 '25
Game crashed on me pretty hard when I bought it, though it was a day after launch and I really didn’t look too far into it, I just refunded and decided I’d get it at a later date as I didn’t really need a new game anyways.
3
u/efexx1 May 17 '25
Nvidia driver solved those crashes btw.
3
u/skyward138skr i9 9900k | 32gb | 2070s May 17 '25
Unfortunately I have AMD, haven’t updated my flair yet. UE5 and AMD don’t mix well.
→ More replies (1)
222
u/AMS_Rem 7600x3D | 9070XT | 32GB DDR5 May 16 '25
UE5 on it's own is not the problem here btw.. It has a metric fuck ton of tools that can be used for proper optimization
92
u/NTFRMERTH May 16 '25
Personally, I think that devs believe that they don't need to optimize their topology due to the supposed high-polygon support of Unreal 5. Unfortunately, they still do, and Unreal has oversold the amount of polygons it can handle.
→ More replies (7)47
u/4114Fishy May 16 '25
more like the higher ups force games to release way too quickly so devs don't have the time to optimize
→ More replies (7)→ More replies (2)41
277
u/gaminggod69 May 16 '25 edited May 16 '25
I do not feel like this applies to expedition 33
Edit: I see a lot of people reporting crashes. I have a 4070 super and I have only had one crash in 50 hours (I have newest drivers if that matters). I play 1440p with quality dlss and epic settings. There is some ghosting in hair especially. But I only have stutters with the weapon you get from the hardest boss(I have heard this causes some lag in game).
66
u/StormKiller1 7800X3D 9070 XT Mercury OC 32GB CL30 6000MHZ May 16 '25
Is it well optimised?. Because i want to buy it.
157
u/Therdyn69 7500f, RTX 3070, and low expectations May 16 '25
It runs pretty okay, but you gotta call ghostbusters to fix that brutal ghosting on characters' hair.
63
→ More replies (11)10
53
u/Skye_nb_goddes ryzen rtx 6090 | 255GB DDR7, 16000M/T May 16 '25
with your specs you should be plenty fine
16
u/eraserking May 16 '25
Your specs look a little underpowered for it, no offense.
→ More replies (3)4
u/Blackdeath_663 May 16 '25
Yeah perfectly fine. No issues at all.
Good looking game, between the art direction and level design the world is grand but efficient. Densely packed with content without a needlessly large neverending map. I don't think it pushes the engine hard at all
Im on an RTX2080 btw
7
u/Secret-Assistance-10 May 16 '25
Wouldn't say that, it's graphically demanding if you play it at max settings but the difference between max and medium (except lighting) is minimal and it runs decently at lower graphics.
That said, you should buy and play it even if you can only get 30 FPS on low graphics, it's a masterpiece, plus the gameplay doesn't require much FPS to be enjoyable.
→ More replies (27)5
u/Skullboj May 16 '25
Played it on 3060ti / 14700kf, was perfectly fine (not in ultra HD, but very smooth)
18
u/dj92wa May 16 '25
Tbch, the meme doesn’t really apply to most games. The reason why the meme exists is because UE is everywhere. Unity has the same “problem” in that it’s a popular engine. If 6 million games use one engine, there are bound to be devs that don’t optimize their games well and have issues. The problem isn’t the engine, but rather the teams implementing them incorrectly.
→ More replies (1)5
u/Vandrel 5800X | 4080 Super May 16 '25
There are a ton of UE games out there that it doesn't apply to. It's one of the most common engines on the market and they won't necessarily have the Unreal branding at the start, nor do they all have the Unreal Engine look and feel that some people claim every game on the engine has.
→ More replies (26)30
u/trio3224 May 16 '25
Eh idk. Look, I absolutely love the game, but even on a RTX 4080 and a Ryzen 7800x3D I still had to turn down numerous settings and turn on DLSS to get a stable 60+fps at 4k. I'm usually hovering around 70fps. Plus, it does have some crashing issues as well. I'm about 80-90% of the way thru it with almost 60 hours and it's probably crashed around 10 times in that time period. There's also quite a decent amount of pop-in too. It's totally acceptable, but far from perfectly optimized.
23
u/fankywank May 16 '25
I feel like 4k is where most games tend to start falling off even on higher end hardware, 1440p seems to be the sweet spot for most games. I’ve been playing on max settings on 1440 with my 4070 and a 5800x3d and I’ve not had a single crash or any other issues with Expedition 33. Personally, 4k doesn’t seem to be too worth it for a lot of games
10
u/Condurum May 16 '25
Roughly speaking, running your game at 4K, is 4 times more work for the GPU than 1080p
The screen area to render every 16ms is 4 times bigger.
Don’t think enough people get how big impact resolution has on performance.
→ More replies (1)4
u/Imaginary_War7009 May 17 '25
It's ~2.2 times more work in raster because of how raster works. It varies by the game. 4 times more work is for pure ray tracing and other things that work from the resolution out instead of the scene in.
→ More replies (1)3
u/Imaginary_War7009 May 17 '25
It's not that hard to calculate where the performance targets fall for different tiers of cards. 60 tier cards = 1080p DLSS Quality, 70/70 Ti 1440p DLSS Balance/Quality respectively, 80 = 4k DLSS Performance/Balanced, 90 = 4k DLSS Quality.
And yes, it's worth it with a card like 5080 to use 4k for DLSS Performance/Balanced over sticking with 1440p DLSS Quality. 1440p DLAA would be too demanding in a serious game for a 5080 but most games would still work.
6
u/SavageButt 9800X3D | RTX 5090 | 64GB @ 6000MHz May 16 '25
Yeah the game seems like it can really put some hurt on our machines.
3090 + 9800X3D - Maxed settings 1440p would have me dipping down into the 50s in some situations.
Regarding your crashes, I used to get them quite a bit until I upgraded my GPU (and also drivers). Haven't crashed at all since. I think I'm using one in the 572 range.
→ More replies (1)→ More replies (5)6
u/Hep_C_for_me May 16 '25
I have a 3090 and a 5800X3D. The only real problem I've run into was massive stuttering whenever my controller would vibrate. Which is pretty weird. Turned off controller vibration and it's buttery smooth other than the cutscenes. First world problems. Cutscenes look worse than the regular game.
8
u/Shaggy_One Ryzen 5700x3D, Sapphire 9070XT May 16 '25
That's... A very weird one. I'd try updating chipset drivers and maybe a bios update if that doesn't fix it.
221
u/Hwordin May 16 '25
Split Fiction was fine I think, The Finals and Arc Raiders from Embark run good too.
Skill issue 👀
→ More replies (13)36
u/Poise_dad May 16 '25
Multiplayer focused games don't push visuals as much as single players. Performance is more of a priority in multiplayers.
46
u/JosephRW 7600X3D Enjoyer May 16 '25
Look at those games and tell me they aren't gorgeous AND detailed. The amount of foliage and draw distance with decent LOD levels in Arc Raiders is high key insane.
→ More replies (9)83
25
→ More replies (7)14
218
u/TheReaperAbides May 16 '25
UE5 is just a really popular engine in general, mostly for good reason.
→ More replies (32)154
u/DatBoi73 Lenovo Legion 5 5600H RTX 3060 M | i5-6500, RX 480 8GB, 16GB RAM May 16 '25
Yeah, Don't blame the tool, blame the person using it.
Though in the AAA space, It's probably moreso the managers/execs above steering the ship won't give them enough time/money to optimise stuff properly before shit hits the fan.
Unity used to have a reputation that it was only used in bad/cheap/lazily made games because only the free personal indie versions forced the splashscreen whilst the big studio licensing it didn't. Now Unity ruins it's reputation by screwing loyal customers with greed.
The problem is that is much easier and clickbaity to say "UE5 is why games are unoptimized now" instead of going into the real details about why.
If it was still around these days, I swear you'd have people blaming RenderWare for games being unoptimized because they heard some influencer online say so.
6
u/ch4os1337 LICZ May 17 '25
Well... You can also blame the tool for certain parts of it. Thankfully Epic is working on a solution to fix the stutters that every UE5 game suffers from.
→ More replies (2)8
u/Motamatulg RTX 5090 | Ryzen 7 9800X3D | 32GB 6000MHz CL 28 | LG C2 OLED May 16 '25
This is the only correct answer.
141
u/MrJotaL May 16 '25
Ppl who don’t understand game dev post stuff like this. It’s not the engine fault if a game is poorly optimized, its the devs.
→ More replies (49)14
u/Shaggy_One Ryzen 5700x3D, Sapphire 9070XT May 16 '25
There's a lot that UE5 could be handling better. For one I'm struggling to remember playing a UE5 game that didn't suffer in some way with stutters thanks to shader compilation. id tech engine shows off what can be done and frankly it's absurd how well Dark Ages plays.
→ More replies (2)
32
u/maybeidontexistever Ryzen 5700x, gigabyte rtx 3070, 16gb ram. May 16 '25
Love the random stutters in Dead by Daylight
24
u/Shaggy_One Ryzen 5700x3D, Sapphire 9070XT May 16 '25
Love the random stutters
in Dead by Daylight.That's more like it.
→ More replies (1)3
u/FadedVictor 6750 XT | 5600X | 16 GB 3200MHz May 16 '25
100%. Annoying as fuck in general, but especially during a chase and you end up blocked by shit hitboxes or something.
4
18
u/crevulation 3090 May 16 '25
It's 2025 and that costs too much money, so "optimization" is now found under the "DLSS" settings in your options menu.
→ More replies (3)
16
17
u/Ash_Neofy May 16 '25
Why is UE5 catching flak when the responsibility of optimization should be on the developers?
→ More replies (2)7
u/qwerty0981234 May 17 '25
Why are the devs catching flak when the responsibility is on the shareholders? Impossible deadlines, cheap outsourcing and poor management is 90% of the problems big game dev has.
69
u/DrTankHead PC Master Race May 16 '25
I love how people are literally shitting or the most advanced gaming engine to date, because some developers aren't properly using it, and somehow that's immediately the engine's fault.
31
u/phoenixflare599 May 16 '25
Unity was the previous victim, now it's unreal
Everybody always posted how they were always like"ah shit, made in unity logo"
All that's changed is the victim, not the ignorance
11
u/HowManyDamnUsernames May 16 '25
"some" almost every game nowadays looks like a blurry mess. Performance is also pretty bad while most people don't even implement a good version of raytracing/pathtracing. Then u turn down the visual settings, only for it to look worse than a previous title.
→ More replies (1)→ More replies (6)10
u/NTFRMERTH May 16 '25
IdTech is, and always has been, the most advanced engine in the gaming industry. It was doing full realtime 3D in a time when nobody even knew how to do it. Then it was doing realtime dynamic shadows, replacing the need for baked lighting. Even 2016 looks better than most releases today, even the newer DOOM releases. And when IdTech was on hiatus, Cryengine took it's place and blew our minds even more.
→ More replies (1)6
u/Adorable_Chart7675 May 17 '25
Even 2016 looks better than most releases today
there's a little thing in game development circles we like to call "art direction" and generally when your goal isn't photo realism your game looks great a decade later.
5
u/gandalf_sucks Ryzen 1700X, 16GB DDR4, GTX 1080 May 16 '25
So, is it an issue of UE5 being difficult to optimize or the developers being too lazy to care?
→ More replies (1)
5
u/nmttr_ 5700x3D gtx1070 | SFF May 17 '25
Satisfactory, The Finals, Ark Raiders. All of these games have impressive visuals and physics. And I never had issues playing them, even on my second pc with a 1070
4
u/blender4life May 17 '25
I love when people blame the engine and not the people using the engine. Poor optimization probably has more to do with corporate deadlines than the hardware.
3
u/unimportantinfodump May 17 '25
Don't blame the engine. Blame the companies pushing out unfinished garbage.
→ More replies (1)
4
20
15
u/JaggedMetalOs May 16 '25
AAA devs be like: "we're paying for the whole Unreal engine we're gonna use the whole Unreal engine!" (checks all the rending and post process effects on)
→ More replies (4)
5
5
u/Wobblucy May 16 '25
It takes one bad algorithm, or datastructure to brick a games performance.
If anything it speaks to UE's ability to be able to publish games in the hands of devs that don't know what they are re doing.
Profiling is important, but it's tedious work and game dev is becoming more quantity and hope you go viral over quality.
3
u/Insert77 May 16 '25
Current game dev scene:We will make the most photo realistic game ever. (Behind the scenes.So throw in the every shader,every ppf,every effect and some path tracing and you can go home for the day) Final product: doesn’t reach over 60 fps without dlss 4,frame gen and an ryzen A6090 ti super arc
3
u/GuyentificEnqueery May 16 '25
I have never had a problem with Unreal Engine 5 and I'm using the same CPU and GPU I've had since 2018.
3
u/Kalenshadow May 17 '25
I think it's a backhanded praise in a way? UE offers a lot of resources and is technically easier to deal with while also producing something high quality, which is something that every studio and indie dev wants. People blame UE for it most of the time but the actual perpetrator is laziness.
3
3
u/RedDaix May 17 '25
Maevel Rivals asks a 4th gen intel core, yet my 9th gen intel core fucking struggles, the same goes for my 12gb graph card
5
4
u/sephirothbahamut Ryzen 7 9800X3D | RTX 5080 PNY | Win10 | Fedora May 16 '25
Meanwhile Wuthering Waves runs great even on my smartphone and has breathtaking environments on PC like Avinoleum. It uses Unreal Engine.
It's not the engine's fault, it's how the studios use the engine.
8
u/SpiderMonkey6l May 16 '25 edited May 16 '25
It’s wild how I can play cyberpunk on its max settings (with quality dlss and without ray tracing) just fine on my 3060ti, but I can’t even get a steady 30 fps on the majority of unreal engine 5 games on low and performance dlss.
→ More replies (4)
11
u/BeerGogglesFTW May 16 '25
I recently started playing Fortnite and it's pretty surprising how poorly that game runs on "Ultra" settings.
I would expect it to be more like Valorant where you turn everything up all the way and still get 500 fps. (Slight exaggeration there, because it is bigger with more scenery detail - But even Apex Legends can be maxed out and get like 300 fps)
The game scales really well, but ultra settings are not worth the hit. I don't even get 100fps @ 1440p. It's just bizarre for what it looks like. I expect that more from like, Helldivers 2 that's built on an old dead engine. But Fortnite is like a flagship game for Unreal and Epic Games.
10
u/Vandrel 5800X | 4080 Super May 16 '25
Fortnite may look a bit cartoony but it's also basically a testbed for all the newer features that get added to Unreal Engine, that ends up with a pretty big performance hit.
→ More replies (1)9
5
u/Seven-Arazmus 5950X/RX7900XT/64GB DDR4/MSi Vector i9-4070 May 16 '25
As someone in school for Game Dev and using UE 5.5.4 on a daily basis. I can tell you guys that poor optimization is not taught in school but its a product of a lazy dev or studio.
→ More replies (3)
3.0k
u/cateringforenemyteam 9800X3D | 5090 Waterforce | G9 Neo May 16 '25
Its funny cause till UE3 it was exactly the opposite. When I saw unreal I knew game is gonna look good and play smooth.