Skip to content
Ganker
Community-driven video game blog & discussion
Posted on by Anonymous
- Anonymous
1 week ago
Reply
-A ?
- Anonymous
1 week ago
Reply
fippy bippy
- Anonymous
1 week ago
Reply
It runs smooth on my phone
- Anonymous
1 week ago
Reply
>play immortals of aveum
0 stutters
>play lords of the fallen
0 stutters
>play layers of fear
0 stutters
>play jusant
0 stutters
>play remnant 2
0 stutters
>play talos principle 2
0 stuttersThe digitalfoundry hom*osexuals just need to fix they're PC (And also Moan less about Stellar Blade's "sexuality" (or lack thereof))
UE5 is fine. Those b***hes will always cry because 0.001% FPS dropped for a fraction of a milisecond.
- Anonymous
1 week ago
Reply
For real, goyim literally whine about every little thing even when it doesn't impact the experience.
- Anonymous
1 week ago
Reply
>uses amd
>complains of stuttering
You knew you were buying into the AMDip platform when you bought the CPU.Developers should just stick with unreal 4, it runs twice as fast as 5 without meme tracing.
See
Isn't this an issue with the shader comp that resolves itself with play time? It's still sh*tty, especially for someone like me that plays fortnite maybe 3 times a year, but this seems a bit disingenious
Watch the video and 5 seconds after that image shows its not a problem, why be so disingenuous?
- Anonymous
1 week ago
Reply
It's only Alex their resident peesee gamur who's unironically a hom*osexual about both this stuff and irl
- Anonymous
1 week ago
Reply
shill post
- Anonymous
1 week ago
Reply
You're too stupid to notice the stutters
- Anonymous
1 week ago
Reply
So he won. You're miserable he's happy
- Anonymous
1 week ago
Reply
>Immortals of aveum
That game was such a sh*t game holy frick - Anonymous
1 week ago
Reply
>>play lords of the fallen
>0 stutters
Lying hom*osexual.- Anonymous
1 week ago
Reply
I don't get stutters either
- Anonymous
1 week ago
Reply
I don't think those hom*osexuals are lying they are simply blind.
Their brains can't just notice those stutters.
I tried playing lords of the fallen and it's awful the FPS is unstable as frick.
- Anonymous
1 week ago
Reply
More like 0 PLAYERS
- Anonymous
1 week ago
Reply
>>>play immortals of aveum
yeah this is cap fr
nobody would willingly play that sh*t - Anonymous
1 week ago
Reply
>Those b***hes will always cry because 0.001% FPS dropped for a fraction of a milisecond.
If you're dumb just say that. It's giving too stupid to understand what a stutter is
- Anonymous
1 week ago
Reply
>PC gaming is better than conso-
- Anonymous
1 week ago
Reply
>-le in every way.
- Anonymous
1 week ago
Reply
Half the UE5 games on Console run at 720p. Immortals and Remnant 2 do.
- Anonymous
1 week ago
Reply
they stutter on console too, dead space remake had a particular animation stutter that only happened on PS5 to boot
- Anonymous
1 week ago
Reply
I think dead space had traversal stutter which is a different thing.
- Anonymous
1 week ago
Reply
These problems effect every platform - shader stutter is the only one unique to pc. Most consoles players don't notice when their resolution is dropping to 560p on the ps5.
- Anonymous
1 week ago
Reply
Unreal's documentation even used to mention that it's impossible to build complete PSO cache and some shaders are dynamic, but it's alright, because they're not noticeable and that typical Fortnite has at least 100 shader compilations.
Epic's own words, not mine.
- Anonymous
1 week ago
Reply
>typical Fortnite
*typical Fortnite match
- Anonymous
1 week ago
Reply
>uses amd
>complains of stuttering
You knew you were buying into the AMDip platform when you bought the CPU.- Anonymous
1 week ago
Reply
this happens on intel/nvidia you dumb moron
- Anonymous
1 week ago
Reply
Learn to tune your PC before calling people moronic, mental midget.
- Anonymous
1 week ago
Reply
Stfu Jufes jit
- Anonymous
1 week ago
Reply
Still not going back to IntCel
- Anonymous
1 week ago
Reply
Is that why Intel is forcing lower power limits and lower clocks in bios due to stability issues with 13th and 14th gen cpus?
- Anonymous
1 week ago
Reply
I'm genuinely convinced that the only people who unironically defend Intel are senile and tech illiterate boomers and millennials in their 40s+
- Anonymous
1 week ago
Reply
There are people who still think AMD GPUs have terrible drivers.
Old habits are hard to kick.- Anonymous
1 week ago
Reply
God that would look so much better if those sprites had filtering set to off.
- Anonymous
1 week ago
Reply
Not seeing that issue on Steam Deck and I'm pretty sure that uses AMD.
- Anonymous
1 week ago
Reply
Because massive driver issues were a thing a decade ago.
Nvidias selling point is they offers more features in comparison.Anonymous The funny thing is their "feature" selling points are more like cope so they can sell you downgraded GPUs for more money.
For example DLSS was originally made to make RT possible but now its used for non RT games as cope for not enough vram capacity or memory bandwidth due to 192bit bus.
Their primary genuine selling point is their encoder/decoder/nvenc which is a hardware feature that Intel/AMD can't copy. When you get into stuff like MadVR or streaming its just hilarious how above Nvidia is.
- Anonymous
1 week ago
DLSS is still the king of upscaling but Intel and AMD are getting better as well but are still behind.
They also still do the best at hardware raytracing.
It's the non gaming stuff they excel at which if you use anything that benefits from it Nvidia is the way to go. - Anonymous
1 week ago
Intel and AMD's options are still so sh*t that I'd rather just deal with the TAAU blur at that point than deal with variable resolution + motion artifacts.
- Anonymous
1 week ago
Because XIV's DLSS runs at an extremely low internal resolution for some fricking reason. People already modded it to look vastly better in the benchmark.
- Anonymous
1 week ago
At 4k I find them usable if using more recent versions.
DLSS holds up when using say 1440p really well. - Anonymous
1 week ago
That's probably my issue then. I'm on a 1440p display since I prefer 120hz, which I'm sure as frick am not going to hit at 4K even with upscaling BS. Problem is a lot of modern games don't even give me the option of not using upscaling based AA.
- Anonymous
1 week ago
Yeah it's fricking trash.
I get it at 4k with a high end GPU it works well and upscaling doesn't frick image quality at that resolution but requiring it for 1440p or lower without said high end GPU is just moronic. - Anonymous
1 week ago
Yeah it's fricking trash.
I get it at 4k with a high end GPU it works well and upscaling doesn't frick image quality at that resolution but requiring it for 1440p or lower without said high end GPU is just moronic.quality mode for fsr/dlss at 4k is 1440p which still looks good. at 1440p it goes down to 1707x960 which is worse than 1080p.
- Anonymous
1 week ago
I'm not writing specifically to you but isn't this whole "but at least it's 1440p" worthless cope when it's all about getting used to things.
If you got used to 4k then 1440p is still gonna look worse. Just like 1440p into 960p is going to look worse. Both of them look worse. But for some reason every time someone with 4k monitor talks about this it's like the quality never decreased. Or are you saying that you can't tell the difference between 4k and 1440p? Isn't that fricking stupid too?
- Anonymous
1 week ago
most games are designed to only look proper at standard common resolutions for lighting, anti aliasing, and more.
720, 1080, 1440p, 4k are all designed in mind not weird sh*t like 1707x960 or 960x540.
- Anonymous
1 week ago
No they aren't? Everything looks "more proper" on higher resolution. On different sized display devices maybe but that is not tied to resolution.
- Anonymous
1 week ago
>Their primary genuine selling point is their encoder
This. I'm a streamer and I'd have bought a 7800XT if AMD encoders weren't complete garbage but AMD AV1 image quality is barely better than nvenc h264 at similar bit rates and doesn't compare to NVENC AV1 or HEVC. Even Intel has better encoders integrated into their 14th gen CPUs now. There is no reason to even consider these things if you want to encode. - Anonymous
1 week ago
does high quality encoding even matter when the twitch bitrate is soo dogsh*t the quality sucks anyway?
- Anonymous
1 week ago
Reply
Because linux and windows drivers are different, have you thought about that?
- Anonymous
1 week ago
Reply
AMD gays are like vegans, it's like they don't realise that saying "I own AMD" means "I'm a fricking moron" to everyone with a clue.
- Anonymous
1 week ago
Reply
>1080p
lol how poor are you?- Anonymous
1 week ago
Reply
Way to prove my point here
I'm genuinely convinced that the only people who unironically defend Intel are senile and tech illiterate boomers and millennials in their 40s+
lmao
- Anonymous
1 week ago
I'll just leave this here.
Well the 7800X3D beat the 14700k so there's that I guess. - Anonymous
1 week ago
consider reading on margin of error
- Anonymous
1 week ago
>it was a coincidence that all the 900k cpus beat the AMD cpus , pure margin of error
just what are the chances? - Anonymous
1 week ago
13900K and 7800X3D have literally the same score, moron-kun.
it's clear there's a GPU bottleneck - Anonymous
1 week ago
That's really embarrassing, what moron will pay 150(min) more for a cpu that takes 4 times the power for a max of 0.8% increase?
- Anonymous
1 week ago
Reply
1080p is used for CPU benchmarks since anything higher is usually GPU bound
- Anonymous
1 week ago
Is this the new cope? You should look at benchmarks in the resolutions you don't use rather than use the different results in the resolution you do use?
lol - Anonymous
1 week ago
At higher resolutions CPU differences aren't generally present unless you compare cheap vs expensive chips, with exceptions being on the small number of CPU heavy games.
Also this is how all reviewers compare CPUs. - Anonymous
1 week ago
13900K and 7800X3D have literally the same score, moron-kun.
it's clear there's a GPU bottleneckI simply don't care about your excuses and validations.
You say "I own AMD" , my brain reeds it as "I am a fricking moron"
deal with it, moron - Anonymous
1 week ago
>reads
i'm also a moron in training - Anonymous
1 week ago
How are you dealing with your middle age crisis?
- Anonymous
1 week ago
I had that a while back. I went to ibiza and took lots of drugs
- Anonymous
1 week ago
Reply
Not that guy but I went 13900k because it can bruteforce way better than 7800X3D if a game is sh*ttily optimized and also better for emulators because single thread perf.
- Anonymous
1 week ago
Reply
>doesn't notice that they didn't use an AMD GPU in the testing
lol- Anonymous
1 week ago
Reply
>tech illiterate moron complains about AMD CPU
>show CPU benchmarks
>"uhhh... but what about GPU?"
jump off a bridge into Ganges river- Anonymous
1 week ago
>ESL hom*osexual completely imagined seeing the words GPU or CPU
>still morons anyway
yes - Anonymous
1 week ago
>imagined
>uses amd
>complains of stuttering
You knew you were buying into the AMDip platform when you bought the CPU.>>uses amd
of stuttering
>You knew you were buying into the AMDip platform when you bought the CPU.
>CPUYou should be ashamed of pretending to be native English speaker lmao
- Anonymous
1 week ago
not me hom*osexual
That's really embarrassing, what moron will pay 150(min) more for a cpu that takes 4 times the power for a max of 0.8% increase?
it's more embarrassing that you base your computer purchases on if it uses up an extra 2p an hour electricity
- Anonymous
1 week ago
Reply
nobody cares about your israelite bogeyman , weirdo
- Anonymous
1 week ago
Reply
nobody cares about your r*ddit-friendly opinions, take your multilated dick back to your home and stay there mutt.
Vk/dx12 entire point is ray tracing + cpu optimization
pretty sure the Vulkan API predated the announcement of memetracing. Not by much mind you, but still. Does the games Vulkan works on and excels at even have memetracing? they're mostly high fps pvp/shooter games.
- Anonymous
1 week ago
Reply
In the oven moshe
- Anonymous
1 week ago
Reply
everything after version 3 was a mistake
- Anonymous
1 week ago
Reply
unreal engine 1.5 should have been the last one, honestly
- Anonymous
1 week ago
Reply
>Unity shot itself in the foot
>Godot is abandonware
Unreal won- Anonymous
1 week ago
Reply
>Godot is abandonware
>but still constantly gets updates
>got a ton of funding about the unity sh*t
>is constantly shilled as THE indie game engine now
hahahahaha- Anonymous
1 week ago
Reply
cool. how many AAA devs have picked this amazing engine that can rival Unreal and it's free.
- Anonymous
1 week ago
Reply
>free
Is UE5 the new front in Epic's efforts to take control of the gaming world and all of our data/devices for their chinese overlords? What kind of business operating in the way a normal business does, guided by normal business principles, just gives out free games and licenses to their proprietary game engine, when they're losing money? What the frick are these Black folk up to?- Anonymous
1 week ago
Reply
They are losing money on anything that isn't Fortnite
UE makes a good chunk of money (between $100-200 million a year) but at Epics size that isn't enough to run on alone. - Anonymous
1 week ago
Reply
learn to read, you brown fricking ape
- Anonymous
1 week ago
Reply
UE5 is only free for smaller games that don't sell too many copies. After you make a certain sales threshold you have to start paying Epic. It's the same as it was with UE3 (UDK) and UE4.
- Anonymous
1 week ago
Reply
It's 5% after a million in revenue.
It's still the indie and small studio cut since larger studios can negotiate having no cut for an upfront payment.
- Anonymous
1 week ago
Reply
>twitter thread
>twitter thread with the user and stats cropped out
Please have a nice day. - Anonymous
1 week ago
Reply
>older games
>devs all made their own specialized engines
>had to spend a bit more on upkeep but got really good performance and the ability to add unique features as needed
>even some communal engines were around like the quake engine with heavy modifications to suit each game
>newer games
>greedy mbas have killed all custom sh*t as well as fired everyone who worked on them so custom engines all basically dead
>gone all in on the unreal slop train
>should be perfectly fine unless unreal becomes a bloated mess
>oh look unreal is a bloated mess
>performance and feature issues are compounded by hiring dollar store pajeets to save money (which actually only costs more due to competent people wasting time fixing their sh*t)
I hope unreal continues sh*tting the bed so the crash accelerates and we get back to good sh*t like Fox engine- Anonymous
1 week ago
Reply
This is just the way it is now and not just in game development. I get to see my companies invoices and budgets. It is actually cheaper to hire third world scum for 20 dollars an hour, have them frick everything up and then pay someone like me 100 dollars an hour to fix the mistakes. The only time it bites us in the ass is when we have tight time frames. Same thing was happening at my old job. Buying pipe from China knowing we'd have to throw away half of it for not meeting QC but it was still cheaper then buying American pipe.
- Anonymous
1 week ago
Reply
Played the demo and had zero stutter, same with every other UE 5 game I've played.
- Anonymous
1 week ago
Reply
Upgrade your PC you fricking moron.
- Anonymous
1 week ago
Reply
Homeworld 3 uses 4 not 5
- Anonymous
1 week ago
Reply
Developers should just stick with unreal 4, it runs twice as fast as 5 without meme tracing.
- Anonymous
1 week ago
Reply
You can get UE5 to run similar to 4 by not using Lumen, Nanite and Virtual shadow maps.
- Anonymous
1 week ago
Reply
That would require devs to not be complete morons and actually know something about the engine they’re using and we all know that’s never going to happen.
- Anonymous
1 week ago
Reply
The stutter is generally a DX12 issue since most games don't generate a cache before release and Epic hasn't made generating them easy or straight forward and a large number of devs with the engine suck using it
- Anonymous
1 week ago
Reply
Tell that dx11 only star wars jedi fallen order you gay. It has ALL stutters you can imagine and it's dx11 title.
Hell they can't even fix their own fortnite sh*t for 10 years already. This engine is just massive pile of sh*t. It should be abandoned by everyone.
?si=H1JCIC25L2ZE4v6J
?si=lw5HguC112QZJ-tc
- Anonymous
1 week ago
Reply
The stutters in fallen order appear to be from load and unloading assets from streaming, which UE has been fricking awful at for ages.
[...]
it also happens to dx11. the issue is the shaders are compiled at drawtime in unreal engine. even if you precompile them theres still various shaders that cant be accounted for like particles, lighting, custom characters in online games, etc.the proper fix is to fix is switch to vulkan and make use of graphics pipeline library. where the shaders are compiled before drawtime.
It's a shame Epic treats Vulkan as a second class citizen, even Apple’s Metal gets more attention.
- Anonymous
1 week ago
Reply
Tell that dx11 only star wars jedi fallen order you gay. It has ALL stutters you can imagine and it's dx11 title.
Hell they can't even fix their own fortnite sh*t for 10 years already. This engine is just massive pile of sh*t. It should be abandoned by everyone.
?si=H1JCIC25L2ZE4v6J
?si=lw5HguC112QZJ-tc
it also happens to dx11. the issue is the shaders are compiled at drawtime in unreal engine. even if you precompile them theres still various shaders that cant be accounted for like particles, lighting, custom characters in online games, etc.
the proper fix is to fix is switch to vulkan and make use of graphics pipeline library. where the shaders are compiled before drawtime.
- Anonymous
1 week ago
Reply
It stutters for like 5 minutes after a big patch, these homies are really overblowing the issue
- Anonymous
1 week ago
Reply
everytime you update your gpu drivers it stutters again.
The stutters in fallen order appear to be from load and unloading assets from streaming, which UE has been fricking awful at for ages.
[...]
It's a shame Epic treats Vulkan as a second class citizen, even Apple’s Metal gets more attention.more devs should be adopting vulkan but microsoft is making sure that directx stays dominant. you seen how good doom & dota 2 runs on vulkan. more games can be like that.
- Anonymous
1 week ago
Reply
how often do you update your gpu drivers, truthfully
- Anonymous
1 week ago
Reply
everytime nvidia releases a game ready driver for a game you want to play?
- Anonymous
1 week ago
Reply
so like once every two years at the absolute most? no AAA games are worth playing these days and for smaller games you don't need new drivers.
- Anonymous
1 week ago
Reply
Normies already had to update their drivers four times this year because each month was a huge release.
- Anonymous
1 week ago
>Normies already had to update their drivers four times this year because each month was a huge release.
realYou're genuinely moronic and can't remember how you would have felt if you had breakfast this morning.
normie denial
pretty obvious what you had for breakfast (it was slop)You realize drivers fix things and add things right? It's not just games. Chrome is literally broken on Nvidia drivers side last year unless you update to 24H2 Windows on old drivers or grab the latest drivers.
>You realize drivers fix things
yeah, like fixing the problem that old games run well. new drivers fix that all the time kekMany people have automatic download for drivers in their GeForce panel and Nvidia puts out a new automatic download every 2 weeks.
>Many people
>people
anyone who leaves auto-update features on is not people.
- Anonymous
1 week ago
Reply
You're genuinely moronic and can't remember how you would have felt if you had breakfast this morning.
- Anonymous
1 week ago
Reply
You realize drivers fix things and add things right? It's not just games. Chrome is literally broken on Nvidia drivers side last year unless you update to 24H2 Windows on old drivers or grab the latest drivers.
- Anonymous
1 week ago
Reply
Many people have automatic download for drivers in their GeForce panel and Nvidia puts out a new automatic download every 2 weeks.
- Anonymous
1 week ago
Reply
I'm all for Vulkan being the top dog since anyone using it well gets incredible results.
That would require devs to not be complete morons and actually know something about the engine they’re using and we all know that’s never going to happen.
Studios like UE since they can get any bottom of the barrel Youtube tutorial trained individual for the cheap so it's not surprising.
- Anonymous
1 week ago
Reply
Half-Life Alyx's Vulkan renderer doesn't work great on Nvidia.
- Anonymous
1 week ago
Reply
- Anonymous
1 week ago
Reply
Really depending on what you do needing to know the engine in depth isn't strictly needed, managers figured of what should be worked on and how.
What do they have you work on?- Anonymous
1 week ago
Reply
I make vfx, so I'm someone who can really easily tank the performace of the game, luckily for them I'm not a complete moron and know how to make an explosion without people's computers exploding.
- Anonymous
1 week ago
Oh that's fair, when you can just turn everything on and render sh*t out that's much nicer then fighting to keep reasonable frame rates at all times.
For VFX UE is nice since it's biggest complaints don't apply to the end result.
- Anonymous
1 week ago
Reply
everytime you update your gpu drivers it stutters again.
[...]
more devs should be adopting vulkan but microsoft is making sure that directx stays dominant. you seen how good doom & dota 2 runs on vulkan. more games can be like that.Imagine a world where you could casually run AAA games at 4k 200+ fps on a $600 gpu because you change your api.
- Anonymous
1 week ago
Reply
Doens't happen on console so I don't care
- Anonymous
1 week ago
Reply
why did I greentext this lol. anyways the engine ran smooth. animations for the bots sucked though.
- Anonymous
1 week ago
Reply
disagreed. the engine is honestly kinda sad. graphically it's outdated in obvious ways, while mysteriously needing a strong GPU to run!
200+ fps in Diabotical feel more like 100 fps in a proper Quake game. and you need to activate prerendered frames >1 to get it to look smooth but of course that means increasing input lag. it's trash.
they were outspoken about optimizing the CPU side which is great. but then when it came to GPUs they seemed to adopt an attitude of looking at framerates (cosmetic averages that mean little for how a game actually feels) and gambled on everybody upgrading to RTX 2000 and up soon enough that it wouldn't matter. it did matter because the AFPS community is notorious for using sh*t hardware. most players ended up forcing decent performance by using blurry sh*tscaling and prerendered frames >1 which does so much damage to the basic enjoyment of a game. and at least the latter is a serious competitive disadvantage in a game where reacting fast matters. - Anonymous
1 week ago
Reply
>the dev got a payout from epic to keep it on their platform
If you followed arena fps development over the past 20years, you'd know that they make 0 money.
Epic paying the Diabotical devs like 100 or 200k to have it exclusive on their platform was a good thing because the dev actually got paid what they deserved.
The guy who made reflex arena got fricking nothing and worked on it for 4years. All the clones like Toxic, Xonotic and sh*t brought 0 cash. AFPS are pure fan projects.
Also the eggballs are fine and the animation are made this way so you focus on the center spheres for tracking consistency instead of spazzing realistic models like in Quake Champions- Anonymous
1 week ago
Reply
the sphere models are fine. the problem is that they're floating above the ground and half their hitbox is the invisible fricking empty space around their legs and even under their tiny cripple legs. you have to aim at the tiny legs to hit best which is absolutely moronic.
meanwhile hit detection in Quake Champions is nowadays as good as it was in QL. I almost never experience any surprises with it.
- Anonymous
1 week ago
Reply
Its actually hilarious if you think about it, there are literally only 2 RDNA chips and 2 GTX chips selling on the market and apparently its just Too Difficults for pajeets to compile shaders for them, even funnier when you realize that consoles use one of those RDNA chips.
- Anonymous
1 week ago
Reply
Because Epic never improved the system to manually collect caches, the documentation was focused on mobile for years and are focusing on an async solution that runs on the fly.
- Anonymous
1 week ago
Reply
Its still the industry standard that just lets you start making levels, import assets and make scenes with fancy grafics. Expect to keep seeing it for years to come.
- Anonymous
1 week ago
Reply
Can someone explain why everything needs shader compilation these days? Growing up (1998-2010s) i really dont recall any games having micro stutters. At worst you'd get low fps due to a crappy gpu, but not actual stutters like this first time you loaded an asset.
- Anonymous
1 week ago
Reply
tl;dr
incompetence and bloatDX12 and Vulcan give more direct control over the hardware but this requires more work from the developer. Before that the GPU driver did the heavy lifting.
Now each game developer has to deal with Pipeline State Objects. This preferably should be done by the dev tools instead of manual labor.
If you don't have pre-cashed PSOs you will experience stutter because game is waiting for things like shader compilation.
PSO cache can be shared between machines if they use the same driver and hardware configuration. This is why cucksoles don't suffer from PSO stuttering the caching is done before runtime.
This is also why you get a lot of stuttering after GPU driver update.This is simplified version.
- Anonymous
1 week ago
Reply
Because shader compilation are made on the fly on unreal engine. If devs are careful they compile them before the game starts like in cod or in bf.
Gotcha, thanks! What about unity games? I don't seem to recall them having such a problem in the latest stuff i played?
- Anonymous
1 week ago
Reply
It's not engine specific. It has more to do with API and how it's implemented.
Complexity of the game also matters because in linear game you as a developer will know what will your machine needs to draw a frame, you don't have to stream assets on the fly nearly as often.
How many AAA open world games are made in unity?- Anonymous
1 week ago
Reply
I guess Genshin?
Unity is rarely used in AAA development.
- Anonymous
1 week ago
Reply
Genshin does have shader stutter but you have to go out of your way to notice it. The game hides the compilation in loading screens but if you move in the open world way too fast using glitch or hacks on a fresh driver it just sh*ts itself.
- Anonymous
1 week ago
Reply
most Unity games to this day use DX11. linux-adjacent contrarians push for vulkan but it has no benefit so to me that's a yellow flag, except maybe on mobile.
- Anonymous
1 week ago
Reply
Vk/dx12 entire point is ray tracing + cpu optimization
- Anonymous
1 week ago
Reply
Also mesh shaders
- Anonymous
1 week ago
Never used tessellation/geometry shader.
Are mesh shaders that good? Or it’s just for performance? - Anonymous
1 week ago
Effectively don't need to worry about lods or model poly count in terms of performance.
If you read up on Nanite in UE5 that's mesh shaders. They also have the benefit of lowering draw calls since they batch more at a rate of 1 + material count.
They can be memory and storage efficient if since large normal maps for detail isn't needed. But developers are just getting used to them in UE or inhouse tool using them like Remedy and Massive use.
- Anonymous
1 week ago
Reply
raytracing isn't usable in Unity. it's locked away in the "HDRP" experimental form of Unity which is not production ready and likely never will be because the company has stopped creating technology and are just monetizing mobile sh*t, huffing trendy farts and making fake announcements for short stock price bumps.
- Anonymous
1 week ago
Reply
>Now each game developer has to deal with Pipeline State Objects. This preferably should be done by the dev tools instead of manual labor.
Last year, Nintendo engineer introduced new official Vulkan extension, VK_EXT_shader_object>It’s a very cool extension for Zink. Effectively, it means (unoptimized) shader variants can be generated very fast. So fast that the extension should solve all the remaining issues with shader compilation and stuttering by enabling applications (zink) to create and bind shaders directly without the need for pipeline objects.
- Anonymous
1 week ago
Reply
This post is what happens when laymen try to comment on professional subjects. A consoles shader compilation is completely irrelevant since devs ship already compiled shaders due to standardized hardware, this extension is for edge cases that are only part of console shader packaging issues.
- Anonymous
1 week ago
Reply
>A consoles shader compilation is completely irrelevant since devs ship already compiled shaders due to standardized hardware
I imagine Nintendo prepares for more significant hardware revisions for Switch 2.
- Anonymous
1 week ago
Reply
Because shader compilation are made on the fly on unreal engine. If devs are careful they compile them before the game starts like in cod or in bf.
- Anonymous
1 week ago
Reply
>$590
>7800X3D
>$450
>same game performance
Am I missing something?- Anonymous
1 week ago
Reply
yeah the clown hat and nose for buying intel
- Anonymous
1 week ago
Reply
>Intel shills.
Why? For what purpose? They stagnated the industry for the better part of a decade. - Anonymous
1 week ago
Reply
-BOL
TSUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUU - Anonymous
1 week ago
Reply
Godot
- Anonymous
1 week ago
Reply
Use whatever and just plan on how to solve issues like this.
- Anonymous
1 week ago
Reply
write your own
you weren't going to finish your game anyway, so you might as well
- Anonymous
1 week ago
Reply
I like how the picture is cherrypicked from a legacy data sample form 5.0 he used for comparison
- Anonymous
1 week ago
Reply
Stutter Engine 5
- Anonymous
1 week ago
Reply
It's UE4.
- Anonymous
1 week ago
Reply
her breasts literally shrink and expand, wtf. boob jiggle is pretty much a solved problem so I don't get why someone would develop a novel solution that looks so much worse.
- Anonymous
1 week ago
Reply
>Cropping the user
Frick off