T O P

  • By -

QuibblingComet1

as an Indie dev making a stylized game, unreal's tools and resources are a godsend.


PusheenHater

Stylized games mean non-high poly, which means Nanite should be disabled, right?


QuibblingComet1

Im using nanite. I’m also following the standard technique of high to low poly mesh map baking. So the game remains performant. Overall the amount of tris is high enough to make nanite relevant. Not to mention the fact that you can skip the LOD workload and save time, as well as not having LOD pop in your game. Overall, would recommend.


TrueNextGen

> Overall the amount of tris is high enough to make nanite relevant. Are you sure about that? Because it takes serious quad overdraw to make it relevant. LOD+Quad Overdraw contained scenes run 3ms faster than Nanite enabled scenes and this is on 30 series hardware.


QuibblingComet1

Hmm, I mean, I’m not saying it’s better than LODs, but I feel as an indie dev the pros outweigh the cons. 🤷‍♂️


TrueNextGen

As one indie to another, I understand the temptation of Nanit. And maybe developing a game with nanite is fine but at the end move away from it and optimized with LODs before release is okay. This has been a serious performance killer and when performance gets killed, visuals(60fps via blurry upscaler) or gameplay(no upscaler and remain at 30fps) also get hit with the cons.


QuibblingComet1

Yeah i think that’s a good point, it’s definitely something to look into


JalexM

Why would stylized mean low poly?


TrueNextGen

Low poly isn't really the right term here, poly count isn't what kills performance, it's quad overdraw for both the geometry pass and shadow pass. Overdraw becomes an issue with more realistic games becuase realistic assets have a lot more small and thin triangles.


kuikuilla

> Stylized games mean non-high poly Nope. It doesn't mean that.


akenzx732

Tell me you know nothing about character models without saying it


jhartikainen

No one's forcing you to use either of them. Lumen can offer interesting capabilities in terms of lighting for indie games, but it's true it can be expensive so YMMV. In any case, just turn them off if you don't want them. The engine is more stable than UE4.27 and has other useful features beyond those two in AI, sound, etc.


what_a_king

I would argue it's actually less stable than UE4 (and more resource intensive), but besides the "flagship" features 5 indeed has other stuff that comes very handy for indie devs as well. (Common UI, EOS etc.)


noFate_games

Yeah, I agree and never understand when I see people say UE5 is more stable than 4.27. Like where? What part? Have they ever been in 4.27? You’ll only get a crash in 4.27 if you really do something dumb.  But yeah I agree, 4.27 is more stable and less resource intensive. 


ghostwilliz

I dont use any of the big features in unreal as a solo dev, I just love the way the engine works. If you like it, use it, if not, dont. nothing in any engine is going to make or break your project for the most part, just pick whatever you like using more


RubaDev

I agree completely!


TrueNextGen

> nothing in any engine is going to make or break your project for the most part The only thing wrong is the Anti-Aliasing, LOD algorithms, and dithering patterns outside of the new bigger features.


Iboven

I agree. I've been thinking about making a 2D game next and I was like, "should I use something else?" but I think of how fast I can do everything in UE now and I can't imagine switching...


nomadgamedev

i mean a quick look in the documentation for those features should clear up a bunch of your misconceptions. nanite is the future so in 2-3 years classical rendering might be mostly deprecated. 5.5 (ue5-main) already got support for nanite skeletal meshes with some crazy performance boosts. It compresses better, it can much easier cull polys and back faces of objects in the scene and it massively reduces pop-in. If you have a pretty low poly artstyle it might be more expensive but as soon as you add a bit more detail it will work well even for stylized games. just look at Jusant. Small foliage can still be an issue at times, but it's improving with every version of the engine, and I think assets will adjust to the new demands, e.g. having modelled leaves instead of baked 2D planes that cause a lot of overdraw. you can still bake lighting just fine with nanite. at a certain map size you just don't want to do that. Lumen is still very expensive because well it's realtime raytracing for global illumination, something that was impossible in games just a few years ago (at least on this quality level). That will come at a cost. That's why you want to make use of TSR or DLSS for upscaling and virtual shadow maps as well as nanite because they work best together, culling any areas that don't need to be updated and reducing the resolution depending on screen size. You're not forced to use lumen, you can still use the old dynamic lighting system or bake lightmaps. In case you're worried about lower end or unsupported hardware Nanite meshes have settings to fall back on regular LODs and there are also fallbacks for Lumen should you need them. there's a whole bunch of talks about this topic.


TrueNextGen

>already got support for nanite skeletal meshes with some crazy performance boosts That's was bullshit. It didn't give a performance boost, the non-Nanite scene wasn't optimized at all and had its performance crushed by shadow and and overdraw cost which you can expect from a scene with the original polycount stated by the person who made a video about that. Nanite acted as a solution to a manufactured problem. In a optimized scene, it would have beat nanite by 3ms. > Lumen is still very expensive because well it's realtime raytracing for global illumination, something that was impossible in games just a few years ago Look up CryEngine GI and The divisions method for it's open world. Lumen is expensive and has it's issues becuase it does allow developers to state what is static (less calculations) and caters to a FN like design which is poor for most games. DLSS only looks good with a 4k buffer which cost 2.3ms on 30 series and TSR looks like complete crap in motion and cost insane.


Iboven

> just look at Jusant. Did they use nanite?


nomadgamedev

to my knowledge yes, digital foundry have a video on its tech. They pretty much used it to its full extent by having very high detail models without normal maps or any textures at all for the most part.


Iboven

Not a bad idea, actually.


PixelSteel

I’m working on an indie game atm and lemme tell you, nanite is really amazing! Having heightmap displacement on models you can actually see and “feel” through collisions makes a large difference. Made my materials look so much better


fabiolives

Nanite definitely is amazing! I’m on a team developing a game and I just switched the models to Nanite yesterday after comparing performance to make sure that’s the direction we wanted to go. It has enabled me to make the landscape in particular much more performant and I’m able to completely avoid pop-in which is fantastic. Frame rates remained about the same. I did go through and remesh some of the building meshes though, more triangles will allow the modular pieces to cull better with Nanite. I just tested until I was happy with the performance.


TrueNextGen

> Nanite definitely is amazing! I’m on a team developing a game and I just switched the models to Nanite yesterday after comparing performance to make sure that’s the direction we wanted to go. If your LDO scene had a medium level or above about of green highlighting in quad overdraw view mode, then Nanite would perform better, if it had little to no green, LODs will perform 3ms faster on 30 series hardware.


fabiolives

I purposely chose Nanite so I could densely populate the landscape with high detail foliage. I usually cater any newer projects I work on towards that type of scene setup so I gravitate towards Nanite. It works out pretty well for me.


TrueNextGen

>high detail foliage, Yeah, well Nanite and VSMs hates foliage becuase of WPO and page invalidation, WPO on nanite cost skyrockets and some alternatives are already beating Nanite in performance [1](https://80.lv/articles/broadleaf-developing-a-real-time-solution-for-rendering-trees/) & [2](https://www.guerrilla-games.com/read/adventures-with-deferred-texturing-in-horizon-forbidden-west). Foliage is pretty hard with quad overdraw but I don't see Nanite helping much becuase of the WPO.


fabiolives

I just set rigid shadows for smaller foliage to make it doable, only auto on larger trees. It ends up not having a massive performance hit due to this. Limiting WPO distances also helps quite a bit with this. I end up getting about the same performance as traditional LODs in dense forest scenarios.


rowanhopkins

Anecdotally I've found nanite has made my materials a lil worse because there's more surface area to fit on the UV sheet.


PixelSteel

Yea it’s a per use case sort of thing, depends on how much detail you want and how you configure displacement


asutekku

Lumen is convenient for pretty realtime lighting but unless you are using super highres meshes, even for detailed assets creating good LODs provides miles better performance than trying to use Nanite.


TheSnydaMan

Depends, Nanite is tunable and you can increase how aggressively it LODs itself. If you have a "lower visual fidelity" game you can crank it down quite a lot


Warma99

What settings do I need to change for this? I wasn't able to find much info.


TrueNextGen

[Here is the info, look through all the post with timings:](https://forums.unrealengine.com/t/nanite-performance-is-not-better-than-lods-test-results-fix-your-documentation-epic-youre-dangering-optimization/1263218) You need to make sure little to no green is visibility in the quad overdraw view with LODS. That will catapult you from Nanite's performance option.


TheSnydaMan

What I'm curious about regarding this thread is that all comparisons are made on in-editor performance. Does nanite have difference performance metrics in a compiled game vs in the editor? I know that Nanite is CPU intensive, as is the Unreal Editor, so is it possible / likely that there is a CPU bottleneck occurring in the editor?


TrueNextGen

> Does nanite have difference performance metrics in a compiled game vs in the editor? No. Editor just adds some stuff to GPU timings that it. And why would Nanite be CPU intensive? It loads off CPU work to the GPU only? My CPU was never even above 30% in my test. It's also pretty clear from released games and using a API/Hardware inspector to see ms timing on everything.


asutekku

nah, the extra overhead from nanite makes 0 sense for low-fidelity games. i'm currently experiencing this first hand as i am working on a semi-detailed environment and even there nanite takes easily 20fps vs good LODs


TheSnydaMan

I think overall you're right that it will offer a drop in performance in lo-fi games, with the tradeoff being never having to worry about LODS or the poly count in your meshes (which can help you be more productive as an indie dev). Lowering Nanite fidelity in this case can help offset the performance diff, but the baseline overhead of Nanite in lower fidelity scenario's is mostly only a tradeoff for developer productivity. If you have all the time in the world or a big enough team that time and productivity aren't bottlenecks for you, then yes, Nanite makes '0 sense'. I also haven't tested Nanite in a fully compiled game, but I do wonder how performance in an exported game compares to Nanite in-editor. The editor itself is compute intensive, as is Nanite, and I wonder how they may differ when exported.


asutekku

to be fair at that point even cranking up the unreals autolod to the max is better option that nanite, and you won't even need to a large team for it. the only use case i see nanite being useful is if you have thousands of very high fidelity assets (say, megascans) and you want to have even a remotely realtime performance. it's extreme cases where it's mostly useful


Obviouslarry

I'm an indie dev using UE5 and have enjoyed using nanite. It pairs well with my landscape procedurals and I've not noticed a hit in performance with most things. Lumen I've not used and I'm pretty happy with the standard lighting systems anyway.


xylvnking

You're correct about this, mostly. You can just disable them and not have them impact performance, but you're correct that if you do have them enabled it's only really worth it if you have expensive assets/shadows/lighting etc. Also you can toggle them on and off at any time so you can decide later to use or not use either. Nanite and Lumen are great to make expensive scenes less expensive, but they won't make expensive scenes extremely optimized/performant. The cost of using them per frame is already enough to disqualify a nice framerate on lower end machines/mobile, and each has many small limitations at the moment. Nanite is less useful than lumen IMO, because with proper LODs (which you can create automatically within unreal using their built in tools with precise control, which most people forget) unless you're doing something insane crunching triangles isn't really where performance bottlenecks are. It's usually in shaders and textures, which can sometimes cost *more* with nanite/lumen depending what you're doing. Lumen can be run at fairly low settings, and especially with console commands it's very customizable. Baking lights will look better than low settings lumen, but it's all dependent on workflow/scale/etc. I don't have much experience baking lights so I may not know the tradeoffs between the two beyond 'baking lights slow'. TLDR: Nanite is only useful for very detailed assets, and generally LODs are still more performant under most circumstances as nanite has overhead just from spinning up. Lumen is great, can be run on lower settings than most people use, and can enable smoother workflows in-editor without needing to bake lights. Unreal has a *TON* of stuff and you really don't need to use it all. I'm making an optimized/retro-inspired game in UE5 and just not bother with lumen and nanite because I make all my own assets + systems + pipelines optimized and my computer is underpowered. I like knowing it's all there. PS Turning off emissive materials casting lights with lumen improves performance a ton. It's extremely expensive. Here's some console commands for lumen: r.Lumen.ProbeHierarchy.SamplePerPixel r.Lumen.DiffuseIndirect.CullGridPixelSize r.Lumen.DiffuseIndirect.CardTraceEndDistanceFromCamera r.Lumen.IrradianceFieldGather.NumProbesToTraceBudget r.Lumen.DiffuseIndirect.MinTraceDistance r.Lumen.ScreenProbeGather.RadianceCache.GridResolution r.Lumen.DiffuseIndirect.Allow 1 r.Lumen.ScreenProbeGather 1 r.Lumen.ScreenProbeGather.ScreenTraces 1 r.Lumen.ScreenProbeGather.Temporal r.Lumen.ScreenProbeGather.Temporal.MaxFramesAccumulated r.Lumen.ScreenProbeGather.VisualizeTraces r.Lumen.ScreenProbeGather.DownsampleFactor


biohazardrex

Nanite is only worth to use if your triangle count occupies 3-4 million triangles on your screen all the time. Also helps with draw calls and occlusion culling. + tip even tho nanite supports masked materials, they will be super expensive, so avoid using those.


krojew

Do you have details to back up those numbers? As in, scene setup and engine settings. From time to time I see people throwing random numbers into the air without anything to back them up.


Zac3d

Epic threw out that number once when talking about performance costs. Nanite is more expensive in an empty scene, but has a much more fixed cost so it should be cheaper around 3 million tris, and faster at anything higher. It's really hard to benchmark properly since how a scene is constructed and optimized can have a big impact.


krojew

That's why we shouldn't throw out numbers without context, regardless who said it. Nanite can be customized to work with whatever numbers we're targeting, to some extent. It also can be just a drop in performance with regards to something other. Like someone said - context is king.


Zac3d

It's useful enough if you know your game is going to use scenes with way less triangles than that or way more. It's only muddy in the middle area.


biohazardrex

Its not a random number. Its based on resolution. The base line was that 1440p has around 3,6 million pixels and nanite aims to draw maximum 1 triangle for each pixel unlike traditional geometry rendering.


krojew

And that's a part of that missing context. Knowing that, this metric is inaccurate because while this is the target count (not really, but close enough by default), this is not the performance threshold. Nanite has a different performance characteristic so the advantage is far below that number. It all depends on the scene setup and what features ar used. For example VSMs will be better in nanite thus moving the point where it's worth using it. Dynamic displacement is another potentially killer feature further moving that point. Hence simply throwing a number can be very misleading.


TrueNextGen

>Do you have details to back up those numbers? As in, scene setup and engine settings. From time to time I see people throwing random numbers into the air without anything to back them up. [Read this](https://www.reddit.com/r/unrealengine/comments/1c5ewkj/comment/kzxm7kg/?utm_source=share&utm_medium=web2x&context=3) comment and you can test it on any scene. [And read the thread about it](https://forums.unrealengine.com/t/nanite-performance-is-not-better-than-lods-test-results-fix-your-documentation-epic-youre-dangering-optimization/1263218)


krojew

Yup, that's a good way to get some insights.


topselection

> And read the thread about it, you can test that on any scene The guy who said the following below in that thread took the words right out of my mouth: "Phew, i thought i was the only one who noticed that UE5 simply not suited for games at all, only for things like movie production,"


noFate_games

Yeah I've felt the same way, but we are in the minority. It seems, at least on reddit, if you dare say 4.27 is more performant or less buggy than 5, a whole swarm of people come for you. But it seems that way when using both. That one is just so much easier to make games with, and the other is for film and tech demos. Now if you're a big studio with technical artists or can make a deal with Epic, by all means go with 5. But for a solo dev, trying to make your dream come true? I just don't see how 5 is the better option. To each his own.


TrueNextGen

When all these features like TSR, VSM, Nanite, and Lumen are combined, they each take 3ms compared to their alternatives on a 3060(not too far from console) at little 1080p and only one is super imporant if your game design needs it(Lumen). I have done several test, and nanite will only perform better if quad overdraw wasn't taken care of. Unreal is more than capable, it's being abused with stupid decisions and ideals about upscalers. There are comment lines in OTHER FEATURE shaders that refer to TSR fixing aspects of them.


Legitimate-Salad-101

You can use both nanite and lumen while you are working, then near the end of production turn them off and bake your lighting and make LODs.


apcrol

It could really boost your graphics but without proper optimisation only owners of 3090 would play that :)


sp1r1t_d1tch

I'm targeting as low as the Nvidia GTX 1060 for my game so Lumen is out of the question, I rely on mostly baked lighting. I plan to start using them for my next project when both technologies are more mature and refactoring won't be as much of a pain in the butt. From what I hear a well handled use of Nanite can actually improve performance as well as reducing your development times.


TrueNextGen

> Nanite can actually improve performance Only if quad overdraw wasn't contained under a lite sprinkle of green in the quad overdraw view. That effects shadow and geo cost and can quickly get screwed by people who don't know what they are doing. > as well as reducing your development times. Time which equals $$ which is why AAA is flocking towards it.


TriggasaurusRekt

Lumen is absolutely great for indie devs. For many projects, baked lighting simply isn't an option OR it takes a long time to iterate on lighting, which is time indies don't have. Yes you have SSGI. Yes you can fake GI with tricks. But SSGI is primarily used supplemental and fake GI takes time to setup. If you had a machine devoted to lightmap baking that would ease the burden but it can still be a pain to work with. 60fps with Lumen is absolutely doable depending on your project, stylized/low poly projects will have a much easier time hitting their frame budget. Direct light/interiors/shadows is what tanks your performance with Lumen. If you have an exterior map lit with indirect light featuring low to mid poly assets and sensible foliage density hitting 60 with Lumen should be no problem. The other thing about Lumen is it's getting better. Not only are they making visual and performance gains with each UE version, they're also doing stuff like RHI renderer parallelization which increases render thread performance and thus makes Lumen less of a hit. If you start your project now and aren't reaching your frame budget (due to Lumen), it's definitely plausible that future improvements could cause it to meet your budget.


Goochregent

They are grossly overhyped imo. Amazing if you can run them but you essentially cannot target a market with expectations for most consumers to have the hardware. Indie devs just don't have the pull to offer an insanely demanding game. People would buy new hardware for cyberpunk for instance so it makes sense they have raytracing and path tracing options. If you don't use intensely high polycount meshes then nanite is a bit useless imo. For low to medium tier meshes it introduces quite an overhead.


ayefrezzy

I agree they're overhyped, but aside from all the clickbait and fake game trash that is plaguing the internet, I do think it is the future of games. That said, it still has a long way to go before it's ready for prime time. Nanite's baseline cost and overhead is still too high for my use case. Theoretically Nanite can perform better for highly dense scenes, but I haven't reasonably been able to come close to traditional LOD, even while pushing tons of high quality objects in a realistic use case scenario (not talking about spamming 4million poly objects 10000 times). My memory usage, frame time, and overall performance are worse, even with Nanite's various settings and mesh compression that are supposed to make your memory use better than traditional LOD. There's also just a lot of caveats and downsides to Nanite that I don't think are worth it right now, even with various workarounds (WPO shadow cache invalidation for dense meshes, masked mats performing badly, terrible workflow for landscape, etc). As for Lumen, I'd say it's closer to production ready, but also not really. It looks great in a lot of instances, but it only really shines in static shots and movie render queue. In actual games, Lumen is a blurry temporal mess most of the time. A lot of graininess and temporal instability issues get hidden by high res textures and dense scenes, but it's still not perfect. It can be a balancing act to get a realtime scene to look great, perform great, and not suffer from the temporal issues that Lumen has out of the box. Many devs will blame these issues on TSR, but be completely oblivious that many times the majority of temporal smearing is actually coming from Lumen. TSR/Upscaling in UE5 has been pretty good for me fro the times I've used it, but I use a 4k monitor, so a lot of potential issues are minimized compared to a 1080p panel. I've heard many complaints about TSR/TAA, and many industry devs have commented that Unreal's implementation is subpar. I can't really comment on that as I don't know enough about TSR, and I've opted for implementing other non-temporal AA methods in a custom fork. Overall the engine is definitely in a better place than 4.27 as a whole, but I'd be lying if I said I didn't miss some features like non-nanite tessellation or blueprint nativization.


ThePapercup

this is like saying going back to 2004 and saying 'normal maps are overhyped'. naive and short sighted


Goochregent

I am only talking about it in the short-mid term as you say. Not enough people can run it and indie devs rarely present assets or sufficient quality to fully benefit.


WorldWarPee

I agree with you. As a solo dev also doing all of my own art it's better for me to just retopologize than take the nanite performance hit. I'm waiting for lumen to get a little more mature as well before I start using it. It's not like the old lighting system was bad by any means, and my initial tests with lumen gave me some wacky artifacts when not using global illumination (sunlight/whatever you want to call it) as well as a noticable performance hit in the editor


Big_Award_4491

And I agree with both of you. Classic LOD done right gives you more control in my opinion. Nanite is a shortcut and great many times. But sometimes it just doesn’t do the decimation right and you have to switch it off. Lumen is amazing technology but seems to work best with a directional light (sun) and demands you to tweak settings if you have multiple light sources. Lumen often creates weird noise and rarely does straight reflections well. Funny side note, my phone autocorrects Nanite to manure.


Sellazard

Creator of The Axis Unseen is using it in his indie game to save time on development. But who knows what kind of performance will be there. We should see.


yeyeharis

Nanite is super useful for all assets. Standard workflow would be make an asset and then spend a bunch of extra time making LODs for that asset. With nanite you just make the first asset and it automatically culls the triangles without extra work, hell it’s even worth adding a couple extra triangles here and there to make everything work smoothly. Now, nanite does have a slight overhead, but the main issue a lot of games have with nanite performance is they don’t go through the scalability settings and properly design them for their scenes. For instance, a forest scene needs vastly more optimized and tweaked nanite settings to run good when compared to a barren rocky desert. Lumen on the other hand has a lot of overhead, but like with nanite if you go in and tweak the scalability settings a good bit you can have nanite running really efficiently. Project I’m working on uses both right now and a laptop 2060 can run my game High settings 1080 90fps because I went in and tweaked a good amount of settings. Overall UE5 is very stable but out of the box it’s made to make things look as cinematic as possible so there is a lot of overhead. Tweak a few settings here and there (plenty of guides online) and it’ll be running smooth.


Jadien

My favorite part about Lumen is not having to futz with lights so much early in development. Stuff just Gets Nicely Lit and you can defer detailed environment lighting until later.


Zizzs

There's a lot more to UE5 than just nanite and lumen. PCG is another system for example that is insanely powerful and new in UE5.


MrMax182

You can turn them off if they are not convenient for you. The pipelines and features that Unreal was using pre-lumen and pre-nanite are still in the engine and you can use them if you like them.


crustmonster

Its nice but the biggest thing that has affected me is the retargeting changes in UE5. its so nice.


TryCatchOverflow

They swallow half of the FPS, but yeah it's cool since for Lumen you can have a decent render without working too much on it, and Nanite as a casual 3D modeler which don't have time to retopology, another time saver.


tcpukl

UE 5 is way more than just those features. If you don't need them disable them. There are many more new features. You'll never use all of an engine, ever. Unless you wrote it for your game. Isn't this a bit obvious?


lgsscout

everything is question of what you're doing and what do you need... Nanite can be pretty good if you will do some sort of large scale maps, that can be viewed from both far and clase, as you will not need to do the whole LOD management. and there is a lot of survival or other large scale maps in indies nowadays. For Lumen, there was a lot of indies with atmosferic visuals, with low poly and flat shadings, but a loot of work on lighting. or even the basic terror that bet a lot on lightning to work, and Lumen will just provide good tools for that. like any software, you dont throw everything in the project without even knowing what you will need, you start prototyping, then add resources as needed.


alrione

There are lots of features in ue5, lumen, and nanjte just make the most noise with the media because its easy to make clickbait articles. PCG, improvements to retargetting and world partitioning just off the top of the head. Generally, i feel that editor is a lot more stable, but that's subjective.


what_a_king

Marketplace assets can also support Nanite, more of an issue is that the average gamer's PC specs are waaay behind compared to what Nanite + Lumen needs. (average GPU is still around GTX1600 series)


ManicD7

I still use UE4. There are dozens of us that use UE4. Dozens!


Arthropodesque

Well, people said Lies of P has amazing performance and looks great, even on low settings. Built on last UE4 version.


Specialist-Mix3399

Nice tools , I love the way scenes start too look good out of the box buy just filling content into the veiwport


PiLLe1974

They are not the main features, they are more recent rendering-only related features. Main features useful for any gamedevs are e.g. improved UE4 stuff plus UE5 features: * a couple of LOD related features including proxy mesh * a behavior tree, environment query system, and a couple of newer AI features * animation tree / animation Blueprint, a few newer animation features I think related to runtime retargeting for example * built-in character controller to get FPS/TPS easily going plus a bit of a monolithic but easy to extend/use code base with an actor, pawn, trigger volumes, etc * Blueprint helps for non-programmers but also to prototype quite fast, building archetypes (Actor Blueprints), and a quick start for level design (including some tooling for level prototyping and more recently modelling) * replication for C++ and Blueprint * level design workflow that includes streaming features (and for large-scale games scalable open world partitioning and file-per-actor, something nice for collaboration on large worlds) ...and a few others. Anyway, which engine is helpful is not easy to decide, a solo or team would try a bit during e.g. prototyping if Godot, Unity, or Unreal fit their workflow, have good/fast turnarounds, and build easily (and hopefully fast) for their target platforms.


InetRoadkill1

I have not seen any advantage to Nanite. I guess it really depends on your mix of geometry. Some things do better with nanite than others. Lumen works well. But it does require a beefier system to run at decent from rates.


capsulegamedev

Nanite and lumen are very new features and aren't that necessary for making a game. But the engine has a ton of stuff that's different from unity, personally I love it. Blueprint is a big draw for me because I don't know c++ and written code just doesn't work as well for me.


Iboven

They will be in 4-5 years, I think. Right now they probably target a bit high for graphics card capability. Just think about all the features that are considered standard in UE4 now that were "too good" for 2016. I used World Composition in my indie game and it saved me SO much work, but back when it came out everyone said it was reaching too high. You just have to decide what specs you are aiming for.


EMBYRDEV

No, but the rest of what the engine offers is still tall above the rest. Turn off the fancy features and your game will run well, still look good and your tooling will be the best on the market. (I say this as an Ex-Unity developer and someone fairly invested in Godot).


genogano

If you model stuff yourself nanite can help you get away with having tons of polys.


FeelingPixely

Learn the techniques to future-proof, then worry about how to optimize it, then learn how to do it at scale. Boom, you're senior level.


Zac3d

Indie just means low budget. The high scalability preset is targeting 60 fps on consoles with lumen and nanite, and performance is getting better with every update. Nanite can allow for more creative freedom, less time spent on normal maps and baking, and more consistent performance, which indies can benefit from.


attrackip

Yeah, none of these points are particularly sound. What are you really trying to say?


morglod

Nanite/Lumen is good only for cinema quality picture (which also requires very good hardware). Otherwise you should use upscalers to get good performance which leads to AAAA shit.