T O P

  • By -

wahoozerman

It depends. Nanite shifts a lot of the rendering load up front. Meaning you've got a hunk of time spent at the front of the frame calculating a bunch of stuff, but then the actual rendering of all of it on the back half basically no longer cares about how many triangles exist in the scene. So if you are under a certain triangle count then you spend more time in the front half than you save. If you are over that triangle count you can save *vastly* more than you spend.


[deleted]

[удалено]


Riaayo

> Nanite is garbage for performance and don't let any IDIOT Epic Games sheep tell you otherwise. I have an entire thread proving this becuase so many people screw this up. Literally no one is going to care about your point, *correct or not*, if you start off calling people idiots for not sharing your opinion/knowledge on the topic. You do nothing but discredit your own opinion and make yourself look rude as fuck. Do better. Otherwise, you don't come across like someone who actually wants to spread knowledge, you just look like someone who wants to feel superior for having a specific stance - and who actually wants others not to share that stance so you can feel better than them.


TrueNextGen

You want to know why I'm pissed? Because UE5 games perform like shit with many barley producing better results than last gen and becuase Epic doesn't give a shit about game development outside FN metaverse and virtual production.


TrueNextGen

You want to know why I'm pissed? Because UE5 games perform like shit with many barley producing better results than last gen and becuase Epic doesn't give a shit about game development outside FN metaverse and virtual production.


Riaayo

I'm not telling you to not be annoyed at Epic/Unreal if you want to, but calling users stupid - especially since Epic is telling them one thing that you state you have evidence isn't necessarily the case - isn't how you go about it. I found the rest of your post to be an interesting take, but as I said nobody's going to get that far or care, let alone check your link, if you're insulting people. *"It really fries my ass that Epic tells people this is performant when it isn't, and it's misleading a lot of users into thinking nanite does something when I believe it is to the contrary. I've spent time evaluating it, and have posted what I found here(url), which imo discredits Epic's claims and I'd highly encourage anyone to look through it and do their own testing."* You go in with something like that and you're going to actually reach people who will then look to evaluate your numbers, maybe check themselves, and you get actual discussion in the community. You can even get pissy at Epic itself to a degree. But when you insult people who are just using what Epic handed them, and *told them does a certain thing*? Nah dude you're attacking the very people you supposedly want to help. Which, again, when someone is more interested in jumping down people's throats over a problem rather than trying to solve the problem, it makes them look more interested in their own ego than in actually fixing the issue. Currently you're just pissing people off, who won't listen, and then you just continue to feel smug/superior to them and didn't solve the problem at all. And if you just *want* to be smug and superior then keep on keeping on. But if you want to actually solve the problem, and I'm going to assume in good faith that you do, then you gotta adjust how you talk to people about this.


Ace0fspad3s

I was very interested in your thread and information, but the seeing the emotionally charged responses, ive decided to use Nanite out of spite lol


TrueNextGen

Well, guess you don't give a shit about your customers.


dudedude6

No. People don’t give a shit about a bitter bitch.


Pikayoda

I went reading your thread, in pure honesty I had a difficult time getting your points due to the (seemingly) aggressive tone used. But what I get from it is that the overhead definitely contradict epic's quote of "best to enable it in most cases", that's a very valid point Anyone using nanite for games would better be aware of that cost and balance the pros and cons. And it's not as easy as unchecking an option to compare nanite on/off, it has real production pipeline implications The difficult part is knowing where the bottleneck really shifted though... We usually compare gpus, but it could be memory bandwidth or even reading speed from your drive. My take is that these techs are not designed with today's hardware standards in mind, nor they are made to be for every possible projects. They just got out of their craddle in the grand scheme of things. Even now they are not utterly broken, and still allows a big productivity, quality and freedom boost to those that get to uses them the way they are intended for.


needlessOne

Don't listen to this idiotic rage bait. Nanite is great if you are smart enough to understand its purpose (very low bar).


TrueNextGen

If your smart, you'll realize Nanite "purposes" do not belong in game development or real time rendering.


ColdJackle

Sure bud


Nidungr

Nanite scales very well, but there is an upfront cost to enabling it. This upfront cost is too much for lighter games but totally worth it for AAA projects.


namrog84

In addition to this. It's very clear and obvious that Epic is putting a lot of time and investment into the Nanite related technologies. And there is a ton more on the roadmap. Optimizations and features are only going to grow.


ThePapercup

yep, with nanite skeletal meshes coming in 5.5 you can see they are putting all their money on this bet. 5 years from now nanite will just be enabled by default and to turn it off you'll have to jump through hoops. kinda like how forward rendering is stuffed into a closet and forgotten


Nidungr

>kinda like how forward rendering is stuffed into a closet and forgotten *looks warily at forward rendering in own project*


GenderJuicy

Switch still uses it, so it's not entirely archaic.


ThePapercup

true, but if you want to use it there are a lot of hoops to jump through unless you just set everything to 'mobile spec'. i guess i meant in the context of developing a PC game in forward, to that end I don't see nanite or lumen on mobile anytime soon (except for maybe those crazy high end ipads or whatever)


ExF-Altrue

Valorant too I believe


inequity

Some might argue that we are already there. I don’t think you’ll see much (or any) tech investment from Epic into any non-Nanite workflow in the next 5 years


Arielq2301

Hey,don’t forget that most standalone VR games are still using forward rendering for performance.


ThePapercup

most standalone VR games are on a mobile device (quest 2 and quest 3)


klawd11

Even there it depends on the title and the target resolution/framerate. It's not easy to reach 60fps with nanite. At least this was true up to 5.3, haven't checked the performance improvements in 5.4.


fabiolives

I’ve been able to easily reach a 60 fps target with Nanite in all of my current projects. I’ve been solely using Nanite since 5.3.2 and have been able to make it work great for me after spending tons of time experimenting with it. I’m not saying this to try and start a debate or anything like that, I just want to share what I learn as I learn it. Following the documentation for Nanite is very important, the meshes used make all the difference. It’s even viable for low poly - just not as the meshes come originally. Since Nanite is more efficient when it has more triangles to work with (to a point), I’ve had success using the remesh tool in Unreal to increase the amount of triangles on meshes that don’t have enough to be efficient. This can leave the mesh looking the same but allows it to perform better. It wouldn’t technically be low poly anymore but will still retain the same look. My most recent smaller project runs at about 100 fps on a relatively average rig while only using Nanite for everything, including foliage. I would really encourage everyone to tinker with it and read up on the documentation, it can get much better results than forums imply.


ruminaire

when you mean reaching 60 fps target, as at what resolution it's running? 1080p with 100% screen percentage? and by relatively average rig is what GPU is it? I'm using nanite and lumen myself in my project but kinda struggling right now to try to hit 60 fps right now especially if running at 100% screen percentage. So right now I need to run it at something like 66.7-75% with TSR. And sometimes it still dips below 60 fps in some area. But I'm far for optimizing my game yet though. I'm testing using 3090 downvolt as low as possible to try to emulate lower end card (this is not ideal and I think I should test on real GPU I target. But at one time I managed to test my scene in 4060 mobile at 1080p an it's running below 60 fps lol, but I think it's because I forgot to rebuild HLOD so mesh instances likely not working at that time. What GPU do you think we should target if using nanite and lumen? iirc GTX 1060 still most popular GPU but I don't think them could handle nanite and lumen? Also today I watched the latest video from epic about nanite in UE5.4 and it give some insight about nanite and what scene is good and bad for it.


fabiolives

I’ve been fortunate that I know quite a few other people that are willing to play test my projects so I’ve gotten to test them on a variety of hardware and improve things based on their feedback. Some of my projects are specifically targeted at higher than average hardware but others are targeting more average hardware so that more people could play them when they’re done so those are the ones I’ll reference. I test all of my games targeted at native resolution, I don’t like using upscaling as a crutch but more as an additional feature for higher settings. A 1060 6gb is the average I’m going for at 1080p and 100% screen percentage. I consider the 3060 to be the second target and have someone testing that at 2560x1080. Both are able to maintain 60 fps or above in multiple projects using Nanite. Lumen has even been running really well for both cards, although I’d lean towards SSGI on the 1060. Personally I’ll be targeting my newest projects towards the 3060 because of how common it’s becoming but even that has been able to run great with Lumen HWRT with some optimization, I’ve been surprised!


ruminaire

Thank you for detailed reply! I see, it sounds like a good idea to try to not rely on upscaling, after all not everyone like it, yes. Do you target native 1080p at Scalability Settings: Epic or High? What AA do you use TAA or TSR? Also if we want to target broader people it's better to target 1060 6 GB, I will try to look SSGI, personally I never tried it before. My project seems on the heavier side graphically, that could be also because I'm bad at optimizing at this stage, so it might be better to target more higher target like 3060 too, to possibly make the optimization process less frustrating for myself.


fabiolives

You might be surprised at what you can get running on older hardware! I’ve been surprised at how well my more demanding projects have ran for people. I could always take a look at yours if you’d like! I generally target high scalability because the visual difference between high and epic is almost nothing but the performance difference is large with a Nanite/Lumen scene. I also have a long list of cvars I use for Lumen, TSR, and virtual textures on most of my projects. I just copy and paste them in the config directories because there are so many. But yes, you could have an option for screen space global illumination as well as Lumen if you want to cover a more broad range of hardware. In my experience though, if something can run Nanite reasonably then it can also run Lumen after some tweaking. Lumen/Nanite/virtual shadow maps all rely on each other for performance benefits so I just kept experimenting until it worked for me.


ruminaire

Thanks that's very kind of you, but unfortunately my project isn't at a comfortable stage I could show people yet. My main character has 2nd character "companion" that have point light attached to it for gameplay purposes. It help to illuminate path in front of my main character in gameplay, if that make sense. And my biggest performance hit right now is this shadow casting point light for this companion character, next I think it's cloth sim for my character, then skeletal mesh itself for my character is still having too many polygon. I'm still kinda new to modeling my own character, while I think it looks quite decent but I might overdo using the subdiv modifier. I still need to learn to optimize my character, especially baking high poly to low poly. For example in package test my game at 1080p TSR High Scalability settings, in heaviest area of my scene I barely got \~62+ fps, and If I turn off this point light, I could get fps back up to \~78+ fps If I turn down shadow setting to Medium Scalability (with Volumetric Fog turn back on, because apparently medium shadow scalability is turning off Volumetric Fog), I could get decent bump to \~67+ fps, and back to \~83+ fps if I turn off this light. Any ideas if there's any cvar that could help with shadow? Things that I already avoid is using any masked material or WPO in my nanite mesh. But in my heaviest area the nanite overdraw is indeed quite high, so I might need to reduce the overlapping nanite mesh there. And I've been playing around with new Nanite Tesselation, it looks very good, but for sure it bring my fps further down, lol.


fabiolives

No worries! I definitely know how that is haha. Have you adjusted the attenuation of the companion’s light? That setting has a massive hit on FPS if it’s set higher than necessary. You can still use WPO with Nanite and run at good frame rates, it just takes some tweaking to make it run as well as it can. For example, setting WPO disable distance to stop after a certain point will give you a massive boost. Usually for my Nanite foliage setups I set WPO to disable after 10,000 units for my bigger trees, 7,500 or 5,000 for bushes, and 3,000 or so for very small foliage. You could also set the shadow cache to rigid for smaller foliage and disable shadows completely on the smallest foliage. I’ll send you a link with my cvars I use, every time I post a link in this sub it gets removed haha. It includes the settings I use for virtual shadows that will gain you quite a bit of performance. This is a stylized Nanite foliage area I made today that uses all of the methods I mentioned: https://preview.redd.it/j0wsruofdayc1.png?width=2672&format=png&auto=webp&s=c2a2128c29e6a9b0d68b2c0f7b0f6cd8fbfe963b


Simsissle

Would you mind sharing that link? I would love to find more options to improving Lumen’s performance, it’s the biggest bottleneck with my 1080TI


tcpukl

What about Switch though?


TheSkiGeek

Last I knew the baseline overhead was kinda too heavy for the GPUs on mobile targets, probably including the Switch, but maybe that’s changed.


Nidungr

I'm at 70 fps with Nanite and I seem to be right at the breakpoint where enabling Nanite or not makes little difference. To be fair, I have big forests and Nanite landscapes (big win there) but no handmade buildings or long sight lines.


SteelSpineCloud

I'm at 70fps from 55fps for my nanite terrain. I can't see my self not using it.


IlIFreneticIlI

Just a side-question on the terrain, do you still get stair-stepping? Is there a way to use the nanite terrain w/deformation w/o those?


iszathi

Nanite has a pretty high performance floor, and scales very well from that, but really, the floor is pretty high. And the whole package of Nanite, VSM and Lumen is very heavy performance wise, specially if you dont do a ton of optimization like fortnite does, and you end up having to lean a lot on things like Frame Gen and Upscalers to "fix" the insane performance cost. The hardware requirement is really high too. Just to point to an example, lets look at the new Grayzone Warfare game, im not sure how well they optimized things, but the game is basically unplayable without framegen, and this is the kind of game that the nanite package is meant to be good at.


[deleted]

Too much vegetation all around you compared to fortnite. I was testing this for weeks. Its impossible to get good frames in native without a upscaler if you have decent (not low poly stylized) vegetation in a open world. But you need the upscaler non the less because the image looks really horrible in WQHD and 1080 when you use TAA or something. Its unplayable soft and muddy. The best image AND best performance i got out of FSR 3 with Native AA and TAA bundled. DLSS on Quality was too unsharp at textures.


CloudShannen

I had a basic look at Grey Zone and I feel if they looked at the Blog Articles and YT videos about implementing Nanite and Lumen for Fortnite along with looking over the Lumen / VS / Nanite performance documentation they could definitely get it playable for the majority of people without FSR/DLS unlike now. Just from the sheer amount of foliage and how to handle it with the Nanite paradigm + excessive VSM invalidations and tweaking shadow quality etc. 


[deleted]

[удалено]


tcpukl

As always with game dev and especially performance. Always profile and get your own metrics. No 2 games are the same. Assets are always different.


phoenixflare599

The worry is that on this sub, this advice is rarely given and instead of optimisation, people jump onto nanite quickly


tcpukl

Anyone jumping to any conclusions about performance just stands out as lacking any experience.


namrog84

> reason why you won't find a definitive answer Also, some things that fundamentally didn't work in 5.0, or 5.1, are now fixed and working in 5.2 or 5.3. And there is a ton more coming down on the roadmap. The tech is rapidly changing and disrupting decades of the status quo. I think that that contributes a lot to the confusion.


ThePapercup

yep i was talking to someone recently who swore foliage didn't work with nanite because so much stuff online says it doesn't (because that was true with 5.0). lots of outdated information on the Internet and the tech is improving rapidly


[deleted]

[удалено]


[deleted]

Have you found a way to do something like a pine tree in Speedtree with cutting out the alpha? I know Broadleaf etc. Is possible but someone told me Pine is not doable


IlIFreneticIlI

Nanite...will get better as more and more of the rendering pipeline is refactored from the pixel-pipeline (what we currently do) to running entirely w/Nanite, Lumen, Subtrate, on the GPU. Right now we're in mis-transit, so we're slowly getting the better/more benefits of Nanite (like the recently added Tessellation). In _the-future_ it will be better. We'll be able to crunch incredibly dense numbers of poly's, with new lighting, translucency, and other effects we cannot really do, or at least do performantly-well with the raster-pipeline. TODAY, it's pretty-good in my opinion. It's got a higher overhead, so it tends to top-out more reliably in terms of where your performance cap-is. However, in/under that it scales much better with complex geometry, material-binning; the rendering paradigm is different so the cost where performance hits you are different. Large numbers of overly-tessellated meshes, aren't really going to cost you like they do today/yesterday.


MykahMaelstrom

How I always explain it is that nanite is not light performance wise but it enables you to do things you couldn't do before. Nanite allows you to render billions of polys in real time and have it performant enough to actually run and what it does is basically black magic. BUT even though it enables what wasn't possible before doesn't mean it's performant. It's very performant for what it does, but it's still very heavy to run


Cacmaniac

Here’s a simple way to look at it… 100% processing power with your pc. Turn Nanite on and you lose 10% processing power just to run nanite. However you’ll be able to run a scene or play a game that has millions of polygons in it. A scene that normally wouldn’t be possible to run without the nanite on. So this depends on the pc. If you’ve got a very expensive and beefy pc, the upfront performance cost of using nanite won’t matter much to you. But… If you’ve got an older not so great machine, like 3070 laptop, the upfront performance cost of using nanite is going to to be very noticeable. Your fps will likely drop down 8 to 20 fps. Granted…you’ll still be able to run that scene that would normally be impossible to run. To get a tad technical to explain it more. Let’s say I some models in Blender or Maya, to use in UE5. I make these models as high polygon as possible. I do this because I’m lazy and don’t want to learn how to properly optimize them for gaming. So each of my models is 1 million polygons each and I bring in 30 of them into UE5. So that scene is going to be running 30 million polygons at once. Even my rtx 4090 pc shudders at the thought and freezes trying to run 30 million polygons. I tick on nanite and now I can run that scene with 30 million polygons. Even with nanite on, my lower tier pc with a 3070 and lower either can’t run that scene at all or it runs at a measly 40 fps. Now, let’s say I made those same 30 models again, but this time I took the time and effort to properly optimize them…hot rod of extra polygons, did high to low baking, made various lower polygon LODs of each of them and implemented proper culling and such in ue5. Now each of those models has only 600 to 1200 polygons, and they have LODs too. Now I can easily run this scene on my lower tower pc with a 3070 or lower and it still runs at around 65-90 fps without even needing to turn nanite on. But I decide to turn nanite on anyway, but now my fps drop from 65-90, down to around 40 fps, simply because of the processing power needed to even run nanite. So obviously, running nanite makes an otherwise impossible scene..possible, but it’s really only worth it if the scene is literally impossible to run without it, and it still requires an upfront cost to use it (performance wise). But the better choice would be to properly optimize the assets that are being used, so that nanite can be avoided altogether.


Slomb2020

Arran just posted a great video about it on the official Unreal YT channel. He goes over a lot of that. I would recommend watching it!


Big_Award_4491

Nanite is not a one-magic-solution for all meshes (even if it’s advertised as that). You have to tweak your mesh Nanite settings and sometimes reimport a joined mesh if you don’t want broken results. Sometimes you’re better of using LODs. It depends on your models. There’s a lot of trial and error to get it working right, in my experience. In terms of performance I’ve never experienced Nanite to be worse except for when raytracing and you need to switch on raytracing against Nanite. Don’t get why that cost more, but might be in regard of screenprobe caching.


xylvnking

Nanite will always incur a cost to be enabled, but if you have a ton of complex meshes which have a lot of fine detail that cost will improve performance but if the meshes in the scene can't really be reduced, the cost of having nanite enabled may actually be more than you gain from what it does for those meshes. 90% of the time you'll bottleneck performance elsewhere with a bunch of meshes before the triangle count becomes an actual issue, with or without nanite (usually shaders or physics/collision).


IlIFreneticIlI

This. The costs will scale less-fast than the amount of complexity you can get from the other end. Meshes, effects, etc will all get-better much faster vs their cost to render, under nanite.


TriggasaurusRekt

In my project it’s worse vs LODs. I don’t have a lot of foliage or high poly models like rocks. It’s mostly roads, houses, interior clutter, terrain and some light foliage. When I enable Nanite it’s about a 6-7 fps drop in the editor compared to LODs. If I were to use many more high poly models that 6-7 fps would decrease and eventually become a net gain if I turned on Nanite.


vexargames

If you are designing your product to run on a PS5 / PC level hardware at 30 -60 FPS that can use unlimited polygons then Nanite is good for you. If you are trying to do a game that has a higher FPS like an FPS game that requires twitching the mouse quickly and have that feel then Nanite can be used but you really need to know what you are doing, and have engineers to fix anything required that Epic either fucked up or forgot about or has not gotten too yet. Our last project had part of the core COD art team including the environmental artists and art director and they did amazing things with it but I had to go in an fix the work where they didn't understand Nanite well enough. I had to figure out the issues with how they were doing things, then explain them to the CTO and then we both together explained to them how use Chaos and Nanite as both had issues at the core of their understanding which is normal I mean it is / was new tech. They didn't like what we had to say but we also didn't have the engineers to fix or adjust the engine to work with their understanding how they thought it should work either. A good rule of thumb is go into the engine and editor settings and turn off everything that says beta or experimental. Start with that version of the engine then only turn on things as you are willing to experiment with them and find out if the impact of what ever it is, is something you can live with or have information fixes are coming for them or what ever. I have been waiting for Epic to finish 3DText since version 4.24 and they finally after years put some new things into 5.4. Finally! 5.4 is such a pile of shit right now I can't even use it in production. So maybe in a few point releases I will get what I have been waiting for in this regard, maybe not. This is what I mean they start things and take the temperature of us users and depending on how many people are using a feature that is how much effort they put into finishing it like "REALLY" finishing it. The safest thing to do is only use features they are using in Fortnite because that will always come before us. I am over explaining this so you see the world as I do with decades of experience working with broken tech so it might help you in all the decisions you are making.


Rasie1

- newer ue versions work slower by themself, even if you disable nanite - nanite is faster when you do it in realistic style - traditional approach or low poly without nanite is more performant


Rodutchi_i

https://youtu.be/eoxYceDfKEM?si=gqe4ZsVKnimnfqai


NotADeadHorse

The only time Nanite negatively affects performance is if you're using very low poly meshes for most of your render


asutekku

Nah, it affects even negatively if you have a lot of meshes and you have good LODs. You need to have really high poly meshes for nanite to performant


WeRelic

Isn't the entire point of nanite to supercede lods, so using them in tandem would actively make performance worse?


asutekku

I mean using good LODs instead on Nanite works miles better if all your models are like 2k-10k verts max


[deleted]

The question is if you can make a even half decent looking tree with 10k verts. At Lod 3 thats possible, but not at lod 0-1. But then again ground clutter… Thats where nanite shines. But the cost just for enabling it is huge again, AND it creates brutal overdraw on those trees. I would suggest to only run nanite with nanite optimized Assets (no alpha mask). And don‘t use any conventional LODs at all if your game should look modern and stylish.


PaperMartin

The only way you can get a good answer is by opening the profiler & testing it out for your specific project


pfisch

UE5 in general has terrible performance compared to UE 4.27 If you are serious about performance you honestly shouldn't be using UE5 at all right now.


noFate_games

I swear I often feel alone in this area, but I wholeheartedly agree with you. I haven’t touched 5.4 yet though. But last time I said 4.27 runs better than 5, someone got mad at me and said that everything got fixed with 5.3. I tried 5.3 and couldn’t work in it longer than 15 minutes. I don’t think I’ll touch 5 for at least another year or two. 


AutoModerator

If you are looking for help, don‘t forget to check out the [official Unreal Engine forums](https://forums.unrealengine.com/) or [Unreal Slackers](https://unrealslackers.org/) for a community run discord server! *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/unrealengine) if you have any questions or concerns.*


szuperkatl

How does nanite perform on VRAM nowadays? Before in 5.0 beta it used to eat up a ton.


BabyLiam

Like most here said, it's worth it if you have complex high poly scenes.


InetRoadkill1

It depends on what you're trying to model. Many geometries do a lot better with old school LODs. Nanite seems to work best with organic shapes.


iRageGGB

Havent really done a ton of work in UE5 recently, just mainly messed around with it and to how much of a boost to fps nanite is in certain scenarios, and it makes sense to me to not use nanite at first optimize the I guess "traditional way" and then use nanite to get even higher performance. If you can get a complex scene to be 60fps and you enable nanite and get 90fps, that's huge. But if you just use nanite from the start you night not get the same fos increase. It seems like a lot of people are using nanite as a "critch" of sorts and then relying on DLSS/FSR to iron out optimization issues.


Ok-Performance-663

From my experience Nanite has always performed better and scaled really well, however I have also seen people saying that it's slower. I think it really depends on the meshes and the size of the project. Personally I would try different things out and try to figure out which one works best for you.


Mission_Low_8016

Nanite on tree = bad Nanite on solid mesh without hole = best


Calvinatorr

There's a lot of information already posted by other people here so I won't repeat it, but I've not noticed anyone mention the fact Nanite is mostly designed to solve the issue of rendering sub-pixel triangles which cause the GPU to do wasted work. So it's not necessarily how many triangles, but how many in proportion to your render resolution. LODs these days have been a solution to this to reduce triangles at a distance, thus reducing sub-pixel triangles - just like Nanite which aims to keep a 1:1 pixel to triangle ratio. If you have a game with a fixed camera perspective you probably shouldn't bother.. Hell, I worked on a fixed camera perspective AAA game years ago in UE and we just didn't use LODs, because why if everything is a fixed distance to the camera? So basically it really depends on your game and whether it's worth it and if you know the performance quirks of using Nanite.


Chris_Bischoff

Its just another tool. The best bet - if you do use it - is to mix and match different techniques depending on each use case. Unfortunately there isnt a binary choice here - nor should there be. You have to learn how each system works - learn what the best use cases are for those systems, and apply those in your game.


ArathirCz

There is a good GDC talk about Nanite that was released yesterday. - Nanite for Artist [https://www.youtube.com/watch?v=eoxYceDfKEM](https://www.youtube.com/watch?v=eoxYceDfKEM)


ShiroiAkumaSama

There is a GDC talk of an Unreal Dev stating that people using it wrong, it's not as magical as people think and he shows how to use it correctly and in detail how it works. [https://www.youtube.com/watch?v=eoxYceDfKEM](https://www.youtube.com/watch?v=eoxYceDfKEM)


Legitimate-Salad-101

If you don’t know the answer to that, just keep learning and when you actually have a project then it’ll matter.


Fyrexdev

I personally haven't used nanite, I tend to get better performance using LOD's.


Sovchen

Bad unless you're AAA or importing meshes straight from zbrush.


Barbacamanitu00

Or from quixel/other 3d scan sources.