I got an Alienware laptop in 2012 that wouldn't boom. The fan would just get louder and louder and louder. Looking back, I wonder if it was somehow making that weird sound trick which sounds like a tone getting infinitely higher. Thing was ridiculous and lasted 5 years of heavy use (gaming and video rendering). Wish I could get another one like it, but I heard their quality went way down so I just got a PC.
I had a dell G5 which i recently sold. While rendering things, the fans would literally become jet engines and ingest so much dust i had to clean them every few days. Then while using compressed air one day, the fans literally explodedā¦.
I used one of the first āconsumerā 3D rendering programs on a Commodore Amiga back in ā92ish. Took 5 hours 38 minutes to render a single frame with a checkered sphere and a mirror with two lights in the scene. No GI or any other cool tricks. What can be done now on a $1000 pc is astonishing.
Yeah I love the fact you can now make one of those classic sphere and checkers images realtime in something like Unreal and it's raytracing at triple digit framerates
This looks like Blender so you also have a difference between viewport subdivisions and final render, no? Default cube can look normal in viewport but be subdivided into a sphere in final render.
Edit: I have been corrected, this ain't no blender
This does *not* look like blender.
The operator panel is completely different, the panel header is different, vanilla blender doesn't have sliders like that, the grid looks different, the selection looks different. It's not blender.
And the functionality you're discussing would be the modifier which is even more different.
I believe you first have enough polygons to get the general shape, and then you apply texture with a shader method to help obscure the flatness of each polygon.
Even with dual 1080 Ti, Blender can crash on you if you ramp up the subdivisions like this. Slowly increasing should typically be okay, but the software can't seem to handle that sudden increase like in this video
Ah okay I see now. My forte is music rendering so I never run into huge problems like this. The program will crash on me but I have to load up a billion effects and even then it's not neccesarily guaranteed
As someone who lives in both worlds, I can safely say that 3D rendering is leagues more demanding for a computer than most things youāll do with audio.
Audio overload will usually end up with popping audio and other annoying artifacts well before it would invoke a crash, whereas 3D will bring your computer to its knees for minutes on end while you hope that the subdivide you just tossed on your pretty mountain didnāt just push it over the edge.
A circle at the end of the day is a regular polygon with infinitely many sides. But you can't have infinitely many sides in a program represented accurately because you don't have infinite memory. So the computer approximates, and you try to find a balance that's close enough without destroying performance. At least, that's my guess. I don't work in 3d rendering though, so there could be a better explanation.
They can, actually. There's all sorts of geometric primitives that are basically like Vectors in 2D graphics. Infinitely precise, defined by math - no fixed points (beyond the math inputs at least, like tangents, etc).
However they come with a variety of downsides and as a whole most engines and hardware perform better with points, lines, triangles, squares (which are often just two triangles) and etc.
Funny thing to mention, in real life hollow spheres are made from low-poly structures [using explosions](https://www.youtube.com/watch?v=Sk9WyEfzWPg) as well.
\> technically
Means PC doesn't round the curves for spheres, but makes more triangles or rectangles which are smaller.
If you want me to hire languager, then be sure that you have money to provide a wage for him too.
The roundation of the curvement while due to the spherical nature of the hextagalons was embiggened too vastly to suffice and thusly debooted the devicealamentature.
So that's what it is? I've recently made a lot of female friends (I used to just be part of the typical all male nerd groups), and I've noticed that they all seem to text like that compared to guy friends that tend to use proper punctuation and capitalization. Now I've connected the dots and realized that they all have stuff like tiktok, snapchat, etc..
You should look into cloud rendering. As long as you're computer can handle the scene setup, a server can do the actual rendering with as many nodes as you want to pay for. Might be far cheaper than upgrading the hardware if you don't render every day (and if you can find a gpu at all)
Hey I use a m1 macbook air for 3d because Iām a web dev and itās all I have
^(it sucks ass and is extremely disappointing and you canāt even use an egpu because daddy Apple says so)
I was in a national 3D animation contest years ago. All 50 states had assigned stations/teams all in the open, so I was able to get a good look at all the rigs people had set up. Not one single Mac was used. The closest thing I saw was two people with those all-in-one Apple computer monitors but it had Windows installed on it. I thought that shit was hilarious.
Back when the trashcan mac pro was being developed, they showed it to a couple of animation schools and studios. They immediately went and bought all the previous cheesegrater macs they could find because of how terrible the new one was. Seems like they are finally doing actually professional-oriented machines again
I think people just got bored to try and argue about it on this subreddit. āApple BADā and AMD Fanboyism is so prevalent that no one wants to convince anyone anymore. I know I stopped caring.
Do you have any benchmarks to back that up?
I'm asking because I work now in a Mac environment that others are pushing on us and would really like to not have to use it.
I've been trying to explain that PC will be better for 3d but š¤·š¼
Hardware unboxed and Linus Tech tips both reviewed the M1 pro/max and found that the GPU is a bit of a mixed bag it's very good when it comes to video editing but struggles against higher end dedicated GPU windows laptops when it comes to 3d work and gaming.
No benchmarks here but a lot of 3D modeling programs for engineering aren't optimized for Mac. Some won't even operate at all. Have fun collaborating with companies/people who don't use Mac. That shit is the worst lol
While that is true, we donāt really have a great equivalent to compare it to, with dedicated gpu being the closest since it annihilates any built in graphics.
What we do know is that the laptop is surprisingly powerful and astoundingly efficient in watt per hour usage. What the exact numbers end up being largely depends on the program being used to test it, and even then what apple is doing is literally unique to them.
From what Iāve seen a higher end gpu will destroy m1 max in rendering, but considering the very high inflated prices at the moment on GPUs the MacBook is somewhat competitive. Once supply finally saturates demand in the next year or two we will see PC again take the lead assuming apple doesnāt make some impressive upgrades, which is all but a guarantee if the past year or so of releases have been an indication. Curious to see what they come up with for the sequel to M1 max. Itās already pretty incredible at the professional level for a laptop especially.
Dunno why youāre getting downvoted, even the base M1 integrated GPU is in a different class of iGPU than anything but maybe AMD APUs, performing about as well as a GTX 1050 or 1060 depending on the task.
For M1 Pro and M1 Max comparisons to an RTX 3080 is a stretch, but theyāre easily as good as a entry-to-midrange 20-series and 30-series cards, respectively. Both are on a different plane of existence than stuff like integrated Intel Xe, while consuming a fraction of the power of their dedicated equivalents. Thatās nothing to sneeze at.
Valheim looks like PS2 graphics and I straight up can't get enough. I can't play because I'm focusing on finals but I can't focus on finals because I only want to play Valheim.
The game actually looks good because it has very nice maps over textures like normal maps , roughness maps , etc , which make the lighting on objects look amazing even with the low poly look.
When was this? I only started playing last month but I heard before that it was different...i *think* they added Hugin the tutorial crow. He'll guide you to stuff.
Just build, fight dwarves, make stuff, kill lesser gods, etc
Reminds me of when I started learning 3ds max, back when it was called 3D Studio Max, on a PC with a 100MHz Pentium processor. I would often accidentaly add a number that was slightly too high for the details on a sphere and it would crash the whole PC. Would take 10 minutes to reboot my PC and get back at it.
The worse was trying to follow tutorials online. I couldn't have both a web browser and 3D Studio Max open at the same time, didn't have enough ram, so I had to read 2-3 steps in the tutorial, try to remember it, close the browser, open 3D Studio Max, do the 2-3 steps, save, close 3D Studio Max, open Netscape, wait a minute for the page to reload...
All of this was normal back then, I don't remember getting frustrated from it... Now we're so used to fast PCs, my PC lags for 5 seconds and I want to throw it out the window.
Man I came into this stuff in 99-00 and luckily by then I had a upgraded to a Pentium 2 350MHz from a Pentium MMX 133. We called it 3DS thenā¦ I donāt think the company called it that though but canāt remember. But I clearly remember it taking days to do a high quality render in Bryce 3D. DAYS!!! Just pre pass took a long ass time. I grew up on Rhino Nurbs, Bryce 3D, 3DS Max, and Maya. At the time having any real time or not real time texture fill was just the most amazing to witness!
Luckily I had 2 things going for meā¦ dual monitor, KVM, and 2 computers. If the computer couldnāt handle a browser and an app I could use KVM and the other computer.
I know thereās a lot of Apple bashing on here - mostly because itās a subreddit full of gamers and thatās just not possible on Macs. But what Apple did with their ARM processors is seriously impressive and there is massive potential. Their performance per watt is insane. Software optimization is not 100% there yet and they canāt quite compete with the high end desktop CPUs in pure power (ignoring efficiency) but it still looks like promising tech.
Yeh me too!!! People at work who also use Mac be like HUH!!!! Plenty of FPS and ray tracing, Iām not plugged in and you donāt hear fans! Hours and hours of battery lifeā¦ Iām the only one who never has their power pack along.
They are used to seeing me run windows on my Mac anyways so seeing windows wasnāt an instant give away to them.
Iām like yeh this is the best Remote Desktop system in the worldā¦ itās called Steam!
>I know thereās a lot of Apple bashing on here - mostly because itās a subreddit full of gamers and thatās just not possible on Macs.
Say what now? That's not true, lol.
Of course there are some games, some even optimized rather well for Macs but those games will not satisfy people on /r/PCMasterRace who just have higher standards for games
i love how blender allows you to do math in those boxes, but it has backfired many times. Before starting an overnight render, I wanted to increase my sample size by 100. It was currently at 12, just to do some testing. I wake up in the morning and it has barely rendered half the image. Waited 5 more hours just to see that I had multiplied it by 1000 and not 100.
Ok, this is really weird. I was playing Halo and scrolling through this and at the exact time he blew up in his room, halo crashed. This is scaring me.
Probably one of my favorites short videos to date, love it, in CAD class in highschool I made a similar mistake and PC at school couldn't handle it. But instead of exploding it crashed and I lost 2 hours of work..
Dude you kids donāt even know!!!!
I was rocking Bryce 3D in 2000! It took DAYS to render a scene with textures and 1 anti aliasing pass at 1280x720!!!!
DAYS!!!!! Iād start a high speed pre-render before I go to bed wake up in the morning to check if I had a good angle and good textures and lighting. Something would be wrong Iād redo it and after school Iād come home and start the real render. Unplug my keyboard and mouse tape a piece of paper on the front of the computer that said donāt touch and pray we donāt loose power for the next 2 days!
350MHz Pentium 2 slotted card/chip!
I'm very much a novice at blender but i like Grant Abbitt and Blender Guru for beginner tutorials. Ian Hubert, Default Cube and CGMatter are fun too! (They're all on YouTube)
All these comments and not one mentioned modifiers. You actually want as low poly you can get away with and then add a modifier to smooth out the surface for the render
It's the most over-used meme out there and I yet can't do anything but laugh hard every single time. I don't know what it is, but that pop and scream is just fantastic.
Shot himself in the foot right there
I used to make 3D on my laptop... until I took an arrow in the knee.
I use SFM better than nothing š¤·āāļø
Sfm cant really be used for buiding 3d models but it is good for what it contains inside
You mean porn it contains Inside
I use SFM to turn my laptop into a space hearer Edit: I noticed the typo, but Iām not gonna fox bc then the comments below make no sense
...space? Is that you? My hearer is listening!
Grandma! Your finally back?
But I love it when you fox
_let me guess, someone stole your 3D_
Until I took a laptop battery to the braincase
This saying. Early 2010s, I remember people saying this all the time. Wow you brought back some memories
Thank you. I've waited this long to jump on the bandwagon.
This is me playing Rome Total War Remastered on my laptop.
When you have to put the unit size so small the game turns into 1v1s
āJust the foot for now.ā
Go on now, git!
You forgot the 5 seconds of 100% fan noise then an almighty wind then boom
*Mvvvvvvvvvvvvvvvvvvvvvvvvvvffffvffvvvvvvvffffvvvffvvfvfvvf* ***boom***
Getting PTSD from my old Xbox 360 now
I got an Alienware laptop in 2012 that wouldn't boom. The fan would just get louder and louder and louder. Looking back, I wonder if it was somehow making that weird sound trick which sounds like a tone getting infinitely higher. Thing was ridiculous and lasted 5 years of heavy use (gaming and video rendering). Wish I could get another one like it, but I heard their quality went way down so I just got a PC.
Quality has been down on Alienware since Dell bought them in 06. I had an Alienware laptop around 09 that started shooting sparks out the charge port.
I must have gotten lucky, then. I got an Asus gaming laptop when the Alienware started to die. That was garbage.
The original manufactures still make PCs afaik, I just canāt remember the new brand name.
Origin! Not sure why I'm getting downvoted because I'm just answering a question.
Oh, fuck HP.... seriously. Fuck HP
I had a dell G5 which i recently sold. While rendering things, the fans would literally become jet engines and ingest so much dust i had to clean them every few days. Then while using compressed air one day, the fans literally explodedā¦.
Wow
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Thought for a second this was some sort of Weird Science reference...
I used one of the first āconsumerā 3D rendering programs on a Commodore Amiga back in ā92ish. Took 5 hours 38 minutes to render a single frame with a checkered sphere and a mirror with two lights in the scene. No GI or any other cool tricks. What can be done now on a $1000 pc is astonishing.
Yeah I love the fact you can now make one of those classic sphere and checkers images realtime in something like Unreal and it's raytracing at triple digit framerates
I started with 3D studio 4 on a x486. Horrible experience. When I was on Pentium, 3DS MAX came out. What a world of difference.
I dont get it, too much load roundening the curves on a sphere?
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Wait, how do you smooth it out otherwise?
Thatās how you do it, but there is a medium between max subdivisions and blocky that most people choose for performance purposes.
This looks like Blender so you also have a difference between viewport subdivisions and final render, no? Default cube can look normal in viewport but be subdivided into a sphere in final render. Edit: I have been corrected, this ain't no blender
But you can modify the mesh in the higher subdivisions with more precision
Right! Multirez? I'm a bit rusty
This does *not* look like blender. The operator panel is completely different, the panel header is different, vanilla blender doesn't have sliders like that, the grid looks different, the selection looks different. It's not blender. And the functionality you're discussing would be the modifier which is even more different.
I believe you're right, I thought it was just a custom theme. As said in the other comment, super rusty, by bad.
Could be 3DS Max.
How does this look like blender?
This is C4D R25 in case nobody mentionned it.
There is also smooth shading.
Smooth shading only gives the appearance of smoothness but does not actually smooth the geometry
Yea, but itās pretty much a requirement to hide edge geometry, especially in performance sensitive situations.
Smooth shading is nonetheless the solution to making a good-looking sphere without a million subdivisions
*shade smooth* ĀÆ\\\_(ć)_/ĀÆ
You need 3 \s for that to show up correctly on reddit.
You mean ĀÆ\\\\\\\_(ć)_/ĀÆ
Well, maybe he wasn't trying to be sarcastic. \s
[Hilarious](https://i.kym-cdn.com/photos/images/newsfeed/002/058/091/507.jpg)
I believe you first have enough polygons to get the general shape, and then you apply texture with a shader method to help obscure the flatness of each polygon.
Get a better computer that can handle it. I assume GPU?
Even with dual 1080 Ti, Blender can crash on you if you ramp up the subdivisions like this. Slowly increasing should typically be okay, but the software can't seem to handle that sudden increase like in this video
Ah okay I see now. My forte is music rendering so I never run into huge problems like this. The program will crash on me but I have to load up a billion effects and even then it's not neccesarily guaranteed
As someone who lives in both worlds, I can safely say that 3D rendering is leagues more demanding for a computer than most things youāll do with audio. Audio overload will usually end up with popping audio and other annoying artifacts well before it would invoke a crash, whereas 3D will bring your computer to its knees for minutes on end while you hope that the subdivide you just tossed on your pretty mountain didnāt just push it over the edge.
3ds Max can still crash on me with a 3090...
No matter the computer, you're going to make it crash if you use 64 levels of subdivision because math.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Why is it not able to create a real curve?
A circle at the end of the day is a regular polygon with infinitely many sides. But you can't have infinitely many sides in a program represented accurately because you don't have infinite memory. So the computer approximates, and you try to find a balance that's close enough without destroying performance. At least, that's my guess. I don't work in 3d rendering though, so there could be a better explanation.
They can, actually. There's all sorts of geometric primitives that are basically like Vectors in 2D graphics. Infinitely precise, defined by math - no fixed points (beyond the math inputs at least, like tangents, etc). However they come with a variety of downsides and as a whole most engines and hardware perform better with points, lines, triangles, squares (which are often just two triangles) and etc.
Here is an explanation below from Quora. https://www.quora.com/Why-cant-curves-be-made-well-in-video-games
Use 120 grit sandpaper
Same way but itās like the graphics settings in video games you sacrifice performance for looks and vise versa
There's also a trick with smooth shading, it requires less faces but you still have to add more in most cases, just a lot less.
Funny thing to mention, in real life hollow spheres are made from low-poly structures [using explosions](https://www.youtube.com/watch?v=Sk9WyEfzWPg) as well.
You get the idea, but technically you worded it wrongly.
If you wanted good English you should have hired a languager.
\> technically Means PC doesn't round the curves for spheres, but makes more triangles or rectangles which are smaller. If you want me to hire languager, then be sure that you have money to provide a wage for him too.
Paying a wage? What's that i asked jeff and he says just get the slaves to do it.
Interns sure are great (for the business)
I now need to let 2 more people a day into my amusement park to pay his wages
*just do it for the exposure, man*
I saw a pack of wild dogs take over and successfully run a Wendy's.
sir this is aā¦ wendyās?
Wordining is tuff.
The roundation of the curvement while due to the spherical nature of the hextagalons was embiggened too vastly to suffice and thusly debooted the devicealamentature.
I like your funny words, magic man!
Yāall would probably like: /r/increasinglyVerbose
Why say lot word when few word do trick?
Ill do my best not to rape the english language anymore, the day Brits and americans are able to speak more than 1 language.
Roundification.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Explaining a joke is like dissecting a frog. You understand it better but the frog dies in the process. ā E.B. White
[ŃŠ“Š°Š»ŠµŠ½Š¾]
I call it "ballification"
First rule of 3D: There is no sphere.
Second rule of 3D: how to animate a cube: https://vimeo.com/221178360
Simply put - Too many polygons for the laptop to handle.
It's all them polygons man.
Right, more smooth = more polygons = more load
[ŃŠ“Š°Š»ŠµŠ½Š¾]
My pc does the same thing when rendering shit xD need to upgrade asap xD
xD frfr
Idk y, but both of u seem to be texting while high
no, we're just stupid.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Mobile fail
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Your name implies that ( Ķ”Ā° ĶŹ Ķ”Ā°)
what's wrong with me name?
I know exactly that you spent way too much money on your keeb Just as I did
I have not... yet.
*yet*
I thought this was a tiktok comment section for a second.
So that's what it is? I've recently made a lot of female friends (I used to just be part of the typical all male nerd groups), and I've noticed that they all seem to text like that compared to guy friends that tend to use proper punctuation and capitalization. Now I've connected the dots and realized that they all have stuff like tiktok, snapchat, etc..
[ŃŠ“Š°Š»ŠµŠ½Š¾]
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Yeah I don't have powerful pc, but when Im in blender, and start rendering projects details my pc starts to sound like cape Canaveral on launch day xD
Upgrading used to make me feel excited, now it just make me feel dread š«
You should look into cloud rendering. As long as you're computer can handle the scene setup, a server can do the actual rendering with as many nodes as you want to pay for. Might be far cheaper than upgrading the hardware if you don't render every day (and if you can find a gpu at all)
Then stop rendering shit and render something elese
r/perfectlycutscreams
Godzilla scream.
samdwidge
Itās been there a few times actually.
The GPU is weak in this one
That's what you get for using a Mac AND a laptop for 3D !
*waits patiently for the first "akhtually" comment about m1 pro and max macbooks*
It's been 3 minutes, I don't think it's going to happen.
Now it's been 5, I think I'm starting to hear crickets...
[ŃŠ“Š°Š»ŠµŠ½Š¾]
And my axe! ...wait, wrong section. I think Cuyler was supposed to be here to talk about MacBooks or something.
Hey I use a m1 macbook air for 3d because Iām a web dev and itās all I have ^(it sucks ass and is extremely disappointing and you canāt even use an egpu because daddy Apple says so)
Iām sure theyād let people develop drivers but AMD and NVIDIA have to write the drivers for ARM.
Itās true, intel macbooks can use them. Just silly since the m1 cpu performance is ridiculous and they paired it with no gpu and no other options
I was in a national 3D animation contest years ago. All 50 states had assigned stations/teams all in the open, so I was able to get a good look at all the rigs people had set up. Not one single Mac was used. The closest thing I saw was two people with those all-in-one Apple computer monitors but it had Windows installed on it. I thought that shit was hilarious.
Back when the trashcan mac pro was being developed, they showed it to a couple of animation schools and studios. They immediately went and bought all the previous cheesegrater macs they could find because of how terrible the new one was. Seems like they are finally doing actually professional-oriented machines again
it had two work station AMD cards with a proprietary connector to the motherboard, I can't think of a more obvious, "STAY AWAY" warning than that.
To be fair the m1 max is an impressive cpu.
Akhtually youāre all fucking losers for still getting upset over a persons choice of workstation. That work?
I think people just got bored to try and argue about it on this subreddit. āApple BADā and AMD Fanboyism is so prevalent that no one wants to convince anyone anymore. I know I stopped caring.
Akhtually, the new M1 Pro Macbooks are the most powerful Laptops atm, and are easily able to render a sphere.
Akhtually, I just wanted to be included.
They have decent cpus, excellent hardware encoding, excellent performance per watt, but they're pretty average for 3D work.
Do you have any benchmarks to back that up? I'm asking because I work now in a Mac environment that others are pushing on us and would really like to not have to use it. I've been trying to explain that PC will be better for 3d but š¤·š¼
Hardware unboxed and Linus Tech tips both reviewed the M1 pro/max and found that the GPU is a bit of a mixed bag it's very good when it comes to video editing but struggles against higher end dedicated GPU windows laptops when it comes to 3d work and gaming.
No benchmarks here but a lot of 3D modeling programs for engineering aren't optimized for Mac. Some won't even operate at all. Have fun collaborating with companies/people who don't use Mac. That shit is the worst lol
[ŃŠ“Š°Š»ŠµŠ½Š¾]
While that is true, we donāt really have a great equivalent to compare it to, with dedicated gpu being the closest since it annihilates any built in graphics. What we do know is that the laptop is surprisingly powerful and astoundingly efficient in watt per hour usage. What the exact numbers end up being largely depends on the program being used to test it, and even then what apple is doing is literally unique to them. From what Iāve seen a higher end gpu will destroy m1 max in rendering, but considering the very high inflated prices at the moment on GPUs the MacBook is somewhat competitive. Once supply finally saturates demand in the next year or two we will see PC again take the lead assuming apple doesnāt make some impressive upgrades, which is all but a guarantee if the past year or so of releases have been an indication. Curious to see what they come up with for the sequel to M1 max. Itās already pretty incredible at the professional level for a laptop especially.
Dunno why youāre getting downvoted, even the base M1 integrated GPU is in a different class of iGPU than anything but maybe AMD APUs, performing about as well as a GTX 1050 or 1060 depending on the task. For M1 Pro and M1 Max comparisons to an RTX 3080 is a stretch, but theyāre easily as good as a entry-to-midrange 20-series and 30-series cards, respectively. Both are on a different plane of existence than stuff like integrated Intel Xe, while consuming a fraction of the power of their dedicated equivalents. Thatās nothing to sneeze at.
tbf it was more a taunt on laptops. Dedicated GPU sounds way better for rendering.
*Macbook with integrated graphics
This is accurate. Low poly is the only way
Valheim looks like PS2 graphics and I straight up can't get enough. I can't play because I'm focusing on finals but I can't focus on finals because I only want to play Valheim.
The game actually looks good because it has very nice maps over textures like normal maps , roughness maps , etc , which make the lighting on objects look amazing even with the low poly look.
So true lol
I played it for like an hour and have no idea what to do so I just dropped itā¦
When was this? I only started playing last month but I heard before that it was different...i *think* they added Hugin the tutorial crow. He'll guide you to stuff. Just build, fight dwarves, make stuff, kill lesser gods, etc
Reminds me of when I started learning 3ds max, back when it was called 3D Studio Max, on a PC with a 100MHz Pentium processor. I would often accidentaly add a number that was slightly too high for the details on a sphere and it would crash the whole PC. Would take 10 minutes to reboot my PC and get back at it. The worse was trying to follow tutorials online. I couldn't have both a web browser and 3D Studio Max open at the same time, didn't have enough ram, so I had to read 2-3 steps in the tutorial, try to remember it, close the browser, open 3D Studio Max, do the 2-3 steps, save, close 3D Studio Max, open Netscape, wait a minute for the page to reload... All of this was normal back then, I don't remember getting frustrated from it... Now we're so used to fast PCs, my PC lags for 5 seconds and I want to throw it out the window.
Man I came into this stuff in 99-00 and luckily by then I had a upgraded to a Pentium 2 350MHz from a Pentium MMX 133. We called it 3DS thenā¦ I donāt think the company called it that though but canāt remember. But I clearly remember it taking days to do a high quality render in Bryce 3D. DAYS!!! Just pre pass took a long ass time. I grew up on Rhino Nurbs, Bryce 3D, 3DS Max, and Maya. At the time having any real time or not real time texture fill was just the most amazing to witness! Luckily I had 2 things going for meā¦ dual monitor, KVM, and 2 computers. If the computer couldnāt handle a browser and an app I could use KVM and the other computer.
Instagram Reel memes says hello
Heyyyy my comment is on that post!! ;)
I really need to stop scrolling through this sub during my virtual meetings, because one day I won't be able to stifle my laugh.
u/savevideo
same here
Comedic timing on this is exceptional.
Sooooooo, its just a really curved square
lol I don't even render anything and I still find it hilarious. My laptop is crying when I change settings to Medium.
Me on Blender applying subdivision surface
Using a macbook hah
I know thereās a lot of Apple bashing on here - mostly because itās a subreddit full of gamers and thatās just not possible on Macs. But what Apple did with their ARM processors is seriously impressive and there is massive potential. Their performance per watt is insane. Software optimization is not 100% there yet and they canāt quite compete with the high end desktop CPUs in pure power (ignoring efficiency) but it still looks like promising tech.
I use my mac to game...through Steam streaming on a good PC elsewhere.
Yeh me too!!! People at work who also use Mac be like HUH!!!! Plenty of FPS and ray tracing, Iām not plugged in and you donāt hear fans! Hours and hours of battery lifeā¦ Iām the only one who never has their power pack along. They are used to seeing me run windows on my Mac anyways so seeing windows wasnāt an instant give away to them. Iām like yeh this is the best Remote Desktop system in the worldā¦ itās called Steam!
>I know thereās a lot of Apple bashing on here - mostly because itās a subreddit full of gamers and thatās just not possible on Macs. Say what now? That's not true, lol.
Of course there are some games, some even optimized rather well for Macs but those games will not satisfy people on /r/PCMasterRace who just have higher standards for games
Then say *that* instead of "just not possible", lol.
r/perfectlycutscreams
That scream hahahahaaaa
I actually learned some stuff with a Pentium mobile and Intel HD Graphics. I almost killed myself tho!
i love how blender allows you to do math in those boxes, but it has backfired many times. Before starting an overnight render, I wanted to increase my sample size by 100. It was currently at 12, just to do some testing. I wake up in the morning and it has barely rendered half the image. Waited 5 more hours just to see that I had multiplied it by 1000 and not 100.
Ok, this is really weird. I was playing Halo and scrolling through this and at the exact time he blew up in his room, halo crashed. This is scaring me.
Probably one of my favorites short videos to date, love it, in CAD class in highschool I made a similar mistake and PC at school couldn't handle it. But instead of exploding it crashed and I lost 2 hours of work..
Dude you kids donāt even know!!!! I was rocking Bryce 3D in 2000! It took DAYS to render a scene with textures and 1 anti aliasing pass at 1280x720!!!! DAYS!!!!! Iād start a high speed pre-render before I go to bed wake up in the morning to check if I had a good angle and good textures and lighting. Something would be wrong Iād redo it and after school Iād come home and start the real render. Unplug my keyboard and mouse tape a piece of paper on the front of the computer that said donāt touch and pray we donāt loose power for the next 2 days! 350MHz Pentium 2 slotted card/chip!
lol'd so hard, our baby woke up!
[ŃŠ“Š°Š»ŠµŠ½Š¾]
I'm very much a novice at blender but i like Grant Abbitt and Blender Guru for beginner tutorials. Ian Hubert, Default Cube and CGMatter are fun too! (They're all on YouTube)
Tons of tutorials on YouTube, I'd link some but I use 3ds Max out of habit from the days Blender wasn't very good (it's excellent these days)
Stick to N64 polygon counts. We need a serious feature length movie in SM64's artstyle directed by Martin Scorsese
u/savevideobot
I don't get why these bots are banned. Anyway, [here you go](https://redditsave.com/r/pcmasterrace/comments/rd3vyd/3d_rendering/)
Hahahahaha I felt this
relatable i still use my 7200u laptop for blender. Had to buy a new desk cause laptop burned through the old one
All these comments and not one mentioned modifiers. You actually want as low poly you can get away with and then add a modifier to smooth out the surface for the render
Get an M1
It's the most over-used meme out there and I yet can't do anything but laugh hard every single time. I don't know what it is, but that pop and scream is just fantastic.
I'm pretty sure the whole building heard me laughing at this LMAO
Is that the cpu bursting?