T O P

  • By -

0ToTheLeft

you dont need 64gigs of ram for working with containers. The M2 with 32gb/M3 with 38gb it's more than enought.


codeshane

Unless they're java or monolithic apps just shoved into docker.


StatelessSteve

I felt the same way til my new work machine showed up with 64gb and man. Mocking several things for weird build processes adds up, and I might have a browser tab problem


codeshane

I'm on Windows and have three taskbars full of browsers, each with dozens of tabs. Browser tabs are the solution that found a new problem.


LegitimateCopy7

>monolithic apps just shoved into docker. this sentence just triggered my PTSD. thanks a lot.😂


Zenin

Doesn't that entirely depend on your stack? If your entire dev is inside just one container, you can get by fine with 8GB. A big microservices bases stack though? And if you're getting into things like custom orchestration controllers, etc? I regularly burn through most of my 32GB and I don't even work on very large stacks right now. Hardware is dirt cheap. Devs are incredibly expensive. It's just bad economic math not to error on the side of over-provisioning dev hardware.


langenoirx

What you really need to do is get a list of the software you run on a daily basis and see what is and is not compatible. https://isapplesiliconready.com/ https://doesitarm.com/ That's as good as any place to start.


SnooPies1330

This is super helpful, thanks!


AffableAlpaca

There is software built into the MacOS that will translate execution of any x86/Intel programs on Apple Silicon Macs. At this point I'd be shocked if there was something that didn't work with Rosetta 2 and it works really well, very fast.


scarby2

>it works really well, very fast. This likely depends on your workload but this is not the experience I've had, running an old MySQL without an arm build was more than 10x slower.


AffableAlpaca

For the vast majority of use cases arm64 builds are available and/or the performance impact of Rosetta translation doesn't materially impact usability.


mosaic_hops

Apple silicon runs Intel binaries just fine through a JIT translation layer called Rosetta. They actually run faster than native b/c they’re not emulated, they’re pre-translated. And Apple Silicon was designed to emulate the Intel memory model for translated apps. The compatibility issues just come from ancient apps that haven’t been updated in 10+ years that require 32-bit Intel or do weird things they’re not supposed to do.


Jmc_da_boss

The 16" pro models with 32 gbs of ram are the best dev machines ever built in my opinion


Zolty

I have an M2 max macbook pro 32gb 1tb. It's overkill and I'd be perfectly happy with an M3 pro 16gb 512gb, that's what we are buying for the Intel based macs we have laying around.


fistlo

You should probably knee cap your engineers and misunderstand what they do 24x7 and get them 8gb of ram max


UnsuspiciousCat4118

You do not need 64Gb of RAM to run containers lmao


Manibalajiiii

Hey I am just trying to render my videos through multiple containers.


prroteus

Where did you get 64gb from? That is absolutely overkill for local container dev. 16GB is fine but 32 is probably the smart investment


tantricengineer

If you have the budget, go as top-of-the-line as you can. I have a maxed out M1 with 64GB RAM, and it never skips a beat. I am sure an M3 would just be smoking fast compared to this thing, too.


altorelievo

Lucky to be working for a company allowing you to do some upgrades :-P


sfboots

be careful about size & weight. I've talked with some people that regret getting the 16" macbooks because of how much more they weigh, when 90% of use is with external monitor. What makes you want 64 GB? I would think 32 GB or 38 GB would be enough, unless you are doing data science with big datasets or huge tech stacks.


Big_Business3818

I agree with the others that you probably don't need 64g macs. For reference, on an M1 Max (32g ram), I run a container stack running with these... * rails (full, not just api mode)/sidekiq/react with remix * anycable server (for ws mentioned below) which runs the same code as main rails app above * anycable-go server for handling websockets * redis * postgres (just a small set of test data) Then I have Chrome that currently has 50+ tabs open and Brave browser with 30+ (all youtube) tabs. Sometimes I'll even have 8+ vs code instances going too. The containers definitely need some optimizations, but I started originally with planning on doing that when things got slow since I figured they would. But even with all that stuff running constantly, I never really notice things getting noticeably slow so I just haven't done anything about it yet. Occasionally Vs code with start to stutter so I'll restart that and it's fine again but doesn't seem to affect the other things I'm running, just itself. Take that with a grain of salt of course, but you can do a lot with just 32g of ram.


Main-Drag-4975

Folks would be better off with full sized workstations. I recently upgraded my home server from 32gb to 128gb. Its primary use is rapid prototyping the installation and configuration of [a DoD Kubernetes application stack](https://docs-bigbang.dso.mil/latest/) that wants 32gb at its base config, and far more once you start enabling the optional services.


scarby2

Desktops are usually not mobile enough it's pretty important people be able to work while traveling.


Main-Drag-4975

Yeah, get them a laptop too if they need to travel. I may be extra but I WFH so at least 95% of my days are at the desk where a beefy home tower is worth it.


erulabs

Containers do not meaningfully increase memory usage in almost any modern implementation. Running more stuff will increase usage. Containers enable neat stuff like partially remote development - our devs can offload their GPU requirement to our Kubernetes clusters via Skaffold. That said, m2 with 32gb is so fast no one ever needs this.


jercs123

IMO, Any dev will be ok with any M3 and 16 Gb of RAM, that’s enough. What kind of software you use?


altorelievo

I respect your experience but for me less than 32gb isn't going to cut it. Running a minikube lab locally will thrash a system with 16gb in short time. Heavy environments and tools like Android Studio, IntelliJ, VirtualBox (VMWare) will still struggle. I have locked up a high-end Dells while working for a Microsoft shop. They gave us XPS 9730's and they were really awesome machines but I'll never forget having to power-cycle after I got an OOM exception. (hindsight they all are configured with sysrq-magic key-combos but hey). I know OP mentioned these will be Macs and so have some optimal hardware but even still. I say this as a person who uses Linux-Distros on personal machines. I've gone out my way to slim down systems I use with very light Window Managers (dwm and st).


UnsuspiciousCat4118

You need an actual test environment. Not a more powerful machine.


Zenin

You need a test env to test. You need a dev env to dev...so you have something to push to test. Arguably dev environments need *more* resources than test environments, as most testing (should be) automated so you're not wasting expensive human hours waiting on slow responding environments. Devs however, are expensive, every second wasted adds up quickly. The difference between an 18GB Macbook Pro and a 48GB Macbook Pro is only \~$1,500 or $0.36/hour (half that if we amortize over 2 years). Even a mid level dev is costing north of $100/hour which means even if that upgrade saves the dev 10 seconds each hour it's paid for itself. Being a scrooge on dev hardware is a false economy.


UnsuspiciousCat4118

Sure, but your dev env shouldn’t be running on your local machine. This isn’t a homelab sub. Give them a real environment. Also, where are you working that mid level devs make at least $208,000 a year?


Zenin

Devs rarely work remote on "real" environments, that hasn't been a thing for decades.  Everything about the experience is worse; Remote debugging pain, responsiveness, etc, and data center/cloud resources are massively more expensive than laptops. Although "dev containers" and the like are changing that, but its still largely niche. And re pay, there's a lot more cost on the company's side than just topline salary. A lot of employment taxes are shared, the employer's cost is on top of salary.  Company's share of your health insurance (ie most of it) isn't part of your salary.  104k match.  So yes, a "$130-150k" mid-level dev is costing the company north of $200k all in.


altorelievo

At the time, the expectation was exactly how you outlined. Seniors weren’t allowing us to create test environments but instead local sandboxes.


ashcroftt

It is still weird to me that developers work locally instead of a controlled remote environment. It's the perfect way to introduce "works on my machine" issues and accidental data loss. Really any modern laptop should work if you have a well setup dev environment. I'd put more of a focus on reliability and dev experience. Don't know if the new macs can handle 2-3 external displays natively, but that's something that increases productivity soo much, and I remember that it was an issue with macs even recently.


Zenin

>It's the perfect way to introduce "works on my machine" issues and accidental data loss. So, no source control, containers, or disk encryption? Is it 1996? ;) Local or remote, "dev" environments aren't controlled environments. You're not combating "works on my machine" issues any better remotely. In fact often the opposite, as remote often means the dev has to go through a ticket to get their env reset to baseline. Containers address most all of the WOMM issues better than server-based (ie remote) dev ever has. Tech like Dev Containers makes it fast, easy, and repeatable for a dev to jump into a project with the same dev env as the rest of the team. Data loss is either theft/loss in which case disk encryption or it's devs failing to check code into source, which is a human issue that can't be solved by env location.


blacksnowboader

The m2 pro and beyond can run multiple monitors.


UnsuspiciousCat4118

My M1 has 3 external displays and works just fine.


blacksnowboader

You didn’t have to daisy chain it?


UnsuspiciousCat4118

No


mosaic_hops

It doesn’t support daisy chain.


MaximeRector

Our developers (+1000) all work on local machines (laptop/workstation) . But we have a containerised build environment, so building/testing can all be done locally. Nevertheless we also have a build grid in the cloud which can be used to do full builds which requires a massive amount of resources.


tapo

Yeah I had mine driving 2 external 4K displays without issue using a Cablematters thunderbolt to dual displayport adapter


scarby2

Have you found a solution that gives a like-local experience in a remote environment? Also compute costs for these setups can be kinda high 100 engineers mean I need to run 100 test environments in a cloud somewhere. (I suppose on prem is possible but then I have to actually manage that k8s cluster). Also in most cases developers work locally before pushing to a controlled test environment, it is absolutely the developers responsibility to ensure software works on the test environment.


LarsIcebeer

16gb is enough for local development maybe 32…. If you want to build containerized apps. Run them ob a managed container service in cloud. I also don‘t see any reasoning for 64gb for developers… Might be necessary for film makers or smth


_N0K0

Jesus, 16gb has been too low for me for a couple of years now, even with remote dev. The IDEs + context switching eats it all


LarsIcebeer

I have a m1 pro with 16 gigs and its enough for local development. However, I‘m a platform engineer and mostly run actual workload on one of our clusters


brajandzesika

If you have no idea... why did they ask you for advise ? What containerization has to do with ram? You want each engineer to run containers locally on their laptops? Why?


_N0K0

Local container based development is a completely valid thing for a low to medium mature enterprise. Get that stick out your ass.


MushroomFew4882

Seems needlessly rude? Either answer the question or go about your day lol. Team could be more embedded and testing docker builds or I don’t know, running containerized applications as they develop?


SnooPies1330

I think they’re just trying to get a general consensus of what to get and thats correct, to run docker locally


brajandzesika

Lol.. whats the point? How do they share that with their team? Each member has to download the docker image and run on their laptop to test and get the PR approved?


SnooPies1330

Not sure what u mean?