The app I’m currently working on runs on Docker. It’s a medium-sized Rails monolith with a bunch of resource-heavy dependencies (Postgres, PostGIS, ElasticSearch, Redis, Next.js and a few more). Docker helps us make sure the whole stack works for everyone, every time.
For those using Linux, it works without noticeable downsides. But on macOS, despite every possible performance tweak, Docker had been a huge pain. MacBook with Docker will always run hot, battery will drain in less than an hour, fan speed is high enough for the laptop to take off and I need an external SSD disk to fit all the images.
Why not simply use an Ubuntu laptop instead? I’ve tried it (with a Dell XPS) and, for me, it doesn’t work. While Ubuntu Desktop has improved significantly since the last time I tried it, I depend too heavily on macOS software to give it up altogether.
After a few attempts, I’ve found a way to make Docker work for me on macOS. And ironically, the solution is to not run it on macOS.
Docker Will Never be Fast on the Mac
Docker for Mac will always need some kind of virtualization and file sync. That’s never going to change, so the best way to get the most out of Docker is to run it on the platform it was built for: Linux.
And that’s my solution: I now have a remote Linux development server with Docker.
Currently, I’m using a droplet on DigitalOcean. I initially wanted to set up an Intel NUC in my local network to reduce latency, but thanks to VSCode Remote Development, latency is not really a problem when editing, so I’d much rather not have to deal with an extra computer at home.
Setup & Workflow
The dev machine is a VPS, running latest Ubuntu. If I ever need to scrape it and set up a new one, I run my provisioning scripts and I’m ready to go within minutes.
There are a few essential tools that make working on a remote server possible. Apart from VSCode, iTerm with its amazing tmux integration plays a huge part.
tmux -CC, iTerm turns SSH sessions into regular macOS terminal tabs, making the whole experience seamless.
The server has all my dotfiles installed, so I have access to all my aliases, command-line tools and vim config.
To start working, I simply run the following commands:
ssh devbox tmux -CC attach # optional (in a new iTerm tab, on the host machine): # (forwards remote ports to localhost - sudo is required for 443 forwarding) sudo -E ssh -F ~/.ssh/config devbox -N -L 443:localhost:443 -L 3035:localhost:3035
What’s So Great About This Setup?
The best part about this setup is that the heavy lifting is done on someone else’s hardware. I can run resource-intensive tasks and my MacBook is no longer running hot all the time, battery life is back to normal and I’ve reclaimed over 40GB of disk space.
My devbox doesn’t depend on my laptop’s state: I can upgrade/restart my laptop without affecting my development machine. If I lose access to my MacBook, I can open up pretty much any other computer (or an iPad), SSH to my devbox and continue working. I don’t need a spec’d out laptop just to get work done because the hardware is outsourced to the cloud.
It’s Not All Roses
Obviously, there are trade-offs. I now need a decent internet connection to work. I usually have it anyway, but on a choppy wifi you’ll definitely notice you’re not using a local machine.
Another downside is cost. I currently pay $40/mo for a 8GB/4CPU machine on DigitalOcean. At one point it will make more sense to buy my own hardware.
There are also security concerns. My devbox doesn’t expose any services to the public (hence the port forwarding) and I have firewall, fail2ban and automated security updates enabled, but there’s still a chance I might have misconfigured something.
Is it Worth It?
If you need Docker but don’t want to give up macOS, you should definitely give it a try. After a few weeks of using this setup, I really don’t see any benefits of running Docker locally. Network latency in the terminal and occasional connection glitches are nothing compared to the frustrations caused by Docker for Mac.