In previous posts we went over how to use docker to limit process memory when doing machine learning so that linux desktop won’t freeze when system runs out of memory. People often use servers with very large ram memory on orders of 1TB. Usualy that is out of reach for budget machine learning rigs. Docker enables us to fake this resource by having 1TB swap for example. We can have entire hard disk formated as swap and limit docker container to not use entire ram memory while taking advantage of swap as we saw in previous posts. Some of experiences with this setup are that kernel memory with docker container should be bit larger than memory you allocate for container

 -m 5g --kernel-memory 500m --memory-swappiness 100 --memory-swap -1

Kernel memory can’t be swapped. If it runs out system freezes. Once it is bigger than what is allocated usually freezes won’t happen and swap can be utilized while desktop system still runs. If you put more hard drives as swap kernel could use parallel throughput see here.

This can increase potency of small machine learning rig since for now memory is more expensive.

It is also useful to run your desktop apps in docker so that you avoid that you run out of memory while running your machine learning experiments. There are plenty examples here how to do it.