Once homebrewing machine learning cyber trivial difficulties often arise. Desktop that you use to mix that snowflake concoction can lock up if you use up entire memory and linux starts to swap. You google quickly and find that on stackoverflow people tell you to go buy more memory. You then buy more but it quickly chockes again by that wikipedia that you loaded in python array.

What can one do?

Luckly with awesome tech like docker one can limit amount of memory so that jupyter notebook does not eat up entire ram.

Here are few tips and tricks that will get you up and running.

make swap

Make swap and mount it

#create 60gb swap file called myswap
sudo dd if=/dev/zero of=/mnt/myswap bs=1024 count=61457280
sudo swapon /mnt/myswap

use docker to limit memory

Install nvidia-docker then give 10gb for jupyter notebook and enable swap also open ports for jupyter notebook and tensorboard. This image is google gpu keras image.

sudo nvidia-docker run -v /data:/data -v /mycode:/notebooks/code -m10g --memory-swappiness=100 --memory-swap=-1 --rm -p 8888:8888 -p 6006:6006 gcr.io/tensorflow/tensorflow:latest-gpu jupyter notebook --allow-root

Now ethical machine learn away!