This article consists of the following sections.
Cookiecutter Docker Science provides the following features.
NOTE: please visit home page before you get started.
Many researchers and engineers do their machine learning or data mining experiments.For such data engineering tasks, researchers apply various tools and system libraries which are constantlyupdated, installing and updating them cause problems in local environments. Even when we work in hostingenvironments such as EC2, we are not free from this problem. Some experiments succeeded in oneinstance but failed in another one, since library versions of each EC2 instances could be different.
By contrast, we can creates the identical Docker container in which needed tools with the correct versions are already installed in one command withoutchanging system libraries in host machines. This aspect of Docker is important for reproducibility of experiments,and keep the projects in continuous integration systems.
Unfortunately running experiments in a Docker containers is troublesome. Adding a new library into requirements.txt
or Dockerfile
does not installed as if local machine. We need to create Docker image and container each time.We also need to forward ports to see server responses such as Jupyter Notebook UI launch in Docker container in our local PC.Cookiecutter Docker Science provides utilities to make working in Docker container simple.
This project is a tiny template for machine learning projects developed in Docker environments.In machine learning tasks, projects glow uniquely to fit target tasks, but in the initial state,most directory structure and targets in Makefile are common.Cookiecutter Docker Science generates initial directories which fits simple machine learning tasks.
To generate project from the cookiecutter-docker-science template, please run the following command.
$cookiecutter git@github.com:docker-science/cookiecutter-docker-science.git
Then the cookiecutter command ask for several questions on generated project as follows.
$cookiecutter git@github.com:docker-science/cookiecutter-docker-science.git project_name [project_name]: food-image-classification project_slug [food_image_classification]: jupyter_host_port [8888]: description [Please Input a short description]: Classify food images into several categories Select data_source_type: 1 - s3 2 - nfs 3 - url data_source [Please Input data source]: s3://research-data/food-images
Then you get the generated project directory, food-image-classification
.
The following is the initial directory structure generated in the previous section.
├── Makefile <- Makefile contains many targets such as create docker container or │ get input files. ├── config <- This directory contains configuration files used in scripts │ │ or Jupyter Notebook. │ └── jupyter_config.py ├── data <- data directory contains the input resources. ├── docker <- docker directory contains Dockerfile. │ ├── Dockerfile <- Base Dockerfile contains the basic settings. │ ├── Dockerfile.dev <- Dockerfile for experiments this Docker image is derived from the base Docker image. │ │ This Docker image does not copy the files and directory but used mount the top │ │ directory of the host environments. │ └── Dockerfile.release <- Dockerfile for production this Docker image is derived from the base Docker image. │ The Docker image copy the files and directory under the project top directory. ├── model <- model directory store the model files created in the experiments. ├── my_data_science_project <- cookie-cutter-docker-science creates the directory whose name is same │ │ as project name. In this directory users puts python files used in scripts │ │ or Jupyter Notebook. │ └── __init__.py ├── notebook <- This directory stores the ipynb files saved in Jupyter Notebook. ├── requirements.txt <- Libraries needed in the project. The library listed in this file │ are installed in the Docker images for not only development but also production. ├── requirements_dev.txt <- Libraries needed to run experiments. The library listed in this file │ are installed in the Docker images for developments. └── scripts <- Users add the script files to generate model files or run evaluation.
Cookiecutter Docker Science provides many Makefile targets to supports experiments in a Docker container. Users can run the target with make [TARGET] command.
After cookiecutter-docker-science generate the directories and files, users first run this command. init setups resources for experiments.Specifically init run init-docker and sync-from-source command.
init-docker
init-docker command first creates Docker the images based on docker/Dockerfile.
sync-from-source
sync-from-source downloads input files which we specified in the project generation. If you want to change the input files, please modify this target to download the new data source.
create-container command creates Docker container based on the created image and login the Docker container.
Users can start and login the Docker container with start container created by the create-container.
jupyter target launch Jupyter Notebook server.
profile target shows the misc information of the project such as port number or container name.
clean target removes the artifacts such as models and *.pyc files.
clean-model
clean-model command removes model files in model directory.
clean-pyc
clean-pyc command removes model files of *.pyc, *.pyo and __pycache__.
clean-docker
clean-docker command removes the Docker images and container generated with make init-docker and make create-container.When we update Python libraries in requirements.txt or system tools in Dockerfile, we need to clean Docker the image and container with this target and create the updated image and container with make init-docker and make create-container.
distclean target removes all reproducible objects. Specifically this target run clean target and remove all files in data directory.
clean-data
clean-data command removes all datasets in data directory.
lint target check if coding style meets the coding standard.
test target executes tests.
sync-to-remote target uploads the local files stored in data to specified data sources in such as S3 or NFS directories.
With Cookiecutter Docker Science, data scientists or software engineers do their developments in host environment.They open Jupyter notebook in the browsers in the host machine connecting the Jupyter server launched in Docker container.They also writes the ML scripts or library classes in the host machine. The code modification in host environment arereflected in the container environment. In the containers, they just launch Jupyter server or start ML scriptswith make command.
When you log in a Docker container by make create-container
or make start-container
command, the log in directory is /work
.The directory contains the project top directories in host computer such as data
or model
. Actually the Docker container mountsthe project directory to /work
of the container and therefore when you can edit the files in the host environment with your favorite editorsuch as Vim, Emacs, Atom or PyCharm. The changes in host environment are reflected in container environment.
We can run a Jupyter Notebook in the Docker container. The Jupyter Notebook uses the default port 8888
in Docker container (NOT host machine) andthe port is forwarded to the one you specify with JUPYTER_HOST_PORT
in the cookiecutter command. You can see the Jupyter Notebook UI accessing"http://localhost:JUPYTER_HOST_PORT". When you save notebooks the files are saved in the notebook
directory.
make init-docker command creates a Docker image based on docker/Dockerfile.dev, which containslibraries for developments. The libraries are not needed in production.
To create a Docker image for production which does not contain the developmentlibraries such as Jupyter, we run make init-docker command specifying a environment variable MODE to release as make init-docker MODE=release.
In the generation of project with cookiecutter, the default port of Jupyter Notebook in host is 8888
. The number is common and couldhave a collision to another server processes.
If we already have the container, we first need to remove the current container with make clean-container
. And thenwe create the Docker container changing the port number with make create-container
command adding the Jupyter port parameter (JUPYTER_HOST_PORT).For example the following command creates Docker container forwarding Jupyter default port 8888
to 9900
in host.
make create-container JUPYTER_HOST_PORT=9900
Then you launch Jupyter Notebook in the Docker container, you can see the Jupyter Notebook in http://localhost:9900
Some projects can have multiple Dockerfiles. Dockerfile.gpu
contains the settings for GPU machines. Dockerfile.cpu
contains settings to be that can be used in production for non-GPU machines.
To use one of these specific Dockerfile, override the settings by adding parameters to the make command. For example, when we want to create a container from docker/Dockerfile.cpu
, we run make create-container DOCKERFILE=docker/Dockerfile.cpu
.
help target flushes the details of specified target. For example, to get the details of clean target.
$make help TARGET=clean target: clean dependencies: clean-model clean-pyc clean-docker description: remove all artifacts
As we can see, the dependencies and description of the specified target (clean) are shown.
Apache version 2.0
See CONTRIBUTING.md.
Cookiecutter 是一款快速建立工程模板的Python命令行工具。它可以让你快速从模板中建立工程,目前有 Python、C、Common Lisp、JS、LaTeX/XeTeX、Berkshelf-Vagrant 和 HTML 的模板。 从 cookiecutter-pypackage.git 中的模板创建工程,你可以输入一些值, 然后它会在当前工作目录下为你创建一些 Python 包。
cookiecutter-flask A Flask template for cookiecutter. (Supports Python ≥ 3.6) See this repo for an example project generated from the most recent version of the template. Use it now Docker (This is th
Cookiecutter Django Powered by Cookiecutter, Cookiecutter Django is a framework for jumpstartingproduction-ready Django projects quickly. Documentation: https://cookiecutter-django.readthedocs.io/en/l
Cookiecutter Django-Vue Powered by Cookiecutter,inspired by Cookiecutter Django. Features Docker 12 Factor Server: Nginx Frontend: Vue + vue-cli + PWA Backend: Django Database: PostgreSQL API: REST or
cookiecutter-flask-restful Cookiecutter template for flask restful, including blueprints, application factory, and more Introduction This cookie cutter is a very simple boilerplate for starting a REST
cookiecutter-django-rest a factory for building bleeding edge, best practiced, scalable, rest apis You need to make a scalable api on a deadline. You deeply care about the quality of your work.cookiec