Tim Reynolds

Software & Startups

Efficiently building Node.js applications with Docker multi-stage builds

Docker provides not only a great way to package your application for deployment but also introduces the possibility of isolating the build process from the caveats of your local machine or CI’s build agent.

Initially this would have been achieved using "docker-in-docker" whereby a build container would be used to run linting, test and finally build the application container inside it.

Using this approach can be cumbersome and if done incorrectly can lead to bloated application containers that include NPM register secrets and poorly cached layers of Docker slowing build times.

Cutting to the chase you should end up with a Docker file that looks as follows;

    FROM node/10-alpine as builder
    WORKDIR /build
    COPY package.json .
    COPY package.lock.json .
    RUN npm ci --only=production
    RUN cp -R node_modules prod_node_modules
    RUN npm ci
    # Copy src code and tests into the build container
    # Preferable explicalty rather than copying everything 
    # if doing this consider using .dockerignore
    COPY . . 
    # Run linting, tests etc
    RUN npm run lint 
    RUN npm run test 
    FROM node/10-alpine
    COPY --from=builder /build/prod_node_modules ./node_modules
    # Again be explicate about what you copy
    COPY --from=builder /build/ .
    CMD ["node", "server.js"]

Now lets run through each section. Firstly, you'll notice that the setup has multiple FROM statements which is the muilt-stage feature of docker, well covered in their documentation. It's important that these containers share the same OS as dependencies will be installed in one container and then copied into another, having a mismatch of binary version to OS will lead to headaches later.

Here you'll also notice we're using alpine which will lead to the smallest possible production container, however if npm ci is required to compile anything from source, cough sass, you'll probably be lacking the OS dependencies required. If thats the case I'd recommend creating a container which includes these dependencies and using it across your containers. Not doing so will result in you needing to run RUN apk add --update make gcc g++ or similar slowing every build down.

Next we copy the package.json and package.lock.jsoninto the container before running npm ci. This ensures if these two files haven't changed Docker uses a cached layer and skips the npm command saving you the time required to install dependencies.

If these files have changed the RUN commands will be executed in this case two npm ci commands. The ci command functions in much the same way as install but is specifically designed for automated environments enforcing a package.json and lock file which are instep along with providing greater install speed by skipping certain user-oriented features.

Executing this initially with the —only=production allows us to take a copy of the runtime only dependencies for later use in the production image. Luckily the local npm cache assist us in the second execution of the command meaning only development dependencies are installed from the online repository.

After this you can execute any test, linting or similar steps within the builder container before producing the application container. Doing so simply requires the runtime code and dependencies to be copied from the building using the COPY —from command. Note you'll want to take the production node modules and place them in the node modules location.

Once done you can push the outputted container into your docker registry for deployment.

✍️ more words