Evolving Dev

It is becoming more and more common to deploy containerized applications for a variety of reasons:

  • consistency between development and production
  • isolation of the service environments
  • the ability to easily scale modular services

There are many more benefits of using containers, but you get the gist of it. The most commonly used container system is Docker. As it's popularity has risen, a growing number of people in the Phoenix community are deploying applications either inside or built within docker containers.

However, note the first benefit of containers that I have listed above. I develop on a Mac, but most often I deploy to Linux droplets on Digital Ocean. Even if I build my app inside Docker during the deployment process, then I am just putting off dealing with potential unseen problems while working on my Mac. This left me wanting to run Phoenix locally inside a Docker container.

You can certainly read the documentation for both Phoenix and Docker to create a setup from scratch, but there are many great articles on this topic. Here are a couple: – How to run your phoenix application with dockerStarting a phoenix project with docker

After taking my first crack at it, I had Phoenix up and running in a Docker container. The important elements of the setup are:

  • dockerfile: installs the base docker image & dependencies, sets up a directory for our codebase, and runs the shell script
  • docker compose file: defines the docker services that will be used (the phoenix image and the postgres image), provides environment variables, and the application's port
  • shell script file: installs dependencies (mix and npm), handles creating the database, and starts the phoenix server

Even with the application running fine, there was one issue. A major positive aspect of Phoenix development no longer worked: live reloading. With some minor changes I was able to get my development process back in shape.

dockerfile: – Add a line to install inotify tools. This is required for responding to file change events. – Remove the use of the Docker copy command. We will be using a different method for providing our codebase to the container. More on that shortly.

FROM elixir:1.8.2-alpine

RUN apk update
RUN apk upgrade --no-cache
RUN apk add nodejs=10.14.2-r0 nodejs-npm=10.14.2-r0
RUN apk add inotify-tools=3.20.1-r1
RUN apk add postgresql-client=11.3-r0
RUN mix local.rebar --force
RUN mix local.hex --force

RUN mkdir /app


CMD ["sh", "./entrypoint.sh"]

docker-compose.yml: – Add a Docker volume to the Phoenix container

version: '3.2'

      context: .
      - .:/app
      PGUSER: postgres
      PGPASSWORD: postgres
      PGDATABASE: app
      PGPORT: 5432
      PGHOST: db
      - "4000:4000"
      - db
    image: postgres:9.6
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
      PGDATA: /var/lib/postgresql/data/pgdata
    restart: always
      - pgdata:/var/lib/postgresql/data

webpack.config.js: – Add a configuration to tell webpack to poll for file changes.

const path = require('path');
const glob = require('glob');
const MiniCssExtractPlugin = require('mini-css-extract-plugin');
const UglifyJsPlugin = require('uglifyjs-webpack-plugin');
const OptimizeCSSAssetsPlugin = require('optimize-css-assets-webpack-plugin');
const CopyWebpackPlugin = require('copy-webpack-plugin');

module.exports = (env, options) => ({
  optimization: {
    minimizer: [
      new UglifyJsPlugin({ cache: true, parallel: true, sourceMap: false }),
      new OptimizeCSSAssetsPlugin({})
  entry: {
      './js/app.js': ['./js/app.js'].concat(glob.sync('./vendor/**/*.js'))
  output: {
    filename: 'app.js',
    path: path.resolve(__dirname, '../priv/static/js')
  watchOptions: {
    poll: true
  module: {
    rules: [
        test: /\.js$/,
        exclude: /node_modules/,
        use: {
          loader: 'babel-loader'
        test: /\.css$/,
        use: [MiniCssExtractPlugin.loader, 'css-loader']
  plugins: [
    new MiniCssExtractPlugin({ filename: '../css/app.css' }),
    new CopyWebpackPlugin([{ from: 'static/', to: '../' }])

After making those changes and restarting your container, you will notice that any asset or elixir file change will trigger a server/page reload. For a full example project check out this repo.

You may have heard about the JAMstack. Like many people, including myself, your first response was probably to bemoan yet another client-side architectural development. “Javascript fatigue” almost certainly crossed your mind. I'm here to reassure you that it's not some complicated new framework or web technology. It's just a set of principles for building applications. There are also some tools that make adhering to these principles much easier.

So, what is it?

The JAM in JAMstack stands for client-side Javascript, reusable API's, and prebuilt markup. The gist of it is that applications use Javascript for dynamic capabilities such as routing, fetching data from API's, and updating content. All static content is rendered via prebuilt markup.

These principles provide several benefits. By removing the need for communication between the client and server to display non-dynamic content, the client is able to be much more autonomous. Deployment of the client is less likely to be gated by updates to backend services, decreasing days to deployment for a significant portion of your codebase. The use of prebuilt markup means that more of your content is already available via the cdn that serves your client-side code, improving site performance.

All of the associated benefits come without giving up on the things that people do enjoy about modern web development. You can still use your Javascript framework of choice and the latest ECMAscript features.

That being said, there are some new tools to pickup and learn. They provide a ton of value and the learning curve is negligible compared to picking up an entirely new Javascript framework.

Static Site Generators (SSG)

As I've mentioned above, one of the key elements of JAMstack applications is the use of prebuilt markup. Think of every server-side template you've ever used. Without a doubt, many of the pages generated from those templates did not really have to be rendered on each http request for a given page. The templates served as a convenient way to avoid manually writing tons of HTML. That's exactly what these SSGs do. You define a template within the framework of an SSG. Then, the SSG processes that source and outputs it in a static format of JS, CSS, HTML, and other assets. It provides the convenience without losing the performance benefits of static content.

Managing Deployment

Updating a codebase every time that the static content changes will lead to more deployments. Luckily, there is a large community working to make automated deployments much more straightforward. One of the big names that you will come across when searching for a solution is Netlify. Their service will update the available version of your content following commits to a git repository.

Getting Started

Once I understand the concepts and tools around a new framework or technology, my next question is usually around how much is required to get it up and running. For JAMstack applications the answer can be very straightforward, but in true client-side fashion, there are a ton of options. Here are a few that I recommend:

  • Netlify Templates These templates allow you to deploy an app for a variety of SSG libraries on Netlify with one click. You will be able to connect and view the code in either Github or Gitlab. There will also be a live version available for testing at a randomly generated url. -Codesandbox If you want to get up and running a bit faster, then you can use a Codesandbox template to view the code for a SSG like Gatsby with the option of deployment via a few different services. -Local development with Static Site Generators For a local development option, you could follow the instructions by Gatsby or Hugo to generate your first project.


We've gone over the basic principles of JAMstack applications and the tools to help you get started. I hope that you've found this introduction to be useful. These concepts and tools will certainly play a role in my workflow going forward.

Enter your email to subscribe to updates.