Skip to main content

docker in production

filename for development
Dockerfile.dev

filename for prod
Dockerfile

Question is how to build docker with a custom file name bcz by default it searches for Dockerfile only
answer: docker build -f Dockerfile.dev

another problem 
anytime we make any change to our code we need to build it again
we need to find someway via which if we make some change in the code, it automatically gets pushed to the container without rebuilding and restarting the container

Solution:
Docker volume
the problem was we taking the snapshot of the project and placing it inside the container
so instead, now we will be creating the reference, that will point to the local machine

eg. docker run -p 3000:3000 -v /app/node_modules -v $(pwd):/app <imageid>
-v /app/node_modules  -> denotes that dont map it to the local machine
-v $(pwd):/app -> say that map present working dir to /app in the docker

writing the same code in docker-compose to simplify

version: '3'
services:
  web:
    build: .
    ports:
      - "3000:3000"
    volumes:
      - /app/node_modules
      - .:/app

this one is with normal build but since our project doesn't contain Dockerfile and instead contains Dockerfile.dev, we need to make some changes

version: '3'
services:
  web:
    build:
      context: . #location of the working directory
      dockerfile: Dockerfile.dev

    ports:
      - "3000:3000"
    volumes:
      - /app/node_modules
      - .:/app

When are using reference then we can remove the " COPY . . " from the docker file bcz we are anyways referencing but anyways that can be used as inspiration to setup production env and its suggested to leave it there

So this was all about the dev env
anytime we want to go to test env, we just need to  change "npm run start" to "npm run test"

FROM node:alpine

WORKDIR '/app'

COPY package.json .
RUN npm install

COPY . .

CMD ["npm","run","test"]

or in the same build image of the dev
we can run it like


docker build -f Dockerfile.dev
return a container id
dev run <container-id>  npm run start
the above command will only give output, if we want to add input too then we need to add -it
dev run <container-id>  npm run start


For having out code in production we will need things like nginx
so to use and set up that bcz nginx will have different image base .
So we will have multi step build process
it will have build phase and run phase

To run nginx with code

FROM node:alpine as builder
# we are tagging the phase so everything under this can be reffered by builder

WORKDIR '/app'
COPY package.json .
RUN npm install
COPY . .
RUN npm run build

#build folder will be created inside of the working directory and that is the folder that we care about /app/build

#run phase for running nginx

FROM nginx
# this also states that previous block is complete
COPY  --from=builder /app/build /usr/share/nginx/html
#using this container will be-default take care of starting up nginx

Comments

Popular posts from this blog

Gui logging in node js and python

For node.js Use  frontail for logging https://www.npmjs.com/package/frontail For Python -- use Cutelog https://pypi.org/project/cutelog/ In NodeJs for using frontail we need to use log the logs in a file for logging logs to file , we will use winston Using winston https://www.npmjs.com/package/winston Eg. of using winstonconst { createLogger, format, transports } = require('winston'); const { combine, timestamp, label, prettyPrint } = format; const logger = createLogger({   level: 'info',   format: format.json(),   transports: [     //     // - Write to all logs with level `info` and below to `combined.log`      // - Write all logs error (and below) to `error.log`.     //     new transports.File({ filename: 'error.log', level: 'error' }),     new transports.File({ filename: 'combined.log' })   ] }); logger.log({   level: 'info',   message: 'What time is...

fork and sync a github project to bitbucket

First create an empty repository on Bitbucket. Then clone it locally git clone git@bitbucket.org:benlongstaff/forked-repo.git cd forked-repo Now add Github repo as a new remote in Bitbucket called “sync” git remote add sync git@github.com:something/original-repo.git Verify the remotes, it should look something like Summer:forked-repo benlongstaff$ git remote -v origin git@bitbucket.org:benlongstaff/forked-repo.git (fetch) origin git@bitbucket.org:benlongstaff/forked-repo.git (push) sync git@github.com:something/original-repo.git (fetch) sync git@github.com:something/original-repo.git (push) Pull from the master branch in the sync remote. git pull sync master Setup a local branch to track the sync remote’s master branch git branch --track github-master sync/master Push the local master branch to the origin remote in Bitbucket. git push -u origin master To merge in changes from the original repo pull them down into the  github-master  branch and then rebas...

rename field in elastic Search

https://qiita.com/tkprof/items/e50368eb1473497a16d0 How to Rename an Elasticsearch field from columns: - {name: xxx, type: double} to columns: - {name: yyy, type: double} Pipeline API and reindex create a new Pipeline API : Rename Processor PUT _ingest/pipeline/pipeline_rename_xxx { "description" : "rename xxx", "processors" : [ { "rename": { "field": "xxx", "target_field": "yyy" } } ] } { "acknowledged": true } then reindex POST _reindex { "source": { "index": "source" }, "dest": { "index": "dest", "pipeline": "pipeline_rename_xxx" } }