Skip to main content

Yaml in kubernetes

A kubernetes yaml file will always contain top 4 level fields (required field)

apiVersion:
kind:
metadata:

spec:

______________________________________________________________________

kinds : Pod (can be Pod , service , ReplicaSet , Deployment)
metadata:
    name: myapp-pod
    labels:
      app: myapp

metadata values are in the form of the dictionary
metadata can only have name and labels
however, labels can have as many properties as u wish

spec  is a dictionary
spec:
   containers: (this is an list or an array bcz pods can have multiple containers within them)
       - name : nginx-container (the - indicates that this is the first item in the list)
          image : nginx


create the pod using the command
kubectl create -f pod-definition.yml

once you create the pod,, to see it use the command
kubectl get pods

to get the detailed description about the pods use:
kubectl describe pod {name from get pods command}


eg.
apiVersion: v1

kind: Pod

metadata:
  name: myapp-pod
  labels:
    name: myapp
    type: pod

spec:
  containers:
    - name: nginx-image
      image: nginx

plugin for help (pycharm)
https://plugins.jetbrains.com/plugin/9354-kubernetes-and-openshift-resource-support


Comments

Popular posts from this blog

Gui logging in node js and python

For node.js Use  frontail for logging https://www.npmjs.com/package/frontail For Python -- use Cutelog https://pypi.org/project/cutelog/ In NodeJs for using frontail we need to use log the logs in a file for logging logs to file , we will use winston Using winston https://www.npmjs.com/package/winston Eg. of using winstonconst { createLogger, format, transports } = require('winston'); const { combine, timestamp, label, prettyPrint } = format; const logger = createLogger({   level: 'info',   format: format.json(),   transports: [     //     // - Write to all logs with level `info` and below to `combined.log`      // - Write all logs error (and below) to `error.log`.     //     new transports.File({ filename: 'error.log', level: 'error' }),     new transports.File({ filename: 'combined.log' })   ] }); logger.log({   level: 'info',   message: 'What time is...

fork and sync a github project to bitbucket

First create an empty repository on Bitbucket. Then clone it locally git clone git@bitbucket.org:benlongstaff/forked-repo.git cd forked-repo Now add Github repo as a new remote in Bitbucket called “sync” git remote add sync git@github.com:something/original-repo.git Verify the remotes, it should look something like Summer:forked-repo benlongstaff$ git remote -v origin git@bitbucket.org:benlongstaff/forked-repo.git (fetch) origin git@bitbucket.org:benlongstaff/forked-repo.git (push) sync git@github.com:something/original-repo.git (fetch) sync git@github.com:something/original-repo.git (push) Pull from the master branch in the sync remote. git pull sync master Setup a local branch to track the sync remote’s master branch git branch --track github-master sync/master Push the local master branch to the origin remote in Bitbucket. git push -u origin master To merge in changes from the original repo pull them down into the  github-master  branch and then rebas...

rename field in elastic Search

https://qiita.com/tkprof/items/e50368eb1473497a16d0 How to Rename an Elasticsearch field from columns: - {name: xxx, type: double} to columns: - {name: yyy, type: double} Pipeline API and reindex create a new Pipeline API : Rename Processor PUT _ingest/pipeline/pipeline_rename_xxx { "description" : "rename xxx", "processors" : [ { "rename": { "field": "xxx", "target_field": "yyy" } } ] } { "acknowledged": true } then reindex POST _reindex { "source": { "index": "source" }, "dest": { "index": "dest", "pipeline": "pipeline_rename_xxx" } }