Skip to main content

Yaml in kubernetes

A kubernetes yaml file will always contain top 4 level fields (required field)

apiVersion:
kind:
metadata:

spec:

______________________________________________________________________

kinds : Pod (can be Pod , service , ReplicaSet , Deployment)
metadata:
    name: myapp-pod
    labels:
      app: myapp

metadata values are in the form of the dictionary
metadata can only have name and labels
however, labels can have as many properties as u wish

spec  is a dictionary
spec:
   containers: (this is an list or an array bcz pods can have multiple containers within them)
       - name : nginx-container (the - indicates that this is the first item in the list)
          image : nginx


create the pod using the command
kubectl create -f pod-definition.yml

once you create the pod,, to see it use the command
kubectl get pods

to get the detailed description about the pods use:
kubectl describe pod {name from get pods command}


eg.
apiVersion: v1

kind: Pod

metadata:
  name: myapp-pod
  labels:
    name: myapp
    type: pod

spec:
  containers:
    - name: nginx-image
      image: nginx

plugin for help (pycharm)
https://plugins.jetbrains.com/plugin/9354-kubernetes-and-openshift-resource-support


Comments

Popular posts from this blog

Gui logging in node js and python

For node.js Use  frontail for logging https://www.npmjs.com/package/frontail For Python -- use Cutelog https://pypi.org/project/cutelog/ In NodeJs for using frontail we need to use log the logs in a file for logging logs to file , we will use winston Using winston https://www.npmjs.com/package/winston Eg. of using winstonconst { createLogger, format, transports } = require('winston'); const { combine, timestamp, label, prettyPrint } = format; const logger = createLogger({   level: 'info',   format: format.json(),   transports: [     //     // - Write to all logs with level `info` and below to `combined.log`      // - Write all logs error (and below) to `error.log`.     //     new transports.File({ filename: 'error.log', level: 'error' }),     new transports.File({ filename: 'combined.log' })   ] }); logger.log({   level: 'info',   message: 'What time is...

opening multiple ports tunnels ngrok in ubuntu

Location for the config yml file /home/example/.ngrok2/ngrok.yml content of config file authtoken: 4nq9771bPxe8ctg7LKr_2ClH7Y15Zqe4bWLWF9p tunnels: app-foo: addr: 80 proto: http host_header: app-foo.dev app-bar: addr: 80 proto: http host_header: app-bar.dev how to start ngrok with considering the config file: ngrok start --all

rename field in elastic Search

https://qiita.com/tkprof/items/e50368eb1473497a16d0 How to Rename an Elasticsearch field from columns: - {name: xxx, type: double} to columns: - {name: yyy, type: double} Pipeline API and reindex create a new Pipeline API : Rename Processor PUT _ingest/pipeline/pipeline_rename_xxx { "description" : "rename xxx", "processors" : [ { "rename": { "field": "xxx", "target_field": "yyy" } } ] } { "acknowledged": true } then reindex POST _reindex { "source": { "index": "source" }, "dest": { "index": "dest", "pipeline": "pipeline_rename_xxx" } }