Skip to main content

Posts

Showing posts from August, 2018

Sumeru enterprise tiger privacy policy

Sumeru Enterprise Tiger Business Solutions Pvt. Ltd. Data Privacy Policy At Sumeru Enterprise Tiger Business Solutions Pvt. Ltd. we are committed to providing you with digitalization software products and services to meet your needs. Our commitment includes protecting personally identifiable information we obtain about you when you register to use one of our websites or become our customer (“Personal Information”). We want to earn your trust by providing strict safeguards to protect your Personal Information. This Policy applies to members, customers, former customers, users, and applicants. In the course of our business activities, Sumeru Enterprise Tiger Business Solutions Pvt. Ltd. collects, processes, and shares Personal Information. Indian law gives individuals the right to limit some but not all sharing. This Policy explains what Personal Information we collect, process, and share. We describe how we do so, and why. The Policy also describes your rights to access and correc

Gui logging in node js and python

For node.js Use  frontail for logging https://www.npmjs.com/package/frontail For Python -- use Cutelog https://pypi.org/project/cutelog/ In NodeJs for using frontail we need to use log the logs in a file for logging logs to file , we will use winston Using winston https://www.npmjs.com/package/winston Eg. of using winstonconst { createLogger, format, transports } = require('winston'); const { combine, timestamp, label, prettyPrint } = format; const logger = createLogger({   level: 'info',   format: format.json(),   transports: [     //     // - Write to all logs with level `info` and below to `combined.log`      // - Write all logs error (and below) to `error.log`.     //     new transports.File({ filename: 'error.log', level: 'error' }),     new transports.File({ filename: 'combined.log' })   ] }); logger.log({   level: 'info',   message: 'What time is the testing at?' });

elastic search snapshot and restore requests commands

elastic search snapshot and restore requests commands PUT /_snapshot/my_backup {   "type": "fs",   "settings": {     "location": "/home/kishlay/Documents/esbackup"   } } this may throw permission denied exception so for that do chmod 777 (folder path) PUT /_snapshot/my_backup2 {   "type": "fs",   "settings": {     "location": "/home/kishlay/Documents/esbackup2"   } } GET /_snapshot/my_backup/snapshot GET /_snapshot PUT /_snapshot/my_backup/snapshot_2?wait_for_completion=true {   "indices": "my_index4",   "ignore_unavailable": true,   "include_global_state": false } GET /_snapshot/my_backup/snapshot_2 DELETE /my_index4 POST /_snapshot/my_backup/snapshot_2/_restore DELETE /khandialog DELETE /khanscript DELETE /_snapshot/my_backup/snapshot_2 PUT /_snapshot/my_ba

python code to get url list of all videos of a channel

import scrapy import re import pickle import json import sys class MySpider(scrapy.Spider): global allUrlFile , fullUrl allUrlFile = open ( 'allUrl.txt' , 'a' ) fullUrl = open ( 'fullUrl.txt' , 'a' ) localHost = "http://localhost:8050/render.html?url=" #youtubeUrl = "https://www.youtube.com/channel/UCv1Ybb65DkQmokXqfJn0Eig/channels" # channel with only one ajax # youtubeUrl = "https://www.youtube.com/user/khanacademy/videos" # khan academy youtubeUrl = "https://www.youtube.com/channel/UCU0kWLAbhVGxXarmE3b8rHg/videos" # khan hindi start_urls = [localHost + youtubeUrl] name = "allvideos" def start_requests( self ): for url in self .start_urls: yield scrapy.Request(url, self .parse) def parse( self , response): self .log( "this program just visited " + response.url) if not 'browse_ajax' in response.url:

elastic search commands

PUT twitter/_doc/1 {     "user" : "kimchy",     "message" : "trying out Elasticsearch having running all the way goes down sizing are is the" } PUT /my_index1 {   "settings": {     "analysis": {       "analyzer": {         "my": {           "type": "standard",           "stopwords": [ "is", "having" ]         }       }     }   } } GET /_search?q=having    GET /_analyze {   "analyzer": "standard",   "text": "trying out Elasticsearch having running all the way goes down sizing are is the" } GET /_analyze?tokenizer=whitespace {"You're the 1st runner home!"} POST _analyze {   "analyzer": "my_analyzer",   "text":     "The quick brown fox." } PUT /my_index {   "mappings": {     "blog": {       "properti