Skip to main content

python program to download video and add all in a csv file

import requests  # or urllib
import csv
import sys 
reload(sys) 
sys.setdefaultencoding('utf-8')

# get Youtube Data API Key
API_KEY = ""  # insert your API key
# youtube channel ID
channel_id = ""  # insert Youtube channel ID
page_token = ""
print('how are u')
videos = []

next = True
while next:
    url = ("https://www.googleapis.com/youtube/v3/search?key="
            "{}&channelId={}&part=snippet,id"
            "&order=date&maxResults=50&pageToken={}"
            ).format(
                API_KEY,
                channel_id,
                page_token
            )
    resp = requests.get(url)
    data = resp.json()

    for i in data['items']:
        videos.append(i)

    # iterate through result pagination
    is_next = data.get('nextPageToken')
    if is_next:
        page_token = is_next
    else:
        next = False

# structuring the data
rows = []
count =0
for i in videos:
    title = i['snippet'].get('title')
    description = i['snippet'].get('description', "")
    videoId = "https://www.youtube.com/watch?v={}".format(
        i['id'].get('videoId', ""))
    # add special formula [=image("url")], so we can view the thumbnail in google docs spreadsheet
    count = count + 1
    print(videoId)
    print(count)
   
    thumb = "=image(\"{}\")".format(i['snippet']['thumbnails'].get('default').get('url', ""))
    rows.append(";".join([title, description, videoId, thumb]))
    print("#")
    print(rows[count-1])
    print("#")
    # data is now ready to write to csv file

# writing to csv file
path = "videos.csv"
with open(path, "w") as csv_file:
    writer = csv.writer(csv_file, delimiter=";")
    for row in rows:
        print(row)
        writer.writerow(row.split(";"))
       
print(count)

Comments

Popular posts from this blog

Gui logging in node js and python

For node.js Use  frontail for logging https://www.npmjs.com/package/frontail For Python -- use Cutelog https://pypi.org/project/cutelog/ In NodeJs for using frontail we need to use log the logs in a file for logging logs to file , we will use winston Using winston https://www.npmjs.com/package/winston Eg. of using winstonconst { createLogger, format, transports } = require('winston'); const { combine, timestamp, label, prettyPrint } = format; const logger = createLogger({   level: 'info',   format: format.json(),   transports: [     //     // - Write to all logs with level `info` and below to `combined.log`      // - Write all logs error (and below) to `error.log`.     //     new transports.File({ filename: 'error.log', level: 'error' }),     new transports.File({ filename: 'combined.log' })   ] }); logger.log({   level: 'info',   message: 'What time is...

fork and sync a github project to bitbucket

First create an empty repository on Bitbucket. Then clone it locally git clone git@bitbucket.org:benlongstaff/forked-repo.git cd forked-repo Now add Github repo as a new remote in Bitbucket called “sync” git remote add sync git@github.com:something/original-repo.git Verify the remotes, it should look something like Summer:forked-repo benlongstaff$ git remote -v origin git@bitbucket.org:benlongstaff/forked-repo.git (fetch) origin git@bitbucket.org:benlongstaff/forked-repo.git (push) sync git@github.com:something/original-repo.git (fetch) sync git@github.com:something/original-repo.git (push) Pull from the master branch in the sync remote. git pull sync master Setup a local branch to track the sync remote’s master branch git branch --track github-master sync/master Push the local master branch to the origin remote in Bitbucket. git push -u origin master To merge in changes from the original repo pull them down into the  github-master  branch and then rebas...

rename field in elastic Search

https://qiita.com/tkprof/items/e50368eb1473497a16d0 How to Rename an Elasticsearch field from columns: - {name: xxx, type: double} to columns: - {name: yyy, type: double} Pipeline API and reindex create a new Pipeline API : Rename Processor PUT _ingest/pipeline/pipeline_rename_xxx { "description" : "rename xxx", "processors" : [ { "rename": { "field": "xxx", "target_field": "yyy" } } ] } { "acknowledged": true } then reindex POST _reindex { "source": { "index": "source" }, "dest": { "index": "dest", "pipeline": "pipeline_rename_xxx" } }