gdrive-downloader is a collection of shell scripts runnable on all POSIX compatible shells ( sh / ksh / dash / bash / zsh / etc ).
It can be used to to download files or folders from google gdrive.
As this is a collection of shell scripts, there aren't many dependencies. See Native Dependencies after this section for explicitly required program list.
For Linux or MacOS, you hopefully don't need to configure anything extra, it should work by default.
Install Termux.
Then, pkg install curl
and done.
It's fully tested for all usecases of this script.
Install iSH
While it has not been officially tested, but should work given the description of the app. Report if you got it working by creating an issue.
Again, it has not been officially tested on windows, there shouldn't be anything preventing it from working. Report if you got it working by creating an issue.
This repo contains two types of scripts, posix compatible and bash compatible.
These programs are required in both bash and posix scripts.
Program | Role In Script |
---|---|
curl | All network requests |
xargs | For parallel downloading |
mkdir | To create folders |
rm | To remove files and folders |
grep | Miscellaneous |
sed | Miscellaneous |
mktemp | To generate temporary files ( optional ) |
sleep | Self explanatory |
ps | To manage different processes |
du | To get actual file sizes |
If BASH is not available or BASH is available but version is less tham 4.x, then below programs are also required:
Program | Role In Script |
---|---|
date | For installation, update and Miscellaneous |
stty or zsh or tput | To determine column size ( optional ) |
You can install the script by automatic installation script provided in the repository.
Default values set by automatic installation script, which are changeable:
Repo: Akianonymus/gdrive-downloader
Command name: gdl
Installation path: $HOME/.gdrive-downloader
Source value: master
Shell file: .bashrc
or .zshrc
or .profile
For custom command name, repo, shell file, etc, see advanced installation method.
Now, for automatic install script, there are two ways:
To install gdrive-downloader in your system, you can run the below command:
curl -Ls --compressed https://drivedl.cf | sh -s
alternatively, you can use the original github url instead of https://drivedl.cf
curl -Ls --compressed https://github.com/Akianonymus/gdrive-downloader/raw/master/install.sh | sh -s
and done.
This section provides information on how to utilise the install.sh script for custom usescases.
These are the flags that are available in the install.sh script:
-p | --path <dir_name>
Custom path where you want to install the script.
Note: For global installs, give path outside of the home dir like /usr/bin and it must be in the executable path already.
-c | --cmd <command_name>
Custom command name, after installation, script will be available as the input argument.
-r | --repo <Username/reponame>
Install script from your custom repo, e.g --repo Akianonymus/gdrive-downloader, make sure your repo file structure is same as official repo.
-b | --branch <branch_name>
Specify branch name for the github repo, applies to custom and default repo both.
-s | --shell-rc <shell_file>
Specify custom rc file, where PATH is appended, by default script detects .zshrc, .bashrc. and .profile.
-t | --time 'no of days'
Specify custom auto update time ( given input will taken as number of days ) after which script will try to automatically update itself.
Default: 5 ( 5 days )
--sh | --posix
Force install posix scripts even if system has compatible bash binary present.
-q | --quiet
Only show critical error/sucess logs.
--skip-internet-check
Do not check for internet connection, recommended to use in sync jobs.
-U | --uninstall
Uninstall the script and remove related files.\n
-D | --debug
Display script command trace.
-h | --help
Display usage instructions.
Now, run the script and use flags according to your usecase.
E.g:
curl -Ls --compressed https://drivedl.cf | sh -s -- -r username/reponame -p somepath -s shell_file -c command_name -b branch_name
If you have followed the automatic method to install the script, then you can automatically update the script.
There are three methods:
Automatic updates
By default, script checks for update after 3 days. Use -t / --time flag of install.sh to modify the interval.
An update log is saved in "${HOME}/.gdrive-downloader/update.log".
Use the script itself to update the script.
gdl -u or gdl --update
This will update the script where it is installed.
If you use the this flag without actually installing the script,
e.g just by sh gdl.sh -u
then it will install the script or update if already installed.
Run the installation script again.
Yes, just run the installation script again as we did in install section, and voila, it's done.
Note: Above methods always obey the values set by user in advanced installation,e.g if you have installed the script with different repo, say myrepo/gdrive-downloader
, then the update will be also fetched from the same repo.
After installation, no more configuration is needed for public files/folders.
But sometimes, downloading files from shared drive ( team drives ) errors. To tackle this, use --key
flag and bypass that error. In case it still errors out, give your own api key as argument.
To get your own api key, go to Retrieve API key
section in auth.md.
Note: Even after specifying api key, don't recklessly download a file over and over, it will lead to 24 hr ip ban.
To handle the issue ( more of a abuse ) in above note, use oauth authentication.
Other scenario where oauth authentication is needed would be for downloading private files/folders. Go to Authentication section for more info.
gdl gdrive_id/gdrive_url
Script supports argument as gdrive_url, or a gdrive_id, given those should be publicly available.
Now, we have covered the basics, move on to the next section for extra features and usage, like skipping sub folders, parallel downloads, etc.
These are the custom flags that are currently implemented:
-aria | --aria-flags 'flags'
Use aria2c to download. "-aria" doesn't take arguments.
To give custom flags as argument, use long flag, --aria-flags. e.g: --aria-flags '-s 10 -x 10'
Note 1: aria2c can only resume google drive downloads if -k/--key
or -o/--oauth
option is used, otherwise, it will use curl.
Note 2: aria split downloading won't work in normal mode ( without -k
or -o
flag ) because it cannot get the remote server size. Same for any other feature which uses remote server size.
Note 3: By above notes, conclusion is, aria is basically same as curl in normal mode, so it is recommended to be used only with --key
and --oauth
flag.
-o | --oauth
Use this flag to trigger oauth authentication.
Note: If both --oauth and --key flag is used, --oauth flag is preferred.
--oauth-refetch-refresh-token
Use this flag to trigger refetching of refresh token if existing refresh token is expired.
-k | --key 'custom api key' ( optional argument )
To download with api key. If api key is not specified, then the predefined api key will be used.
Note: In-script api key surely works, but have less qouta to use, so it is recommended to use your own private key.
To save your api key in config file, use gdl --key default="your api key"
. API key will be saved in ${HOME}/.gdl.conf
and will be used from now on.
Note: If both --key and --oauth flag is used, --oauth flag is preferred.
-c | --config 'config file path'
Override default config file with custom config file.
Default: ${HOME}/.gdl.conf
-d | --directory 'foldername'
Custom workspace folder where given input will be downloaded.
-s | --skip-subdirs
Skip downloading of sub folders present in case of folders.
-p | --parallel "num of parallel downloads"
Download multiple files in parallel.
Note:
--proxy 'http://user:password@host:port'
Specify a proxy to use, should be in the format accepted by curl --proxy and aria2c --all-proxy flag.
--speed 'speed'
Limit the download speed, supported formats: 1K and 1M.
-ua | --user-agent 'user agent string'
Specify custom user agent.
-R | --retry 'num of retries'
Retry the file download if it fails, postive integer as argument. Currently only for file downloads.
-in | --include 'pattern'
Only download the files which contain the given pattern - Applicable for folder downloads.
e.g: gdl gdrive_id --include '1'
, will only include with files with pattern '1' in the name.
Regex can be used which works with grep -E command.
-ex | --exclude 'pattern'
Only download the files which does not contain the given pattern - Applicable for folder downloads.
e.g: gdl gdrive_id --exclude '1'
, will only include with files with pattern '1' not present in the name.
Regex can be used which works with grep -E command.
-l | --log 'log_file_name'
Save downloaded files info to the given filename.
-q | --quiet
Supress the normal output, only show success/error download messages for files, and one extra line at the beginning for folder showing no. of files and sub folders.
--verbose
Display detailed message (only for non-parallel downloads).
--skip-internet-check
Do not check for internet connection, recommended to use in sync jobs.
-V | --version | --info
Show detailed info about script ( if script is installed system wide ).
-u | --update
Update the installed script in your system, if not installed, then install.
--uninstall
Uninstall the installed script in your system.
-h | --help 'flag name (optional)'
Print help for all flags and basic usage instructions.
To see help for a specific flag, --help flag_name ( with or without dashes )
e.g: gdl --help aria
-D | --debug
Display script command trace.
For oauth or api key authentication, see auth.md
On first run, the script asks for all the required credentials, which we have obtained in the previous section.
Execute the script: gdl gdrive_url/gdrive_id -o
Note: -o/ --oauth
flag is needed if file should be downloaded with authentication.
Now, it will ask for following credentials:
Client ID: Copy and paste from credentials.json
Client Secret: Copy and paste from credentials.json
Refresh Token: If you have previously generated a refresh token authenticated to your account, then enter it, otherwise leave blank.If you don't have refresh token, script outputs a URL on the terminal script, open that url in a web browser and tap on allow. Copy the code and paste in the terminal.
If everything went fine, all the required credentials have been set.
After first run, the credentials are saved in config file. The config file is ${HOME}/.gdl.conf
.
To use a different one temporarily, see -c / --config
custom in Download Script Custom Flags.
This is the format of a config file:
ACCOUNT_default_CLIENT_ID="client id"
ACCOUNT_default_CLIENT_SECRET="client secret"
ACCOUNT_default_REFRESH_TOKEN="refresh token"
ACCOUNT_default_ACCESS_TOKEN="access token"
ACCOUNT_default_ACCESS_TOKEN_EXPIRY="access token expiry"
where default
is the name of the account.
You can use a config file in multiple machines, the values that are explicitly required are CLIENT_ID
, CLIENT_SECRET
and REFRESH_TOKEN
.
ACCESS_TOKEN
and ACCESS_TOKEN_EXPIRY
are automatically generated using REFRESH_TOKEN
.
A pre-generated config file can be also used where interactive terminal access is not possible, like Continuous Integration, docker, jenkins, etc
Just have to print values to "${HOME}/.gdl.conf"
, e.g:
printf "%s\n" '
ACCOUNT_default_CLIENT_ID="client id"
ACCOUNT_default_CLIENT_SECRET="client secret"
ACCOUNT_default_REFRESH_TOKEN="refresh token"
' >| "${HOME}/.gdl.conf"
Note: Don't skip those backslashes before the double qoutes, it's necessary to handle spacing.
Note: If you have an old config, then nothing extra is needed, just need to run the script once and the default config will be automatically converted to the new format.
When downloading a file, or a folder ( except parallel downloading ), script shows progress bar for ongoing download.
================[ Downloaded: 4.6 GB | Left: 44.7 GB ]=================
------------------[ Speed: 48.0 MB/s | ETA: 15m10s ]-------------------
You can use multiple inputs without any extra hassle.
Pass arguments normally, e.g: gdl url1 url2 id2 id2
where url1 and url2 are drive urls and rest two are gdrive ids.
Downloads interrupted either due to bad internet connection or manual interruption, can be resumed from the same position.
You can interrupt many times you want, it will resume ( hopefully ).
It will not download again if file is already present, thus avoiding bandwidth waste.
In normal mode of downloading, when aria is used, if interrupted, then it will be resumed by curl because aria cannot detect the remote file size.
But when --key
or --oauth
is used, it will resume successfully with aria too.
If you have followed the automatic method to install the script, then you can automatically uninstall the script.
There are two methods:
Use the script itself to uninstall the script.
gdl --uninstall
This will remove the script related files and remove path change from shell file.
Run the installation script again with -U/--uninstall flag
curl -Ls --compressed https://drivedl.cf | sh -s -- --uninstall
Yes, just run the installation script again with the flag and voila, it's done.
Note: Above methods always obey the values set by user in advanced installation.
In this section, the mechanism of the script it explained, if one is curious how it works to download folders as it is not supported officially.
The main catch here is that the script uses gdrive api to fetch details of a given file or folder id/url. But then how it is without authentication ?
Well, it does uses the api key but i have provided it in script. I have grabbed the api key from their gdrive file page, just open a gdrive folder on browser, open console and see network requests, open one of the POST requests and there you have it.
Also, google api key have a check for referer, so we pass referer with curl as https://drive.google.com
to properly use the key.
Now, next steps are simple enough:
Main Function: _check_id
It parses the input and extract the file_id, then it does a network request to fetch name, size and mimetype of id.
If it's doesn't give http status 40*, then proceed.
In case of:
Main Function: _download_file
Before downloading, the script checks if file is already present. If present compare the file size to remote file size and resume the download if applicable.
Recent updates by google have the made the download links ip specific and very strict about cookies, so it can only be downloaded on the system where cookies was fetched.Earlier, cookies was only needed for a file greater than 100 MB.
But either the case, the file can be moved to a different system and the script will resume the file from same position.
Main Function: _download_folder
First, all the files and sub folder details are fetched. Details include id and mimeType.
Now, it downloads the files using _download_file
function, and in case of sub-folders, _download_folder
function is repeated.
Issues Status |
---|
Use the GitHub issue tracker for any bugs or feature suggestions.
Total Contributers |
---|
Pull Requests |
---|
Submit patches to code or documentation as GitHub pull requests.
Make sure to run format_and_lint.sh and release.sh before making a new pull request.
If using a code editor, then use shfmt and shellcheck plugin instead of format_and_lint.sh
我有一个使用OAuth客户端从GDrive文件导出文本的脚本,它运行得非常好- 但是每次都要通过OAuth流是一件痛苦的事情,因为只有我在使用这个脚本,所以我想简化一些事情,使用一个服务帐户来代替它,从这篇文章开始- Google Drive API Python服务帐户示例 我的新服务帐户脚本执行完全相同的操作,如下所示- 但是当我为同一个运行它时,我会得到以下结果- 感觉服务帐户脚本正在通过身
我得到以下错误: PHP致命错误:在vendor/Google/apiclient/src/Google/Http/REST.PHP:118中出现未捕获的异常“Google\u Service\u exception”,消息为“未能解析内容范围头” 堆栈跟踪: 0供应商/google/apiclient/src/Google/Http/REST. php(94):Google_Http_REST:
问题内容: 在我的应用程序上,我使用通用的图像下载器BaseImageDownloader类同步加载画廊的内容。对于来自Imageloader.getInstance()。loadImage异步函数的相同内容,它不会给出任何安全异常并按原样加载图像,但是当我尝试使用BaseImageDownloader同步下载它(而且Imageloader.getInstance()。loadImage()相同)
从相机获取照片 注意: 开始使用 Photo Downloader 导入图像之前,请确保您已将 Bridge 更新为最新版本。要检查更新,请选择“帮助”>“更新”。要了解如何更新应用程序,请参阅更新 Creative Cloud 应用程序。 使用支持的线缆将相机、读卡器或移动设备连接到计算机。 注意: (仅限 macOS)在 Mac 计算机上,可将 Adobe Bridge 配置为在将相机连接到计
从相机获取照片 注意: 开始使用 Photo Downloader 导入图像之前,请确保您已将 Bridge 更新为最新版本。要检查更新,请选择“帮助”>“更新”。要了解如何更新应用程序,请参阅更新 Creative Cloud 应用程序。 使用支持的线缆将相机、读卡器或移动设备连接到计算机。 注意: (仅限 macOS)在 Mac 计算机上,可将 Adobe Bridge 配置为在将相机连接到计
本文向大家介绍Python爬虫框架scrapy实现downloader_middleware设置proxy代理功能示例,包括了Python爬虫框架scrapy实现downloader_middleware设置proxy代理功能示例的使用技巧和注意事项,需要的朋友参考一下 本文实例讲述了Python爬虫框架scrapy实现downloader_middleware设置proxy代理功能。分享给大家供