How to use wget to download a file stored in Google Drive without making publicly shareable link of the specific file?
Solution 1
Try looking at rclone
https://rclone.org/
It's got a very complete feature set, is command-line based (syntax is very similar to rsync
) and supports Google Drive via OAuth. Should be exactly what you're looking for.
Solution 2
While many previous methods no longer work, I found that the following method still works (at least as for today):
wget --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILE_ID' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=FILE_ID" -O FILE_NAME && rm -rf /tmp/cookies.txt
(replace FILE_ID and FILE_NAME with the file id from the google drive download link, and the name of the file you wish to download)
(answer originally found in https://silicondales.com/tutorials/g-suite/how-to-wget-files-from-google-drive/)
Solution 3
Create a bash script called getgoogle
somewhere in your PATH, e.g. /usr/local/bin or ~/bin (if you've mapped that in your environment)
#!/bin/bash
if [[ $1 == *"id="* ]]; then
ID=${1##*'id='}
else
ID=$1
fi
echo "Retrieving Google drive item with ID $ID"
wget --no-check-certificate -r "https://docs.google.com/uc?export=download&id=$ID" -O $(curl -s "https://drive.google.com/file/d/$ID/
view?usp=sharing" | grep -o '<title>.*</title>' | cut -d'>' -f2 | awk -F ' - Goo' '{print $1}')
make sure you change the permissions on the getgoogle
script after saving it by running:
chmod u+x getgoogle
Usage:
getgoogle <URL|ID>
Will download the Google Drive item whether you have the full link as URL
or just the identifier ID
and rename it upon download to the original filename of the file that was uploaded to Google Drive.
Related videos on Youtube
![sub1996](https://lh6.googleusercontent.com/-o1k-mmdqjvk/AAAAAAAAAAI/AAAAAAAAAAA/ACHi3reQz0MAYx-i0Su0U_JLhwM8ZA24eQ/photo.jpg?sz=256)
sub1996
Updated on September 18, 2022Comments
-
sub1996 almost 2 years
First make the file publicly shareable then:-
I use for small files downloadwget --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O FILENAME
for large files
wget --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=FILEID" -O FILENAME && rm -rf /tmp/cookies.txt
But how to do it without making publicly shareable link?
Note: wget is not a must for me.If you know any other software please recommend me but it must contain following features:-
- Commandline based
- Download file resume feature
- lightweight and portable (if possible , its not compulsory)
- Multi-thread downloading and file appending feature(if possible , its not compulsory)
-
Anaksunaman over 4 years'But how to do it without making a publicly shareable link?" - I would be skeptical that this is even possible, both for technical as well as security related reasons. If you want to have direct access to files via e.g.
wget
, you are likely better off placing them on a normal web or ftp server, even ones you host yourself.
-
Avio over 3 yearsThanks man, this worked almost flawlessy. The only additional step I had to take is that, after a few tens of redirect, the script downloaded
410294
more bytes on the beginning of the binary I wanted (these410294
bytes started with:<!DOCTYPE html><html><head><title>Google Drive - Virus scan warning</title>
) so I had to perform add
withskip
to remove this HTML header.