Downloading a RAR file from Mediafire using WGET
Solution 1
From Mediafires Terms of Service:
General Use of the Service, Permissions and Restrictions
You agree while using MediaFire Services, that you may not:
Alter or modify any part of the Services;
Use the Services for any illegal purpose;
Use any robot, spider, offline readers, site search and/or retrieval application, or other device to retrieve or index any portion of the Services, with the exception of public search engines
So essentially by using anything other than the tools that Mediafire provide via their website you are in fact breaking their terms of service.
Solution 2
bash function:
mdl () {
url=$(curl -Lqs "$1"|grep "href.*download.*media.*"|tail -1|cut -d '"' -f 2)
aria2c -x 6 "$url" # or wget "$url" if you prefer.
}
Example:
$ sudo apt install aria2
$ mdl "http://www.mediafire.com/?tjmjrmtuyco"01/14 13:58:34 [NOTICE] Downloading 1 item(s)
38MiB/100MiB(38%) CN:4 DL:6.6MiB ETA:9s]
Solution 3
Mediafire now allows to download from the IP you have requested. So 1st you need to download the page using following command
curl -O "http://www.mediafire.com/file/6ddhdfg/db.zip/file"
Once the file is downloaded, find the URL inside the file like
http://download*.mediafire.com/*
and then use the command wget to download the file
wget http://download*.mediafire.com/*
P.S. * varies downloads to downloads. so you need to find that exact value.
Solution 4
I've never tried myself, but there are a few things you could try to "cheat" the website.
For example --referer
will let you specify a referer URL - maybe the site expects you to come from a specific "home" page or something: with this option wget will pretend it's coming from there.
Also, --user-agent
will make wget "pretend" it's a different agent - namely, a browser like Firefox.
--header
will let you forge the whole HTTP request to mimic that of a browser.
If none of those work, there are also more options, dealing with cookies and other advanced settings: man wget
for the whole list.
I hope this helps a bit: if you succeed, please post how you did it!
Solution 5
Actually it can be done. What you have to do is:
- Go to the link like you're going to download to your computer
- When the "download" button comes up, "right-click" and copy the link and add that to your
wget
.
It'll be something like
wget http://download85794.mediafire.com/whatever_your_file_is
Related videos on Youtube
![Admin](/assets/logo_square_200-5d0d61d6853298bd2a4fe063103715b4daf2819fc21225efa21dfb93e61952ea.png)
Admin
Updated on September 18, 2022Comments
-
Admin almost 2 years
Example:
http://www.mediafire.com/?tjmjrmtuyco
This was what I tried...
wget -A rar [-r [-l 1]] <mediafireurl>
That is to say, I tried with and without the recursive option. It ends up downloading an HTML page of a few KB in size, while what I want is in the range 90-100 MB and RAR.
What happens with MediaFire for those who may not be aware, is that it first says
Processing Download Request...
This text after a second or so turns into the download link and reads
Click here to start download..
I would appreciate it if someone would tell me how to write a proper script for this situation.
-
Joachim Sauer almost 13 yearsThis is probably not allowed according to the Mediafire TOS and they will do their best to make it as hard as possible for you to do.
-
tumchaaditya about 12 yearsseems to be difficult with captcha, javascript timer and all the other things in place... they also have mechanisms in place to block downloads from much more sophisticated download managers..
-
tumchaaditya about 12 yearsyou can try jdownloader. it automates the download process from such file sharing sites(mediafire, filesonic etc.)
-
Zibri over 4 yearsplease, mark mine as the answer superuser.com/a/1517096/635532
-
-
A.Essam about 8 yearsThat's right! it works this way
-
Heath Mitchell almost 4 yearsBut if
wget
counts as a "retrieval application" then a browser does too... I think they're talking about things that crawl the whole site -
Luc H almost 4 yearsDoes not work anymore sadly, they just serve the .html
-
Banee Ishaque K almost 3 yearsWorking successfully...