How to capture cURL output to a file?
Solution 1
curl -K myconfig.txt -o output.txt
Writes the first output received in the file you specify (overwrites if an old one exists).
curl -K myconfig.txt >> output.txt
Appends all output you receive to the specified file.
Note: The -K is optional.
Solution 2
For a single file you can use -O
instead of -o filename
to use the last segment of the URL path as the filename. Example:
curl http://example.com/folder/big-file.iso -O
will save the results to a new file named big-file.iso in the current folder. In this way it works similar to wget but allows you to specify other curl options that are not available when using wget.
Solution 3
There are several options to make curl output to a file
# saves it to myfile.txt
curl http://www.example.com/data.txt -o myfile.txt
# The #1 will get substituted with the url, so the filename contains the url
curl http://www.example.com/data.txt -o "file_#1.txt"
# saves to data.txt, the filename extracted from the URL
curl http://www.example.com/data.txt -O
# saves to filename determined by the Content-Disposition header sent by the server.
curl http://www.example.com/data.txt -O -J
Solution 4
For those of you want to copy the cURL output in the clipboard instead of outputting to a file, you can use pbcopy
by using the pipe |
after the cURL command.
Example: curl https://www.google.com/robots.txt | pbcopy
. This will copy all the content from the given URL to your clipboard.
Solution 5
Either curl
or wget
can be used in this case. All 3 of these commands do the same thing, downloading the file at http://path/to/file.txt and saving it locally into "my_file.txt":
wget http://path/to/file.txt -O my_file.txt # my favorite--it has a progress bar
curl http://path/to/file.txt -o my_file.txt
curl http://path/to/file.txt > my_file.txt
Notice the first one's -O
is the capital letter "O".
The nice thing about the wget
command is it shows a nice progress bar.
You can prove the files downloaded by each of the 3 techniques above are exactly identical by comparing their sha512 hashes. Running sha512sum my_file.txt
after running each of the commands above, and comparing the results, reveals all 3 files to have the exact same sha hashes (sha sums), meaning the files are exactly identical, byte-for-byte.
See also: wget command to download a file and save as a different filename

Tony
Updated on July 08, 2022Comments
-
Tony 6 months
I have a text document that contains a bunch of URLs in this format:
URL = "sitehere.com"
What I'm looking to do is to run
curl -K myfile.txt
, and get the output of the response cURL returns, into a file.How can I do this?
-
Tony about 10 yearsSorry maybe I need to clarify - the doc with all my URL's in the format about is called myfile.txt so I do curl -K myfile.txt and it runs though each one but I don't get the output into any file.
-
kodybrown over 6 yearsI use the redirect for my command lines:
curl url > destfile.x
-
kris about 5 yearsWhen I do either of these the output still displays in the terminal, not in the file
-
jglouie over 4 years@kris you probably have an ampersand in the url. put the url in double quotes and then try
-
Arya Pourtabatabaie about 4 yearsIt works without the -K. With it, I get "No URL specified."
-
qwr over 3 yearsfor multiple files use
--remote-name-all
unix.stackexchange.com/a/265819/171025 -
lacostenycoder about 3 yearspbcopy is only available on MacOS. However
xclip
can be used in it's place for Linux see this question. However I would in most cases prefercurl http://example.com -o example_com.html & cat example_com.html | pbcopy
So you wouldn't need to curl again if you accidently clear your clipboard. -
lacostenycoder about 3 yearsAlso this should be used with caution if you're unsure of the size of the payload. For example you probably wouldn't want to paste this into a text editor, but opening it in vim no problem.
curl http://www.textfiles.com/etext/FICTION/fielding-history-243.txt | pbcopy
maybe don't try this! -
Jacky Supit over 1 yearthank you. i was thinking about using wget and then find the solution to ignore the ssl like curl. turned out that we need to only using -o like wget instead of -k like curl :)
-
lk_vc 11 monthsTo follow redirect, add
-L
option. -
Thor Galle 7 monthsThanks, the man page mentions that this also outputs the "descriptive information" that
-vv
displays (SSL info, HTTP verb, headers, ...), which I wanted to store. None of the other answers write that to a file. -
Little_Ye233 6 monthsAnother addition for qwr's comment: the argument need to be put before the URLs like
curl --remote-name-all https://example.tld/resource{1,2,3}
. See: curl.se/docs/manpage.html#--remote-name-all -
wybe 6 monthsThe
>>
appends, it does not overwrite. If you would like to overwrite, use a single>