How to capture cURL output to a file?

845,375

Solution 1

curl -K myconfig.txt -o output.txt 

Writes the first output received in the file you specify (overwrites if an old one exists).

curl -K myconfig.txt >> output.txt

Appends all output you receive to the specified file.

Note: The -K is optional.

Solution 2

For a single file you can use -O instead of -o filename to use the last segment of the URL path as the filename. Example:

curl http://example.com/folder/big-file.iso -O

will save the results to a new file named big-file.iso in the current folder. In this way it works similar to wget but allows you to specify other curl options that are not available when using wget.

Solution 3

There are several options to make curl output to a file

 # saves it to myfile.txt
curl http://www.example.com/data.txt -o myfile.txt
# The #1 will get substituted with the url, so the filename contains the url
curl http://www.example.com/data.txt -o "file_#1.txt" 
# saves to data.txt, the filename extracted from the URL
curl http://www.example.com/data.txt -O 
# saves to filename determined by the Content-Disposition header sent by the server.
curl http://www.example.com/data.txt -O -J 

Solution 4

For those of you want to copy the cURL output in the clipboard instead of outputting to a file, you can use pbcopy by using the pipe | after the cURL command.

Example: curl https://www.google.com/robots.txt | pbcopy. This will copy all the content from the given URL to your clipboard.

Solution 5

Either curl or wget can be used in this case. All 3 of these commands do the same thing, downloading the file at http://path/to/file.txt and saving it locally into "my_file.txt":

wget http://path/to/file.txt -O my_file.txt  # my favorite--it has a progress bar
curl http://path/to/file.txt -o my_file.txt
curl http://path/to/file.txt > my_file.txt

Notice the first one's -O is the capital letter "O".

The nice thing about the wget command is it shows a nice progress bar.

You can prove the files downloaded by each of the 3 techniques above are exactly identical by comparing their sha512 hashes. Running sha512sum my_file.txt after running each of the commands above, and comparing the results, reveals all 3 files to have the exact same sha hashes (sha sums), meaning the files are exactly identical, byte-for-byte.

See also: wget command to download a file and save as a different filename

Share:
845,375
Tony
Author by

Tony

Updated on July 08, 2022

Comments

  • Tony
    Tony 6 months

    I have a text document that contains a bunch of URLs in this format:

    URL = "sitehere.com"
    

    What I'm looking to do is to run curl -K myfile.txt, and get the output of the response cURL returns, into a file.

    How can I do this?

  • Tony
    Tony about 10 years
    Sorry maybe I need to clarify - the doc with all my URL's in the format about is called myfile.txt so I do curl -K myfile.txt and it runs though each one but I don't get the output into any file.
  • kodybrown
    kodybrown over 6 years
    I use the redirect for my command lines: curl url > destfile.x
  • kris
    kris about 5 years
    When I do either of these the output still displays in the terminal, not in the file
  • jglouie
    jglouie over 4 years
    @kris you probably have an ampersand in the url. put the url in double quotes and then try
  • Arya Pourtabatabaie about 4 years
    It works without the -K. With it, I get "No URL specified."
  • qwr
    qwr over 3 years
    for multiple files use --remote-name-all unix.stackexchange.com/a/265819/171025
  • lacostenycoder
    lacostenycoder about 3 years
    pbcopy is only available on MacOS. However xclip can be used in it's place for Linux see this question. However I would in most cases prefer curl http://example.com -o example_com.html & cat example_com.html | pbcopy So you wouldn't need to curl again if you accidently clear your clipboard.
  • lacostenycoder
    lacostenycoder about 3 years
    Also this should be used with caution if you're unsure of the size of the payload. For example you probably wouldn't want to paste this into a text editor, but opening it in vim no problem. curl http://www.textfiles.com/etext/FICTION/fielding-history-243.‌​txt | pbcopy maybe don't try this!
  • Jacky Supit
    Jacky Supit over 1 year
    thank you. i was thinking about using wget and then find the solution to ignore the ssl like curl. turned out that we need to only using -o like wget instead of -k like curl :)
  • lk_vc
    lk_vc 11 months
    To follow redirect, add -L option.
  • Thor Galle
    Thor Galle 7 months
    Thanks, the man page mentions that this also outputs the "descriptive information" that -vv displays (SSL info, HTTP verb, headers, ...), which I wanted to store. None of the other answers write that to a file.
  • Little_Ye233
    Little_Ye233 6 months
    Another addition for qwr's comment: the argument need to be put before the URLs like curl --remote-name-all https://example.tld/resource{1,2,3}. See: curl.se/docs/manpage.html#--remote-name-all
  • wybe
    wybe 6 months
    The >> appends, it does not overwrite. If you would like to overwrite, use a single >