curl Command in Linux: Usage and Examples

curl is a command-line utility for transferring data from or to a server designed to work without user interaction. With curl, you can download or upload data using one of the supported protocols including HTTP, HTTPS, SCP
, SFTP
, and FTP
. curl provides a number of options allowing you to resume transfers, limit bandwidth, use proxies, add user authentication, and much more.
In this guide, we cover curl with real examples and the options you will use most often.
Installing Curl
The curl package is pre-installed on most Linux distributions today.
To check whether the curl package is installed on your system, open your terminal, type curl, and press Enter. If you have curl installed, the system will print curl: try 'curl --help' or 'curl --manual' for more information. Otherwise, you will see something like curl command not found.
If curl is not installed, you can easily install it using the package manager of your distribution.
Install Curl on Ubuntu and Debian
sudo apt update
sudo apt install curlInstall Curl on Fedora, RHEL, and Derivatives
sudo dnf install curlHow to Use Curl
The syntax for the curl command is as follows:
curl [options] [URL...]In its simplest form, when invoked without any option, curl displays the specified resource to the standard output.
For example, to retrieve the example.com homepage, you would run:
curl example.comThe command will print the source code of the example.com homepage in your terminal window.
If no protocol is specified, curl tries to guess the protocol you want to use, and it will default to HTTP.
Save the Output to a File
To save the result of the curl command, use either the -o or -O option.
Lowercase -o saves the file with a custom filename:
curl -o output.html https://example.com/Uppercase -O saves the file with its original filename from the URL:
curl -O https://go.dev/dl/go1.24.2.linux-amd64.tar.gzDownload Multiple Files
To download multiple files at once, use multiple -O options, followed by the URL to the file you want to download.
In the following example, we are downloading the Arch Linux and Debian ISO files:
curl -O https://geo.mirror.pkgbuild.com/iso/latest/archlinux-x86_64.iso \
-O https://www.debian.org/distrib/netinstResume a Download
You can resume a download by using the -C - option. This is useful if your connection drops during the download of a large file, and instead of starting the download from scratch, you can continue the previous one.
For example, if you are downloading the Ubuntu 24.04 ISO file using the following command:
curl -O https://releases.ubuntu.com/noble/ubuntu-24.04.4-live-server-amd64.isoand suddenly your connection drops, you can resume the download with:
curl -C - -O https://releases.ubuntu.com/noble/ubuntu-24.04.4-live-server-amd64.isoGet the HTTP Headers of a URL
HTTP headers are colon-separated key-value pairs containing information such as User-Agent, content type, and encoding. Headers are passed between the client and the server with the request or the response.
Use the -I option to fetch only the HTTP headers of the specified resource:
curl -I https://www.ubuntu.com/HTTP/2 301
server: nginx/1.14.0 (Ubuntu)
date: Fri, 10 Apr 2026 14:39:14 GMT
content-type: text/html
content-length: 162
location: https://ubuntu.com/
strict-transport-security: max-age=15724800Each line is a header field. The first line shows the HTTP version and status code.
Save the Response Headers to a File
If you want to inspect the headers later, use -D to save them to a file:
curl -D headers.txt https://www.ubuntu.com/ -o /dev/nullTest if a Website Supports HTTP/2
To check whether a particular URL supports the new HTTP/2 protocol
, fetch the HTTP headers with -I along with the --http2 option:
curl -I --http2 -s https://linuxize.com/ | grep HTTPThe -s option tells curl to run in silent mode and hide the progress meter and error messages.
If the remote server supports HTTP/2, curl prints HTTP/2.0 200:
HTTP/2 200Otherwise, the response is HTTP/1.1 200:
HTTP/1.1 200 OKIf you have curl version 7.47.0 or newer, you do not need to use the --http2 option because HTTP/2 is enabled by default for all HTTPS connections.
Show Status Code and Timing
To see the status code and basic timing information without printing the response body, use -w with -o /dev/null:
curl -sS -o /dev/null -w "status=%{http_code} time=%{time_total}s\n" https://linuxize.com/Follow Redirects
By default, curl does not follow the HTTP Location headers.
If you try to retrieve the non-www version of google.com, you will notice that instead of getting the source of the page you will be redirected to the www version:
curl -I google.comHTTP/1.1 301 Moved Permanently
Location: http://www.google.com/
Content-Type: text/html; charset=UTF-8The 301 status and Location header show that google.com redirects to www.google.com. By default, curl stops here and does not follow the redirect.
The -L option instructs curl to follow any redirect until it reaches the final destination:
curl -L google.comChange the User-Agent
Sometimes when downloading a file, the remote server may be set to block the curl User-Agent or to return different contents depending on the visitor device and browser.
In situations like this, to emulate a different browser, use the -A option.
For example, to emulate Firefox, you would use:
curl -A "Mozilla/5.0 (X11; Linux x86_64; rv:133.0) Gecko/20100101 Firefox/133.0" https://getfedora.org/Specify a Maximum Transfer Rate
The --limit-rate option allows you to limit the data transfer rate. The value can be expressed in bytes, kilobytes with the k suffix, megabytes with the m suffix, and gigabytes with the g suffix.
In the following example, curl will download the Go binary and limit the download speed to 1 MB:
curl --limit-rate 1m -O https://go.dev/dl/go1.23.5.linux-amd64.tar.gzThis option is useful to prevent curl from consuming all the available bandwidth.
Transfer Files via FTP
To access a protected FTP server with curl, use the -u option and specify the username and password as shown below:
curl -u FTP_USERNAME:FTP_PASSWORD ftp://ftp.example.com/Once logged in, the command lists all files and directories in the user’s home directory.
You can download a single file from the FTP server using the following syntax:
curl -u FTP_USERNAME:FTP_PASSWORD ftp://ftp.example.com/file.tar.gzTo upload a file to the FTP server, use the -T followed by the name of the file you want to upload:
curl -T newfile.tar.gz -u FTP_USERNAME:FTP_PASSWORD ftp://ftp.example.com/Send Cookies
Sometimes you may need to make an HTTP request with specific cookies to access a remote resource or to debug an issue.
By default, when requesting a resource with curl, no cookies are sent or stored.
To send cookies to the server, use the -b switch followed by a filename containing the cookies or a string.
For example, to send a session cookie with your request:
curl -b "session_id=abc123" https://httpbin.org/cookiesYou can also save cookies from a response and reuse them in subsequent requests:
# Save cookies to a file
curl -c cookies.txt https://example.com/login
# Send saved cookies with the next request
curl -b cookies.txt https://example.com/dashboardRequest JSON Data
When you are working with APIs, set the Accept header and keep the output clean:
curl -sS -H "Accept: application/json" https://api.github.com/repos/curl/curlSend Basic Auth Credentials
If a server requires basic authentication, use -u with a username and password:
curl -u user:passwd https://httpbin.org/basic-auth/user/passwdUsing Proxies
curl supports different types of proxies, including HTTP, HTTPS, and SOCKS. To transfer data through a proxy server, use the -x (--proxy) option, followed by the proxy URL.
The following command downloads the specified resource using a proxy on 192.168.44.1 port 8888:
curl -x 192.168.44.1:8888 https://www.linux.com/If the proxy server requires authentication, use the -U (--proxy-user) option followed by the user name and password separated by a colon (user:password):
curl -U username:password -x 192.168.44.1:8888 https://www.linux.com/Send POST Requests
By default, curl sends GET requests. To send data with a POST request, use the -X POST option along with -d to specify the data:
curl -X POST -d "username=admin&password=secret" https://httpbin.org/postTo send JSON data, set the Content-Type header:
curl -X POST -H "Content-Type: application/json" \
-d '{"username":"admin","password":"secret"}' \
https://httpbin.org/postYou can also read data from a file using @:
curl -X POST -H "Content-Type: application/json" -d @data.json https://httpbin.org/postSet Timeouts
To prevent curl from hanging indefinitely, you can set timeouts:
# Maximum time for the entire operation (30 seconds)
curl --max-time 30 https://example.com/
# Maximum time to establish a connection (10 seconds)
curl --connect-timeout 10 https://example.com/
# Combine both for robust requests
curl --connect-timeout 10 --max-time 60 https://example.com/Verbose Output
The -v option shows the full request and response exchange, including the TLS handshake, request headers, and response headers. It is the most useful flag for debugging connection issues:
curl -v https://example.com/ -o /dev/null -s* Host example.com:443 was resolved.
* IPv4: 172.66.147.243, 104.20.23.154
* Trying 172.66.147.243:443...
* ALPN: curl offers h2,http/1.1
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
* TLSv1.3 (IN), TLS handshake, Server hello (2):
* SSL connection using TLSv1.3 / TLS_AES_256_GCM_SHA384
* ALPN: server accepted h2
* Server certificate:
* subject: CN=example.com
* issuer: C=US; O=CLOUDFLARE, INC.; CN=Cloudflare TLS Issuing ECC CA 1
* SSL certificate verify ok.
* Connected to example.com (172.66.147.243) port 443
> GET / HTTP/2
> Host: example.com
> User-Agent: curl/8.14.1
> Accept: */*
>
< HTTP/2 200
< content-type: text/html
< server: cloudflareLines starting with * are connection details from curl itself. Lines with > are the request headers sent to the server, and lines with < are the response headers coming back.
The -o /dev/null -s part discards the response body and hides the progress bar so only the debug output remains.
Quick Reference
For a printable quick reference, see the curl cheatsheet .
| Task | Command |
|---|---|
| Fetch a URL | curl example.com |
| Save to file (custom name) | curl -o file.html example.com |
| Save to file (original name) | curl -O https://example.com/file.tar.gz |
| Download multiple files | curl -O url1 -O url2 |
| Resume a download | curl -C - -O url |
| Fetch HTTP headers only | curl -I example.com |
| Follow redirects | curl -L example.com |
| Verbose output | curl -v example.com |
| POST form data | curl -X POST -d "key=value" url |
| POST JSON | curl -X POST -H "Content-Type: application/json" -d '{}' url |
| Basic auth | curl -u user:pass url |
| Set timeout | curl --max-time 30 url |
| Limit bandwidth | curl --limit-rate 1m -O url |
| Use a proxy | curl -x proxy:port url |
| Change User-Agent | curl -A "agent string" url |
| Send cookies | curl -b "name=value" url |
| Silent with errors | curl -sS url |
FAQ
What does curl -f do?
The -f (fail) option tells curl to return an error code (exit code 22) instead of outputting the HTML error page when the server responds with an HTTP error status (400 or above). This is useful in scripts where you want to detect failures cleanly.
What is the difference between curl and wget?
Both download files from the command line. curl supports more protocols, handles uploads, and is better suited for API work and scripting with pipes. wget
is simpler for recursive downloads and mirroring websites. Most systems have both available.
How do I skip TLS certificate verification?
Use the -k (or --insecure) option. This tells curl to proceed even if the server certificate cannot be verified. Only use this for testing against local or development servers, never in production.
How do I send a bearer token with curl?
Pass the token in an Authorization header:
curl -H "Authorization: Bearer $API_TOKEN" https://api.example.com/data.env file that is ignored by Git.Conclusion
The curl command handles everything from simple file downloads to API debugging with verbose output and authentication. For the full option list, run man curl or visit the curl documentation
.
Tags
Linuxize Weekly Newsletter
A quick weekly roundup of new tutorials, news, and tips.
About the authors

Dejan Panovski
Dejan Panovski is the founder of Linuxize, an RHCSA-certified Linux system administrator and DevOps engineer based in Skopje, Macedonia. Author of 800+ Linux tutorials with 20+ years of experience turning complex Linux tasks into clear, reliable guides.
View author page