Wget Command in Linux with Examples

By 

Updated on

7 min read

Wget Command

GNU Wget is a command-line utility for downloading files from the web. It supports HTTP, HTTPS, and FTP protocols and provides features such as recursive downloads, bandwidth limiting, resume support, and background operation.

This guide explains how to use the wget command through practical examples and detailed explanations of the most common options.

Installing Wget

Wget is pre-installed on most Linux distributions. To verify, run:

Terminal
wget --version

If it is not installed, use your distribution’s package manager.

Ubuntu, Debian, and Derivatives

Terminal
sudo apt install wget

Fedora, RHEL, and Derivatives

Terminal
sudo dnf install wget

Wget Command Syntax

The general syntax of the wget command is:

txt
wget [OPTIONS] [URL]
  • OPTIONS — Flags that control download behavior
  • URL — The address of the file or resource to download

Download a File

In its simplest form, wget downloads the resource at the given URL to the current working directory :

Terminal
wget https://releases.ubuntu.com/noble/ubuntu-24.04.4-live-server-amd64.iso
wget download file

During the download, wget displays the progress bar alongside the file name, file size, download speed, and estimated time remaining. Once the download completes, the file is saved in the current directory.

If a file with the same name already exists, wget appends .N (a number) to the filename to avoid overwriting it.

To suppress all output, use the -q (quiet) option:

Terminal
wget -q https://example.com/file.tar.gz

Save Under a Different Name

To save the downloaded file with a specific name, use the -O option:

Terminal
wget -O latest-hugo.zip https://github.com/gohugoio/hugo/archive/refs/heads/master.zip

This saves the file as latest-hugo.zip instead of its original name.

Download to a Specific Directory

By default, wget saves files in the current directory. To save to a different location, use the -P option:

Terminal
wget -P /mnt/iso https://releases.ubuntu.com/noble/ubuntu-24.04.4-live-server-amd64.iso

Resume a Download

To resume an interrupted download, use the -c (continue) option:

Terminal
wget -c https://releases.ubuntu.com/noble/ubuntu-24.04.4-live-server-amd64.iso

This is useful when downloading large files over an unreliable connection. If the remote server does not support resuming, wget starts the download from the beginning.

Limit Download Speed

To prevent wget from consuming all available bandwidth, use the --limit-rate option. Append k for kilobytes, m for megabytes, or g for gigabytes:

Terminal
wget --limit-rate=2m https://example.com/large-file.tar.gz

This limits the download speed to 2 MB per second.

Download Multiple Files

To download multiple files at once, create a text file with one URL per line and pass it to wget with the -i option:

urls.txttxt
https://geo.mirror.pkgbuild.com/iso/latest/archlinux-x86_64.iso
https://www.debian.org/distrib/netinst
https://download.fedoraproject.org/pub/fedora/linux/releases/41/Server/x86_64/iso/Fedora-Server-dvd-x86_64-41-1.4.iso
Terminal
wget -i urls.txt

To read URLs from standard input, use -i -.

Download in the Background

To run the download as a background process, use the -b option:

Terminal
wget -b https://releases.ubuntu.com/noble/ubuntu-24.04.4-live-server-amd64.iso

The output is written to wget-log in the current directory. To monitor progress, use the tail command:

Terminal
tail -f wget-log

Recursive Download and Website Mirroring

To download an entire website for offline browsing, use the -m (mirror) option combined with -k (convert links) and -p (download page requisites):

Terminal
wget -m -k -p https://example.com
  • -m — Mirror mode: enables recursive downloading, timestamping, and infinite depth
  • -k — Convert links in downloaded pages to point to local files
  • -p — Download all resources needed to display each page (CSS, images, JavaScript)

To limit the recursive depth, use the -l option:

Terminal
wget -r -l 2 https://example.com

This downloads pages up to 2 levels deep from the starting URL.

To accept or reject specific file types during a recursive download:

Terminal
wget -r --accept=jpg,png,gif https://example.com/gallery/
Terminal
wget -r --reject=mp4,avi https://example.com/media/

Authentication

HTTP Authentication

To download from a server that requires HTTP basic authentication, use the --user and --password options:

Terminal
wget --user=admin --password=secret https://example.com/protected/file.tar.gz

For token-based authentication, pass the token as a custom header:

Terminal
wget --header="Authorization: Bearer YOUR_TOKEN" https://api.example.com/data.json

FTP Authentication

To download from a password-protected FTP server:

Terminal
wget --ftp-user=FTP_USERNAME --ftp-password=FTP_PASSWORD ftp://ftp.example.com/file.tar.gz

Useful Options

Check if a URL Exists

The --spider option checks whether a URL is reachable without downloading the file:

Terminal
wget --spider https://example.com/file.tar.gz

If the file exists, wget prints “200 OK”. If not, it prints an error status.

Download Only if Newer

The -N (timestamping) option downloads the file only if the remote version is newer than the local copy:

Terminal
wget -N https://example.com/data.csv

Custom HTTP Headers

To send custom headers with the request, use the --header option:

Terminal
wget --header="Accept: application/json" https://api.example.com/endpoint

You can pass --header multiple times to add several headers.

Set Retries and Timeouts

To control retry behavior and connection timeouts:

Terminal
wget --tries=5 --timeout=30 https://example.com/file.tar.gz
  • --tries=N — Retry up to N times (default is 20)
  • --timeout=N — Set DNS, connect, and read timeout to N seconds

To also retry when the connection is refused:

Terminal
wget --tries=5 --retry-connrefused https://example.com/file.tar.gz

Change the User Agent

Some servers block requests from the default wget user agent. To send a different user agent string, use the -U option:

Terminal
wget -U "Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0" https://example.com/

Skip Certificate Verification

To download from a host with an invalid or self-signed SSL certificate, use --no-check-certificate:

Terminal
wget --no-check-certificate https://self-signed.example.com/file.tar.gz
Warning
Skipping certificate verification disables an important security check. Only use this option when you trust the remote host and understand the risk.

Download and Pipe to Another Command

To download a file and pipe it directly to another command without saving it to disk, use -O -:

Terminal
wget -q -O - "https://wordpress.org/latest.tar.gz" | tar -xzf - -C /var/www

This downloads the latest WordPress archive and extracts it directly to /var/www.

Use Server-Suggested Filename

Some servers send a Content-Disposition header with a suggested filename. To use it, add --content-disposition:

Terminal
wget --content-disposition https://example.com/download?file=report

Troubleshooting

Unable to resolve host address
DNS resolution failed. Verify the hostname is correct and that your DNS settings are working. Try ping hostname to test.

HTTP request sent, awaiting response… 403 Forbidden
The server is blocking the request. Try changing the user agent with -U, or check if the server requires authentication.

ERROR: cannot verify certificate
The server’s SSL certificate is invalid or expired. If you trust the host, use --no-check-certificate. Otherwise, update your system’s CA certificates with sudo apt install ca-certificates or sudo dnf install ca-certificates.

Download speed is extremely slow
Check your connection and whether the remote server is throttling. Try a different mirror if one is available. Use -c to resume if the download stalls.

wget: command not found
Wget is not installed. Install it with sudo apt install wget or sudo dnf install wget depending on your distribution.

Quick Reference

CommandDescription
wget URLDownload a file
wget -O name.txt URLSave with a specific filename
wget -P /path URLSave to a specific directory
wget -c URLResume an interrupted download
wget --limit-rate=2m URLLimit download speed
wget -i urls.txtDownload URLs from a file
wget -b URLDownload in the background
wget -m -k -p URLMirror a website for offline browsing
wget --spider URLCheck if URL exists without downloading
wget -N URLDownload only if remote file is newer
wget --tries=5 --timeout=30 URLSet retries and timeout
wget --header="Auth: token" URLSend custom HTTP header
wget -q -O - URL | cmdPipe download to another command

For a printable quick reference, see the Wget Cheatsheet .

FAQ

What is the difference between wget and curl?
Both download files from the web. Wget specializes in recursive downloads, website mirroring, and resuming large transfers. curl supports more protocols and is better suited for API interactions and sending data. Wget saves to a file by default, while curl prints to standard output.

How do I download an entire directory from a web server?
Use wget -r --no-parent URL/directory/ to recursively download the directory without ascending to the parent. Add --accept or --reject to filter file types.

Can wget resume a partially downloaded file?
Yes. Use the -c flag. Wget sends a Range header to request only the remaining bytes. If the server does not support range requests, the download starts from the beginning.

How do I retry a failed download automatically?
Wget retries up to 20 times by default. Use --tries=N to change the limit. Add --retry-connrefused to also retry when the server refuses the connection. Use --waitretry=N to wait N seconds between retries.

How do I download files that require login?
For HTTP basic authentication, use --user and --password. For token-based APIs, pass the token with --header="Authorization: Bearer TOKEN". For FTP, use --ftp-user and --ftp-password.

Conclusion

Wget is a versatile tool for downloading files, mirroring websites, and automating transfers from the command line. Combine it with options like -c for resuming, --limit-rate for bandwidth control, and -m -k -p for offline website copies.

For the complete list of options, see the GNU Wget Manual .

If you have any questions, feel free to leave a comment below.

Linuxize Weekly Newsletter

A quick weekly roundup of new tutorials, news, and tips.

About the authors

Dejan Panovski

Dejan Panovski

Dejan Panovski is the founder of Linuxize, an RHCSA-certified Linux system administrator and DevOps engineer based in Skopje, Macedonia. Author of 800+ Linux tutorials with 20+ years of experience turning complex Linux tasks into clear, reliable guides.

View author page