Wget Command in Linux with Examples

GNU Wget is a command-line utility for downloading files from the web. It supports HTTP, HTTPS, and FTP protocols and provides features such as recursive downloads, bandwidth limiting, resume support, and background operation.
This guide explains how to use the wget command through practical examples and detailed explanations of the most common options.
Installing Wget
Wget is pre-installed on most Linux distributions. To verify, run:
wget --versionIf it is not installed, use your distribution’s package manager.
Ubuntu, Debian, and Derivatives
sudo apt install wgetFedora, RHEL, and Derivatives
sudo dnf install wgetWget Command Syntax
The general syntax of the wget command is:
wget [OPTIONS] [URL]OPTIONS— Flags that control download behaviorURL— The address of the file or resource to download
Download a File
In its simplest form, wget downloads the resource at the given URL to the current working directory
:
wget https://releases.ubuntu.com/noble/ubuntu-24.04.4-live-server-amd64.iso
During the download, wget displays the progress bar alongside the file name, file size, download speed, and estimated time remaining. Once the download completes, the file is saved in the current directory.
If a file with the same name already exists, wget appends .N (a number) to the filename to avoid overwriting it.
To suppress all output, use the -q (quiet) option:
wget -q https://example.com/file.tar.gzSave Under a Different Name
To save the downloaded file with a specific name, use the -O option:
wget -O latest-hugo.zip https://github.com/gohugoio/hugo/archive/refs/heads/master.zipThis saves the file as latest-hugo.zip instead of its original name.
Download to a Specific Directory
By default, wget saves files in the current directory. To save to a different location, use the -P option:
wget -P /mnt/iso https://releases.ubuntu.com/noble/ubuntu-24.04.4-live-server-amd64.isoResume a Download
To resume an interrupted download, use the -c (continue) option:
wget -c https://releases.ubuntu.com/noble/ubuntu-24.04.4-live-server-amd64.isoThis is useful when downloading large files over an unreliable connection. If the remote server does not support resuming, wget starts the download from the beginning.
Limit Download Speed
To prevent wget from consuming all available bandwidth, use the --limit-rate option. Append k for kilobytes, m for megabytes, or g for gigabytes:
wget --limit-rate=2m https://example.com/large-file.tar.gzThis limits the download speed to 2 MB per second.
Download Multiple Files
To download multiple files at once, create a text file with one URL per line and pass it to wget with the -i option:
https://geo.mirror.pkgbuild.com/iso/latest/archlinux-x86_64.iso
https://www.debian.org/distrib/netinst
https://download.fedoraproject.org/pub/fedora/linux/releases/41/Server/x86_64/iso/Fedora-Server-dvd-x86_64-41-1.4.isowget -i urls.txtTo read URLs from standard input, use -i -.
Download in the Background
To run the download as a background process, use the -b option:
wget -b https://releases.ubuntu.com/noble/ubuntu-24.04.4-live-server-amd64.isoThe output is written to wget-log in the current directory. To monitor progress, use the tail
command:
tail -f wget-logRecursive Download and Website Mirroring
To download an entire website for offline browsing, use the -m (mirror) option combined with -k (convert links) and -p (download page requisites):
wget -m -k -p https://example.com-m— Mirror mode: enables recursive downloading, timestamping, and infinite depth-k— Convert links in downloaded pages to point to local files-p— Download all resources needed to display each page (CSS, images, JavaScript)
To limit the recursive depth, use the -l option:
wget -r -l 2 https://example.comThis downloads pages up to 2 levels deep from the starting URL.
To accept or reject specific file types during a recursive download:
wget -r --accept=jpg,png,gif https://example.com/gallery/wget -r --reject=mp4,avi https://example.com/media/Authentication
HTTP Authentication
To download from a server that requires HTTP basic authentication, use the --user and --password options:
wget --user=admin --password=secret https://example.com/protected/file.tar.gzFor token-based authentication, pass the token as a custom header:
wget --header="Authorization: Bearer YOUR_TOKEN" https://api.example.com/data.jsonFTP Authentication
To download from a password-protected FTP server:
wget --ftp-user=FTP_USERNAME --ftp-password=FTP_PASSWORD ftp://ftp.example.com/file.tar.gzUseful Options
Check if a URL Exists
The --spider option checks whether a URL is reachable without downloading the file:
wget --spider https://example.com/file.tar.gzIf the file exists, wget prints “200 OK”. If not, it prints an error status.
Download Only if Newer
The -N (timestamping) option downloads the file only if the remote version is newer than the local copy:
wget -N https://example.com/data.csvCustom HTTP Headers
To send custom headers with the request, use the --header option:
wget --header="Accept: application/json" https://api.example.com/endpointYou can pass --header multiple times to add several headers.
Set Retries and Timeouts
To control retry behavior and connection timeouts:
wget --tries=5 --timeout=30 https://example.com/file.tar.gz--tries=N— Retry up to N times (default is 20)--timeout=N— Set DNS, connect, and read timeout to N seconds
To also retry when the connection is refused:
wget --tries=5 --retry-connrefused https://example.com/file.tar.gzChange the User Agent
Some servers block requests from the default wget user agent. To send a different user agent string, use the -U option:
wget -U "Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0" https://example.com/Skip Certificate Verification
To download from a host with an invalid or self-signed SSL certificate, use --no-check-certificate:
wget --no-check-certificate https://self-signed.example.com/file.tar.gzDownload and Pipe to Another Command
To download a file and pipe it directly to another command without saving it to disk, use -O -:
wget -q -O - "https://wordpress.org/latest.tar.gz" | tar -xzf - -C /var/wwwThis downloads the latest WordPress archive and extracts it directly to /var/www.
Use Server-Suggested Filename
Some servers send a Content-Disposition header with a suggested filename. To use it, add --content-disposition:
wget --content-disposition https://example.com/download?file=reportTroubleshooting
Unable to resolve host address
DNS resolution failed. Verify the hostname is correct and that your DNS settings are working. Try ping hostname to test.
HTTP request sent, awaiting response… 403 Forbidden
The server is blocking the request. Try changing the user agent with -U, or check if the server requires authentication.
ERROR: cannot verify certificate
The server’s SSL certificate is invalid or expired. If you trust the host, use --no-check-certificate. Otherwise, update your system’s CA certificates with sudo apt install ca-certificates or sudo dnf install ca-certificates.
Download speed is extremely slow
Check your connection and whether the remote server is throttling. Try a different mirror if one is available. Use -c to resume if the download stalls.
wget: command not found
Wget is not installed. Install it with sudo apt install wget or sudo dnf install wget depending on your distribution.
Quick Reference
| Command | Description |
|---|---|
wget URL | Download a file |
wget -O name.txt URL | Save with a specific filename |
wget -P /path URL | Save to a specific directory |
wget -c URL | Resume an interrupted download |
wget --limit-rate=2m URL | Limit download speed |
wget -i urls.txt | Download URLs from a file |
wget -b URL | Download in the background |
wget -m -k -p URL | Mirror a website for offline browsing |
wget --spider URL | Check if URL exists without downloading |
wget -N URL | Download only if remote file is newer |
wget --tries=5 --timeout=30 URL | Set retries and timeout |
wget --header="Auth: token" URL | Send custom HTTP header |
wget -q -O - URL | cmd | Pipe download to another command |
For a printable quick reference, see the Wget Cheatsheet .
FAQ
What is the difference between wget and curl?
Both download files from the web. Wget specializes in recursive downloads, website mirroring, and resuming large transfers. curl
supports more protocols and is better suited for API interactions and sending data. Wget saves to a file by default, while curl prints to standard output.
How do I download an entire directory from a web server?
Use wget -r --no-parent URL/directory/ to recursively download the directory without ascending to the parent. Add --accept or --reject to filter file types.
Can wget resume a partially downloaded file?
Yes. Use the -c flag. Wget sends a Range header to request only the remaining bytes. If the server does not support range requests, the download starts from the beginning.
How do I retry a failed download automatically?
Wget retries up to 20 times by default. Use --tries=N to change the limit. Add --retry-connrefused to also retry when the server refuses the connection. Use --waitretry=N to wait N seconds between retries.
How do I download files that require login?
For HTTP basic authentication, use --user and --password. For token-based APIs, pass the token with --header="Authorization: Bearer TOKEN". For FTP, use --ftp-user and --ftp-password.
Conclusion
Wget is a versatile tool for downloading files, mirroring websites, and automating transfers from the command line. Combine it with options like -c for resuming, --limit-rate for bandwidth control, and -m -k -p for offline website copies.
For the complete list of options, see the GNU Wget Manual .
If you have any questions, feel free to leave a comment below.
Tags
Linuxize Weekly Newsletter
A quick weekly roundup of new tutorials, news, and tips.
About the authors

Dejan Panovski
Dejan Panovski is the founder of Linuxize, an RHCSA-certified Linux system administrator and DevOps engineer based in Skopje, Macedonia. Author of 800+ Linux tutorials with 20+ years of experience turning complex Linux tasks into clear, reliable guides.
View author page