Checking the headers of a website can provide valuable information about the server, security policies, and other metadata. This is particularly useful for web developers, system administrators, and security professionals. The Linux command line offers several tools that allow users to inspect website headers easily and efficiently.
In this tutorial you will learn:
- How to use the
curlcommand to check website headers - How to use the
wgetcommand to inspect website headers - How to use the
httpiecommand to retrieve website headers - How to use the
lynxcommand to inspect headers

| Category | Requirements, Conventions or Software Version Used |
|---|---|
| System | Linux operating system |
| Software | curl, wget, httpie, lynx |
| Other | Internet connection |
| Conventions | # – requires given linux commands to be executed with root privileges either directly as a root user or by use of sudo command$ – requires given linux commands to be executed as a regular non-privileged user |
Introduction to Methods for Checking Website Headers
There are several methods to check website headers using the Linux command line. Each method utilizes different tools that are pre-installed or easily installable on most Linux distributions. These methods will allow you to retrieve and inspect the headers returned by a web server when a request is made to a website.
- Using the curl Command: The
curlcommand is a widely used tool for transferring data with URLs. It supports various protocols, including HTTP, HTTPS, FTP, and more. To check the headers of a website, you can use the following command:$ curl -I http://example.com
This command fetches the headers of the specified URL. The
-Ioption tellscurlto fetch the headers only, without downloading the entire content. The output will include details such as HTTP status code, content type, server information, and more. - Using the wget Command: The
wgetcommand is another powerful utility for retrieving files from the web. While it is primarily used for downloading files, it can also be used to fetch and display headers.$ wget --spider -S http://example.com
In this command, the
--spideroption tellswgetto check the URL without downloading the content, and the-Soption displays the headers sent by the server. This method provides a detailed view of the headers, including the HTTP status code and other relevant information. - Using the httpie Command:
httpieis a user-friendly command-line HTTP client. It is designed for ease of use, allowing for simple and intuitive interaction with web services. Note thathttpieneeds to be installed first:$ sudo apt-get install httpie
Once installed, you can use the following command to check headers:
$ http -h GET http://example.com
In this command,
httpis the command-line tool,-hspecifies that only headers should be displayed, andGETis the HTTP method used. This method provides a clean and readable output of the headers, making it easy to interpret the information returned by the server.
obtain website header using the httpie Command - Using the lynx Command:
lynxis a text-based web browser for the command line. It can be used to retrieve headers by making a request to the specified URL.$ lynx -head -dump http://example.com
In this command,
-headspecifies that only the headers should be fetched, and-dumpoutputs the headers to the command line. This method provides a straightforward way to view the headers of a website using a text-based browser.
Conclusion
Checking website headers using the Linux command line is a straightforward process with the right tools. curl, wget, httpie, and lynx are all effective utilities that can help you retrieve and inspect headers for various purposes, such as debugging, monitoring, and security analysis. By mastering these commands, you can gain valuable insights into the behavior and configuration of web servers.
