I have seen people manually checking each and every page on a site to search for broken links. It is possible only for websites having very few pages. When the number of pages become large, it will become impossible. It becomes really easy if we can automate finding broken links. We can find the broken links by using HTTP manipulation tools. Let's see how to do it.
In order to identify the links and find the broken ones from the links, we can use
curl. It has an option
-traversal, which will recursively visit pages in the website and build the list of all hyperlinks in the website. We can use cURL to verify whether each of the links are broken or not.
Let's write a Bash script ...