This repository provides a tool to check the http statuscode of every link on a given website.
composer global require fwartner/status-check
This tool will scan all links on a given site.
check-status scan https://example.com
It outputs a line per link found.
When the crawl is finished a summary will be shown.
By default it uses 10 concurrent connections to speed up the crawling process. You can change that number passing a different value to the
check-status scan https://example.com --concurrency=20
You can also write all urls that gave a non-2xx or non-3xx response to a file:
check-status scan https://example.com --output=log.txt
When the crawler finds a link to an external site it will by default crawl that link as well. If you don't want the crawler to crawl such external urls use the
check-status scan https://example.com --dont-crawl-external-links
By default, requests timeout after 10 seconds. You can change this by passing the number of seconds to the
check-status scan https://example.com --timeout=30
To run the tests you'll have to start the included node based server first in a separate terminal window.
cd tests/server ./start_server.sh
With the server running, you can start testing.
Please see CHANGELOG for more information what has changed recently.
Please see CONTRIBUTING for details.
If you discover any security related issues, please email email@example.com instead of using the issue tracker.
The MIT License (MIT). Please see License File for more information.