I have to build a backend module, which gives the editor an overview of all broken links (404, 500, etc) of the whole site.
I currently see two possible approaches to achieve this:
Use a crawler of some sort (like https://github.com/spatie/crawler) to crawl the whole site for dead links.
The problem i have with this approach is, that it takes the crawler quite some time to finish crawling every link (because it has to wait for every response to check if there are more links to crawl).
Build up a Database-Table with all links on the site and curl them one by one. I kinda like the idea, but it is probably difficult to keep the table up to date.
Are there any other approaches?
Which one would you recommend?