Introducing gargantua - The fast website crawler

A simple website crawler and load-testing tool called 「 gargantua 」 that crawls all links on your website with as many concurrent workers as your site can handle.
created by on 2017-02-15

Recently I wanted to check the response times of all category and product detail pages of a Magento 2 shop that I am working on.

So I searched for some tools for load-testing and crawling:

But afaik none of these tools can easily combine load-testing and crawling.

So I built a little web-crawler in go that

… and all of that concurrently with 5, 50 or even 100 workers at the same time to get done fast and/or to create a bit of load for the site so I can see how it responds to a bit of stress:

gargantua crawl --url https://www.sitemaps.org/sitemap.xml --workers 5

Animation of gargantua v0.1.0 crawling a website with 5 concurrent workers

「 gargantua 」 is just a prototype but is works pretty good and is really easy to use:

Introduction to gargantua

You can get the the latest code and binaries at github.com/andreaskoch/gargantua.

If I missed some tools that can to the same tasks please let me know via Twitter (@andreaskoch). If you find bugs or have ideas for feature requests please create an issue in the GitHub repository: github.com/andreaskoch/gargantua.

🖖 Andy

Shortlink:
Tags: