A fast, local-first web crawler and AEO & SEO analysis tool. Crawl entire sites in seconds from the terminal or a native desktop app, run automated SEO checks, extract content as clean Markdown, and export to JSON, CSV, or Sitemap XML.
Hey Product Hunt! I built crawler.sh because I kept running into the same problem: every time I needed to audit a website's SEO or extract its content, I had to choose between bloated enterprise tools, slow cloud services, or stitching together a bunch of scripts.
crawler.sh is a single tool that does all three: crawling, SEO analysis, and content extraction, from the terminal or a native desktop app. It's built in Rust so it's fast, it runs locally so your data stays private, and it outputs standard formats so you can take the results anywhere.
Love the positioning, local, fast, no bloat is refreshing. How deep does the AEO analysis go? Is it just schema and structured data checks, or does it also evaluate content for AI-answer readiness?
Love the local-first approach especially keeping everything fast and private
Curious — since it can crawl and extract entire sites, how are you handling safeguards around accidentally pulling sensitive or restricted data?
Feels like that could be important as usage scales
Great tool for SEO enthusiasts! How does Crawler.sh handle dynamic JavaScript-heavy websites during the crawl?
Local-first approach is underrated, most SEO tools are cloud-heavy and slow. The fact that this runs from terminal AND has a native desktop app is a nice touch for different types of users.
The Markdown extraction feature caught my eye specifically,been looking for something that can cleanly pull content without all the HTML noise. How does it handle JavaScript-heavy sites like React or Next.js? Does it wait for JS to render or purely static crawl?
Also curious, AEO (Answer Engine Optimization) is still pretty new territory. What kind of checks are you running for that specifically? Most tools haven't caught up there yet.
Congrats on the launch, terminal-based dev tools don't get enough love on PH!
Congratulations on the launch, @mehmetkose! Curious to see how the platform provides feedback on the issues
Man, markdown extraction is such a lifesaver when dealing with giant websites. Way easier than my old 'copy-paste marathon' method. 😂
Fast, private, and outputs standard formats, that’s how tools should be. Congrats on the launch 🙌
Hey Product Hunt! I built crawler.sh because I kept running into the same problem: every time I needed to audit a website's SEO or extract its content, I had to choose between bloated enterprise tools, slow cloud services, or stitching together a bunch of scripts.
crawler.sh is a single tool that does all three: crawling, SEO analysis, and content extraction, from the terminal or a native desktop app. It's built in Rust so it's fast, it runs locally so your data stays private, and it outputs standard formats so you can take the results anywhere.