Crawling and Indexing

  1. Indexed, though blocked by robots.txt

    Learn how to resolve the "Indexed, though blocked by robots.txt" error in Google Search Console. Follow our detailed guide to prevent Google from indexing blocked or private pages on your site.

    Read more
  2. Improving Indexation

    Whenever users type a query, Google searches through its index (a massive database of all web pages) and retrieves the most valuable content.

    Read more
  3. Robots.txt file

    Use a robots.txt file to instruct search engine crawlers which pages to crawl, and which to ignore. A small mistake in a robots.txt file can have big effects.

    Read more
  4. Search Console Crawl Stats Report

    Fix your crawl, and everything else becomes a breeze. Discover how to use the Search Console Crawl Stats report like a pro and troubleshoot issues!

    Read more
  5. Noindex tags

    Even though Google’s bots constantly crawl the internet, that doesn’t mean they have to crawl and index every page on your website.

    Read more
  6. Redirects

    301, 302, 307... It’s time to demystify all the main redirect types in SEO! Learn everything you need to know in our redirect guide.

    Read more
  7. Crawl budget

    Don’t waste your crawl budget! Learn the top strategies for getting your pages indexed and ranking faster.

    Read more
  8. Removing Pages from Google and Bing

    Removing pages from Google and Bing is easy, but it's important to follow the right steps. This article tells you how to remove content from search engines.

    Read more
  9. Google Coverage Statuses

    Get indexed & get rankings! Learn everything you need to know about the 20 Google Coverage Statuses (and how to fix them)

    Read more