Block AI & Search Engine Bots From Scraping Your Kajabi Site

Kayla M.

calendar_month Updated

Learn to block AI and Search Engine Bots from scraping your Kajabi Site.


In this article:


Permanently remove your URL from search engines

To remove content or a URL from Google Search permanently, take one of the following actions to remove the page permanently:

  • Remove or update the content on your site (images, pages, directories) and make sure that your site returns a 404 (Not Found) HTTP status code. Non-HTML files (like PDFs) should be completely removed from your site. These can be indexed by bots.
  • Block access to the content: For example, require a password.
  • Indicate that the page should not be indexed using the noindex meta tag. This is less secure than the other methods. More information can be found here.
  • Do not use robots.txt as a blocking mechanism.

Using custom code to hide your Kajabi site

This can be prevented by using the Removals Tool to hide the URL.

Alternatively, if you are on an account plan that includes code editor access, like the Pro Plan, you have the ability to add custom code to your site to hide your Kajabi site from search engines.

Warning:
Introducing custom code to your site can open you up to vulnerabilities and potential malware attacks. We recommend using vulnerability tools like VirusTotal or Snyk to scan any custom code that you intend to use and apply certain best practices to help reduce potential impact.

Continue reading to learn how to add custom code to your pages on Kajabi to hide your login page from search engines.


Adding custom code to hide your Kajabi site

Prevent a page from appearing on most search engines by including a noindex meta tag in your page's code. 

To include noindex meta tag in your page's code:

  • Open the Website tab from the Dashboard.
  • Select </> Modify code from the dropdown option next to your Live site name.
  • From the Snippets directory, open the global_head.liquid file.
  • Copy the code snippet below and paste between the <head></head> tags in your file.

    Code snippet:

    <!-- 'noindex' to hide login page from search engines === -->
    {% if template == "login" %}
      <meta name="robots" content="noindex">
    {% endif %}


  • Click Save:


Adding custom code to the legacy template, Premier

If you are using the legacy template, Premier, you will add the same code snippet to your Premier template. 

To add your code to your Premier template:

  • Open the Website tab from the Dashboard.
  • Select </> Modify code from the dropdown option next to your Live site name.
  • From the Layouts directory, open the theme.liquid file.
  • Copy the code snippet below and paste between the <head></head> tags in your file.

    Code snippet:

    <!-- 'noindex' to hide login page from search engines === -->
    {% if template == "login" %}
      <meta name="robots" content="noindex">
    {% endif %}


  • Click Save:


Block AI Bots - How Kajabi can help

Completely blocking AI bots from scraping your content can be tricky, but Kajabi can help you make it harder for them and discourage bots from scraping.

Many AI scrapers identify themselves with a unique code called a "user agent." Kajabi leverages Cloudflare to protect your sites, and can help you block scrapers with this method. Reach out to security@kajabi.com and we'd be more than happy to help.

What are some web crawlers associated with AI? (this list is not exhaustive):

  • User agent: anthropic-ai
  • User agent: CCBot
  • User agent: ChatGPT-User
  • User agent: GPTBot
  • User agent: OmigiliBot

It's important to consider the trade-offs between blocking access and allowing some scraping to occur. Blocking all bots can hurt your SEO since search engine crawlers are also bots. 


Still have questions? Let us know your question below to have it added to the list or reach out to Support if you need additional assistance. Thanks for being the best part of Kajabi!

Did you find this article helpful?