Updated robots.txt. How do I clear the cache?

Using the Hippo plugin robotstxt, I added a new sitemap to the list of sitemaps. I can see my change in production if I add a random query string to cache bust, but the original URL still shows the previous sitemap. If this is a cache issue, how do I clear the cache to show the updated robots.txt? It’s been almost 24 hours since I made the change. Here are the URLs to compare:

  1. https://www.couchbase.com/robots.txt
  2. https://www.couchbase.com/robots.txt?1 (this shows the added sitemap XML)

is it your browser cache? I get to see same files…

I tried with a few different browsers, cleared the browser cache, and incognito mode. I’m still seeing the same: the first URL has one single sitemap listed and the second URL has two sitemaps listed.

We do use Akamai as a CDN. I requested that Akamai clear the cache to see if the update takes.

might be location related (CDN), for me files are still exactly the same…

Thanks, Machak. That’s good to know. It sounds more like clearing the Akamai cache will resolve this issue.

Resolved by purging the robots.txt file from Akamai CDN.