FTL robotstxt.yaml (default config) parsing/reading failure


We’ve upgraded from 14.7.3 to 15.2.2 and since the upgrade we are getting this warning

WARN  http-nio-8080-exec-91 [[org.hippoecm.hst.servlet.HstFreemarkerServlet]] An error has occurred when reading existing sub-variable "allows"; see cause exception! The type of the containing value was: extended_hash+string (org.onehippo.forge.robotstxt.annotated.Section wrapped into f.e.b.StringModel)

FTL stack trace ("~" means nesting-related):
	- Failed at: #if section.allows??  [in template "jcr:/hst:hst/hst:configurations/hst:default/hst:templates/robotstxt.ftl" at line 18, column 3]

and that’s precisely the part which changed in robotstxt.ftl

          <#list section.allows as path>
Allow: ${path}

Is this a known issue and why is it occurring?

Kind Regards
Mehul Parmar

Since 15.2, the class org.onehippo.forge.robotstxt.annotated.Section has the #getAllows method, see (see CMS-14837), so this should be compatible with the ftl. Can you check the version of the hippo-plugin-robotstxt-hst-client.jar that is used?


Hey @Mehul_Parmar1, thanks for raising this.

The Allow property was recently introduced to the Robots document type.
Which now throws an exception for existing Robots documents that don’t have the new property in their model.
The quickest solution for this is to update all your existing Robots documents with an empty robotstxt:allow multi-string property via a groovy script. Or simply edit and publish if the number of documents is small.

I will raise a ticket internally to better handle this case.

the hippo-plugin-robotstxt-hst-client version is 15.2.2

thanks @Lef_Karamoulas, adding the field helped resolve the issue.