Delete robots.txt
Remove specific rules for a user-agent in your robots.txt
file. To delete all rules for a user-agent, provide an empty rule set. This will remove the user-agent’s entry entirely, leaving it subject to your site’s default crawling behavior.
Note: Deleting a user-agent with no rules will make the user-agent’s access unrestricted unless other directives apply.
Enterprise Only
This endpoint requires an Enterprise workspace.Required scope: site_config:write
Path parameters
site_id
Unique identifier for a Site
Headers
Authorization
Bearer authentication of the form Bearer <token>, where token is your auth token.
Request
This endpoint expects an object.
rules
List of rules for user agents.
sitemap
URL to the sitemap.
Response
Request was successful
rules
List of rules for user agents.
sitemap
URL to the sitemap.