Delete robots.txt

Remove specific rules for a user-agent in your `robots.txt` file. To delete all rules for a user-agent, provide an empty rule set. This will remove the user-agent's entry entirely, leaving it subject to your site's default crawling behavior. **Note:** Deleting a user-agent with no rules will make the user-agent's access unrestricted unless other directives apply. <Warning title="Enterprise Only">This endpoint requires an Enterprise workspace.</Warning> Required scope: `site_config:write`

Authentication

AuthorizationBearer

Bearer authentication of the form Bearer <token>, where token is your auth token.

Path parameters

site_idstringRequiredformat: "objectid"
Unique identifier for a Site

Request

This endpoint expects an object.
ruleslist of objectsOptional
List of rules for user agents.
sitemapstringOptional
URL to the sitemap.

Response

Request was successful
ruleslist of objects
List of rules for user agents.
sitemapstring
URL to the sitemap.

Errors

400
Bad Request Error
401
Unauthorized Error
404
Not Found Error
429
Too Many Requests Error
500
Internal Server Error