Skip to content

Commit

Permalink
add robots to prevent AI scraping for #225
Browse files Browse the repository at this point in the history
  • Loading branch information
bahmutov committed Jun 10, 2024
1 parent 5a90b38 commit 82f27ca
Show file tree
Hide file tree
Showing 2 changed files with 22 additions and 1 deletion.
3 changes: 2 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,8 @@
"docs:build": "vuepress build docs",
"precopy:fiddles": "npm run export:examples",
"copy:fiddles": "cp fiddles.json public/cypress-examples",
"postdocs:build": "npm run copy:fiddles",
"copy:robots": "cp robots.txt public/cypress-examples",
"postdocs:build": "npm run copy:fiddles && npm run copy:robots",
"check:links": "find . -type f -name '*.md' ! -path './node_modules/*' | xargs -L1 npx markdown-link-check --config ./links-check-config.json --quiet",
"format": "prettier --write 'docs/**/*.md'",
"format:extracted": "prettier --write 'docs/**/*.js' '!docs/.vuepress/**/*.js'",
Expand Down
20 changes: 20 additions & 0 deletions robots.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
User-agent: CCBot
Disallow: /

User-agent: ChatGPT-User
Disallow: /

User-agent: GPTBot
Disallow: /

User-agent: Google-Extended
Disallow: /

User-agent: anthropic-ai
Disallow: /

User-agent: Omgilibot
Disallow: /

User-agent: Omgili
Disallow: /

0 comments on commit 82f27ca

Please sign in to comment.