Skip to content

Commit

Permalink
make robots.txt restrictive by default
Browse files Browse the repository at this point in the history
  • Loading branch information
ruslandoga committed Mar 17, 2024
1 parent d3586a8 commit 596e810
Show file tree
Hide file tree
Showing 3 changed files with 13 additions and 6 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ All notable changes to this project will be documented in this file.
- Add 'browser_versions.csv' to CSV export
- Add `CLICKHOUSE_MAX_BUFFER_SIZE_BYTES` env var which defaults to `100000` (100KB)
- Add alternative SMTP adapter plausible/analytics#3654
- Add restrictive `robots.txt` for self-hosted plausible/analytics#3905

### Removed
- Removed the nested custom event property breakdown UI when filtering by a goal in Goal Conversions
Expand Down
11 changes: 10 additions & 1 deletion lib/plausible_web/endpoint.ex
Original file line number Diff line number Diff line change
Expand Up @@ -33,11 +33,20 @@ defmodule PlausibleWeb.Endpoint do
plug(PlausibleWeb.Tracker)
plug(PlausibleWeb.Favicon)

static_paths = ~w(css js images favicon.ico)

static_paths =
on_full_build do
static_paths
else
static_paths ++ ["robots.txt"]
end

plug(Plug.Static,
at: "/",
from: :plausible,
gzip: false,
only: ~w(css js images favicon.ico robots.txt)
only: static_paths
)

on_full_build do
Expand Down
7 changes: 2 additions & 5 deletions priv/static/robots.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,2 @@
# See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
# User-agent: *
# Disallow: /
User-agent: *
Disallow: /

0 comments on commit 596e810

Please sign in to comment.