fix robots.txt

Serve robots.txt from django
This commit is contained in:
Hugh Rundle 2024-01-26 15:48:34 +11:00
parent 41f9431d33
commit a79b771e6c
Signed by: hugh
GPG key ID: A7E35779918253F9
3 changed files with 4 additions and 2 deletions

View file

@ -7,10 +7,9 @@ A django app running on Docker. Replaces _Aus GLAM Blogs_.
* `cp .env.example .env` and enter env values for your app
* set up web server config (nginx example coming soon)
* `docker compose build`
* `./glamr-dev makemigrations` (may get a DB connection error here, ignore it or run again)
* `./glamr-dev migrate`
* `./glamr-dev createsuperuser`
* `docker compose up`
* `docker compose up -d`
* set up database backups (as cron jobs): `./glamr-dev backup`:
* set up cron jobs for management commands as below

View file

@ -3,6 +3,7 @@ URL configuration for ausglamr project.
"""
from django.contrib import admin
from django.urls import path, re_path
from django.views.generic import TemplateView
from blogs import views
@ -64,4 +65,6 @@ urlpatterns = [
views.UnsubscribeEmail.as_view(),
name="unsubscribe-email",
),
path('robots.txt', TemplateView.as_view(template_name='robots.txt',
content_type='text/plain')),
]