Boost Your Website's SEO with Django: A Complete Guide
Django
Improve your Django website's SEO with this in-depth guide. Learn how to set up Django sitemaps and configure a robots.txt file to help search engines crawl and index your site more effectively. This tutorial covers essential SEO techniques to enhance your website's visibility on Google and attract more organic traffic.
Project Structure
Check out the full Django project structure we’ll be using in this guide here.
How to add Sitemap
A sitemap is an XML file that informs search engines of your website's pages, their relevance, and how frequently they are updated. Your website will become more visible on search engines if you use a sitemap because it helps their crawlers index your entire website. It lives in your root URL, for example in this blog is located in https://matteoviserta.com/sitemap.xml.
Create sitemaps.py in our app directory:
# app/sitemaps.py
from django.contrib.sitemaps import Sitemap
from django.urls import reverse
from .models import Post
class PostSitemap(Sitemap):
priority = 0.5
changefreq = "weekly"
def items(self):
return Post.objects.all()
def lastmod(self, obj):
return obj.updated_at
class StaticSitemap(Sitemap):
priority = 0.5
changefreq = "weekly"
def items(self):
return ["home", "about"]
def location(self, item):
return reverse(item)
Add sitemaps in you installed app inside settings.py:
# core/settings.py
INSTALLED_APPS = [
'...',
'django.contrib.sitemaps',
'...',
]
Then we need to configure the URLs:
# core/urls.py
sitemaps = {
'posts': PostSitemap,
'static': StaticSitemap,
}
urlpatterns = [
path('admin/', admin.site.urls),
path('', include('app.urls')),
path('sitemap.xml', sitemap, {'sitemaps': sitemaps}, name='django.contrib.sitemaps.views.sitemap'),
]
How to add Robots.txt
A robots.txt file tells bots which URLs they can and cannot access on your site. It lives in your root URL, for example in this blog is located in https://matteoviserta.com/robots.txt.
First we need create a robots.txt file inside templates directory. Now we need to add robots views:
# app/views.py
from django.views.generic import TemplateView
class RobotsTxtView(TemplateView):
template_name = "robots.txt"
Then we have to setting URLs.
# app/urls.py
from django.urls import path
from .views import RobotsTxtView
urlpatterns = [
path("robots.txt", RobotsTxtView.as_view(content_type="text/plain"), name="robots"),
]
How to add Meta Tags
Meta tags are HTML elements that provide information about your webpages to search engines and users.
Before we need install django-meta in our venv and add in installed app inside setting.py:
# core/settings.py
INSTALLED_APPS = [
'...',
'meta',
]
Let's modify our model by adding meta information:
# app/models.py
from django.db import models
from meta.models import ModelMeta
class Post(ModelMeta, models.Model):
title = models.CharField(max_length=20)
content = models.TextField()
image = models.ImageField()
…
_metadata = {
'title': 'name',
'description': 'abstract',
'image': 'get_meta_image',
...
}
def get_meta_image(self):
if self.image:
return self.image.url
Push metadata in the context using as_meta method:
# app/views.py
class MyView(DetailView):
...
def get_context_data(self, **kwargs):
context = super(MyView, self).get_context_data(self, **kwargs)
context['meta'] = self.get_object().as_meta(self.request)
return context
Include meta/meta.html template in our templates:
# templates/base.html
{% load meta %}
<html>
<head {% meta_namespaces %}>
{% include "meta/meta.html" %}
</head>
<body>
</body>
</html>
Django-Meta has a few configuration options that allow you to customize it. Two of them are required: META_SITE_PROTOCOL and META_SITE_DOMAIN. We need to set them in core/setting.py:
# django-meta settings
META_SITE_PROTOCOL = 'http'
META_SITE_DOMAIN = 'localhost'
There are also plenty tools useful to check and improve our site. We can use PageSpeed Insight to analize our pages, Google Analytics to keep tracks of website traffic and Google Search Console to monitor presence in Google search results. PageSpeed Insight reports on the user experience of a page on both mobile and desktop devices, and provides suggestions on how that page may be improved. Google Search Console is a free web service provided by Google that helps website owners monitor and maintain their site's presence in Google search results. Google Analytics is a free web analytics service offered by Google that tracks and reports website traffic, providing valuable insights about user behavior.
0