django robots.txt生成器
django-url-robots的Python项目详细描述
Djangorobots.txt生成器。基于使用修饰的django.conf.urls.url。 它得到urlpatterns,并用*替换不明确的部分。
安装和使用
安装django url robots的推荐方法是使用pip
使用easy_install或pip从pypi安装:
pip install django-url-robots
将'url_robots'添加到INSTALLED_APPS:
INSTALLED_APPS = ( ... 'url_robots', ... )
将url_robots视图添加到根urlconf:
urlpatterns += [ url(r'^robots\.txt$', url_robots.views.robots_txt), ]
用布尔关键字参数robots_allow描述规则,用它url_robots.utils.url代替django.conf.urls.url:
from url_robots.utils import url urlpatterns += [ url('^profile/private$', views.some_view, robots_allow=False), ]
django-url-robots用Django-1.8+测试。按百分比编码Unicode字符。
设置
此时,只有一个选项可以定义robots.txt文件的模板:
urlpatterns += [ url(r'^robots\.txt$', url_robots.views.robots_txt, {'template': 'my_awesome_robots_template.txt'}), ]
示例
robots_template.txt:
User-agent: * Disallow: /* # disallow all {{ rules|safe }}
网址.py:
from django.conf.urls import include urlpatterns = [ url(r'^profile', include('url_robots.tests.urls_profile')), ]
url_profile.py:
from url_robots.utils import url urlpatterns = [ url(r'^s$', views.some_view, name='profiles', robots_allow=True), url(r'^/(?P<nick>\w+)$', views.some_view), url(r'^/(?P<nick>\w+)/private', views.some_view, name='profile_private', robots_allow=False), url(r'^/(?P<nick>\w+)/public', views.some_view, name='profile_public', robots_allow=True), ]
结果robots.txt:
User-agent: * Disallow: /* # disallow all Allow: /profiles$ Disallow: /profile/*/private* Allow: /profile/*/public*