Re: robots.txt on git.postgresql.org

From: Andres Freund <andres(at)2ndquadrant(dot)com>
To: Greg Stark <stark(at)mit(dot)edu>
Cc: PostgreSQL-development <pgsql-hackers(at)postgresql(dot)org>
Subject: Re: robots.txt on git.postgresql.org
Date: 2013-07-09 15:30:49
Message-ID: 20130709153049.GA4886@alap2.anarazel.de
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

On 2013-07-09 16:24:42 +0100, Greg Stark wrote:
> I note that git.postgresql.org's robot.txt refuses permission to crawl
> the git repository:
>
> http://git.postgresql.org/robots.txt
>
> User-agent: *
> Disallow: /
>
>
> I'm curious what motivates this. It's certainly useful to be able to
> search for commits.

Gitweb is horribly slow. I don't think anybody with a bigger git repo
using gitweb can afford to let all the crawlers go through it.

Greetings,

Andres Freund

--
Andres Freund http://www.2ndQuadrant.com/
PostgreSQL Development, 24x7 Support, Training & Services

In response to

Responses

Browse pgsql-hackers by date

  From Date Subject
Next Message Fabien COELHO 2013-07-09 15:39:37 Re: Patch to add regression tests for SCHEMA
Previous Message Greg Stark 2013-07-09 15:24:42 robots.txt on git.postgresql.org