[Archivesspace_Users_Group] Robots.txt file for dev and test instances?

Brad Westbrook brad.westbrook at lyrasis.org
Tue Aug 23 13:55:16 EDT 2016

Hi, Kari,

Mark Cooper is going to put a fix in at the webserver level, but it may take a couple of weeks to get it fully implemented.


From: archivesspace_users_group-bounces at lyralists.lyrasis.org [mailto:archivesspace_users_group-bounces at lyralists.lyrasis.org] On Behalf Of Kari R Smith
Sent: Tuesday, August 23, 2016 11:01 AM
To: archivesspace_users_group at lyralists.lyrasis.org
Subject: [Archivesspace_Users_Group] Robots.txt file for dev and test instances?

It's come to our attention that the data in the Dev and sandbox instances of ArchivesSpace are being crawled by Internet Archive and Google is providing search results that point to data in these instances.

This is problematic for us because we were using early versions of resource records and also editing data with unreal values.

Is it possible to add a robots.txt file so that these sites are not crawled in the future?


Kari R. Smith
Digital Archivist, Institute Archives and Special Collections
Massachusetts Institute of Technology Libraries
617.253.5690   smithkr at mit.edu   http://libraries.mit.edu/archives/

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lyralists.lyrasis.org/pipermail/archivesspace_users_group/attachments/20160823/33b775ca/attachment.html>

More information about the Archivesspace_Users_Group mailing list