[Archivesspace_Users_Group] Robots.txt file for dev and test instances?

Kari R Smith smithkr at mit.edu
Tue Aug 23 11:01:13 EDT 2016


Hello,
It's come to our attention that the data in the Dev and sandbox instances of ArchivesSpace are being crawled by Internet Archive and Google is providing search results that point to data in these instances.

This is problematic for us because we were using early versions of resource records and also editing data with unreal values.

Is it possible to add a robots.txt file so that these sites are not crawled in the future?

Thanks,
Kari

Kari R. Smith
Digital Archivist, Institute Archives and Special Collections
Massachusetts Institute of Technology Libraries
617.253.5690   smithkr at mit.edu   http://libraries.mit.edu/archives/

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lyralists.lyrasis.org/pipermail/archivesspace_users_group/attachments/20160823/8b3acee3/attachment.html>


More information about the Archivesspace_Users_Group mailing list