[Archivesspace_Users_Group] Checking for Broken URLs in Resources

Corey Schmidt Corey.Schmidt at uga.edu
Wed Feb 10 08:44:32 EST 2021

Dear all,

Hello, this is Corey Schmidt, ArchivesSpace PM at the University of Georgia. I hope everyone is doing well and staying safe and healthy.

Would anyone know of any script, plugin, or tool to check for invalid URLs within resources? We are investigating how to grab URLs from exported EAD.xml files and check them to determine if they throw back any sort of error (404s mostly, but also any others). My thinking is to build a small app that will export EAD.xml files from ArchivesSpace, then sift through the raw xml using python's lxml package to catch any URLs using regex. After capturing the URL, it would then use the requests library to check the status code of the URL and if it returns an error, log that error in a .CSV output file to act as a "report" of all the broken links within that resource.

The problems with this method are: 1. Exporting 1000s of resources takes a lot of time and some processing power, as well as a moderate amount of local storage space. 2. Even checking the raw xml file takes a considerable amount of time. The app I'm working on takes overnight to export and check all the xml files. I was considering pinging the API for different parts of a resource, but I figured that would take as much time as just exporting an EAD.xml and would be even more complex to write. I've checked Awesome ArchivesSpace, this listserv, and a few script libraries from institutions, but haven't found exactly what I am looking for.

Any info or advice would be greatly appreciated! Thanks!



Corey Schmidt
ArchivesSpace Project Manager
University of Georgia Special Collections Libraries
Email: Corey.Schmidt at uga.edu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lyralists.lyrasis.org/pipermail/archivesspace_users_group/attachments/20210210/4d18241b/attachment.html>

More information about the Archivesspace_Users_Group mailing list