[Archivesspace_Users_Group] Mass export of EAD

Suda, Phillip J psuda1 at tulane.edu
Tue Jul 28 12:13:56 EDT 2015


Thanks all for your suggestions/scripts/help. This is a great start.

Thank you,

Phil


Phillip Suda
Systems Librarian
Howard-Tilton Memorial Library
Tulane University
psuda1 at tulane.edu
504-865-5607



From: archivesspace_users_group-bounces at lyralists.lyrasis.org [mailto:archivesspace_users_group-bounces at lyralists.lyrasis.org] On Behalf Of Kevin Clair
Sent: Tuesday, July 28, 2015 10:41 AM
To: Archivesspace Users Group <archivesspace_users_group at lyralists.lyrasis.org>
Subject: Re: [Archivesspace_Users_Group] Mass export of EAD

I have a Perl script I run from command line that runs every batch export I want or need at this point: https://github.com/duspeccoll/as_utils/blob/master/reports.pl

It grabs the JSON list of all the IDs for a given model, and then either dumps everything into a single JSON object or exports to some other format. The EAD export is lines 206-224. This is *extremely* customized for our environment, and I’ve made no effort yet to modify it for general use, but it’s an idea of how one could go about doing this.  -k

From: archivesspace_users_group-bounces at lyralists.lyrasis.org<mailto:archivesspace_users_group-bounces at lyralists.lyrasis.org> [mailto:archivesspace_users_group-bounces at lyralists.lyrasis.org] On Behalf Of Steven Majewski
Sent: Tuesday, July 28, 2015 9:34 AM
To: Archivesspace Users Group
Subject: Re: [Archivesspace_Users_Group] Mass export of EAD


There is ead_export.sh in the scripts directory.
It only exports published collections, but that can be changed in the code if needed.
That script runs locally on the AS server and it writes into the archivesspace/data/
directory, so you need write access.

resource ids will not necessarily be sequential after deletions and transfers, but you
can get a JSON list of all of the ids from /repositories/$REPO_ID/resources?all_ids=true
and then loop over those ids.

— Steve Majewski


On Jul 28, 2015, at 11:15 AM, Alexander Duryee <alexanderduryee at nypl.org<mailto:alexanderduryee at nypl.org>> wrote:

Phil,
As far as I'm aware, there's no bulk EAD export functionality in ASpace.  However, since ASpace's resource identifiers are sequential integers, you can loop over each resource id in a repository and make an API call for its EAD record:
for x in {first..last}; do curl -H '[session token]' "https://[address]/repositories/[id]/resource_descriptions/${x}.xml<https://[address]/repositories/%5bid%5d/resource_descriptions/$%7bx%7d.xml>" > aspace_${x}.xml; done
A loop like that should generate EAD records for each resource in your repository.
Regards,
--Alex

On Tue, Jul 28, 2015 at 10:27 AM, Suda, Phillip J <psuda1 at tulane.edu<mailto:psuda1 at tulane.edu>> wrote:
Greetings all,

             Is there an API or mass export feature for exporting all EAD records from a repository, etc.? I am only seeing a collection level export feature.

Thanks,

Phil

Phillip Suda
Systems Librarian
Howard-Tilton Memorial Library
Tulane University
psuda1 at tulane.edu<mailto:psuda1 at tulane.edu>
504-865-5607<tel:504-865-5607>


_______________________________________________
Archivesspace_Users_Group mailing list
Archivesspace_Users_Group at lyralists.lyrasis.org<mailto:Archivesspace_Users_Group at lyralists.lyrasis.org>
http://lyralists.lyrasis.org/mailman/listinfo/archivesspace_users_group



--
Alexander Duryee
Metadata Archivist
New York Public Library
(917)-229-9590
alexanderduryee at nypl.org<mailto:alexanderduryee at nypl.org>
_______________________________________________
Archivesspace_Users_Group mailing list
Archivesspace_Users_Group at lyralists.lyrasis.org<mailto:Archivesspace_Users_Group at lyralists.lyrasis.org>
http://lyralists.lyrasis.org/mailman/listinfo/archivesspace_users_group

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lyralists.lyrasis.org/pipermail/archivesspace_users_group/attachments/20150728/131fc7bf/attachment.html>


More information about the Archivesspace_Users_Group mailing list