[Archivesspace_Users_Group] Loading long CSV files for accessions in ArchivesSpace

Blake Carver blake.carver at lyrasis.org
Mon Oct 19 10:01:44 EDT 2020


# specifies whether the "Load Digital Objects" button is available at the Resource Level
AppConfig[:hide_do_load] = false
# upper row limit for an excel spreadsheet
AppConfig[:bulk_import_rows] = 1000
# maximum size (in KiloBytes) for an excel spreadsheet
AppConfig[:bulk_import_size] = 256

https://github.com/archivesspace/archivesspace/blob/master/common/config/config-defaults.rb#L676-L682

There are limits in the config.rb

________________________________
From: archivesspace_users_group-bounces at lyralists.lyrasis.org <archivesspace_users_group-bounces at lyralists.lyrasis.org> on behalf of Bowers, Kate A. <kate_bowers at harvard.edu>
Sent: Monday, October 19, 2020 9:28 AM
To: Archivesspace Users Group <archivesspace_users_group at lyralists.lyrasis.org>
Subject: Re: [Archivesspace_Users_Group] Loading long CSV files for accessions in ArchivesSpace


Do you have an idea of what the limit is? That is: what is the largest number of rows you have successfully imported as accessions via spreadsheet?

Thanks!



Kate



Kate Bowers

Collections Services Archivist for Metadata, Standards, and Systems

Harvard University Archives

kate_bowers at harvard.edu

https://archives.harvard.edu

617-998-5238



From: archivesspace_users_group-bounces at lyralists.lyrasis.org <archivesspace_users_group-bounces at lyralists.lyrasis.org> On Behalf Of philip.webster at sheffield.ac.uk
Sent: Monday, October 19, 2020 9:11 AM
To: Archivesspace Users Group <archivesspace_users_group at lyralists.lyrasis.org>
Subject: [Archivesspace_Users_Group] Loading long CSV files for accessions in ArchivesSpace



Hi,

One thing we have noticed when attempting to batch load accessions using CSV import capability in ArchivesSpace is that there appears to be some sort of limit that prevents us from loading large numbers of items.

I could think of several reasons why this may be the case.

  1.  There may be a hard-coded limit on the number of rows that can be processed in a single background process.
  2.  There may be a relationship between the amount of memory allocated to ArchivesSpace and the number of rows that can be processed.
  3.  There may be a limited amount of time the background process can run for.



Our current workaround is to split the accessions import CSV into multiple files and load them in sequence.

Does anyone on the mailing list know what could be causing the batch process stop before finishing an entire CSV file?



Regards,

Philip Webster

The University of Sheffield


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lyralists.lyrasis.org/pipermail/archivesspace_users_group/attachments/20201019/f50e5468/attachment.html>


More information about the Archivesspace_Users_Group mailing list