[Archivesspace_Users_Group] MARC XML import

Brian Harrington brian.harrington at lyrasis.org
Tue Jan 24 11:13:20 EST 2023

Hi Sarah,

  1.  The MARCXML importer gets the language from 008/35-37.  I’m not sure what happened to it, but the 008 in your record looks short.  Looking at the code, if the language positions are blank the language is set to undefined.  But that would only work if there are actual blanks in those positions.
  2.  Underneath the error messages, you can see a representation of the JSON record that ASpace is trying to load.  If you look at “notes”=>, the first one (position 0) has an empty "subnotes"=>[{"jsonmodel_type"=>"note_text"}].  You can see this is an odd note with a label of “Publication date”.  In this case, ASpace is trying to create a date note from the 260$c, which is blank in your case, so the it creates a bad note.  However, it’s only trying to create that note because it can’t get the date from the 008.  ASpace only recognizes 008 dates if 008/06 is an i, k, or s, and you have a q there.  It probably should recognize the q, but the quick fix would be to change that to an i or an s.

I hope this helps.


Brian Harrington (he/him)
Data Migration Specialist
brian.harrington at lyrasis.org

From: archivesspace_users_group-bounces at lyralists.lyrasis.org <archivesspace_users_group-bounces at lyralists.lyrasis.org> on behalf of Newhouse, Sarah <snewhouse at sciencehistory.org>
Date: Monday, January 23, 2023 at 5:05 PM
To: Archivesspace Users Group <archivesspace_users_group at lyralists.lyrasis.org>
Subject: [Archivesspace_Users_Group] MARC XML import
Hi all,

I would really appreciate another set of eyes on these two errors I’m getting while testing ingest of MARC XML (resource records) from our library catalog into 3.2:

  1.  lang_materials/0/language_and_script/language : Property is required but was missing
I’ve tried a few different things here and can’t pinpoint the problem. There isn’t a MARC field that maps to this ASpace field in the most recent import map<https://docs.google.com/spreadsheets/d/1jU6MYF7UI7a-UKdd5XhYCV6W1UyrMMCzYDFlgb8iNW8/edit#gid=1527709562> (unless I’m misreading it, which is entirely possible – it’s been a long day), but native ASpace resource records exported as MARCXML will import back in with language and script data, so I can tell there is a MARC field that maps to this. Based on MARC XML from resource records created in ASpace, I tried adding language info in 040$a or 041$b, but neither resulted in a successful import, just the same error message. Is there a MARC field I’m missing in the import map and/or the ASpace-generated MARC? The Help Center<https://archivesspace.atlassian.net/wiki/spaces/ArchivesSpaceUserManual/pages/917405730/Languages+Sub-Record+as+of+v2.7.0> does say this field can be imported as MARC XML, but doesn’t say how.
  2.  notes/0/subnotes/0/content : Property is required but was missing

I see that other folks have encountered this error with an index number to point them to the note that’s missing content, but lacking one here I’m not sure where to start. And looking at my sample files with my fallible human eyes I don’t see any empty fields. Has anyone run into this before? Is this just pointing at my missing language field?

If there’s a resource or guide I’ve missed – especially for interpreting import job errors -- please point me to it! MARC XML is attached (with the added 040 and 041 fields) in case it’s helpful.

Thank you!


Sarah Newhouse   (she, her, hers)
Digital Preservation Archivist
Othmer Library of Chemical History
t. +1.215.873.8249

Science History Institute
Chemistry • Engineering • Life Sciences
315 Chestnut Street • Philadelphia, PA 19106 • U.S.A.
Learn about the scientific discoveries that changed our world at sciencehistory.org/learn<https://www.sciencehistory.org/learn>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lyralists.lyrasis.org/pipermail/archivesspace_users_group/attachments/20230124/910c419d/attachment.html>

More information about the Archivesspace_Users_Group mailing list