2019-2020 Metadata Standards sub-team Work Plan
2019-2010 Work Plan Tasks and Goal
- Defining our level of commitment to various library and archives standards
Task | Notes | Timeline/Assignment |
---|---|---|
Tier definitions | 1st tier: ArchivesSpace strives for an optimal support of standard. For structural standards, import and export. 2nd tier: ArchivesSpace strives for compliant support. For structural standards, export. | Agreed, 10/17/19; can be revisited as we work through mappings |
Determine what belongs in which tier | 1st tier: EAD2002, MARC, and DACS 2nd tier: Dublin Core, EAD3, EAC-CPF (to be re-evaluated once Agent module work is done), OAI-PMH | Agreed, 10/17/19; can be revisited as we work through mappings |
Support for Emerging Standards | Monitoring standards changes, and commenting on behalf of AS community as warranted, e.g. RiC | As needed |
2. Maintaining published metadata mappings
(this will need to be broken down into tasks, and we will need to ensure that we clearly define our level of commitment)
Task | Notes | Timeline/Assignment |
---|---|---|
List existing mappings (in rough priority order) |
| Done |
Getting some sample imports and exports | Kevin - MARC (done) Jared - EAD (done) Should be ongoing as we identify testing needs | |
Find workspace to store records | Completed - James and Christine created repos for us to use | |
Determine test process for import | Set up some sort of process to document checks. 3. Dividing up elements to check 4. Check fields that can be easily checked a. We should be able to check a lot of the exports in sandbox.archivesspace.org 5. Isolating import fields that are more challenging to check a. (I’m thinking some of the more long tail things, with <colspec> or odd <fontmatter> elements to check how that imports) b. I did not know that <ptrloc> was a thing, for example (https://www.loc.gov/ead/tglib/elements/ptrloc.html) 6. Finding/creating records to test with long-tail elements Google Sheet for tracking review: https://docs.google.com/spreadsheets/d/1jU6MYF7UI7a-UKdd5XhYCV6W1UyrMMCzYDFlgb8iNW8/edit?usp=sharing | |
Determine test process for export | Create a sample record (records?) with every field filled out, with the content being the fields name. Make sure we cover expected cases (dates, extents) that might need multiples of repeated fields. | |
Ensuring that code changes impacting metadata mappings are communicated would be a great contribution | Instructions to community members in how to create JIRA tickets for review Questions/Comments
| |
3. How DACS integration is supported and how tooltips are in sync
Task | Notes | Timeline/Assignment |
---|---|---|
Review existing tooltips for DACS compliance | To be reviewed at Nov meeting | |
Look at new DACS fields or suggestions |
4. Reviewing metadata-related dev. tickets and to what extent a request to improve a feature will be rewarding to the community
A new label for metadata-standards-related tickets has been created: https://archivesspace.atlassian.net/browse/ANW-918?jql=labels%20%3D%20metadata%20order%20by%20created%20DESC
→ To be a standing agenda item; 10 min of every monthly call for review, discussion, and if necessary a volunteer to draft a comment
5. Offer a mechanism to provide feedback
- During the TAC Monthly Meeting for October 15th, Kevin proposed that the mechanism for capturing feedback may be to use JIRA in coordination with monitoring the listserv
- Other attendees seemed responsive to this, but there was equal interest in integrating Confluence with listserv activity, or, perhaps, creating additional listservs
Ideas and Tasks For the Future