SoaF call 2015-10-19

Connection Info:
3pm Eastern
Google Hangout: https://hangouts.google.com/call/nkb5sb5bu5hamuzubmvqmrvnjea 

Attendees: 

Juliet Hardesty
Jeremy Morse
wgcowan 

Agenda

  • Interest from others at U Michigan (yay!)
  • Review deliverables and timeline
    • not just media but also image, disk image, other file types
    • segments of an XML document - TEI and other encoded texts; referring to segments possibly using XPath but there might be something else available
  • Review Avalon use case
  • Discuss phased approach
    • phase 1 - how to express information for Hydra to use to talk to players/viewers
    • phase 2 - how to store information in Fedora 4 so PCDM can understand
    • use phased approach for sure
    • not much value for Fedora 3 in storage recommendations
    • establish what needs to be used first (different types of selectors for different types of files), not sure about getting into how it should be stored
    • HydraConnect notes say storage is out of scope
    • media stream - 2 points (start and end)
    • image - more than just 2 points
    • section of XML - subcategory of a document
  • Next steps - Action items
    • set up SoaF Use Cases
    • review OA Selectors for formats, see if any formats we want include are missing - Julie
    • IIIF use case - Will
    • XML use case - Jeremy
    • Next meeting Monday, Nov. 2, 4pm Eastern - meet every 2 weeks through end of year (mid-December)

Time-based media (Avalon) Use Case

Avalon uses W3C Media Fragments for calling up segments of a file (example with Track 2 being called up using https://pawpaw.dlib.indiana.edu/media_objects/avalon:1854/section/avalon:1855?t=131.0,332.0). This spec is used within the web page showing the player to call up a fragment of an audio or video media object. Avalon does not store URIs containing these parameters but instead stores start and end time points in custom XML as a bit stream on the file object (MasterFile) that is part of the media object (MediaObject).

There are needs and issues for referring to segments of a file that Avalon has not handled yet. Right now a single text label is allowed for each start and end time point but no more descriptive metadata capabilities are available. The custom XML being used requires start and end times to be in a certain element (<Span>) and that element is not allowed to contain any further elements.

Another path that annotations will take for audiovisual files are end-user annotations: making playlists, making private annotations or segments. These don't go with the MasterFile object but do need to be stored somewhere and the same method used to call them up (W3C Media Fragments).