|
|
|
# MetaData discussions
|
|
|
|
|
|
|
|
## 1. Records classification & keywords
|
|
|
|
|
|
|
|
- Goal: Filter and classify records
|
|
|
|
- Easy implementation with Zenodo REST API (and through eOSSR)
|
|
|
|
|
|
|
|
Issues:
|
|
|
|
- https://gitlab.in2p3.fr/escape2020/wp3/eossr/-/issues/75
|
|
|
|
- https://gitlab.in2p3.fr/escape2020/wp3/eossr/-/issues/68
|
|
|
|
|
|
|
|
### Analysis classification: analysis appear as software
|
|
|
|
|
|
|
|
- ESAP will probably need more fine-grained classification (need to wait for their input) – see notebooks, workflows… ?
|
|
|
|
- Keywords implementation :
|
|
|
|
- Use already existing schema :)
|
|
|
|
- Use existing Zenodo REST API :)
|
|
|
|
- Good for first quick-search (not too fined-grained)
|
|
|
|
- ESAP does not exclude the possibility to build a database on their side
|
|
|
|
|
|
|
|
- My opinion: as OSSR, the keywords implementation is enough. Implementing a restricted set of generic keywords is possible and reasonnable to follow for providers.
|
|
|
|
|
|
|
|
### Domain classification
|
|
|
|
- Research infrastructures
|
|
|
|
- list of partners: LSST, CTA, Virgo, KM3NeT…
|
|
|
|
- Currently: the list appears only in the codemeta generator :-/
|
|
|
|
|
|
|
|
- Astronomy Thesaurus (proposal from Mark)
|
|
|
|
|
|
|
|
|
|
|
|
- What is the goal here? Use case?
|
|
|
|
|
|
|
|
- We need to differentiate between what is requested (because it’s used by a service) and suggestion for good practice
|
|
|
|
|
|
|
|
## 2. Software / Analysis environment building
|
|
|
|
|
|
|
|
- Goal: being able to install the environment for the provided software or analysis
|
|
|
|
- Current status: soft requirement (see Reasonnable set of software engineering practices)
|
|
|
|
|
|
|
|
- Issues:
|
|
|
|
- https://gitlab.in2p3.fr/escape2020/wp3/ossr-pages/-/issues/16
|
|
|
|
- https://gitlab.in2p3.fr/escape2020/wp3/eossr/-/issues/62
|
|
|
|
|
|
|
|
- Possible codemeta implementation: `buildInstructions`
|
|
|
|
- but `Type=URL`, while these instructions should be provided with the record (env.txt, conda yml, docker recipe…)
|
|
|
|
- is a relative path OK?
|
|
|
|
|
|
|
|
- My opinion: we should come up with a way to **require** these
|
|
|
|
|
|
|
|
## 3. Software / Analysis running environment
|
|
|
|
|
|
|
|
- Goal: being able to run the analysis in an existing environment (= docker container)
|
|
|
|
|
|
|
|
- Issues:
|
|
|
|
- https://gitlab.in2p3.fr/escape2020/wp3/wossl/-/issues/6
|
|
|
|
- https://gitlab.in2p3.fr/escape2020/wp3/eossr/-/issues/34
|
|
|
|
- https://gitlab.in2p3.fr/escape2020/wp3/eossr/-/issues/65
|
|
|
|
- https://gitlab.in2p3.fr/escape2020/wp3/eossr/-/issues/44
|
|
|
|
- https://gitlab.in2p3.fr/escape2020/wp3/eossr/-/issues/39
|
|
|
|
|
|
|
|
- CodeMeta implementation: `relatedLink` or `runtimePlatform`
|
|
|
|
|
|
|
|
My opinion:
|
|
|
|
2 different needs:
|
|
|
|
- ease re-use:
|
|
|
|
- provided for humans mainly in our metadata (could be provided in the doc rather than in the metadata)
|
|
|
|
- should not be required
|
|
|
|
- hard reproducibility with no human interaction:
|
|
|
|
- there are better ways to define a reusable analysis with a clearly defined environment to be used (see e.g. REANA) and these are workflow or platform specific - not in the OSSR scope
|
|
|
|
|