ossr-curation merge requestshttps://gitlab.in2p3.fr/vuillaume/ossr-curation/-/merge_requests2022-03-19T19:03:49+01:00https://gitlab.in2p3.fr/vuillaume/ossr-curation/-/merge_requests/35[CURATE] gammapy/gammapy: v.0.192022-03-19T19:03:49+01:00CI[CURATE] gammapy/gammapy: v.0.19=== Record #5721467 ===
Title: gammapy/gammapy: v.0.19
DOI: 10.5281/zenodo.5721467
URL: https://zenodo.org/record/5721467
A Python package for gamma-ray astronomy
## Check the software checklist for the entry
- [ ] ...=== Record #5721467 ===
Title: gammapy/gammapy: v.0.19
DOI: 10.5281/zenodo.5721467
URL: https://zenodo.org/record/5721467
A Python package for gamma-ray astronomy
## Check the software checklist for the entry
- [ ] Contains valid codemeta.json (see validator output)
- [ ] Documentation is provided in the Zenodo entry (at least through codemeta)
- [ ] a stable versioned release of the project
- [ ] It is under an open-source license (see SPDX [https://spdx.org/licenses/])
- [ ] Follows a reasonable set of software development / software engineering practices (rough by-eye quality estimate)
## Complete onboarding issue
Related onboarding issue: XXX (to be entered by onboarding manager)
- [ ] Make sure all boxes of the checklist up to "Uploaded to Zenodo" are ticked
- [ ] Tick "software checklist completed" when done with the above
- [ ] When cleared for merging, tick "Added to Zenodo community/published" and change issue status to "closed"
----
**!! There is no codemeta file in record 5721467 !!**https://gitlab.in2p3.fr/vuillaume/ossr-curation/-/merge_requests/34[CURATE] OrcaNet2022-03-14T17:23:30+01:00CI[CURATE] OrcaNet=== Record #5996080 ===
Title: OrcaNet
DOI: 10.5281/zenodo.5996080
URL: https://zenodo.org/record/5996080
An open-source python package for conviniently managing the training of deep neural networks on large datasets....=== Record #5996080 ===
Title: OrcaNet
DOI: 10.5281/zenodo.5996080
URL: https://zenodo.org/record/5996080
An open-source python package for conviniently managing the training of deep neural networks on large datasets. It makes heavy use of keras, tensorflow and h5py.
## Check the software checklist for the entry
- [ ] Contains valid codemeta.json (see validator output)
- [ ] Documentation is provided in the Zenodo entry (at least through codemeta)
- [ ] a stable versioned release of the project
- [ ] It is under an open-source license (see SPDX [https://spdx.org/licenses/])
- [ ] Follows a reasonable set of software development / software engineering practices (rough by-eye quality estimate)
## Complete onboarding issue
Related onboarding issue: XXX (to be entered by onboarding manager)
- [ ] Make sure all boxes of the checklist up to "Uploaded to Zenodo" are ticked
- [ ] Tick "software checklist completed" when done with the above
- [ ] When cleared for merging, tick "Added to Zenodo community/published" and change issue status to "closed"
----
**There are 16 warnings and 6 errors to take care of.
Please check the CI**https://gitlab.in2p3.fr/vuillaume/ossr-curation/-/merge_requests/33[CURATE] Dockerfile to extract Gravitational Wave data from the ESCAPE datalake2022-03-14T17:14:31+01:00CI[CURATE] Dockerfile to extract Gravitational Wave data from the ESCAPE datalake=== Record #5742053 ===
Title: Dockerfile to extract Gravitational Wave data from the ESCAPE datalake
DOI: 10.5281/zenodo.5742053
URL: https://zenodo.org/record/5742053
This is a container to extract Gravitational Wav...=== Record #5742053 ===
Title: Dockerfile to extract Gravitational Wave data from the ESCAPE datalake
DOI: 10.5281/zenodo.5742053
URL: https://zenodo.org/record/5742053
This is a container to extract Gravitational Wave (GW) data from the datalake using Rucio and feed 1 second GW frames to the GW pipelines.
To run this container it first needs to be built:
(sudo) docker build -t rucio-gwf-shared-vol-writer-public .
Then it can be run using the following command:
(sudo) docker run -e RUCIO_ACCOUNT=<rucio account name> -v /path/to/client.crt:/opt/rucio/etc/client.crt -v /path/to/client.key:/opt/rucio/etc/client.key -v
/path/to/output/frames:/dataout rucio-gwf-shared-vol-writer
## Check the software checklist for the entry
- [ ] Contains valid codemeta.json (see validator output)
- [ ] Documentation is provided in the Zenodo entry (at least through codemeta)
- [ ] a stable versioned release of the project
- [ ] It is under an open-source license (see SPDX [https://spdx.org/licenses/])
- [ ] Follows a reasonable set of software development / software engineering practices (rough by-eye quality estimate)
## Complete onboarding issue
Related onboarding issue: XXX (to be entered by onboarding manager)
- [ ] Make sure all boxes of the checklist up to "Uploaded to Zenodo" are ticked
- [ ] Tick "software checklist completed" when done with the above
- [ ] When cleared for merging, tick "Added to Zenodo community/published" and change issue status to "closed"
!! There is no codemeta file in record 5742053 !!https://gitlab.in2p3.fr/vuillaume/ossr-curation/-/merge_requests/32[CURATE] cWB pipeline library: 6.4.12022-03-14T17:14:21+01:00CI[CURATE] cWB pipeline library: 6.4.1=== Record #5798976 ===
Title: cWB pipeline library: 6.4.1
DOI: 10.5281/zenodo.5798976
URL: https://zenodo.org/record/5798976
This new release (cWB-6.4.1) is a major upgrade of cWB: it fixes minor problems with the pr...=== Record #5798976 ===
Title: cWB pipeline library: 6.4.1
DOI: 10.5281/zenodo.5798976
URL: https://zenodo.org/record/5798976
This new release (cWB-6.4.1) is a major upgrade of cWB: it fixes minor problems with the previous version, and it extends some functionalities (such as cwb_gwosc interface) to gwosc data from the latest O3b data release. It is fully compatible (i.e. in terms of results) with the latest LIGO-Virgo-KAGRA published analyses/results on the data collected during the Third Observing run O3.
Besides the standard Makefiles, we provide an alternative installation method based on cmake, which should simplify the installation process.
See https://gwburst.gitlab.io/ for more details.
Public git repository: https://gitlab.com/gwburst/public/library
## Check the software checklist for the entry
- [ ] Contains valid codemeta.json (see validator output)
- [ ] Documentation is provided in the Zenodo entry (at least through codemeta)
- [ ] a stable versioned release of the project
- [ ] It is under an open-source license (see SPDX [https://spdx.org/licenses/])
- [ ] Follows a reasonable set of software development / software engineering practices (rough by-eye quality estimate)
## Complete onboarding issue
Related onboarding issue: XXX (to be entered by onboarding manager)
- [ ] Make sure all boxes of the checklist up to "Uploaded to Zenodo" are ticked
- [ ] Tick "software checklist completed" when done with the above
- [ ] When cleared for merging, tick "Added to Zenodo community/published" and change issue status to "closed"
!! There is no codemeta file in record 5798976 !!https://gitlab.in2p3.fr/vuillaume/ossr-curation/-/merge_requests/31[CURATE] Astronomical data organization, management and access in Scientific ...2022-03-14T17:14:14+01:00CI[CURATE] Astronomical data organization, management and access in Scientific Data Lakes=== Record #6077556 ===
Title: Astronomical data organization, management and access in Scientific Data Lakes
DOI: 10.5281/zenodo.6077556
URL: https://zenodo.org/record/6077556
The data volumes stored in telescope arc...=== Record #6077556 ===
Title: Astronomical data organization, management and access in Scientific Data Lakes
DOI: 10.5281/zenodo.6077556
URL: https://zenodo.org/record/6077556
The data volumes stored in telescope archives is constantly increasing due to the development and improvements in the instrumentation. Often the archives need to be stored over a distributed storage architecture, provided by independent compute centres. Such a distributed data archive requires overarching data management orchestration. Such orchestration comprises of tools which handle data storage and cataloguing, and steering transfers integrating different storage systems and protocols, while being aware of data policies and locality. In addition, it needs a common Authorisation and Authentication Infrastructure (AAI) layer which is perceived as a single entity by end users and provides transparent data access.
The scientific domain of particle physics also uses complex and distributed data management systems. The experiments at the Large Hadron Collider\,(LHC) accelerator at CERN generate several hundred petabytes of data per year. This data is globally distributed to partner sites and users using national compute facilities. Several innovative tools were developed to successfully address the distributed computing challenges in the context of the Worldwide LHC Computing Grid (WLCG).
The work being carried out in the ESCAPE project and in the Data Infrastructure for Open Science (DIOS) work package is to prototype a Scientific Data Lake using the tools developed in the context of the WLCG, harnessing different physics scientific disciplines addressing FAIR standards and Open Data. We present how the Scientific Data Lake prototype is applied to address astronomical data use cases. We introduce the software stack and also discuss some of the differences between the domains.
!! There is no codemeta file in record 6077556 !!