Content from Introduction
Last updated on 2025-10-24 | Edit this page
Estimated time: 40 minutes
Overview
Questions
- Why interoperability is important when dealing with research data?
Objectives
- Understand the importance of interoperability for data reuse
- Identify the key elements of an interoperable data format
- Identify characteristics that make a NetCDF dataset interoperable
Understand the importance of interoperability for data reuse
Understand the importance of interoperability for data reuse
Make it a multiple choice (or transform it to think-pair-share to enable more discussions) You have two datasets about ocean temperature — one in CSV format with unclear column names, and one in NetCDF format following CF conventions. Which dataset would be easier to reuse and why?
CSV — because it’s a simple text file
NetCDF — because it follows shared conventions
Both are equally reusable
- NetCDF — because it follows shared conventions
True/False or Agree/Disagree with discussion afterwards
- “As long as data are open access, they are interoperable.”
- “Metadata standards help ensure interoperability.”
- “As long as data is using an open standard format is interoperable” (hint to connect to the next section)
F,T,F
This exercise is for discussion in Plenum nad it can serves as a good link to the next section
Identify the key elements of an interoperable data format
bablbalblablbalba
Identify characteristics that make a NetCDF dataset interoperable
blbalbalblaba
- Interoperability in the context of research data refers to the ability of systems, datasets, and tools to work together seamlessly.
- Interoperability can occur at multiple levels: technical(compatible formats), semantic(shared vocabularies), organizational (common policies),legal(licensing)
- etc.
Content from Protocols to retrieve web-hosted data
Last updated on 2025-10-24 | Edit this page
Estimated time: 0 minutes
Overview
Questions
- How do you write a lesson using Markdown and sandpaper?
Objectives
- Understand the DAP protocol to access web hosted netcdf data
- Access a NetCDF file using OpenDAP interface, via DAP protocol.(to be discussed)
- Read a NetCDF file programatically, using DAP protocol - with
open_datasetfromxarrayPython library. - Explore and manipulate a NetCDF file programatically.
Exercise: TRUE or FALSE?
Is this statement true or false? > The
xarray.open_data() function you used, has downladed the
dataset file to your computer. Whay do you think so?
No, the data has been accessed with the DAP protocol, which allows to explore and summarise the dimensions of the data, but they have not been downloaded to the computer.
Content from Conventions and standards
Last updated on 2025-10-24 | Edit this page
Estimated time: 0 minutes
Overview
Questions
- How do you write a lesson using Markdown and sandpaper?
Objectives
By the end of this episode, learners will be able to:
- Evaluate adherence of a NetCDF files to CF convention.
- Convert NetCDF file metadata so that it follows CF convention.
- Detect interoperability “holes” in a specific data format / data package.
Content from Publish datasets via REST API
Last updated on 2025-10-24 | Edit this page
Estimated time: 40 minutes
Overview
Questions
- What is a REST API?
- Why API are an example of interoperability?
- How to create a draft dataset using the 4TU.ResearchData Rest API?
- How to submit for review a dataset using the 4TU.ResearchData Rest API
Objectives
- Understand why APIs are interoperables protocols?
- Know how to submit data to a data repository via its API
Episode content goes here
API are interoperable protocols ….