The TUM Research Data Hub is the central point of contact for Research Data Management (RDM) support. These are our services for the TUM-Community:
- Training includes courses like Research Data Management Essentials.
More info: Training - Consulting for e.g., developing an RDM strategy for a research project.
Please contact researchdata(at)tum.de - Infrastructure and tools such as TUM DataTagger, mediaTUM, and TUM eLabFTW.
More info: Infrastructure & Tools - Data Stewards are our experts who provide guidance and work actively on implementing TUM Guidelines for Handling Research Data.
More info: Request a Data Steward
TUM Research Data Hub
A joint initiative of the University Library and the Munich Data Science Institute. It is the central contact point for all questions related to research data.
For any inquiries on research data, please contact researchdata(at)tum.de
Research Data Services
The team with Data Stewards focuses on research data management. They offer training, tools, and consultation, e.g., for creating data management plans. The Research Data Services are located at the University Library on the main campus.
Munich Data Science Institute (MDSI)
The Data Services with Data Stewards focuses on research involving data science, machine learning, and AI. They offer support and networking events. The MDSI offices are on the campus in Garching.
A data steward manages research data, ensures proper storage, implements guidelines, and supports project teams with data science expertise. A data steward based at the TUM Research Data Hub works partly for third-party funded research projects and also provides research data management services for the whole TUM, such as training and consultation.
More info: Request a Data Steward
Research Data Management (RDM) is the organization, storage, documentation, and preservation of research data throughout a research project. Effective RDM ensures data usability, reproducibility, and compliance with disciplinary standards. It also facilitates data sharing and long-term archiving, aligning with good scientific practice and funding requirements.
Tip: Start RDM early in your project, as it saves time later.
More info: Research Data Management
The acronym FAIR stands for: Findable, Accessible, Interoperable, and Reusable. These principles should be followed and documented to increase the usability of research data. Besides, they are often required for funding by the German Research Foundation (DFG) and Horizon Europe/European Research Council. You can learn more about FAIR in our training.
More info: Training
In a data management plan (DMP), you describe how you will handle your research data during and after the end of the research project. This planning is often relevant for third-party funding.
We can help you write your DMP. Please contact researchdata(at)tum.de
More info: Submit application & planning
Funders often require clear descriptions of data types, workflows, storage, publication plans, and compliance with FAIR principles and legal/ethical standards in research data management (RDM). Proposals must justify RDM-related costs and outline long-term data preservation. These aspects are reviewed during application and project reporting, directly affecting funding decisions.
Data Management Plans (DMPs) are often required when applying for third-party funding and should be updated throughout the project to reflect any changes.
- For Horizon Europe, a full DMP is always required.
- For DFG, it is sometimes sufficient to address data handling in the proposal.
More info: Submit application & planning
When applying for funding (e.g., DFG, Horizon Europe, ERC), you should clearly explain how you will handle data throughout your project. The exact information differs, but you can use the checklist of DFG, the template of Horizon Europe, and DMP tools, which are also all linked and explained in our DMP guidelines.
We can help you describe your research data management strategy in your proposal. Please contact researchdata(at)tum.de
More info: Submit application & planning
Some funding programs recommend and promote having a Data Steward position. If your funding agency offers to finance a part-time Data Steward, you may be eligible for co-financing by TUM, provided the position is requested through the TUM Research Data Hub.
More info: Request a Data Steward
Publishing data increases transparency and citation potential. Please consider the following:
- Use repositories like mediaTUM, Zenodo, or subject-specific repositories.
- Include comprehensive metadata.
- Choose a suitable license, e.g., CC BY.
- Assign a persistent identifier, like a DOI.
- Ensure data complies with ethical and legal standards.
More info: Publish & Share Data
DOI (Digital Object Identifier): A permanent, unique identifier for a data publication that ensures it can be reliably found and cited. Once assigned, the dataset cannot be changed. Use a DOI when your dataset is finalized and will not change after publication.
Concept DOI: A special DOI used when datasets are expected to change over time. It links all versions of a dataset and always points to the most recent one, while each version also gets its own DOI. Use a Concept DOI if you anticipate future updates to your dataset. This allows others to cite either a specific version or always refer to the latest version.
More info: Publish & share data
Assigning a suitable license to your research data and software supports reuse while protecting your work. It defines how others may use, share, and cite it.
More info: Handout on Information on Licenses for Published Scientific Data and Software Programs
Yes, you can. TUM encourages the reuse of quality datasets, which is especially valuable for rare or hard-to-reproduce data and meta-analyses. When reusing, please consider the following:
- Search in search engines like DataCite Commons, BASE, B2FIND, and repositories listed in re3data.org. Data journals and journal articles also often link to datasets.
- Check licensing and usage conditions.
- Always cite the original source and the data creators.
- Check if the data is well-documented.
- Evaluate data reliability and relevance before use.
More info: Reuse Data
A data paper describes one or more datasets, including collection methods, quality, and limitations. It is peer-reviewed and linked to the dataset in a repository and can lead to more citations.
More info: Dataset & data paper
In collaboration with the Leibniz Supercomputing Centre (LRZ) and the University Library, TUM offers a comprehensive range of tools and services to support Research Data Management across all data lifecycle phases. These solutions are tailored to different use cases, data types, and collaboration needs.
You can count on us to help you pick the right infrastructure solutions for your project and integrate them into your workflows. Please contact us at researchdata(at)tum.de
More info: Infrastructure & Tools
These tools support active research workflows, including documentation, collaboration, and short- to mid-term storage:
- TUM DataTagger for storing, versioning, sharing, and annotating research data with metadata, especially in interdisciplinary teams.
- Personal LRZ Cloud Storage for individual storage needs with regular backups and global access via WebDisk. Storage capacity is up to 400 GB per user.
- Institutional LRZ Cloud Storage for shared storage within Schools or research groups. Storage capacity is up to 100 TB per institution.
- Sync+Share for collaborative editing and syncing of files, including external sharing options. Storage capacity is up to 50 GB per user.
- TUM Data Science Storage (TUM-DSS) for storing and sharing large-scale scientific datasets. TUM-DSS can be accessed via TUM DataTagger. It supports storage in the TB to PB range per user.
- Microsoft 365 (OneDrive, SharePoint, Teams) for collaborative document editing and sharing. Storage capacity up to 250 GB per user in OneDrive and up to 100 GB per user in SharePoint/Teams. Make sure to follow data protection laws – suitable only for non-confidential or encrypted data – and to back up data yourself.
- LRZ GitLab for collaborative development and version control of research software.
- GigaMove (RWTH Aachen) for transferring large files to external partners. Suitable only for non-confidential or encrypted data. Storage capacity is up to 100 GB per user.
More info: Infrastructure & Tools
These services are designed for long-term archiving, backup, and publication of completed research data:
- mediaTUM for publishing research data, theses, publications, and multimedia content. It supports both restricted and public access.
- Backup & Archive (LRZ) for backup and archiving data from systems within the Munich Science Network (MWN) . Its standard archiving period is 10 years. It supports storage in the MB to PB range per user.
- ISAR Cloud Storage for archiving data from LRZ Cloud Storage. Data is archived for 10 years and then transferred to the LRZ Backup & Archive for a further 10 years. Storage capacity is up to 100 TB per user.
- Data Science Archive (DSA) for archiving large datasets from TUM-DSS with external sharing options. Storage capacity is up to 260 PB per user.
More info: Infrastructure & Tools
Backup:
Short-term, version-based protection against data loss for changing (“hot”) data.
Infrastructure that provides automatic backups: Sync+Share, TUM DataTagger, LRZ Cloud Storage (personal or institutional NAS). Please note: When using Microsoft 365 (OneDrive, SharePoint, Teams) or Gigamove, no backups are created, so you are responsible for backups.
Archiving:
Long-term secure storage for finished (“cold”) data, for compliance. The default archiving duration, recommended by the DFG, is 10 years. There may be subject-specific deviations.
Example tools: LRZ Backup & Archive, ISAR Cloud Storage, Data Science Archive.
Long-term Archiving:
Indefinite storage for finished (“cold”) data to be preserved “forever,” suitable for particularly valuable and unique data.
More info: Archive data