About the LSDF: Difference between revisions

From Lsdf
Jump to navigationJump to search
mNo edit summary
mNo edit summary
Line 1: Line 1:
===What is the Large Scale Data Facility Project===
===What is the Large Scale Data Facility Project===


The Large Scale Data Facility project at the [[http://www.scc.kit.edu/ Steinbuch Centre for Computing (SCC)(in German)]] recognises the importance of derived data for future science. Measurement data of emperical science quickly grows to PetaByte scale. These large amounts of data can only be efficiently stored, processed and retrieved by an infra structure dedicated for data intensive computing. The LSDF also offers secure internet access to data for exchange between partners and long time archival of data for reference. The computing environment of the LSDF is already successfully used for storage and analysis of images from High Throughput Microscopy projects of the [[http://www-itg.fzk.de/itg/itg_home.html Institute of Toxicology and Genetics]]. The LSDF builds on the extensive [[http://lhc.web.cern.ch/lhc/ LHC]] data handling experience SCC gained for [[http://www.gridka.de GridKa]], the German WLCG Tier 1 centre.
The Large Scale Data Facility project at the [[http://www.scc.kit.edu/ Steinbuch Centre for Computing (SCC)(in German)]] recognises the importance of derived data for future science. Measurement data of emperical science quickly grows to PetaByte scale. These large amounts of data can only be efficiently stored, processed and retrieved by an infra structure dedicated for data intensive computing. The LSDF also offers secure internet access to data for exchange between partners and long time archival of data for reference. The computing environment of the LSDF is already successfully used for storage and analysis of images from High Throughput Microscopy and Light Sheet Microscopy projects of the [[http://www-itg.fzk.de/itg/itg_home.html Institute of Toxicology and Genetics]]. The LSDF builds on the extensive [[http://lhc.web.cern.ch/lhc/ LHC]] data handling experience SCC gained for [[http://www.gridka.de GridKa]], the German WLCG Tier 1 centre.


===Why do scientists need the LSDF===
===Why do scientists need the LSDF===

Revision as of 23:22, 6 December 2012

What is the Large Scale Data Facility Project

The Large Scale Data Facility project at the [Steinbuch Centre for Computing (SCC)(in German)] recognises the importance of derived data for future science. Measurement data of emperical science quickly grows to PetaByte scale. These large amounts of data can only be efficiently stored, processed and retrieved by an infra structure dedicated for data intensive computing. The LSDF also offers secure internet access to data for exchange between partners and long time archival of data for reference. The computing environment of the LSDF is already successfully used for storage and analysis of images from High Throughput Microscopy and Light Sheet Microscopy projects of the [Institute of Toxicology and Genetics]. The LSDF builds on the extensive [LHC] data handling experience SCC gained for [GridKa], the German WLCG Tier 1 centre.

Why do scientists need the LSDF

Data becomes more important in science and society every day. Experiments, instruments and measurements all produce massive amounts of data, 24 hours per day. Transforming it into scientific information and later to general knowledge needs services and facilities to manage, archive, explore and analyze this valuable information for the years to come. Investing all efforts to deliver state-of-the-art solutions for handling massive amounts of scientific data is therefore a key technology enabler for the society of the 21st century.

Cooperations

The [Institute for Data Processing and Electronics (IPE)] is developing software for scientific workflows and for meta data support which will eventually allow different scientific disciplines and sources to interact with the data in similar ways. Graphical user interfaces as well as application programming interfaces will hide the complexity of the attached data storage and access technologies and will provide secure and efficient access worldwide. To enable high data throughput processing frequently required data workflows will be optimized running on the LSDF analysis farm.

Image processing will benefit from the research at the [Institute of Applied Computer Science (IAI)]. The IAI will extend existing image processing filter cascades to autonomously identify and quantify heterogeneous structures in different types of microscopic images, e.g. for the mapping between fluorescence and brightfield channels. The main focus of the new algorithms is the handling of 3D information for correlative microscopy requiring information fusion between different 3D models derived from confocal microscopy, TEM, and X-Ray microscopy with different scales. Possible approaches are the detection of existing landmarks or the use of markers.

Publications and Marketing documents

[LSDF doc]