Difference between revisions of "About the LSDF"

From Lsdf
Line 1: Line 1:
 
What is the Large Scale Data Facility Project.
 
What is the Large Scale Data Facility Project.
   
The Large Scale Data Facility project at the [[http://www.scc.kit.edu/ Steinbuch Centre for Computing (SCC)]] recognises the importance of derived data for future science. Measurement data of emperical science quickly grows to PetaByte scale. These large amounts of data can only be efficiently stored, processed and retrieved by an infra structure dedicated for data intensive computing. The LSDF also offers secure internet access to data for exchange between partners and long time archival of data for reference. The computing environment of the LSDF is already successfully used for storage and analysis of images from High Throughput Microscopy projects of the [[Institute of Toxicology and Genetics http://www-itg.fzk.de/itg/itg_home.html]].
+
The Large Scale Data Facility project at the [[http://www.scc.kit.edu/ Steinbuch Centre for Computing (SCC)]] recognises the importance of derived data for future science. Measurement data of emperical science quickly grows to PetaByte scale. These large amounts of data can only be efficiently stored, processed and retrieved by an infra structure dedicated for data intensive computing. The LSDF also offers secure internet access to data for exchange between partners and long time archival of data for reference. The computing environment of the LSDF is already successfully used for storage and analysis of images from High Throughput Microscopy projects of the [[http://www-itg.fzk.de/itg/itg_home.html Institute of Toxicology and Genetics]].
   
 
The [[http://www.ipe.kit.edu/english/index.php Institute for Data Processing and Electronics (IPE)]] is developing software for scientific workflows and for meta data support which will eventually allow different scientific disciplines and sources to interact with the data in similar ways. Graphical user interfaces as well as application programming interfaces will hide the complexity of the attached data storage and access technologies and will provide secure and efficient access worldwide. To enable high data throughput processing frequently required data workflows will be optimized running on the LSDF analysis farm.
 
The [[http://www.ipe.kit.edu/english/index.php Institute for Data Processing and Electronics (IPE)]] is developing software for scientific workflows and for meta data support which will eventually allow different scientific disciplines and sources to interact with the data in similar ways. Graphical user interfaces as well as application programming interfaces will hide the complexity of the attached data storage and access technologies and will provide secure and efficient access worldwide. To enable high data throughput processing frequently required data workflows will be optimized running on the LSDF analysis farm.

Revision as of 17:22, 18 October 2010

What is the Large Scale Data Facility Project.

The Large Scale Data Facility project at the [Steinbuch Centre for Computing (SCC)] recognises the importance of derived data for future science. Measurement data of emperical science quickly grows to PetaByte scale. These large amounts of data can only be efficiently stored, processed and retrieved by an infra structure dedicated for data intensive computing. The LSDF also offers secure internet access to data for exchange between partners and long time archival of data for reference. The computing environment of the LSDF is already successfully used for storage and analysis of images from High Throughput Microscopy projects of the [Institute of Toxicology and Genetics].

The [Institute for Data Processing and Electronics (IPE)] is developing software for scientific workflows and for meta data support which will eventually allow different scientific disciplines and sources to interact with the data in similar ways. Graphical user interfaces as well as application programming interfaces will hide the complexity of the attached data storage and access technologies and will provide secure and efficient access worldwide. To enable high data throughput processing frequently required data workflows will be optimized running on the LSDF analysis farm.

Image processing will benefit from the research at the [Institute of Applied Computer Science (IAI)]. The IAI will extend existing image processing filter cascades to autonomously identify and quantify heterogeneous structures in different types of microscopic images, e.g. for the mapping between fluorescence and brightfield channels. The main focus of the new algorithms is the handling of 3D information for correlative microscopy requiring information fusion between different 3D models derived from confocal microscopy, TEM, and X-Ray microscopy with different scales. Possible approaches are the detection of existing landmarks or the use of markers.