Studentische Arbeiten am SCC: Difference between revisions
From Lsdf
Jump to navigationJump to search
Diana.gudu (talk | contribs) |
|||
Line 18: | Line 18: | ||
* [[Optimisation of MongoDB Data Structures for KASCADE Cosmic-ray Data Centre]] |
* [[Optimisation of MongoDB Data Structures for KASCADE Cosmic-ray Data Centre]] |
||
* [[Design and Deployment of a Sharded Cluster for the KASCADE Cosmic-ray Data Centre]] |
* [[Design and Deployment of a Sharded Cluster for the KASCADE Cosmic-ray Data Centre]] |
||
* [[Market-based cloud resource allocation]] |
|||
=== Masterarbeiten === |
=== Masterarbeiten === |
Revision as of 14:52, 10 March 2016
HiWi and Bachelorarbeiten
- Web basierter Datentransfer
- Entwicklung von Simulationssoftware zür Lösung Quantum kinetische Gleichungen
- Spezifikation eines Benchmarks für Daten-intensive Computing
- Distributed Volunteer Computing for scientific Simulations
- Dateien Indizierungsdienst für LSDF (Large Scale data facility)
- Integration von virtuellen Maschinen in einem Hadoop Cluster
- Construction of a highly redundant Ganesha based NFS server
- Integrate the WOS GPFS Bridge with GPFS
- Adding a UFTP endpoint to HPSS
- Visualisierung der Storage Auslastung des SCC
- Setup very restricted Linux or Docker containers for SFTP access for archives
- Camara readout of tape library installed webcams with RPi
- POC for dSpace or Archivematica
- Open-sourcing an In-house Software Project
- MongoDB as an In-memory Sharded Database
- Optimisation of MongoDB Data Structures for KASCADE Cosmic-ray Data Centre
- Design and Deployment of a Sharded Cluster for the KASCADE Cosmic-ray Data Centre
- Market-based cloud resource allocation
Masterarbeiten
- Fast fixity checking with rsync
- Entwicklung eines WebPortal fuer Mess- und Simulationsdaten aus der Wissenschaft
- Entwicklung eines MapReduce Frameworks mittels In-memory Datenübertragung
- Integrating kerberised and shibboleth based authentication
- Ein Metascheduler zur Skalierung vom Grid zum Cloud
- Entwicklung eines Simulationsmodel fuer Datenzugriffe bei digitale Archivirung
- Implementierung eines globalen Speicher auf virtuellen Maschinen
- Entwicklung eines Profiling Tools zur Überwachung des Datenverkehrs von MapReduce Ausführungen
- Aufbau einen Formatconvertierungs Workflow in hadoop
- Graphical Interface to the GPFS policy engine
Praxis der Software-Entwicklung (PSE)
Template: Thesis-Template