Skip to content. | Skip to navigation

Personal tools
Log in

DIstributed Computing Environments (DICE) Team
We are a group of computer scientists and IT experts from the Department of Computer Science AGH and ACC Cyfronet AGH. We are a curiosity- and research-driven team, specializing in large-scale distributed computing, HPC, Web and Cloud technologies. We develop new methods, tools and environments and we apply these solutions in e-Science, healthcare and industrial domains.

You are here: Home Products


Platforms, tools and components developed by our team

DICE Product Portfolio


The Atmosphere platform is responsible for maintaining an interface between the end-user tools developed in the VPH-Share project and the underlying hardware resources required to perform complex computations in a distributed Cloud environment. The Atmosphere approach is not to develop low-level cloud middleware or services, but rather to build on top of existing solutions – thus, Atmosphere integrates resources acquired from commercial IaaS providers (e.g. Amazon EC2) with privately-deployed open-source cloud platforms (e.g. OpenStack). The result is a hybrid Cloud infrastructure on which VPH collaboration partners can expose and access domain-oriented computational and data storage services representing various areas of medical science. More ...

Publications related to Atmosphere.

Common Information Space (CIS)

Data streams from sensors (e.g. dike sensors, volcano sensors) need to be processed in various ways in order to analyze a current trend, make a prediction, validate a model, or recommend an action. Applications that do this processing, implement sophisticated simulation models and are computationally intensive. Common Information Space (CIS) is an infrastructure for high-performance and high-throughput processing of sensor data streams and an environment for end-users to run applications and manage their results. More ...

Publications related to CIS.


datanetlogo.pngDataNet enables lightweight metadata and data management in applications exploiting high-performance computing (HPC) resources. It allows to create ad-hoc data models and deploy them as actionable repositories with easy access restriction configuration and accessibility which is programming language neutral by using REST-like approach for data recording and retrieval. DataNet data models comprise structured data which is typical for SQL database storage scenarios, as well as, files which are well-suited for storing unstructured data. DataNet ensures scalability by using one of the available PaaS platforms combined with storage sites provided by HPC to implement the storage backend. More ...

Publications related to DataNet.


gridspaceGridSpace is a novel virtual laboratory framework enabling researchers to conduct virtual experiments on Grid-based resources and other HPC infrastructures. Current generation of GridSpace – GridSpace2 facilitates exploratory development of experiments by means of scripts which can be expressed in a number of popular languages, including Ruby, Python and Perl. The framework supplies a repository of gems enabling scripts to interface low-level resources such as PBS queues, EGEE computing elements, LFC directories and other types of Grid resources. Moreover, GridSpace2 provides a Web 2.0-based Experiment Workbench supporting joint development and execution of virtual experiments by groups of collaborating scientists. More ...

Publications related to GridSpace.


HypeFlow_logo.pngHyperFlow is a lightweight tool that enables orchestration of scientific applications into complex pipelines, or scientific workflows. HyperFlow aids users in composing their applications into workflows, deploying them in the cloud, and executing them. A workflow in HyperFlow is described as a graph of workflow activities (called processes) using a simple JSON-based data structure. The activities can either be implemented in JavaScript or mapped to executable programs. In the latter case, the workflow developer only needs to associate each workflow activity with a previously prepared Virtual Machine image where appropriate programs are installed. This approach makes HyperFlow equally suitable for experienced programmers who desire low-level programming capabilities, and domain specialists who only wish to construct pipelines out of existing module. HyperFlow also automates workflow deployment and execution in the cloud. More ...

Publications related to HyperFlow.


plgapp-logo-big.pngPlgapp is a platform used for rapid and lightweight development of scientific applications running on high performance computing resources which facilitates access to grid resources and handles the entire web application deployment pipeline. Usage of available web frameworks enables purely browser-side programming and frees application developers from any server-side dependencies. The platform provides testing and production environments to cover application development cycle including seamless source code synchronization. More ...

Publications related to plgapp.

Visual query construction over RDF data - the QUaTRO2 tool

QUaTRO is a tool for query construction and execution over RDF data leveraging OWL data model. The tool was designed to work with arbitrary ontology and a variety of RDF data sources. The query construction model aims at providing high expressiveness while maintaining an easy to use graphical interface. More ...

Publications related to QUaTRO2.


rimrock_logo.pngThe Rimrock application simplifies the way how you can interact with the remote servers. It allows to execute application in a batch mode or start an interactive application, where  application output can be fetched online and new input can be sent using a simple REST interface. What is more, by using a dedicated REST interface you will be able to start a new job in the infrastructure. You don't need to care about the way how to create a correct job description using JDL (Job Description Language) – just pass the command you want to execute and we will do the rest. More ...

Publications related to Rimrock.

NOTE! This web site uses cookies and similar technologies (if you do not change the browser settings, you agree to this).

cyfronet agh