About openEASE

openEASE is a web-based knowledge service providing robot and human activity data. It contains semantically annotated data of manipulation actions, including the environment the agent is acting in, the objects it manipulates, the task it performs, and the behavior it generates. The episode representations can include images captured by the robot, other sensor datastreams as well as full-body poses. A powerful query language and inference tools, allow reasoning about the data and retrieving requested information based on semantic queries. Based on the data and using the inference tools robots can answer queries regarding to what they did, why, how, what happened, and what they saw.

openEASE can be used by humans using a browser-based query and visualization interface, but also remotely by robots via a WebSocket API.

Overview Paper:

Michael Beetz, Moritz Tenorth, Jan Winkler, “Open-EASE — A Knowledge Processing Service for Robots and Robotics/AI Researchers”, In IEEE International Conference on Robotics and Automation (ICRA), Seattle, Washington, USA, 2015. Finalist for the Best Conference Paper Award and Best Cognitive Robotics Paper Award. [PDF]

Background and Motivation:

openEASE is an initiative of a more comprehensive research enterprise called EASE (Everyday Activity Science and Engineering), which is motivated in this [Video]

openEASE related Publications

A list of selected publications about the knowledge representation, perception, plan-based control and learning methods used for openEASE.

Publications

What's New

KnowRob webpage is released

We are pleased to announced that KnowRob webpage is updated according to latest changes in KnowRob 2.0. KnowRob is a knowledge processing system that combines knowledge representation and reasoning methods with techniques for acquiring knowledge and for grounding the knowledge in a physical system and can serve as a common semantic framework for integrating information […]

[Read More]

NEEMHub infrastructure for storing and processing NEEMs

In order to make huge amount of data accessible to the research community, allow to analyze the data, create machine learning models from the data and support version control for the data and models, we have made an effort of releasing an infrastructure which can handle such requirements with one solution. NEEM-Hub is one stop […]

[Read More]

NEEM Handbook is released

The “NEEM Handbook”, describes the EASE system for episodic memories of everyday activities. It is thought to provide EASE researchers with compact but still comprehensive information about what information is contained in NEEMs , how it is represented, acquired, curated and published. Narrative Enabled Episodic Memories When somebody talks about the deciding goal in the […]

[Read More]

SOMA ontology released

An ontological modelling approach called “Socio-Physical Model of Activities (SOMA)” is developed within the scope of CRC EASE research project at the University of Bremen that attempts to advance our understanding of how human-scale manipulation tasks can be mastered by robotic agents. The abstraction of knowledge for everyday activities are handled by SOMA such a way so […]

[Read More]

New Knowledge Base Experiments are added

Data from several experiments are now available online. This includes “Setting Up a Table in Bullet World“, “Kuka KMR-IIWA robot scanning retail shelves “,  “Human avatar fetch and place“, and “URobot simulation fetch and place“. Access experiment information through the Data, Knowledge & Tools column of the openEASE homepage.

[Read More]

New openEASE Platform is released

We are very happy to release the new openEASE Platform. This project is associated with the Institute of Artificial Intelligence (IAI) of the University Bremen. If you want to learn more about the functionality of this platform, please check out the Introduction video on our front page or click here.

[Read More]

Tutorials & Manuals

Tutorials, installation guides, instruction videos and documentation about openEASE.

Tutorials

Software Components

Knowledge Base Experiments

Setting Up a Table in Bullet World

Kuka KMR-IIWA robot scanning retail shelves

Long-term fetch & place (Legacy)

Acquiring everyday manipulation skills through games (Legacy)

Humans setting the table (Legacy)

Perception for everyday manipulation (Legacy)

Robot performing chemical experiments (Legacy)

Safe human-robot activity (Legacy)

Natural-language understanding for intelligent robots