IROS 2018: BDSR Workshop

Workshop on Latest Advances in Big Activity Data Sources for Robotics & New Challenges

 

Workshop Information

Date: October 1st, 2018

Location: Madrid, Spain

Deadline for the paper and poster submissions: August 5th, 2018

Deadline for the camera-ready versions: September 5th, 2018

Organizers:

                                                                                                                                                                                                      

Asil Kaan Bozcuoğlu, M.Sc.                                                Prof. Dr. Tamim Asfour                                                                 Dr.-Ing Karinne Ramirez Amaro          &            Prof. Dr. Gordon Cheng

University of Bremen                                                 Karlsruhe Institute of Technology                                                              Technical University of Munich

Workshop Objectives

Recently, we have witnessed that robots start to execute human-level complex tasks such as making popcorn, baking pizza and carrying out chemical experiments. Although these executions are milestones by themselves, robots still have limitations in terms of flexibility, speed and adaptability. To attack these limitations, we believe that big data sources, which contain activities from robots, human tracking and virtual reality, play an important role. Having a big activity data source on site can help robots in many ways such as learning motion parameterizations, adapting to different conditions and generalizing their existing knowledge. Although we see many applications which start to make use of big data, the research community is still in the phase of “re-inventing the wheel” by designing new data structures, collecting similar data and implementing interfaces between data and learning/control routines. Our main goal in this workshop is to gather the interested researchers from IROS attendees and make a step towards the standardization of research tools and data formats to strengthen the joint research efforts.

We believe that the data coming from different agents should reside in a similar format for being combined and used together. On the other hand, there exist surely unique aspects of each source. For instance, the robotic activity data has usually annotations from the control and perception routines but we do not have access to such “brain dumps” in the human tracking data or in the virtual reality (VR). Similarly, we can detect force-dynamic events and the ground truth in the simulation and virtual reality environments in contrast to real-world executions. Thus, one of our objectives in this workshop is to discuss and seek an answer to the questions “Is it possible to come up with a generic data format to cover all these aspects? If so, how can we design such a format?”

A more specific sub-problem is that the variety of virtual reality engines is used by roboticists. All the available VR engines present different input/output devices and are able to capture position and orientation by using tracking technologies. Therefore the development of VR-software complies to demanding quality standards and timing constraints which obey the needs of sensory simulation and direct 3D interaction. Working applications combine different packages and implement specific solutions. In every VR-setting, activity data is represented and stored in different formats and therefore virtual scenarios cannot be easily integrated and interpreted uniformly. To this end, we plan to analyze the existing virtual reality set-ups from the accepted papers and assess the needs of the research community. An important question to be asked is“Can we agree on a pseudo-standard VR system for robotics like Gazebo for 3D-simulations?”

Overall, the participants will get insights from the state-of-the-art with the presentations of the invited speakers and the authors of the accepted papers in this workshop. We foresee that the keynotes from the invited speakers with known expertise on the field will lead to valuable discussions. Since we plan to assess the community’s needs, we will encourage every participant to actively communicate and discuss on possible ways towards a collaborative research effort in the dedicated discussion slot. In addition, we will offer the authors a place for their posters to explain in detail their work for every accepted paper.

Topics of Interest

-Big Data Sources for Robotics

  • Architectures
  • Cloud-based Big Data Sources
  • Motion databases

–  Virtual Reality in Robotics Applications

  • Designing VR-based Games with Purposes
  • Activity Recognition and Recording in VR Frameworks
  • Learning and Reasoning from VR

– Representations of Activities and Motions in Datasets

– Learning from Human-tracking

– Representations of Activity Data inside Symbolic Knowledge Bases

Call for Papers

Please download from here

Invited Speakers (Tentative)

Prof. Dr. Tamim Asfour    (Tentative Title: “Large-Scale Human Motion Database for Robotics”)

Prof. Michael Beetz, PhD   (Tentative Title: “Virtual Reality-based Mental Simulations for Robots and Hybrid Reasoning”)

Asst. Prof. Dr. Maya Çakmak

Dr. Tetsunari Inamura

Prof. Dr. Katja Mombaur (Tentative Title: “Model-based Studies of Stability in Human Movement Experiments”)

 Prof. Dr. Wataru Takano

Schedule (Tentative)

Time                                   Event                                                        Comment

9:30 – 9:45                      Opening                                                    Opening remarks by the organizers.

9:45 – 10:00                   Invited talk 1                                          Prof. Asfour

10:00 – 10:30                Coffee break

10:30 – 11:30                Invited talk 2 & 3                                 Prof. Beetz & Prof. Çakmak

11:30 – 12:00                Lightning Talks

12:00 – 12:30                Poster Session

12:30 – 13:30                Lunch break

13:30 – 14:30                Invited Talk 4 & 5                                Prof. Inamura & Prof. Mombaur

14:30 – 15:00                Lightning Talks

15:00 – 15:30                Coffee break                                         Participants will  have chance to visit the posters

15:30 -16:00                  Invited talk 6                                         Prof. Takano

16:00 – 16:15                Lightning talks

16:15-17:15                   Discussion

17:15-17:30                   Final remarks

17:30                                  End

Acknowledgements

This full day workshop is supported by the DFG Collaborative Research Center 1320: Everyday Activity Science and Engineering – EASE