Project CoCoMaps

Welcome to the Collaborative Cognitive Maps (CoCoMaps) project. It ran from September 2016 until March 2018. Please see the Publications page for all results including four live demonstration videos.

The aim of the project was to

  • Improve natural human-robot communication & collaboration
  • Working together on task specification and completion
  • Based on real-time dialogue skills
  • Cognitive model of turn-taking, knowledge and goals
  • Task-oriented coordination of multi-party task completion
  • Using conversational dialogue to obtain and convey information

The experiment CoCoMaps uses an expanded version of the existing Cognitive Map Architecture implemented on Honda’s ASIMO robot in an environment with more complex tasks than already attempted. This will allow the robot to interact in more complex ways, in particular, to simultaneously interact with another robot and more than one person at a time. Thus, the project aims for a group of 2 robots and 2 humans. These systems will enable social interactions that can coexist with the robots’ attention to – and completion of – practical tasks in the workplace. A particular focus is on human detection and tracking algorithms and on an improved dialogue system.

The principal components of the dialogue system, as targeted in this demonstration, have been validated piece-wise in laboratory settings, and some subsystems have been demonstrated to work in combination; a final unified, fully integrated whole remains to be demonstrated and is thus targeted here. Deployment in a single robot has been demonstrated, but demonstration on multiple robots has only been demonstrated in simulation; here targeted for integrated demonstration.

CoCoMaps is receiving funding from ECHORD++ which is funded by the European Union’s Seventh Framework Programme for research, technological development and demonstration under grant agreement no 601116.

Project updates   Follow Us on Facebook Follow Us on Twitter

Project Pages

Project News

  • Final Report

    The CoCoMaps project has now been completed and all the deliverables and milestones have been met. We have produced the final project report which is available here. You can also still catch the project videos and all the reports on this page. We thank the ECHORD++ project and the EU for their support without which […]

  • The video and the report for Demo 3 are now available to see.

  • Demo 3

    The fourth and final demonstration of the project was completed successfully today. The two robots collaborated with two humans to via dialogue complete a set of tasks specified by the humans. Information needed to complete the tasks was dynamically detected and extracted by asking the humans to provide it. The video and the report will […]

  • We have today released all the open source parts of the CoCoMaps project. Please visit the download page for viewing it. Do remember that the project runs inside the Psyclone Platform, so you will need this as well from, it can be downloaded here.

  • We have today released the final version of the Psyclone platform, now officially out of Beta. Please visit the download page and grab either the source code or the binary release. Work will continue on Psyclone beyond the end of the project and we will keep this page updated with new versions.

  • The video and report for Demo 2 are now available to see. 

  • Demo 2

    We have completed Demo 2 of the project and the report and video are almost ready for publication. The robots behaved very well and worked together nicely to complete the tasks. They start up searching for humans to speak to (like in Demo 1) and once a human has been found one of the robots […]

  • Last Push

    We are heading into the last two months of the project, scheduled to finish on 31 March 2018. We have completed two demonstrations (Demo 0 and Demo 1) and are working hard on the final elements for the final two (Demo 2 and Demo 3). Our speech interaction is improving day by day and the […]

  • A new look

    Our robots are now boasting of a new and longer neck for a raised camera position. It makes them look like giraffes, but it’s part of a set of improvements being implemented for our next demo. They also include testing a new camera with better handling of light, implementing better on-board face detection and lowering […]

  • Demo 1 – Video

    Demo 1 went well with the robots successfully identifying their human partners – and the video editing is finally complete. In this video you can watch the full demo.

Comments are closed.