Demo 1 – Video

Demo 1 went well with the robots successfully identifying their human partners – and the video editing is finally complete. In this video you can watch the full demo.

Posted in Uncategorized | Leave a comment

Demo 1 – Early data analysis results

Demo 1 produced a lot of data and we are working our way through it now. Preliminary figures suggest that the CCMCatalog enabled the robots to collaborate far more efficiently when working together and that the robots were able to detect and track humans in the scene. Stay tuned for more results…

 

Posted in Uncategorized | Leave a comment

Showtime

The stage has now been set for our robots to show off their capabilities so far. The robots are to map out an area which they scan for humans using search patterns. Each human detected will be identified using facial recognition. All information is instantly shared with the other robot via the CCMCatalog. Let’s see how they’ll do!

Posted in Uncategorized | Leave a comment

Robot jitters before first demo

Preparation of our robots is now in full swing to get them up and running for our first demo.

The on-board cameras all need to run at the same time which we have been testing as well as getting the robots to communicate through the CCMCatalog. The first trial run went pretty well but afterwards the processing server decided to (literally) blow up. This called for some time being spent on finding a replacement server after already having lost precious time due to prolonged robot battery delivery discussion with the vendor.

Posted in Uncategorized | Leave a comment

Planning the first demo

The CCMCatalog is almost done and can handle the sharing and negotiation of observations and tasks between the two robots. Face detection has gone in and the new body and legs detector is being tested now. Basic navigation is working now and we are working on the operator interface in the PsyProbe web interface. 

Posted in Uncategorized | Leave a comment

New cool features in Psyclone 2.0: Python integration

Psyclone 2.0 already supports user authored native C++ modules running in a mixed Windows/Linux cluster environment. Now we have added native Python 2.7 and 3.5 support so our users can create either inline or separate modules written entirely in Python. As part of the CoCoMaps project we are now integrating with ROS (Robot Operating System) via Python.

 

Posted in Uncategorized | Leave a comment

New Psyclone 2.0 platform going Open Source soon

Later this year we plan to release version 2.0 of the Psyclone platform. It is a complete rewrite of Psyclone 1.5 with all the old features intact and with massive performance improvements, merging of discrete and streaming data messaging, introduction of drumbeat signals for simulations and a completely new web interface, PsyProbe 2.0.
You can read more about Psyclone 1 at: https://cmlabs.com/products.

 

Posted in Uncategorized | Leave a comment

CoCoMaps at the Hannover Messe 2017

The CoCoMaps team is in Hannover demonstrating our turn-taking technology for robot-human natural conversations. You can find us in Hall 17 at the ECHORD stand C70. We are here with three other cool robotics projects and attracting a lot of attention.

Posted in Uncategorized | Leave a comment

Robot-human turn-taking now running on the Psyclone 2 platform

We now have turn-taking for two participants (one robot and one human) up and running on the Psyclone 2 platform. We use Nuance for speech recognition and speech generation and can handle interruptions and long pauses. Now we just need to work on starting and ending the conversation 🙂

Posted in Uncategorized | Leave a comment

CoCoMaps invited to exhibit at Hannover Messe 2017

The CoCoMaps project has been invited to exhibit our technology at the Hannover Messe 2017 in Germany. We will be showcasing our turn-taking technology running on the Psyclone 2 platform.

Posted in Uncategorized | Leave a comment