Our robots are now boasting of a new and longer neck for a raised camera position. It makes them look like giraffes, but it’s part of a set of improvements being implemented for our next demo. They also include testing a new camera with better handling of light, implementing better on-board face detection and lowering the volume of network traffic.
Demo 1 produced a lot of data and we are working our way through it now. Preliminary figures suggest that the CCMCatalog enabled the robots to collaborate far more efficiently when working together and that the robots were able to detect and track humans in the scene. Stay tuned for more results…
The stage has now been set for our robots to show off their capabilities so far. The robots are to map out an area which they scan for humans using search patterns. Each human detected will be identified using facial recognition. All information is instantly shared with the other robot via the CCMCatalog. Let’s see how they’ll do!
Preparation of our robots is now in full swing to get them up and running for our first demo.
The on-board cameras all need to run at the same time which we have been testing as well as getting the robots to communicate through the CCMCatalog. The first trial run went pretty well but afterwards the processing server decided to (literally) blow up. This called for some time being spent on finding a replacement server after already having lost precious time due to prolonged robot battery delivery discussion with the vendor.
The CCMCatalog is almost done and can handle the sharing and negotiation of observations and tasks between the two robots. Face detection has gone in and the new body and legs detector is being tested now. Basic navigation is working now and we are working on the operator interface in the PsyProbe web interface.
Psyclone 2.0 already supports user authored native C++ modules running in a mixed Windows/Linux cluster environment. Now we have added native Python 2.7 and 3.5 support so our users can create either inline or separate modules written entirely in Python. As part of the CoCoMaps project we are now integrating with ROS (Robot Operating System) via Python.
Later this year we plan to release version 2.0 of the Psyclone platform. It is a complete rewrite of Psyclone 1.5 with all the old features intact and with massive performance improvements, merging of discrete and streaming data messaging, introduction of drumbeat signals for simulations and a completely new web interface, PsyProbe 2.0.
You can read more about Psyclone 1 at: https://cmlabs.com/products.
The CoCoMaps team is in Hannover demonstrating our turn-taking technology for robot-human natural conversations. You can find us in Hall 17 at the ECHORD stand C70. We are here with three other cool robotics projects and attracting a lot of attention.
We now have turn-taking for two participants (one robot and one human) up and running on the Psyclone 2 platform. We use Nuance for speech recognition and speech generation and can handle interruptions and long pauses. Now we just need to work on starting and ending the conversation 🙂