Read more about some of the many Psyclone platform use cases.
Psyclone 2.0.2 has been released which includes a number of performance updates and improved network stability. Please visit the Download page to grab either the new binary builds or the Open Source version.
Finally we have put together a number of use cases for the Psyclone Platform.
The CoCoMaps project has now been completed and all the deliverables and milestones have been met. We have produced the final project report which is available here.
You can also still catch the project videos and all the reports on this page. We thank the ECHORD++ project and the EU for their support without which none of this would have been possible.
The fourth and final demonstration of the project was completed successfully today. The two robots collaborated with two humans to via dialogue complete a set of tasks specified by the humans. Information needed to complete the tasks was dynamically detected and extracted by asking the humans to provide it. The video and the report will be posted here when ready.
We have today released all the open source parts of the CoCoMaps project. Please visit the download page for viewing it. Do remember that the project runs inside the Psyclone Platform, so you will need this as well from, it can be downloaded here.
We have today released the final version of the Psyclone platform, now officially out of Beta. Please visit the download page and grab either the source code or the binary release. Work will continue on Psyclone beyond the end of the project and we will keep this page updated with new versions.
We have completed Demo 2 of the project and the report and video are almost ready for publication. The robots behaved very well and worked together nicely to complete the tasks. They start up searching for humans to speak to (like in Demo 1) and once a human has been found one of the robots strikes up a conversation to collaborate on a task which the other robot will be helping with while the conversation is in progress. We have found a number of issues which didn’t directly affect Demo 2, but which we definitely need to fix for Demo 3, such as the tasks getting out of sync and roles playing up. Luckily the cameras and the vision code works much better now.
We are heading into the last two months of the project, scheduled to finish on 31 March 2018. We have completed two demonstrations (Demo 0 and Demo 1) and are working hard on the final elements for the final two (Demo 2 and Demo 3). Our speech interaction is improving day by day and the robot’s 2D and 3D vision is now ready.