QB1 – Interactive Music Robot Recognises and Represents
QB1 could well be the next step in musicÂ players as well as bringing us closer to having interactiveÂ bots in the home with specific tasks.
IfÂ you think about it; things like Genius playlistsÂ and LastFM has altered the way weÂ consume and interact with music andÂ OZWE wants to have the same effect with the devices themselves.
The QB1 is aware of its surroundings, recognises faces and pictures, and can interpret gestures.
The QB1 will turn to face youÂ and displays aÂ shadowy version of you and your surroundings, which helps you make gestures on-screen.
It’s still being developed and the thought of having multi-touch gestures applied to all your media is definetly groovy.
Just think,Â spin your finger around clockwise to fast forward, counter-clockwise to rewind.Â Name a song, or the track number of an album that’s already playing, or evenÂ hold up an album cover to play it.
But that’s still future-talk.
There’s heap of work to be done before all of this is sorted. You have to keep in mind what needs to be acheived.
Human behavior and the minute details of gestures are unpredictable and must be shown to an AI like QB1 over and over, not just coded in. And think of handwriting or voice recognition: even after â€œtrainingâ€ a program for weeks, youâ€™ll still get mistakes due to the limitations of the system. But the only way they can get better, like us humans, is to make mistakes and learn from them. Therefore, OZWE is looking for volunteers to have a QB1 in their home for a while so it can learn the basics of human interaction â€” and probably to work out some bugs before launch without losing any sales.
I reckon the future is getting closer all the time 🙂
Thereâ€™s more info and demos of the QB1 at OZWEâ€™s site.