Personal tools
You are here: Home News NIFTi Achieves "Excellent" (4 out of 4) at Year 1 Review
About NIFTi
NIFTi is about human-robot cooperation. About teams of robots and humans doing tasks together, interacting together to try and reach a shared goal. NIFTi looks at how the robot could bear the human in mind. Literally. When determining what to do or say next in human-robot interaction; when, and how. NIFTi puts the human factor into cognitive robots, and human-robot team interaction in particular.   Each year, NIFTi evaluates its systems together with several USAR organizations. Rescue personnel teams up with NIFTi robots to carry out realistic missions, in real-life training areas. 
Impressum

This site uses Google Analytics to record statistics on site visits - see Legal information.

 

NIFTi Achieves "Excellent" (4 out of 4) at Year 1 Review

The NIFTi Year 1 review took place at DFKI in Saarbrücken (Germany) on Thursday March 31, and Friday April 1 2011. On Thursday, the consortium presented its scientific results, and on Friday (April Fool's!) we demonstrated the integrated robot system (UGV) and our UAV prototypes. In their first official feedback, the reviewers awarded NIFTi with the predicate "Excellent."

The NIFTi Year 1 review took place at DFKI in Saarbrücken (Germany) on Thursday March 31, and Friday April 1 2011. 

On Thursday, the consortium presented its scientific results. For Year 1, we placed our work in the context of the Tunnel Accident use case. Across the board, we showed how each work package made progress in relation to the use case, how this was driven by the NIFTi user-centric perspective on design, and how the work in one work package was connected to that in others. A summary of the results can be found here: [ pdf ] .

On Friday we demonstrated the integrated robot system (UGV) and our UAV prototypes. We built up an accident scenario at the (roofed) DFKI parking lot, and set up an Incident Command Post in a room nearby, to remotely control the UGV. It being April Fool's day, the demo's went remarkably well! We showed how a human operator can instruct the robot where and how to explore the environment, using a multi-modal operator control unit (OCU). Instructions vary from basic tele-operation, to spoken small movement commands, waypoint-based navigation (with or without accompanying commands), to complex landmark-based navigation like "go to the car." The last example illustrated the novel work on functional mapping we are doing: The robot goes to a place near the car from where it is easy to look inside the car. And that requires a fair amount of semantic understanding of the environment, going from the observation of a car and its position, to a projection of areas around the car that afford looking inside. 

In their first official feedback, the reviewers awarded NIFTi with the predicate "Excellent." 

 An album with photos of the review is available on our Facebook page: [ WWW

 

Document Actions