The NIFTi consortium recently presented its latest results in science & technology for human-robot collaboration in USAR, at the Year 2 review. The review was hosted by the Istituto Superiore Antincendi in Rome (Italy), and the Scuola di Formazione Operativa at Montelibretti (Italy).
To provide a better view on or into large objects, NIFTi created a KOHGA3-style arm to lift a camera platform up to 1.50m. Using this arm, the robot and its operator can look for example into cars or trucks, or onto larger rubble piles. The new arm was demonstrated at the recent Year 2 review, at SFO in Montelibretti (Italy).
In Year 2 the project focused on "human-assisted joint exploration" of a tunnel accident disaster site. We approached the problem from the viewpoint of a human-robot team, involving several humans at a remote command post, one in-field human to pilot a UAV, and a UGV (operated from the remote command post).
The NIFTi mission statement recently got translated into Haitian Creole, for Web Geeks Science.
￼NIFTi made an appearance in a recent article on rescue robotics, published in c’t (2/2012, “Autonome Vorhut: Neue Techniken bei Rettungsrobotern”). The article was followed by a two-page interview with Geert-Jan Kruijff (DFKI LT, coordinator NIFTi). Also, the NIFTi robot recently made the cover of a magazine - the Journal of Field Robotics (Jan/Feb 2012).
The NIFTi consortium wishes you all happy holidays, anywhere and everywhere in the world, and a safe 2012! Courtesy of the ASL lab at ETH Zurich, we even have a video of a NIFTi robot acting as Santa-helper ...
We have just successfully finished our end user evaluations for this year, at the SFO training area of Vigili del Fuoco in Montelibretti (Italy). The evaluations were set in the "tunnel accident" use case. Now, a team of three people used a semi-autonomous UGV, and a tele-operated UAV, to explore the disaster site. The human team-members performed the roles of Mission Commander, UGV Operator (both at a remote command post), and UAV Operator (in-field). The evaluation focused on the situation awareness of the UGV operator, and the ways in which the Operator and the UGV collaborated.
NIFTi organized a AAAI fall symposium in November this year, on the theme of human-robot teaming. The symposium included invited talks by Ron Arkin, Jeffrey Bradshaw, and Satoshi Tadokoro. One of the main lessons: Think of making robot intelligence as acceptable intelligence, to make a robot a real team player.
In November, NIFTi organizes an autumn school around the theme of human-robot cooperation.
Recently, DIE ZEIT published an article about the use of micro-UAVs in emergency responses. This article mentions in detail the efforts of NIFTi, illustrated on the demonstration scenario at the Industry Day we recently organized at the Training Center of the Fire Brigade of Dortmund.
Several written and "video'd" reports of the NIFTi Industry Day appeared in the (German) press.
Early July, NIFTi met at the Training Center of the Fire Department of Dortmund (FDDO) for the NIFTi Joint Exercises 2011. The goal: Study human-robot teaming. Given a team of humans (researchers, end users), a microcopter, and the new NIFTi robot - how could and should they interact to jointly explore a disaster site, like a multi-story apartment building on fire?
After a year of close cooperation with the end user organizations involved in NIFTi, the new NIFTi robot platform has arrived! BlueBotics has built 6 platforms "to specification," and at the recent NIFTi Joint Exercises in Dortmund they (the robots ...) have been put to the test ...
NIFTi organized its first Industry Day this July, at the Dortmund Fire Brigade Training Center. Over 25 invited members of industry, end user organizations, and research institutions attended the event to learn more about practical progress made in NIFTi over the last year.
NIFTi organizes a AAAI 2011 Fall Symposium on robot-human team-work in dynamic adverse environments. The goal of the symposium is to discuss collaboration between humans and robots in situations which require all to continuously adapt to developing events. Examples are adverse circumstances encountered during emergency responses in various types of search & rescue missions, or security or military deployments. Adaptation needs to reflect changes both in what is to be done by human and robot (the tasks), and in how they are supposed to do so (the roles). It needs to explicitly couple the task-work with the social aspects of team-work for the human-robot team to maintain cohesive, effective operations. What makes this difficult is that humans are performing under stress. And that robots, so far, are far from being well- tuned to the human factor in human-robot collaboration.
The NIFTi Year 1 review took place at DFKI in Saarbrücken (Germany) on Thursday March 31, and Friday April 1 2011. On Thursday, the consortium presented its scientific results, and on Friday (April Fool's!) we demonstrated the integrated robot system (UGV) and our UAV prototypes. In their first official feedback, the reviewers awarded NIFTi with the predicate "Excellent."
NIFTi is nearing the end of its first year. During Year 1, our S&T development was set in the context of human-instructed exploration of a disaster site. Together with end users we developed use cases and robot platform requirements. During the year we performed several field tests for data gathering and experimentation at the end user training sites, leading up to the end user field trials in Dortmund in January 2011.
In early January, NIFTi organized a week-long field trial with end users at the training area of the Dortmund Firebrigade (FDDO). In the trials, firemen experimented with several robot systems to explore a tunnel accident. FDDO organized a press conference to help showcase the project.
We have created a project leaflet with a brief description of what we do. This leaflet is available as PDF.
At the SFO in Montelibretti (Italy), we recently performed a large battery of field experiments. We built up a real-life tunnel accident, and then ran our robots and UAV to gather data for mapping, vision, human-robot interaction, and attention.