Personal tools
You are here: Home Results PDFs of PUBLIC Year 1 deliverables
About NIFTi
NIFTi is about human-robot cooperation. About teams of robots and humans doing tasks together, interacting together to try and reach a shared goal. NIFTi looks at how the robot could bear the human in mind. Literally. When determining what to do or say next in human-robot interaction; when, and how. NIFTi puts the human factor into cognitive robots, and human-robot team interaction in particular.   Each year, NIFTi evaluates its systems together with several USAR organizations. Rescue personnel teams up with NIFTi robots to carry out realistic missions, in real-life training areas. 
Impressum

This site uses Google Analytics to record statistics on site visits - see Legal information.

 

PDFs of PUBLIC Year 1 deliverables

PUBLIC versions of the Year 1 deliverables

File m10-DR.5.1.1: Domain analysis and specifications: context scenario and skill primitives
In this report we provide the specifications for planning activities by the NIFTi architecture through the interaction with the end-users and formalize the skill learning procedure through the analysis of USAR scenarios.
File m10-DR 8.1.2: Market analysis for USAR robots with HRI
The report provides a first analysis of the market for USAR robots with HRI. The purpose of this report is to serve as an initial orientation on the robotics market, with a specific focus on the USAR segment. The report helps us to place the NIFTi UGV prototype and the end user requirements it is based on, relative to potential competitors. The report focuses on available robot platforms. An update of this report is scheduled for Year 4.
File m12-DR 1.1.1: Acquisition of spatial maps of semi-structured environments
Building the representation of the environment of a robot is essential to two general tasks: autonomous navigation and communication of the perception of the robot to the end-user. This report presents the results of the work- package 1 on spatio-temporal modeling for situation awareness during the first year. The objectives were to build consistent spatial representation of USAR sites (MS1.1) based on multi-modal localization in semi-structured environments and hybrid maps. We used state-of-the-art metrical 2D map- ping algorithm on which we developed an advanced topological decomposi- tion method based on information theory. We show results on 3D perception dedicated to SLAM. We also advanced the tools for modular and efficient Iterative Closest Point (ICP) algorithm.
File m12-DR 2.1.1: Basic vision-based situation awareness capabilities
This report summarizes research on WP2 – Visuo-conceptual modeling for situation awareness, during the first 12 months. Work on this package was mainly focused on the implementation of computer vision-based modules. The modules allow the robot to perceive the surrounding world. In addition to that, we were working on the connection between vision and conceptual modeling, vision and planning, and vision and the Graphical User Interface (GUI) in which the detected objects are presented to the human operator. The core of the vision system is implemented in C++ and it has been inte- grated by using the Robotic Operating System (ROS) middleware. Vision components mainly use the data captured by an omni-directional camera, a laser scanner and an odometry sensor.
File m12-DR 3.1.1: Adaptive situated HRI for human-instructed navigation
This report presents the results of WP 3 for the first year of NIFTi. The overall objective of WP3 is to facilitate communication between the humans and robots, while jointly exploring a disaster area. We see communication as an inherent aspect of this team-working: How does what we say fit into the context of what we do, and where we are? In Year 1 the work in WP3 fo- cused on communication to support human-guided exploration of the robot. Characteristic for this setting is that a human operator is remotely located, outside of visible range of the robot operating in the hotzone. The commu- nication modalities include spoken dialogues and graphical user interfaces.
File m12-DR 5.1.2: Methods and paradigms for skill learning based on affordances and action-reaction observation
This document describes the status of progress for the research on learning skills for functioning processes and task execution performed by the NIFTi Consortium. In particular, according to the Description of Work (DOW), research is focused on the development of novel methods and paradigms for skill learning based on affordances and action-reaction observation. Planned work, as per the DOW, is introduced and the actual work is discussed, highlighting the relevant achievements, how these contribute to the current state of the art and to the aims of the project.
File m12-DR 6.1.2: Platform manufacturing and sensor integration
This document describes the manufacturing of the NIFTi platform.
File m12-DR 6.1.2: Platform manufacturing and sensor integration (UAV)
This document describes the design and manufacturing of the NIFTi UAV.
File m12-DR 7.1.3: Integration and end-user evaluation for human-instructed exploration
This document describes the integration of the first prototype of the NIFTi robot and its evaluation performed at the test site of Fire Department Dort- mund.
Document Actions