It is quite normal for people to move in a multimodal way. Depending on the destination and context, the optimal sequence of movement modes (e.g. walking, cycling, bus, train) is selected and constantly adapted to changing conditions. People often use traffic data and routing services for planning. However, the route descriptions provided are intended for humans, because humans are able to locate and contextualize the descriptions and intuitively fill in any gaps.
However, this concept can also be taken further to apply the same capabilities and benefits to autonomous mobile systems (e.g. delivery robots). Despite immense advances in robotics and AI, machines still struggle to understand complex real-world environments and route descriptions. This information is not detailed enough or lacks contextual information.
The central project goal is to demonstrate the multimodal navigation of an autonomous mobile robot using different means of public transportation in the Graz-Weiz region as a proof-of-concept.
This requires various innovations. The utilization of public transport data such as GIP/VAO by machines and their augmentation by other external and robot-internal data sources must be improved. The use of data to plan the mobility of robots must be expanded and the execution of movement modes in the real world must be improved. These movement modes (robot skills) also need to be extended, made more robust, and adapted for different contexts. Furthermore, the robot must be equipped with the ability to recognize when data is too incomplete for navigation and mobility plans are in danger of failing, so that it can react independently.
The concepts and techniques will be integrated into a proof-of-concept architecture and evaluated in cooperation with transport operators in the Graz-Weiz region in real operation.
The project is intended to deliver results on the extent to which natural intuitive mobility concepts can be transferred to machines and provide the basis for innovative and sustainable mobility concepts.
