High-quality microphones and cameras can be scattered across areas in the city for heterogeneous sensing purposes, acquiring situational awareness from the surrounding environment. Adding audio to augment video surveillance can enhance security coverage beyond the camera views to help detect and interpret events as they occur. Moreover, following a mobile crowd sensing approach, smartphone mobile devices, empowered by innovative applications, allow citizens to collect audio-visual data streams and thus monitor a large geographical space of the city at no cost and without involving specialised data captors. Mobile devices allow to relate and correlate geo-referenced sensor streams to physical and social data. Consequently, citizens can acquire greater knowledge on their urban landscape and city managers can obtain better knowledge of their city, thus tailoring civic policies more effectively to the real needs of the population.

Given huge amount of streaming audio and visual data in addition to the available Big Datasets of actual and historical data offered by heterogeneous IoT sensors that are scattered in a Smart City environment (e.g. wireless sensor networks, smart meters, wireless sensors that measure light, CO2, humidity, noise levels), MARVEL will achieve multimodal perception and intelligence for audio-visual scene recognition, event detection and situational awareness without violating ethical and privacy limits. Based on novel methods, approaches and engineering paradigms in multimodal audio-visual data management and processing, MARVEL will showcase the potential to address civic challenges very effectively, from increasing public safety and security to analysing traffic flows and traffic behaviour

Key Facts

  • Project Coordinator: Dr. Sotiris Ioannidis
  • Institution: Foundation for Research and Technology Hellas (FORTH)
  • E-mail: marvel-info{at}marvel-project.eu 
  • Start: 01.01.2021
  • Duration: 36 months
  • Participating Organisations: 17
  • Number of countries: 12

Get Connected



This project has received funding from the European Union’s Horizon 2020 Research and Innovation program under grant agreement No 957337. The website reflects only the view of the author(s) and the Commission is not responsible for any use that may be made of the information it contains.