1. The Internet of Audio Things: state-of-the-art, vision, and challenges
- Author
-
György Fazekas, Mathieu Lagrange, Luca Turchet, Carlo Fischione, Hossein S. Ghadikolaei, Royal Institute of Technology [Stockholm] (KTH ), Università degli Studi di Trento (UNITN), Queen Mary University of London (QMUL), Laboratoire des Sciences du Numérique de Nantes (LS2N), IMT Atlantique Bretagne-Pays de la Loire (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Nantes - UFR des Sciences et des Techniques (UN UFR ST), Université de Nantes (UN)-Université de Nantes (UN)-École Centrale de Nantes (ECN)-Centre National de la Recherche Scientifique (CNRS), Ecole Polytechnique Fédérale de Lausanne (EPFL), ANR-16-CE22-0012,CENSE,Caractérisation des environnements sonores urbains : vers une approche globale associant données libres, mesures et modélisations(2016), Université de Nantes - Faculté des Sciences et des Techniques, Université de Nantes (UN)-Université de Nantes (UN)-École Centrale de Nantes (ECN)-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique Bretagne-Pays de la Loire (IMT Atlantique), and Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)
- Subjects
Computer science ,Interoperability ,02 engineering and technology ,Representation (arts) ,Electrical Engineering, Electronic Engineering, Information Engineering ,sensors ,internet of audio things (ioaut) ,computer.software_genre ,01 natural sciences ,Ecoacoustics ,Field (computer science) ,[INFO.INFO-AI]Computer Science [cs]/Artificial Intelligence [cs.AI] ,semantic web ,[STAT.ML]Statistics [stat]/Machine Learning [stat.ML] ,[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing ,0202 electrical engineering, electronic engineering, information engineering ,Elektroteknik och elektronik ,010301 acoustics ,accuracy ,Multimedia ,[INFO.INFO-MM]Computer Science [cs]/Multimedia [cs.MM] ,Computer Science Applications ,machine learning ,classification ,Internet of Audio Things ,Hardware and Architecture ,targets ,The Internet ,ecosystems ,Internet of Things ,[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processing ,Information Systems ,source localization ,auditory scene analysis ,Computer Networks and Communications ,Auditory Scenes Analysis ,Internet of Sounds ,system ,[INFO.INFO-LG]Computer Science [cs]/Machine Learning [cs.LG] ,Smart city ,0103 physical sciences ,Wireless ,music ,time synchronization ,business.industry ,020206 networking & telecommunications ,internet of things ,Information and Communications Technology ,network ,Signal Processing ,Smart City ,production ,State (computer science) ,business ,computer - Abstract
International audience; The Internet of Audio Things (IoAuT) is an emerging research field positioned at the intersection of the Internet of Things, sound and music computing, artificial intelligence, and human-computer interaction. The IoAuT refers to the networks of computing devices embedded in physical objects (Audio Things) dedicated to the production, reception, analysis and understanding of audio in distributed environments. Audio Things, such as nodes of wireless acoustic sensor networks, are connected by an infrastructure that enables multidirectional communication, both locally and remotely. In this paper, we first review the state of the art of this field, then we present a vision for the IoAuT and its motivations. In the proposed vision, the IoAuT enables the connection of digital and physical domains by means of appropriate information and communication technologies, fostering novel applications and services based on auditory information. The ecosystems associated with the IoAuT include interoperable devices and services that connect humans and machines to support human-human and human-machines interactions. We discuss challenges and implications of this field, which lead to future research directions on the topics of privacy, security, design of Audio Things, and methods for the analysis and representation of audio-related information.
- Published
- 2020
- Full Text
- View/download PDF