20 results on '"Camurri, Antonio"'
Search Results
2. IEMP Technical Resources
- Author
-
Eerola, Tuomas, Clayton, Martin, Alborno, Paolo, Camurri, Antonio, Jacoby, Nori, Jakubowski, Kelly, and Tarsitani, Simone
- Subjects
Arts and Humanities ,Music Performance ,Ethnomusicology ,Music - Abstract
This project collects together pieces of code used in preparation of IEMP Collection.
- Published
- 2022
- Full Text
- View/download PDF
3. Interpersonal entrainment in music performance : theory, method and model
- Author
-
Clayton, Martin, Jakubowski, Kelly, Eerola, Tuomas, Keller, Peter E., Camurri, Antonio, Volpe, Gualtiero, and Alborno, Paolo
- Abstract
Interpersonal musical entrainment—temporal synchronization and coordination between individuals in musical contexts—is a ubiquitous phenomenon related to music’s social functions of promoting group bonding and cohesion. Mechanisms other than sensorimotor synchronization are rarely discussed, while little is known about cultural variability or about how and why entrainment has social effects. In order to close these gaps, we propose a new model that distinguishes between different components of interpersonal entrainment: sensorimotor synchronization—a largely automatic process manifested especially with rhythms based on periodicities in the 100–2000 ms timescale—and coordination, extending over longer timescales and more accessible to conscious control. We review the state of the art in measuring these processes, mostly from the perspective of action production, and in so doing present the first cross-cultural comparisons between interpersonal entrainment in natural musical performances, with an exploratory analysis that identifies factors that may influence interpersonal synchronization in music. Building on this analysis we advance hypotheses regarding the relationship of these features to neurophysiological, social, and cultural processes. We propose a model encompassing both synchronization and coordination processes and the relationship between them, the role of culturally shared knowledge, and of connections between entrainment and social processes.
- Published
- 2020
4. Analysis of Movement Quality in Full-Body Physical Activities
- Author
-
Niewiadomski, Radoslaw, Kolykhalova, Ksenia, Piana, Stefano, Alborno, Paolo, and Camurri, Antonio
- Subjects
movement quality ,karate ,full-body movement ,dance ,gesture analysis - Abstract
Full-body human movement is characterized by fine-grain expressive qualities that humans are easily capableof exhibiting and recognizing in others’ movement. In sports (e.g., martial arts) and performing arts(e.g., dance), the same sequence of movements can be performed in a wide range of ways characterized bydifferent qualities, often in terms of subtle (spatial and temporal) perturbations of the movement. Even anon-expert observer can distinguish between a top-level and average performance by a dancer or martialartist. The difference is not in the performed movements–the same in both cases–but in the “quality” of theirperformance.In this article, we present a computational framework aimed at an automated approximate measure ofmovement quality in full-body physical activities. Starting from motion capture data, the framework computeslow-level (e.g., a limb velocity) and high-level (e.g., synchronization between different limbs) movementfeatures. Then, this vector of features is integrated to compute a value aimed at providing a quantitative assessmentof movement quality approximating the evaluation that an external expert observer would give ofthe same sequence of movements.Next, a system representing a concrete implementation of the framework is proposed. Karate is adoptedas a testbed.We selected two different katas (i.e., detailed choreographies of movements in karate) characterizedby different overall attitudes and expressions (aggressiveness, meditation), and we asked seven athletes,having various levels of experience and age, to perform them. Motion capture data were collected from theperformances and were analyzed with the system. The results of the automated analysis were compared withthe scores given by 14 karate experts who rated the same performances. Results show that the movementquality scores computed by the system and the ratings given by the human observers are highly correlated (Pearson’s correlations r = 0.84, p = 0.001 and r = 0.75, p = 0.005).
- Published
- 2019
- Full Text
- View/download PDF
5. AI Serious Games
- Author
-
March, Erik, Schuller, Bjorn, Baird, Alice, Baron-Cohen, Simon, Lassalle, Amandine, O'Reilly, Helen, Pigat, Delia, Robinson, Peter, Davies, Ian, Baltrusaitis, Tadas, Golan, Ofer, Friedenson-Hayo, Shimrit, Tal, Shahar, Newman, Shai, Meir-Goren, Meir, Camurri, Antonio, Piana, Stefano, Bolte, Sven, Sezgin, Metin, Alyuz, Nese, Rynkiewicz, Agnieska, Baranger, Aurelie, Robinson, Peter [0000-0003-0347-3789], and Apollo - University of Cambridge Repository
- Subjects
ComputerSystemsOrganization_COMPUTERSYSTEMIMPLEMENTATION ,46 Information and Computing Sciences ,4607 Graphics, Augmented Reality and Games - Abstract
‘Serious games’ are becoming extremely relevant to individuals who have specific needs, such as children with an Autism Spectrum Condition (ASC). Often, individuals with an ASC have difficulties in interpreting verbal and non-verbal communication cues during social interactions. The ASC-Inclusion EU-FP7 funded project aims to provide children who have an ASC with a platform to learn emotion expression and recognition, through play in the virtual world. In particular, the ASC-Inclusion platform focuses on the expression of emotion via facial, vocal, and bodily gestures. The platform combines multiple analysis tools, using on-board microphone and web-cam capabilities. The platform utilises these capabilities via training games, text-based communication, animations, video and audio clips. This paper introduces current findings and evaluations of the ASC-Inclusion platform and provides detailed description for the different modalities.
- Published
- 2016
6. The Intersection of Art and Technology
- Author
-
Camurri, Antonio and Volpe, Gualtiero
- Subjects
Cultural differences ,Electronic music ,Synchronization ,Real-time systems ,Collaboration ,Art ,Music - Abstract
As art influences science and technology, science and technology can in turn inspire art. Recognizing this mutually beneficial relationship, researchers at the Casa Paganini-InfoMus Research Centre work to combine scientific research in information and communications technology (ICT) with artistic and humanistic research. Here, the authors discuss some of their work, showing how their collaboration with artists informed work on analyzing nonverbal expressive and social behavior and contributed to tools, such as the EyesWeb XMI hardware and software platform, that support both artistic and scientific developments. They also sketch out how art-informed multimedia and multimodal technologies find application beyond the arts, in areas including education, cultural heritage, social inclusion, therapy, rehabilitation, and wellness.
- Published
- 2016
7. Towards Multimodal, Multi-Party, and Social Brain-Computer Interfacing
- Author
-
Nijholt, Antinus, Camurri, Antonio, and Costa, Cristina
- Subjects
Multimedia ,Casual ,Computer science ,InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,EWI-20156 ,Multi-modal interaction ,METIS-289626 ,HMI-MI: MULTIMODAL INTERACTIONS ,computer.software_genre ,Brain-Computer Interfacing ,Multimodal interaction ,Important research ,Brain computer interfacing ,InformationSystems_MODELSANDPRINCIPLES ,Interfacing ,Human–computer interaction ,Human computer interaction ,Games ,computer ,IR-81852 ,Social brain ,Brain–computer interface - Abstract
In this paper we identify developments that have led to the current interest from computer scientists in Brain-Computer Interfacing (BCI). Nondisabled users have become a target group for BCI applications. Non-disabled users can not be treated as patients. They are free to move and use their hands during the interaction with an application. Therefore BCI should be integrated in a multimodal approach. Games are an important research area since shortcomings of BCI can be translated into challenges in multimodal cooperative, competitive, social and casual games.
- Published
- 2012
8. Social Interaction in a Cooperative Brain-computer Interface Game
- Author
-
Obbink, Michel, Gürkök, Hayrettin, Plass - Oude Bos, D., Hakvoort, Gido, Poel, Mannes, Nijholt, Antinus, Camurri, Antonio, and Costa, Cristina
- Subjects
HMI-HF: Human Factors ,IR-81860 ,Computer science ,Interface (computing) ,Social Interaction ,METIS-289714 ,HMI-MI: MULTIMODAL INTERACTIONS ,Social relation ,EWI-22306 ,Co operation ,Co-operation ,Human–computer interaction ,Brain-Computer Interfaces ,Selection method ,Games ,Simulation ,Brain–computer interface ,Gesture - Abstract
Does using a brain-computer interface (BCI) influence the social interaction between people when playing a cooperative game? By measuring the amount of speech, utterances, instrumental gestures and empathic gestures during a cooperative game where two participants had to reach a certain goal, and questioning participants about their own experience afterwards this study attempts to provide answers to this question. The results showed that social interaction changed when using a BCI compared to using a mouse. There was a higher amount of utterances and empathic gestures. This indicates that the participants reacted more to the higher difficulty of the BCI selection method. Participants also reported that they felt they cooperated better during the use of the mouse.
- Published
- 2012
9. Real-time Automatic Emotion Recognition from Body Gestures
- Author
-
Piana, Stefano, Staglianò, Alessandra, Odone, Francesca, Verri, Alessandro, and Camurri, Antonio
- Subjects
FOS: Computer and information sciences ,Computer Vision and Pattern Recognition (cs.CV) ,Computer Science - Human-Computer Interaction ,Computer Science - Computer Vision and Pattern Recognition ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Human-Computer Interaction (cs.HC) - Abstract
Although psychological research indicates that bodily expressions convey important affective information, to date research in emotion recognition focused mainly on facial expression or voice analysis. In this paper we propose an approach to realtime automatic emotion recognition from body movements. A set of postural, kinematic, and geometrical features are extracted from sequences 3D skeletons and fed to a multi-class SVM classifier. The proposed method has been assessed on data acquired through two different systems: a professionalgrade optical motion capture system, and Microsoft Kinect. The system has been assessed on a "six emotions" recognition problem, and using a leave-one-subject-out cross validation strategy, reached an overall recognition rate of 61.3% which is very close to the recognition rate of 61.9% obtained by human observers. To provide further testing of the system, two games were developed, where one or two users have to interact to understand and express emotions with their body.
- Published
- 2014
- Full Text
- View/download PDF
10. Elckerlyc in practice - on the integration of a BML Realizer in real applications
- Author
-
Reidsma, Dennis, van Welbergen, H., Camurri, Antonio, Costa, Cristina, and Volpe, Gualtiero
- Subjects
EWI-21993 ,METIS-287901 ,IR-80698 - Abstract
Building a complete virtual human application from scratch is a daunting task, and it makes sense to rely on existing platforms for behavior generation. When building such an interactive application, one needs to be able to adapt and extend the capabilities of the virtual human offered by the platform, without having to make invasive modications to the platform itself. This paper describes how Elckerlyc, a novel platform for controlling a virtual human, offers these possibilities.
- Published
- 2012
11. Single Value Devices
- Author
-
Mader, Angelika H., Dertien, Edwin Christian, Reidsma, Dennis, Camurri, Antonio, and Costa, Cristina
- Subjects
EWI-21978 ,METIS-287898 ,IR-80668 - Abstract
We live in a world of continuous information overflow, but the quality of information and communication is suffering. Single value devices contribute to information and communication quality by focussing on one explicit, relevant piece of information. The information is decoupled from a computer and represented in an object, integrated into daily life. The contribution of this paper is on different levels: Firstly, we identify single value devices as a class, and, secondly, illustrate it through examples in a survey. Thirdly, we collect characterizations of single value devices into a taxonomy. The taxonomy also provides a collection of design choices that allow one to more easily find new combinations or alternatives, and that facilitate the design of new, meaningful, effective and working objects. Finally, when we want to step from experimental examples to commercializable products, a number of issues become relevant that are identified and discussed in the remainder of this paper.
- Published
- 2012
12. Design of an interactive playground based on traditional children's play
- Author
-
Tetteroo, Daniel, Reidsma, Dennis, van Dijk, Elisabeth M.A.G., Nijholt, Antinus, Camurri, Antonio, Costa, Cristina, and Volpe, Gualtiero
- Subjects
METIS-287900 ,ComputingMilieux_PERSONALCOMPUTING ,EWI-21991 ,IR-80697 - Abstract
This paper presents a novel method for interactive playground design, based on traditional children's play. This method combines the rich interaction possibilities of computer games with the physical and open-ended aspects of traditional children's games. The method is explored by the development of a prototype interactive playground, which has been implemented and evaluated over two iterations.
- Published
- 2012
13. Towards Mimicry Recognition during Human Interactions: Automatic Feature Selection and Representation
- Author
-
Sun, X., Nijholt, Antinus, Pantic, Maja, Camurri, Antonio, and Costa, Cristina
- Subjects
EC Grant Agreement nr.: ERC/203143 ,Computer science ,EWI-20157 ,media_common.quotation_subject ,METIS-289627 ,Feature extraction ,HMI-MI: MULTIMODAL INTERACTIONS ,050109 social psychology ,Feature selection ,02 engineering and technology ,Affect (psychology) ,motion energy ,Human–computer interaction ,human- human interaction ,Human behavior analysis ,0202 electrical engineering, electronic engineering, information engineering ,0501 psychology and cognitive sciences ,Conversation ,Computer vision ,Interpersonal interaction ,Affective computing ,media_common ,Mimicry representation ,business.industry ,05 social sciences ,Representation (systemics) ,EC Grant Agreement nr.: FP7/231287 ,Mimicry ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,IR-81853 - Abstract
During face-to-face interpersonal interaction people have a tendency to mimic each other, that is, they change their own behaviors to adjust to the behavior expressed by a partner. In this paper we describe how behavioral information expressed between two interlocutors can be used to detect and identify mimicry and improve recognition of interrelationship and affect between them in a conversation. To automatically analyze how to extract and integrate this behavioral information into a mimicry detection framework for improving affective computing, this paper addresses the main challenge: mimicry representation in terms of optimal behavioral feature extraction and automatic integration.
- Published
- 2012
14. Elckerlyc in Practice – On the Integration of a BML Realizer in Real Applications
- Author
-
Reidsma, Dennis, van Welbergen, Herwin, Camurri, Antonio, and Costa, Cristina
- Subjects
Multimedia ,Computer science ,business.industry ,computer.software_genre ,Personalization ,Task (project management) ,Human–computer interaction ,Scratch ,System integration ,Architecture ,business ,computer ,Virtual actor ,computer.programming_language - Abstract
Building a complete virtual human application from scratch is a daunting task, and it makes sense to rely on existing platforms for behavior generation. When building such an interactive application, one needs to be able to adapt and extend the capabilities of the virtual human offered by the platform, without having to make invasive modifications to the platform itself. This paper describes how Elckerlyc, a novel platform for controlling a virtual human, offers these possibilities.
- Published
- 2012
15. Smart Material Interfaces: A Vision
- Author
-
Minuto, A., Vyas, Dhaval, Poelman, Wim, Nijholt, Antinus, Camurri, Antonio, and Costa, Cristine
- Subjects
Ubiquitous computing ,Multimedia ,Relation (database) ,METIS-296097 ,Computer science ,HMI-MI: MULTIMODAL INTERACTIONS ,EWI-22304 ,computer.software_genre ,Smart material ,Ubiquitous Computing ,Tangible User Interfaces ,Potential difference ,Smart Material Interfaces ,Human–computer interaction ,Special property ,IR-81858 ,User interface ,computer - Abstract
In this paper, we introduce a vision called Smart Material Interfaces (SMIs), which takes advantage of the latest generation of engineered materials that has a special property defined “smart‿. They are capable of changing their physical properties, such as shape, size and color, and can be controlled by using certain stimuli (light, potential difference, temperature and so on). We describe SMIs in relation to Tangible User Interfaces (TUIs) to convey the usefulness and a better understanding of SMIs.
- Published
- 2012
16. Mappe per Affetti Erranti : a Multimodal System for Social Active Listening and Expressive Performance
- Author
-
Camurri, Antonio, Canepa, Corrado, Coletta, Paolo, Mazzarino, Barbara, and Volpe, Gualtiero
- Subjects
multimodal interactive systems ,expressive gesture ,active listening of sound and music content ,affective computing ,sound and music computing - Abstract
Description
- Published
- 2008
- Full Text
- View/download PDF
17. Panel: The need of formats for streaming and storing music-related movement and gesture data
- Author
-
Jensenius, Alexander, Camurri, Antonio, Castagné, Nicolas, Maestre, Esteban, Joseph Malloch, Mc Gilvray, Douglas, Schwarz, Diemo, Wright, Matthew, Musical Gestures Group [Oslo], Faculty of Humanities [Oslo], University of Oslo (UiO)-University of Oslo (UiO), Dipartimento di Informatica, Sistemistica e Telematica (DIST), Universita degli studi di Genova -Università di Genova, ACROE - Ingénierie de la Création Artistique (ACROE-ICA), Ministère de la Culture et de la Communication (MCC)-Institut National Polytechnique de Grenoble (INPG), Music Technology Group, Universitat Pompeu Fabra [Barcelona] (UPF), Input Devices and Music Interaction Laboratory (IDMIL), McGill University = Université McGill [Montréal, Canada], Centre Interdisciplinaire de Recherche en Musique, Médias et Technologie [Montréal] (CIRMMT), Schulich School of Music [Montréal], McGill University = Université McGill [Montréal, Canada]-McGill University = Université McGill [Montréal, Canada], Centre for Music Technology - Univ. of Glasgow, University of Glasgow, Institut de Recherche et Coordination Acoustique/Musique (IRCAM), CNMAT - UC Berkeley, Center for Computer Research in Music and Acoustics (CCRMA), Stanford University, and Ann Arbor
- Subjects
[INFO.INFO-SD]Computer Science [cs]/Sound [cs.SD] ,[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] - Abstract
International audience; The last decade has seen the development of standards for music notation (MusicXML), audio analysis (SDIF), and sound control (OSC), but there are no widespread standards, nor structured approaches, for handling music-related movement, action and gesture data. This panel will address the needs for such formats and standards in the computer music community, and discuss possible directions for future development.
18. Instruction of computer music for computer engeneering students and professionals
- Author
-
Camurri, Antonio, Dannenberg, Roger, Giovanni De Poli, and Smith, Julius O.
19. A Summary of Formats for Streaming and Storing Music-Related Movement and Gesture Data
- Author
-
Jensenius, Alexander, Castagné, Nicolas, Camurri, Antonio, Maestre, Esteban, Joseph Malloch, Mc Gilvray, Douglas, Musical Gestures Group [Oslo], Faculty of Humanities [Oslo], University of Oslo (UiO)-University of Oslo (UiO), ACROE - Ingénierie de la Création Artistique (ACROE-ICA), Ministère de la Culture et de la Communication (MCC)-Institut National Polytechnique de Grenoble (INPG), Infomus Lab, University of Genoa (UNIGE), Music Technology Group (MTG), Universitat Pompeu Fabra [Barcelona] (UPF), Centre Interdisciplinaire de Recherche en Musique, Médias et Technologie [Montréal] (CIRMMT), Schulich School of Music [Montréal], McGill University = Université McGill [Montréal, Canada]-McGill University = Université McGill [Montréal, Canada], Input Devices and Music Interaction Laboratory (IDMIL), McGill University = Université McGill [Montréal, Canada], Centre for Music Technology - Univ. of Glasgow, and University of Glasgow
- Subjects
[INFO.INFO-SD]Computer Science [cs]/Sound [cs.SD] ,[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] ,[INFO.INFO-GR]Computer Science [cs]/Graphics [cs.GR] - Abstract
International audience; This paper summarises a panel discussion at the 2007 International Computer Music Conference on movement and gesture data formats, presents some of the formats currently in development in the computer music community, and outlines some of the challenges involved in future development.
20. Gesture desk - An integrated multi-modal gestural workplace for sonification
- Author
-
Hermann, Thomas, Henning, Thomas, Ritter, Helge, Camurri, Antonio, and Volpe, Gualtiero
- Subjects
ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION - Abstract
This paper presents the gesture desk, a new platform for a human-computer interface at a regular computer workplace. It extends classical input devices like keyboard and mouse by arm and hand gestures, without the need to use any inconvenient accessories like data gloves or markers. A central element is a "gesture box" containing two infrared cameras and a color camera which is positioned under a glass desk. Arm and hand motions are tracked in three dimensions. A synchronizer board has been developed to provide an active glare-free IR-illumination for robust body and hand tracking. As a first application, we demonstrate interactive real-time browsing and querying of auditory self-organizing maps (AuSOMs). An AuSOM is a combined visual and auditory presentation of high-dimensional data sets. Moving the hand above the desk surface allows to select neurons on the map and to manipulate how they contribute to data sonification. Each neuron is associated with a prototype vector in high-dimensional space, so that a set of 2D-topologically ordered feature maps is queried simultaneously. The level of detail is selected by hand altitude over the table surface, allowing to emphasize or deemphasize neurons on the map.
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.