21 results on '"Meyrueis, Vincent"'
Search Results
2. Advanced Monitoring of Manufacturing Process through Video Analytics.
- Author
-
Hakam, Nisar, Benfriha, Khaled, Meyrueis, Vincent, and Liotard, Cyril
- Subjects
MANUFACTURING processes ,VIDEO processing ,AUTOMATION ,METADATA ,DIGITIZATION ,VIDEO surveillance - Abstract
The digitization of production systems has revolutionized industrial monitoring. Analyzing real-time bottom-up data enables the dynamic monitoring of industrial processes. Data are collected in various types, like video frames and time signals. This article focuses on leveraging images from a vision system to monitor the manufacturing process on a computer numerical control (CNC) lathe machine. We propose a method for designing and integrating these video modules on the edge of a production line. This approach detects the presence of raw parts, measures process parameters, assesses tool status, and checks roughness in real time using image processing techniques. The efficiency is evaluated by checking the deployment, the accuracy, the responsiveness, and the limitations. Finally, a perspective is offered to use the metadata off the edge in a more complex artificial-intelligence (AI) method for predictive maintenance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Towards the Augmentation of Digital Twin Performance
- Author
-
Charrier, Quentin, primary, Hakam, Nisar, additional, Benfriha, Khaled, additional, Meyrueis, Vincent, additional, Liotard, Cyril, additional, Bouzid, Abdel-Hakim, additional, and Aoussat, Améziane, additional
- Published
- 2023
- Full Text
- View/download PDF
4. Emotional activity in early immersive design: Sketches and moodboards in virtual reality
- Author
-
Rieuf, Vincent, Bouchard, Carole, Meyrueis, Vincent, and Omhover, Jean-François
- Published
- 2017
- Full Text
- View/download PDF
5. Towards the augmentation of digital twin performance
- Author
-
Charrier, Quentin, Hakam, Nisar, Benfriha, Khaled, Meyrueis, Vincent, Liotard, Cyril, Bouzid, Abdel-Hakim, Aoussat, Améziane, Charrier, Quentin, Hakam, Nisar, Benfriha, Khaled, Meyrueis, Vincent, Liotard, Cyril, Bouzid, Abdel-Hakim, and Aoussat, Améziane
- Abstract
Digital Twin (DT) aims to provide industrial companies with an interface to visualize, analyze, and simulate the production process, improving overall performance. This paper proposes to extend existing DT by adding a complementary methodology to make it suitable for process supervision. To implement our methodology, we introduce a novel framework that identifies, collects, and analyses data from the production system, enhancing DT functionalities. In our case study, we implemented Key Performance Indicators (KPIs) in the immersive environment to monitor physical processes through cyber representation. First, a review of the Digital Twin (DT) allows us to understand the status of the existing methodologies as well as the problem of data contextualization in recent years. Based on this review, performance data in Cyber–Physical Systems (CPS) are identified, localized, and processed to generate indicators for monitoring machine and production line performance through DT. Finally, a discussion reveals the difficulties of integration and the possibilities to respond to other major industrial challenges, like predictive maintenance.
- Published
- 2023
6. TOWARDS REMOTE CONTROL OF MANUFACTURING MACHINES THROUGH ROBOT VISION SENSORS.
- Author
-
Halawi Ghoson, Nourhan, Hakam, Nisar, Shakeri, Zohreh, Meyrueis, Vincent, Loubère, Stéphane, and Benfriha, Khaled
- Subjects
REMOTE control ,ENGINEERING design ,THREE-dimensional printing ,INDUSTRIAL robots ,DETECTORS - Abstract
The remote management of equipment is part of the functionalities granted by the design principles of Industry 4.0. However, some critical operations are managed by operators, machine setup and initialization serve as a significant illustration. Since the initialization is a repetitive task, industrial robots with a smart vision system can undertake these duties, enhancing the autonomy and flexibility of the manufacturing process. The smart vision system is considered essential for the implementation of several characteristics of Industry 4.0. This paper introduces a novel solution for controlling manufacturing machines using an embedded camera on the robot. This implementation requires the development of an interactive interface, designed in accordance with the supervision system known as Manufacturing Execution System. The framework is implemented inside a manufacturing cell, demonstrating a quick response time and an improvement between the cameras. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. IdeAM Running Quiz: A Digital Learning Game to Enhance Additive Manufacturing Opportunities Discovery
- Author
-
Pham Van, Léo, primary, Jean, Camille, additional, Meyrueis, Vincent, additional, Gazo, Claude, additional, Mantelet, Fabrice, additional, Gueguan, Jérôme, additional, Buisine, Stéphanie, additional, and Segonds, Frédéric, additional
- Published
- 2022
- Full Text
- View/download PDF
8. IdeAM Running Quiz: A Digital Learning Game to Enhance Additive Manufacturing Opportunities Discovery.
- Author
-
Van, Léo Pham, Jean, Camille, Meyrueis, Vincent, Gazo, Claude, Mantelet, Fabrice, Guegan, Jérôme, Buisine, Stéphanie, and Segonds, Frédéric
- Subjects
DIGITAL learning ,EDUCATIONAL games ,ARCADE games ,LITERATURE reviews - Abstract
This paper provides a description of IdeAM Running Quiz, a learning game aiming at enhancing Additive Manufacturing (AM) opportunities learning. Firstly, a review of the literature on the Design and Additive Manufacturing methodologies and the educational effectiveness of learning games is presented. Then, the steps involved in the design of IdeAM Running Quiz game are introduced. IdeAM Running Quiz is a digital learning game inspired by arcade games. It is an infinite race where the player will have to choose the right path that will allow him to advance. It features examples of objects that can be made with AM as well as AM opportunities. For this research, we have conducted an experiment aiming at evaluating the effectiveness of IdeAM Running Quiz on AM opportunities learning, and satisfaction provided by playing it. This experiment has shown that IdeAM Running Quiz helps player to understand additive manufacturing opportunities. The experiment has also shown that participants like to play IdeAM Running Quiz and would recommend it for someone who wants to discover AM opportunities. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
9. Catalogue d'exposition: Catalogue d’exposition de 7 installations artistiques (Scenograffer, Interacte, Solar Insects, CREATIC, Gesture Selfie Diaporama, Deaf Poetry, TypannotSigns)
- Author
-
Meyrueis, Vincent, Boutet, Dominique, Jégo, Jean-François, Image Numérique et Réalité Virtuelle (INREV ), Arts des Images et Art Contemporain (AIAC), Université Paris 8 Vincennes-Saint-Denis (UP8)-Université Paris 8 Vincennes-Saint-Denis (UP8), Dynamique du Langage In Situ (DYLIS), Université de Rouen Normandie (UNIROUEN), Normandie Université (NU)-Normandie Université (NU)-Institut de Recherche Interdisciplinaire Homme et Société (IRIHS), Normandie Université (NU)-Normandie Université (NU)-Université de Rouen Normandie (UNIROUEN), Normandie Université (NU), ZEM Brandenburgische Zentrum für Medienwissenschaften, Vincent Meyrueis, Dominique Boutet, and Jean-François Jégo
- Subjects
[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] ,[SHS.ART]Humanities and Social Sciences/Art and art history ,[SHS.LANGUE]Humanities and Social Sciences/Linguistics ,ComputingMilieux_MISCELLANEOUS - Abstract
International audience
- Published
- 2019
10. Communication invitée
- Author
-
Boutet, Dominique, Jégo, Jean-François, Meyrueis, Vincent, Dynamique du Langage In Situ (DYLIS), Université de Rouen Normandie (UNIROUEN), Normandie Université (NU)-Normandie Université (NU)-Institut de Recherche Interdisciplinaire Homme et Société (IRIHS), Normandie Université (NU)-Normandie Université (NU)-Université de Rouen Normandie (UNIROUEN), Normandie Université (NU), Arts des Images et Art Contemporain (AIAC), Université Paris 8 Vincennes-Saint-Denis (UP8), Image Numérique et Réalité Virtuelle (INREV ), Université Paris 8 Vincennes-Saint-Denis (UP8)-Université Paris 8 Vincennes-Saint-Denis (UP8), and Viadrina Universität
- Subjects
[SHS.LANGUE]Humanities and Social Sciences/Linguistics ,[SHS.ANTHRO-SE]Humanities and Social Sciences/Social Anthropology and ethnology ,ComputingMilieux_MISCELLANEOUS - Abstract
International audience
- Published
- 2019
11. Body of the gestures / Gestures of the body
- Author
-
Boutet, Dominique, Jégo, Jean-François, Meyrueis, Vincent, and Boutet, Dominique
- Subjects
[SHS.ANTHRO-SE] Humanities and Social Sciences/Social Anthropology and ethnology ,[SHS.LANGUE] Humanities and Social Sciences/Linguistics ,ComputingMilieux_MISCELLANEOUS - Published
- 2019
12. POLIMOD Pipeline: tutorial: step-by-step tutorial: Motion Capture, Visualization & Data Analysis for gesture studies
- Author
-
Boutet, Dominique, Jégo, Jean-François, Meyrueis, Vincent, Dynamique du Langage In Situ (DYLIS), Université de Rouen Normandie (UNIROUEN), Normandie Université (NU)-Normandie Université (NU)-Institut de Recherche Interdisciplinaire Homme et Société (IRIHS), Normandie Université (NU)-Normandie Université (NU)-Université de Rouen Normandie (UNIROUEN), Normandie Université (NU), Université Paris 8 Vincennes-Saint-Denis (UP8), Université de Rouen, Université Paris 8, Moscow State Linguistic University, and POLIMOD
- Subjects
[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing ,[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] ,[MATH.MATH-NA]Mathematics [math]/Numerical Analysis [math.NA] ,[INFO.INFO-CL]Computer Science [cs]/Computation and Language [cs.CL] ,[INFO.INFO-GR]Computer Science [cs]/Graphics [cs.GR] - Abstract
This open-access tutorial describes step-by-step how to use motion capture for gesture studies. We propose a pipeline to collect, visualize, annotate and analyze motion capture (mocap) data for gesture studies. A pipeline is "an implementation of a workflow specification. The term comes from computing, where it means a set of serial processes, with the output of one process being the input of the subsequent process. A production pipeline is not generally perfectly serial because real workflows usually have branches and iterative loops, but the idea is valid: A pipeline is the set of procedures that need to be taken in order to create and hand off deliverables" (Okun, 2010).The pipeline designed here presents two main parts and three subparts. The first part of the pipeline describes the data collection process, including the setup and its prerequisites, the protocol to follow and how to export data regarding the analysis. The second part is focusing on data analysis, describing the main steps of Data processing, Data analysis itself following different gesture descriptors, and Data visualization in order to understand complex or multidimensional gesture features. We design the pipeline using blocks connected with arrows. Each block is presenting a specific step using hardware or software. Arrows represent the flow of data between each block and the 3-letter acronyms attached refer to the data file format.
- Published
- 2018
13. POLIMOD Pipeline: documentation. Motion Capture, Visualization & Data Analysis for gesture studies
- Author
-
Boutet, Dominique, Jégo, Jean-François, Meyrueis, Vincent, Dynamique du Langage In Situ (DYLIS), Université de Rouen Normandie (UNIROUEN), Normandie Université (NU)-Normandie Université (NU)-Institut de Recherche Interdisciplinaire Homme et Société (IRIHS), Normandie Université (NU)-Normandie Université (NU)-Université de Rouen Normandie (UNIROUEN), Normandie Université (NU), Université Paris 8 Vincennes-Saint-Denis (UP8), and Université de Rouen, Université Paris 8, Moscow State Linguistic University
- Subjects
[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing ,[INFO]Computer Science [cs] ,[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] ,[INFO.INFO-NA]Computer Science [cs]/Numerical Analysis [cs.NA] ,[INFO.INFO-CL]Computer Science [cs]/Computation and Language [cs.CL] ,[INFO.INFO-GR]Computer Science [cs]/Graphics [cs.GR] - Abstract
We propose a pipeline to collect, visualize, annotate and analyze motion capture (mocap) data for gesture studies. A pipeline is "an implementation of a workflow specification. The term comes from computing, where it means a set of serial processes, with the output of one process being the input of the subsequent process. A production pipeline is not generally perfectly serial because real workflows usually have branches and iterative loops, but the idea is valid: A pipeline is the set of procedures that need to be taken in order to create and hand off deliverables" (Okun, 2010). The pipeline designed here (table 1) presents two main parts and three subparts. The first part of the pipeline describes the data collection process, including the setup and its prerequisites, the protocol to follow and how to export data regarding the analysis. The second part is focusing on data analysis, describing the main steps of Data processing, Data analysis itself following different gesture descriptors, and Data visualization in order to understand complex or multidimensional gesture features. We design the pipeline using blocks connected with arrows. Each block is presenting a specific step using hardware or software. Arrows represent the flow of data between each block and the 3-letter acronyms attached refer to the data file format (table 2). The development of the pipeline raises three main questions: How to synchronize Data ? How to pick data and transform it? And what is changing in annotation? To solve the question of data synchronization, we design a protocol where we detail how hardware has to be properly selected regarding the type of measures and the protocol to follow implies specific steps for the participant such as adopting a T-Pose, or clapping their hands once at the beginning and the end of the recording to facilitate data synchronization and export in the next steps. About picking relevant data and transforming it, we propose to select and prepare files to export regarding the analysis and software expected. Thus, mocap files could be converted to videos to be visualized for instance in Elan to enhance gesture coding or converted to text files to be analyzed in Excel, or processed in Unity to explore the flow of movement, new gesture features or kinematics. We detail all these processes in a step-by-step tutorial available in an open access. Finally, we question what a pipeline involving Mocap is changing in annotation. We notice mocap allows no more a single point a view but as many as required since we can use virtual camera to study gesture from the "skeleton" of the participant. For example, we show it is possible to adopt a first-person point of view to embody then better understand participants gestures. We also propose an augmented reality tool developed in the Unity3D software to visualize in real-time multidimensional gesture features (such as velocity, acceleration, jerk) or a combination of them in simpler curve or surface representations. As future direction, data collected here could be used for a machine learning algorithm in order to extract automatically gesture properties or automatically detect and tag aspectuality of gestures. At last, an embodied visualization tool using virtual reality could thus offer newer possibilities to code and understand gestures than using a 2D video as a reference or study material.
- Published
- 2018
14. A Workflow for Real-time Visualization and Data Analysis of Gesture using Motion Capture
- Author
-
Jégo, Jean-François, primary, Meyrueis, Vincent, additional, and Boutet, Dominique, additional
- Published
- 2019
- Full Text
- View/download PDF
15. A Workflow for Real-time Visualization and Data Analysis of Gesture using Motion Capture.
- Author
-
Jégo, Jean-François, Meyrueis, Vincent, and Boutet, Dominique
- Published
- 2019
- Full Text
- View/download PDF
16. Modification interactive de formes en réalité virtuelle : application à la conception d'un produit
- Author
-
Meyrueis, Vincent, Centre de Robotique (CAOR), MINES ParisTech - École nationale supérieure des mines de Paris, Université Paris sciences et lettres (PSL)-Université Paris sciences et lettres (PSL), École Nationale Supérieure des Mines de Paris, and Philippe Fuchs
- Subjects
Conception assistée par ordinateur (CAO) ,Réalité virtuelle ,Environnement immersif ,Real-time 3D object deformation ,Immersive environment ,Virtual reality ,Déformation en temps réel d'objets 3D ,[INFO.INFO-GR]Computer Science [cs]/Graphics [cs.GR] ,Computer-aided design (CAD) - Abstract
The context of this research is the use of virtual reality for product design and development. At the industrial level, the virtual integration reviews are currently limited to static project review, with no possibility of changing the original design. In this work, we present a new method for virtual design review, which aims to enable the user to modify, as naturally as possible, the digital model inside the virtual environment. This method, called D3, is based on three steps: a selection step by drawing, a deformation step by manipulating the selected area and a final refining step that allows engineers to adjust the modifications. For the method to be a means of communication, it has to be simple, intuitive and usable by all project actors. An experiment was performed to evaluate the method in terms of learning curve and errors performed by subjects in a modification task. Finally, in order to facilitate the refining step, this method offers several ways for engineers to reflect the modifications on the CAD model.; Ces travaux se placent dans le cadre de l'utilisation de la réalité virtuelle pour la conception et le développement de produits. Les revues d'intégration virtuelles utilisées, au niveau industriel, sont actuellement limitées à la revue de projet statique, sans possibilité de modification de la conception initiale. Nous proposons une méthode destinée à la revue de projet virtuelle, dont le but est de permettre à l'utilisateur de modifier le plus naturellement possible la maquette numérique depuis l'environnement virtuel. Cette méthode appelée D3 est basée sur trois étapes : une étape de sélection par dessin, une étape de déformation par manipulation de la zone sélectionnée et une étape de reprise qui permet aux ingénieurs de reprendre les modifications. Afin que la méthode soit un support de communication, elle se doit d'être simple, intuitive et utilisable par tous. Une expérimentation a été menée dans le but d'évaluer la méthode au niveau de la facilité d'apprentissage et des erreurs commises par les sujets lors d'une tâche de modification. Enfin, dans le but de faciliter le travail de reprise, cette méthode offre plusieurs moyens permettant aux ingénieurs de répliquer les modifications sur la maquette CAO.
- Published
- 2011
17. D3: an Immersive aided design deformation method
- Author
-
Meyrueis, Vincent, Paljic, Alexis, Fuchs, Philippe, Centre de Robotique (CAOR), MINES ParisTech - École nationale supérieure des mines de Paris, Université Paris sciences et lettres (PSL)-Université Paris sciences et lettres (PSL), équipe RV&RA, Université Paris sciences et lettres (PSL)-Université Paris sciences et lettres (PSL)-MINES ParisTech - École nationale supérieure des mines de Paris, and ACM
- Subjects
Computer-Aided Design (CAD) ,Real-time 3D Object Deformation ,I.3.5 [Computer Graphics]: Computational Geometry and Object Modeling--Curve, surface, solid, and object representations ,I.3.6 [Computer Graphics]: Methodology and Techniques--Interaction techniques ,Virtual Reality ,Immersive Environment ,[INFO.INFO-GR]Computer Science [cs]/Graphics [cs.GR] - Abstract
International audience; In this paper, we introduce a new deformation method adapted to immersive design. The use of Virtual Reality (VR) in the design process implies a physical displacement of project actors and data between the virtual reality facilities and the design office. The decisions taken in the immersive environment are manually reflected on the Computed Aided Design (CAD) system. This increases the design time and breaks the continuity of data workflow. On this basis, there is a clear demand among the industry for tools adapted to immersive design. But few methods exist that encompass CAD problematic in VR. For this purpose, we propose a new method, called D3, for "Draw, Deform and Design", based on a 2 step manipulation paradigm, consisting with 1) area selection and 2) path drawing, and a final refining and fitting phase. Our method is discussed on the basis of a set of CAD deformation scenarios.
- Published
- 2009
18. Tâches de conception en environnement immersif par les techniques de réalité virtuelle
- Author
-
Meyrueis, Vincent, Paljic, Alexis, Fuchs, Philippe, Centre de Robotique (CAOR), MINES ParisTech - École nationale supérieure des mines de Paris, Université Paris sciences et lettres (PSL)-Université Paris sciences et lettres (PSL), équipe RV&RA, and Université Paris sciences et lettres (PSL)-Université Paris sciences et lettres (PSL)-MINES ParisTech - École nationale supérieure des mines de Paris
- Subjects
[INFO.INFO-RB]Computer Science [cs]/Robotics [cs.RO] ,[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] ,ComputingMilieux_MISCELLANEOUS ,[INFO.INFO-GR]Computer Science [cs]/Graphics [cs.GR] - Abstract
International audience
- Published
- 2007
19. A template approach for coupling virtual reality and CAD in an immersive car interior design scenario
- Author
-
Meyrueis, Vincent, primary, Paljic, Alexis, additional, Leroy, Laure, additional, and Fuchs, Philippe, additional
- Published
- 2013
- Full Text
- View/download PDF
20. D 3
- Author
-
Meyrueis, Vincent, primary, Paljic, Alexis, additional, and Fuchs, Philippe, additional
- Published
- 2009
- Full Text
- View/download PDF
21. D3.
- Author
-
Meyrueis, Vincent, Paljic, Alexis, and Fuchs, Philippe
- Published
- 2009
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.