20 results on '"Oliver, Schneider"'
Search Results
2. Sustainable Haptic Design: Improving Collaboration, Sharing, and Reuse in Haptic Design Research
- Author
-
Oliver Schneider, Bruno Fruchard, Dennis Wittchen, Bibhushan Raj Joshi, Georg Freitag, Donald Degraen, and Paul Strohmeier
- Published
- 2022
- Full Text
- View/download PDF
3. TactJam: An End-to-End Prototyping Suite for Collaborative Design of On-Body Vibrotactile Feedback
- Author
-
Dennis Wittchen, Katta Spiel, Bruno Fruchard, Donald Degraen, Oliver Schneider, Georg Freitag, Paul Strohmeier, Dresden University of Applied Sciences, Vienna University of Technology (TU Wien), Technology and knowledge for interaction (LOKI), Inria Lille - Nord Europe, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre de Recherche en Informatique, Signal et Automatique de Lille - UMR 9189 (CRIStAL), Centrale Lille-Université de Lille-Centre National de la Recherche Scientifique (CNRS)-Centrale Lille-Université de Lille-Centre National de la Recherche Scientifique (CNRS), Saarland University [Saarbrücken], Deutsches Forschungszentrum für Künstliche Intelligenz GmbH = German Research Center for Artificial Intelligence (DFKI), University of Waterloo [Waterloo], Max Planck Institute for Informatics [Saarbrücken], This project received funding from the European Research Council (ERC StG Interactive Skin 714797) and the Hertha-Firnberg Grant T 1146-G by the Austrian Science Fund (FWF)., and European Project: 714797,H2020-EU.1.1. - EXCELLENT SCIENCE - European Research Council (ERC),InteractiveSkin(2017)
- Subjects
[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] - Abstract
International audience; We present TactJam, an end-to-end suite for creating and sharing low fidelity prototypes of on-body vibrotactile feedback. With Tac-tJam, designers can create, record and share vibrotactile patterns online. This opens up new ways of collaboratively designing vibrotactile patterns both in collocated as well as in remote settings. We evaluate TactJam in a two-part distributed online workshop, exploring the design of on-body tactons. Participants were able to successfully use TactJam to learn about tacton design. We present an overview of mappings between tactons and their associated concepts before comparing the results of tactons created using solely a GUI and tactons created through experimenting with placements directly on the body. Conducting both parts of the workshop separately highlighted the importance of designing directly with bodies: less implicit assumptions were made, and designs were guided by personal experience. We reflect on these results and close on deliberations for the future development of TactJam.
- Published
- 2022
- Full Text
- View/download PDF
4. Juicy Haptic Design: Vibrotactile Embellishments Can Improve Player Experience in Games
- Author
-
Tanay Singhal and Oliver Schneider
- Subjects
Computer science ,business.industry ,05 social sciences ,020207 software engineering ,02 engineering and technology ,Terminology ,Empirical research ,Game design ,User experience design ,Human–computer interaction ,0202 electrical engineering, electronic engineering, information engineering ,Added value ,0501 psychology and cognitive sciences ,business ,050107 human factors ,Interactive media ,Haptic technology ,Meaning (linguistics) - Abstract
Game designers and researchers employ a sophisticated language for producing great player experiences with concepts such as juiciness, which refers to excessive positive feedback. However, much of their discourse excludes the role and value of haptic feedback. In this paper, we adapt terminology from game design to study haptic feedback. Specifically, we define haptic embellishments (HEs) as haptic feedback that reinforce information already provided through other means (e.g., via visual feedback) and juicy haptics as excessive positive haptic feedback with the intention of improving user experience in games and other interactive media. We report two empirical studies of users’ experiences interacting with visuo-haptic content on their phones to 1) study participants’ preferences for ten design principles for HEs and 2) measure the added value of juicy haptics, implemented as HEs, on player experience in a game. Results indicate that juicy haptics can enhance enjoyability, aesthetic appeal, immersion, and meaning.
- Published
- 2021
- Full Text
- View/download PDF
5. Scrappy: Using Scrap Material as Infill to Make Fabrication More Sustainable
- Author
-
Daniel Vogel, Oliver Schneider, Alec Jacobson, and Ludwig Wall
- Subjects
Engineering drawing ,Computer science ,business.industry ,05 social sciences ,Nesting (process) ,3D printing ,020207 software engineering ,Scrap ,02 engineering and technology ,Energy consumption ,Geometry processing ,Workflow ,0202 electrical engineering, electronic engineering, information engineering ,Infill ,0501 psychology and cognitive sciences ,Software system ,business ,050107 human factors - Abstract
We present a software system for fused deposition modelling 3D printing that replaces infill material with scrap to reduce material and energy consumption. Example scrap objects include unused 3D prints from prototyping and calibration, household waste like coffee cups, and off-cuts from other fabrication projects. To achieve this, our system integrates into an existing CAD workflow and manages a database of common items, previous prints, and manually entered objects. While modelling in a standard CAD application, the system suggests objects to insert, ranked by how much infill material they could replace. This computation extends an existing nesting algorithm to determine which objects fit, optimize their alignment, and adjust the enclosing mesh geometry. While printing, the system uses custom tool-paths and animated instructions to enable anyone nearby to manually insert the scrap material.
- Published
- 2021
- Full Text
- View/download PDF
6. Defining Haptic Experience: Foundations for Understanding, Communicating, and Evaluating HX
- Author
-
Erin Kim and Oliver Schneider
- Subjects
Consistency (negotiation) ,User experience design ,Computer science ,business.industry ,Human–computer interaction ,Immersion (virtual reality) ,Usability ,business ,Experiential learning ,ComputingMethodologies_COMPUTERGRAPHICS ,Personalization ,Haptic technology - Abstract
Haptic technology is maturing, with expectations and evidence that it will contribute to user experience (UX). However, we have very little understanding about how haptic technology can influence people's experience. Researchers and designers need a way to understand, communicate, and evaluate haptic technology's effect on UX. From a literature review and two studies - one with haptics novices, the other with expert hapticians - we developed a theoretical model of the factors that constitute a good haptic experience (HX). We define HX and propose its constituent factors: design parameters of Timeliness, Density, Intensity, and Timbre; the cross-cutting concern of Personalization; usability requirements of Utility, Causality, Consistency, and Saliency; and experiential factors of Harmony, Expressivity, Autotelics, Immersion, and Realism as guiding constructs important for haptic experience. This model will help guide design and research of haptic systems, inform language around haptics, and provide the basis for evaluative instruments, such as checklists, heuristics, or questionnaires.
- Published
- 2020
- Full Text
- View/download PDF
7. Keep Calm and Ride Along: Passenger Comfort and Anxiety as Physiological Responses to Autonomous Driving Styles
- Author
-
Nicole Belinda Dillen, Lennart E. Nacke, Krzysztof Czarnecki, Oliver Schneider, Edith Law, and Marko Ilievski
- Subjects
Future studies ,Skin response ,05 social sciences ,Applied psychology ,020207 software engineering ,02 engineering and technology ,Physiological responses ,Lead vehicle ,Jerk ,0202 electrical engineering, electronic engineering, information engineering ,medicine ,Anxiety ,0501 psychology and cognitive sciences ,medicine.symptom ,Affective computing ,Psychology ,Skin conductance ,050107 human factors - Abstract
Autonomous vehicles have been rapidly progressing towards full autonomy using fixed driving styles, which may differ from individual passenger preferences. Violating these preferences may lead to passenger discomfort or anxiety. We studied passenger responses to different driving style parameters in a physical autonomous vehicle. We collected galvanic skin response, heart rate, and eye-movement patterns from 20 participants, along with self-reported comfort and anxiety scores. Our results show that the presence and proximity of a lead vehicle not only raised the level of all measured physiological responses, but also exaggerated the existing effect of the longitudinal acceleration and jerk parameters. Skin response was also found to be a significant predictor of passenger comfort and anxiety. By using multiple independent events to isolate different driving style parameters, we demonstrate a method to control and analyze such parameters in future studies.
- Published
- 2020
- Full Text
- View/download PDF
8. DualPanto
- Author
-
Jonas Bounama, Jotaro Shigeyama, Patrick Baudisch, Oliver Schneider, Thijs Roumen, Sebastian Marwecki, Robert Kovacs, Daniel Amadeus Gloeckner, and Nico Boeckhoff
- Subjects
Computer science ,Visually impaired ,Virtual world ,05 social sciences ,ComputingMilieux_PERSONALCOMPUTING ,020207 software engineering ,02 engineering and technology ,Metaverse ,Spatial registration ,Human–computer interaction ,0202 electrical engineering, electronic engineering, information engineering ,Pantograph ,0501 psychology and cognitive sciences ,050107 human factors ,Output device ,Haptic technology ,Avatar - Abstract
We present a new haptic device that enables blind users to continuously interact with spatial virtual environments that contain moving objects, as is the case in sports or shooter games. Users interact with DualPanto by operating the me handle with one hand and by holding on to the it handle with the other hand. Each handle is connected to a pantograph haptic input/output device. The key feature is that the two handles are spatially registered with respect to each other. When guiding their avatar through a virtual world using the me handle, spatial registration enables users to track moving objects by having the device guide the output hand. This allows blind players of a 1-on-1 soccer game to race for the ball or evade an opponent; it allows blind players of a shooter game to aim at an opponent and dodge shots. In our user study, blind participants reported very high enjoyment when using the device to play (6.5/7).
- Published
- 2018
- Full Text
- View/download PDF
9. Metamaterial Textures
- Author
-
Alexandra Ion, Oliver Schneider, Pedro Lopes, Patrick Baudisch, and Robert Kovacs
- Subjects
Rapid prototyping ,Fabrication ,Computer science ,business.industry ,05 social sciences ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Metamaterial ,3D printing ,020207 software engineering ,02 engineering and technology ,Grid ,Programmable matter ,Computer graphics (images) ,0202 electrical engineering, electronic engineering, information engineering ,0501 psychology and cognitive sciences ,business ,050107 human factors ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
We present metamaterial textures---3D printed surface geometries that can perform a controlled transition between two or more textures. Metamaterial textures are integrated into 3D printed objects and allow designing how the object interacts with the environment and the user's tactile sense. Inspired by foldable paper sheets ("origami") and surface wrinkling, our 3D printed metamaterial textures consist of a grid of cells that fold when compressed by an external global force. Unlike origami, however, metamaterial textures offer full control over the transformation, such as in between states and sequence of actuation. This allows for integrating multiple textures and makes them useful, e.g., for exploring parameters in the rapid prototyping of textures. Metamaterial textures are also robust enough to allow the resulting objects to be grasped, pushed, or stood on. This allows us to make objects, such as a shoe sole that transforms from flat to treaded, a textured door handle that provides tactile feedback to visually impaired users, and a configurable bicycle grip. We present an editor assists users in creating metamaterial textures interactively by arranging cells, applying forces, and previewing their deformation.
- Published
- 2018
- Full Text
- View/download PDF
10. A Demonstration of Metamaterial Textures
- Author
-
Pedro Lopes, Alexandra Ion, Robert Kovacs, Patrick Baudisch, and Oliver Schneider
- Subjects
Computer science ,business.industry ,05 social sciences ,Metamaterial ,3D printing ,020207 software engineering ,02 engineering and technology ,Object (computer science) ,Programmable matter ,Transformation (function) ,Computer graphics (images) ,Door handle ,0202 electrical engineering, electronic engineering, information engineering ,0501 psychology and cognitive sciences ,business ,050107 human factors ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
We present metamaterial textures-3D printed surface geometries that can perform a controlled transition between two or more textures. Metamaterial textures are integrated into 3D printed objects and allow designing how the object interacts with the environment and the user's tactile sense. Metamaterial textures offer full control over the transformation, such as in between states and sequence of actuation. This allows for integrating multiple textures that enable functional objects such as, e.g., a transformable door handle, which integrates tactile feedback for visually impaired users, or a shoe sole that transforms from flat to treaded to adapt to weather conditions. In our hands-on demonstration, we show our 3D printed prototypes and several samples. Attendees can touch the objects and explore their different textured states.
- Published
- 2018
- Full Text
- View/download PDF
11. The Haptic Bridge
- Author
-
Paulo Blikstein, Allison M. Okamura, Karon E. MacLean, Melisa Orta Martinez, Oliver Schneider, and Richard L. Davis
- Subjects
Design framework ,Multimedia ,InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Computer science ,Learning environment ,05 social sciences ,050301 education ,020207 software engineering ,02 engineering and technology ,computer.software_genre ,Haptic force feedback ,Graph ,law.invention ,Unit circle ,law ,Human–computer interaction ,0202 electrical engineering, electronic engineering, information engineering ,Trigonometric functions ,Cartesian coordinate system ,0503 education ,computer ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology - Abstract
Haptic force feedback systems are unique in their ability to dynamically render physical representations. Although haptic devices have shown promise for supporting learning, prior work mainly describes results of haptic-supported learning without identifying underlying learning mechanisms. To this end, we designed a haptic-supported learning environment and analyzed four students who used it to make connections between two different mathematical representations of sine and cosine: the unit circle, and their graph on the Cartesian plane. We highlight moments where students made connections between the representations, and identify how the haptic feedback supported these moments of insight. We use this evidence in support of a proposed theoretical and design framework for educational haptics. This framework captures four types of haptic representations, and focuses on one -- the haptic bridge -- that effectively scaffolds sense-making with multiple representations.
- Published
- 2017
- Full Text
- View/download PDF
12. Voodle
- Author
-
Paul Bucci, Oliver Schneider, Karon E. MacLean, and David Marino
- Subjects
Direct voice input ,Engineering ,Social robot ,Multimedia ,business.industry ,05 social sciences ,Context (language use) ,Animation ,computer.software_genre ,050105 experimental psychology ,Human–robot interaction ,Sketch ,Human–computer interaction ,Robot ,0501 psychology and cognitive sciences ,business ,Set (psychology) ,computer ,050107 human factors - Abstract
Social robots must be believable to be effective; but creating believable, affectively expressive robot behaviours requires time and skill. Inspired by the directness with which performers use their voices to craft characters, we introduce Voodle (vocal doodling), which uses the form of utterances -- e.g., tone and rhythm -- to puppet and eventually control robot motion. Voodle offers an improvisational platform capable of conveying hard-to-express ideas like emotion. We created a working Voodle system by collecting a set of vocal features and associated robot motions, then incorporating them into a prototype for sketching robot behaviour. We explored and refined Voodle's expressive capacity by engaging expert performers in an iterative design process. We found that users develop a personal language with Voodle; that a vocalization's meaning changed with narrative context; and that voodling imparts a sense of life to the robot, inviting designers to suspend disbelief and engage in a playful, conversational style of design.
- Published
- 2017
- Full Text
- View/download PDF
13. Sketching CuddleBits
- Author
-
Merel Jung, Paul Bucci, Karon E. MacLean, David Marino, Anasazi Valair, Xi Laura Cang, Lucia Tseng, Oliver Schneider, and Jussi Rantala
- Subjects
Social robot ,Computer science ,05 social sciences ,Affective computing ,050109 social psychology ,Affect (psychology) ,physical prototyping ,Arousal ,haptics ,Do-It-Yourself (DIY) ,Human–computer interaction ,Robot ,0501 psychology and cognitive sciences ,Valence (psychology) ,human-robot interaction (HRI) ,050107 human factors ,Haptic technology - Abstract
Social robots that physically display emotion invite natural communication with their human interlocutors, enabling ap- plications like robot-assisted therapy where a complex robot’s breathing influences human emotional and physiological state. Using DIY fabrication and assembly, we explore how sim- ple 1-DOF robots can express affect with economy and user customizability, leveraging open-source designs. We developed low-cost techniques for coupled iteration of a simple robot’s body and behaviour, and evaluated its potential to display emotion. Through two user studies, we (1) vali- dated these CuddleBits’ ability to express emotions (N=20); (2) sourced a corpus of 72 robot emotion behaviours from participants (N=10); and (3) analyzed it to link underlying parameters to emotional perception (N=14). We found that CuddleBits can express arousal (activation), and to a lesser degree valence (pleasantness). We also show how a sketch-refine paradigm combined with DIY fabrication and novel input methods enable parametric design of physical emotion display, and discuss how mastering this parsimonious case can give insight into layering simple behaviours in more complex robots.
- Published
- 2017
- Full Text
- View/download PDF
14. Multisensory haptic interactions: understanding the sense and designing for it
- Author
-
Hasti Seifi, Oliver Schneider, and Karon E. MacLean
- Subjects
Focus (computing) ,Engineering ,Multimedia ,InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,business.industry ,Kinesthetic learning ,Virtual reality ,computer.software_genre ,law.invention ,Touchscreen ,law ,Stereotaxy ,Joystick ,User interface ,business ,computer ,Haptic technology - Abstract
Our haptic sense comprises both taction or cutaneous information obtained through receptors in the skin, and kinesthetic awareness of body forces and motions. Broadly speaking, haptic interfaces to computing systems are anything a user touches or is touched by, to control, experience, or receive information from something with a computer in it. Keyboard and mouse, a physical button on a kitchen blender, and the glass touchscreen on your smartphone are energetically passive haptic interfaces: no external energy is pumped into the users' body from a powered actuator. Most readers will have encountered energetically active haptic feedback as a vibrotactile (VT) buzz or forces in a gaming joystick, a force feedback device in a research lab, or a physically interactive robot. Much more is possible.When we bring touch into an interaction, we invoke characteristics that are unique or accentuated relative to other modalities. Like most powerful design resources, these traits also impose constraints. The job of a haptic designer is to understand these "superpowers" and their costs and limits, and then to deploy them for an optimally enriched experience.Both jobs are relatively uncharted, even though engineers have been building devices with the explicit intention of haptic display for over 25 years, and psychophysicists have been studying this rich, complex sense for as many decades. What makes it so difficult? Our haptic sense is reallymanydifferent senses, neurally integrated; meanwhile, the technology of haptic display is anything but stable, with engineering challenges of a different nature than those for graphics and sound. In the last few years, technological advances from materials to robotics have opened new possibilities for the use of energetically active haptics in user interfaces, our primary focus here. Needs are exposed at a large scale by newly ubiquitous technology like "touch" screens crying out for physical feedback, and high-fidelity virtual reality visuals that are stalled in effectiveness without force display.
- Published
- 2017
- Full Text
- View/download PDF
15. HapTurk
- Author
-
Hasti Seifi, Oliver Schneider, Matthew Chun, Karon E. MacLean, and Salma Kashani
- Subjects
Communication design ,Multimedia ,Computer science ,business.industry ,media_common.quotation_subject ,05 social sciences ,020207 software engineering ,02 engineering and technology ,Affect (psychology) ,Crowdsourcing ,computer.software_genre ,Visualization ,User studies ,User experience design ,Human–computer interaction ,Perception ,0202 electrical engineering, electronic engineering, information engineering ,0501 psychology and cognitive sciences ,business ,computer ,050107 human factors ,Haptic technology ,media_common - Abstract
Vibrotactile (VT) display is becoming a standard component of informative user experience, where notifications and feedback must convey information eyes-free. However, effective design is hindered by incomplete understanding of relevant perceptual qualities, together with the need for user feedback to be accessed in-situ. To access evaluation streamlining now common in visual design, we introduce proxy modalities as a way to crowdsource VT sensations by reliably communicating high-level features through a crowd-accessible channel. We investigate two proxy modalities to represent a high-fidelity tactor: a new VT visualization, and low-fidelity vibratory translations playable on commodity smartphones. We translated 10 high-fidelity vibrations into both modalities, and in two user studies found that both proxy modalities can communicate affective features, and are consistent when deployed remotely over Mechanical Turk. We analyze fit of features to modalities, and suggest future improvements.
- Published
- 2016
- Full Text
- View/download PDF
16. Tactile Animation by Direct Manipulation of Grid Displays
- Author
-
Ali Israr, Karon E. MacLean, and Oliver Schneider
- Subjects
InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Computer science ,Computer graphics (images) ,Interpolation (computer graphics) ,Wearable computer ,Animation ,Object (computer science) ,Grid ,Graphics pipeline ,GeneralLiterature_MISCELLANEOUS ,ComputingMethodologies_COMPUTERGRAPHICS ,Haptic technology ,Abstraction (linguistics) - Abstract
Chairs, wearables, and handhelds have become popular sites for spatial tactile display. Visual animators, already expert in using time and space to portray motion, could readily transfer their skills to produce rich haptic sensations if given the right tools. We introduce the tactile animation object, a directly manipulated phantom tactile sensation. This abstraction has two key benefits: 1) efficient, creative, iterative control of spatiotemporal sensations, and 2) the potential to support a variety of tactile grids, including sparse displays. We present Mango, an editing tool for animators, including its rendering pipeline and perceptually-optimized interpolation algorithm for sparse vibrotactile grids. In our evaluation, professional animators found it easy to create a variety of vibrotactile patterns, with both experts and novices preferring the tactile animation object over controlling actuators individually.
- Published
- 2015
- Full Text
- View/download PDF
17. Exploring Embedded Haptics for Social Networking and Interactions
- Author
-
Ali Israr, Siyan Zhao, and Oliver Schneider
- Subjects
Multimedia ,Social network ,InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Computer science ,business.industry ,Interface (computing) ,Wearable computer ,Interpersonal communication ,Virtual reality ,computer.software_genre ,Human–computer interaction ,business ,Mobile device ,Mobile interaction ,computer ,Haptic technology - Abstract
Haptic feedback is frequently used for user interactions with mobile devices, wearables, and handheld controllers in virtual reality and entertainment settings. We explore the use of vibrotactile (VT) feedback for social and interpersonal communication on embedded systems, particularly in a mobile context. We propose an architecture that supports compact packet communication between devices and triggers expressive VT patterns in a typical messenger application. We present a communication API, haptic vocabularies, and an interface for receiving and authoring haptic messages. Finally, we conclude with an informal survey for using haptics in a social setting.
- Published
- 2015
- Full Text
- View/download PDF
18. Toward a validated computing attitudes survey
- Author
-
Allison Elliott Tew, Brian Dorn, and Oliver Schneider
- Subjects
Medical education ,Class (computer programming) ,Knowledge management ,Iterative design ,Computer science ,Process (engineering) ,business.industry ,ComputingMilieux_COMPUTERSANDEDUCATION ,Expert consensus ,Adaptation (computer science) ,business - Abstract
The Computing Attitudes Survey (CAS) is a newly designed instrument, adapted from the Colorado Learning Attitudes about Science Survey (CLASS), for measuring novice to expert-like perceptions about computer science. In this paper we outline the iterative design process used for the adaptation and present our progress toward establishing the instrument's validity. We present results of think-aloud interviews and discuss procedures used to determine expert consensus for CAS items. We also detail results of a pilot of the instrument with 447 introductory students in Fall 2011 along with a preliminary factor analysis of this data. Findings to date show consistent interpretation of statements by faculty and students, establish expert consensus of opinion and identify eight candidate factors for further analysis.
- Published
- 2012
- Full Text
- View/download PDF
19. Towards frabjous
- Author
-
Nathaniel D. Osgood, Christopher Dutchyn, and Oliver Schneider
- Subjects
Domain-specific language ,business.industry ,Computer science ,computer.software_genre ,Domain (software engineering) ,Software framework ,Programming language specification ,Reactive programming ,Programming paradigm ,Artificial intelligence ,Programming domain ,Software engineering ,business ,computer ,Functional reactive programming - Abstract
Agent-based infection-transmission models, which simulate an infection moving through a population, are being employed more frequently by health policy-makers. However, these models present several obstacles to widespread adoption. They are complex entities and impose a high development and maintenance cost. Current tools can be opaque, requiring multidisciplinary collaboration between a modeler and an expert programmer, and another round of translation when communicating with domain experts. In this paper, we describe the use of functional reactive programming (FRP), a programming paradigm created by imbuing a functional programming language with an intrinsic sense of time, to represent agent-based models in a concise and transparent way. We document the conversion of several agent-based models developed in the popular hybrid modeling tool AnyLogic to a representation in FRP. We also introduce Frabjous, a programming framework and domain-specific language for computational modeling. Frabjous generates human-readable and modifiable FRP code from a model specification, allowing modelers to have two transparent representations in which to program: a high-level model specification, and a full functional programming language with an agent-based modeling framework.
- Published
- 2012
- Full Text
- View/download PDF
20. Chalk sounds
- Author
-
Carl Gutwin, Robert Xiao, Oliver Schneider, and Stephen Brewster
- Subjects
Collaborative software ,Multimedia ,business.industry ,Computer science ,law ,Awareness information ,Workspace ,Radar ,Visual awareness ,business ,computer.software_genre ,computer ,law.invention - Abstract
Awareness of other people's activity is an important part of shared-workspace collaboration, and is typically supported using visual awareness displays such as radar views. These visual presentations are limited in that the user must be able to see and attend to the view in order to gather awareness information. Using audio to convey awareness information does not suffer from these limitations, and previous research has shown that audio can provide valuable awareness in distributed settings. In this paper we evaluate the effectiveness of synthesized dynamic audio information, both on its own and as an adjunct to a visual radar view. We developed a granular-synthesis engine that produces realistic chalk sounds for off-screen activity in a groupware workspace, and tested the audio awareness in two ways. First, we measured people's ability to identify off-screen activities using only sound, and found that people are almost as accurate with synthesized sounds as with real sounds. Second, we tested dynamic audio awareness in a realistic groupware scenario, and found that adding audio to a radar view significantly improved awareness of off-screen activities in situations where it was difficult to see or attend to the visual display. Our work provides new empirical evidence about the value of dynamic synthesized audio in distributed groupware.
- Published
- 2011
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.