Back to Search
Start Over
Exploring Mobile Touch Interaction with Large Language Models
- Publication Year :
- 2025
-
Abstract
- Interacting with Large Language Models (LLMs) for text editing on mobile devices currently requires users to break out of their writing environment and switch to a conversational AI interface. In this paper, we propose to control the LLM via touch gestures performed directly on the text. We first chart a design space that covers fundamental touch input and text transformations. In this space, we then concretely explore two control mappings: spread-to-generate and pinch-to-shorten, with visual feedback loops. We evaluate this concept in a user study (N=14) that compares three feedback designs: no visualisation, text length indicator, and length + word indicator. The results demonstrate that touch-based control of LLMs is both feasible and user-friendly, with the length + word indicator proving most effective for managing text generation. This work lays the foundation for further research into gesture-based interaction with LLMs on touch devices.<br />Comment: 21 pages, 16 figures, 3 tables, ACM CHI 2025
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2502.07629
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1145/3706598.3713554