PUBLICATION
Exploring an Architecture of an Adaptable GenAI-Driven Multimodal User Interface for Third-Party Systems
Type
Workshop Paper
Year
2024
Authors
Research Area
Event
Mensch und Computer 2024 (MuC)
Published in
Mensch und Computer 2024 - Workshopband
Download
Abstract
Nowadays, Generative Artificial Intelligence (GenAI) can outperform humans in creative professions, such as design. As a result, GenAI attracted a lot of attention from researchers and industry. However, GenAI could used to augment humans with a multimodal user interface, as proposed by Ben Shneiderman in his recent work on Human-Centred Artificial Intelligence (HCAI). Most studies of HCAI have mainly focused on greenfield projects. In contrast to existing research, we describe a brownfield software architecture approach with a loosely coupled GenAI-driven multimodal user interface that combines human interaction with third-party systems. A domain-specific language for user interaction connects natural language and signals of the existing system through GenAI. Our proposed architecture enables research and industry to provide user interfaces for existing software systems that allow hands-free interaction.
Reference
Münch, Tobias; Gaedke, Martin: Exploring an Architecture of an Adaptable GenAI-Driven Multimodal User Interface for Third-Party Systems. Mensch und Computer 2024 - Workshopband, 2024.