Socially-Assistive Robots for Dementia Care on Embodied Voice Assistant (Eva)
Tags |
---|
Overview
Eva platform is an open-source social robotics platform that provides you with the resources to build a social robot and its interactions cheaply, fast, and easily.
Keywords
EVA-SIM, multimodal data, dementia, HCI, HRI, Embodied Voice Assistant (EVA)
Getting started
Welcome to EvaPlatform! We're thrilled that you've decided to join our community. Our goal is to provide you with an exceptional experience every time you use our platform. Don't hesitate to reach out if you have any questions or feedback.
Build your IoT prototypes faster with JavaScript and no servers here. You can find some interesting and useful notebooks on GitHub or full experiments on EvaStore. Test your amazing ideas with Evasim. Ready? Build your own Evabot following the instructions on Eva official repository.
Would you like to contribute to the project(s)?
https://github.com/sanchezcarlosjr/evasim/
https://github.com/sanchezcarlosjr/javascript-notebook
Installation
Issues.
Code mobility
Naming
Database
Communication
Goals
Our main goal is to create an EVA platform to aid patients with dementia. To achieve this, we are currently focused on developing an EVA simulator using research from papers [5] and [6], as well as utilizing tools such as Blender, 3D printing, WebGL, and other relevant technologies. Because researchers strive for smart improvements, we bring out EvaSim [7], a simulator that enhances fast prototyping.
The development of the simulator will be followed by rigorous testing and evaluation of our system to ensure its efficacy in aiding dementia patients.
Methodology
The software development process of this project is user-centered and RUP-based, this involves understanding the human beings who use our systems and their needs before writing a single line of code.
We’re going to find user needs by understanding the discourse of the domain, observing the user behavior in the current workflow, and prototyping insofar as we have uncertainty or doubts.
So, our goals are to understand the language from the available literature focused on HCI, Ambient Intelligence, and Social Assistance; print and build a robot model to identify improvement opportunities using the 3D printer and Blender; document the available software; write down the system requirements; develop the Eva-SIM [7] [8] —which has been built in Brazil; and work with other interested groups.
Theory
HCI
Computer simulations
Robots as therapy
We use robots to assist people, we call such robots Socially Assistive Robots (SARs). Eva is a low-cost hardware SAR that assists dementia and geriatric cognitive disorder patients with non-pharmacological interventions. Nowadays, Evabot has become an Eva platform referring to the open-source social robotics platform that builds around it —EvaSim, Evabot, Eva Language, Evaweb, and so on.
Some of the multimodal interaction Evabot capabilities are Text-To-Speech (e.g. you choose vendors such as Google Cloud or IBM Watson), facial expression recognition, controlling light sensory effects, Wizard and Oz approach, and connecting IoT devices.
Serious games
The imitation game
Game of Colors
The game of emotions
Construction. Team versions.
Ecosystem
WBS
@startwbs * Eva Platform ** User-Centered Design Process *** RUP *** Prototypes ** Teams *** CICESE *** Universidade Federal Fluminense (UFF) *** Universidad de Castilla-La Mancha (mamilab) *** USCD ** EvaNotebook *** No servers. Please don't confuse it with serverless. *** Angular *** EditorJS *** Capabilities **** Send emails **** Send SMS **** Call people **** Control EVA and environment **** Connect to ChatGPT **** Connect to APIs **** Web Workers **** Collaborative notebook **** WebAssembly **** Blockchain? **** Decentralized database **** Observable Plot *** Protocols **** MQTT **** WebSockets **** HTTP **** WebRTC ** EvaSim *** Unity *** 3D *** WebRTC ** EvaLanguage *** Visual Programming Language *** Google Blockly *** JSON Domain-Specific Language ** Evabot *** 3D impression **** Blender *** Software components **** New version ***** MQTT ***** Rhasspy **** MEAN stack. ***** MongoDB ***** Express ***** AngularJS ***** NodeJS ***** PM2 (i.e. Process manager) ***** Interactions by VPL **** Computer vision ***** Mediapipe **** Voice and natural language processing ***** IBM Watson ***** Dialogflow by Google ***** Telegram API ***** The Wizard of Oz *** Hardware components **** Amoled Touch Display **** Raspberry PI 4 **** ArbotiX-M **** Two servomotors **** Bluetooth speaker **** Matrix voice **** Webcam **** Smart Bulb @endwbs
CICESE
Capabilities
Eva Language
VPL Blocks
https://rhasspy.readthedocs.io/en/latest/
https://github.com/matrix-io/matrixio-models/blob/master/matrix-voice/matrix-voice.stl
Eva Sim
Manuals
https://github.com/midiacom/eva-robot
Mockups
Project vision
We would like to enhance EVA researchers in training, programming, and testing new interactions.
Observations
What is the current workflow for researchers to develop new interactions?
Without a server or broker, the current performance of the system is slow, leading to user frustration. To improve the user experience, faster prototyping is essential. Additionally, users interact more frequently with computer vision, computer voice, and gestures, making it necessary to optimize the system for these interactions.
Although the system could support multiple programming languages, we’ve chosen VPL due to its ease of use for developers.
Prototypes
v1
https://www.figma.com/file/w63waPDDeIInLPiUpxQtM9/EVA?node-id=0%3A1&t=4USALfnKimt3aYEr-1
v3
- EvaNotebook. Build your scenarios faster with no servers.
- EvaSim 3D. Prototype your scenarios on a platform that works right out of the box.
- EvaLanguage. Control everything with custom scenarios.
- EvaStore. Share your ideas and experiments.
- EvaBot. Build a Smart and Affective Assistance with no pain.
References
[14]
Fluminense Federal University (UFF)
[11]
Capabilities
Due to the risk of damaging the assembled robot and a slow feedback loop, Marcelo Marques da Rocha proposed a simulator that interprets the Eva language and emulates the capabilities of Eva [7] [11]. This simulator is intended for use by developers, HRI researchers, and enthusiasts who need to safely and quickly test their real-world scenarios, enabling better training.
To summarize, the goal of the Eva Sim project, and therefore our goal, is to develop a simulator that allows Eva developers to prototype, learn, and iterate more quickly.
graph LR
JSON["JSON (Eva Language)"] --> EvaSimParser["Eva Sim x Parser"]
EvaSimParser["Eva Sim Parser"] --> XML
XML --> EvaSimModule["EvaSim Execution Module"]
EvaSimModule --> UserInterface
For example, on the simulator, you can run Serious games such as The Imitation Game (not the movie), Game of Colors, and Game of Emotions easily and with the same behavior as in the Evabot.
You can only visualize what is going on, but you cannot program on EvaSim.
TODO: Eva Sim user experience evaluation.
EvaML (EVA Markup Language)
EvaML is an XML vocabulary in the Eva language domain that is obtained from another system by JSON. EvaML is deprecated.
Universidad de Castilla-La Mancha
[12]
UCSD
Requirements
We are migrating the current version of EvaSim to the Unity platform on the Web. This will eliminate the need for researchers to go through an installation process, allowing for immediate use. Faster feedback from patients is crucial for research purposes.
Researchers require the ability to visualize the 3D robot and execute interactions in their preferred programming language. Additionally, they need to load customized environments, including keys for services such as Dialogflow, IBM Watson, ChatGPT, and others. EvaSim must support all features available on Evabot.
Researchers aim to connect to EvaSim through automation tasks such as "The Game of Emotions" and execute interactions automatically. However, due to cost and performance considerations, external servers are not allowed, and therefore EvaSim must run entirely on the client side. To meet these constraints, we propose using a Terminal with WebRTC and WebSockets, which enables EvaSim to operate without user interaction and run entirely on the client side.
Continuous integration
https://game.ci/docs/github/activation
Testing and evaluations
Conclusions
References
[1] Dix, Alan J., Janet E. Finlay, et al. Human-Computer Interaction. 2nd ed. Prentice Hall, 1998. ISBN: 9780132398640.
[2] Olsen, Dan R. Developing User Interfaces (Interactive Technologies). Morgan Kaufmann, 1998. ISBN: 9781558604186.
[3] Cook, Albert M., and Jan Miller Polgar. Cook & Hussey’s Assistive Technologies: Principles and Practice. 3rd ed. Mosby Elsevier, 2007. ISBN: 9780323039079.
[4] Item 1007/3283 | Repositorio CICESE. (2020, August 06). Retrieved from https://cicese.repositorioinstitucional.mx/jspui/handle/1007/3283
[5] Raskin, Jef. The Humane Interface: New Directions for Designing Interactive Systems. Addison-Wesley Professional, 2000. ISBN: 9780201379372.
[6] D. Cruz-Sandoval and J. Favela, "A Conversational Robot to Conduct Therapeutic Interventions for Dementia," in IEEE Pervasive Computing, vol. 18, no. 2, pp. 10-19, 1 April-June 2019, doi: 10.1109/MPRV.2019.2907020.
[7] M. Marques da Rocha, D. Cruz-Sandoval, J. Favela and D. C. Muchaluat-Saade, "EvaSIM: a Software Simulator for the EVA Open-source Robotics Platform," 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Napoli, Italy, 2022, pp. 714-721, doi: 10.1109/RO-MAN53752.2022.9900561.
[8] EVA - An open-source social robotics platform. (2022, December 20). Retrieved from https://eva-social-robot.github.io
[9] google. "mediapipe." GitHub, 23 Jan. 2023, github.com/google/mediapipe.
[10] Technologies, Unity. "Unity - Manual: Building your WebGL application." 20 Jan. 2023, docs.unity3d.com/Manual/webgl-building.html.
[11] midiacom. "eva-robot." GitHub, 29 Jan. 2023, github.com/midiacom/eva-robot.
[12] Villa, L., Hervás, R., Dobrescu, C.C., Cruz-Sandoval, D., Favela, J. (2022). Incorporating Affective Proactive Behavior to a Social Companion Robot for Community Dwelling Older Adults. In: Stephanidis, C., Antona, M., Ntoa, S., Salvendy, G. (eds) HCI International 2022 – Late Breaking Posters. HCII 2022. Communications in Computer and Information Science, vol 1655. Springer, Cham. https://doi.org/10.1007/978-3-031-19682-9_72
[13] Lozano, E.A., Sánchez-Torres, C.E., López-Nava, I.H., Favela, J. (2023). An Open Framework for Nonverbal Communication in Human-Robot Interaction. In: Bravo, J., Urzáiz, G. (eds) Proceedings of the 15th International Conference on Ubiquitous Computing & Ambient Intelligence (UCAmI 2023). UCAmI 2023. Lecture Notes in Networks and Systems, vol 842. Springer, Cham. https://doi.org/10.1007/978-3-031-48642-5_3