Hall2024

From emcawiki
Revision as of 00:39, 19 November 2024 by JakubMlynar (talk | contribs) (Created page with "{{BibEntry |BibType=ARTICLE |Author(s)=Lauren Hall; Saul Albert; Elizabeth Peel; |Title=Doing Virtual Companionship with Alexa |Tag(s)=EMCA; Companions; Conversation analysis;...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
Hall2024
BibType ARTICLE
Key Hall2024
Author(s) Lauren Hall, Saul Albert, Elizabeth Peel
Title Doing Virtual Companionship with Alexa
Editor(s)
Tag(s) EMCA, Companions, Conversation analysis, Discoursive psychology, Dementia, Virtual assistants, AI Reference List
Publisher
Year 2024
Language English
City
Month
Journal Social Interaction: Video-Based Studies of Human Sociality
Volume 7
Number 3
Pages
URL Link
DOI 10.7146/si.v7i3.150089
ISBN
Organization
Institution
School
Type
Edition
Series
Howpublished
Book title
Chapter

Download BibTex

Abstract

Technologists often claim that virtual assistants, e.g., smart speakers, can offer 'smart companionship for independent older people'. However, the concept of companionship manifested by such technologies is rarely explained further. Studies of virtual assistants as assistive technologies have tended to conceptualise companionship as a 'special form of friendship' or as a way of strengthening 'psychological wellbeing' and 'emotional resilience'. While these abstractions can be measured using psychological indices or self-report, they are not necessarily informative about how 'virtual companionship' may be performed in everyday interaction. This case study focuses on how a virtual assistant is used by a person living with dementia and asks to what extent it takes on a role recognizable, from interactional studies, as 'doing companionship'. We draw on naturalistic video data featuring a person living with dementia in her own home using a smart speaker. Our results show how actions such as complaints about and blamings directed towards the device are achieved through shifts of ‘footing’ between turns that are ostensibly ‘talk to oneself’ and turns designed to occasion a response. Our findings have implications for the design, feasibility, and ethics of virtual assistants as companions, and for our understanding of the embedded ontological assumptions, interactive participation frameworks, and conversational roles involved in doing companionship with machines.

Notes