Difference between revisions of "Gardner2010a"

From emcawiki
Jump to: navigation, search
(Created page with "{{BibEntry |BibType=ARTICLE |Author(s)=Rod Gardner; Mike Levy; |Title=The coordination of talk and action in the collaborative construction of a multimodal text |Tag(s)=EMCA;...")
 
 
Line 1: Line 1:
 
{{BibEntry
 
{{BibEntry
 
|BibType=ARTICLE
 
|BibType=ARTICLE
|Author(s)=Rod Gardner; Mike Levy;  
+
|Author(s)=Rod Gardner; Mike Levy;
 
|Title=The coordination of talk and action in the collaborative construction of a multimodal text
 
|Title=The coordination of talk and action in the collaborative construction of a multimodal text
|Tag(s)=EMCA; Language; Action; Collaboration; Kinetic Behavior; Nonverbal; Gesture;  
+
|Tag(s)=EMCA; Language; Action; Collaboration; Kinetic Behavior; Nonverbal; Gesture;
 
|Key=Gardner2010a
 
|Key=Gardner2010a
 
|Year=2010
 
|Year=2010
Line 9: Line 9:
 
|Volume=42
 
|Volume=42
 
|Number=8
 
|Number=8
|Pages=2189-2203
+
|Pages=2189–2203
 
|URL=http://www.sciencedirect.com/science/article/pii/S0378216610000196
 
|URL=http://www.sciencedirect.com/science/article/pii/S0378216610000196
|DOI=doi:10.1016/j.pragma.2010.01.006
+
|DOI=10.1016/j.pragma.2010.01.006
 
|Abstract=This paper explores how speech and action are coordinated in a web-based task undertaken by two high school students working collaboratively at the computer. The paper focuses on the coordination involved in the interactions between the two students and the computer screen, keyboard, and mouse, and explores the temporal synchrony and ‘matching’ points between speaking and typing, and speaking and mouse movements, within and between participants. Examples include coordination of speaking words aloud whilst typing, coordination of reading aloud from the screen and mouse movements, and coordination between participants, as when one individual is typing and the other talking. The discussion draws on the literature describing the coordination of language and action, kinesic behaviour, and nonverbal communication, including gesture, which have the potential to mediate conversation. Results indicate most coordination of talk and action is at the beginning of the action. Sometimes work is done to ensure coordination, either by slowing down the talk or pausing or stretching sounds mid-utterance. Talk that is coordinated temporally to some action on the screen is precise; in other words even when action and talk are mismatched (e.g., she is not talking about what she is doing), talk and action can start and finish together.
 
|Abstract=This paper explores how speech and action are coordinated in a web-based task undertaken by two high school students working collaboratively at the computer. The paper focuses on the coordination involved in the interactions between the two students and the computer screen, keyboard, and mouse, and explores the temporal synchrony and ‘matching’ points between speaking and typing, and speaking and mouse movements, within and between participants. Examples include coordination of speaking words aloud whilst typing, coordination of reading aloud from the screen and mouse movements, and coordination between participants, as when one individual is typing and the other talking. The discussion draws on the literature describing the coordination of language and action, kinesic behaviour, and nonverbal communication, including gesture, which have the potential to mediate conversation. Results indicate most coordination of talk and action is at the beginning of the action. Sometimes work is done to ensure coordination, either by slowing down the talk or pausing or stretching sounds mid-utterance. Talk that is coordinated temporally to some action on the screen is precise; in other words even when action and talk are mismatched (e.g., she is not talking about what she is doing), talk and action can start and finish together.
 
}}
 
}}

Latest revision as of 11:28, 25 November 2019

Gardner2010a
BibType ARTICLE
Key Gardner2010a
Author(s) Rod Gardner, Mike Levy
Title The coordination of talk and action in the collaborative construction of a multimodal text
Editor(s)
Tag(s) EMCA, Language, Action, Collaboration, Kinetic Behavior, Nonverbal, Gesture
Publisher
Year 2010
Language
City
Month
Journal Journal of Pragmatics
Volume 42
Number 8
Pages 2189–2203
URL Link
DOI 10.1016/j.pragma.2010.01.006
ISBN
Organization
Institution
School
Type
Edition
Series
Howpublished
Book title
Chapter

Download BibTex

Abstract

This paper explores how speech and action are coordinated in a web-based task undertaken by two high school students working collaboratively at the computer. The paper focuses on the coordination involved in the interactions between the two students and the computer screen, keyboard, and mouse, and explores the temporal synchrony and ‘matching’ points between speaking and typing, and speaking and mouse movements, within and between participants. Examples include coordination of speaking words aloud whilst typing, coordination of reading aloud from the screen and mouse movements, and coordination between participants, as when one individual is typing and the other talking. The discussion draws on the literature describing the coordination of language and action, kinesic behaviour, and nonverbal communication, including gesture, which have the potential to mediate conversation. Results indicate most coordination of talk and action is at the beginning of the action. Sometimes work is done to ensure coordination, either by slowing down the talk or pausing or stretching sounds mid-utterance. Talk that is coordinated temporally to some action on the screen is precise; in other words even when action and talk are mismatched (e.g., she is not talking about what she is doing), talk and action can start and finish together.

Notes