McIlvenny2020
McIlvenny2020 | |
---|---|
BibType | ARTICLE |
Key | McIlvenny2020 |
Author(s) | Paul McIlvenny |
Title | New Technology and Tools to Enhance Collaborative Video Analysis in Live 'Data Sessions' |
Editor(s) | |
Tag(s) | EMCA, Data session, Ethnomethodological conversation analysis, Audio-visual technology, Qualitative research, Digital humanities, Immersive qualitative analytics, Virtual reality |
Publisher | |
Year | 2020 |
Language | English |
City | |
Month | |
Journal | QuiViRR: Qualitative Video Research Reports |
Volume | 1 |
Number | |
Pages | |
URL | Link |
DOI | 10.5278/ojs.quivirr.v1.2020.a0001 |
ISBN | |
Organization | |
Institution | |
School | |
Type | |
Edition | |
Series | |
Howpublished | |
Book title | |
Chapter |
Abstract
The live ‘data session’ is arguably a significant collaborative practice amongst a group of co-present colleagues that has sustained the fermentation of emerging analyses of interactional phenomena in ethnomethodological conversation analysis for several decades. There has not, however, been much in the way of technological innovation since its inception. In this article, I outline how the data session can be enhanced (a) by using simple technologies to support the ‘silent data session’, (b) by developing software tools to present, navigate and collaborate on new types of video data in novel ways using immersive virtual reality technologies, and (c) by supporting distributed version control to nurture the freedom and safety to collaborate synchronously and asynchronously on the revision of a common transcript used in a live data session. Examples of real cases, technical solutions and best practices are given based on experience. The advantages and limitations of these significant enhancements are discussed in methodological terms with an eye to future developments.
Notes