GESPIN 2023 Nijmegen

From emcawiki
Revision as of 12:51, 16 January 2023 by BryHebenstreit (talk | contribs) (Created page with "{{Announcement |Announcement Type=Conference |Full title=Gesture and Speech in Interaction 2023 Nijmegen |Short title=GESPIN 2023 Nijmegen |Short summary=Submissions for Gestu...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search
GESPIN 2023 Nijmegen
Type Conference
Categories (tags) Uncategorized
Dates 2023/09/13 - 2023/09/15
Link http://bit.ly/3WgsMfO
Address
Geolocation 51° 50' 42", 5° 50' 34"
Abstract due
Submission deadline 2023/03/15
Final version due
Notification date 2023/05/31
Tweet Submissions for Gesture and Speech in Interaction 2023 Conference are open. Submissions close 15 Mar 2023. More information and how to submit here: http://bit.ly/3WgsMfO
Export for iCalendar

Gesture and Speech in Interaction 2023 Nijmegen:


Details:

GESPIN 2023

"Broadening perspectives, integrating views"

Location: Nijmegen

Date: Wed 13- Fri 15th of September

Paper submission opens: January 10th, 2023

Paper submission deadline: March 15th, 2023

Notification of acceptance/rejection: end of May, 2023

Registration open: TBA

Registration open: TBA

GeSpIn is an interdisciplinary event for researchers working on the interaction between speech and visual communicative signals, such as articulatory, manual, and bodily gestures co-occurring with speech. At GeSpIn 2023 we hope to bring together researchers working on visual signals together with vocalization or speech, from multidisciplinary perspectives in order to exchange ideas and present the cutting edge of their field. This 8th edition of GeSpIn will be held in Nijmegen, the Netherlands and will focus on the theme of “Broadening Perspectives, Integrating Views: Towards General Principles of Multimodal Signaling Systems”.

As such, we encourage researchers working on (multimodal) prosody, social anthropology, philosophy, (psycho)linguistics, psychology, cognitive science, neuroscience, human movement science, computer science (e.g., human-computer interaction), comparative biology, and more to submit their research to address topics such as:

- Do principles of speech-gesture interaction generalize to, or interact with, other multimodal interactions and forms of audiovisual integration (e.g., speech interacting with head gestures or facial signals)?

- What methods in computer science can be used to characterize and synthesize the (temporal) interactions between speech and gesture, within and between agents?

- How is speech-gesture coupling influenced by the immediate dialogic context (e.g., behavior of the interlocutor, or speech act being performed)

- Can multimodal signaling as studied in non-human animals teach us something fundamental about multimodal communication systems that also applies to humans?

- What can cross-linguistic comparisons of speech-gesture interaction teach us about the underlying principles of multimodal coordination?

- Development of gesture-speech coordination: Can general principles of development be identified? Are there sensitive periods and developmental stages?

- What is the role of basic biomechanical or neural processes in visual and auditory signaling and the perception of said multimodal signals?

Please note that all researchers and theoreticians/philosophers working on the interaction between gesture/visual and sound-producing cues (e.g., in terms of pragmatics, prosody, semantics) should feel invited, also if their particular study does not fit these topics exactly.

Organizers

Wim Pouw & James Trujillo (main contacts: wim.pouw@donders.ru.nl/james.trujillo@donders.ru.nl)

Hans Rutger Bosker

Linda Drijvers

Marieke Hoetjes

Judith Holler

Lieke van Maastricht

Asli Ozyurek