Here we present the experimental tasks and questionnaires described in our pre-print: 'Perceptual prioritisation of self-associated voices', available at https://psyarxiv.com/xdw6t/. All tasks presented here are for Experiment 1 only. (Experiment 2: https://gorilla.sc/openmaterials/46084; Experiment 3: https://gorilla.sc/openmaterials/46086).
Built with Experiment
An adapted online version of Sui et al.'s (2012) perceptual matching paradigm using both female and male voice stimuli. Participants are familiarised to – and then make speeded judgements about - three different voices and their associations to three different social identities (self, friend, other). Reaction times and accuracy are both collected.
Tasks include:
1. Headphone check to detect and screen out participants who are listening to the auditory stimuli over loudspeaker rather than through headphones (as instructed). Participants hear 12 sets of three tones and have to select which is the quietest. Screening is designed and pre-validated by Woods et al. (2012) available at www.doi.org/10.3758/s13414-017-1361-2.
2. Familiarisation task: Participants are passively exposed to three different male voices (auditory stimuli) alongside an identity label (visual stimuli: text label, either YOU, FRIEND, or STRANGER).
3. Perceptual matching task: Pairs of stimuli are presented on screen (a voice with an identity label) and participants make speeded judgements about whether the pairs are correctly matched or not. Visual stimuli are presented only 500ms after auditory stimuli has finished. Participants respond by a left/right keyboard press. Trials are time restricted. Reaction times and accuracy are recorded, and feedback is provided on each trial. Task includes 6-way counterbalancing ensuring that the six different combinations of a voice + an identity are represented equally and randomised across all participants.
Fully open! Access by URL and searchable from the Open Materials search page