scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Alexithymia explains atypical spatiotemporal dynamics of eye gaze in autism

TL;DR: In this article, a multilevel polynomial modeling strategy was used to describe the spatiotemporal dynamics of eye gaze to emotional facial expressions, and it was shown that atypical gaze to the eyes is best predicted by alexithymia in both autistic and non-autistic individuals.
About: This article is published in Cognition.The article was published on 2021-03-22 and is currently open access. It has received 11 citations till now. The article focuses on the topics: Alexithymia & Facial expression.

Summary (6 min read)

1. Introduction 1

  • Autism Spectrum Disorder (henceforth ‘autism’) is a condition defined by atypical social 2 interaction and communication, and restricted patterns of thought and behaviour (APA, 3 2013).
  • While the exact mechanisms linking these symptoms have not been determined, two 6 hypotheses specify a role for arousal.
  • Such redundancies may explain 6 why allocation of gaze to facial features is not always predictive of emotion recognition, 7 particularly with dynamic stimuli.
  • Under the alexithymia hypothesis, 15 inconsistent results when testing various socioemotional abilities in the autism literature are 16 due to sampling variance with respect to alexithymia.

1.2.1. Spatiotemporal dynamics 4

  • In addition to a failure to consider co-occurring alexithymia, there are at least three 5 potentially significant methodological limitations of previous studies.
  • This practice does not adequately describe gaze 8 behaviour, which is a dynamic process serving to optimise eye movements for sequential 9 selection of relevant visual information (Land, 1999).
  • 12 While it is possible that differences between groups in the timecourse of gaze allocation to 13 facial expressions are captured by aggregate metrics, some divergence between aggregate and 14 timecourse metrics is to be expected because dynamic face stimuli convey information which 15 varies in its spatial location over time.
  • Non-linearities in the distribution of gaze allocation 16 are likely in response to more naturalistic and moving stimuli (Tatler et al., 2011).
  • Importantly, these data (e.g., gaze data in visual world studies) have 7 highlighted that the underlying distribution of gaze is not always well captured in the 8 aggregate (“average”) pattern of gaze data collapsed across time (Barr, 2008; Mirman et al., 9 2008, Seedorff, 2018).

1.2.2. Eye-movement differences and data quality 11

  • Eye movements are typically parsed into fixations (to gaze targets) and saccades.
  • To do 23 so, areas of the visual scene are determined to contain ‘Areas of Interest’ (AOIs; e.g., the eye-24 region of a face).
  • It is important to situate any theory of gaze allocation to emotional facial expressions within 11 wider theories of the function of gaze and visual perception.
  • Given the demonstrated dependence of gaze patterns on the task to be 13 performed, thought to be mediated by task-dependent perceptual predictions, it is perhaps 14 unsurprising that different patterns of results have been found across these studies.
  • Under this hypothesis, autistic participants may 18 be expected not to modulate their gaze behaviour based on task requirements or priors about 19 the stimuli, or not to do so to the same degree that neurotypicals do (Król & Król, 2019).

1.4. The present study and experimental hypotheses 6

  • The overarching goal of this study was to examine whether atypical eye gaze to emotional 7 facial expressions is a product of alexithymia or autism.
  • 11 We also aimed to address previous methodological issues in several ways.the authors.
  • To investigate the role of task- and emotion-related priors on gaze behaviour, each 16 participant's gaze was recorded when they were, and were not, cued to the upcoming 17 emotion, and when they were asked to complete a range of tasks (free-gaze, emotion 18 recognition, and intensity judgements).
  • It can therefore be determined whether any autism- or 19 alexithymia-related atypicality is dependent upon task parameters, or instead a general 20 feature of gaze behaviour.

2. Methods 8

  • 1.Participants 9 75 participants were recruited from a database of research volunteers and compensated with 10 either an honorarium or course credit.
  • Data from 45 neurotypical individuals 13 and 25 individuals diagnosed with autism were therefore included in the final sample (for 14 details see Table 1 and Table S.2. in Supplementary materials).
  • Participants were diagnosed 15 by specialist clinicians all of whom were independent from the research team.
  • As part of their 18 inclusion in the participant database used for recruitment in this study, participants were 19 assessed using the original ADOS algorithm administered by a certified professional.
  • On the 4 day of the study, participants were tested individually in a soundproofed, dimly-lit room, and 5 took short breaks between tasks.

2.2.1. Dynamic Emotional Face Paradigm 10

  • The dynamic emotional face paradigm consisted of four conditions in which participants 11 were shown videos of naturalistic dynamic emotional facial expressions from a validated 12 dataset (Yitzhak et al., 2017).
  • 192 trials were 16 presented, 48 in each condition (free-gaze, emotion recognition, cued, and intensity 17 judgement), eight of each emotion (neutral, happy, sad, angry, fear, disgust) displayed by four 18 male and four female actors.
  • Note that these analyses do not control for alexithymia.
  • This was supplemented by a short system delay of random length at the end of each 4 trial sequence to allow for time-critical logging and trial preparation.
  • The experiment was 9 programmed and presented in PyGaze (Dalmaijer, Mathôt, & Van der Stigchel, 2014).

2.2.2. Eye-tracking 11

  • A Tobii TX300 (Tobii, Sweden) screen-based remote eye-tracker was used with the original 12 screen, sampling at 300Hz and tracking an area of 1920 x 1080 pixels, managed by an 13 experienced operator.
  • A 5-point calibration was administered at the start of each block and repeated until all 17 points were successfully calibrated according to the eye tracker’s default criteria.
  • 23 Recalibration after every 20 trials was based on a pilot experiment showing that participants 24 could complete approximately 20 trials at a time while maintaining an optimal position, with 25 minimal movement, and without triggering drift correction.
  • The first 150ms of 7 each trial were excluded from analysis to account for gaze reorientation during screen 8 transitions and fixations initiated before stimulus onset, and the viewing window was locked 9 to 7800ms of full data after re-zeroing.
  • 15 16 Trials and participants with more than 35% of data loss were not analysed.

2.2.4. Derivation of AOIs 1

  • Objective AOIs were derived using an adapted implementation of a recently proposed 2 Limited-Radius Voronoi-tessellation method (LRVT) that builds on open-source facial 3 recognition and landmarking software (Baltrušaitis & Robinson, 2016; Hessels et al., 2018).
  • The method allows identification of the location of facial features in a video over time.
  • AOIs 5 are then created by segmenting the faces using the distances to predefined points indicating 6 distinct features (eyes, mouth, nose).
  • This 9 method is noise-robust for sparse stimuli like faces and recommended for group comparisons 10 of gaze metrics (Hessels et al., 2018).
  • The TAS-20 was used to measure alexithymia and was scored according to the original 16 norms, with each item rated on a scale from 1 (completely disagree) to 5 (completely agree) 17 which were then summed for a total score (Parker, Taylor, & Bagby, 2003).

2.3.2. Autism Spectrum Quotient – AQ28 and AQ50 19

  • To quantify autistic traits, participants completed the AQ28 (Hoekstra et al., 2011).
  • One total 20 score was computed by scoring the 4-point Likert-scale according to the original AQ scoring 21 criteria.
  • The AQ28 was used to quantify autistic traits as recent work suggests it is more 22 consistent with clinical judgment than the AQ50 (Ashwood et al., 2016), and that the AQ50 23 in its entirely likely contains redundancies that do not necessarily provide improved precision 24 of measurement (Cuve et al., 2021, Lundqvist & Lindner, 2017).
  • AQ50 scores were also 1 available for the autistic participants, and these were used to establish whether autistic 2 participants' scores were above established clinical thresholds.

2.3.3. Depression, Anxiety and Stress Scale – DASS21 4

  • To control for any effect of depression and anxiety, these traits were assessed using the 5 DASS21 (Lovibond & Lovibond, 1995).
  • 15 Unlike traditional analyses which aggregate gaze behaviour across time, modelling gaze 16 behaviour in this way allows four metrics to be derived.
  • In order to identify when significant effects were observed, a non-parametric 15 cluster-based bootstrapping analysis was performed (Maris & Oostenveld, 2007).
  • These 18 models could then be directly contrasted to test whether adding alexithymia or autism 19 improved model fit over simpler models.
  • 22 For all models, quasi-maximal random structures were initially fitted.

2.4.1. Significance testing and model comparison 11

  • An additional validation 16 used the step function from the lmerTest package to fit a backwards elimination regression 17 and identify eliminated terms.
  • As part of the model 19 assessment, information criteria (AIC and BIC) were also used to contrast different models 20 (e.g., alexithymia vs. autism models).
  • The full set of comparisons are listed in Table S.3 in 21 Supplementary materials.
  • Control analyses 1 Control analyses ensured that data quality or effects of other covariates (e.g., autistic traits, 2 age, IQ, anxiety, ADOS and DASS scores) did not change the results of the main analyses 3 (due to space limitations these are presented in the Supplementary materials - Control and 4 additional analyses).

3. Results 6

  • To enable comparison with previous literature, the authors present first the standard fixation measures 9 that aggregate over the stimulus viewing window.
  • Two metrics are reported, the number of 10 fixations, and the total duration of fixations.
  • 11 Gaze distribution to face regions followed expected patterns (see Figure 2 A & B).
  • These results are line with other research on face scanning and 16 supported the focus on eye-gaze in subsequent analyses.
  • There was also a slight preference 17 for the nose over the mouth, and interactions with condition; however, to simplify 18 interpretation of results in relation to the research hypotheses, the authors fit separate models for each 19 region of interest (see below).

3.1.1. Fixation count to the eyes 6

  • 2 3.1.1.2.Condition effects relative to the free-gaze baseline: effects of alexithymia and 3 autism 4.
  • Thus, data from aggregate measures suggest that individuals with alexithymia adapt 1 their gaze behaviour according to the task to be performed.
  • 2 Overall, traditional analyses of fixation counts support H1; that alexithymia is a better 3 predictor of reduced eye gaze than autism.

3.2. Timecourse models for eye gaze 10

  • The timecourse models of eye gaze data use polynomial regressors locked to the stimulus 11 viewing window to describe the changes in eye gaze proportion across time and how these 12 patterns are impacted by alexithymia and autism (whether measured as the presence or 13 absence of an autism diagnosis or the degree of autistic traits) and conditions.
  • Initially, an 14 analysis of the free-gaze condition alone is presented, which is followed by an analysis of 15 how the pattern of eye gaze changes when comparing each condition to the free-gaze 16 condition.
  • This latter analysis also describes how alexithymia and autism impact the changes 17 in eye gaze patterns as a function of condition.
  • Only critical terms are described in the text 18 below.

3.2.1. Free-gaze condition: Effects of alexithymia and autism 20

  • Similarly, 3 the backwards stepwise validation analysis based on AIC eliminated both autism diagnosis 4 and autistic traits and their interactions with polynomial terms, but not alexithymia and 5 interactions with higher order polynomials.
  • 6 Estimates of the alexithymia model suggested a significant intercept (Estimate = 2.72, SE = 7 .15, p < .001) indicating that overall, participants were more likely to look at the eyes than 8 not.
  • There was a significant main effect of alexithymia (Estimate = -. 50, SE = .14, p < .001), 9 indicating that alexithymia is associated with decreased gaze allocation to the eye region.
  • 10 Alexithymia interacted with higher order polynomials suggesting that temporal evolution of 11 gaze behaviour differs as a function of alexithymia, with highly alexithymic individuals 12 reaching near asymptote after reduced attention to the eyes early in the trial (see Figure 3).
  • 13 Full details and mouth gaze models are presented in Table S.4 in Supplementary materials.

3.2.2. Condition effects relative to the free-gaze baseline: Effects of alexithymia and 15

  • Autism 16 Timecourse data and model predictions are illustrated in Figure 3. 24 for visualisation purposes only, as alexithymia was treated as a continuous predictor.
  • 10 11 Similarly, the backwards stepwise selection validation analysis removed the majority of the 12 autism terms and none of the alexithymia terms.
  • A potential concern that can be visualised in Figure 3 relates to the rapid increase in gaze to 4 the eyes seen at the start of the viewing window.
  • Excluding the first 1000ms resulted in similar 20 results, with the only notable difference being that the best fitting model was achieved with 21 up to the cubic polynomial, without the quartic term being required.
  • Thus, with respect to the study hypotheses, the timecourse analysis supported H1 in showing 10 that alexithymia explained atypical eye gaze to emotional facial expressions better than 11 autism (either the presence of a diagnosis or autistic traits).

3.2.3. Non-parametric cluster-based analysis 17

  • Whereas the previous analyses allow the overall trajectory of gaze to the eyes to be described 18 across time, they do not allow identification of the particular time period(s) over the course of 19 a trial when effects of alexithymia or autism can be seen.
  • There were no 15 significant clusters for the autistic traits model.
  • 19 Overall, these analyses highlight that alexithymia is related to consistently reduced gaze to 20 the eyes across conditions and over time (see Figure 4), and therefore support H1; that 21 alexithymia explains reduced 22 eye gaze better than autism.
  • Simpler 2 models including only the effects of condition and the polynomials were favoured in model 3 selection for the mouth (see ‘Timecourse for the Mouth AOI’ in Supplementary materials for 4 full details).
  • 5 6 3.3.Stationary (SGE) and transition (GTE) gaze entropy 7 Hypothesis 4 stated that gaze will be more predictable (as demonstrated by reduced entropy) 8 in the emotion recognition and intensity judgement tasks, and when participants are cued to 9 the upcoming emotion, compared to the free-gaze condition.

3.4. Emotion recognition and gaze 1

  • Both autism diagnosis and alexithymia predicted emotion recognition accuracy.
  • There was however no effect of looking time to eyes on emotion recognition accuracy (Z = 8 .80, p = .4) nor interactions between looking time and clinical predictors on this measure (all 9 p > .05), which was consistent when replacing eyes with nose and mouth looking measures.
  • 10 Similarly, correlations between gaze measures, accuracy and intensity ratings were all below 11 .1 and non-significant (see Table S2 in Supplementary materials).

4. Discussion 16

  • This study investigated whether autism or alexithymia was the better predictor of eye gaze 17 while individuals with and without autism and alexithymia completed four tasks with 18 dynamic emotional facial expressions.
  • Previous studies addressing eye gaze in autism have 19 aggregated gaze behaviour across time, analysing the number of fixations on, and the total 20 duration of attention to, the eyes (Grynszpan & Nadel, 2014; Hadjikhani, Åsberg Johnels, et 21 al., 2017; Kliemann et al., 2012).
  • Analysis and revising the manuscript, also known as 10 BS.
  • Eye 34 movements during emotion recognition in faces.

Did you find this useful? Give us your feedback

Figures (5)
Citations
More filters
Journal ArticleDOI
TL;DR: Theory of Mind (ToM), the ability to represent the mental states of oneself and others, is an essential social skill disrupted across many psychiatric conditions as discussed by the authors, and it is plausible that ToM impairment is related to alexithymia (difficulties identifying and describing one's own emotions).

11 citations

Journal ArticleDOI
TL;DR: This article evaluated studies that measured the relationship between eye gaze and activity in the 'social brain' when viewing facial stimuli and found that eye avoidance may be used to reduce amygdala-related hyperarousal among people on the autism spectrum.
Abstract: Reduced eye contact early in life may play a role in the developmental pathways that culminate in a diagnosis of autism spectrum disorder. However, there are contradictory theories regarding the neural mechanisms involved. According to the amygdala theory of autism, reduced eye contact results from a hypoactive amygdala that fails to flag eyes as salient. However, the eye avoidance hypothesis proposes the opposite-that amygdala hyperactivity causes eye avoidance. This review evaluated studies that measured the relationship between eye gaze and activity in the 'social brain' when viewing facial stimuli. Of the reviewed studies, eight of eleven supported the eye avoidance hypothesis. These results suggest eye avoidance may be used to reduce amygdala-related hyperarousal among people on the autism spectrum.

11 citations

Journal ArticleDOI
TL;DR: In this article, the authors used factor-analytic and network approaches to determine whether alexithymia should be considered a product of autism or regarded as a separate condition.
Abstract: Despite the heterogeneity in autism, socioemotional difficulties are often framed as universal. Increasing evidence, however, suggests that socioemotional difficulties may be explained by alexithymia, a distinct yet frequently co-occurring condition. If, as some propose, autistic traits are responsible for socioemotional impairments, then alexithymia may itself be a symptom of autism. We aimed to determine whether alexithymia should be considered a product of autism or regarded as a separate condition. Using factor-analytic and network approaches, we provide evidence that alexithymic and autistic traits are distinct. We argue that: (1) models of socioemotional processing in autism should conceptualise difficulties as intrinsic to alexithymia; and (2) assessment of alexithymia is crucial for diagnosis and personalised interventions.

6 citations

Journal ArticleDOI
01 Feb 2022-Cortex
TL;DR: In this paper , an objective, data-driven method for the analysis of gaze patterns and their relation to diagnostic test scores was presented. But this method was applied to data acquired in an adult sample (N = 111) of psychiatry outpatients while they freely looked at images of human faces.

5 citations

Journal ArticleDOI
TL;DR: A systematic review and meta‐analysis supports the investigation of gaze variables as potential biomarkers of ASD, although future longitudinal studies are required to investigate the developmental progression of this relationship and to explore the influence of heterogeneity in ASD clinical characteristics.
Abstract: Autism spectrum disorder (ASD) is characterized by significant social functioning impairments, including (but not limited to) emotion recognition, mentalizing, and joint attention. Despite extensive investigation into the correlates of social functioning in ASD, only recently has there been focus on the role of low‐level sensory input, particularly visual processing. Extensive gaze deficits have been described in ASD, from basic saccadic function through to social attention and the processing of complex biological motion. Given that social functioning often relies on accurately processing visual information, inefficient visual processing may contribute to the emergence and sustainment of social functioning difficulties in ASD. To explore the association between measures of gaze and social functioning in ASD, a systematic review and meta‐analysis was conducted. A total of 95 studies were identified from a search of CINAHL Plus, Embase, OVID Medline, and psycINFO databases in July 2021. Findings support associations between increased gaze to the face/head and eye regions with improved social functioning and reduced autism symptom severity. However, gaze allocation to the mouth appears dependent on social and emotional content of scenes and the cognitive profile of participants. This review supports the investigation of gaze variables as potential biomarkers of ASD, although future longitudinal studies are required to investigate the developmental progression of this relationship and to explore the influence of heterogeneity in ASD clinical characteristics.

4 citations

References
More filters
Journal Article
TL;DR: Copyright (©) 1999–2012 R Foundation for Statistical Computing; permission is granted to make and distribute verbatim copies of this manual provided the copyright notice and permission notice are preserved on all copies.
Abstract: Copyright (©) 1999–2012 R Foundation for Statistical Computing. Permission is granted to make and distribute verbatim copies of this manual provided the copyright notice and this permission notice are preserved on all copies. Permission is granted to copy and distribute modified versions of this manual under the conditions for verbatim copying, provided that the entire resulting derived work is distributed under the terms of a permission notice identical to this one. Permission is granted to copy and distribute translations of this manual into another language, under the above conditions for modified versions, except that this permission notice may be stated in a translation approved by the R Core Team.

272,030 citations

Journal ArticleDOI
TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Abstract: In this final installment of the paper we consider the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now. To a considerable extent the continuous case can be obtained through a limiting process from the discrete case by dividing the continuum of messages and signals into a large but finite number of small regions and calculating the various parameters involved on a discrete basis. As the size of the regions is decreased these parameters in general approach as limits the proper values for the continuous case. There are, however, a few new effects that appear and also a general change of emphasis in the direction of specialization of the general results to particular cases.

65,425 citations

Journal ArticleDOI
TL;DR: In this article, a model is described in an lmer call by a formula, in this case including both fixed-and random-effects terms, and the formula and data together determine a numerical representation of the model from which the profiled deviance or the profeatured REML criterion can be evaluated as a function of some of model parameters.
Abstract: Maximum likelihood or restricted maximum likelihood (REML) estimates of the parameters in linear mixed-effects models can be determined using the lmer function in the lme4 package for R. As for most model-fitting functions in R, the model is described in an lmer call by a formula, in this case including both fixed- and random-effects terms. The formula and data together determine a numerical representation of the model from which the profiled deviance or the profiled REML criterion can be evaluated as a function of some of the model parameters. The appropriate criterion is optimized, using one of the constrained optimization functions in R, to provide the parameter estimates. We describe the structure of the model, the steps in evaluating the profiled deviance or REML criterion, and the structure of classes or types that represents such a model. Sufficient detail is included to allow specialization of these structures by users who wish to write functions to fit specialized linear mixed models, such as models incorporating pedigrees or smoothing splines, that are not easily expressible in the formula language used by lmer.

50,607 citations

Journal ArticleDOI
TL;DR: The lmerTest package extends the 'lmerMod' class of the lme4 package, by overloading the anova and summary functions by providing p values for tests for fixed effects, and implementing the Satterthwaite's method for approximating degrees of freedom for the t and F tests.
Abstract: One of the frequent questions by users of the mixed model function lmer of the lme4 package has been: How can I get p values for the F and t tests for objects returned by lmer? The lmerTest package extends the 'lmerMod' class of the lme4 package, by overloading the anova and summary functions by providing p values for tests for fixed effects. We have implemented the Satterthwaite's method for approximating degrees of freedom for the t and F tests. We have also implemented the construction of Type I - III ANOVA tables. Furthermore, one may also obtain the summary as well as the anova table using the Kenward-Roger approximation for denominator degrees of freedom (based on the KRmodcomp function from the pbkrtest package). Some other convenient mixed model analysis tools such as a step method, that performs backward elimination of nonsignificant effects - both random and fixed, calculation of population means and multiple comparison tests together with plot facilities are provided by the package as well.

12,305 citations

Journal ArticleDOI
TL;DR: The factor structure of the combined BDI and BAI items was virtually identical to that reported by Beck for a sample of diagnosed depressed and anxious patients, supporting the view that these clinical states are more severe expressions of the same states that may be discerned in normals.

9,443 citations

Frequently Asked Questions (11)
Q1. What are the contributions in this paper?

The authors propose that, where observed, atypical visual exploration of emotional facial expressions 5 is due to alexithymia, a distinct but frequently co-occurring condition. In this eye-tracking 6 study the authors tested the alexithymia hypothesis using a number of recent methodological 7 advances to study eye gaze during several emotion processing tasks ( emotion recognition, 8 intensity judgements, free gaze ), in 25 adults with, and 45 without, autism. 

The current results can inform future research in four ways. Second, models of social attention in autism need to incorporate 9 potential causal influences of alexithymia on socioemotional behaviour. The findings highlight potential underlying 4 visual processing mechanisms of atypical emotion processing in alexithymia and autism. 

Non-linearities in the distribution of gaze allocation 16 are likely in response to more naturalistic and moving stimuli (Tatler et al., 2011). 

The AQ28 was used to quantify autistic traits as recent work suggests it is more 22 consistent with clinical judgment than the AQ50 (Ashwood et al., 2016), and that the AQ50 23 in its entirely likely contains redundancies that do not necessarily provide improved precision 24of measurement (Cuve et al., 2021, Lundqvist & Lindner, 2017). 

14Moving beyond traditional gaze metrics, separate analyses incorporated the temporal 15 domain by modelling eye gaze behaviour over time. 

Hypothesis 4 is that gaze will be more predictable (as 14 demonstrated by reduced entropy) in the emotion recognition and intensity judgement tasks, 15 and when participants are cued to the upcoming emotion, compared to the free-gaze 16 condition. 

Free-20 gaze conditions may therefore be more likely to reveal reduced eye attention in the autistic 21 population, as a result of co-occurring alexithymia, than active tasks such as emotion 22 recognition. 

consistent with the entropy analysis of the dispersion of gaze fixations (SGE), the 11 predictability of gaze transitions as a consequence of condition reveals less of an influence of 12 top-down priors relating to task, or knowledge of the upcoming emotional face stimulus, in 13 alexithymic individuals. 

if autism (or alexithymia) is characterised by a reduced reliance on priors, 4 these effects of task on the predictability of gaze behaviour should be reduced. 

Such manipulations were motivated by previous findings of task-dependant gaze 11 behaviour (Del Bianco et al., 2018; Hessels et al., 2019; Ricciardelli, Carcagno, Vallar, & 12 Bricolo, 2013) and theories suggesting that autistic perception is less affected by priors than 13 neurotypical perception (Pellicano & Burr, 2012). 

20To summarise, the authors tested three groups of statistical models for both the gaze timecourse and 21 entropy: 22Group A. Separate models for Autism and Alexithymia: 23A.1. Autism Diagnosis Models: Simple models (orthogonal polynomials: linear, quadratic, 1 cubic and quartic for timecourse analyses; condition: free gaze, emotion recognition, intensity 2 judgement, cued) compared to more complex models including autism diagnosis, condition, 3 polynomials (for timecourse analyses) and their interactions.