Research

Project “Leeswinst”

Towards dynamic early predictors of reading skills

Children’s reading development shows large variability. Reading problems (dyslexia) are currently diagnosed and treated after a child has failed to respond adequately to reading instruction, i.e., around 8-9 years. This is problematic because early intervention is crucial for optimal (reading) development and social opportunities. To enable earlier prediction of reading problems, in Project Leeswinst we design tailored learning tasks where children learn the letters of an artificial language. We characterize individual differences in children’s learning trajectories by modelling their performance and brain activity. With this knowledge, we will develop a digital learning test that predicts who will learn to read fluently and who will need extra support to prevent reading problems.

Interested? More Information? 📧 dyslexie@maastrichtuniversity.nl or fill out our interest form.

In collaboration with

Funded by


Letter-sound learning

Establishing strong associations between visual and spoken language representations is essential for developing reading skills. We have a fairly good understanding of how the brain processes and retains already learnt associations and how this may differ in individuals with reading problems (dyslexia). Yet, our understanding of the dynamic neural and behavioural changes during the actual learning process is limited. This research aims to address this gap by investigating individual differences in learning trajectories for letter-sound associations in children and adults with and without dyslexia, as well as the potential influence of perceptual, cognitive, and socio-emotional factors on these trajectories.

Interested? 📧 dyslexie@maastrichtuniversity.nl.


Letters & Numbers

Numeracy and literacy are fundamental skills that rely on culturally acquired mappings between visual symbols (e.g., 3, v) and their corresponding spoken language representations (e.g., /three/, /v/). The occipito-temporal and superior temporal cortex are involved in the processing of the visual and auditory content of letters and numbers respectively. Yet, it is still unclear whether beyond the identification of single items, these regions are also involved in the more abstract multi-modal representation of letters and numbers. In fact, in previous research, the superior temporal cortex has been found to respond to letters of both modalities and to integrate them to a fused coherent percept. The same function for numbers has been associated with the intraparietal sulcus. To our knowledge, no research has explored the neural representation of these multimodal regions and whether their multimodal responses actually “hide” a rather amodal/abstract representation. Therefore, the present project aim to explore both the unimodal and multimodal representation of letters and numbers using multivariate pattern analysis and fMRI.

Interested? 📧 f.gentile@maastrichtuniversity.nl


Coding vs Natural Language

We live in hyper-digitized world where computer programming has become an essential skill for people’s career development. Similarly to natural language, programming language is used for communication, i.e., to instruct the “machine” about the task to implement. Moreover, despite few differences, computer programming shares important aspects with natural language. Both types of languages are constructed on basic building blocks: variables that when combined become an expression (programming) and speech sounds-words whose combination creates sentences (natural language). Most notably, (1) they both rely on a set of rules (syntax) needed to correctly structure the building blocks and create a valid expression, statement, or sentence, and (2) the meaning of an expression/statement in a code snippet or a sentence (semantics) strongly depends on how the building blocks are combined. In this project, we use behavioral measurements and EEG to explore to what extent syntactic and semantic processes differ while novices and experts process programming and natural language.

Interested? 📧 f.gentile@maastrichtuniversity.nl  


Project Eyeread

Natural reading in Adults

Reading involves the dual route language network in our brain. The networks transfers perceptual information of written text to the frontal lobe for understanding and prediction what comes next. Our project EYEread contributes to the understanding of neural dynamics of natural reading in which volunteers read entire sentences or paragraphs. We address two specific topics:

(1) Context effects. We read differently depending on text context, but how? Our volunteers read different sorts of text, i.e. with or without prior knowledge of story content. We collect and study their EEG and eye reading pattern and see how context has impact on fixation trajectories, reading demand (pupil size), and text recall.

(2) Event segmentation. When we read we snip a story into chunks of meaning and store them, so that we can later recall in chunks. In memory research, this snipping is called event segmentation. Recent fMRI and EEG studies showed how the brain does this segmentation while people were listening to audiobooks or watching movies – with the help of machine-learning techniques such as Hidden-Markov-Modelling (HMM). Our project aims to apply HMM in natural reading.

Interested? 📧 b.jansma@maastrichtuniversity.nl


Project GaltACS

Galactosemia is a rare genetic metabolic disorder that effects a child’s ability to metabolize the sugar galactose properly. Children are screened for this disease at birth, and receive a strict diet. Still, they develop some motor and cognitive issues over time. Since 2011, in an interdisciplinary team, we aim to understand these deficits by means of brain imaging.

In the project GaltACS (2019-23), we recorded EEG while volunteers named short video clips, to investigate language production. We detected differences in theta oscillation (3-7 Hz) between patients and healthy controls. Theta is relevant for language and memory.

In a next step, we studied the causal relation of language and theta. We stimulated the brain of our volunteers while they described the animated scenes. We used noninvasive brain stimulation (NIBS), called transcranial alternating current stimulation (tACS). Afterwards, we compared language performance pre/during/post stimulation sessions, and observed a significant reduction of naming errors in patients by theta stimulation, not by sham (placebo stimulation). This pilot study is promising, and might indicate a new and additional treatment approach to improve cognition and life quality of the patients.

Interested? 📧 b.jansma@maastrichtuniversity.nl