2024 : 11 : 21
Ahmad Sohrabi

Ahmad Sohrabi

Academic rank: Assistant Professor
ORCID:
Education: PhD.
ScopusId: 29567584600
HIndex:
Faculty: Faculty of Humanities and Social Sciences
Address:
Phone:

Research

Title
Age of Acquisition Effect: Evidence From Single-Word Reading and Neural Networks
Type
JournalPaper
Keywords
Reading, Neural networks, Age of acquisition, Word recognition, Connectionism
Year
2019
Journal Basic and Clinical Neuroscience
DOI
Researchers Ahmad Sohrabi

Abstract

Introduction: Many studies show that words learned early in life are read more easily than the ones learned later and are less vulnerable to brain damage. Methods: the first part of the current study, 25 primary school students in the 5th grade read the word groups learned initially during a previous grade. The words used in the experiments were 327 Farsi monosyllable words matched on the other factors involved in Farsi word naming. Results: The analysis of covariance (the consistency and frequency as covariates) showed that words learned in earlier grades were read more easily than the ones learned later, showing the known effect of the Age of Acquisition (AoA). In the second part of the study, it was tried to simulate AoA in word naming by a neural network model developed earlier based on connectionist approach. While previous studies used random patterns, in the current study words from primary school books were used. Likewise, words learned early by the model were read better than words learned later. However, there was a failure in replicating previous simulation of AoA in English reading by an algorithm called Quick prop for Farsi. In addition, the model was lesioned by removing some hidden units to see its effect on word reading. As a result, words learned earlier were less vulnerable to damage compared with the ones learned later. Conclusion: The findings showed that words learned earlier, compared to those learned later, were read better and were less vulnerable to damage. These effects are explained by considering the nature of learning in neural networks trained by error back-propagation.