Upload new images. The image library for this site will open in a new window.
Upload new documents. The document library for this site will open in a new window.
Show web part zones on the page. Web parts can be added to display dynamic content such as calendars or photo galleries.
Choose between different arrangements of page sections. Page layouts can be changed even after content has been added.
Move this whole section down, swapping places with the section below it.
Check for and fix problems in the body text. Text pasted in from other sources may contain malformed HTML which the code cleaner will remove.
Accordion feature turned off, click to turn on.
Accordion featurd turned on, click to turn off.
Change the way the image is cropped for this page layout.
Cycle through size options for this image or video.
Align the media panel to the right/left in this section.
Open the image pane in this body section. Click in the image pane to select an image from the image library.
Open the video pane in this body section. Click in the video pane to embed a video. Click ? for step-by-step instructions.
Remove the image from the media panel. This does not delete the image from the library.
Remove the video from the media panel.
Title: Neural representation of syntactic prediction: A simultaneous eye-tracking and EEG study
Research summary: Human brain rapidly integrates linguistic information and makes online prediction about upcoming words as sentences unfold in time. However, it is still unclear when the prediction is formed and what neural features of prediction are activated in the brain. By using a simultaneous eye-tracking and EEG technique and verb bias task, this study found that people's first fixation pattern has shown a significant verb bias effect right after hearing the verb and before the first noun phrase. Also, a greater negativity was elicited by strongly biased verbs 100 ms before the first fixation.
Link for PDF: https://drive.google.com/file/d/1ClNvjfogJxPGlC_ddKWG1mmlUkqVkT_B/view?usp=sharing
Link for video: https://drive.google.com/file/d/1Gm-Aykd6EsyVAY7AvfZLfoGiCb5xs8XX/view?usp=sharing
Move this whole section up, swapping places with the section above it.
Presentation Title: Early-life signed language exposure does not impede the development of spoken language: A functional near infrared spectroscopy investigation of phonemic discrimination in cochlear implant (CI) users
Research summary: This study examined neural activation patterns underlying phonemic discrimination (ability to distinguish phonemes) in individuals with CI who were both: exposed to signed language at different ages and received their CI at different ages. We used an auditory target phoneme discrimination task using an oddball paradigm. Our findings showed that early life language exposure (ASL and/or via CI) was associated with greater activation of left hemisphere language areas that are critically involved in auditory phoneme detection and their right hemisphere homologues. We found no negative impact of early life signed language exposure on spoken language phonemic discrimination ability in CI users.
The link to presentation pdf: https://sites.udel.edu/shakhlon/cv/files
Presentation Title: Informative use of "not" is N400 blind
Research Summary: This study investigates how negation is processed when it is used informatively. We conducted an experiment utilizing N400 ERP component and found that N400 is not modulated by the truth value of the negated sentences. Our findings provide new support for the theory that negation is incorporated "late" in sentence processing.
Link for PDF: https://sites.udel.edu/ryanrhodes/files/2020/10/Poster_final.pdf
Link for video: https://youtu.be/7wMK7asGEng