It reminds me of the McGurk Effect, nicely illustrated in the video below, in which you hear a sound halfway between what you see and what is said. This shows that you don't only hear with your ears, but your eyes and ears together. The actual sound you process is made up by your brain based on these combined inputs. (Watch the video, you'll see/hear.)
Problems in phonemic discrimination have traditionally been blamed for slow reading, but lately that blanket analysis is being undermined by things like attentional improvement (see Video game cure for Dyslexia? lower on this page) and studies of early neural detection of visual and audio stimuli.
But it still is an issue for many, which makes it interesting to know that such an ability is something we're are either born with or develops in the womb from the sounds we hear. The study involve babies who were born 3 to 4 months before their due date. Although therapies can access the neural circuits involved and make improvements in people who have this problem, this finding suggests there may be a genetic link.
We know that much of gene expression is shaped by environmental experience, so there's no need to despair. It does suggest, though, that dynamical therapies that are not aimed directly at the perceived problem but at connected abilities (such as iLs) may offer promise.
Mahmoudzadeh link provide below.