Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"I think we have to be realistic about" is, whether you intend it or not, an attempt to reframe the discussion around your particular set of priors. We need to be careful about language like that.

There are many reasons to challenge the assumption that it's difficult for older people to learn to code:

* It's may not be valid to extrapolate from more general neuroplasticity findings to programming

* It may be that changes in neuroplasticity affect the effectiveness of different modalities of learning, so that it's not so much that it's harder for older people to learn coding, but that they might need to use different strategies to learn --- for instance, 20 year olds might be able to get away with rote memorization as a crutch, and older adults might need to use more authentic learning strategies

* It's also possible that for large numbers of adults, learning to code is easier, because their brains have been trained through work they've done throughout their careers

* It's also the case that there isn't one "programming", but rather a whole constellation of different skills, some of which reward an orientation towards organization and careful planning, some of which reward tenacity, some of which reward creative and intuitive thinking, some of which reward numeracy, and so on

We should be very careful about generalizing, because one thing we do know is that human brains have incredibly powerful cognitive biases that are working constantly to fill in the blanks in our knowledge --- particularly about our models of the behavior and capabilities of other humans --- and much of what it fills in is accurate only up to the point where it allows us to effectively collect berries and hunt gazelles and stuff.

For whatever it's worth: I'm 40, not 50, but the tempo at which I've been learning new things is accelerating, not decelerating: I work in more languages now, I work in more problem domains (for instance, I never used to do any front-end work), and in particular my math --- which has always been awful --- has been improving by leaps and bounds. Maybe things will suddenly suck for me when I turn 50, but the current trends are not worrying me.



I'm skeptical of arguments based on neuroplasticity or learning styles for the following reason. Given how good people are at learning in general (it is the evolutionary advantage that humans have), it would in fact be extremely surprising if there was anything short of an illness that shut down somebody's ability to learn things. Thus ability to learn things, even when one is older, doesn't need an explanation, whereas inability to learn things does.


There's definitely research that suggests our ability to learn some things, like verbal languages, slows down with age --- though I wasn't able to find a cite for whether that decline is gradual or whether there's some inflection at adulthood. Either way, I'd just caution that we not automatically extrapolate from those findings to the broader claim that every new skill acquisition is similarly impacted.


Aren't languages really special, though? I remember reading one's ability to learn even new sounds almost vanishes, which explains many people's accents, and which means it's more like a separate process in the brain. Coding, on the other hand, is a general, abstract skill. So I'd expect ability to learn coding to drop off at roughly the same rate as everything else with age. I think it's one of those things where there is a small negative effect, but it's so small that even if you knew about it, you'd still have many much more important things to think about and so you'd be right to ignore it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: