Human knowledgeThu 15 February 2018
It took Newton the better part of 3 years to 'invent' calculus.  He is considered to be one of the brightest mathematical minds to have ever lived. Yet most 17 year olds today are capable of mastering the same calculus he did. Why is that?
It's almost too obvious to say - it's easier to learn something than it is to imagine it. More interestingly, there seems to be a kind of inflation with learning. The best high school students today might learn calculus at 15, rather than 17. And 'best' in this instance actually seems to describe the school, rather than the student. It's been well established that school quality is a far better predictor of academic success than anything else. 
So it's not about how much horsepower your brain has. If subjects are taught to you in the right way, you'll be able to master more difficult concepts than your parents did, at a younger age - both because of this inflation effect and because more will have been discovered by the time you get there.
Anecdotally, there always seem to be folks who learn things far younger than others in their generation. For them, the general quality that sets them apart seems to be their precociousness, not their intelligence. Younger students often lack the discipline to focus for more advanced coursework. Once they gain the grit and work ethic of adults, it's hard to stop them.
Thus it's always been a big frustration to me when people say "I just can't do math". Barring a diagnosable mental deficiency, this is never true. Most often students struggle not because the material is difficult but because they lack the discipline to keep working at it or are thinking about it from the wrong perspective. It is alarming how often it is the latter - kids work a lot harder than we give them credit for. But teachers are often abysmally bad at teaching them the right paradigms, often because they never learned them correctly in the first place.
The most common case I see of an incorrect paradigm is the student who leans on memorization to get through exams. Studies pretty clearly demonstrate that even with clever tricks like spaced repetition or mnemonics, long term recall for everyone is abysmally low. Relying on it to pass your test sounds good in the short term, but really it seems that you cut off the learning that other parts of your brain could be getting, like your reasoning skills. Perhaps that's why absent minded professors have such bad memories. They don't happen to have bad memories despite being able to reason well. They reason well because they have bad memories. They didn't have the crutch to lean on, so their muscles grew stronger.
Some careers, like law or medicine, are so old fashioned they continue to teach as if memorizing lengthy material is necessary, even though practicing physicians and lawyers don't remember the details from their licensing exams only a few years before. The paradigm they are really learning in medical school is a sense of what is "weirder than usual" and what isn't. For symptoms that they don't immediately recognize, they google it, like anyone else would. 
That isn't to say that paradigms are everything. There will always be some basic facts and theories that need to be learned by rote in any field. One also may have to hone specific technical skills. But nonetheless it appears that every subject has some kind of limiting reagent - a few concepts that, once students really grok, make the rest of the field feel like a simple extension of what one already knows.
The broader question is whether one can pick up new paradigms even beyond the young age at which most of them are initially acquired. I would argue that yes you can, so long as you already have the paradigm of paradigms - that ultimately the person who created the knowledge was not a young person. Newton was capable of inventing calculus at an older age. So you must be capable of at least understanding it now.
Don't think you're as smart as Newton was? Surely you're not as dumb as an average 17 year old. Sandwiched between the two, there is virtually no concept across any subject of human knowledge that you are incapable of understanding. And your children will pick it up faster than you.
The average length of a PhD is about 7 years.  Even that is an overestimate of the knowledge one needs to acquire, since the majority of that time is spent trying to cobble together something to contribute to the field. That means the entire body of human knowledge, visualized as a swimming pool, is only 7 years deep. The average human lifespan is >78 years. 
The only reason you aren't an expert at something new is because you haven't had the discipline and the paradigms needed to get there. I'm not offering a solution to this. But I do expect that within my lifetime, the vast human potential for knowledge acquisition and usage will not stay as untapped as it currently is.
Thanks to Otis Reid and Karthik Prasad for reading drafts and providing feedback.
 The conflict between Newton and Leibniz is quite famous. But Fermat and other mathematicians discussed the principles of calculus long before both of them. In fact, Newton's work is considered to be a new paradigm (along with a mathematical framework) on top of Fermat's "theory of tangents".
 Good physicians don't use wikipedia, but UptoDate and AccessMedicine are the medical equivalents, with proper citations. That said, you'd be frightened by how often wikipedia is used.
 The length of the average PhD appears to be decreasing in the past few years, which was quite surprising to me. National Science Foundation, National Center for Science and Engineering Statistics