The current threads about doctors becoming obsolete has got me thinking about what kind of knowledge/information is worth internalizing or committing to memory. Time is limited. So is brain space. Some things really are best left to look up on uptodate or a similar database. But other things should be in mind, always. What I want to know is, what are those things? In med school, we indiscriminately try to learn and memorize everything b/c boards or class exams require it. But we should exercise some judgment once we're past those hurdles. That's the stage I'm at now. I want to be really specific about the things for which I spend the time and effort committing to memory (w/ Anki of course🙂). I want the highest value information to be in mind, but then leave the other stuff to my 'peripheral brain'.
Here is a preliminary list to get us started.
Things to commit to memory:
1. Knowledge that must be used in acute settings where access to a computer is limited
- Ex. protocols trauma, 'codes', emergent situations, etc.
2. Exam techniques
- This is practical, skill stuff. You can't be reading Bates while working on a patient
3. Fundamental, conceptual knowledge that provides a framework for other learning
- This is more vague. I'd like some help fleshing this out. So for example, with respect to autoimmune disease, knowing that HLA-B%^*#(*995 is involved in disease X is not worth memorizing. But knowing that autoimmune disease has genetic inheritance often, and that certain classes of molecules are involved IS worth knowing.
4. Manual skills, i.e. procedures, surgical techniques, etc.
Things to leave to the peripheral brain
1. Detailed management stuff
- Sure, some basic idea of management of disease should be in mind, but protocols and standard of care is changing frequently. This is why we often check UpToDate before prescribing or determining a course of action. What are the most cutting edge regimens or interventions?
2. Clinical pharmacology - what drugs do we use for disease X? Some of that will undoubtedly be committed to memory, but I'd rather entrust the duty of prescribing the right med to smart applications that tell me what the latest and best drug is.
3. Rare disease diagnosis
- We don't really have a choice here. The zebras don't stay in our heads b/c we dont' see them enough. For things that fall outside the scope of the everyday, it is worth having clinical diagnosis support systems, like this that even diagnostic rockstars like this guy will refer to when uncertain.
That's what I got. What do you think?
PS. I don't want this thread to turn into a flame war about doctors not being needed anymore, or being replaced by computers and/or midlevels. I just want the collective opinion about what things we should spend our valuable, limited time and resources internalizing and what we shouldn't. Ok carry on 🙂
Here is a preliminary list to get us started.
Things to commit to memory:
1. Knowledge that must be used in acute settings where access to a computer is limited
- Ex. protocols trauma, 'codes', emergent situations, etc.
2. Exam techniques
- This is practical, skill stuff. You can't be reading Bates while working on a patient
3. Fundamental, conceptual knowledge that provides a framework for other learning
- This is more vague. I'd like some help fleshing this out. So for example, with respect to autoimmune disease, knowing that HLA-B%^*#(*995 is involved in disease X is not worth memorizing. But knowing that autoimmune disease has genetic inheritance often, and that certain classes of molecules are involved IS worth knowing.
4. Manual skills, i.e. procedures, surgical techniques, etc.
Things to leave to the peripheral brain
1. Detailed management stuff
- Sure, some basic idea of management of disease should be in mind, but protocols and standard of care is changing frequently. This is why we often check UpToDate before prescribing or determining a course of action. What are the most cutting edge regimens or interventions?
2. Clinical pharmacology - what drugs do we use for disease X? Some of that will undoubtedly be committed to memory, but I'd rather entrust the duty of prescribing the right med to smart applications that tell me what the latest and best drug is.
3. Rare disease diagnosis
- We don't really have a choice here. The zebras don't stay in our heads b/c we dont' see them enough. For things that fall outside the scope of the everyday, it is worth having clinical diagnosis support systems, like this that even diagnostic rockstars like this guy will refer to when uncertain.
That's what I got. What do you think?
PS. I don't want this thread to turn into a flame war about doctors not being needed anymore, or being replaced by computers and/or midlevels. I just want the collective opinion about what things we should spend our valuable, limited time and resources internalizing and what we shouldn't. Ok carry on 🙂