MD Cheating allegations engulf Dartmouth medical school

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
I gotta say, I really admire that you have such a strong stance on not using insanely high exam scores to stratify applicants, while yourself being somebody who demolished standardized exams yourself. It just makes me happy.

tangential to this, it always surprised me that theres anyone to disagree with @efle on this when so much our medical education is devoted to the appropriate use of tests and statistics.

Members don't see this ad.
 
  • Like
Reactions: 1 users
Hi all -- I'm one of several external data and tech analysts who has reviewed multiple log files associated with this investigation, and can confirm that based on what we've seen the vast majority of these students are totally innocent (or at least didn't cheat using Canvas, which is the only evidence that the school has offered suggesting that they cheated). I have seen a number of student log files, and I can confirm that every single one of them contains a overwhelming proportion of datapoints that are both senseless and impossible for a human to have produced. And most of these log files only appear (based on title) to contain sets of Canvas activities and exam questions that were loaded simultaneously and appear to be relevant to each other -- but opening the Canvas assets and doing an exhaustive read through them almost always shows that they don't mention the stuff that shows up in the exam questions they're associated with and would have been absolutely no help in answering them. It's particularly funny that most logs contain stuff like accessing a course logistics announcement or link to a page that turns out to be blank. And this is after the school pre-cleaned all of the logs to make them look as damning as possible.

Once you eliminate all of the accuastions that make no sense files that I have seen contain no more than 1-2 potentially-meaningful correlations per exam. Assuming this pattern holds in all cases (and I know it does in a least one), that means that the students who were expelled had their medical careers ended because it looked like they accessed Canvas for one single question each on 3 or more exams - amidst a sea of data that clearly wasn't related to anything they did.

Anyone who says the Birthday Paradox doesn't apply here either doesn't understand what the school did (multiple waves of targeted cherry-picking, which STILL resulted in very weak data at the end), or doesn't understand statistics. You absolutely cannot definitively assert that students cheated if you're pulling from a dataset that has 20 rows that were more-or-less random mashups of test questions and pages from the course that were automatically being reloaded on an auxilliary device. It's very likely in such a scenario that there will appear to be at least one meaningful/problematic match-up of exam question and Canvas activity - not because there was, but because p<0.05 literally means that if you go on a phishing expedition with 20 random datapoints, there's a very good chance that you'll find one that looks meaningful even though it isn't.

Of course schools can use Canvas logs to suggest that students did something suspicious. But they can only do that if the logs actually show consistent suspicious behavior - not one random activity that looks potentially problematic in a sea of data that's clearly random noise.
 
Last edited:
  • Like
  • Hmm
  • Care
Reactions: 17 users
Hi all -- I'm one of several external data and tech analysts who has reviewed multiple log files associated with this investigation, and can confirm that based on what we've seen the vast majority of these students are totally innocent (or at least didn't cheat using Canvas, which is the only evidence that the school has offered suggesting that they cheated). I have seen a number of student log files, and I can confirm that every single one of them contains a overwhelming proportion of datapoints that are both senseless and impossible for a human to have produced. And most of these log files only appear (based on title) to contain sets of Canvas activities and exam questions that were loaded simultaneously and appear to be relevant to each other -- but opening the Canvas assets and doing an exhaustive read through them almost always shows that they don't mention the stuff that shows up in the exam questions they're associated with and would have been absolutely no help in answering them. It's particularly funny that most logs contain stuff like accessing a course logistics announcement or link to a page that turns out to be blank. And this is after the school pre-cleaned all of the logs to make them look as damning as possible.

Once you eliminate all of the accuastions that make no sense files that I have seen contain no more than 1-2 potentially-meaningful correlations per exam. Assuming this pattern holds in all cases (and I know it does in a least one), that means that the students who were expelled had their medical careers ended because it looked like they accessed Canvas for one single question each on 3 or more exams - amidst a sea of data that clearly wasn't related to anything they did.

Anyone who says the Birthday Paradox doesn't apply here either doesn't understand what the school did (multiple waves of targeted cherry-picking, which STILL resulted in very weak data at the end), or doesn't understand statistics. You absolutely cannot definitively assert that students cheated if you're pulling from a dataset that has 20 rows that were more-or-less random mashups of test questions and pages from the course that were automatically being reloaded on an auxilliary device. It's very likely in such a scenario that there will appear to be at least one meaningful/problematic match-up of exam question and Canvas activity - not because there was, but because p<0.05 literally means that if you go on a phishing expedition with 20 random datapoints, there's a very good chance that you'll find one that looks meaningful even though it isn't.

Of course schools can use Canvas logs to suggest that students did something suspicious. But they can only do that if the logs actually show consistent suspicious behavior - not one random activity that looks potentially problematic in a sea of data that's clearly random noise.
I forget where I read this, but at one point I thought the school was claiming that it was additional information from ExamSoft that was the nail in the coffin - namely that they could show the Canvas access was not only to relevant files, but that it occurred at the same time the person encountered that question on the test. Can you clarify about that? I can understand the birthday paradox resulting in a few page views related to a few questions somewhere in the test while I was taking it, but it shouldn't be able to explain the temporal relationship between seeing a question and accessing the related info immediately afterwards?

Edit: As a follow up, if we are crediting these things to the paradox, there ought to be a similar number of instances where related materials were accessed immediately before the question was encountered? That should be exonerating...but I imagine inaccessible until you have the raw/uncleaned logs.
 
  • Like
Reactions: 1 user
Members don't see this ad :)
I forget where I read this, but at one point I thought the school was claiming that it was additional information from ExamSoft that was the nail in the coffin - namely that they could show the Canvas access was not only to relevant files, but that it occurred at the same time the person encountered that question on the test. Can you clarify about that? I can understand the birthday paradox resulting in a few page views related to a few questions somewhere in the test while I was taking it, but it shouldn't be able to explain the temporal relationship between seeing a question and accessing the related info immediately afterwards?

Edit: As a follow up, if we are crediting these things to the paradox, there ought to be a similar number of instances where related materials were accessed immediately before the question was encountered? That should be exonerating...but I imagine inaccessible until you have the raw/uncleaned logs.
God I hope you're right bc I'm really freaked out for these students.
 
I forget where I read this, but at one point I thought the school was claiming that it was additional information from ExamSoft that was the nail in the coffin - namely that they could show the Canvas access was not only to relevant files, but that it occurred at the same time the person encountered that question on the test. Can you clarify about that? I can understand the birthday paradox resulting in a few page views related to a few questions somewhere in the test while I was taking it, but it shouldn't be able to explain the temporal relationship between seeing a question and accessing the related info immediately afterwards?

Edit: As a follow up, if we are crediting these things to the paradox, there ought to be a similar number of instances where related materials were accessed immediately before the question was encountered? That should be exonerating...but I imagine inaccessible until you have the raw/uncleaned logs.
Yes, you're correct: the school's evidence sheets consist of ExamSoft question history transcripts manually lined up, apparently by hand, with Canvas Course Access logs, both of which have precise timestamps. The school appears to have automatically assumed an integrity violation if:
1. A question topic and a piece of Canvas material appeared to share a substantive focus even if only in a superficial way; AND
2. The exam question was initiated within roughly the same 3-5 min span as the activity involving the thematically-linked Canvas material.

However, as I mentioned previously such assumed match-ups were actually wrong more often than not: whoever was doing this correlation incorrectly thought that a particular piece of Canvas material was relevant to a particular exam question with an error rate of at least 80%. And so while the school tried to accuse each student of having between 10 and 30 of these correlations, at most each student ended up having 1 or 2 and sometimes none at all that were even slightly suspect. Additionally, the real "hit rate" of substantive correlations was probably much lower even than that, because it looks like the school silently redacted dozens and perhaps hundreds of rows of "clearly irrelevant" data from each student's log - rows which would have showed just how much automated activity was underway. I hear you that the temporal coincidence looks bad - but we believe that automatic refresh operations were repeatedly invoking same exact files every 5-10 minutes for the full duration of the exams in question. Given that frequency of automatic refreshes, it would have been nigh-on impossible for someone who had kept a few relevant documents open on an inactive device (which is what was doing the automatic syncing in all of these cases) to get through the whole test without at least one randomly-generated point of synchronicity between exam content and Canvas content.

I entirely agree with you that looking at other time periods would have been a great way for the school to the check meaningfulness of their data. We've asked them to look at and provide full unfiltered data from one hour before and one hour after each exam, and they have so far refused.
 
Last edited:
  • Like
  • Wow
Reactions: 9 users
Yes, you're correct: the school's evidence sheets consist of ExamSoft question history transcripts manually lined up, apparently by hand, with Canvas Course Access logs, both of which have precise timestamps. The school appears to have automatically assumed an integrity violation if:
1. A question topic and a piece of Canvas material appeared to share a substantive focus even if only in a superficial way; AND
2. The exam question was initiated within roughly the same 3-5 min span as the activity involving the thematically-linked Canvas material.

However, as I mentioned previously such assumed match-ups were actually wrong more often than not: whoever was doing this correlation incorrectly thought that a particular piece of Canvas material was relevant to a particular exam question with an error rate of at least 80%. And so while the school tried to accuse each student of having between 10 and 30 of these correlations, at most each student ended up having 1 or 2 and sometimes none at all that were even slightly suspect. Additionally, the real "hit rate" of substantive correlations was probably much lower even than that, because it looks like the school silently redacted dozens and perhaps hundreds of rows of "clearly irrelevant" data from each student's log - rows which would have showed just how much automated activity was underway. I hear you that the temporal coincidence looks bad - but we believe that automatic refresh operations were repeatedly invoking same exact files every 5-10 minutes for the full duration of the exams in question. Given that frequency of automatic refreshes, it would have been nigh-on impossible for someone who had kept a few relevant documents open on an inactive device (which is what was doing the automatic syncing in all of these cases) to get through the whole test without at least one randomly-generated point of synchronicity between exam content and Canvas content.

I entirely agree with you that looking at other time periods would have been a great way for the school to the check meaningfulness of their data. We've asked them to look at and provide full unfiltered data from one hour before and one hour after each exam, and they have so far refused.
Ah, I see - if one element (Canvas) is frequently active throughout, the only random event is for you to encounter a question on related material. Much more probable than the two events occurring in sequence.

I have to say, it is a terrible look for the school to refuse sharing the full logs. If they're certain enough about an accusation to expel someone and end their medical career, they better believe the complete set of info will withstand scrutiny. It should be pretty damn obvious that if the powerpoint on ARDS was being refreshed every 10 minutes for the entire test, it had nothing to do with question #79 about ARDS.
 
  • Like
Reactions: 7 users
Put aside the statistical analysis and the likelihood of Canvas automatically reloading just for a minute. The part that makes this sort of cheating sound so improbable to me is that it's... so dumb? Anyone under 30 years old understands that Canvas/related services track activity while one actively uses them (though this passive activity thing is a bit of a surprise). Everything posted on Canvas can also be downloaded and accessed surreptitiously later if one were so inclined. I'll hold my judgment until more facts emerge but I have a hard time believing that the stupidest way of cheating is so pervasive at one of the country's better medical schools.

Full disclosure: I plan on matriculating at Dartmouth over the summer though this controversy has led me to revisit my options.
 
  • Like
Reactions: 15 users
Put aside the statistical analysis and the likelihood of Canvas automatically reloading just for a minute. The part that makes this sort of cheating sound so improbable to me is that it's... so dumb? Anyone under 30 years old understands that Canvas/related services track activity while one actively uses them (though this passive activity thing is a bit of a surprise). Everything posted on Canvas can also be downloaded and accessed surreptitiously later if one were so inclined. I'll hold my judgment until more facts emerge but I have a hard time believing that the stupidest way of cheating is so pervasive at one of the country's better medical schools.

Full disclosure: I plan on matriculating at Dartmouth over the summer though this controversy has led me to revisit my options.

Agreed. If you are a medical student, don't you download all the powerpoints to your computer when you are studying? Why would I go back to Canvas to re-download a set of powerpoints during the middle of an exam when they are already on my computer locally?
 
  • Like
Reactions: 4 users
Hi all -- I'm one of several external data and tech analysts who has reviewed multiple log files associated with this investigation, and can confirm that based on what we've seen the vast majority of these students are totally innocent (or at least didn't cheat using Canvas, which is the only evidence that the school has offered suggesting that they cheated). I have seen a number of student log files, and I can confirm that every single one of them contains a overwhelming proportion of datapoints that are both senseless and impossible for a human to have produced. And most of these log files only appear (based on title) to contain sets of Canvas activities and exam questions that were loaded simultaneously and appear to be relevant to each other -- but opening the Canvas assets and doing an exhaustive read through them almost always shows that they don't mention the stuff that shows up in the exam questions they're associated with and would have been absolutely no help in answering them. It's particularly funny that most logs contain stuff like accessing a course logistics announcement or link to a page that turns out to be blank. And this is after the school pre-cleaned all of the logs to make them look as damning as possible.

Once you eliminate all of the accuastions that make no sense files that I have seen contain no more than 1-2 potentially-meaningful correlations per exam. Assuming this pattern holds in all cases (and I know it does in a least one), that means that the students who were expelled had their medical careers ended because it looked like they accessed Canvas for one single question each on 3 or more exams - amidst a sea of data that clearly wasn't related to anything they did.

Anyone who says the Birthday Paradox doesn't apply here either doesn't understand what the school did (multiple waves of targeted cherry-picking, which STILL resulted in very weak data at the end), or doesn't understand statistics. You absolutely cannot definitively assert that students cheated if you're pulling from a dataset that has 20 rows that were more-or-less random mashups of test questions and pages from the course that were automatically being reloaded on an auxilliary device. It's very likely in such a scenario that there will appear to be at least one meaningful/problematic match-up of exam question and Canvas activity - not because there was, but because p<0.05 literally means that if you go on a phishing expedition with 20 random datapoints, there's a very good chance that you'll find one that looks meaningful even though it isn't.

Of course schools can use Canvas logs to suggest that students did something suspicious. But they can only do that if the logs actually show consistent suspicious behavior - not one random activity that looks potentially problematic in a sea of data that's clearly random noise.
Yes, you're correct: the school's evidence sheets consist of ExamSoft question history transcripts manually lined up, apparently by hand, with Canvas Course Access logs, both of which have precise timestamps. The school appears to have automatically assumed an integrity violation if:
1. A question topic and a piece of Canvas material appeared to share a substantive focus even if only in a superficial way; AND
2. The exam question was initiated within roughly the same 3-5 min span as the activity involving the thematically-linked Canvas material.

However, as I mentioned previously such assumed match-ups were actually wrong more often than not: whoever was doing this correlation incorrectly thought that a particular piece of Canvas material was relevant to a particular exam question with an error rate of at least 80%. And so while the school tried to accuse each student of having between 10 and 30 of these correlations, at most each student ended up having 1 or 2 and sometimes none at all that were even slightly suspect. Additionally, the real "hit rate" of substantive correlations was probably much lower even than that, because it looks like the school silently redacted dozens and perhaps hundreds of rows of "clearly irrelevant" data from each student's log - rows which would have showed just how much automated activity was underway. I hear you that the temporal coincidence looks bad - but we believe that automatic refresh operations were repeatedly invoking same exact files every 5-10 minutes for the full duration of the exams in question. Given that frequency of automatic refreshes, it would have been nigh-on impossible for someone who had kept a few relevant documents open on an inactive device (which is what was doing the automatic syncing in all of these cases) to get through the whole test without at least one randomly-generated point of synchronicity between exam content and Canvas content.

I entirely agree with you that looking at other time periods would have been a great way for the school to the check meaningfulness of their data. We've asked them to look at and provide full unfiltered data from one hour before and one hour after each exam, and they have so far refused.


How do @operaman and @NotAProgDirector respond to these findings?
 
  • Like
Reactions: 1 user
How do @operaman and @NotAProgDirector respond to these findings?
Interesting. I’m definitely skeptical because it seems far fetched to think that such a seemingly simple explanation has eluded a large institution with a large team of their own experts. We also must bear in mind that the chance of a statistical error doesn’t exonerate anyone either. Surely the school has looked at all the data when making these decisions. If they haven’t, I suspect that may come to light in litigation at some point.
 
  • Like
Reactions: 1 user
Interesting. I’m definitely skeptical because it seems far fetched to think that such a seemingly simple explanation has eluded a large institution with a large team of their own experts. We also must bear in mind that the chance of a statistical error doesn’t exonerate anyone either. Surely the school has looked at all the data when making these decisions. If they haven’t, I suspect that may come to light in litigation at some point.
If they had really considered this, they should be perfectly comfortable releasing the full, unabridged data sets. If you watch the town halls, you will see that Duane Compton seems biased towards punishing potentially innocent students rather than setting a high standard of evidence. Sure, the chance of a statistical error doesn’t mean that no one cheated. But it absolutely means that the evidence isn’t exactly strong. You seem inclined towards type 1 error. I think justice systems should be inclined towards type 2. Guilty until proven innocent is not the way to go.
 
  • Like
Reactions: 8 users
We can argue the logistics of Canvas all day, but I think the true odious concern here is that Dartmouth accused SEVENTEEN students of cheating. Obviously, they did not have the best evidence prior to making these accusations, since so many were exonerated shortly thereafter. They allowed a culture of fear to propagate on their campus, and considering everything else going on in the world who knows the effect that had on the mental health of the students. Those students feared for their entire careers. It's truly astonishing.
 
  • Like
Reactions: 15 users
Am I the only one here who thinks it's not even a big deal? You can't cheat on your steps. You can't cheat on rounds. Those are the big things they look at. Everyone always says preclinical grades are low yield. Just give them a warning and be done with it. Why are they expelling people over this?
 
  • Like
  • Wow
  • Okay...
Reactions: 9 users
Members don't see this ad :)
Am I the only one here who thinks it's not even a big deal? You can't cheat on your steps. You can't cheat on rounds. Those are the big things they look at. Everyone always says preclinical grades are low yield. Just give them a warning and be done with it. Why are they expelling people over this?
You should definitely ask a professor "what's the big deal cheating on all the preclinical tests?"

They'll learn you right quick
 
  • Like
  • Love
  • Haha
Reactions: 4 users
Am I the only one here who thinks it's not even a big deal? You can't cheat on your steps. You can't cheat on rounds. Those are the big things they look at. Everyone always says preclinical grades are low yield. Just give them a warning and be done with it. Why are they expelling people over this?
The profession is built on integrity and honesty. Geisel is a P/F cirriculum, which means if you can't pass a preclinical exam without cheating then don't become a doctor because you don't deserve to be one. Not saying that these students did cheat but the concept of cheating getting thrown around as no big deal is absurd.
 
  • Like
  • Love
Reactions: 6 users
We can argue the logistics of Canvas all day, but I think the true odious concern here is that Dartmouth accused SEVENTEEN students of cheating. Obviously, they did not have the best evidence prior to making these accusations, since so many were exonerated shortly thereafter. They allowed a culture of fear to propagate on their campus, and considering everything else going on in the world who knows the effect that had on the mental health of the students. Those students feared for their entire careers. It's truly astonishing.
The town-hall response from the administration was horribly tone-deaf too. "I'm sorry we terrorized you, but we have to make sure your accomplishments and your degree are real."

It's a small program as well, like the one I belong to. If it's anything like mine in other regards, the administration probably knows some of the students somewhat well professionally, and they can put a face to those names. Yet they still decided to go on a witch-hunt and not consider the consequences for the state of the community. This isn't a massive program like Jefferson or Wayne where you could forgive the administration for being overwhelmed or unaware.

Sounds like a callous disregard for the well-being of their students.
 
Last edited:
  • Like
Reactions: 8 users
The profession is built on integrity and honesty. Geisel is a P/F cirriculum, which means if you can't pass a preclinical exam without cheating then don't become a doctor because you don't deserve to be one. Not saying that these students did cheat but the concept of cheating getting thrown around as no big deal is absurd.
Cheating does matter. Hell I've never cheated on a med school exam and have complained about people who do, but the mechanism through which you crackdown on cheating also matters

I'm also skeptical of the idea that three students in a small program cheated enough to warrant outright expulsion. It sounds like they said "let's kick out the students with the 'strongest evidence' and just fail/force to repeat the rest."
 
Last edited:
  • Like
Reactions: 5 users
It seems unlikely that 17 (?) students cheated in such a way. It seems more likely some random school employee messed this up and hit the nuclear option before realizing that you can't put that genie back in the bottle once you do so.
 
Last edited by a moderator:
  • Like
  • Love
Reactions: 17 users
It seems unlikely that 17 (?) students cheated in such a *****ic way. It seems more likely some random half-assed school employee ****ed this up and hit the nuclear option before realizing that you can't put that genie back in the bottle once you do so.
Exactly. People who are extremely accomplished in one field often buy their own BS and think their gifts are transferrable to other domains. They usually aren't. It's entirely believable that someone with a big title jumped the gun, accused students of abusing technology they (admin) didn't really understand, and then totally botched the clean-up when called out on it. One of the central responsibilities of any administrator at a big-name school is to make sure the school keeps its reputation intact and avoids bad press. How's that working out?
 
  • Like
Reactions: 4 users
This scared the hell out of me. My school uses a learning management system called Akila. I would often have akila windows open on my laptop from the night before, or last minute review an hr before the exam ( exam starts at 9 and you study until like 8:30 ish, no different than looking at your notebook right up until you walk into an in person exam, which I did in college all the time.)

I then shut my laptop lid which puts it to sleep, plug in the exam password on my school issued iPad that becomes locked to Examplify ( which means even if an Akila window or notability is open on it I obviously can't access it). I have my phone next to me on sleep mode so I can quickly contact admin if there are technical difficulties. ( But sometimes there are Akila tabs open on that, too).

The last couple of months it has occured to me that I may get flagged for cheating. Even if my other devices are " asleep" I always wondered if the log makes it look like I'm accessing course materials during the exam. I've recently starting making sure I kill every browser window on my laptop before sleeping it to ensure nothing looks sus. Seems I dodged a bullet.
 
  • Like
Reactions: 6 users
Interesting. I’m definitely skeptical because it seems far fetched to think that such a seemingly simple explanation has eluded a large institution with a large team of their own experts. We also must bear in mind that the chance of a statistical error doesn’t exonerate anyone either. Surely the school has looked at all the data when making these decisions. If they haven’t, I suspect that may come to light in litigation at some point.

I agree - if I hadn't seen this nightmare firsthand I probably wouldn't believe it either, because it's all just too surreal. For the record, the "experts" you mention produced a research brief to support their accusations, which claimed that it is impossible for browsers to automatically reload content on their own. I suspect everyone in this community knows how ridiculous that statement is, but just in case: multiple technology experts have confirmed that there's a bunch of things that can cause browsers to refresh on their own, which included "background app refresh, meta-refresh, AJAX, automatic browser updates, and computer restart." One tech person said flatly "if anyone who reported to me wrote a memo like this, I would fire them on the spot."

As you suggest, the statistical error argument here is in some ways secondary, and is only intended to explain why Dartmouth thinks they found anything at all, and why that approach is statistically problematic. The bigger problem is that even in the selectively-filtered logs that were presented to students, 80% or more of what Dartmouth thinks they found is bogus -- either impossible for a human to do or clearly of no relevance to the exam questions that these materials were supposedly related to.
 
Last edited:
  • Like
Reactions: 6 users
This scared the hell out of me. My school uses a learning management system called Akila. I would often have akila windows open on my laptop from the night before, or last minute review an hr before the exam ( exam starts at 9 and you study until like 8:30 ish, no different than looking at your notebook right up until you walk into an in person exam, which I did in college all the time.)

I then shut my laptop lid which puts it to sleep, plug in the exam password on my school issued iPad that becomes locked to Examplify ( which means even if an Akila window or notability is open on it I obviously can't access it). I have my phone next to me on sleep mode so I can quickly contact admin if there are technical difficulties. ( But sometimes there are Akila tabs open on that, too).

The last couple of months it has occured to me that I may get flagged for cheating. Even if my other devices are " asleep" I always wondered if the log makes it look like I'm accessing course materials during the exam. I've recently starting making sure I kill every browser window on my laptop before sleeping it to ensure nothing looks sus. Seems I dodged a bullet.

Unfortunately, I think this is exactly the right response. PSA: Everyone who uses a learning management system should completely log out of it on all devices (including phones) during exam periods. Hopefully no other school will be so callously negligent in its use of these tools as Dartmouth has been. But from what I've seen the overall level of understanding of academic tech in the med school community is atrocious, so I'd say it's pretty much a certainty that in the absence of some kind of nationwide policy this situation is going to arise again elsewhere.
 
Last edited:
  • Like
Reactions: 5 users
I find it so hard to believe that Dartmouth admins are complete idiots who don't know how their exam technology works and who hadn't even considered the possibility that students who'd cheat would've also likely stored or printed the material locally and offline to use in exams

Come on, we're talking a mid tier MD school whose adcoms flex on being popular for nontrads and having small class sizes. Is a school like that really being idiotic? That's a complete shame to med education
 
  • Like
Reactions: 2 users
I find it so hard to believe that Dartmouth admins are complete idiots who don't know how their exam technology works and who hadn't even considered the possibility that students who'd cheat would've also likely stored or printed the material locally and offline to use in exams

Come on, we're talking a mid tier MD school whose adcoms flex on being popular for nontrads and having small class sizes. Is a school like that really being idiotic? That's a complete shame to med education
It's a nonprofit in a town of 10k people dude, not Google
 
  • Like
Reactions: 3 users
You should definitely ask a professor "what's the big deal cheating on all the preclinical tests?"

They'll learn you right quick

The profession is built on integrity and honesty. Geisel is a P/F cirriculum, which means if you can't pass a preclinical exam without cheating then don't become a doctor because you don't deserve to be one. Not saying that these students did cheat but the concept of cheating getting thrown around as no big deal is absurd.

Oh booo. Seeing surgeons scream at everyone in the OR. Seeing doctors covering each others mal practice. Seeing admins, NPs, PAs, ripping EVERYONE off hasn't relieved you guys of your idealism? On top of that having your eval signed by a doctor you never met and getting 'meets expectations' across the board despite being at their hospital 12 hours a day for 6 weeks doing ##### work? Cheating on a preclinical quiz/test/whatever doesn't mean you'll be a bad doctor, bad person. Doesn't mean you have less of a knowledge base. It just means that you panicked and pulled up a powerpoint when it looked like no one was watching. By the way I don't know a single medical student who isn't/hasn't looking **** up on google during these stupid online tests/quizzes. And IDC what some PHD wants to rant about OUR professionalism. They didn't do medical school. Don't know what it's like. By the way, we're in a profession where a little bit of mercy goes a long way. Seems like you guys have none. Another way to look at it is that the administration expelled the people who told the truth and owned up to their mistake, and then couldn't/didn't do anything to the people who stuck to their stories or lawyered up. They expelled the genuine people who by the way would have been great doctors, and selected for the people who would lie to their last breath. To the premed who thinks we're a profession built on integrity look a little further on this site, or /rmedical school. r/residency. It's all one big joke and most of us would love to have the years and money back and do whatever else.
 
  • Wow
  • Like
Reactions: 2 users
Med schools: "We care about your mental health because our accrediting body says we must"

Also med schools: "YOU'RE ALL CHEATERS AND ARE ALL EXPELLED! NO, WE WON'T HEAR YOUR VINDICATING OR EXONERATING PROOF!"

:rofl:
 
  • Like
  • Haha
Reactions: 10 users
I just told my buddy about this ( she knows about the Giesel story bc she has a friend at GIesel), that she should make sure to close Akila tabs and she is very freaked out bc she never thought this could lead to an erroneous cheating accusation. But she's gonna close all tabs now. ( She's thinking of " what if my sleeping laptop shows I accessed something I didn't access" ) And now she is informed albeit stressed.
 
  • Like
Reactions: 1 user
I’m having a hard time seeing how Dartmouth hasn’t royally screwed this up and destroyed multiple lives needlessly. The more info that comes out makes them look worse and worse.

I hope some big time lawyer decides to take this on.
 
  • Like
Reactions: 8 users
Oh booo. Seeing surgeons scream at everyone in the OR. Seeing doctors covering each others mal practice. Seeing admins, NPs, PAs, ripping EVERYONE off hasn't relieved you guys of your idealism? On top of that having your eval signed by a doctor you never met and getting 'meets expectations' across the board despite being at their hospital 12 hours a day for 6 weeks doing ##### work? Cheating on a preclinical quiz/test/whatever doesn't mean you'll be a bad doctor, bad person. Doesn't mean you have less of a knowledge base. It just means that you panicked and pulled up a powerpoint when it looked like no one was watching. By the way I don't know a single medical student who isn't/hasn't looking **** up on google during these stupid online tests/quizzes. And IDC what some PHD wants to rant about OUR professionalism. They didn't do medical school. Don't know what it's like. By the way, we're in a profession where a little bit of mercy goes a long way. Seems like you guys have none. Another way to look at it is that the administration expelled the people who told the truth and owned up to their mistake, and then couldn't/didn't do anything to the people who stuck to their stories or lawyered up. They expelled the genuine people who by the way would have been great doctors, and selected for the people who would lie to their last breath. To the premed who thinks we're a profession built on integrity look a little further on this site, or /rmedical school. r/residency. It's all one big joke and most of us would love to have the years and money back and do whatever else.
Sorry you're burnt out but it doesn't justify someone cheating their way through preclinicals. People doing unethical things later doesn't mean we should normalize unethical things as students. Two wrongs don't make a right but three lefts do, or some such.
 
  • Like
  • Love
Reactions: 10 users
They still have massive resources though?
It's a mid-tier program in the middle of nowhere lol. I don't see how they'd manage to attract the best talent in educational technology or statistics

Only thing they get from the prestigious parent institution is legal counsel

And just like every other educational institution on the planet, yes Harvard, Penn, Drexel, whether they can afford a proper investigation or not..., they can bungle an investigation. Even if the allegations were true, the mistakes already made, like prematurely accusing and having to dismiss the cases of 30 students already, reek of incompetence
 
  • Like
Reactions: 1 users
It's a mid-tier program in the middle of nowhere lol. I don't see how they'd manage to attract the best talent in educational technology or statistics

Only thing they get from the prestigious parent institution is legal counsel

And just like every other educational institution on the planet, yes Harvard, Penn, Drexel, whether they can afford a proper investigation or not..., they can bungle an investigation. Even if the allegations were true, the mistakes already made, like prematurely accusing and having to dismiss the cases of 30 students already, reek of incompetence
But we're talking technology that has been used for several months if not a year when everything went remote. Dartmouth not knowing how their software works in April 2020 is one thing but this happened in April 2021. I'm aware they screwed up but this level of incompetence is just very hard to believe
 
  • Like
Reactions: 1 user
But we're talking technology that has been used for several months if not a year when everything went remote. Dartmouth not knowing how their software works in April 2020 is one thing but this happened in April 2021. I'm aware they screwed up but this level of incompetence is just very hard to believe
Yup. I think the plethora of data, valid or not, they had access to hurt them as well... When you have so much information at your fingertips, it's tempting to read into associations and patterns that might not even be meaningful

I have no doubt that the administrators or IT professionals have a decent baseline level of competence in their day to day work, but they should have taken the time to consult with professionals with more experience.

And it goes without saying that their failure to do so traumatized some of their innocent students
 
Yup. I think the plethora of data, valid or not, they had access to hurt them as well... When you have so much information at your fingertips, it's tempting to read into associations and patterns that might not even be meaningful

I have no doubt that the administrators or IT professionals have a decent baseline level of competence in their day to day work, but they should have taken the time to consult with professionals with more experience.

And it goes without saying that their failure to do so traumatized some of their innocent students
Yeah i agree but unfortunately the Dartmouth meltdown is damaging for students everywhere since it becomes a lot harder to trust the school's IT now in fears of being falsely accused

Someone on Reddit said the training license application asks whether you've been ever accused of cheating. Idk if that's true (@Banco @efle @DOVinciRobot ?) but if it is, even a false accusation that's dismissed can lead to problems down the line.
 
Yeah i agree but unfortunately the Dartmouth meltdown is damaging for students everywhere since it becomes a lot harder to trust the school's IT now in fears of being falsely accused

Someone on Reddit said the training license application asks whether you've been ever accused of cheating. Idk if that's true (@Banco @efle @DOVinciRobot ?) but if it is, even a false accusation that's dismissed can lead to problems down the line.
Yup.... Our school uses a mixture of proctoring and recording. Since I like to do a first pass of every question then use the restroom/walk and do a second pass, I've always wondered if that raised some eyebrows
 
  • Hmm
Reactions: 1 user
I find it so hard to believe that Dartmouth admins are complete idiots who don't know how their exam technology works and who hadn't even considered the possibility that students who'd cheat would've also likely stored or printed the material locally and offline to use in exams

Come on, we're talking a mid tier MD school whose adcoms flex on being popular for nontrads and having small class sizes. Is a school like that really being idiotic? That's a complete shame to med education
There were many schools, particularly law schools, using examsoft in which students could find the exam stored on the pc. You are giving too much credit to admin here. They have no idea how much of it works and many IT employees at schools are not really IT people but more like "I can't get the printer to work. I can't the login to work" people not true network and IT system administrators.
 
  • Like
Reactions: 4 users
But we're talking technology that has been used for several months if not a year when everything went remote. Dartmouth not knowing how their software works in April 2020 is one thing but this happened in April 2021. I'm aware they screwed up but this level of incompetence is just very hard to believe
You are going to be really upset when you get into the working world and realize people in charge of stuff like this barely have a cursory understanding of the product. It's modern America.
 
  • Like
Reactions: 1 users
Lol, this is just a classic display of admins being globally incompetent. I mean, God...even covering their own butts is a struggle
 
  • Like
Reactions: 3 users
I do think being the 90th vs 75th percentile on the MCAT gives a lot more confidence on the level of ability of the student than STEP given that the MCAT is specifically designed to stratify applicants and STEP is not.
I feel like there's way too many variables that go into an MCAT score to make a statement like that. How long did person X study versus person Y? How many times did they take it? Is your true potential represented in your first score or your latest score if you took it multiple times? How about compared to someone who took it once? Additionally, if you look at the FL scoring charts, you can see that it gets to a point where every 1-2 questions more correct is another point for that section. The difference between your score and a score 10 percentile points higher could be one more question correct on each section. What if you never took a P/S class and just bombed that section the entire difference of 6 points? What if you're ESL and just bombed CARS?

Anecdotally, I know someone who scored 90th+ percentile on the MCAT, while I scored 51st percentile, and yet I outperformed them on numerous tests in the same classes in undergrad and have finished classes with an entire letter grade above them. They studied for the MCAT for many months, took a paid prep class on it, and then performed well on test day. I studied for one month and then bombed it on test day after not sleeping. Who will do better in med school? I don't know.
 
  • Like
Reactions: 1 user
I find it so hard to believe that Dartmouth admins are complete idiots who don't know how their exam technology works and who hadn't even considered the possibility that students who'd cheat would've also likely stored or printed the material locally and offline to use in exams

Come on, we're talking a mid tier MD school whose adcoms flex on being popular for nontrads and having small class sizes. Is a school like that really being idiotic? That's a complete shame to med education
Yes to all of the above. This situation is unbelievable, and "a complete shame to med education." That unfortunately doesn't make it any less true.
 
I feel like there's way too many variables that go into an MCAT score to make a statement like that. How long did person X study versus person Y? How many times did they take it? Is your true potential represented in your first score or your latest score if you took it multiple times? How about compared to someone who took it once? Additionally, if you look at the FL scoring charts, you can see that it gets to a point where every 1-2 questions more correct is another point for that section. The difference between your score and a score 10 percentile points higher could be one more question correct on each section. What if you never took a P/S class and just bombed that section the entire difference of 6 points? What if you're ESL and just bombed CARS?

Anecdotally, I know someone who scored 90th+ percentile on the MCAT, while I scored 51st percentile, and yet I outperformed them on numerous tests in the same classes in undergrad and have finished classes with an entire letter grade above them. They studied for the MCAT for many months, took a paid prep class on it, and then performed well on test day. I studied for one month and then bombed it on test day after not sleeping. Who will do better in med school? I don't know.
Yeah, and even if the MCAT was highly reflective of one's reasoning ability and preparation wasn't a significant factor, it isn't reflective of the skillset used in medical school. Med school is like 20% reasoning and 80% memorization, and the rate-limiting step to doing well is usually memorization. There's a modest amount of memorization for the MCAT, but it's nothing any reasonably book-smart person can't do with effort - med school memorization on the other hand is a different beast
 
  • Like
Reactions: 1 user
If they had really considered this, they should be perfectly comfortable releasing the full, unabridged data sets. If you watch the town halls, you will see that Duane Compton seems biased towards punishing potentially innocent students rather than setting a high standard of evidence. Sure, the chance of a statistical error doesn’t mean that no one cheated. But it absolutely means that the evidence isn’t exactly strong. You seem inclined towards type 1 error. I think justice systems should be inclined towards type 2. Guilty until proven innocent is not the way to go.

One of the early lessons in medical school is to understand that the first rule of "Medical School admin/faculty" is to never ever admit you were wrong about anything. If you show any self doubt then the grift is over. The way these admins see it is that if they admit they did anything wrong here they are ****ed, and its always "better you than me" in the medical school admin/student dynamic.
 
  • Like
Reactions: 5 users
How do @operaman and @NotAProgDirector respond to these findings?
It's hard to say. Just going off the students' letter, it sounds like the school may have initially acted on the noise, but then the appendix A suggests that continually accessing the same file in what's clearly automatic refresh was one of the cited grounds for exoneration. I wonder if we aren't getting information generated at different points in the story, ie. is the outside expert talking about the data before the first big group were exonerated using the appendix A criteria, or is this still the kind of evidence the school is using?
 
There were many schools, particularly law schools, using examsoft in which students could find the exam stored on the pc. You are giving too much credit to admin here. They have no idea how much of it works and many IT employees at schools are not really IT people but more like "I can't get the printer to work. I can't the login to work" people not true network and IT system administrators.
I second this! These IT sys admins are not javascript developers or savvy programmers who scan through web pages and log files frequently. Starting an investigation like this requires proper procedures. Most of the IT people I have dealt with, primarily install softwares and go through a set of checklists to ensure network security. I just don't understand how an accredited medical school would expel multiple students based on such a rash investigation. I hope you guys understand that these poor students are real human beings. You just simply can't ruin someone's career like this.

Also, please note that there may have been many medical students this year who looked things up on google, or had the lecture slides printed out. Are they disciplining those students? People make mistakes. Why not help them learn? If I were to become a faculty, I would try to help my students learn from their mistakes and prepare them for their future. I wouldn't get involved in any of these ethics boards who are destroying people's careers. It's EVIL.
 
Last edited:
  • Like
Reactions: 1 users
It's hard to say. Just going off the students' letter, it sounds like the school may have initially acted on the noise, but then the appendix A suggests that continually accessing the same file in what's clearly automatic refresh was one of the cited grounds for exoneration. I wonder if we aren't getting information generated at different points in the story, ie. is the outside expert talking about the data before the first big group were exonerated using the appendix A criteria, or is this still the kind of evidence the school is using?
Yes, good point - let me try to spell out exactly what happened with these various waves of cleaning.

1. The school did its initial dragnet pull of Canvas data, which generated log files for around 40 students (we're told) whose Canvas accounts showed some activity during times they were taking exams

2. The school saw that a bunch of the log files they had initially produced looked totally absurd, and promptly threw out all the ones that were so massive/random/unrelated to exam content that anyone could see that they weren't generated by humans.

2a. What the school didn't do at this point is ask "huh - why did we just have to throw out so many look files that looked totally ridiculous? Is it possible that there are automatic refresh processes at work that we don't understand?" They may have discarded logs that featured only large numbers of repeated refreshes of one or two files, but they definitely sustained multiple cases involving logs that had 2-3 page refreshes in the span of less than ten minutes.

3. Instead, the school simply discarded the logs that didn't tell a story that was useful to them, and kept the ones that did superficially appear to be able to tell a story of academic integrity violations. According to the school, this initial cull brought the number of accused students down to 17.

4. The school then did another wave of data cleaning, to remove all of the data points that didn't tell a clear story of cheating from from the 17 logs. This resulted in a bunch of log files that looked far more clean-cut and less messy than they actually were - and basically hid the fact that there was tons of traffic in these logs that actually didn't make any sense.

5. But there was one more problem. The match-up of Canvas and Examsoft data and selective filtering of all data that didn't represent a clean match appears to have been done by IT people who really didn't understand either the principle of autonomous page refresh or the course material. As a result, a lot of the match-ups the IT folks thought they had found turned out to be bogus, despite their cherry-picking. They accused students of cheating if there was Canvas activity on a course page whose title seemed to be relevant to an exam question -- but those Canvas pages often turned out to be things like empty discussion forums and instructor announcements that had no meaningful content. Actually opening and reviewing the Canvas pages that were cited in these logs would have fully exonerated at least 4 of the 10 students who still remain accused.

This, by the way, is where the statistical argument comes in: the school basically went on a fishing expedition, discarding datapoints they didn't like at multiple levels of analysis, until they had generated clean-looking logs that seemed to tell a damning story. And even with all of this selective culling, they messed up a significant percentage of the time, and included many associations that weren't actually problematic at all. And in case you're wondering: no, there's no evidence that these file access patterns were the result of students flailing around trying to find specific answers in a bunch of places. The kinds of resources that show up in the logs are just not things that any sane person would access with the hope of finding useulf information during an exam. Additionally, in several of these cases there existed one or more very-obvious Canvas files devoted to the very topic of the exam question that was flagged, which would have been the first place students would have looked for the answers to the questions they were accused of cheating on, had they actually been trying to cheat. The students did not access these very-obvious files, because they were apparently too busy staring at blank discussion pages.
 
Last edited:
  • Like
Reactions: 16 users
Interesting. I’m definitely skeptical because it seems far fetched to think that such a seemingly simple explanation has eluded a large institution with a large team of their own experts. We also must bear in mind that the chance of a statistical error doesn’t exonerate anyone either. Surely the school has looked at all the data when making these decisions. If they haven’t, I suspect that may come to light in litigation at some point.

I mean, there have been residency programs that completely forgot to certify a match list and had to soap every spot or go unfilled. Is it that unbelievable that a school could overlook that? Or even worse, that they realized they overlooked it and just doubled down?
 
  • Like
  • Wow
Reactions: 9 users
I don’t know how Dartmouth can come out of this looking like anything but a monster, and they shouldn’t. They accused completely innocent students of cheating without having done their due diligence before making the accusation. It doesn’t appear they made any sort of appropriate apology either. Maybe those 3 who were expelled deserved it, maybe they didn’t, but the school was irresponsible and cruel to students at a time when they need support the most. This should cost people their jobs at the very least.
 
  • Like
Reactions: 8 users
I'm not a Dartmouth student but my dad sent me this article this morning
Cheating allegations engulf Dartmouth medical school - The Boston Globe

TL/DR Some medical students at Dartmouth were accused of cheating because their accounts pinned Canvas during their exams. Some admitted to cheating, others stated that their Canvas apps were background refreshing which caused the pins.
Sequela of Step 1 going to pass/fail. Let's put our classroom exams on a pedestal.
 
  • Like
Reactions: 1 user
Jeez the entire first half of the article are the students whining about how the school hasn't supported them during the pandemic - which has nothing to do with cheating - then the latter half is basically all of the interviewed students not taking responsibility or claiming they were coerced when they did take responsibility. Is personal accountability just out the window these days?? Why would one's personal laptop ping another website without prompt, coincidentally for only 17 students btw? There is literally no reason for that, Canvas is not a live feed type website where it is updating your status to others continuously. Further, they already eliminated cases where students were allowed to use their course material.. literally supporting that they successfully caught students using it to access course material.

It is not an unfathomable idea that M1s are cheating a lot more this year. They easily can. If you have a second laptop, it doesn't matter if you have kill switch test software or not. I've already had classmates admit to me they've been looking stuff up on their second laptop and I can easily believe certain students doing so in our more challenging exams. It's sad, but not sure why these students are acting as though it is a totally alien concept.

People these days just cannot take personal accountability and must blame their poor choices on others. "COVID made me do it" "I have no emotional support" "Dartmouth is too aggressive about the claims" OK???? If you cheated, you cheated. Everyone understands that's a death sentence in medical school. Own up.

Side note, I also disagree with Dartmouth's censorship of their students by disallowing disparaging expression. That is pretty concerning. Any federal funds should be cut immediately if they're receiving it/actively enforcing censorship.
 
  • Dislike
  • Like
Reactions: 10 users
Top