March 30, 2020
Henrietta Lacks, the Tuskegee Experiment, & Ethical Data Collection: Crash Course Statistics #12

Henrietta Lacks, the Tuskegee Experiment, & Ethical Data Collection: Crash Course Statistics #12


Hi, I’m Adriene Hill, and Welcome back to
Crash Course, Statistics. Today we’re going to step back from sampling
and regressions to talk about the impact of all that statistical gathering. We’ve seen that the interpretation of this
information can have real lasting effects on our society, but its collection can also
have lasting effects on the subjects. The process of gathering and applying statistics can affect real people’s lives which means there’s an responsibility to gather and
use this data ethically. Today we’ll discuss 5 stories. Four of them are real and all of them can
help us learn where collecting data can go wrong, and how we can help prevent these things
from happening again. INTRO Our first story begins in 1822 when a young
fur-trapper named Alexis St. Martin got shot in the stomach when another trapper’s gun
accidentally went off. The wound was serious but a local army Doctor,
William Beaumont was able to stabilize St. Martin through a series of presumably painful
anesthetic free surgeries. But Dr. Beaumont couldn’t close the wound,
which left a small hole–called a gastric fistula–that allowed access to the stomach. St. Martin was out of a job since it’s hard
to be an active fur trapper with a hole in your stomach. So, he signed a contract to become a servant
to Dr. Beaumont. In addition to traditional chores, St. Martin
participated in all sorts of experiments at the whim of the doctor. Beaumont used the gastric fistula to study
how the body digested food. He made huge strides in the field, including
exploring the influence of mental disturbance on the process of digestion and correcting
the long held belief that the stomach digested food by grinding it up. When, in 1838, the two finally parted ways,
Beaumont spent the last 15 years of his life pleading with St. Martin to come back. Unsurprisingly, St. Martin declined. Without this strange situation, the field
of gastroenterology may have progressed more slowly. In fact St. Martin’s fistula was an inspiration
to Pavlov who used fistulas in dogs during his famous classical condition experiments. But all this progress came at a cost–to St.
Martin but also to those dogs. One of the most important ethical considerations
in research is whether humans who participate are able to feasibly say “no”. People with little power, resources, or money
can be coerced into participating in experiments that they’re uncomfortable with. Most research institutions have a committee,
called the Institutional Review Board– IRB–which oversees all the research at that institution
to make sure that it’s ethical. “Voluntariness” is one of the most important
things that they check for. This prohibits people with undue power or
influence over us from asking that we participate in a research study. For example your boss or professor is limited
in how they can ask you to participate in a research study because you might feel that
you have no choice–you have to participate otherwise this they might fire you or give
you a failing grade. Ethical research needs to be voluntary–at
least in humans. Animal rights activists argue that since animals
cannot volunteer for a study, we shouldn’t use them. In addition to their voluntary participation,
subjects should also know what will happen to them during the study. This was not the case in 1932 when the Tuskegee
Institute began a 40 year long study on over 600 black men. Under the guise of free medical care, the
men were secretly enrolled in a study to observe the long term progression of Syphilis. Over 300 of the men enrolled had the disease,
but Researchers failed to treat them with anything but fake or innocuous medicines like
aspirin even after it became clear that penicillin was a highly effective treatment for the disease. Late stage symptoms of Syphilis include serious
neurological and cardiovascular problems, yet The Institute allowed the study to go
on. Some wives and kids also contracted syphilis. In 1972 public outrage caused the study to
close down when news of the unethical conditions was leaked to the media. In 1951, at the same time the Tuskegee study
was running a poor tobacco farmer named Henrietta Lacks went to Johns Hopkins Hospital in Maryland
and had cells from a tumor collected without her knowledge or consent. These cells were used to grow a new cell line–called
the HeLa line–which scientists use to do in vitro experiments. The cells’ ability to thrive and multiply
outside her body made the cell line useful to researchers. It is still used for medical research today,
lending itself to cancer and AIDS research as well as immunology studies like the one
that led Jonas Salk to discover the Polio Vaccine. And in 1955 HeLa cells were the first human
cells to be successfully cloned. Overtime, the cell line and discoveries it
facilitated became extremely lucrative for researchers. But Lacks and her family…didn’t receive
any financial benefit. These studies emphasize the need for informed
consent. Subjects have the right to not only receive
all the facts relevant to their decision to participate, they have the right to understand
them. Many institutions require that information
must be presented clearly and in a way that’s appropriate for the subjects’ comprehension
level. Even children–whose parents are legally allowed
to consent for them, must get an age-appropriate explanation of what will happen in the study. This is incredibly important because it respects
the dignity and autonomy of the subject, allowing them to stop research procedures at any time. That incentivizes researchers to design studies
with more acceptable levels of risk. In all three of those stories the research
procedures did not have any benefit to the patients. In 1947, the Nuremberg code was created in
order to establish guidelines for the ethical treatment of human subjects. One of the main tenets is beneficence, which
not only requires that researchers minimize the potential for risk to subjects, but also
requires that the risk should be outweighed by potential benefits to the patient and the
scientific community. The Nuremberg code was created and implemented
after the second World War, during which horrifying experiments were conducted on prisoners in
Nazi concentration camps. The Nuremberg Code lays out ten principles
to which modern day studies still must adhere. These 10 principles stand as the basis for
much of current research ethics and include things like voluntariness, informed consent,
and beneficence. But as we settle into the age of technology,
the application of these ethical principles can become more cloudy. Our last story here isn’t real but it illustrates
the complexities of research ethics in the digital age. In the seventh season of the hit show Parks
and Recreation, a giant internet corporation comes to the small town of Pawnee, Indiana,
to offer free Wifi to the entire city. Everyone gladly accepts, they like the free
service. But when boxes of personalized gifts arrive
at every citizen’s doorstep, some become–understandably–concerned. Because, the gifts are perfect, fitting the
exact interests of the recipient. Someone who collects stuffed pigs dressed
as celebrities get “Hamuel L. Jackson” and someone obsessed with politics get the
newest Joe Biden poetry collection. These boxes are perfect for the people who
received them–eerily perfect. So how did the internet company know what
each person would want? It turns out that the free WiFi came with
a pretty high cost, privacy. In exchange for the free WiFi the internet
company, Gryzzl, was collecting all data that was transferred over their Network, this gets
called Data Mining. And it may seem far-fetched, but it’s happening
right now. Not the gift stuff. The data mining. Grocery stores track what we buy with our
rewards cards. Netflix keeps track of everything we watch. Amazon knows exactly what we buy – what we
look at. And those Terms of Service Agreements we click
on without reading them when we download an app or sign up for a social media account
they often include some kind of stipulation. When we use “free” internet services we’re
agreeing to pay, not with money, but usually our information. Facebook and Google offer there services for
free in part because they’re profiting off of our data. They might be using it for research. Or to customize our experience on their site
so that we buy or watch more stuff on Amazon and Youtube. They also use it to sell targeted ads, giving
advertisers the opportunity select exactly the type of people who will see their ads. And sometimes the way these ads are targeted
can be pretty unethical. For example, companies discriminating based
on age by specifying that job ads should be shown only to young people. Data is being used in ways that affect every
facet of your life. But, since we’re still in the beginning
stages of this huge influx of digital information, we get to see the progression of ethics in
this area unfold in front of us. The laws that will protect your data and privacy,
like the Nuremberg Code protects participants in scientific experiments, are still being
written, and many of the same concepts are coming up. For example, using the internet, Google, and
social media have become so entrenched in some societies that it’s almost impossible
to hold a job without them. If that’s the case, we need to ask whether
it’s ethical to require that users sign over their right to privacy in order to use
them, or like in most clinical studies, would that border on coercion? We also need to ask whether companies that
use or sell our information be held to the standard of “informed consent” which requires
agreements to be in language that’s simple enough for the user to understand what they’re
agreeing to–even without a law degree. Or, on the other hand whether they should
be exempt from this requirement if they only use the data internally. It possible to draw parallels between data
mining and the stories we talked about at the beginning of this episode – though admittedly
not quite as harrowing. Like Alexis St. Martin may have felt pressure
to stay with Dr. Beaumont because he couldn’t work as a fur-trapper anymore, it can be argued–
to a much lesser degree — that we use sites like Google and Twitter because we feel that
there’s no other option as we try to remain informed in our hyperconnected world. And we might not be getting all the information
we need to consent in an understandable way, similarly to how Henrietta Lacks was not informed
why her cells were being taken or what they’d be used for. These situations are obviously not exactly
the same, and we, as a society, need to decide how to apply the principles of research ethics
in these new digital spaces. As we move forward and gain the ability to
do things like sequence an entire genome in days, rather than years, we open the door
for amazing advances in personalized medicine that could save millions of lives. But we also open the door for abuse of this
sensitive information. The conversation about how to handle these
types of situations is still going on. We’re are the ones who will decide what
is said. And we’re going to be the subject of those
decisions. Thanks for watching. I’ll see you next time.

83 thoughts on “Henrietta Lacks, the Tuskegee Experiment, & Ethical Data Collection: Crash Course Statistics #12

  1. I read the biography of Henrietta Lacks and was surprised that I hadn't known about her before. I still don't understand why this hasn't been given the attention it deserves before.

  2. I disagree with your choice of the word "fail" for describing how the doctors "treated" the Tuskeegee victims. It implies that the doctors didn't know what they were doing. A better word would have been "refused."

  3. If it makes money, companies will find the way to do it even if it's illegal or unethical. Who's going to stop them? The government? If it tries there will be whines of "Socialism!!!" from business-friendly media companies and politicians? The public won't stop it unless they personally are inconvenienced too much. I'm not sure what could be done to make sure businesses behave ethically period because these days it seems like the least ethical companies are the ones who are the most successful.

  4. Make ISP give internet for free until new laws are written, they mine all data alongside all "free" service sites.If all stays the same: Make all "free" service sites and internet providers pay users for data they collect and provide info on what data they collected and how they are using it.

  5. The doctors in the Tuskegee Experiment did not just fail. They actively prevented the patients from seeking treatment too, clearly knew there was a cure, but put their experiment b4 their patients. For those who argue this is not related to stats, every discipline has some sort of ethical code that ought to be followed. Stats has various ethical issues not covered: John Oliver covered p-hacking. This topic is often covered in political science, sociology, and anthropology Research Methods courses. Any discipline dealing w/ stats ought to cover ethics.

  6. i literally just finished reading the immortal life of Henrietta Lacks today. coincidence? probably.

  7. What about behavioral or psychological experiments, where the participant knowing what's going to happen would ruin the results?

  8. Just throwing this out there – the Japanese also conducted human experiments that were every bit as horrific as the Nazis during WWII. It was kept quiet for a long time, so probably didn't contribute to the Nuremberg code at all, but yeah. Unit 731.

  9. "To gather and USE this data ethically". Ethical considerations should only relate to the gathering of data (as in the case of all the real cases presented ), not how it is used. It is the difference between talking about facts and talking about what we wish were true. Eg. I find the knowledge of X to be used incorrectly by Y% of the population, lets 'ethically' ban seeking knowledge of X even if the gathering is done ethically. Ppl can be quite loose with Y and 'incorrectly'. Moreover, given it is a consequentialist argument of some harm, and given the degree is unspecified as implied by the undefined Y, then correlations at Z significance or deductive arguments about possibility generally make any argument of the sort true.

  10. HELLO GUYSS! Help me to reach 13.2K subscribers on my channel i really appreciate those people who help & particpate to reach my GOAL✔ thanks it really means a lot🇵🇭😄😁💕

  11. I absolutely love this episode! I'm writing about ethics in information technology, and that last minute of the video was so on point!

  12. I don't think it is very ethical let somebody decide the fate of the progress of modern medecine because they don't want to. The Hela cells allowed doctors to save coutless lives and are still doing so today. Letting people know what is involved in a study is all well and good but more often than not the researcher don't even what they are going to discover. It is not like they knew from the begining that Hela cells would be so valuable. Even when we know the value ot theses cells, is it ethical to let them hostages to the will of one person ? There has to be a line where the the good of the many outweight the good of the few. Is it ethical that the Hela cells make money to the persons who contributed nothing to their application and usage (ie: Henrita lack family)?

    Informed conscent is only working if the information is somewhat able to be had. Agreement that can be renounced also threaten the people counting on that agreement boundries. Should you be able to go back and attack people for doing something that you agreed beforehand because you now have both acces to new information, is that fair ?

    I agree that doctors have a moral duty to explain as well as they can and as clearly as they can what they think is going to be hapening in a study and what their aims are. They also have a duty to keep updating the voloteers to the usage of that data. However, Expecting total and absolute control over ones own data is dangerous too, there has to exist such a thing as common good, something that override the selfish impulse of individuals.

  13. The government needs to respect the test subjects worth to the researchers not just their usage and what was with pardoning all the sadistic doctors or evil scientists if they joined up with the winners and shared their findings? We see who taught us our morals and why.

  14. Governments do not understand the current technology (reference the Congressional  questioning of Zuckerberg) and they are all firmly in the pockets of business, so do not expect anything "pro-privacy".

  15. You did a video about ethics, and mentioned Pavlov's dogs, but completely left out that he did those exact same studies on children?

  16. Noticed the opening that scrolls just a bit to fast, always have to pause to read the dam labels had Yuma as the sunniest place in the world. Glad to report that it is sunny today after two dreadful days of clouds.

  17. How do you feel about 'post-hoc informed consent' (i.e. first gather the data, then ask permission to use it, delete if not granted)?

  18. For Europeans, the question of whether we should get informed, voluntary consent has already been resolved. With the new privacy regulation that gets enforced from May, companies that use the personal data of EU citizens without informed, voluntary consent will risk crippling fines. Its still okay to abuse Americans, thought. 😉

  19. human experiments. ftw. they are happy to be large of something parter. i say.

  20. we dont even get the gifts. im glad some of the classics, & hand wavey vagueness about the truth of modern human experimentation is covered. we do still want to sleep.

  21. DO AWAY W/THE DIGITAL NUREMBERG CODE! SERVE YOUR COUNTRY/COMPANY(S)!

  22. wait wait you mean to tell me I didn't have to take all those stupid social borderline class discriminatory polls my professor made us take. Like I had a Choice??!?!! boy if I was a citizen and had rights what I'd do

  23. I think the important thing to remember with data collection is that you, the user, are never personally identifiable. Technically speaking, it's not you that's being sold, it's the behaviours or indicators of a group of users (that you happen to be a part of) that's being sold.

  24. There's also the experiments companies paid prisons to do on prisoners in the us, does anybody remember the name of that?

  25. A very informative, sobering look at ethics, not just in research and technology, but also in social media and business. In addition to key requirements such as voluntariness and beneficence, privacy and informed consent, we whose googol of data are collected every single moment ought to have the opportunity (and the choice) to share in whatever profit is gained from our very data! I wonder how much Facebook, Google, and Amazon will be inclined to exploit our data, if profit sharing with their users were to become an international requirement.

  26. If you are scared about the data mining thing, get adblock so that you can stay ignorant about what's going on.

  27. …historically, doctoring had much of its root system in sibling responsibility: the parents did all they could for their first child who then carried forward for his/her youngers, their seconds…and then there was the doing-unto-others modus operandi where doctors tried their-own medicine, rendering them willing to try worse on their dying patients…and then there was the financial-gain m.o.m. (motive-opportunity-means used to profile criminals) where treating government-drafted soldiers to return them as ticking-bombs 'puking with vengeance' to the frontline was lucrative and focused as on results-oriented-certainty not necessarily those you mention and giving rise to the double entendre of the hippocritical-hippocratical do-no-worse oath and statistics that imagined best-worstcase scenarios…

  28. …abuses of social media include political attack ads on the State Governor who flustered to remember his Twitter password when he was supposed to inform the public that there was-no, incoming ICBM, as if social media had anything to do with national security, ranting like 'commie-pinko sympathetizers' pretending to legally own their contagious disease samples (unless-of-course if they're guilty they should own-up)…

  29. That is weird. I stumble upon this video after i watched another video about hela cells. Kinda beneficial i must say

  30. Probably the worst unethical consequence of private information gathered by Internet and software companies is income inequality. Poverty is one of the biggest issues our world faces, and poverty is maily the consequence of the unequal and unfair distribution of the economic production. Social network, software, Internet companies have grown so much that they have billions or trillions of dollars in their bank accounts, meanwhile half of the world's population are forced to survive with 3 or less dollars a day. Amaz*n, Faceb**k, Micr*s*ft, G**gle, they are all the best examples I can imagine of companies causing income inequality and poverty, and one of the ways they do so is by gathering our private information and selling it with comertial, political, etc purposes.

  31. WHAT I watched the Crash Course Anatomy about this experiment YESTERDAY. First time watching both episodes and also first time I heard about the experiment. Coincidences, man!

  32. Ask yourself, does progress always come at a price? Are some experiments too risky or just wrong? A little curiosity can't hurt anyone Can it?

  33. Because animals cannot consent, we should not be testing on them. I'm glad she mentioned this– too many discussions of ethics only talk about humans. Animals can suffer, too, and have a right to their own lives, just like you and I do.

  34. LI'l Sebastian flying on a drone LOL– I'm dead LOL!! All fun aside, this video was excellent. The crash course episodes have helped me so much in my ethics class!!

  35. Lacks had the first immortalized human cell line. I guess she doesnt want to tell that a black woman had immortal genes

Leave a Reply

Your email address will not be published. Required fields are marked *