Articles

A Brief History of Bloodletting

A Brief History of Bloodletting


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

Several thousand years ago, whether you were an Egyptian with migraines or a feverish Greek, chances are your doctor would try one first-line treatment before all others: bloodletting. He or she would open a vein with a lancet or sharpened piece of wood, causing blood to flow out and into a waiting receptacle. If you got lucky, leeches might perform the gruesome task in place of crude instruments.

Considered one of medicine’s oldest practices, bloodletting is thought to have originated in ancient Egypt. It then spread to Greece, where physicians such as Erasistratus, who lived in the third century B.C., believed that all illnesses stemmed from an overabundance of blood, or plethora. (Erasistratus also thought arteries transported air rather than blood, so at least some of his patients’ blood vessels were spared his eager blade.) In the second century A.D., the influential Galen of Pergamum expanded on Hippocrates’ earlier theory that good health required a perfect balance of the four “humors”—blood, phlegm, yellow bile and black bile. His writings and teachings made bloodletting a common technique throughout the Roman empire. Before long it flourished in India and the Arab world as well.

In medieval Europe, bloodletting became the standard treatment for various conditions, from plague and smallpox to epilepsy and gout. Practitioners typically nicked veins or arteries in the forearm or neck, sometimes using a special tool featuring a fixed blade and known as a fleam. In 1163 a church edict prohibited monks and priests, who often stood in as doctors, from performing bloodletting, stating that the church “abhorred” the procedure. Partly in response to this injunction, barbers began offering a range of services that included bloodletting, cupping, tooth extractions, lancing and even amputations—along with, of course, trims and shaves. The modern striped barber’s pole harkens back to the bloodstained towels that would hang outside the offices of these “barber-surgeons.”

As hairdressers lanced veins in an attempt to cure Europeans’ ailments, in pre-Columbian Mesoamerica bloodletting was believed to serve a very different purpose. Maya priests and rulers used stone implements to pierce their tongues, lips, genitals and other soft body parts, offering their blood in sacrifice to their gods. Blood loss also allowed individuals to enter trance-like states in which they reportedly experienced visions of deities or their ancestors.

Bloodletting as a medical procedure became slightly less agonizing with the advent in the 18th century of spring-loaded lancets and the scarificator, a device featuring multiple blades that delivered a uniform set of parallel cuts. Respected physicians and surgeons extolled the practice, generously prescribing it to their most esteemed patients. Marie-Antoinette, for instance, seemed to benefit from a healthy dose of bloodletting while giving birth to her first child, Marie-Thérèse, in 1778, 14 years before the guillotine would shed more of the queen’s blood. As an excited crowd thronged her bedchamber, hoping to witness a dauphin’s arrival, the mother-to-be fainted, prompting her surgeon to wield his lancet. Marie-Antoinette immediately revived after the bloodletting—perhaps because the windows were simultaneously opened to let in fresh air.

America’s first president was less fortunate than France’s most infamous queen. On December 13, 1799, George Washington awoke with a bad sore throat and began to decline rapidly. A proponent of bloodletting, he asked to be bled the next day, and physicians drained an estimated 5 to 7 pints in less than 16 hours. Despite their best efforts, Washington died on December 17, leading to speculation that excessive blood loss contributed to his demise. Bloodletting has also been implicated in the death of Charles II, who was bled from the arm and neck after suffering a seizure in 1685.

By the late 1800s new treatments and technologies had largely edged out bloodletting, and studies by prominent physicians began to discredit the practice. Today it remains a conventional therapy for a very small number of conditions. The use of leeches, meanwhile, has experienced a renaissance in recent decades, particularly in the field of microsurgery.


A Brief History of Whiteness

By Winona Guo and Priya Vulchi

T he racial categories that we’re familiar with developed only 200 years ago, primarily by England and Spain. Otherwise cut off from the rest of the world, England kept on invading Ireland, labeling the people as savages — in fact, the cruel saying “the only good Indian is a dead Indian” first circulated in England as “the only good Irishman is a dead Irishman.”

A little less than 2,000 miles a way from England, Spain, loyal to the Catholic Church, was offering the Jewish and Muslim people under their rule three choices: “leave, convert, or die.” While many Jews and Muslims converted to Catholicism to escape persecution, church leaders questioned their sincerity, leading to the 1478 Spanish Inquisition, during which “interest in religious purity morphed into an obsession with blood purity,” as Matthew Desmond and Mustafa Emirbayer write in Racial Domination, Racial Progress.

In both England and Spain during this time, nationalism and capitalism began to rise. To satisfy Europe’s growing sense of nationalism and hunger for capitalism, the Age of Discovery began — “or, from the standpoint of the indigenous people of Africa and the Americas, the ‘Age of Terrorism,’” write Desmond and Emirbayer. When Christopher Columbus “discovered America” — aka happened upon an island in the Bahamas that was already inhabited — the Americas were populated by approximately 50 million to 100 million indigenous people.

With Christopher Columbus’ lead, the Spanish colonized the Americas the English followed a century later. From 1600 to 1900, 90 to 99 percent of America’s indigenous peoples died as a direct result of European colonization.

With the rise of nationalism, capitalism, and European discovery of the “New World” — which, again, was only “new” from a European perspective — a different worldview was desired to make sense of it all. Through colonialism, “race” became a key element of that worldview.

Whiteness remains the dominant category today — other races are compared and contrasted relative to it.

To further their capitalist interests in the “New World,” the English needed a labor force. So, indentured servitude started. Indentured servants were often kidnapped. They included Irish, impoverished English, indigenous, and African people. (Note how the English and Irish are identified as people from two separate nations, whereas indigenous and African people, all from different nations, are considered as two monoliths.)

Indentured servitude evolved into chattel slavery. Among all other indentured servants, why were Black people singled out to be enslaved? It couldn’t be Native Americans, because their numbers were reducing rapidly, they could escape their captors more easily since they were familiar with the land, and they were already relied upon as trappers in the lucrative fur trade business. It couldn’t be the “savage Irish” because, upon escaping, Irish slaves could “blend in” with their English captors.

Africans, however, could not blend in. Furthermore, Africans were not accustomed to the American landscape, making escape from captivity more difficult they were also immune to Old World diseases, unlike Natives, and many were already farmers. Africans soon came to be seen as “the perfect slaves” and originally not strictly because of their Blackness.

Thus, Whiteness and Blackness were born: “twins birthed from the same womb, that of slavery,” write Desmond and Emirbayer. The White race began to be formed “out of a heterogeneous and motley collection of Europeans who had never before perceived that they had anything in common.”

Whiteness remains the dominant category today — other races are compared and contrasted relative to it. Whiteness positions itself against ideas of, among others, Blackness, Indigenousness, Asianness, and Hispanic-ness. This is why people of color, rather than White people, will frequently be identified by their race. Whiteness has become the norm.


A brief history of AZT

AIDS - The Growing Threat, What’s Being Done. Time Magazine, Aug. 12, 1985.

Editor’s note: The museum is planning a special display of artifacts to go on view in 2011, commemorating the 30th anniversary of the first reports of HIV/AIDS. The museum's medical sciences collection includes artifacts such as a panel from the AIDS Memorial Quilt, an HIV test kit, anti-viral drugs, prevention posters, and red awareness ribbons.

While researching the big events in 20th century medical history for a timeline the museum is putting together, I found that the drug AZT and the HIV/AIDS epidemic are topics that especially interest me.

AZT, also called Zidovudine (ZVD) and Retrovir, was the first approved HIV/AIDS drug. It is a reverse transcriptase inhibitor. This type of medicine stops the reproduction of DNA and reduces the amount of the virus in the blood (the viral load).

AZT samples in the museum’s collections

AZT was approved by the FDA on March 19, 1987. It was approved in record time with only one trial on humans instead of the standard three and that trial was stopped after nineteen weeks. The study was stopped because the patients on the placebo were dying faster and the need for a treatment outweighed the need for full testing.

Pamphlet, “100 Questions and Answers: AIDS”, New York State Department of Health, July, 1991.

AZT is a controversial drug. For example, some physicians say that a patient can start taking AZT at any time. Some say the threshold is five hundred CD4 cells (T-cells) or below. Others say never take AZT. AZT is also used to reduce the transmission from mother to child during pregnancy and labor. The pro side says it reduces transmission the con side says it may cause birth defects. Another issue is whether it makes symptoms worse. The medicine can prolong life but it might also kill healthy cells. There are stories of children dying from AZT.

Would you take a kid out of their home because someone is not giving them their meds? What if you are a mother who already had one child whom you believed had died from the drug’s effects— would you give another child the drug or risk have her/him removed from the family? These are questions Valerie Emerson brought upin her 1998 custody battle. This court case was about a mother’s right to keep her kid although she did not give prescribed meds. It was considered child abuse. She won her case.

Editor’s Note: You can find more information on the history of HIV/AIDS around the Web:

  • National Institutes of Health In Their Own Words: NIH Researchers Recall the Early Years of AIDS
  • amfAR Twenty-five Years of HIV/AIDS: Snapshots of an Epidemic
  • Kaiser Family Foundation Global HIV/AIDS Timeline
  • National Library of Medicine Against the Odds: Making a Difference in Global Health
  • Food and Drug Administration HIV/AIDS Historical Time Line 1981-1990
  • Our online exhibition on HIV and AIDS

Meecha Corbett is an intern in the Division of Medicine and Science at the National Museum of American History.


Here's a brief history of the rise and fall of metzitzah b’peh — the blood-sucking circumcision ritual

Here's a brief history of the controversy surrounding metzitzah b'peh:

• Metzitzah b'peh, which literally means oral suction, is first mentioned in the Babylonian Talmud in tractate Shabbos, which dates back to the 4th Century. There, Rav Papa says that any mohel who doesn't do metzitzah b'peh is risking a baby's life, arguing it somehow prevents infections.

• In 1831, a German professor published a handbook for mohelim. He tried to demonstrate that there was no evidence showing the ritual serves a therapeutic purpose. In fact, it could actually harm the baby, he concluded.

• Six years later, a student of Rabbi Moses Sofer, a leading Talmudic scholar, asked him about several babies who appeared to become sick after they were circumcised by a mohel who did oral suction. In a famous response published in a journal in 1845, Rabbi Sofer concluded that metzitzah does not have to be done orally. Instead, a mohel could use a sponge to clean the wound.


A Brief History of Fake Blood

This Tuesday is Halloween, the time of year when we all douse ourselves in fake blood and watch gory movies. But what is that red stuff actually made of, and how has the recipe changed? In 2013, around the release of the new Carrie, Forrest Wickman explored the history of the ever-evolving recipe for fake blood. The post is reprinted below.

When it comes to adaptations of Carrie, the blood literally comes in buckets. For the newest version, director Kimberly Peirce was determined to get the climactic drop of pig’s blood just right. As she described it in a recent New York Times Magazine profile, she tried three-gallon, four-gallon, and five-gallon buckets, and she tried a three-foot drop, a four-foot drop, and a five-foot drop. Trying all these different configurations required take after take after take. When she asked Brian De Palma, director of the classical original Carrie (1976), how many takes it took him, he apparently replied, “What do you mean? We did one.”

Movie gore has come a long way since the first Carrie. What pumps through our veins hasn’t changed a drop, but what goes in those buckets has been reformulated again and again.

Fake movie blood—sometimes called “Kensington Gore,” after the street of that name in London—began evolving long before 1976. For black-and-white films, when blood was permitted at all (the censorship guidelines of the Hays Code in Hollywood didn’t much allow it), filmmakers used something quite simple: chocolate syrup. On black-and-white film, it made a starker contrast than red blood, and no one in the theater would ever know it was just Bosco or Hershey’s.

At first, technical advances were modest. For Psycho (1960), employing state-of-the-art makeup design didn’t mean using a new kind of blood, just a new method of delivery: the plastic squeeze bottle. It was brand new with Shasta chocolate syrup. As makeup supervisor Jack Barron explained it, “This was before the days of the ‘plastic explosion,’ so that was pretty revolutionary. Up to that time in films, we were using Hershey’s, but [with the squeeze bottle] you could do a lot more.”

Color presented new challenges. Starting at least as early as The Curse of Frankenstein (1957), the first color film from the schlockmeisters at Hammer Film Productions—a British studio, exempt from the Hays Code—blood began to splatter the silver screen in Technicolor. But horror filmmakers were still unaccustomed to working in color, and so the blood didn’t look right: In Hammer films like The Curse of Frankenstein and Horror of Dracula (1958), it was cartoonishly bright. The so-called “Godfather of Gore,” Herschell Gordon Lewis, knew this was a problem. While working on what became the first splatter film, Blood Feast (1963), he “realized how purple the fake blood at that time was because it had been prepared for black-and-white movies.” To avoid using these substandard materials, he got his blood custom, from the charmingly named Barfred Laboratories.


ANCIENT

Two women are shown dancing (and presumably menstruating) in this rock engraving from the Upper Yule River in Western Australia. Wikimedia

Though females have experienced menstruation since before humans even fully evolved as a species, there’s very little documentation about periods among ancient peoples. This is likely due to the fact that most scribes were men, and history was mainly recorded by men. As a result, “we don’t know whether women’s attitude [about menstruation] was the same [as men’s] or not,” Helen King, Professor of Classical Studies at the Open University, writes. “We don’t even know what level of blood loss they expected… but the Hippocratic gynecological treatises assume a ‘wombful’ of blood every month, with any less of a flow opening up the risk of being seen as ‘ill.’”

It’s very likely that women in ancient times had fewer periods than they do now, due to the possibility of malnourishment, or even the fact that menopause began sooner in earlier eras — as early as age 40, as Aristotle noted. However, there’s little evidence surrounding how ancient women handled blood flow.

Historians do know that in many parts of the ancient world, menstruating women were strongly associated with mystery, magic, and even sorcery. For example, Pliny the Elder, a Roman author and natural philosopher, wrote that a nude menstruating woman could prevent hailstorms and lightning, and even scare away insects from farm crops. In Mayan mythology, menstruation was believed to have originated as a punishment after the Moon Goddess — who represented women, sexuality, and fertility — disobeyed the rules of alliance when she slept with the Sun god. Her menstrual blood was believed to have been stored in thirteen jars, where it was magically transformed into snakes, insects, poison, and even diseases. Interestingly, in some cases, the ancient Mayans believed the blood could turn into medicinal plants too.

The Mayan moon goddess, associated with womanhood and fertility, is pictured here with a rabbit. Wikimedia

Period blood held plenty of different meanings in ancient cultures, and was often used as a “charm” of sorts based on a belief that it had powerful abilities to purify, protect, or cast spells. In ancient Egypt, the Ebers Papyrus (1550 BC) hinted at vaginal bleeding as an ingredient in certain medicines. In biblical times, ancient Hebrews upheld laws of Niddah, in which menstruating women went into seclusion and had to be separated from the rest of society for seven “clean” days.

Despite these mythological or even medicinal hints at menstruation, however, it’s generally unknown what women used as ancient tampons or pads. Assumptions of ragged cloths that were re-washed, tampons made of papyrus or wooden sticks wrapped in lint, or “loincloths” in Egypt have circulated, but no one really knows what women in fact used during this time.


A Brief History of Blood Transfusion Through The Years

As early as the 17 th century, blood has been used as a therapy for a variety of ailments. Over the years, there have been many great advances and it is no wonder this precious resource is so valuable. Here is a look at some of the bigger milestones related to blood transfusion over the years.

1628 English physician William Harvey discovers the circulation of blood. Shortly afterward, the earliest known blood transfusion is attempted.

1665 The first recorded successful blood transfusion occurs in England: Physician Richard Lower keeps dogs alive by transfusion of blood from other dogs.

1818 James Blundell performs the first successful blood transfusion of human blood to treat postpartum hemorrhage.

1840 The first whole blood transfusion to treat hemophilia is successfully completed.

1900 Karl Landsteiner discovers the first three human blood groups, A, B and O.

1902 Landsteiner’s colleagues, Alfred Decastello and Adriano Sturli, add a fourth blood type, AB.

1907 Blood typing and cross matching between donors and patients is attempted to improve the safety of transfusions. The universality of the O blood group is identified.

1914 Adolf Hustin discovers that sodium citrate can anticoagulate blood for transfusion, allowing it to be stored and later transfused safely to patients on the battlefield.

1932 The first blood bank is established at Leningrad hospital.

1939-1940 The Rh blood group is discovered and recognized as the cause behind most transfusion reactions.

1940 The US government establishes a nationwide blood collection program.

1950 Plastic bags allowing for a safer and easier collection system replace breakable glass bottles used for blood collection and storage.

1961 Platelet concentrates are recognized to reduce mortality from hemorrhaging in cancer patients.

1970 Blood banks move towards an all-volunteer donor base.

1972 The process of apheresis is discovered, allowing the extraction of one component of blood, returning the rest to the donor.

1983 Stanford Blood Center is the first blood center to screen for AIDS contaminated blood, using a surrogate test (T-lymphocyte phenotyping) two years before the AIDS virus antibody test is developed.

1985 The first HIV blood-screening test is licensed and implemented by blood banks.

1987 Stanford Blood Center is the first in the country to screen donors for Human T-Lymphotropic Virus Type I (HTLV-I), a virus believed to cause a form of adult leukemia.

1990 A specific test to identify Hepatitis C is introduced.

2002 West Nile Virus is identified as transfusion-transmissible.

At Stanford Blood Center, we lead the fields of transfusion and transplantation medicine by advancing science and technology. We provide hope for the future by teaching the medical leaders of tomorrow. We enhance lives by connecting donors to patients every day.

SBC is proud to be part of this industry that saves so many lives and hope you will consider becoming a donor. For eligibility details, please visit our eligibility page:


A Brief History of Evidence-based Practice

The formalized concept was embraced by many, but also elicited some criticisms, including that evidence-based medicine relies too heavily on research. It was still being described by some as a “new approach” almost twenty years later (Selvaraj et al, 2010), suggesting that it has taken some time to become integrated into the medical profession worldwide.

Health care practitioners work in a range of clinical environments that are likely to influence decision making using EBP. In 2009, Satterfield et al developed a transdisciplinary model for evidence-based practice. This model depicts the three core components of EBP (best available research evidence, clinical expertise and patient’s preferences) within the broader clinical or organizational context. In a sense, the organizational context is a fourth EBP component.

From Satterfield et al (2009) Toward a Transdisciplinary Model of Evidence-based Practice. Millbank Quarterly 87(2): 368-390. (With permission)


Invention of Glass Lenses

Long before, in the hazy unrecorded past, someone picked up a piece of transparent crystal thicker in the middle than at the edges, looked through it, and discovered that it made things look larger. Someone also found that such a crystal would focus the sun's rays and set fire to a piece of parchment or cloth. Magnifiers and "burning glasses" or "magnifying glasses" are mentioned in the writings of Seneca and Pliny the Elder, Roman philosophers during the first century A. D., but apparently they were not used much until the invention of spectacles, toward the end of the 13th century. They were named lenses because they are shaped like the seeds of a lentil.

The earliest simple microscope was merely a tube with a plate for the object at one end and, at the other, a lens which gave a magnification less than ten diameters -- ten times the actual size. These excited general wonder when used to view fleas or tiny creeping things and so were dubbed "flea glasses."


A Brief, Blood-Boiling History of the Opioid Epidemic

Julia Lurie

A Maryland cop counts pill capsules suspected to contain heroin. Lexey Swall/Grain Images

The scale of the overdose epidemic is hard to fathom. In 2016, overdoses claimed 64,000 lives—more than the US military casualties in Vietnam and Iraq combined. The origins of today’s crisis, a perfect storm of potent, easily accessible opioids, trace back to aggressive pharmaceutical marketing and liberal painkiller prescribing in the 1990s and 2000s. Here’s how it happened:

1970s: Percocet and Vicodin are introduced, but physicians are wary of prescribing them because of their addictive qualities.

1995: The American Pain Society promotes the “Pain Is the Fifth Vital Sign” standard, urging doctors to monitor pain along with pulse, breathing, blood pressure, and temperature. Purdue Pharma is one of 28 corporate donors.

1996: Purdue Pharma debuts OxyContin with the most aggressive marketing campaign in pharmaceutical history, downplaying its addictiveness. Over the next five years, the number of opioid painkiller prescriptions jumps by 44 million.

1997: Arthur Sackler, whose family owns Purdue Pharma, is posthumously inducted into the Medical Advertising Hall of Fame for “bringing the full power of advertising and promotion to pharmaceutical marketing.”

1998: Purdue distributes 15,000 copies of “I Got My Life Back,” a promotional video featuring a doctor saying opioids “do not have serious medical side effects” and “should be used much more than they are.” It also offers new patients a free first OxyContin prescription.

2001: The Joint Commission, a nonprofit charged with accrediting hospitals, promotes the now familiar 0-10 pain scale and begins judging hospitals based on patient satisfaction with pain treatment. The commission and Purdue team up on a guide for doctors and patients that says, “There is no evidence that addiction is a significant issue when persons are given opioids for pain control.”

2002: US doctors prescribe roughly 23 times more OxyContin than they did in 1996 sales of the drug have increased more than thirtyfold.

2004: With input from a Purdue exec, the Federation of State Medical Boards recommends sanctions against doctors who undertreat pain.

2007: Three drug distributors—McKesson, Cardinal Health, and AmerisourceBergen—make $17 billion by flooding West Virginia pharmacies with opioid painkillers between 2007 and 2012, according to a subsequent Pulitzer Prize-winning Charleston Gazette-Mail investigation.

2009: The Joint Commission removes the requirement to assess all patients for pain. By now, the United States is consuming the vast majority of the world’s opioid painkillers: 99 percent of all hydrocodone and 81 percent of oxycodone.

2010: Cheap, strong Mexican heroin makes its way to American rural and suburban areas. Meanwhile, the Affordable Care Act offers addiction treatment coverage to many Americans for the first time. Annual OxyContin sales exceed $3 billion.

2011: The Centers for Disease Control and Prevention declares that painkiller overdoses have reached “epidemic levels.”

2012: Health care providers write 259 million opioid painkiller prescriptions—nearly enough for every American to have a bottle of pills. The increasingly white face of addiction changes how policymakers frame the problem, from a moral failing necessitating prison time to a disease requiring treatment.

2013: Fentanyl, a painkiller up to 50 times more powerful than heroin, starts to make its way into the heroin supply. Most of it is illicitly produced in China.

2015: Seizures of fentanyl have multiplied by fifteenfold since 2013. About 12.5 million Americans report misusing painkillers nearly 1 million report using heroin.

2016: An estimated 64,000 Americans die of drug overdoses—more than all US military casualties in the Vietnam and Iraq wars combined. In December, Congress passes legislation allotting $1 billion to fund opioid addiction treatment and prevention efforts over two years.

2017: President Donald Trump declares a public health state of emergency, which opens up a fund of just $57,000. The GOP tries repeatedly to repeal Obamacare, a move that would take away addiction treatment coverage for an estimated 3 million Americans.

This article has been updated.

Looking for news you can trust?

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.


Watch the video: Concrete Blonde - Bloodletting The Vampire Song (July 2022).


Comments:

  1. Rayman

    I'm sorry, but I think you are wrong. I propose to discuss it. Email me at PM.

  2. Dait

    I believe that you are making a mistake. I can defend my position. Email me at PM.

  3. Yehoash

    If I were you, I would have acted differently.

  4. Siddael

    Well done, what a phrase ..., the remarkable idea

  5. Nikogal

    Excuse, that I can not participate now in discussion - it is very occupied.I will be released - I will necessarily express the opinion on this question.

  6. Darrel

    I consider, that you are not right. I am assured. Write to me in PM.

  7. Aitan

    I apologize for interfering, but could you please give a little more information.



Write a message