History & Evolution of Intensive (Critical) Care Units


              The English nurse Florence Nightingale pioneered efforts to use a separate hospital area for critically injured patients. During the Crimean War in the 1850s, she introduced the practice of moving the sickest patients to the beds directly opposite the nursing station on each ward so that they could be monitored more closely.  In 1923, the American neurosurgeon Walter Dandy created a three-bed unit at the Johns Hopkins Hospital. In these units, specially trained nurses cared for critically ill postoperative neurosurgical patients.

           The Danish anaesthesiologist Bjørn Aage Ibsen became involved in the 1952 poliomyelitis epidemic in Copenhagen, where 2722 patients developed the illness in a six-month period, with 316 of those developing some form of respiratory or airway paralysis. Some of these patients had been treated using the few available negative pressure ventilators, but these devices (while helpful) were limited in number and did not protect the patient’s lungs from aspiration of secretions. Ibsen changed the management directly by instituting long-term positive pressure ventilation using tracheal intubation, and he enlisted 200 medical students to manually pump oxygen and air into the patients’ lungs round the clock. At this time, Carl-Gunnar Engström had developed one of the first artificial positive-pressure volume-controlled ventilators, which eventually replaced the medical students. With the change in care, mortality during the epidemic declined from 90% to around 25%. Patients were managed in three special 35-bed areas, which aided charting medications and other management.

        In 1953, Ibsen set up what became the world’s first intensive care unit in a converted student nurse classroom in Copenhagen Municipal Hospital. He provided one of the first accounts of the management of tetanus using neuromuscular-blocking drugs and controlled ventilation.

         The following year, Ibsen was elected head of the department of anaesthesiology at that institution. He jointly authored the first known account of intensive care management principles in the journal Nordisk Medicin, with Tone Dahl Kvittingen from Norway.

      For a time in the early 1960s, it was not clear that specialized intensive care units were needed, so intensive care resources were brought to the room of the patient that needed the additional monitoring, care, and resources. It became rapidly evident, however, that a fixed location where intensive care resources and dedicated personnel were available provided better care than ad hoc provision of intensive care services spread throughout a hospital. In 1962, in the University of Pittsburgh, the first critical care residency was established in the United States. In 1970, the Society of Critical Care Medicine was formed.

How an epidemic led to development of Intensive Care Unit

How an epidemic led to development of Intensive Care Unit

The number of hospital admissions was more than the staff had ever seen. And people kept coming. Dozens each day. They were dying of respiratory failure. Doctors and nurses stood by, unable to help without sufficient equipment.

It was the polio epidemic of August 1952, at Blegdam Hospital in Copenhagen. This little-known event marked the start of intensive-care medicine and the use of mechanical ventilation outside the operating theatre — the very care that is at the heart of abating the COVID-19 crisis.

In 1952, the iron lung was the main way to treat the paralysis that stopped some people with poliovirus from breathing. Copenhagen was an epicentre of one of the worst polio epidemics that the world had ever seen. The hospital admitted 50 infected people daily, and each day, 6–12 of them developed respiratory failure. The whole city had just one iron lung. In the first few weeks of the epidemic, 87% of those with bulbar or bulbospinal polio, in which the virus attacks the brainstem or nerves that control breathing, died. Around half were children.

Desperate for a solution, the chief physician of Blegdam called a meeting. Asked to attend: Bjørn Ibsen, an anaesthesiologist recently returned from training at the Massachusetts General Hospital in Boston. Ibsen had a radical idea. It changed the course of modern medicine.

Student saviours                                    

The iron lung used negative pressure. It created a vacuum around the body, forcing the ribs, and therefore the lungs, to expand; air would then rush into the trachea and lungs to fill the void. The concept of negative-pressure ventilation had been around for hundreds of years, but the device that became widely used — the ‘Drinker respirator’ — was invented in 1928 by Philip Drinker and Louis Agassiz Shaw, professors at the School of Public Health in Boston, Massachusetts. Others went on to refine it, but the basic mechanism remained the same until 1952.

Iron lungs only partially solved the paralysis problem. Many people with polio placed in one still died. Among the most frequent complications was aspiration — saliva or stomach contents would be sucked from the back of the throat into the lungs when a person was too weak to swallow. There was no protection of the airway.

Ibsen suggested the opposite approach. His idea was to blow air directly into the lungs to make them expand, and then allow the body to passively relax and exhale. He proposed the use of a trachaeostomy: an incision in the neck, through which a tube goes into the windpipe and delivers oxygen to the lungs, and the application of positive-pressure ventilation. At the time, this was often done briefly during surgery, but had rarely been used in a hospital ward.

Ibsen was given permission to try the technique the next day. We even know the name of his first patient: Vivi Ebert, a 12-year-old girl on the brink of death from paralytic polio. Ibsen demonstrated that it worked. The trachaeostomy protected her lungs from aspiration, and by squeezing a bag attached to the tube, Ibsen kept her alive. Ebert went on to survive until 1971, when she ultimately died of infection in the same hospital, almost 20 years later.

The plan was hatched to use this technique on all the patients in Blegdam who needed help to breathe. The only problem? There were no ventilators.

Very early versions of positive-pressure ventilators had been around from about 1900, used for surgery and by rescuers during mining accidents. Further technical developments during the Second World War helped pilots to breathe in the decreased pressures at high altitudes. But modern ventilators, to support a person for hours or days, had yet to be invented.

What followed was one of the most remarkable episodes in health-care history: in six-hour shifts, medical and dental students from the University of Copenhagen sat at the bedside of every person with paralysis and ventilated them by hand. The students squeezed a bag connected to the trachaeostomy tube, forcing air into the lungs. They were instructed in how many breaths to administer each minute, and sat there hour after hour. This went on for weeks, and then months, with hundreds of students rotating on and off. By mid-September, the mortality for patients with polio who had respiratory failure had dropped to 31%. It is estimated that the heroic scheme saved 120 people.

Major insights emerged from the Copenhagen polio epidemic. One was a better understanding of why people died of polio. Until then, it was thought that kidney failure was the cause. Ibsen recognized that inadequate ventilation caused carbon dioxide to build up in the blood, making it very acidic — which caused organs to shut down.

Three further lessons are central today. First, Blegdam demonstrated what can be achieved by a medical community coming together, with remarkable focus and stamina. Second, it proved that keeping people alive for weeks, and months, with positive-pressure ventilation was feasible. And third, it showed that by bringing together all the patients struggling to breathe, it was easier to care for them in one place where the doctors and nurses had expertise in respiratory failure and mechanical ventilation.

So, the concept of an intensive-care unit (ICU) was born. After the first one was set up in Copenhagen the following year, ICUs proliferated. And the use of positive pressure, with ventilators instead of students, became the norm.

In the early years, many of the safety features of modern ventilators did not exist. Doctors who worked in the 1950s and 1960s describe caring for patients without any alarms; if the ventilator accidentally disconnected and the nurse’s back was turned, the person would die. Early ventilators forced people to breathe at a set rate, but modern ones sense when a patient wants to breathe, and then help provide a push of air into the lungs in time with the body. The original apparatus also gathered limited information on how stiff or compliant the lungs were, and gave everyone a set amount of air with each breath; modern machines take many measurements of the lungs, and allow for choices regarding how much air to give with each breath. All of these are refinements of the original ventilators, which were essentially automatic bellows and tubing.

Advantages-Disadvantage of being a doctor

     25 factors- why health care is expensive

REEL Heroes Vs Real Heroes        

 21 occupational risks to doctors and nurses

Covid paradox: salary cut for doctors other paid at home

   Medical-Consumer protection Act- Pros and Cons              

Expensive Medical College  seat- Is it worth it?

Most Unethical Medical Study Ever- Tuskegee Study for Syphilis


One of the ugliest and unethical human studies in the history, Tuskegee Study raised a host of ethical issues such as informed consent, racism, paternalism, unfair subject selection in research, maleficence, truth telling and justice, among others.   It is really unbelievable to understand the heinous nature of the Tuskegee study.

    The Public Health Service started the study in 1932 in collaboration with Tuskegee University (then the Tuskegee Institute), a historically Black college in Alabama. In the study, investigators enrolled a total of 600 impoverished African-American sharecroppers from Macon County, Alabama.

Tuskegee syphilis study

     Tuskegee syphilis study

The goal was to “observe the natural history of untreated syphilis” in black populations. But the subjects were unaware of this and were simply told they were receiving treatment for bad blood. Actually, they received no treatment at all. Even after penicillin was discovered as a safe and reliable cure for syphilis, the majority of men did not receive it.

In 1932, the USPHS, working with the Tuskegee Institute, began a study to record the natural history of syphilis. It was originally called the “Tuskegee Study of Untreated Syphilis in the Negro Male” (now referred to as the “USPHS Syphilis Study at Tuskegee”).

The study initially involved 600 Black men – 399 with syphilis, 201 who did not have the disease. Participants’ informed consent was not collected. Researchers told the men they were being treated for “bad blood,” a local term used to describe several ailments, including syphilis, anemia, and fatigue. In exchange for taking part in the study, the men received free medical exams, free meals, and burial insurance.

By 1943, penicillin was the treatment of choice for syphilis and becoming widely available, but the participants in the study were not offered treatment.

The purpose of the study was to observe the effects of the disease when untreated, though by the end of the study medical advancements meant it was entirely treatable. The men were not informed of the nature of the experiment, and more than 100 died as a result.

None of the infected men were treated with penicillin despite the fact that, by 1947, the antibiotic was widely available and had become the standard treatment for syphilis.

    .

Of these men, 399 had latent syphilis, with a control group of 201 men who were not infected. As an incentive for participation in the study, the men were promised free medical care. While the men were provided with both medical and mental care that they otherwise would not have received,  they were deceived by the PHS, who never informed them of their syphilis diagnosis and provided disguised placebos, ineffective methods, and diagnostic procedures as treatment for “bad blood”.

The men were initially told that the experiment was only going to last six months, but it was extended to 40 years.  After funding for treatment was lost, the study was continued without informing the men that they would never be treated.

The study continued, under numerous Public Health Service supervisors, until 1972, when a leak to the press resulted in its termination on November 16 of that year.  By then, 28 patients had died directly from syphilis, 100 died from complications related to syphilis, 40 of the patients’ wives were infected with syphilis, and 19 children were born with congenital syphilis.

The 40-year Tuskegee Study was a major violation of ethical standards, and has been cited as “arguably the most infamous biomedical research study in U.S. history.”  Its revelation has also been an important cause of distrust in medical science and the US government amongst African Americans.

Later in 1973, a class-action lawsuit was filed on behalf of the study participants and their families, resulting in a $10 million, out-of-court settlement in 1974.

On May 16, 1997, President Bill Clinton issued a formal Presidential Apology for the study.

On May 16, 1997, President Bill Clinton formally apologized on behalf of the United States to victims of the study, calling it shameful and racist. “What was done cannot be undone, but we can end the silence,” he said. “We can stop turning our heads away. We can look at you in the eye, and finally say, on behalf of the American people, what the United States government did was shameful and I am sorry.”

     Advantages-Disadvantage of being a doctor

     25 factors- why health care is expensive

REEL Heroes Vs Real Heroes        

 21 occupational risks to doctors and nurses

Covid paradox: salary cut for doctors other paid at home

   Medical-Consumer protection Act- Pros and Cons

Expensive Medical College  seat- Is it worth it?

History & Evolution of Vaccination


Before the first vaccinations, in the sense of using cowpox to inoculate people against smallpox, people have been inoculated in China and elsewhere, before being copied in the west, by using smallpox, called Variolation.

Variolation was the method of inoculation first used to immunize individuals against smallpox (Variola) with material taken from a patient or a recently variolated individual, in the hope that a mild, but protective, infection would result.

   The procedure was most commonly carried out by inserting/rubbing powdered smallpox scabs or fluid from pustules into superficial scratches made in the skin. 

    The earliest hints of the practice of variolation for smallpox in China come during the 10th century. The Chinese also practiced the oldest documented use of variolation, which comes from Wan Quan’s (1499–1582) Douzhen Xinfa  of 1549. They implemented a method of “nasal insufflation” administered by blowing powdered smallpox material, usually scabs, up the nostrils.

   Various insufflation techniques have been recorded throughout the sixteenth and seventeenth centuries within China. Two reports on the Chinese practice of inoculation were received by the Royal Society in London in 1700; one by Martin Lister who received a report by an employee of the East India Company stationed in China and another by Clopton Havers. In France, Voltaire reports that the Chinese have practiced variolation “these hundred years”.

     In 1796, Edward Jenner, a doctor in Berkeley in Gloucestershire, England, tested a common theory that a person who had contracted cowpox would be immune from smallpox. To test the theory, he took cowpox vesicles from a milkmaid named Sarah Nelmes with which he infected an eight-year-old boy named James Phipps, and two months later he inoculated the boy with smallpox, and smallpox did not develop.

   In 1798, Jenner published An Inquiry into the Causes and Effects of the Variolae Vacciniae which created widespread interest. He distinguished ‘true’ and ‘spurious’ cowpox (which did not give the desired effect) and developed an “arm-to-arm” method of propagating the vaccine from the vaccinated individual’s pustule. Early attempts at confirmation were confounded by contamination with smallpox, but despite controversy within the medical profession and religious opposition to the use of animal material, by 1801 his report was translated into six languages and over 100,000 people were vaccinated. The term vaccination was coined in 1800 by the surgeon Richard Dunning in his text Some observations on vaccination.

   In 1802, the Scottish physician Helenus Scott vaccinated dozens of children in  Mumbai (previous Bombay) against smallpox using Jenner’s cowpox vaccine. In the same year Scott penned a letter to the editor in the Bombay Courier, declaring that “We have it now in our power to communicate the benefits of this important discovery to every part of India, perhaps to China and the whole eastern world”.  Subsequently, vaccination became firmly established in British India. A vaccination campaign was started in the new British colony of Ceylon in 1803.

    By 1807 the British had vaccinated more than a million Indians and Sri Lankans against smallpox. Also in 1803 the Spanish Balmis Expedition launched the first transcontinental effort to vaccinate people against smallpox. Following a smallpox epidemic in 1816 the Kingdom of Nepal ordered smallpox vaccine and requested the English veterinarian William Moorcroft to help in launching a vaccination campaign. In the same year a law was passed in Sweden to require the vaccination of children against smallpox by the age of two. Prussia briefly introduced compulsory vaccination in 1810 and again in the 1920s, but decided against a compulsory vaccination law in 1829.

    A law on compulsory smallpox vaccination was introduced in the Province of Hanover in the 1820s. In 1826, in Kragujevac,  future prince Mihailo of Serbia was the first person to be vaccinated against smallpox in the principality of Serbia. 

    Following a smallpox epidemic in 1837 that caused 40,000 deaths, the British government initiated a concentrated vaccination policy, starting with the Vaccination Act of 1840, which provided for universal vaccination and prohibited Variolation.

    The Vaccination Act 1853 introduced compulsory smallpox vaccination in England and Wales.

    The law followed a severe outbreak of smallpox in 1851 and 1852. It provided that the poor law authorities would continue to dispense vaccination to all free of charge, but that records were to be kept on vaccinated children by the network of births registrars. It was accepted at the time, that voluntary vaccination had not reduced smallpox mortality, but the Vaccination Act 1853 was so badly implemented that it had little impact on the number of children vaccinated in England and Wales.

In the United States of America compulsory vaccination laws were upheld in the 1905 landmark case Jacobson v. Massachusetts by the Supreme Court of the United States. The Supreme Court ruled that laws could require vaccination to protect the public from dangerous communicable diseases. However, in practice the United States had the lowest rate of vaccination among industrialized nations in the early 20th century.

    Compulsory vaccination laws began to be enforced in the United States after World War II. In 1959 the World Health Organization (WHO) called for the eradication of smallpox worldwide, as smallpox was still endemic in 33 countries.

     In the 1960s six to eight children died each year in the United States from vaccination-related complications. According to the WHO there were in 1966 about 100 million cases of smallpox worldwide, causing an estimated two million deaths.

     In the 1970s there was such a small risk of contracting smallpox that the United States Public Health Service recommended for routine smallpox vaccination to be ended.

   By 1974 the WHO smallpox vaccination program had confined smallpox to parts of Pakistan, India, Bangladesh, Ethiopia and Somalia.

     In 1977 the WHO recorded the last case of smallpox infection acquired outside a laboratory in Somalia. In 1980 the WHO officially declared the world free of smallpox.

   In 1974 the WHO adopted the goal of universal vaccination by 1990 to protect children against six preventable infectious diseases: measles, poliomyelitis, diphtheria, whooping cough, tetanus, and tuberculosis.

    In the 1980s only 20 to 40% of children in developing countries were vaccinated against these six diseases. In wealthy nations the number of measles cases had dropped dramatically after the introduction of the measles vaccine in 1963. WHO figures demonstrate that in many countries a decline in measles vaccination leads to a resurgence in measles cases. Measles are so contagious that public health experts believe a vaccination rate of 100% is needed to control the disease.  Despite decades of mass vaccination polio remains a threat in India, Nigeria, Somalia, Niger, Afghanistan, Bangladesh and Indonesia.

   By 2006 global health experts concluded that the eradication of polio was only possible if the supply of drinking water and sanitation facilities were improved in slums. The deployment of a combined DPT vaccine against diphtheria, pertussis (whooping cough), and tetanus in the 1950s was considered a major advancement for public health. But in the course of vaccination campaigns that spanned decades, DPT vaccines became associated with high incidences of side effects. Despite improved DPT vaccines coming onto the market in the 1990s, DPT vaccines became the focus of anti-vaccination campaigns in wealthy nations. As immunization rates decreased, outbreaks of pertussis increased in many countries.

      In 2000, the Global Alliance for Vaccines and Immunization was established to strengthen routine vaccinations and introduce new and under-used vaccines in countries with a per capita GDP of under US$1000.

     Advantages-Disadvantage of being a doctor

     25 factors- why health care is expensive

     REEL Heroes Vs Real Heroes

     21 occupational risks to doctors and nurses

     Covid paradox: salary cut for doctors other paid at home

what is Mechanical ventilator? A machine critical to save life


A medical ventilator (or simply ventilator in context) is a machine designed to provide mechanical ventilation by moving breathable air into and out of the lungs, to deliver breaths to a patient who is physically unable to breathe, or breathing insufficiently.

While modern ventilators are computerized machines, patients can be ventilated with a simple, hand-operated bag valve mask.

Ventilators are chiefly used in intensive care medicine, home care, and emergency medicine (as standalone units) and in anesthesiology  (as a component of an  anesthesia machine .

Medical ventilators are sometimes colloquially called “respirators”, a term stemming from commonly used devices in the 1950s (particularly the “Bird Respirator”). However, in modern hospital and medical terminology, these machines are never referred to as respirators, and use of “respirator” in this context is now a deprecated anachronism signaling technical unfamiliarity.

Function                                  

In its simplest form, a modern positive pressure ventilator consists of a compressible air  reservoir or turbine, air and oxygen supplies, a set of valves and tubes, and a disposable or reusable “patient circuit”. The air reservoir is pneumatically compressed several times a minute to deliver room-air, or in most cases, an air/oxygen mixture to the patient. If a turbine is used, the turbine pushes air through the ventilator, with a flow valve adjusting pressure to meet patient-specific parameters. When over pressure is released, the patient will exhale passively due to the lungs’ elasticity, the exhaled air being released usually through a one-way valve within the patient circuit called the patient manifold.

Ventilators may also be equipped with monitoring and alarm systems for patient-related parameters (e.g. pressure, volume, and flow) and ventilator function (e.g. air leakage, power failure, mechanical failure), backup batteries, oxygen tanks, and remote control. The pneumatic system is nowadays often replaced by a computer-controlled  turbo-pump.

Modern ventilators are electronically controlled by a small embedded system to allow exact adaptation of pressure and flow characteristics to an individual patient’s needs. Fine-tuned ventilator settings also serve to make ventilation more tolerable and comfortable for the patient. In Canada and the United States and in many parts of world, respiratory therapists are responsible for tuning these settings, while biomedical technologists are responsible for the maintenance.

The patient circuit usually consists of a set of three durable, yet lightweight plastic tubes, separated by function (e.g. inhaled air, patient pressure, exhaled air). Determined by the type of ventilation needed, the patient-end of the circuit may be either noninvasive or invasive.

Noninvasive methods, which are adequate for patients who require a ventilator only while sleeping and resting, mainly employ a nasal mask. Invasive methods require     intubation.  For long-term ventilator dependence will normally be a tracheostomy  cannula, as this is much more comfortable and practical for long-term care than is larynx or nasal intubation.

Life-critical system

Because failure may result in death, mechanical ventilation systems are classified as a life critical-system and precautions must be taken to ensure that they are highly reliable, including their  power supply .

Mechanical ventilators are therefore carefully designed so that no single point of failure can endanger the patient. They may have manual backup mechanisms to enable hand-driven respiration in the absence of power (such as the mechanical ventilator integrated into an  anesthetic machine . They may also have safety valves, which open to atmosphere in the absence of power to act as an anti-suffocation valve for spontaneous breathing of the patient. Some systems are also equipped with compressed-gas tanks, air compressors, and/or backup batteries to provide ventilation in case of power failure or defective gas supplies, and methods to operate or call for help if their mechanisms or software fail.

history of ventilator

source 

History of mechanical ventilator 


                

The history of mechanical ventilation begins with various versions of what was eventually called the iron lung, a form of noninvasive negative pressure ventilator widely used during the polio epidemics of the 20th century after the introduction of the “Drinker respirator” in 1928, improvements introduced by John Haven Emerson in 1931,  and the Both respirator in 1937. Other forms of noninvasive ventilators, also used widely for polio patients, include Biphasic Cuirass Ventilation, the rocking bed, and rather primitive positive pressure machines.

In 1949, John Haven Emerson developed a mechanical assister for anesthesia with the cooperation of the anesthesia department at Harvard University. Mechanical ventilators began to be used increasingly in anesthesia and intensive care during the 1950s. Their development was stimulated both by the need to treat polio patients and the increasing use of muscle relaxants during anesthesia. Relaxant drugs paralyze the patient and improve operating conditions for the surgeon but also paralyze the respiratory muscles.

In the United Kingdom, the East Radcliffe and Beaver models were early examples, the latter using an automotive wiper motor to drive the bellows used to inflate the lungs. Electric motors were, however, a problem in the operating theaters of that time, as their use caused an explosion hazard in the presence of flammable anesthetics such as ether and  cyclopropane .

In 1952, Roger Manley of the Westminster Hospital, London, developed a ventilator which was entirely gas driven, and became the most popular model used in Europe. It was an elegant design, and became a great favorite with European anesthetists for four decades, prior to the introduction of models controlled by electronics. It was independent of electrical power, and caused no explosion hazard. The original Mark I unit was developed to become the Manley Mark II in collaboration with the Blease company, who manufactured many thousands of these units. Its principle of operation was very simple, an incoming gas flow was used to lift a weighted bellows unit, which fell intermittently under gravity, forcing breathing gases into the patient’s lungs. The inflation pressure could be varied by sliding the movable weight on top of the bellows. The volume of gas delivered was adjustable using a curved slider, which restricted bellows excursion. Residual pressure after the completion of expiration was also configurable, using a small weighted arm visible to the lower right of the front panel. This was a robust unit and its availability encouraged the introduction of positive pressure ventilation techniques into mainstream European anesthetic practice.

The 1955 release of Forrest Bird’s “Bird Universal Medical Respirator” in the United States changed the way mechanical ventilation was performed, with the small green box becoming a familiar piece of medical equipment.  The unit was sold as the Bird Mark 7 Respirator and informally called the “Bird”. It was a pneumatic device and therefore required no electrical power source to operate.

Intensive care environments around the world revolutionized in 1971 by the introduction of the first  Servo 900 ventilator Elema – Schonander . It was a small, silent and effective electronic ventilator, with the famous SERVO feedback system controlling what had been set and regulating delivery. For the first time, the machine could deliver the set volume in volume control ventilation.

Ventilators used under increased pressure (hyperbaric) require special precautions and few ventilators can operate under these conditions. In 1979, Sechrist Industries introduced their Model 500A ventilator which was specifically designed for use with hyperbaric chambers.

In 1991 the SERVO 300 ventilator series was introduced. The platform of the SERVO 300 series enabled treatment of all patient categories, from adult to neonate, with one single ventilator. The SERVO 300 series provided a completely new and unique gas delivery system, with rapid flow-triggering response.

In 1999 the LTV (Laptop Ventilator) Series was introduced into the market. The new ventilator was significantly smaller than the ventilators of that time weighing ~14 lbs and around the size of a laptop computer. This new design kept the same functionality of the in hospital ventilators, while now opening up a world of opportunity of mobility for the patients.

A modular concept, meaning that the hospital has one ventilator model throughout the ICU department instead of a fleet with different models and brands for the different user needs, was introduced with SERVO-i in 2001. With this modular concept the ICU departments could choose the modes and options, software and hardware needed for a particular patient category.

mechanical ventilator

History of Diphtheria


In 1613, Spain experienced an epidemic of diphtheria. The year is known as El Año de los Garrotillos (The Year of Strangulations) in the history of Spain.

In 1735, a diphtheria epidemic swept through New England.

Before 1826, diphtheria was known by different names across the world. In England, it was known as Boulogne sore throat, as it spread from France. In 1826, Pierre Bretonneau gave the disease the name diphthérite (from Greek diphthera “leather”) describing the appearance of pseudomembrane in the throat.

In 1856, Victor Fourgeaud described an epidemic of diphtheria in California.

In 1878, Queen Victoria’s daughter Princess Alice and her family became infected with diphtheria, causing two deaths, Princess Marie of Hesse and by Rhine and Princess Alice herself.

In 1883, Edwin Klebs identified the bacterium causing diphtheria  and named it Klebs-Loeffler bacterium. The club shape of this bacterium helped Edwin to differentiate it from other bacteria. Over the period of time, it was called Microsporon diphtheriticum, Bacillus diphtheriae, and Mycobacterium diphtheriae. Current nomenclature is Corynebacterium diphtheriae. Friedrich Loeffler was the first person to cultivate C. diphtheriae in 1884. He used Koch’s postulates to prove association between C. diphtheriae and diphtheria. He also showed that the bacillus produces an exotoxin. Joseph P. O’Dwyer introduced the O’Dwyer tube for laryngeal intubation in patients with an obstructed larynx in 1885. It soon replaced tracheostomy as the emergency diphtheric intubation method.

In 1888, Emile Roux and Alexandre Yersin showed that a substance produced by C. diphtheriae caused symptoms of   diphtheria in animals. In 1890, Shibasaburo Kitasato and Emil von Behring immunized guinea pigs with heat-treated diphtheria toxin. They also immunized goats and horses in the same way and showed that an “antitoxin” made from serum of immunized animals could cure the disease in non-immunized animals.

 

Behring used this antitoxin (now known to consist of antibodies that neutralize the toxin produced by C. diphtheriae) for human trials in 1891, but they were unsuccessful. Successful treatment of human patients with horse-derived antitoxin began in 1894, after production and quantification of antitoxin had been optimized.

 Von Behring won the first Nobel Prize in medicine in 1901 for his work on diphtheria.

 

In 1895, H. K. Mulford Company of Philadelphia started production and testing of diphtheria antitoxin in the United States. Park and Biggs described the method for producing serum from horses for use in diphtheria treatment.

In 1897, Paul Ehrlich developed a standardized unit of measure for diphtheria antitoxin. This was the first ever standardization of a biological product, and played an important role in future developmental work on sera and vaccines.

In 1901, 10 of 11 inoculated St. Louis children died from contaminated diphtheria antitoxin. The horse from which the antitoxin was derived died of tetanus. This incident, coupled with a tetanus outbreak in Camden, New Jersey, played an important part in initiating federal regulation of biologic products.

On 7 January 1904, Ruth Cleveland died of diphtheria at the age of 12 years in Princeton, New Jersey. Ruth was the eldest daughter of former President Grover Cleveland and the former first lady Frances Folsom.

In 1905, Franklin Royer, from Philadelphia’s Municipal Hospital, published a paper urging timely treatment for diphtheria and adequate doses of antitoxin.

In 1906, Clemens Pirquet and Béla Schick described serum sickness in children receiving large

quantities of horse-derived antitoxin.

Between 1910 and 1911, Béla Schick developed the Schick test to detect pre-existing immunity to diphtheria in an exposed person.

Only those who were not exposed to diphtheria were preferably vaccinated. A massive, five-year campaign was coordinated by Dr. Schick. As a part of the campaign, 85 million pieces of literature were distributed by the Metropolitan Life Insurance Company with an appeal to parents to “Save your child from diphtheria.” A vaccine was developed in the next decade, and deaths began declining significantly in 1924.

In 1919, in Dallas, Texas, 10 children were killed and 60 others made seriously ill by toxic antitoxin which had passed the tests of the New York State Health Department. Mulford Company of Philadelphia (manufacturers) paid damages in every case.

In the 1920s, an estimated 100,000 to 200,000 cases of diphtheria occurred per year in the

United States, causing 13,000 to 15,000 deaths per year.  Children represented a largemajority of these cases and fatalities. One of the most infamous outbreaks of diphtheria was in Nome, Alaska; the “Great Race of Mercy” to deliver diphtheria antitoxin is now celebrated by the Iditarod Trail Sled Dog Race.

In 1926, Alexander Thomas Glenny increased the effectiveness of diphtheria toxoid (a modified version of the toxin used for vaccination) by treating it with aluminum salts. Vaccination with toxoid was not widely used untli the early 1930s.

In 1943, diphtheria outbreaks accompanied war and disruption in Europe. The 1 million cases in Europe resulted in 50,000 deaths. In 1949, 68 of 606 children died after diphtheria immunization due to improper manufacture of aluminum phosphate toxoid.

In 1974, the World Health Organization included DPT vaccine in their Expanded Programme on Immunization for developing countries.

In 1975, an outbreak of cutaneous diphtheria in Seattle, Washington, was reported.

In 1994, the Russian Federation had 39,703 diphtheria cases. By contrast, in 1990, only 1,211 cases were reported.

Between 1990 and 1998, diphtheria caused 5000 deaths in the countries of the former Soviet Union

In early May 2010, a case of diphtheria was diagnosed in Port-au-Prince, Haiti, after the devastating 2010 Haiti earthquake. The 15- year-old male patient died while workers searched for antitoxin.

In 2013, three children died of diphtheria in Hyderabad, India.]

In early June 2015, a case of diphtheria was diagnosed aVt all d’Hebron University Hospita lin Barcelona, Spain. The 6-year-old child who died of the illness had not been previously vaccinated due to parental opposition to vaccination. It was the first case of diphtheria in the country since 1986 as reported by” El Mundo” or from 1998, as reported by WHO.

In March 2016, a 3-year-old girl died of diphtheria in the University Hospital of Antwerp, Belgium.

In June 2016, a 3-year-old, 5-year-old, and 7-year-old girl died of diphtheria in Kedah and Malacca, Sabah Malaysia.

In January 2017, more than 300 cases were recorded in Venezuela.

In November and December 2017, an outbreak of diphtheria occurred in Indonesia with more than 600 cases found and 38 fatalities.

source

 

Medical Regulation and Medical Community of Ancient Rome


Medical community

Medical services of the late Roman Republic and early Roman Empire were mainly imports from the civilization of Ancient Greece, and then through Greeks enslaved during the Roman conquest of Greece. Greek knowledge imparted to Roman citizens visiting or being educated in Greece.  A perusal of the names of Roman physicians will show that the majority are wholly or partly Greek and that many of the physicians were of servile origin.

The servility stigma came from the accident of a more medically advanced society being conquered by a lesser. One of the cultural ironies of these circumstances is that free men sometimes found themselves in service to the enslaved professional or dignitary, or the power of the state was entrusted to foreigners who had been conquered in battle and were technically slaves. In Greek society, physicians tended to be regarded as noble.

Public medicine

The medical art in early Rome was the responsibility of the pater familias, or patriarch. The importation of the Aesculapium established medicine in the public domain. There is no record of fees being collected for a stay at one of them, at Rome or elsewhere.  individuals vowed to perform certain actions or contribute a certain amount if certain events happened, some of which were healings. Such a system amounts to gradated contributions by income, as the contributor could only vow what he could provide. The building of a temple and its facilities on the other hand was the responsibility of the magistrates. The funds came from the state treasury or from taxes.

Private medicine  A second signal act marked the start of sponsorship of private medicine by the state as well. In the year 219 BCE, a vulnerarius, or surgeon, Archagathus, visited Rome from the Peloponnesus and was asked to stay. The state conferred citizenship on him and purchased him a taberna, or shop, near the compitium Acilii (a crossroads), which became the first officina medica.

The doctor necessarily had many assistants. Some prepared and vended medicines and tended the herb garden. These numbers, of course, are at best proportional to the true populations, which were many times greater.

Roman doctors of any stature combed the population for persons in any social setting who had an interest in and ability for practicing medicine. On the one hand the doctor used their services unremittingly. On the other they were treated like members of the family; i.e., they came to stay with the doctor and when they left they were themselves doctors. The best doctors were the former apprentices of the Aesculapia, who, in effect, served residencies there.

 

The practice of medicine

The physician

The next step was to secure the cura of a medicus. If the patient was too sick to move one sent for a clinicus, who went to the clinum or couch of the patient.

That the poor paid a minimal fee for the visit of a medicus is indicated by a wisecrack in Plautus. It was less than a nummus. Many anecdotes exist of doctors negotiating fees with wealthy patients and refusing to prescribe a remedy if agreement was not reached. The fees charged were on a sliding scale according to assets. The physicians of the rich were themselves rich. For example, Antonius Musa treated Augustus’ nervous symptoms with cold baths and drugs. He was not only set free but he became Augustus’ physician. He received a salary of 300,000 sesterces. There is no evidence that he was other than a private physician; that is, he was not working for the Roman government.

Legal responsibility Doctors were generally exempt from prosecution for their mistakes. Some writers complain of legal murder. However, holding the powerful up to exorbitant fees ran the risk of retaliation. Pliny reports  that the emperor Claudius fined a physician, Alcon, 180 million sesterces and exiled him to Gaul. By chance a law existed at Rome, the Lex Aquilia  passed about 286 BCE, which allowed the owners of slaves and animals to seek remedies for damage to their property, either malicious or negligent. Litigants used this law to proceed against the negligence of medici, such as the performance of an operation on a slave by an untrained surgeon resulting in death or other damage.

Social position While encouraging and supporting the public and private practice of medicine, the Roman government tended to suppress organizations of medici in society. The constitution provided for the formation of occupational collegia, or guilds. The consuls and the emperors treated these ambivalently. Sometimes they were permitted; more often they were made illegal and were suppressed. The medici formed collegia, which had their own centers, the Scholae Medicorum, but they never amounted to a significant social force. They were regarded as subversive along with all the other collegia.Doctors were nevertheless influential. They liked to write. Compared to the number of books written, not many have survived; for example, Tiberius Claudius Menecrates composed 150 medical works, of which only a few fragments remain. Some that did remain almost in entirety are the works of Galen, Celsus, Hippocrates and the herbal expert, Pedanius Dioscorides who wrote the 5-volume De Materia Medica.

Military medical corps

Republican

 The state of the military medical corps before Augustus is unclear. Corpsmen certainly existed at least for the administration of first aid and were enlisted soldiers rather than civilians. The commander of the legion was held responsible for removing the wounded from the field and insuring that they got sufficient care and time to recover. He could quarter troops in private domiciles if he thought necessary.

Imperial  

The army of the early empire was sharply and qualitatively different. If military careers were now possible, so were careers for military specialists, such as medici. Under Augustus for the first time occupational names of officers and functions began to appear in inscriptions. The term medici ordinarii in the inscriptions must refer to the lowest ranking military physicians. No doctor was in any sense “ordinary”. They were to be feared and respected. During his reign, Augustus finally conferred the dignitas equestris, or social rank of knight, on all physicians, public or private. They were then full citizens and could wear the rings of knights. In the army there was at least one other rank of physician, the medicus duplicarius, “medic at double pay”, and, as the legion had milites sesquiplicarii, “soldiers at 1.5 pay”, perhaps the medics had that pay grade as well.

Practice

Medical corps in battle worked on the battlefield bandaging soldiers. From the aid station the wounded went by horse-drawn ambulance to other locations, ultimately to the camp hospitals in the area. There they were seen by the medici vulnerarii, or surgeons, the main type of military doctor. They were given a bed in the hospital if they needed it and one was available. The larger hospitals could administer 400-500 beds.A base hospital was quadrangular with barracks-like wards surrounding a central courtyard. On the outside of the quadrangle were private rooms for the patients. Although unacquainted with bacteria, Roman medical doctors knew about contagion and did their best to prevent it. Rooms were isolated, running water carried the waste away, and the drinking and washing water was tapped up the slope from the latrines.Within the hospital were operating rooms, kitchens, baths, a dispensary, latrines, a mortuary and herb gardens, as doctors relied heavily on herbs for drugs.. They operated or otherwise treated with scalpels, hooks, levers, drills, probes, forceps, catheters and arrow-extractors on patients anesthetized with morphine. Instruments were boiled before use. Wounds were washed in vinegar and stitched. Broken bones were placed in traction. There is, however, evidence of wider concerns. A vaginal speculum suggests gynecology was practiced, and an anal speculum implies knowledge that the size and condition of internal organs accessible through the orifices was an indication of health. They could extract eye cataracts with a special needle. Operating room amphitheaters indicate that medical education was ongoing. Many have proposed that the knowledge and practices of the medici were not exceeded until the 20th century CE.

Regulation of medicine

By the late empire the state had taken more of a hand in regulating medicine. The law codes of the 4th century CE, such as the Codex Theodosianus, paint a picture of a medical system enforced by the laws and the state apparatus. At the top was the equivalent of a surgeon general of the empire. He was by law a noble, a dux (duke) or a vicarius (vicar) of the emperor. He held the title of comes archiatorum, “count of the chief healers.” The Greek word iatros, “healer”, was higher-status than the Latin medicus.Under the comes were a number of officials called the archiatri, or more popularly the protomedicisupra medicosdomini medicorum or superpositi medicorum. They were paid by the state. It was their function to supervise all the medici in their districts; i.e., they were the chief medical examiners. Their families were exempt from taxes. They could not be prosecuted nor could troops be quartered in their homes.The archiatri were divided into two groups:

Archiatri sancti palatii, who were palace physicians

Archiatri populares. They were required to provide for the poor; presumably, the more prosperous still provided for themselves.

The archiatri settled all medical disputes. Rome had 14 of them; the number in other communities varied from 5 to 10 depending on the population.

 

 

 

History & Evolution of Anesthesia: 18th and 19th century advancement in science of anesthesia


Discovery of Anesthesia is one of the most important advancement of modern medicine. The Renaissance saw significant advances in anatomy and surgical technique. However, despite all this progress, surgery remained a treatment of last resort. Largely because of the associated pain, many patients with surgical disorders chose certain death rather than undergo surgery. Although there has been a great deal of debate as to who deserves the most credit for the discovery of general anesthesia, it is generally agreed that certain scientific discoveries in the late 18th and early 19th centuries were critical to the eventual introduction and development of modern anesthetic techniques.

Although anesthesia is known since ancient times,  major advances occurred in the late 19th century, which together allowed the transition to modern surgery. An appreciation of the germ theory of disease led rapidly to the development and application of antiseptic techniques in surgery.

18th century

Joseph Priestley (1733–1804) was an English polymath who discovered nitrous oxide, nitric oxide, ammonia, hydrogen chloride and  oxygen. Beginning in 1775, Priestley published his research in Experiments and Observations on Different Kinds of Air. The recent discoveries about these and other gases stimulated a great deal of interest in the European scientific community. Thomas Beddoes (1760–1808) was an physician and teacher of medicine. With an eye toward making further advances in this new science as well as offering treatment for diseases previously thought to be untreatable (such as asthma and tuberculosis), Beddoes founded the Pneumatic Institution for inhalation gas therapy in 1798 at Dowry Square in Clifton, Bristol.  Beddoes employed chemist and physicist Humphry Davy (1778–1829) as superintendent of the institute, and engineer James Watt (1736–1819) to help manufacture the gases.

During the course of his research at the Pneumatic Institution, Davy discovered the anesthetic properties of nitrous oxide. Davy, who coined the term “laughing gas” for nitrous oxide, published his findings the following year.  Davy was not a physician, and he never administered nitrous oxide during a surgical procedure. He was however the first to document the analgesic effects of nitrous oxide, as well as its potential benefits in relieving pain during surgery.

 

19th century

 Eastern hemisphere

Hanaoka Seishu (1760–1835) of  Osaka  was a Japanese surgeon of the  Edo period  with a knowledge of  Chinese herbal medicine, as well as Western surgical techniques. Beginning in about 1785, Hanaoka embarked on a quest to re-create a compound that would have pharmacologic properties similar to Hua Tuo’s mafeisan. After years of research and experimentation, he finally developed a formula which he named tsūsensan . Like that of Hua Tuo, this compound was composed of extracts of several different plants.

The  five of these seven ingredients were thought to be elements of Hua Tuo’s anesthetic potion, used 1600 years earlier.

The active ingredients in tsūsensan are    scopolamine , hyoscyamine ,  atropine , aconitine , angelicotoxin.  In sufficient quantity, tsūsensan produces a state of general anesthesia and  skeletal muscle paralysis. Shutei nakagawa (1773–1850), a close friend of Hanaoka, wrote a small pamphlet titled “Mayaku-ko” (“narcotic powder”) in 1796. Although the original manuscript was lost in a fire in 1867, this brochure described the current state of Hanaoka’s research on general anesthesia.

On 13 October 1804, Hanaoka performed a partial mastectomy for breast cancer on a 60-year-old woman named Kan Aiya, using tsūsensan as a general anesthetic. This is generally regarded today as the first reliable documentation of an operation to be performed under general anesthesia. Hanaoka went on to perform many operations using tsūsensan, including resection of malignant masses,  extraction of bladder stones, and extremity amputations. Before his death in 1835, Hanaoka performed more than 150 operations for breast cancer.

Western hemisphere

Friedrich Sertürner (1783–1841) first isolated morphine from opium in 1804,  he named it morphine after Morpheus, the Greek god of dreams.

Henry Hill Hickman (1800–1830) experimented with the use of carbon dioxide as an anesthetic in the 1820s. He would make the animal insensible, effectively via almost suffocating it with carbon dioxide, then determine the effects of the gas by amputating one of its limbs. In 1824, Hickman submitted the results of his research to the Royal Society in a short treatise titled Letter on suspended animation: with the view of ascertaining its probable utility in surgical operations on human subjects. The response was an 1826 article in The Lancet titled ‘Surgical Humbug’ that ruthlessly criticised his work. Hickman died four years later at age  of 30. Though he was unappreciated at the time of his death, his work has since been positively reappraised and he is now recognised as one of the fathers of anesthesia.

By the late 1830s, Humphry Davy’s experiments had become widely publicized within academic circles in the north eastern United States. Wandering lecturers would hold public gatherings, referred to as “ether frolics”, where members of the audience were encouraged to inhale diethyl ether or nitrous oxide to demonstrate the mind-altering properties of these agents while providing much entertainment to onlookers. Four notable men participated in these events and witnessed the use of ether in this manner. They were William Edward Clarke (1819–1898), Crawford W. Long (1815–1878), Horace Wells (1815–1848), and William T. G. Morton (1819–1868).

While attending undergraduate school in Rochester, New York, in 1839, classmates Clarke and Morton apparently participated in ether frolics with some regularity. In January 1842, by now a medical student at Berkshire Medical College, Clarke administered ether to a Miss Hobbie, while Elijah Pope performed a dental extraction. In so doing, he became the first to administer an inhaled anesthetic to facilitate the performance of a surgical procedure. Clarke apparently thought little of his accomplishment, and chose neither to publish nor to pursue this technique any further. Indeed, this event is not even mentioned in Clarke’s biography.

Crawford W. Long was a physician and pharmacist practicing in Jefferson, Georgia in the mid-19th century. During his time as a student at the University of Pennsylvania School of Medicine in the late 1830s, he had observed and probably participated in the ether frolics that had become popular at that time. At these gatherings, Long observed that some participants experienced bumps and bruises, but afterward had no recall of what had happened. He postulated that that diethyl ether produced pharmacologic effects similar to those of nitrous oxide. On 30 March 1842, he administered diethyl ether by inhalation to a man named James Venable, in order to remove a tumor from the man’s neck. Long later removed a second tumor from Venable, again under ether anesthesia. He went on to employ ether as a general anesthetic for limb amputations and parturition. Long however did not publish his experience until 1849, thereby denying himself much of the credit he deserved.

On 10 December 1844, Gardner Quincy Colton held a public demonstration of nitrous oxide in Hartford, Connecticut. One of the participants, Samuel A. Cooley, sustained a significant injury to his leg while under the influence of nitrous oxide without noticing the injury. Horace Wells, a Connecticut dentist present in the audience that day, immediately seized upon the significance of this apparent analgesic effect of nitrous oxide. The following day, Wells underwent a painless dental extraction while under the influence of nitrous oxide administered by Colton. Wells then began to administer nitrous oxide to his patients, successfully performing several dental extractions over the next couple of weeks.

William T. G. Morton, another New England dentist, was a former student and then-current business partner of Wells. He was also a former acquaintance and classmate of William Edward Clarke (the two had attended undergraduate school together in Rochester, New York). Morton arranged for Wells to demonstrate his technique for dental extraction under nitrous oxide general anesthesia at Massachusetts General Hospital, in conjunction with the prominent surgeon John Collins Warren. This demonstration, which took place on 20 January 1845, ended in failure when the patient cried out in pain in the middle of the operation.

On 30 September 1846, Morton administered diethyl ether to Eben Frost, a music teacher from Boston, for a dental extraction. Two weeks later, Morton became the first to publicly demonstrate the use of diethyl ether as a general anesthetic at Massachusetts General Hospital, in what is known today as the Ether Dome. On 16 October 1846, John Collins Warren removed a tumor from the neck of a local printer, Edward Gilbert Abbott. Upon completion of the procedure, Warren reportedly quipped, “Gentlemen, this is no humbug.” News of this event rapidly traveled around the world. Robert Liston performed the first amputation in December of that year. Morton published his experience soon after  Harvard University professor Charles Thomas Jackson (1805–1880) later claimed that Morton stole his idea. Morton disagreed and a lifelong dispute began. For many years, Morton was credited as being the pioneer of general anesthesia in the Western hemisphere, despite the fact that his demonstration occurred four years after Long’s initial experience.

In 1847, Scottish obstetrician James Young Simpson (1811–1870) of Edinburgh was the first to use chloroform as a general anesthetic on a human (Robert Mortimer Glover had written on this possibility in 1842 but only used it on dogs). The use of chloroform anesthesia expanded rapidly thereafter in Europe. Chloroform began to replace ether as an anesthetic in the United States at the beginning of the 20th century. It was soon abandoned in favor of ether when its hepatic and cardiac toxicity, especially its tendency to cause potentially fatal cardiac dysrhythmias, became apparent.

In 1871, the German surgeon Friedrich Trendelenburg (1844–1924) published a paper describing the first successful elective human tracheotomy to be performed for the purpose of administration of general anesthesia.

In 1880, the Scottish surgeon William Macewen (1848–1924) reported on his use of orotracheal intubation as an alternative to tracheotomy to allow a patient with glottic edema to breathe, as well as in the setting of general anesthesia with chloroform.  All previous observations of the glottis and larynx (including those of Manuel García,  Wilhelm Hack and Macewen) had been performed under indirect vision (using mirrors) until 23 April 1895, when Alfred Kirstein (1863–1922) of Germany first described direct visualization of the vocal cords. Kirstein performed the first direct laryngoscopy in Berlin, using an esophagoscope he had modified for this purpose, he called this device an autoscope.

 

https://en.wikipedia.org/wiki/History_of_general_anesthesia

Permanent link: https://en.wikipedia.org/w/index.php?title=History_of_general_anesthesia&oldid=805843  182

History & Evolution of Anesthesia: ancient, Middle Ages and Renaissance Anesthetics


Discovery of Anesthesia is one of the most important advancement of modern medicine. Attempts at producing a state of general   anesthesia can be traced throughout recorded history in the writings of the ancient Sumerians, Babylonians, Assyrians, Egyptians, Greeks, Romans, Indians, and Chinese. During the Middle Ages, which correspond roughly to what is sometimes referred to as the Islamic Golden Age, scientists and other scholars made significant advances in science and medicine in the Muslim world and Eastern world.

The Renaissance saw significant advances in anatomy and surgical technique. However, despite all this progress, surgery remained a treatment of last resort. Largely because of the associated pain, many patients with surgical disorders chose certain death rather than undergo surgery. Although there has been a great deal of debate as to who deserves the most credit for the discovery of general anesthesia, it is generally agreed that certain scientific discoveries in the late 18th and early 19th centuries were critical to the eventual introduction and development of modern anesthetic techniques.

Two major advances occurred in the late 19th century, which together allowed the transition to modern surgery. An appreciation of the germ theory of disease led rapidly to the development and application of antiseptic techniques in surgery. Antisepsis, which soon gave way to asepsis, reduced the overall morbidity and mortality of surgery to a far more acceptable rate than in previous eras. Concurrent with these developments were the significant advances in pharmacology and physiology which led to the development of general anesthesia and the control of pain.

In the 20th century, the safety and efficacy of general anesthesia was improved by the routine use of tracheal intubation and other advanced airway management techniques. Significant advances in monitoring and new anesthetic agents with improved pharmacokinetic and pharmacodynamics characteristics also contributed to this trend. Standardized training programs for anesthesiologists and nurse anesthetists emerged during this period. The increased application of economic and business administration principles to health care in the late 20th and early 21st centuries led to the introduction of management practices.

Ancient anesthesia

The first attempts at general anesthesia were probably herbal remedies administered in prehistory. Alcohol is the oldest known sedative; it was used in ancient Mesopotamia  thousands of years ago.

Opium

The Sumerians are said to have cultivated and harvested the opium poppy  in lower Mesopotamia as early as 3400 BCE, though this has been disputed. A small white clay tablet at the end of the third millennium BCE discovered in 1954 during excavations at Nippur.  Currently  it is considered to be the most ancient pharmacopoeia in existence.  About 2225 BCE, the Sumerian territory became a part of the Babylonian empire. Knowledge and use of the opium poppy and its euphoric effects thus passed to the Babylonians, who expanded their empire eastwards to Persia and westwards to Egypt, thereby extending its range to these civilizations. Opium was known to the Assyrians in the 7th century BCE.

  The ancient Egyptians had some surgical instruments, as well as crude analgesics and sedatives, including possibly an extract prepared from the mandrake fruit. The use of preparations similar to opium in surgery is recorded in the Ebers Papyrus, an Egyptian medical papyrus.

   Prior to the introduction of opium to ancient India and China, these civilizations pioneered the use of cannabis incense and aconitum. c. 400 BCE, the Sushruta Samhita (a text from the Indian subcontinent on ayurvedic medicine and surgery) advocates the use of wine with incense of cannabis for anesthesia. By the 8th century CE, Arab traders had brought opium to India  and China.

Classical antiquity

In Classical antiquity, anaesthetics were described by:

·         Dioscorides (De Materia Medica)

·         Galen

·         Hippocrates

Theophrastus (Historia Plantarum)–

China

Hua Tuo, Chinese surgeon, c. CE 200

Bian Que. 300 BCE was a legendary Chinese internist and surgeon who reportedly used general anesthesia for surgical procedures

Hua Tuo   CE 145-220  was a Chinese surgeon of the 2nd century CE. Before the surgery, he administered an oral anesthetic potion, probably dissolved in wine, in order to induce a state of unconsciousness and partial neuromuscular blockade.

The exact composition of mafeisan, similar to all of Hua Tuo’s clinical knowledge, was lost when he burned his manuscripts, just before his death. Because Confucian teachings regarded the body as sacred and surgery was considered a form of body mutilation, surgery was strongly discouraged in ancient China. Because of this, despite Hua Tuo’s reported success with general anesthesia, the practice of surgery in ancient China ended with his death.

 

Other substances used from antiquity for anesthetic purposes include extracts of juniper and coca.

Middle Ages and Renaissance

Arabic and Persian physicians may have been among the first to utilize oral as well as inhaled anesthetics.

In 1000, Abu al-Qasim al-Zahrawi (936-1013), an Arab physician described as the father of surgery. who lived in Al-Andalus, published the 30-volume Kitab al-Tasrif, the first illustrated work on surgery. In this book, he wrote about the use of general anesthesia for surgery. c. 1020, Ibn Sīnā (980–1037) described the use of inhaled anesthesia. The Canon described the “soporific sponge”, a sponge imbued with aromatics and narcotics, which was to be placed under a patient’s nose during surgical operations. Ibn Zuhr (1091–1161) was another Arab physician from Al-Andalus. In his 12th century medical textbook Al-Taisir, Ibn Zuhr describes the use of general anesthesia.These three physicians were among many who performed operations under inhaled anesthesia with the use of narcotic-soaked sponges. Opium made its way from Asia Minor to all parts of Europe between the 10th and 13th centuries.

 

Throughout 1200 – 1500 A.D. in England, a potion called dwale was used as an anesthetic. This mixture contained bile, opium, lettuce, bryony, and hemlock. Surgeons roused them by rubbing vinegar and salt on their cheekbones. One can find records of dwale in numerous literary sources, including Shakespeare’s Hamlet, and the John Keats poem “Ode to a Nightingale”. In the 13th century, we have the first prescription of the “spongia soporifica”—a sponge soaked in the juices of unripe mulberry, flax, mandragora leaves, ivy, lettuce seeds, lapathum, and hemlock with hyoscyamus. After treatment and/or storage, the sponge could be heated and the vapors inhaled with anasthetic effect.

Alchemist Ramon Llull has been credited with discovering diethyl ether in 1275. Aureolus Theophrastus Bombastus von Hohenheim (1493–1541), better known as Paracelsus, discovered the analgesic properties of diethyl ether around 1525.  August Sigmund Frobenius gave the name Spiritus Vini Æthereus to the substance in 1730.

 

https://en.wikipedia.org/wiki/History_of_general_anesthesia

 https://en.wikipedia.org/w/index.php?title=History_of_general_anesthesia&oldid=805843182

Ancient Medicine: Introduction of woman as nurses and doctors


 

Introduction of Woman Nurses and Doctors in 19th century Modern medicine 

Women as physicians

It was very difficult for women to become doctors in any field before the 1970s. Elizabeth Blackwell (1821–1910) became the first woman to formally study and practice medicine in the United States. She was a leader in women’s medical education. While Blackwell viewed medicine as a means for social and moral reform, her student Mary Putnam Jacobi (1842–1906) focused on curing disease. At a deeper level of disagreement, Blackwell felt that women would succeed in medicine because of their humane female values, but Jacobi believed that women should participate as the equals of men in all medical specialties using identical methods, values and insights. In the Soviet Union although the majority of medical doctors were women, they were paid less than the mostly male factory workers.

Women as nurses

Florence Nightingale triggered the professionalization of nursing.

Women had always served in ancillary roles, and as midwives and healers. The professionalization of medicine forced them increasingly to the sidelines. As hospitals multiplied they relied in Europe on orders of Roman Catholic nun-nurses, and German Protestant and Anglican deaconesses in the early 19th century. They were trained in traditional methods of physical care that involved little knowledge of medicine. The breakthrough to professionalization based on knowledge of advanced medicine was led by Florence Nightingale in England. She resolved to provide more advanced training than she saw on the Continent. Britain’s male doctors preferred the old system, but Nightingale won out and her Nightingale Training School opened in 1860 and became a model. The Nightingale solution depended on the patronage of upper class women, and they proved eager to serve. Royalty became involved. In 1902 the wife of the British king took control of the nursing unit of the British army, became its president, and renamed it after herself as the Queen Alexandra’s Royal Army Nursing Corps, when she died the next queen became president.

           In the United States, upper middle class women who already supported hospitals promoted nursing. The new profession proved highly attractive to women of all backgrounds, and schools of nursing opened in the late 19th century. They soon a function of large hospitals, where they provided a steady stream of low-paid idealistic workers. The International Red Cross began operations in numerous countries in the late 19th century, promoting nursing as an ideal profession for middle class women.

The Nightingale model was widely copied. Linda Richards (1841 – 1930) studied in London and became the first professionally trained American nurse. She established nursing training programs in the United States and Japan, and created the first system for keeping individual medical records for hospitalized patients. The Russian Orthodox Church sponsored seven orders of nursing sisters in the late 19th century. They ran hospitals, clinics, almshouses, pharmacies, and shelters as well as training schools for nurses. In the Soviet era (1917–1991), with the aristocratic sponsors gone, nursing became a low-prestige occupation based in poorly maintained hospitals.

 

Woman : Renaissance to Early Modern period 16th-18th century

Catholic women played large roles in health and healing in medieval and early modern Europe. A life as a nun was a prestigious role. Wealthy families provided dowries for their daughters, and these funded the convents, while the nuns provided free nursing care for the poor.

The Catholic elites provided hospital services because of their theology of salvation that good works were the route to heaven. The Protestant reformers rejected the notion that rich men could gain God’s grace through good works, and thereby escape purgatory, by providing cash endowments to charitable institutions. They also rejected the Catholic idea that the poor patients earned grace and salvation through their suffering.  Protestants generally closed all the convents and most of the hospitals, sending women home to become housewives, often against their will. On the other hand, local officials recognized the public value of hospitals, and some were continued in Protestant lands, but without monks or nuns and in the control of local governments.

In London, the crown allowed two hospitals to continue their charitable work, under nonreligious control of city officials. The convents were all shut down but Harkness finds that women, some of them former nuns, were part of a new system that delivered essential medical services to people outside their family. They were employed by parishes and hospitals, as well as by private families, and provided nursing care as well as some medical, pharmaceutical, and surgical services.

Meanwhile, in Catholic lands such as France, rich families continued to fund convents and monasteries, and enrolled their daughters as nuns who provided free health services to the poor. Nursing was a religious role for the nurse, and there was little call for science. 

 

 

 

        Permanent link: https://en.wikipedia.org/w/index.php?title=History_of_medicine&oldid=783167827

            Link    https://en.wikipedia.org/wiki/History_of_medicine

Blog at WordPress.com.

Up ↑

%d bloggers like this: