Concept of Death: Ancient to Modern- Through the Ages


   Death is an evolving and complex concept. Philosophers and theologians from around the globe have recognised the value that death holds for human life. Death and life are bound together: without death there would be no life. Death allows new ideas and new ways. Death also reminds us of our fragility and sameness: we all die.

      Death is the inevitable conclusion of life, a universal destiny that all living creatures share. Even though all societies throughout history have realized that death is the certain fate of human beings, different cultures have responded to it in different ways. Through the ages, attitudes toward death and dying have changed and continue to change, shaped by religious, intellectual, and philosophical beliefs and conceptions. In the twenty-first century advances in medical science and technology continue to influence ideas about death and dying.

ANCIENT TIMES

Archaeologists have found that as early as the Paleolithic period, about 2.5 million to 3 million years ago, humans held metaphysical beliefs about death and dying—those beyond what humans can know with their senses. Tools and ornaments excavated at burial sites suggest that the earliest ancestors believed that some element of a person survived the dying experience.

Ancient Hebrews (c. 1020–586 B.C.), while acknowledging the existence of the soul, were not preoccupied with the afterlife. They lived according to the commandments of their God, to whom they entrusted their eternal destiny. By contrast, early Egyptians (c. 2900–950 B.C.) thought that the preservation of the dead body (mummification) guaranteed a happy afterlife. They believed a person had a dual soul: the ka and the ba. The ka was the spirit that dwelled near the body, whereas the ba was the vitalizing soul that lived on in the netherworld (the world of the dead). Similarly, the ancient Chinese (c. 2500–1000 B.C.) also believed in a dual soul, one part of which continued to exist after the death of the body. It was this spirit that the living venerated during ancestor worship.

Among the ancient Greeks (c. 2600–1200 B.C.), death was greatly feared. Greek mythology—which was full of tales of gods and goddesses who exacted punishment on disobedient humans—caused the living to follow rituals meticulously when burying their dead so as not to displease the gods. Even though reincarnation is usually associated with Asian religions, some Greeks were followers of Orphism, a religion that taught that the soul underwent many reincarnations until purification was achieved.

THE CLASSICAL AGE

Mythological beliefs among the ancient Greeks persisted into the classical age. The Greeks believed that after death the psyche (a person’s vital essence) lived on in the underworld. The Greek writer Homer (c. eighth century–c. seventh century B.C.) greatly influenced classical Greek attitudes about death through his epic poems the Iliad and the Odyssey. Greek mythology was freely interpreted by writers after Homer, and belief in eternal judgment and retribution continued to evolve throughout this period.

Certain Greek philosophers also influenced conceptions of death. For example, Pythagoras (569?–475? B.C.) opposed euthanasia (“good death” or mercy killing) because it might disturb the soul’s journey toward final purification as planned by the gods. On the contrary, Socrates (470?–399? B.C.) and Plato (428–348 B.C.) believed people could choose to end their life if they were no longer useful to themselves or the state.

Like Socrates and Plato, the classical Romans (c. 509–264 B.C.) believed a person suffering from intolerable pain or an incurable illness should have the right to choose a “good death.” They considered euthanasia a “mode of dying” that allowed a person’s right to take control of an intolerable situation and distinguished it from suicide, an act considered to be a shirking of responsibilities to one’s family and to humankind.

THE MIDDLE AGES

During the European Middle Ages (c. 500–1485), death—with its accompanying agonies—was accepted as a destiny everyone shared, but it was still feared. As a defense against this phenomenon that could not be explained, medieval people confronted death together, as a community. Because medical practices in this era were crude and imprecise, the ill and dying person often endured prolonged suffering. However, a long period of dying gave the dying individual an opportunity to feel forewarned about impending death, to put his or her affairs in order, and to confess sins. The medieval Roman Catholic Church, with its emphasis on the eternal life of the soul in heaven or hell, held great power over people’s notions of death.

By the late Middle Ages the fear of death had intensified due to the Black Death—the great plague of 1347 to 1351. The Black Death killed more than twenty-five million people in Europe alone. Commoners watched not only their neighbors stricken but also saw church officials and royalty struck down: Queen Eleanor of Aragon and King Alfonso XI (1311–1350) of Castile met with untimely deaths, and so did many at the papal court at AvignonFrance. With their perceived “proper order” of existence shaken, the common people became increasingly preoccupied with their own death and with the Last Judgment, God’s final and certain determination of the character of each individual. Because the Last Judgment was closely linked to an individual’s disposition to heaven or hell, the event of the plague and such widespread death was frightening.

THE RENAISSANCE                                       

From the fourteenth through the sixteenth centuries, Europe experienced new directions in economics, the arts, and social, scientific, and political thought. Nonetheless, obsession with death did not diminish with this “rebirth” of Western culture. A new self-awareness and emphasis on humans as the center of the universe further fueled the fear of dying.

By the sixteenth century many European Christians were rebelling against religion and had stopped relying on church, family, and friends to help ease their passage to the next life. The religious upheaval of the Protestant Reformation of 1520, which emphasized the individual nature of salvation, caused further uncertainties about death and dying.

The seventeenth century marked a shift from a religious to a more scientific exploration of death and dying. Lay people drifted away from the now disunited Christian church toward the medical profession, seeking answers in particular to the question of “apparent death,” a condition in which people appeared to be dead but were not. In many cases unconscious patients mistakenly believed to be dead were hurriedly prepared for burial by the clergy, only to “come back to life” during burial or while being transported to the cemetery.

An understanding of death and its aftermath was clearly still elusive, even to physicians who disagreed about what happened after death. Some physicians believed the body retained some kind of “sensibility” after death. Thus, many people preserved cadavers so that the bodies could “live on.” Alternatively, some physicians applied the teachings of the Catholic Church to their medical practice and believed that once the body was dead, the soul proceeded to its eternal fate and the body could no longer survive. These physicians did not preserve cadavers and pronounced them permanently dead.

THE EIGHTEENTH CENTURY

The fear of apparent death that took root in the seventeenth century resurfaced with great intensity during the eighteenth century. Coffins were built with contraptions to enable any prematurely buried person to survive and communicate from the grave.

For the first time, the Christian church was blamed for hastily burying its “living dead,” particularly because it had encouraged the abandonment of pagan burial traditions such as protracted mourning rituals. In the wake of apparent death incidents, more long burial traditions were revived.

THE NINETEENTH CENTURY

Premature and lingering deaths remained commonplace in the nineteenth century. Death typically took place in the home following a long deathbed watch. Family members prepared the corpse for viewing in the home, not in a funeral parlor. However, this practice changed during the late nineteenth century, when professional undertakers took over the job of preparing and burying the dead. They provided services such as readying the corpse for viewing and burial, building the coffin, digging the grave, and directing the funeral procession. Professional embalming and cosmetic restoration of bodies became widely available, all carried out in a funeral parlor where bodies were then viewed instead of in the home.

Cemeteries changed as well. Before the early nineteenth century, American cemeteries were unsanitary, overcrowded, and weed-filled places bearing an odor of decay. That began to change in 1831, when the Massachusetts Horticultural Society purchased seventy-two acres of fields, ponds, trees, and gardens in Cambridge and built Mount Auburn Cemetery. This cemetery was to become a model for the landscaped garden cemetery in the United States. These cemeteries were tranquil places where those grieving could visit the graves of loved ones and find comfort in the beautiful surroundings.

Literature of the time often focused on and romanticized death. Death poetry, consoling essays, and mourning manuals became available after 1830, which comforted the grieving with the concept that the deceased were released from worldly cares in heaven and that they would be reunited there with other deceased loved ones. The deadly lung disease tuberculosis—called consumption at the time—was pervasive during the nineteenth century in Europe and the United States. The disease caused sufferers to develop a certain appearance—an extreme pallor and thinness, with a look often described as haunted—that actually became a kind of fashion statement. The fixation on the subject by writers such as Edgar Allan Poe (1809–1849) and the English Romantic poets helped fuel the public’s fascination with death and dying. In the late twentieth and early twenty-first centuries the popularization of the Goth look is sometimes associated with the tubercular appearance.

Spiritualism

By the mid-nineteenth century the romanticizing of death took on a new twist in the United States. Spiritualism, in which the living communicate directly with the dead, began in 1848 in the United States with the Fox sisters: Margaret Fox (1833?–1893) and Catherine Fox (1839?–1892) of Hydesville, New York. The sisters claimed to have communicated with the spirit of a man murdered by a former tenant in their house. The practice of conducting “sittings” to contact the dead gained instant popularity. Mediums, such as the Fox sisters, were supposedly sensitive to “vibrations” from the disembodied souls that temporarily lived in that part of the spirit world just outside the earth’s limits.

This was not the first time people tried to communicate with the dead. Spiritualism has been practiced in cultures all over the world. For example, many Native Americans believe shamans (priests or medicine men) have the power to communicate with the spirits of the dead. The Old Testament (I Samuel 28:7–19) recounts the visit of King Saul to a medium at Endor, who summoned the spirit of the prophet Samuel, which predicted the death of Saul and his sons.

The mood in the United States in the 1860s and 1870s was ripe for Spiritualist s´ances. Virtually everyone had lost a son, husband, or other loved one during the Civil War (1861–1865). Some survivors wanted assurances that their loved ones were all right; others were simply curious about life after death. Those who had drifted away from traditional Christianity embraced this new Spiritualism, which claimed scientific proof of survival after physical death.

THE MODERN AGE

Modern medicine has played a vital role in the way people die and, consequently, the manner in which the dying process of a loved one affects relatives and friends. With advancements in medical technology, the dying process has become depersonalized, as it has moved away from the familiar surroundings of home and family to the sterile world of hospitals and strangers. Certainly, the institutionalization of death has not diminished the fear of dying. Now, the fear of death also involves the fear of separation: for the living, the fear of not being present when a loved one dies, and for the dying, the prospect of facing death without the comforting presence of a loved one.

Changing Attitudes

In the last decades of the twentieth century, attitudes about death and dying slowly began to change. Aging baby boomers (people born between 1946 and 1964), facing the deaths of their parents, began to confront their own mortality. Even though medical advances continue to increase life expectancy, they have raised an entirely new set of issues associated with death and dying. For example, how long should advanced medical technology be used to keep comatose people alive? How should the elderly or incapacitated be cared for? Is it reasonable for people to stop medical treatment, or even actively end their life, if that is what they wish?

The works of the psychiatrist Elisabeth K¨bler-Ross (1926–2004), including the pioneering book On Death and Dying (1969), have helped individuals from all walks of life confront the reality of death and restore dignity to those who are dying. Considered to be a highly respected authority on death, grief, and bereavement, K¨bler-Ross influenced the medical practices undertaken at the end of life, as well as the attitudes of physicians, nurses, clergy, and others who care for the dying.

During the late 1960s medical education was revealed to be seriously deficient in areas related to death and dying. However, initiatives under way in the late twentieth and early twenty-first centuries have offered more comprehensive training about end-of-life care. With the introduction of in-home hospice care, more terminally ill people have the option of spending their final days at home with their loved ones. With the veil of secrecy lifted and open public discussions about issues related to the end of life, Americans appear more ready to learn about death and to learn from the dying.

Hospice Care

In the Middle Ages hospices were refuges for the sick, the needy, and travellers. The modern hospice movement developed in response to the need to provide humane care to terminally ill patients, while at the same time lending support to their families. The English physician Dame Cicely Saunders (1918–) is considered the founder of the modern hospice movement—first in England in 1967 and later in Canada and the United States. The soothing, calming care provided by hospice workers is called palliative care, and it aims to relieve patients’ pain and the accompanying symptoms of terminal illness, while providing comfort to patients and their families.

Hospice may refer to a place—a freestanding facility or designated floor in a hospital or nursing home—or to a program such as hospice home care, in which a team of health-care professionals helps the dying patient and family at home. Hospice teams may involve physicians, nurses, social workers, pastoral counsellors, and trained volunteers.

WHY PEOPLE CHOOSE HOSPICE CARE. Hospice workers consider the patient and family to be the “unit of care” and focus their efforts on attending to emotional, psychological, and spiritual needs as well as to physical comfort and well-being. With hospice care, as a patient nears death, medical details move to the background as personal details move to the foreground to avoid providing care that is not wanted by the patient, even if some clinical benefit might be expected.

THE POPULATION SERVED. Hospice facilities served 621,100 people in 2000; of these, 85.5% died while in hospice care.  Nearly 80% of hospice patients were sixty-five years of age and older, and 26.5%were eighty-five years of age or older. Male hospice patients numbered 309,300, whereas 311,800 were female. The vast majority (84.1%) was white. Approximately half (46.6%) of the patients served were unmarried, but most of these unmarried patients were widowed. Nearly 79% of patients used Medicare as their primary source of payment for hospice services.

Even though more than half (57.5%) of those admitted to hospice care in 2000 had cancer (malignant neoplasms) as a primary diagnosis, patients with other primary diagnoses, such as Alzheimer’s disease and heart, respiratory, and kidney diseases, were also served by hospice.

     Advantages-Disadvantage of being a doctor

     25 factors- why health care is expensive

REEL Heroes Vs Real Heroes        

 21 occupational risks to doctors and nurses

Covid paradox: salary cut for doctors other paid at home

   Medical-Consumer protection Act- Pros and Cons              

Expensive Medical College  seat- Is it worth it? 

History & Evolution of Intensive (Critical) Care Units


              The English nurse Florence Nightingale pioneered efforts to use a separate hospital area for critically injured patients. During the Crimean War in the 1850s, she introduced the practice of moving the sickest patients to the beds directly opposite the nursing station on each ward so that they could be monitored more closely.  In 1923, the American neurosurgeon Walter Dandy created a three-bed unit at the Johns Hopkins Hospital. In these units, specially trained nurses cared for critically ill postoperative neurosurgical patients.

           The Danish anaesthesiologist Bjørn Aage Ibsen became involved in the 1952 poliomyelitis epidemic in Copenhagen, where 2722 patients developed the illness in a six-month period, with 316 of those developing some form of respiratory or airway paralysis. Some of these patients had been treated using the few available negative pressure ventilators, but these devices (while helpful) were limited in number and did not protect the patient’s lungs from aspiration of secretions. Ibsen changed the management directly by instituting long-term positive pressure ventilation using tracheal intubation, and he enlisted 200 medical students to manually pump oxygen and air into the patients’ lungs round the clock. At this time, Carl-Gunnar Engström had developed one of the first artificial positive-pressure volume-controlled ventilators, which eventually replaced the medical students. With the change in care, mortality during the epidemic declined from 90% to around 25%. Patients were managed in three special 35-bed areas, which aided charting medications and other management.

        In 1953, Ibsen set up what became the world’s first intensive care unit in a converted student nurse classroom in Copenhagen Municipal Hospital. He provided one of the first accounts of the management of tetanus using neuromuscular-blocking drugs and controlled ventilation.

         The following year, Ibsen was elected head of the department of anaesthesiology at that institution. He jointly authored the first known account of intensive care management principles in the journal Nordisk Medicin, with Tone Dahl Kvittingen from Norway.

      For a time in the early 1960s, it was not clear that specialized intensive care units were needed, so intensive care resources were brought to the room of the patient that needed the additional monitoring, care, and resources. It became rapidly evident, however, that a fixed location where intensive care resources and dedicated personnel were available provided better care than ad hoc provision of intensive care services spread throughout a hospital. In 1962, in the University of Pittsburgh, the first critical care residency was established in the United States. In 1970, the Society of Critical Care Medicine was formed.

How an epidemic led to development of Intensive Care Unit

How an epidemic led to development of Intensive Care Unit

The number of hospital admissions was more than the staff had ever seen. And people kept coming. Dozens each day. They were dying of respiratory failure. Doctors and nurses stood by, unable to help without sufficient equipment.

It was the polio epidemic of August 1952, at Blegdam Hospital in Copenhagen. This little-known event marked the start of intensive-care medicine and the use of mechanical ventilation outside the operating theatre — the very care that is at the heart of abating the COVID-19 crisis.

In 1952, the iron lung was the main way to treat the paralysis that stopped some people with poliovirus from breathing. Copenhagen was an epicentre of one of the worst polio epidemics that the world had ever seen. The hospital admitted 50 infected people daily, and each day, 6–12 of them developed respiratory failure. The whole city had just one iron lung. In the first few weeks of the epidemic, 87% of those with bulbar or bulbospinal polio, in which the virus attacks the brainstem or nerves that control breathing, died. Around half were children.

Desperate for a solution, the chief physician of Blegdam called a meeting. Asked to attend: Bjørn Ibsen, an anaesthesiologist recently returned from training at the Massachusetts General Hospital in Boston. Ibsen had a radical idea. It changed the course of modern medicine.

Student saviours                                    

The iron lung used negative pressure. It created a vacuum around the body, forcing the ribs, and therefore the lungs, to expand; air would then rush into the trachea and lungs to fill the void. The concept of negative-pressure ventilation had been around for hundreds of years, but the device that became widely used — the ‘Drinker respirator’ — was invented in 1928 by Philip Drinker and Louis Agassiz Shaw, professors at the School of Public Health in Boston, Massachusetts. Others went on to refine it, but the basic mechanism remained the same until 1952.

Iron lungs only partially solved the paralysis problem. Many people with polio placed in one still died. Among the most frequent complications was aspiration — saliva or stomach contents would be sucked from the back of the throat into the lungs when a person was too weak to swallow. There was no protection of the airway.

Ibsen suggested the opposite approach. His idea was to blow air directly into the lungs to make them expand, and then allow the body to passively relax and exhale. He proposed the use of a trachaeostomy: an incision in the neck, through which a tube goes into the windpipe and delivers oxygen to the lungs, and the application of positive-pressure ventilation. At the time, this was often done briefly during surgery, but had rarely been used in a hospital ward.

Ibsen was given permission to try the technique the next day. We even know the name of his first patient: Vivi Ebert, a 12-year-old girl on the brink of death from paralytic polio. Ibsen demonstrated that it worked. The trachaeostomy protected her lungs from aspiration, and by squeezing a bag attached to the tube, Ibsen kept her alive. Ebert went on to survive until 1971, when she ultimately died of infection in the same hospital, almost 20 years later.

The plan was hatched to use this technique on all the patients in Blegdam who needed help to breathe. The only problem? There were no ventilators.

Very early versions of positive-pressure ventilators had been around from about 1900, used for surgery and by rescuers during mining accidents. Further technical developments during the Second World War helped pilots to breathe in the decreased pressures at high altitudes. But modern ventilators, to support a person for hours or days, had yet to be invented.

What followed was one of the most remarkable episodes in health-care history: in six-hour shifts, medical and dental students from the University of Copenhagen sat at the bedside of every person with paralysis and ventilated them by hand. The students squeezed a bag connected to the trachaeostomy tube, forcing air into the lungs. They were instructed in how many breaths to administer each minute, and sat there hour after hour. This went on for weeks, and then months, with hundreds of students rotating on and off. By mid-September, the mortality for patients with polio who had respiratory failure had dropped to 31%. It is estimated that the heroic scheme saved 120 people.

Major insights emerged from the Copenhagen polio epidemic. One was a better understanding of why people died of polio. Until then, it was thought that kidney failure was the cause. Ibsen recognized that inadequate ventilation caused carbon dioxide to build up in the blood, making it very acidic — which caused organs to shut down.

Three further lessons are central today. First, Blegdam demonstrated what can be achieved by a medical community coming together, with remarkable focus and stamina. Second, it proved that keeping people alive for weeks, and months, with positive-pressure ventilation was feasible. And third, it showed that by bringing together all the patients struggling to breathe, it was easier to care for them in one place where the doctors and nurses had expertise in respiratory failure and mechanical ventilation.

So, the concept of an intensive-care unit (ICU) was born. After the first one was set up in Copenhagen the following year, ICUs proliferated. And the use of positive pressure, with ventilators instead of students, became the norm.

In the early years, many of the safety features of modern ventilators did not exist. Doctors who worked in the 1950s and 1960s describe caring for patients without any alarms; if the ventilator accidentally disconnected and the nurse’s back was turned, the person would die. Early ventilators forced people to breathe at a set rate, but modern ones sense when a patient wants to breathe, and then help provide a push of air into the lungs in time with the body. The original apparatus also gathered limited information on how stiff or compliant the lungs were, and gave everyone a set amount of air with each breath; modern machines take many measurements of the lungs, and allow for choices regarding how much air to give with each breath. All of these are refinements of the original ventilators, which were essentially automatic bellows and tubing.

Advantages-Disadvantage of being a doctor

     25 factors- why health care is expensive

REEL Heroes Vs Real Heroes        

 21 occupational risks to doctors and nurses

Covid paradox: salary cut for doctors other paid at home

   Medical-Consumer protection Act- Pros and Cons              

Expensive Medical College  seat- Is it worth it?

Most Unethical Medical Study Ever- Tuskegee Study for Syphilis


One of the ugliest and unethical human studies in the history, Tuskegee Study raised a host of ethical issues such as informed consent, racism, paternalism, unfair subject selection in research, maleficence, truth telling and justice, among others.   It is really unbelievable to understand the heinous nature of the Tuskegee study.

    The Public Health Service started the study in 1932 in collaboration with Tuskegee University (then the Tuskegee Institute), a historically Black college in Alabama. In the study, investigators enrolled a total of 600 impoverished African-American sharecroppers from Macon County, Alabama.

Tuskegee syphilis study

     Tuskegee syphilis study

The goal was to “observe the natural history of untreated syphilis” in black populations. But the subjects were unaware of this and were simply told they were receiving treatment for bad blood. Actually, they received no treatment at all. Even after penicillin was discovered as a safe and reliable cure for syphilis, the majority of men did not receive it.

In 1932, the USPHS, working with the Tuskegee Institute, began a study to record the natural history of syphilis. It was originally called the “Tuskegee Study of Untreated Syphilis in the Negro Male” (now referred to as the “USPHS Syphilis Study at Tuskegee”).

The study initially involved 600 Black men – 399 with syphilis, 201 who did not have the disease. Participants’ informed consent was not collected. Researchers told the men they were being treated for “bad blood,” a local term used to describe several ailments, including syphilis, anemia, and fatigue. In exchange for taking part in the study, the men received free medical exams, free meals, and burial insurance.

By 1943, penicillin was the treatment of choice for syphilis and becoming widely available, but the participants in the study were not offered treatment.

The purpose of the study was to observe the effects of the disease when untreated, though by the end of the study medical advancements meant it was entirely treatable. The men were not informed of the nature of the experiment, and more than 100 died as a result.

None of the infected men were treated with penicillin despite the fact that, by 1947, the antibiotic was widely available and had become the standard treatment for syphilis.

    .

Of these men, 399 had latent syphilis, with a control group of 201 men who were not infected. As an incentive for participation in the study, the men were promised free medical care. While the men were provided with both medical and mental care that they otherwise would not have received,  they were deceived by the PHS, who never informed them of their syphilis diagnosis and provided disguised placebos, ineffective methods, and diagnostic procedures as treatment for “bad blood”.

The men were initially told that the experiment was only going to last six months, but it was extended to 40 years.  After funding for treatment was lost, the study was continued without informing the men that they would never be treated.

The study continued, under numerous Public Health Service supervisors, until 1972, when a leak to the press resulted in its termination on November 16 of that year.  By then, 28 patients had died directly from syphilis, 100 died from complications related to syphilis, 40 of the patients’ wives were infected with syphilis, and 19 children were born with congenital syphilis.

The 40-year Tuskegee Study was a major violation of ethical standards, and has been cited as “arguably the most infamous biomedical research study in U.S. history.”  Its revelation has also been an important cause of distrust in medical science and the US government amongst African Americans.

Later in 1973, a class-action lawsuit was filed on behalf of the study participants and their families, resulting in a $10 million, out-of-court settlement in 1974.

On May 16, 1997, President Bill Clinton issued a formal Presidential Apology for the study.

On May 16, 1997, President Bill Clinton formally apologized on behalf of the United States to victims of the study, calling it shameful and racist. “What was done cannot be undone, but we can end the silence,” he said. “We can stop turning our heads away. We can look at you in the eye, and finally say, on behalf of the American people, what the United States government did was shameful and I am sorry.”

     Advantages-Disadvantage of being a doctor

     25 factors- why health care is expensive

REEL Heroes Vs Real Heroes        

 21 occupational risks to doctors and nurses

Covid paradox: salary cut for doctors other paid at home

   Medical-Consumer protection Act- Pros and Cons

Expensive Medical College  seat- Is it worth it?

History & Evolution of Vaccination


Before the first vaccinations, in the sense of using cowpox to inoculate people against smallpox, people have been inoculated in China and elsewhere, before being copied in the west, by using smallpox, called Variolation.

Variolation was the method of inoculation first used to immunize individuals against smallpox (Variola) with material taken from a patient or a recently variolated individual, in the hope that a mild, but protective, infection would result.

   The procedure was most commonly carried out by inserting/rubbing powdered smallpox scabs or fluid from pustules into superficial scratches made in the skin. 

    The earliest hints of the practice of variolation for smallpox in China come during the 10th century. The Chinese also practiced the oldest documented use of variolation, which comes from Wan Quan’s (1499–1582) Douzhen Xinfa  of 1549. They implemented a method of “nasal insufflation” administered by blowing powdered smallpox material, usually scabs, up the nostrils.

   Various insufflation techniques have been recorded throughout the sixteenth and seventeenth centuries within China. Two reports on the Chinese practice of inoculation were received by the Royal Society in London in 1700; one by Martin Lister who received a report by an employee of the East India Company stationed in China and another by Clopton Havers. In France, Voltaire reports that the Chinese have practiced variolation “these hundred years”.

     In 1796, Edward Jenner, a doctor in Berkeley in Gloucestershire, England, tested a common theory that a person who had contracted cowpox would be immune from smallpox. To test the theory, he took cowpox vesicles from a milkmaid named Sarah Nelmes with which he infected an eight-year-old boy named James Phipps, and two months later he inoculated the boy with smallpox, and smallpox did not develop.

   In 1798, Jenner published An Inquiry into the Causes and Effects of the Variolae Vacciniae which created widespread interest. He distinguished ‘true’ and ‘spurious’ cowpox (which did not give the desired effect) and developed an “arm-to-arm” method of propagating the vaccine from the vaccinated individual’s pustule. Early attempts at confirmation were confounded by contamination with smallpox, but despite controversy within the medical profession and religious opposition to the use of animal material, by 1801 his report was translated into six languages and over 100,000 people were vaccinated. The term vaccination was coined in 1800 by the surgeon Richard Dunning in his text Some observations on vaccination.

   In 1802, the Scottish physician Helenus Scott vaccinated dozens of children in  Mumbai (previous Bombay) against smallpox using Jenner’s cowpox vaccine. In the same year Scott penned a letter to the editor in the Bombay Courier, declaring that “We have it now in our power to communicate the benefits of this important discovery to every part of India, perhaps to China and the whole eastern world”.  Subsequently, vaccination became firmly established in British India. A vaccination campaign was started in the new British colony of Ceylon in 1803.

    By 1807 the British had vaccinated more than a million Indians and Sri Lankans against smallpox. Also in 1803 the Spanish Balmis Expedition launched the first transcontinental effort to vaccinate people against smallpox. Following a smallpox epidemic in 1816 the Kingdom of Nepal ordered smallpox vaccine and requested the English veterinarian William Moorcroft to help in launching a vaccination campaign. In the same year a law was passed in Sweden to require the vaccination of children against smallpox by the age of two. Prussia briefly introduced compulsory vaccination in 1810 and again in the 1920s, but decided against a compulsory vaccination law in 1829.

    A law on compulsory smallpox vaccination was introduced in the Province of Hanover in the 1820s. In 1826, in Kragujevac,  future prince Mihailo of Serbia was the first person to be vaccinated against smallpox in the principality of Serbia. 

    Following a smallpox epidemic in 1837 that caused 40,000 deaths, the British government initiated a concentrated vaccination policy, starting with the Vaccination Act of 1840, which provided for universal vaccination and prohibited Variolation.

    The Vaccination Act 1853 introduced compulsory smallpox vaccination in England and Wales.

    The law followed a severe outbreak of smallpox in 1851 and 1852. It provided that the poor law authorities would continue to dispense vaccination to all free of charge, but that records were to be kept on vaccinated children by the network of births registrars. It was accepted at the time, that voluntary vaccination had not reduced smallpox mortality, but the Vaccination Act 1853 was so badly implemented that it had little impact on the number of children vaccinated in England and Wales.

In the United States of America compulsory vaccination laws were upheld in the 1905 landmark case Jacobson v. Massachusetts by the Supreme Court of the United States. The Supreme Court ruled that laws could require vaccination to protect the public from dangerous communicable diseases. However, in practice the United States had the lowest rate of vaccination among industrialized nations in the early 20th century.

    Compulsory vaccination laws began to be enforced in the United States after World War II. In 1959 the World Health Organization (WHO) called for the eradication of smallpox worldwide, as smallpox was still endemic in 33 countries.

     In the 1960s six to eight children died each year in the United States from vaccination-related complications. According to the WHO there were in 1966 about 100 million cases of smallpox worldwide, causing an estimated two million deaths.

     In the 1970s there was such a small risk of contracting smallpox that the United States Public Health Service recommended for routine smallpox vaccination to be ended.

   By 1974 the WHO smallpox vaccination program had confined smallpox to parts of Pakistan, India, Bangladesh, Ethiopia and Somalia.

     In 1977 the WHO recorded the last case of smallpox infection acquired outside a laboratory in Somalia. In 1980 the WHO officially declared the world free of smallpox.

   In 1974 the WHO adopted the goal of universal vaccination by 1990 to protect children against six preventable infectious diseases: measles, poliomyelitis, diphtheria, whooping cough, tetanus, and tuberculosis.

    In the 1980s only 20 to 40% of children in developing countries were vaccinated against these six diseases. In wealthy nations the number of measles cases had dropped dramatically after the introduction of the measles vaccine in 1963. WHO figures demonstrate that in many countries a decline in measles vaccination leads to a resurgence in measles cases. Measles are so contagious that public health experts believe a vaccination rate of 100% is needed to control the disease.  Despite decades of mass vaccination polio remains a threat in India, Nigeria, Somalia, Niger, Afghanistan, Bangladesh and Indonesia.

   By 2006 global health experts concluded that the eradication of polio was only possible if the supply of drinking water and sanitation facilities were improved in slums. The deployment of a combined DPT vaccine against diphtheria, pertussis (whooping cough), and tetanus in the 1950s was considered a major advancement for public health. But in the course of vaccination campaigns that spanned decades, DPT vaccines became associated with high incidences of side effects. Despite improved DPT vaccines coming onto the market in the 1990s, DPT vaccines became the focus of anti-vaccination campaigns in wealthy nations. As immunization rates decreased, outbreaks of pertussis increased in many countries.

      In 2000, the Global Alliance for Vaccines and Immunization was established to strengthen routine vaccinations and introduce new and under-used vaccines in countries with a per capita GDP of under US$1000.

     Advantages-Disadvantage of being a doctor

     25 factors- why health care is expensive

     REEL Heroes Vs Real Heroes

     21 occupational risks to doctors and nurses

     Covid paradox: salary cut for doctors other paid at home

Challenges of Covid Vaccination-2021


If 2020 was consumed by Covid Virus ,  the next year 2021 will be  for Covid vaccination.

All over the world, billions of people are going to get vaccine.

Corona vaccination is one of the most anticipated events in every country. in coming weeks, multiple vaccines   are likely to get regulatory approval. 

    However, while making a good vaccine was the difficult part, earning  trust of public in vaccine is going to be another one. Especially the hurried development at Pandemic speed  and lack of awareness about safety issues will be areas of concern.

  The adverse events, which are unexpected medial problems that occur with drug treatments, are unavoidable part of any treatment, including vaccine science.

    The system need to be in place to identify  the causal relationship between vaccine agent and  the adverse event.  The objective criteria have to be in place to identify and treat, as the population to be vaccinated is also very large.

 The main hurdles equally challenging will be sourcing, distributing and giving the actual vaccination doses.

The preparation for mammoth exercise will also be a herculean task. It may take months to get ready to supply and build the chains and preparation for this need to begin now.

A systematic approach needs to be ready, so that the process of vaccination gets on smoothly and quickly, as soon as the doses are available. For example, the need for transport vehicles and the storage facilities for billions of doses at distant places will be one of challenges.

It will take mammoth number of healthcare workers, who will vaccinate people at different towns and cities.  

This exercise, if not done in a well-planned manner, could result in chaos.   The failure to set up a system will not only result in suboptimal vaccination but also non uniform supplies. Maintaining the cold chain will be crucial for effectiveness.

 People should get it based on needs rather than black marketing or money power.   The issues which look insignificant like the financial complexity among various stake holders or customer clearances need to be settled first, as they may become significant hurdles for smooth distribution.

    Most important would be to safeguard citizen’s faith in vaccine and clinical trials. As for the future science to develop, would   require people’s co-operation, faith and participation.

     Government regulators and Vaccine makers need to recognize the utmost importance of the communication about the true results of trials and effective communication with the public.  The misinformation and distrust should not  undermine the good work of medical science and advancements.

     Advantages-Disadvantage of being a doctor

     25 factors- why health care is expensive

     REEL Heroes Vs Real Heroes

     21 occupational risks to doctors and nurses

     Covid paradox: salary cut for doctors other paid at home

Authorities mum on Adverse Event at Covid Vaccine Trial


Safety data of Covid Vaccine- need disclosure

   There are two important aspect of a successful vaccine,

1. Efficacy  for the prevention

2. Safety

       Given that the Covid vaccine is needed urgently and will be developed within a years’ time, some doubts about the safety aspect are natural. But safety can be assured, if the data about side effects is made public.

   All  the  companies  in a bid to rush their  vaccine into the market, are eager to  create an hype. But a caution need to be exercised against such hyping, especially when long term safety data is not available.

  Even the sparse details of the severe side effect,  that leak into the public domain, may be just tip of the iceberg, as far as long term safety data of a vaccine is concerned.

   All the side effects, mild or severe, need to be made known  and  in public domain, rather than exposed later after the use.

More than a month and a half after an adverse event occurred in a clinical trial in India of the AstraZeneca vaccine, the Central Drug Standard Control Organisation (CDSCO), the regulator for vaccine trials, has not issued any statement on the occurrence. It also did not respond to queries about whether it has completed its investigation to determine if the trial participant’s illness was related to the vaccine. Serum Institute, which is partnering the pharma MNC and Oxford University for producing the vaccine in India, has also refused to comment. This is in sharp contrast to AstraZeneca and Oxford University going public when one of the trial participants in the vaccine trial in the UK fell ill and halting the trial till an independent safety monitoring board and UK’s regulatory authority gave safety clearance. Information about the occurrence of the serious adverse event (SAE) during the vaccine trial in India came from the family of the trial participant, which has sent the company and the regulators a legal notice. Serum Institute merely stated that it would issue an official statement next week. AstraZeneca had issued a statement within days of the trial participant in UK falling ill and halted the trials across the world in the UK, Brazil and South Africa. The trial was resumed within a week after the independent safety review committee and national regulators gave clearance. The Indian Council of Medical Research is a co-sponsor of the trial along with Serum Institute.

According to the ICMR, it is for the DCGI to take a call on whether or not to halt the trial. The DCGI heads the CDSCO.

The 40-year-old trial participant, a business consultant with an MBA from New Zealand who says he took part in the trial deeming it his duty to help such an important venture, was administered the vaccine at SRMC on October 1. Eleven days later, he woke up with a severe headache, and progressively lost his memory, showed behaviour changes, became disoriented and was unable to talk or recognise his family members, according to the legal notice. As soon as he fell ill he was admitted to the ICU in SRMC.

“Though the legal notice we have served talks of a compensation of Rs 5 crore, our focus is not on monetary compensation. It was sent just last week, more than a month after the occurrence when we saw that none of the authorities was making the adverse event public. They ought to have warned other participants so that they could watch out for similar symptoms. We want to know why the occurrence of the adverse event has been kept under wraps and why the trial was not halted like it was done in the UK. Is an Indian life of less value than that of an UK citizen?” asked a close family friend who has been helping the family cope with the illness.

WHO says more data needed on AstraZeneca dose

     Advantages-Disadvantage of being a doctor

     25 factors- why health care is expensive

     REEL Heroes Vs Real Heroes

     21 occupational risks to doctors and nurses

     Covid paradox: salary cut for doctors other paid at home

Ayurvedic Surgery: 10 Technical Questions? About safety concerns


      If there are certain doubts about the safety of the patient, the apprehension needs to be addressed.

      The government has issued a notification which authorises post-graduate practitioners in specified streams of Ayurveda to be trained to perform surgical procedures such as excisions of benign tumours, amputation of gangrene, nasal and cataract surgeries.

    The notification by the Central Council of Indian Medicine, a statutory body under the AYUSH Ministry to regulate the Indian systems of medicine, listed 39 general surgery procedures and around 19 procedures involving the eye, ear, nose and throat by amending the Indian Medicine Central Council (Post Graduate Ayurveda Education) Regulations, 2016.

     Any  Surgery, how-so-ever simple it may look to the people sitting on fence, carries some  risk and needs  some kind of precautions and regulations to make it risk free.  Therefore if there are certain doubts about the safety of the patient, the apprehension needs to be addressed. If the service of surgery by Ayurveda surgeon has to be availed by public, a certain confidence needs to be generated about the safety and quality assurance. Mere push by an enforced law will not lead to genesis of trust and confidence. So there needs to be technical analysis of some kind, whether  it is a genuine original  strategy or merely  an imposed law.

     If it was an accepted practice till now, there was no need for such notification. So apparently,  if the need was felt  to be said in a forceful manner, there has to be something unusual about the practice.

      No doubt, ancient Ayurvedic text referred to surgical practices. But  in present era of consumerism, patients need to know, how it was being practiced for last 200 to 300 years. What are the results and data about complications.

  There are two main categories for the purpose of discussion.

A. Existence of a robust system

B. Individual competencies.

    Firstly, there should be basic robust system  that will generate Ayurvedic surgeons.

To start with, the  CCIM need to  satisfy on following questions. Following are the basic requirements of surgery.

1. What  kind of Anaesthesia  will be used in surgeries by Ayurveda surgeons? Who will be the anaesthesiologist?

2. What are post op pain killers be used in surgeries by Ayurveda surgeons?

3. What antibiotics  will be  used;. Allopathic or ayurvedic?

4. What are principles of pre-op evaluation?

5. How surgical techniques are different. Are they same used in allopathic surgery or different ones described in Ayurveda?

6. How the post op complications are being managed. Is it by using allopathic medications and investigations?

7.  Data of surgeries done in last decade or two in all of  Ayurvedic medical colleges, especially those done by Ayurvedic surgeons.

8. Who is teaching Ayurveda doctors about the  surgeries? Are there ayurvedic teachers  or being taught by allopathic surgeons?

9. Will  the people in higher positions and government  officials be availing such facilities or it is only for the  poor people? 

10. Will the patients be given enough information or an informed consent about such Ayurvedic surgeons before  surgery?

         More than a law, the whole exercise   will require a trust building   in public  along with quality assurance and something unique to make such surgeries practically happen.

     Advantages-Disadvantage of being a doctor

     25 factors- why health care is expensive

     REEL Heroes Vs Real Heroes

     21 occupational risks to doctors and nurses

     Covid paradox: salary cut for doctors other paid at home

Potential Ray of Hope: Highly effective coronavirus antibodies


        Identification of highly effective antibodies, will not only provide a passive immunity, but can be helpful in developing vaccine as well. This discovery may be a potential ray of hope against Covid war.

Highly effective coronavirus antibodies discovered may lead to passive Covid-19 vaccine

     BERLIN: Scientists have identified highly effective antibodies against the novel coronavirus, which they say can lead to the development of a passive vaccination for Covid-19. Unlike in active vaccination, passive vaccination involves the administration of ready-made antibodies, which are degraded after some time. However, the effect of a passive vaccination is almost immediate, whereas with an active vaccination it has to build up first, the researchers said. The research, published in the journal Cell, also shows that some SARSCoV-2 antibodies bind to tissue samples from various organs, which could potentially trigger undesired side effects. The scientists at the German Center for Neurodegenerative Diseases (DZNE) and Charite – Universitatsmedizin Berlin isolated almost 600 different antibodies from the blood of individuals who had overcome Covid-19, the disease triggered by SARS-CoV2. By means of laboratory tests, they were able to narrow this number down to a few antibodies that were particularly effective at binding to the virus.  Highly effective coronavirus antibodies identified, may lead to passive Covid-19 vaccine The researchers then produced these antibodies artificially using cell cultures. The so-called neutralising antibodies bind to the virus, as crystallographic analysis reveals, and thus prevent the pathogen from entering cells and reproducing, they said. In addition, virus recognition by antibodies helps immune cells to eliminate the pathogen. Studies in hamsters — which, like humans, are susceptible to infection by SARS-CoV-2 — confirmed the high efficacy of the selected antibodies. “If the antibodies were given after an infection, the hamsters developed mild disease symptoms at most. If the antibodies were applied preventively — before infection — the animals did not get sick,” said Jakob Kreye, coordinator of the research project. The researchers noted that treating infectious diseases with antibodies has a long history. For Covid-19, this approach is also being investigated through the administration of plasma derived from the blood of recovered patients. With the plasma, antibodies of donors are transferred, they said. “Ideally, the most effective antibody is produced in a controlled manner on an industrial scale and in constant quality. This is the goal we are pursuing,” said Momsen Reincke, first author of the research. “Three of our antibodies are particularly promising for clinical development,” explained Harald Pruss, a research group leader at the DZNE and also a senior physician at Charite – Universitatsmedizin Berlin. “Using these antibodies, we have started to develop a passive vaccination against SARS-CoV-2,” Pruss said. In addition to the treatment of patients, preventive protection of healthy individuals who have had contact with infected persons is also a potential application, the researchers said. How long the protection lasts will have to be investigated in clinical studies, they said. “This is because, unlike in active vaccination, passive vaccination involves the administration of ready-made antibodies, which are degraded after some time,” Pruss said. In general, the protection provided by a passive vaccination is less persistent than that provided by an active vaccination, the researchers said. “It would be best if both options were available so that a flexible response could be made depending on the situation,” Pruss added.

Plasma therapy- life saving for Covid?


  Few months ago, there was a hope and  presumed scientific reason to believe that plasma therapy will be a wonderful option in Covid pandemic. But the said belief needed to be strengthened by robust trials. As trials continue, the belief that plasma therapy will save lives, have not been proved  clear. Now again there is a doubt in the mind of doctors, whether it will save lives or it may not. What ever future may hold, it is clear that it needs more trials, more evidence. Covid virus has again proved to be more smart.

Delhi: Plasma therapy’s life-saving abilities in question, doctors caution on its use (Times of India)

NEW DELHI: A day after TOI reported about an ICMR study that showed administering convalescent plasma to Covid-19 patients did not reduce death risk, top doctors of AIIMS, Institute of Liver and Biliary Sciences (ILBS) and Lok Nayak Hospital stressed the need to rethink who should get the therapy. In the trial by Indian Council of Medical Research, which involved 464 hospitalised, moderately-ill Covid-19 patients, researchers observed that some participants had higher antibody positivity than their plasma donors. “The difference in age and severity of illness, with donors being younger and having milder disease, could have driven this difference. While all Covid-19 survivors were encouraged to donate plasma, an overwhelming majority of the donors were only mildly sick, young survivors. Recovered patients who had moderate or severe disease were generally reluctant to return to hospitals for plasma donation,” the ICMR study noted.  Earlier the institutes  did not check the level of neutralising antibodies in the donor, which led to poorer outcomes. “The ICMR study re-affirms our assessment based on a trial conducted on 29 patients who received plasma therapy at ILBS. It showed no mortality benefit. However, there was significant benefit in terms of clearing of viral load in those who received the therapy in addition to standard care compared to who received only standard care,” he said. The ILBS director added that only patients with mild-to-moderate illness should be given convalescent plasma. “The therapy has to be given within 24 to 48 hours of diagnosis. Also, detailed assessment of presence of sufficient levels of neutralising antibodies in the donor should be mandatory,” Dr Sarin said. At least 100 Covid-19 patients at the state-run Lok Nayak Hospital have been given plasma therapy till date. Dr Suresh Kumar, its medical director, said larger studies might be needed to assess its benefits. “Remdesivir did not show significant benefit in Covid-19 treatment in some studies. Still, the drug is being used in select patients because it has certain benefits and there is no other known cure. Similarly, plasma therapy may not help reduce death risk but our experience shows it does help in faster recovery in a small subset of patients,” he said. ILBS and Lok Nayak Hospital are conducting a study involving 400 Covid patients to assess the benefits of plasma therapy. Rajiv Gandhi Super Specialty Hospital is also taking part in the study. The ICMR study was conducted at 39 tertiary care hospitals — 29 teaching and 10 private — across the country. According to the study, released on MedRXIV, a preprint service for medicine and health sciences, mortality was documented in 13.6% patients who received plasma therapy in addition to standard care and 31 (14.6%) patients who received only standard care. The trial results also indicated that there was no difference in progression to severe disease among moderately ill patients treated with convalescent plasma along with the best standard of care.

The Visionary princess who built AIIMS :Rajkumari Amrit Kaur


The pages of history celebrates Amrit Kaur’s determination to drive out the British, her feminist zeal, and also the many contributions she had made to the health infrastructure of the country.

Most of people know about the OPD block of AIIMS (premier Institute of India) named after Rajkumari Amrit Kaur, but how the vision of having an institute of excellence was converted into reality is largely unknown. The Indian Express carries a beautiful report about how the visionary Princess and Health Minister of India turned a dream into reality.

On February 18, 1956, the then minister of health, Rajkumari Amrit Kaur, introduced a new bill in the Lok Sabha. She had no speech prepared. But she spoke from her heart. “It has been one of my cherished dreams that for post graduate study and for the maintenance of high standards of medical education in our country, we should have an institute of this nature which would enable our young men and women to have their post graduate education in their own country,” she said.

The creation of a major central institute for post-graduate medical education and research had been recommended by the Health survey of the government of India, a decade ago in 1946. Though the idea was highly appreciated, money was a concern. It took another 10 years for Kaur to collect adequate funds, and lay the foundation of India’s number one medical institute and hospital.

Kaur’s speech in the Lok Sabha sparked a vigorous debate in the house over the nature of the institute. But the bill moved fast, gaining the approval of members of both the houses, and by May that year, the motion was adopted.

The All India Institute of Medical Sciences (AIIMS) was born. “I want this to be something wonderful, of which India can be proud, and I want India to be proud of it,” said Kaur, as the bill was passed in the Rajya Sabha.

In the past few months, as India has been battling a global pandemic, the role of the country’s apex medical body has come under discussion on several occasions. Significantly, it is the first prime minister of the country, Jawaharlal Nehru, who is credited for the heights reached by AIIMS. It is true that AIIMS came to be under the Nehru government. However, the real driving force behind it was Kaur.

A princess of the Kapurthala princely state, a student at Oxford university, a devout follower of Mahatma Gandhi, and an important member of the Constituent Assembly, Kaur was all of this and much more. Members of her family like to remember her as someone who believed in simple living and high thinking. The pages of history, on the other hand, celebrates her determination to drive out the British, her feminist zeal, and also the many contributions she had made to the health infrastructure of the country.

The Kapurthala princess

As a member of the Kapurthala princely family, Kaur had an interesting history. Her father, Raja Sir Harnam Singh, had converted to Protestant Christianity after a chance meeting with a Bengali missionary named Golakhnath Chatterjee in Jalandhar. Singh went on to marry his daughter, Priscilla, and had ten children with her. Kaur, the youngest among them was born on February 2, 1889.

Kaur, therefore, was brought up as a Protestant Christian. After spending her early years in India, she was sent off to England for her education. “Princess Amrit Kaur was as much a product of Edwardian England as she was of India,” suggested her obituary in the New York Times in 1964. She completed her schooling from the Sherborne School for Girls, in Dorset, and then went to study at Oxford University. Thereupon, she returned to India in 1908 at the age of 20, and embarked on a life of nationalism and social reform.

The Gandhian and social reformer

Upon her return from England, Kaur was immediately drawn towards the ideas of nationalism, as she interacted with leaders like Gopal Krishna Gokhale and Mahatma Gandhi. She was mesmerised by the teachings of Gandhi, and shared an enduring, special friendship with him, as is evident from the collection of letters shared between the two, that have been compiled in the book, ‘Letters to Rajkumari Amrit Kaur’.

“What drew me to Bapu was his desire to have women in his non-violent army and his faith in womankind. This was an irresistible appeal to a woman in a land where women were fit for producing children and serving their lords as masters,” she is quoted as having said by American philosopher Richard Gregg in his introductory note in ‘Letters to Amrit Kaur.’

Though she wanted to join the naionalist movement soon after she returned, her family was against her involvement in the struggle, and therefore she kept away till her father passed away in 1930. During this period though, she was actively involved in social reforms particularly those related to women. Consequently, she waged a battle against the purdah system, the devadasi system, and child marriage. In 1927, she helped in the founding of the All India Women’s Conference and later served as its president.

By 1930, as she joined the Gandhian movement, she was imprisoned for her participation in the Dandi march. She gave up all her princely comforts to join Gandhi at his ashram in Sabarmati. “I remember Rajkumari sitting at the spinning wheel and eating along with other ashramites, the simple fare prescribed by Gandhiji,” wrote political activist Aruna Asaf Ali about her fondest memory of Kaur. “Rajkumari Amrit Kaur belonged to a generation of pioneers. They belonged to well- to-do homes but gave up on their affluent and sheltered lives and flocked to Gandhiji’s banner when he called women to join the national liberation struggle,” she added.

In her battle for a free India, she became one of the few women members of the Constituent assembly. She along with Hansraj Jivraj Mehta were the only female members to be ardently in support of the uniform civil code in the constitution.

The passionate health minister who created AIIMS

Nihar Mahindar Singh, the 58-year-old grand niece of Kaur, recalls that as a child she would visit Kaur’s house in New Delhi frequently, as she was getting treated at AIIMS. “I never received any preferential treatment for being her family member. I remember spending hours at a stretch on the corridors of AIIMS. I didn’t even know back then that aunt B (as Kaur was referred to in her family), had created the hospital,” she says, adding that it was much later, and by word of mouth from her family members that she learned of her grand aunt’s contribution in building AIIMS.

As an institute of healthcare and medical research, AIIMS had to have some unique features. To begin with, it was the first of its kind in Asia to prohibit doctors from private practise of any kind. Secondly, the doctors at AIIMS were to devote their time not only to treating patients and teaching, but also to carry out research. “All the staff and students were to be housed in the campus of the Institute in the best traditions of the Guru-Sishya ideal to stay in close touch with each other,” writes V. Srinivas, the deputy director of administration at AIIMS in his article, ‘The making of AIIMS: The parliamentary debate’.

As health minister, Kaur was the pivotal force in ensuring the unique status enjoyed by AIIMS. Yet, it is worth noting, that she was in fact not the first choice of Nehru to be part of the cabinet. “In August 1947, for the woman member of the cabinet, Nehru thought of Hansa Mehta, but took Rajkumari Amrit Kaur at Gandhi’s insistence,” writes author Sankar Ghose, in his book, ‘Jawaharlal Nehru – A Biography’. Writing about why Kaur was not preferred, he explains, “she was sometimes indiscreet and intemperate in her criticism of Congressmen.”

source –The Indian Express

Blog at WordPress.com.

Up ↑

%d bloggers like this: