State of Nineteenth-Century Medicine
Doctors found it to be appropriate that these treatments also depleted the strength of their patients because fever, rapid pulse, and flushed appearance were considered to be signs of dangerous overstimulation. Unfortunately, most chronically ill soldiers were already weakened by exposure to the elements, inadequate diet, physical and mental stress, and, above all, dehydration from chronic diarrhea; however, the miasma theory did have one beneficial effect. Physicians believed that low, swampy areas were a primary source of illness, and the soldiers, as a rule, avoided these areas when selecting campgrounds. This allowed them to reduce their exposure to mosquitoes, which, though not known at the time, actually did carry diseases. Other insects such as flies also spread disease by contaminating food and causing chronic diarrhea. Both soldiers and surgeons alike looked forward to the first frost in the fall, which would drastically reduce the incidence of "camp fever" and malaria.
Another significant advance, in the decades before the war, was the extraction of the active ingredient quinine sulfate from the bark of the Peruvian cinchona tree. The Peruvian bark had been used for centuries to treat "intermittent fever," or malaria. The availability of quinine sulfate provided a safe, predictable, and reliable method of both preventing and treating intermittent fever. Since most of the Atlantic and Gulf coasts in America were areas where malaria was prevalent, the use of quinine sulfate saved many thousands of lives.
The last significant development to occur prior to the Civil War was vaccination for smallpox. Prior to the work of the English physician Edward Jenner with cowpox, the actual smallpox virus was used to inoculate patients, producing a mild form of the disease and conferring a natural immunity. With Jenner's research, first published in 1798, this could be done more safely with the cowpox virus. The success of vaccination and the isolation of smallpox cases prevented this ancient scourge from becoming a significant problem for Civil War soldiers. Most other contagious diseases, especially erysipelas (a streptococcal infection) and gangrene, were also controlled to a great extent with isolation techniques.
Problems with Civil War–era Treatment
In addition, soldiers faced other natural hazards. These included other insect- and parasite-related diseases, electrocution by lightning, snakebites, and drowning. Certainly the most common natural hazards were the extremes of temperature associated with the change of seasons. During the hottest months, when campaigning was most active, soldiers frequently were incapacitated and occasionally died from heatstroke. During the coldest months, cold-related injuries and even death by freezing were not uncommon. This can be attributed primarily to inadequate clothing and shelter. Although this was most common in the Confederate army, there were times in both armies when the supply system proved inadequate.
The only other well-documented vitamin deficiency was night blindness, which is caused by a deficiency of vitamin A. Nineteenth-century surgeons had no idea what caused night blindness, but one theory held that it was caused by sleeping outdoors with the eyes open and exposed to moonlight (hence the popular term "moon blindness"). At night regiments were sometimes forced to march with the soldiers placing their hands on the shoulder of the person ahead of them because their vision was so impaired.
During the warm months of the year, bacterial diseases consisted mainly of intestinal infections from contaminated food and water. Flies infected food, causing infectious diarrhea, or "camp fever," while mosquitoes spread malaria, or "intermittent fever." Both of these scourges virtually disappeared during the winter months, except for a persistence of low-grade infectious diarrhea caused from direct contamination of the water supply from improperly placed latrines and poor camp hygiene. Antebellum U.S. Army regulations took into account the association of filth and disease and called for the proper location and maintenance of latrines. Inexperienced officers were usually unaware of the need to keep their camps clean, and their men tended to ignore such instructions even when given. Over time, however, the connection between cleanliness in camp and a lower rate of illness became obvious and soldiers adjusted accordingly. Aside from low-grade diarrhea, the winter months saw colds, coughs, sore throats, and pneumonia made fatal for lack of effective treatment.
Still, by the latter part of the war, both Union and Confederate soldiers enjoyed generally good health. The weak had died or gone home, while the remainder developed natural immunities either by surviving disease or being vaccinated. Ironically, Confederate soldiers benefitted from their inability, because of the blockade, to procure harsher medications. By relying on readily available botanicals, they enjoyed similar benefits with much less drastic side effects.
Wounded or ill soldiers, especially those who simply needed a few days' rest before returning to duty, were seen in modest facilities near camp. More serious cases were transferred to a building or a collection of tents designated as a general hospital, where they would be provided care for up to ten to twelve days. Longer-term patients were transported by water or rail to cities where large, fixed general hospitals were established. Because most of coastal Virginia quickly fell under Union control, transportation of Confederate casualties was primarily by rail. Cities and towns along these rail lines became hospital centers. These included Richmond, Petersburg, Charlottesville, Gordonsville, and Liberty (now Bedford), as well as numerous smaller towns in the upper (or southern) Shenandoah Valley along the Virginia and East Tennessee Railroad. Union casualties were evacuated, primarily by water, to cities in the North. Important Union hospital centers in Virginia included Fort Monroe, Hampton and Portsmouth in the Hampton Roads area, and Alexandria.
The majority of wounded that made it to the larger field hospitals suffered from wounds to the extremities. Those who suffered from wounds to the head, chest, and abdomen occasionally survived, but usually not because of any surgical intervention. Wounds to the extremities fell into two categories: minor wounds that did not require amputation, and major wounds that did require it. Minor wounds could be successfully treated by thorough cleansing, removal of foreign material, and bandaging. However, wounds complicated by massive tissue destruction or involving a bone or joint usually were considered candidates for amputation, and for good reason. These wounds, which inevitably became infected, often led to sepsis, a systemic with a mortality rate of more than 90 percent. Even after amputation, infection was a concern, but the healthy tissue and the unimpeded drainage from the open end of a fresh stump tended to prevent sepsis and reduce the mortality rate to 20 to 25 percent. Surgeons preferred a simple circular amputation where the tissues were divided with circular incisions at a slightly higher level in each tissue layer. This left a cuff of tissue to provide coverage of the bone without closing the end of the stump as would be done with a flap amputation. (This "circular" technique is still the procedure of choice on the modern battlefield.)
In the era before antiseptic surgery, sterility was not considered necessary and anyway probably would have been impossible to achieve and maintain in the Civil War environment. Nevertheless, some surgeons wrote that they were at least clean. They washed away visible dirt when possible and kept their instruments free of blood and pus that would corrode the metal. Patients were most likely to survive if an amputation was performed within 48 hours of a wounding; this was known as a primary amputation. At the general hospitals, the stumps would occasionally need to be revised, or re-amputated, at a higher level due to infection in the bone or soft tissues. Otherwise, they were treated with continuous water dressings (still a good method of wound care). Styptics and cautery were techniques used to control occasional bleeding and various chemicals considered to be antiseptics (because they prevented sepsis) also made the wounds smell better. Of course these compounds, such as iodine and carbolic acid, were actually killing bacteria and would be used in the postwar period during the early days of aseptic surgery.
Transporting the Wounded
Cite This EntryAPA Citation:
First published: May 26, 2010 | Last modified: April 27, 2016