Plasma and Blood Innovation in Wartime
During World War II, advances in the use of plasma and blood played a crucial role in the Allies’ success. On the battlefields, where life and death were decided in minutes, plasma became a revolutionary innovation, offering an invaluable strategic advantage. Collaboration between American and Canadian scientists, such as Dr. Charles Best and Dr. Charles Drew, led to the development of innovative methods for blood donation and storage, transforming plasma into an essential resource for the war.
The use of dried plasma was a game-changer. Given the challenging battlefield conditions, the ability to transport plasma without refrigeration and with a significantly longer shelf life compared to whole blood was a remarkable advancement. Since August 1940, the United States had been sending plasma to the United Kingdom, not only for military use but also to assist civilians in war-affected areas. This innovation allowed doctors and medics to transport plasma directly to the battlefield, maintaining its effectiveness over long distances.
Plasma, the liquid portion of blood that accounts for about 55% of its volume, contains nutrients, hormones, and proteins, including vital clotting factors. For soldiers in shock due to blood loss, plasma stabilized blood pressure and helped prevent death from hemorrhage. The production of dried plasma involved turning it into a powder, which could be easily reconstituted with distilled water in about three minutes. This innovative process made plasma a fundamental tool for wartime medics.
Despite its vital importance, the use of plasma in the war was not without risks. The mixing of plasma from different donors, a common practice during the conflict, aimed to dilute and neutralize any harmful antibodies. However, this rudimentary approach, compared to modern standards, resulted in outbreaks of hepatitis and other health complications. Plasma compatibility was also complex, as the transfusion of incompatible plasma could lead to the destruction of the recipient’s red blood cells, potentially causing shock and death.
World War I, in a way, paved the way for the creation of national blood services, which were first tested in a war scenario during the Spanish Civil War. This experience revealed the importance of treated and refrigerated blood, which proved effective for up to 18 days. At the beginning of World War II, both the United States and the United Kingdom rapidly expanded their blood programs. “Blood for Britain,” an American program launched in 1940, resulted in the shipment of 5,000 liters of bottled plasma to the United Kingdom. This was easier to transport over long distances than whole blood, and its distribution was improved throughout the war.
The development of new methods for separating and using plasma components, such as human serum albumin, further revolutionized the treatment of soldiers in shock. Albumin, an important blood substitute, was as robust as dried plasma and could be transported in small 100 ml bottles, allowing for quick and effective administration on battlefields. This technique saved thousands of soldiers from imminent death.
However, not all nations equally benefited from these innovations. Germany, Japan, and the Soviet Union lagged in blood technologies during the war. The German program, for example, suffered from the ideological restrictions of the time, which required blood donors to be of “pure Aryan descent,” hindering the expansion and effectiveness of the service. Blood donors of non-European descent were often rejected or segregated, impairing access to vital resources.
The challenges were broad and ongoing. Even with significant advances, many gaps remained in blood transfusion systems. Britain, despite having one of the best blood systems in the 1930s, still did not universally test soldiers for blood type. In the United States, only about 85% of soldiers had their blood types correctly recorded, a concerning fact amid the chaos and urgency of the battlefields.
Curiously, there were reports that both the British and Japanese used coconut water as a plasma substitute, an unproven practice that, according to post-war studies, had some potential for short-term hydration but certainly did not replace human plasma due to differences in chemical composition.
The innovation in the use of plasma and new blood technologies during World War II highlights a frequently underestimated facet of the conflict. Amid countless stories of epic battles and military strategies, these medical advances saved thousands of lives. Although the methods of the time were rudimentary and sometimes risky, they paved the way for the blood transfusion services we know today, representing a silent yet monumental achievement in the long history of war and medicine.
Share this content:
Discover more from The Unknown World War II
Subscribe to get the latest posts sent to your email.