Keeping microbes at bay28 December 2023
The vectors for microbial transmission are numerous in hospitals. Items such as catheters, intravenous bags and devices, dialysis tubing, disposable syringes, gloves, implanted devices and hospital beds all carry the risk of a nosocomial infection. Despite enhanced cleaning regimes, several microbial species can survive in a hospital setting. To find ways to reduce this risk, researchers are constantly experimenting with coatings that have been physically or chemically designed to prevent the spread of pathogens or kill them on contact. But how did we reach the stage of knowledge we’re at now, with numerous coatings leveraging different mechanisms of action available? Peter Littlejohns explores how infection control has evolved throughout history, and why antimicrobial coatings haven’t become a core part of the field yet.
When we look around at the advanced state of medicine, marked by examples like robotic surgeries, AI-driven diagnostics and immunotherapies for cancer, it’s easy to forget there were a number of developments that led to what we see today. That’s just as true for infection control as it is for, say, cell and gene therapies. Just as the latter required the Human Genome Project as a foundation of understanding from which new treatments for diseases could be developed, the former had a long history of discoveries that led to the current practices used to keep pathogenic microbes at bay. Although there’s some evidence of infectious disease awareness in ancient and medieval history – isolating lepers in colonies and quarantining those with bubonic plague – Hungarian physician Ignaz Philipp Semmelweis was the first to highlight the role that physical touch could play in transmitting disease.
His analysis, carried out in Vienna between 1840– 47 at the height of the puerperal fever epidemic is now seen as a milestone in the history of infection control. In it he drew the conclusion that the high mortality rate of 98.4 deaths per 1000 births in a maternity clinic run by medical students was caused by them conducting post-mortem examinations of women who had died of puerperal fever and then performing vaginal examinations on live patients as part of their training. By contrast, the midwives in a second maternity clinic engaged in neither activity, and had a much lower mortality rate of 36.2 deaths per 1000 births. Semmelweis theorised that the culprit was the transmission of what he called ‘decomposing animal organic matter’, and in the last year of his analysis he instituted disinfectant handwashing within both clinics. The result was a dramatic drop in the mortality rates, 12.7 deaths per 1000 births for the students, 13.3 deaths per 1000 births for the midwives. Semmelweis, of course, had no knowledge of microbes and the mechanisms through which they spread and cause disease, but his analysis laid the groundwork for innovations that help to prevent the spread of pathogens – including antimicrobial coatings.
The development of these coatings is intertwined with the history of hospital-acquired infections (HAIs) – the name given to infectious diseases that occur due to pathogen transmission while patients are in hospital. While physicians and healthcare providers noticed patterns of infections that occurred in healthcare environments before and after the observations made by Semmelweis, it wasn’t until the late 19th century, with the advent of germ theory that the microbial causes of infectious diseases, including HAIs, began to be systematically understood. The two main contributors of germ theory were Louis Pasteur (of pasteurised milk fame) and Robert Koch. Pasteur proposed that specific microorganisms such as bacteria were responsible for causing diseases, meanwhile Koch’s postulates set out a framework for establishing a causal relationship between specific microorganisms and particular diseases, such as Mycobacterium tuberculosis and tuberculosis, Vibrio cholerae and cholera and Bacillus anthracis and anthrax.
The influence of germ theory was wide reaching. Although still in their infancy, the ideas of Pasteur influenced the work of British surgeon Joseph Lister, who in the 1860s introduced antiseptic agents to sterilise surgical instruments. The acceptance of antiseptic techniques and hand hygiene practices in surgical and clinical settings grew slowly, prodded along by the growing body of evidence supporting germ theory, and by 1910 the mortality rate of major operations had reduced from about 40% to less than 3%. Infection caused by HAIs was seemingly being brought under control, and the 1928 discovery of penicillin by Alexander Fleming gave clinicians a weapon against disease caused by bacteria. With penicillin being so effective at treating HAIs – the majority of which were and still are caused by bacteria – it enjoyed liberal use within hospitals. But with this widespread use came a problem that still plagues clinics to this day: antibiotic-resistant bacteria.
The emergence of strains of bacteria resistant to penicillin, streptomycin and several other antibiotics of the time, led to a resurgence in the importance of hand hygiene, sterilisation and disinfection, marked by earliest formal infection control programs. By the 1960s, hospital-based infection control efforts had been established in scattered hospitals throughout the US. The number of hospitals with HAI control programmes increased substantially during the 1970s, and such programmes were established in virtually every US hospital by the early 1990s. Pharmaceutical companies attempted to keep pace with the antibiotic resistance of bacteria until the early 1980s. At this point the search for new antibiotics was mostly abandoned due to the high cost of development and approval; how quickly resistance developed; and a growing awareness that the drugs had to be used sparingly – all of which meant a poor return on investment.
Antimicrobial stewardship has developed to provide scientific rigour to the decision of when and when not to use certain antibiotics, through the use of antimicrobial susceptibility testing (AST). But HAIs have remained a major problem in all corners of the world. The WHO reports out of every 100 patients in acute-care hospitals today, seven in highincome countries and 15 in low- and middle-income countries will acquire at least one HAI; of these patients, one in every ten will die from their infection. There’s a recognition among the healthcare community that infection control procedures need to be maintained to avoid HAIs, but even with superbugs like methicillin-resistant Staphylococcus aureus (MRSA) or antibiotic-resistant Clostridium difficile (C.diff) remaining as common causes of infection, hand hygiene, sterilisation and disinfection procedures can vary from hospital to hospital. The Covid-19 pandemic added another reason for hospitals to stay on top of infection control procedures, but it also highlighted the role that high-touch surfaces like counter tops and doorknobs play in pathogen transmission. A consequence of this was a reignition of the discussion around antimicrobial coatings as a strategy to keep microbes at bay in hospitals. Unlike human-led sterility and disinfection procedures, intrinsic antimicrobial surfaces offer a passive system that requires no human intervention, and their action is continuous.
The ability to create antimicrobial coatings is nothing new. Early examples used developments in nanotechnology to embed silver ions produced at a nanoscale into a matrix that could be applied to surfaces to give them antimicrobial properties. Silver was an obvious choice and still features in many current antimicrobial coatings due to its ability to interact with four main components in bacterial cells: the cell wall, plasma membrane, bacterial DNA and proteins. Here it causes degradation of the cell wall and cell lysis, preventing the bacteria from reproducing. The ions also penetrate the cell interior and bind to DNA bases, which prevents them from replicating.
Although this level of knowledge was acquired long afterwards, the use of silver dates back to 1850 BCE Egypt, where it was directly applied to wounds to improve healing. Of course, the potential for toxicity was unknown, and although silver is still employed in wound care today, it’s at much lower concentrations and accompanied by other ingredients, like antibiotics.
With an established method of killing bacteria, it’s worth asking why antimicrobial coatings that use silver, copper (another bactericidal element) or another of the various elements and compounds in these products aren’t employed in hospitals as standard practice. Part of the reason is that, in order to make claims related to the product, manufacturers must provide safety and efficacy data, and as of right now there’s no standardised test to produce it.
A 2023 paper titled ‘Antimicrobial coatings: Reviewing options for healthcare applications’ noted that the best available protocol, produced by the US Environmental Protection Agency (EPA). EPA suggests that an effective product should reduce the number of microorganisms on a surface by a factor of 1,000 (so 1,000,000 microorganisms become 1,000) within an hour of application. But the authors went on to say that of the ten coatings they evaluated, only one (cupric oxide) would pass using that criterion. An additional point here is that in order to be viable for reducing HAIs, coatings must demonstrate efficacy against a broad spectrum of pathogens, especially now that the Covid-19 pandemic has added an impetus for defence against viruses. As of now, most manufacturers tend to only provide efficacy data in the form of laboratory testing against gram-positive and gram-negative bacteria.
Both of these points lead the authors of the paper to conclude: “Although many possible antimicrobial surface options can be proposed, there are much fewer studies that investigate the ‘fitness for purpose’ where there is an evaluation of durability and sustained activity for activity against the key nosocomial pathogens, including drug-resistant bacteria, endospores, fungi and viruses.”
To put it another way, antimicrobial coatings simply haven’t proven themselves to be effective enough to mandate their use in infection control yet – at least when it comes to high-touch surfaces. On the other hand, products like silver alloy coatings for catheters have proven themselves adept at keeping the bacteria responsible for UTIs under control. That’s a lot different to applying an antimicrobial coating to a large surface area, but it proves that biocidal activity is possible. With renewed interest in antimicrobial coatings spurred on by the increasing numbers of antibiotic-resistant bacteria and the experience of the pandemic, hopefully we’re not far from a breakthrough that will give hospitals another weapon against HAIs.