Personal space

25 November 2020



Still blushing from the Covid-19 spotlight, which swung their way when the rest of the industry realised they were able to continue their work as if nothing had changed, OEMs’ computational modelling and simulation experts are in position to enable true personalisation and shape the industry for years to come. They just need to make sure the regulators are on board with it. Tim Gunn assesses the state of the art with Jeffrey Bischoff, director of biomechanics research for Zimmer Biomet, and Walter Schmidt, senior manager of the modelling and simulation team at Stryker Orthopaedics.


If your ability to keep designing, refining and approving new technologies has been impacted by the fact you can’t access your test lab, spare a thought for those who blast their creations into space. After that, consider how they do it.

Behind Walter Schmidt in his home office is a computer model of a communications satellite. It’s an apt diorama. He hasn’t had to move far, but his knowledge of computational modelling – using computer simulations to study a structure’s resistance to fatigue, fracture and other stressors – has taken him from civil engineering and aeronautics, via Las Vegas hotel facade design, all the way to Stryker Orthopaedics, where he’s now the senior manager of the modelling and simulation team. The common denominator, as he calls it, is one mathematical Lego set: finite element analysis (FEA).

$703bn

Amount the Nasa Structure Analysis Programme saved the US space agencies between 1971 and 1984.

Nasa

A model example

Although the history of FEA, the best-known form of computational modelling, goes back as far as the 1940s, Schmidt traces the development of his particular toy box to the 1960s and the Nasa Structure Analysis (Nastran) programme. Nastran was instrumental in the design of the space shuttle, as well as innumerable cars, bridges, railways and skyscrapers.

Software like Nastran first came to prominence in industries like aeronautics because of the difficulty of physically testing expensive, precisely engineered systems and components for their ability to stand up to extremes humans have never encountered on Earth. In short, FEA was well poised to shine in the context of the Covid-19 pandemic. While lab tests and clinical trials ground to a halt, Schmidt’s team was busier than ever. “We can run our simulations remotely and we’re still able to communicate as well as before,” he says proudly. “If anything, this has shined a light on the untapped promise of simulation, and it may be accelerating its influence in an unanticipated way.”

Of course, it had a fair amount of influence already. Even in its most limited application, FEA greatly reduces the number of make and break cycles required to reach a final product. That’s hardly surprising, though – ‘time to market’ never meant more than it did during the space race. In recent years, however, medical device development has begun to change from just another area in which computational modelling can be usefully applied, to a discipline for which advancement is increasingly tied up in its ability to make full use of the technology’s potential.

Jeff Bischoff, director of biomechanics research at Zimmer Biomet, puts it well. “With the technologies in the field right now, we’re getting to the point where we’ve learned about as much as we can from standardised methods that do not incorporate patient specificity,” he says. “Computational modelling should become the dominant way to do that because, if we believe in the technology, it is the best way to quickly, efficiently and accurately get results across the full range of patient conditions.”

As Schmidt explains, that’s not as big a jump as it might seem. Just as hip replacements are impacted by different physiologies, satellites carried by different launch vehicles need to account for discrepancies in design and operation. “Whether they’re launched from China, the US or even the EU,” he says, “they all have their own unique vibrational fingerprints.” Hip replacements, thankfully, are rarely subjected to forces seven times that of gravity.

Computational modelling and simulation was first introduced into the medical device industry to help developers make decisions in the earliest stages of prototyping. “It was a way to take hand calculations one step further,” says Bischoff, “to try and get a quick initial read on how different prototypes might work so that you can triage them.” Schmidt is still quick to emphasise the cost and time savings that can be achieved by “front-loading” computer models into design processes, but as the technology has developed over the past two decades, it’s become possible for simulations to handle increasingly complex calculations and give increasingly accurate results, which means they can be used to expedite processes much further along in the development cycle.

For instance, modelling and simulation is now commonly used to identify the worst-case sizes within a family of implants, meaning only the configurations most susceptible to issues need to be bench-tested – greatly reducing the burden of ensuring a product is safe and getting it approved. Beyond that, leading companies are now using evidence from validated computational models in place of benchtop tests as the final verification of devices for select failure modes in regulatory submissions.

Digital twins

Not that the process is entirely virtual. Even in the best cases, benchtop tests still need to be conducted to provide objective evidence that a model’s predictions are accurate. Nevertheless, validating a reusable model through physical testing is a far less labour-intensive task than subjecting device after device to the same procedures. “There’s still physical testing in the background, in the story,” Bischoff says, “but the final evaluation of the device can absolutely be made strictly on the use of a well-validated computational model.”

Schmidt ventures further into the background to point out that the benchtop tests themselves can be twinned and prototyped virtually. Particularly when researchers are designing tests to mimic the pressures and forces to which novel devices might be subjected, simulation analysts can help get them on the right track early. “You can find weaknesses in the test in terms of its constraints, alignments and sensitivities,” he explains. “[Otherwise], perhaps when you put this test together and run it for the first time, you’ll just be breaking the equipment.” If anything, it’s short-sighted to confine computational models to only representing devices. By simulating manufacturing processes and operations alongside their physical counterparts, developers can evaluate, optimise and predict production in real time based on live data from the physical manufacturing line, its digital twin, and historical production data. Schmidt uses an example from additive manufacturing. Whereas CAD models don’t account for thermal stresses that can shift and warp an underlying layer before the next is deposited atop it, linked simulations can combine historical and real-time data feeds to adjust production as required.

Realising the full potential

As important as all of this might sound to optimal design and manufacturing, it’s a long way from being uniformly implemented across the industry. That’s not just an issue for the laggards. The potential for developers inexperienced in using computational modelling to do it wrong can make regulators wary of even the experts.

Necessity forced aeronautics and civil engineering to regulate and adapt to the use of computational modelling early. The software was so unwieldy and the hardware so slow that it took experts with what Schmidt calls “tribal knowledge” to run computational models and simulations – but you either tested a nuclear reactor’s ability to withstand earthquakes virtually or you didn’t test it at all. In the medical industry, however, the suitability of lab tests, animal tests and clinical trials for evaluating most devices has only left small gaps for the piecemeal implementation of modelling and simulation technologies, with very little standardisation between companies. Now the software is simple enough to be used without much training, but that doesn’t mean everyone’s using it well.

“Probably starting about ten years ago, there was a general recognition in the field that we don’t really have a consensus view on the appropriate level of validation for using models in this context,” explains Bischoff. “If, as a device manufacturer, you did what you thought was good modelling work, there was a lot of uncertainty as to whether the regulators would view it positively or negatively. And from the regulatory side, they were seeing tremendous variability in the level of rigour and the level of credibility of modelling studies. There was no common way to establish a reasonable baseline expectation for model validation and remove that uncertainty.”

As the vice-chair responsible for ASME’s medical device modelling verification and validation standard (V&V 40) in 2018, Bischoff helped give regulators and industry a platform for using simulations to their full potential. Though it doesn’t specify a particular “recipe” for how to implement a simulation, V&V 40 details how developers and manufacturers can ensure their models are credible and accurate. “In the historical approach to testing standards, there has been a very strong emphasis on repeatability,” he explains. “But you can be reproducibly wrong. With ASME, we want to make sure the core methodology itself is in fact the right methodology based on the clinical scenario that we’re trying to represent.”

With that philosophy established, the reproducible recipes come courtesy of ASTM International’s medical device committee, which has approved three standards for developing a computational model to replicate or replace a bench test, and is now working with ASME to match them to V&V 40. Schmidt – like Bischoff, a key member of ASTM’s Orthopaedics and FEA subcommittee – uses worst-case size determination as an example. “We’re using these standards to help us establish the worst-case size using a method that is consistent between all companies in the joint replacement industry,” he explains. “So, if you’re a regulatory body seeing a submission, you now see that everyone in the industry is doing the analysis in the same way. This expedites the approval process and instils confidence in the regulatory bodies that these models do represent a new body of evidence that can be trusted.”

As Schmidt sees it, by using standards to ensure regulators are comfortable with more simplistic models, bodies like ASME and ASTM, which organised a joint workshop late in 2019, are laying the foundations for the more impactful device simulations to come. “Models will get more sophisticated, and the trust in them has to be built upon the positive experiences provided by the simpler models” he says.

Already, the most innovative companies have begun to focus on using computational modelling to better understand and guide their clinical trials, often by identifying critical patient populations and increasing the statistical power of smaller sample sizes, lessening the need to recruit large numbers of people from rare groups. As you read this, the data being gathered by the sensors that manufacturers have placed in some of their recent implants is supporting the development of ‘simulytics’, which can help us better understand and encourage the characteristics that coincide with the optimal clinical outcomes.

This is where computational modelling comes into its own. Bischoff notes how much of a stretch it is to suggest an animal model or a standardised bench test can really be used to develop personalised devices to support outcomes for a specific patient. “Certainly, we could do that in a clinical study, but only as long as that patient was in that clinical study,” he laughs. “Then we lose the predictive capability we want.”

By contrast, a flexible simulation tool verified for representative conditions with benchtop tests, and validated against patient-specific outcomes through clinical trials, would enable developers to confidently explore the full variability of clinical conditions and tailor their products to patient-specific anatomical models. “Everyone wants to know that a device is going to work for them,” says Bischoff. “That’s something that I think only modelling and simulation can provide.” We, it would seem, are the final frontier.

Computational modelling enables developers to explore the full variability of clinical conditions and tailor their products to specific patients.


Privacy Policy
We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.