3D Technology Improves Surgical Implants and Procedures

Three-dimensional (3D) technologies are famous for improving product design, manufacturing, even the quality of movies. But application in medical fields has been a much slower process. While uptake in the use of 3D for orthotic devices is visibly on the rise, 3D for improving orthopaedic implant design and surgical procedures has been hindered by the need for more accurate and usable software and for scanning devices that could more easily tackle the 3D “challenge” inside the human body.

It Takes All Kinds

Design of such things as consumer products is a relatively straightforward experience. Think of the world of one-size-fits-all product, such as a smartphone – design it once and use repeatedly. But throw in the complexities of the human body for product design and the challenge becomes much greater. No human shape is exactly the same, each patient suffers different stresses and unique injuries on joints. Even fully understanding specific joint movements can remain complex.

As software and the technologies to scan the human body have improved, so is 3D starting to enter the world of orthopaedic implants, improving knowledge of joint movement and the products designed to replace them.

Better Understanding of Joint Movement

While readers will think they already know how bones in the wrist move and how to treat them when they’re injured, that premise was challenged several years ago by researchers at the Brown Medical School/Rhode Island Hospital.

Under a five-year study funded by the National Institutes of Health started in 2004, researchers at Brown Medical School/Rhode Island Hospital began using a combination of computer tomography (CT), Geomagic reverse-engineering software, CAD/CAM solid modeling tools and computer animations to study the human wrist in completely new ways. Early discoveries indicated that the wrist doesn’t move in previously accepted manners, and widely practiced treatments of wrist injuries could be dramatically improved.

Joseph Crisco, director of the Orthopaedic Bioengineering Laboratory at Brown Medical School/Rhode Island Hospital, and his colleagues were among the first researchers to gather data on wrist movement from live subjects as opposed to cadavers. The process was non-invasive and utilized data from CT scans of the human wrist through a full range of motion, revealing 3D images, which traditional x-rays do not. (See Exhibit 1.)

Exhibit 1: 3D data from live subjects’ wrists in various positions. The data was imported into Geomagic Studio where NURBS surfaces of the bones were created.

3D data from live subjects’ wrists in various positions. The data was imported into Geomagic Studio where NURBS surfaces of the bones were created.

The scan data was collected and exported to Mayo Foundation’s Analyze software to extract the contours of the small bones present in the scans. Crisco and his team then developed programs to generate point-cloud data from the extracted bone contours. The point-cloud data was imported into Geomagic Studio reverse-engineering software, in which the data is turned into 3D polygonal models and non-uniform rational B-spline (NURBS) surfaces.

Crisco’s group also combined scanned point-cloud data of varying resolutions to create final 3D models of the wrist bones. The surfaces of the higher-resolution scans were automatically aligned within Geomagic Studio with corresponding point clouds from other positions and saved as new files. With these files, the team created accurate and easily viewed digital models of completed wrists in multiple positions.

“Without accurate digital models, we wouldn’t be able to quantitatively study various wrist positions and motions,” said Crisco.


Crisco’s team then imported the digital wrist models into SolidWorks CAD software for additional fine-tuning. The final CAD data was then output to a rapid prototyping machine, to create physical models of the wrist. These extremely accurate models were within two degrees of rotation and 0.2mm of dimension, and preserved the bone orientation of each wrist position.

Brown Medical researchers also used the digital models to create 3D animations of wrist movement with the help of proprietary mathematical algorithms to fill in the gaps between scanned positions. The animations helped the researchers see, three dimensionally, how the relations between wrist positions play out in real life.

Based on the animations, Crisco has found that wrist motion and function were more complicated than previous theories indicated. Descriptions of wrist motion at the time suggested that the eight wrist bones move as three columns or two rows. Instead, Crisco discovered that each bone has a separate pattern of motion that is associated with each unique direction of wrist motion. (See Exhibits 2 and 3.)

Exhibit 2: NURBS surface models of the human wrist bones in flexed (blue) and extended (gray) positions.

NURBS surface models of the human wrist bones in flexed (blue) and extended (gray) positions.

Exhibit 3: NURBS surface model of human wrist in neutral position, as if looking at top of the hand. Neutral wrist positions were used as comparison for the wrist models created through a complete range of motion.

NURBS surface model of human wrist in neutral position, as if looking at top of the hand. Neutral wrist positions were used as comparison for the wrist models created through a complete range of motion.

The team studied injured wrists as well as healthy ones to determine if diagnosis and treatment methods could be improved. They examined soft tissue injuries, such as scapholunate ligament tears, which are often the consequence of a fall on an outstretched hand. At the time, evaluations of ligament tears used x-rays to examine the gap between the scaphoid and lunate bones. If the gap between is wider than expected, a ligament tear is the most probable diagnosis.

“Our current findings indicate that altered 3D bone motion of the wrist – rather than x-rays – is a better indicator of ligament injuries,” says Crisco.

Better Understanding of Implant Wear

 In 2006, Dr. B.J. Fregly, assistant professor in the Department of Mechanical and Aerospace Engineering (MAE) at the University of Florida, was quoted as saying, “More engineering analysis goes into the washing machine in your home than into the artificial knee joints implanted in people.”

He set out to change that, taking technologies traditionally used for virtual prototyping of mechanical systems and extending them to more complex and variable biomechanical systems. His toolbox for studying and predicting joint contact stresses and motions in artificial and natural knees included dynamic modeling and simulation, parallel processing, image processing of CT and MRI data, 3D modeling from Geomagic, video-based motion capture and fluoroscopic imaging.

To utilize all these technologies, Fregly collaborated with Drs. Greg Sawyer and Rafi Haftka of the university’s MAE department and Dr. Scott Banks of the Biomotion Foundation in West Palm Beach, Florida. (See Exhibit 4.)

Exhibit 4: (Foreground) Complete knee assembled in Geomagic Studio, overlaying a fluoroscopic image of the knee. The lab can calculate the 3D relative motion of the artificial knee components through image matching. (Background) Fluoroscopy is used to measure detailed knee joint motion. Reflective markers on the patient’s skin and clothing capture gross movement.

(Foreground) Complete knee assembled in Geomagic Studio, overlaying a fluoroscopic image of the knee. The lab can calculate the 3D relative motion of the artificial knee components through image matching. (Background) Fluoroscopy is used to measure detailed knee joint motion. Reflective markers on the patient’s skin and clothing capture gross movement.


One of the immediate goals of the University of Florida study was to identify specific knee implant design or surgical positioning issues that contribute to the wear, and ultimately failure, of artificial knees. (See Exhibit 5.) Researchers typically evaluate wear in new implant designs using knee simulator machines. While these machines can test several implants of the same design simultaneously, this approach does have its drawbacks.

First, a single series of tests can cost tens of thousands of dollars and take months to run. Additionally, loads experienced by the implant in the body cannot be measured experimentally. Because it is unclear what loads should be applied to the implant, test machines do not reliably reproduce wear patterns observed in real-world situations. Worse yet, identical implants tested in the same machine often produce different wear results, making it difficult to predict how the implants will function in patients.

Exhibit 5: Comparison of actual knee implant wear with simulated wear predicted by computer model. Actual wear was measured from two aligned laser scans, one of the worn tibial insert and the other of an unworn insert of the same size. The team visualized the simulated wear by creating worn surface geometry in Geomagic Studio (based on outputs from the computer simulation) and superimposing the data on the unworn geometry in Geomagic Qualify.

Comparison of actual knee implant wear with simulated wear predicted by computer model. Actual wear was measured from two aligned laser scans, one of the worn tibial insert and the other of an unworn insert of the same size. The team visualized the simulated wear by creating worn surface geometry in Geomagic Studio (based on outputs from the computer simulation) and superimposing the data on the unworn geometry in Geomagic Qualify.

Fregly believed that a computational wear model based on patient-specific computer models would help scientists understand and accurately predict joint wear and failure.

He first evaluated and tested artificial knee designs by simulating a real-world environment specific to each patient using video-based motion capture to record gross movement data.

For more accurate measurements of knee joint motion, Dr. Scott Banks used fluoroscopy, and then matched 3D CAD models to each 2D fluoroscopic image as though he were orienting an object to a photograph of its shadow. Since the CAD models have embedded coordinate systems, the image-matched components are used to quantify the 3D motion of the patient’s knee under real-life loading conditions, such as walking and climbing stairs.

To create models of natural knees, the team used CT scans which produced static 2D image slices from the top to the bottom of a patient’s leg. The CT data were then imported into sliceOmatic image-processing software from TomoVision (www.tomovision.com), in which a 3D point-cloud model was created by stacking the 2D axial images of patients’ legs.

But before this data could be used properly, the 3D point cloud data needed to be transformed into highly accurate 3D surface models, and for this they used Geomagic Studio. This software accurately “cleans up” point-cloud data, turning it into 3D polygon and surface models so it can be immediately used for analysis, design, testing, etc. (See Exhibit 6.)

Exhibit 6: Geomagic is used to create a detailed polygonal model from the point-cloud data.

Geomagic is used to create a detailed polygonal model from the point-cloud data.

Once the researchers developed contact stress predictions from the original movement data, a final comprehensive wear model was created. Combining accurate knee motion data with contact stress predictions created a wear model that pinpointed the exact places where an artificial knee was likely to fail.

Researchers have been able to compare the simulated wear predicted by an early version of a computational model with the actual wear of an artificial knee recovered from a patient. (See Exhibit 7.) Their computational model’s prediction came within three-tenths of a millimeter of the actual maximum wear depth and accurately predicted the locations of the worst wear.

Exhibit 7: A completed 3D model with the implant data ready for computational testing and comparison to a used implant.

A completed 3D model with the implant data ready for computational testing and comparison to a used implant.

By developing a better understanding of how and where stress and movement produce wear, these researchers are pursing advances to extend the life span and functionality of artificial knees.

Improved Implant Design

With the intensity of the research shown just by these two examples, improved implant design and better treatment is an inevitable result of improved 3D technology in the medical fields. Using data from various sources, and turning it into accurate, usable 3D, the world of “mass customization” is starting to become a reality. Taking scan data from the human body, designing a device personalized to a specific joint or limb and manufacturing it rapidly from the 3D data means ultra-fast turn-around of products, with higher quality, fewer failures and a lower demand for additional treatment.

Rachael Dalton-Taggart is Director of Marketing Communications at Geomagic with more than 20 years in the 3D CAD/CAM industries. She started her career in 3D CAD as a CAD Manager in an architectural firm in the U.K., and swiftly moved into journalist and editor roles at key CAD magazines in London. She has worked in various roles of marketing and PR for Bentley Systems, Spatial Inc., Lattice Technology and also ran her own successful PR and Marketing business targeted at 3D, CAD, design and manufacturing vendors for six years. She may be reached at This email address is being protected from spambots. You need JavaScript enabled to view it..

Geomagic, Inc.
919-474-0122 (phone)
www.geomagic.com