Three-dimensional (3D) technologies are famous for improving product design, manufacturing, even the quality of movies. But application in medical fields has been a much slower process. While uptake in the use of 3D for orthotic devices is visibly on the rise, 3D for improving orthopaedic implant design and surgical procedures has been hindered by the need for more accurate and usable software and for scanning devices that could more easily tackle the 3D “challenge” inside the human body.
It Takes All Kinds
Design of such things as consumer products is a relatively straightforward experience. Think of the world of one-size-fits-all product, such as a smartphone – design it once and use repeatedly. But throw in the complexities of the human body for product design and the challenge becomes much greater. No human shape is exactly the same, each patient suffers different stresses and unique injuries on joints. Even fully understanding specific joint movements can remain complex.
As software and the technologies to scan the human body have improved, so is 3D starting to enter the world of orthopaedic implants, improving knowledge of joint movement and the products designed to replace them.
Better Understanding of Joint Movement
While readers will think they already know how bones in the wrist move and how to treat them when they’re injured, that premise was challenged several years ago by researchers at the Brown Medical School/Rhode Island Hospital.
Under a five-year study funded by the National Institutes of Health started in 2004, researchers at Brown Medical School/Rhode Island Hospital began using a combination of computer tomography (CT), Geomagic reverse-engineering software, CAD/CAM solid modeling tools and computer animations to study the human wrist in completely new ways. Early discoveries indicated that the wrist doesn’t move in previously accepted manners, and widely practiced treatments of wrist injuries could be dramatically improved.
Joseph Crisco, director of the Orthopaedic Bioengineering Laboratory at Brown Medical School/Rhode Island Hospital, and his colleagues were among the first researchers to gather data on wrist movement from live subjects as opposed to cadavers. The process was non-invasive and utilized data from CT scans of the human wrist through a full range of motion, revealing 3D images, which traditional x-rays do not. (See Exhibit 1.)
Exhibit 1: 3D data from live subjects’ wrists in various positions. The data was imported into Geomagic Studio where NURBS surfaces of the bones were created.
The scan data was collected and exported to Mayo Foundation’s Analyze software to extract the contours of the small bones present in the scans. Crisco and his team then developed programs to generate point-cloud data from the extracted bone contours. The point-cloud data was imported into Geomagic Studio reverse-engineering software, in which the data is turned into 3D polygonal models and non-uniform rational B-spline (NURBS) surfaces.
Crisco’s group also combined scanned point-cloud data of varying resolutions to create final 3D models of the wrist bones. The surfaces of the higher-resolution scans were automatically aligned within Geomagic Studio with corresponding point clouds from other positions and saved as new files. With these files, the team created accurate and easily viewed digital models of completed wrists in multiple positions.
“Without accurate digital models, we wouldn’t be able to quantitatively study various wrist positions and motions,” said Crisco.