Armed with an industrial grinder and a super-high-resolution camera, Adam Maloof and Akshay Mehra, Princeton geoscientists, successfully deconstructed rock samples and created 3D digital versions of those rocks. These 3D digital renditions allow researchers and scientists to look at the rock from any angle, even inside the rock. The same team developed software that can segment images and isolate objects without human bias.
This technology was used along with field observations taken from the geoscientists who gathered the rock. Through the study of rocks with this technology, the researchers found a thin-shelled creature that lived over 545 million years ago. This creature, called Cloudina, was the first ever “biomineralizer,” that can create a shell, bone and soft tissue.
Previously, some researchers thought that Cloudina were reef-building organisms. But by using the 3D reconstruction found its tube-like structures that showed the fossils had been transported from other areas. This suggested that Cloudina had only a minor role in the reef systems.
"I thought going in we would learn all sorts about this amazing first biomineralizer and first reef builder, but Cloudina turned out to be more like a reef dweller," said Maloof, an associate professor of geosciences. He has now turned his focus to the next-oldest potential reef builder, a sponge called Archaeocyathid that lived about 520 million years ago.
Cloudina was previously unable to be studied this closely because it was too fragile to extract from the limestone. It also couldn’t be imaged with the traditional X-ray images because the density wasn’t correct and Cloudina was chemically identical to limestone. This means that Cloudina was invisible to the x-rays.
The technology was put together around five years ago by Maloof and Situ Studio collaborator Brad Samuels who created what he named “flipbooks,” which were digital renderings that can move through thousands of wafer-thin slices of rock. The GIRI (Princeton Grinding Imaging and Reconstruction Instrument) is the key to seeing what is inside of a rock.
The GIRI cuts slices a few microns thick and run 24 hours a day, 7 days a week. It takes around 90 seconds to cut and image a rock slice. Researchers can customize the speed and scale of the machine. Most of the slices imaged by Maloof and Mehra were cut into 30-micron slices. A 1,500 slice sample with the slices about an inch thick take a day and a half to grind and image. The only required maintenance is to replace machine fluids and clean the wipers once.
"It's destructive of course, that's the disadvantage, but what's so nice is that you get to see photographs and make direct observations," said Maloof. "That's what's been so life-changing to me: I love that it's not a model. You can just see it. On any given slice, if you find something great, you can just find the slice and say, 'What did it look like?' ...We're on a virtual tour inside, rather than looking at waveforms and trying to interpret them."
Even though the samples are pulverized during the process, it preserves every detail in the high-res images and the structural information of the rock. It can be argued that this is one of the less destructive methods currently in use.
"When people have attempted to get 3D information from rocks like this that are opaque to X-rays, they've always tried to dissolve the material out. But then you lose all the in-situ information. You don't know how they grew. They have no relationship to each other. And you don't know how they are related to perhaps smaller or less resilient parts. You may preferentially dissolve the ornamentation or other key details. So this is a way to keep them in their habitat while still trying to figure out what they looked like,” said Maloof.
Since the GIRI was developed it has been significantly improved, including a redesign and replacing the camera housing and wiper. Monitors have been installed for temperature and humidity to record conditions in all the photographs. Before these improvements, the grind time for a single slice was seven minutes, but they got it down to 90 seconds. The software was also significantly improved.
"From the ground up, Akshay has designed machine-learning solutions to make the process of image segmentation -- differentiating fossils from a matrix, cement, etc., in every slice -- automated and reliable," said Maloof. "He has developed techniques that ultimately will be important for any tomographic applications, including X-ray CT. Akshay also has developed ways to make quantitative measurements in the reconstructed 3D volumes. You'd be surprised how much 3D modeling out there only leads to visualization and qualitative interpretation, whereas Akshay actually measures the size, shape and 3D orientation of these critters."
"We can directly measure these Cloudina specimens," said Mehra. "We can directly measure what directions the tubes are bending, what their diameters are, what their curvatures are -- none of them are actually straight -- and based on that information, we can determine whether they are in situ or not in situ."
Neural networks have been developed to identify rock types by their visual properties, like color or texture. After the user defines which classes are present in the images, the network predicts if a pixel belongs to a class with 90 percent accuracy.
The paper on the new machine was published in the Proceedings of the National Academy of Sciences