3D mouse shape reconstruction based on phase-shifting algorithm for fluorescence molecular tomography imaging system.
Summary (4 min read)
1. INTRODUCTION
- Fluorescent Molecular Tomography (FMT) has emerged for almost two decades and has been used widely in biomedical research labs because of its unique features such as nonionized radiation, low cost and wide availability of molecular probes [1] [2].
- In FMT, the fluorophores are injected inside a mouse body intravenously and then excited with lasers to emit fluorescence photons, some of which will propagate to the mouse surface and get measured [7].
- In one study, the mouse was hung and rotated to be viewed at different angles by a camera, thus the 3D geometry can be reconstructed [9] [10].
- In section 2, the steps of the 3D surface reconstruction method are introduced, including the basic principles of phase shifting, selection of phase shifting step number, pico-projector and webcam pairs calibration, phase to coordinate conversion, merge of two point clouds, an introduction of Digiwarp method and a brief description of their FMT imaging system.
- Section 4 concludes the paper with discussions.
2. METHODOLOGY
- There are two pico-projectors (AAXA p4x, AAXA Technologies Inc., Tustin, CA) and two webcams (C615, Logitech, Apples, Switzerland).
- The pico-projectors project fringe patterns onto the surface of the object.
- The minimum focal distance of the webcam is 200 mm.
- The pico-projectors and the webcams have small sizes and are at low cost.
- These components with small size can be easily mounted inside the FMT imaging system as described in [7].
A. Phase Shifting Algorithm
- In phase shifting method, N fringe patterns with a phase shifting step of 2π/N are generated by a computer and delivered to the pico-projector for sequential projection onto the object surface.
- The points with best quality are unwrapped first, and then the points with lower quality, until all points are unwrapped.
- After that an additional centerline image is used to obtain the absolute phase at each pixel [25].
- It is worth noting that the spatial frequency of the projected fringe pattern should be set well.
B. Selection of Phase Shifting Step Number
- The average phase errors for each step number from 3 to 15 are shown in Fig.
- From this result the authors can see how the nonlinearity of projectors affects the accuracy with increasing step number.
- Generally, the phase error decreases as the number of steps increases.
- But the authors can still observe that the phase errors caused by the non-linearity of projectors are relatively small when the fringe pattern number is larger than 7.
C. Pico-projector and Webcam Pairs Calibration
- For one pair of pico-projector and webcam, there are 3 coordinate systems: the webcam coordinate system, the pico-projector coordinate system and the world coordinate system [25].
- System calibration is required to obtain the intrinsic parameters of the webcam and the pico-projector and to create relationships among the three coordinate systems.
- The calibration process is similar to the method described in [25].
- The camera calibration is finished by the Matlab Camera Calibration Toolbox [29].
- The projector checker board images are generated by pixel to pixel mapping from camera images, where the 9-step phase shifting algorithm is utilized again.
D. Phase to Coordinates Conversion
- After all the calibration parameter matrices are obtained, the phase map generated in section 2.A can be converted to the 3D coordinates in the world coordinate system.
- R means rotation matrix, t is translation matrix, and m means the elements of matrix A[R t].
- The above parameters are known and Xw, Yw, Zw are the 3D coordinates in the world coordinate system to be determined.
E. Alignment of Two Point Clouds
- As the two pairs of pico-projector and webcam are calibrated independently, they have different world coordinate systems, as shown in Fig. 2a.
- So the authors need to perform point clouds alignment between the two point clouds in two different coordinate systems to merge them inside one point cloud.
- The authors use a calibration bar as shown in Fig. 2b to transform both coordinate systems to the conical mirror coordinate system {Ocon, xcon, ycon, zcon}.
- During their experiments the authors find that the two 3D point clouds of a mouse shaped phantom cannot be merged precisely after the alignment.
G. FMT Reconstruction
- To validate FMT reconstruction with the mesh generated from the proposed surface extraction method, the authors perform an FMT experiment with a mouse shaped phantom embedded with a capillary tube that is 20 mm long and 1 mm in diameter.
- Briefly, the FMT imaging system consists of a conical mirror, a line pattern laser mounted on a rotary stage and a CCD camera, as shown in Fig.
- A 643 nm line laser (Stocker Yale Canada Inc.) is used to excite the fluorescence photons.
- The authors use 30 line laser source positions and 14,723 detectors.
- The propagation of excited and emitted lights are modeled by the diffusion equation that is solved by the finite element method [32].
A. System Calibration
- Fig. 4a and Fig. 4b show one example of their camera checker board image and its corresponding projector checker board image generated from 9-step phase shifting method.
- The checker board images are used for calibrating the webcam and the pico-projector.
C. Alignment of Two Point Clouds
- The authors use the calibrated 3D shape extracting system to measure the calibration bar as shown in Fig. 2b.
- The authors have calculated the optimal rotation and translation matrices for 2 point clouds and merged them as shown in Fig. 6a.
- The authors plot a cross section as shown in Fig. 6b, and compare it with the ground truth.
- The authors see that the measured calibration bar surface overlaps with the ground truth pretty well.
D. Mouse Shaped Phantom Surface Extraction
- Fig. 7b and 7c show the fringe patterns captured by two webcams from two different views.
- There are in total nine such fringe patterns with a phase shifting step of 2π/9 for each webcam, and an additional centerline picture, which is used to determine the absolute phase.
- Fig. 8a, 8b and Fig. 8c, Fig. 8d plot the wrapped phase map and the unwrapped phase map of webcam 1 and webcam 2, respectively.
- Fig. 9 shows the 3D reconstructed results of the mouse geometry after the alignment of two point clouds and 3D registration, from which the authors can see that the reconstructed size is quite close to the true size.
E. Accuracy Evaluation
- In order to evaluate their system’s accuracy, the authors have fabricated a step object for which the step height between the two planes is 8.13 mm.
- Its photo is shown in Fig. 10a and the reconstructed step is shown in Fig. 10b.
- From the results the authors can see the standard deviations are less than 0.2 mm for both planes, and their system can retrieve the result with errors within 0.5 mm, which is good enough for FMT imaging.
- All the data sets are shown in Fig. 11a, from which the authors can see they overlap very well.
- The cross section also proves that the surface data obtained from their method is very close to the CT data.
F. Digiwarp Results
- After the mouse surface point cloud is obtained, the authors perform Digiwarp to the point cloud and generate the finite element mesh.
- Among these 932 corresponding points, 8 points are chosen manually from the nose, arms and legs, and the other 924 points are chosen automatically slice by slice from the trunk along the x axis.
- To map the 924 points on the trunk, the authors divide trunk section of the point cloud and the digimouse into 30 slices evenly.
- Fig. 13b shows the corrected posture of Digimouse, in which the limbs and the head match the position of those of the subject mouse point cloud and Fig. 13c plots the first volume warping result.
- Fig. 13d shows the surface fitting result while Fig. 13e shows the final volume warping result.
G. FMT reconstruction Results
- Fig.14 plots the transverse, coronal and sagittal views of the overlaid FMT and gray-scale CT images.
- The red color line plots the mouse phantom boundary from the warped mesh.
- From these results the authors can observe that with the finite element mesh generated by the proposed 3d shape extraction method, the FMT reconstruction result is pretty consistent with the CT reconstruction.
4. DISCUSSIONS AND CONCLUSION
- Compared with the approach described in [34], their paper is different in following aspects.
- Secondly, the authors have two pairs of pico-projector and webcam to cover the surface from two views.
- Fourthly, the authors warp a Digimouse mesh into their extracted point cloud to generate the finite element mesh easily and robustly.
- Experimental results indicate that the accuracy of the proposed surface extraction method is within 0.5 mm, which is sufficient for FMT reconstruction as validated with the FMT images.
- The authors thank Dr. Simon Cherry in UC Davis for lending us the CRI camera and the line laser, thank Michael Lun for the proof reading, and thank Y.
Did you find this useful? Give us your feedback
Citations
21 citations
19 citations
5 citations
4 citations
3 citations
Cites methods from "3D mouse shape reconstruction based..."
...1.(10) Two pairs of pico-projector and webcam were used to measure the object surface from two different views....
[...]
References
158 citations
135 citations
"3D mouse shape reconstruction based..." refers methods in this paper
...Some phase unwrapping methods have been studied and reported [19] [23] [24]....
[...]
132 citations
"3D mouse shape reconstruction based..." refers background in this paper
...Phase error compensation methods have been developed to solve this problem[26] [27] for real-time measurement systems....
[...]
131 citations
"3D mouse shape reconstruction based..." refers methods in this paper
...Various reconstruction algorithms have been developed, such as 3-step phase shifting algorithm [17], Fourier transform profilometry [18], and wavelet transform profilometry [19] etc....
[...]
128 citations
"3D mouse shape reconstruction based..." refers methods in this paper
...The propagation of excited and emitted lights are modeled by the diffusion equation that is solved by the finite element method [32]....
[...]