Skip to main content

Digital stereophotogrammetry based on circular markers and zooming cameras: evaluation of a method for 3D analysis of small motions in orthopaedic research

Abstract

Background

Orthopaedic research projects focusing on small displacements in a small measurement volume require a radiation free, three dimensional motion analysis system. A stereophotogrammetrical motion analysis system can track wireless, small, light-weight markers attached to the objects. Thereby the disturbance of the measured objects through the marker tracking can be kept at minimum. The purpose of this study was to develop and evaluate a non-position fixed compact motion analysis system configured for a small measurement volume and able to zoom while tracking small round flat markers in respect to a fiducial marker which was used for the camera pose estimation.

Methods

The system consisted of two web cameras and the fiducial marker placed in front of them. The markers to track were black circles on a white background. The algorithm to detect a centre of the projected circle on the image plane was described and applied. In order to evaluate the accuracy (mean measurement error) and precision (standard deviation of the measurement error) of the optical measurement system, two experiments were performed: 1) inter-marker distance measurement and 2) marker displacement measurement.

Results

The first experiment of the 10 mm distances measurement showed a total accuracy of 0.0086 mm and precision of ± 0.1002 mm. In the second experiment, translations from 0.5 mm to 5 mm were measured with total accuracy of 0.0038 mm and precision of ± 0.0461 mm. The rotations of 2.25° amount were measured with the entire accuracy of 0.058° and the precision was of ± 0.172°.

Conclusions

The description of the non-proprietary measurement device with very good levels of accuracy and precision may provide opportunities for new, cost effective applications of stereophotogrammetrical analysis in musculoskeletal research projects, focusing on kinematics of small displacements in a small measurement volume.

Background

"Three-dimensional (3D) measurements play a vital role in a diversity of industries and disciplines, ranging from the manufacturing and process sectors to healthcare" [1]. Orthopaedic research often focuses on qualitative and quantitative measurements of an object's motion [2, 3]. An image based 3D motion analysis system can track wireless, small, light-weight markers attached to the surface of an object. Measurement of strain on the ligaments or spatial changes of small bones can be determined with a high accuracy. Malicky et al. used a stereoradiogrammetry to measure the strain on the glenohumeral capsule by means of spherical markers [4]. Video-based 3D motion analysis systems based on black [5] or retroreflective [6] spherical markers were developed for tracking objects in small (less than 1 m3) measurement volume. In contrast to gait analysis tracking systems configured for large measurement volume [7], "motion analysis configured for registration within small volumes allows measurement of minuscule displacements with great accuracy" [6].

A possibility to optimize the motion system accuracy was successfully tested by Mössner and co-workers [8]. They used zoom, tilt and pan of theirs cameras to track athletes movement during down hill skiing. The zoom was used to overcome the limited camera resolution. The tilt and pan enabled the object tracking in the field of view of the stationary camera while zooming. In order to perform this task, at least 6 control points [9] had to be visible in each camera frame. This required the Direct Linear Transformation (DLT) method [10, 11] while determining the intrinsic and extrinsic parameters of a fully projective camera. Due to the known intrinsic parameters, the camera lens distortion can be minimized. This improves the object's spatial information derivated from the captured image. In order to reconstruct 3D positions of markers captured by two or more cameras, the extrinsic parameters of each camera have to be known. These extrinsic parameters describe the geometrical relation between the camera and the captured calibration body. The calibration body is not required during subsequent motion tracking when all cameras remain at the same fixed relative to one another position [5, 6]. Otherwise, when experiment conditions require a flexible camera positioning and the camera's intrinsic parameters were determined (the camera was pre-calibrated), we can determine the extrinsic parameters for each camera using at least 3 [12] or 4 [13] control points. This process is also called "camera pose estimation".

Ansar and Daniilidis [13] and Lepetit et al. [14] showed in comparison to different pose estimation methods, a very good accuracy and robustness of the orthogonal iterations algorithm developed by Lu et al. [15]. The orthogonal iterations algorithm searches for an optimal orthogonal projection of the control points presented as a perspective projection on the image plane. The orthogonality constraint is enforced by using singular value decomposition, not from specific parameterization of rotations, e.g., Euler angles [15] typical for DLT methods.

A big calibration body used for the camera calibration in DLT methods can be replaced with a small fiducial marker when the cameras were pre-calibrated. Fiducial markers are artificial landmarks added to a scene to facilitate locating point correspondences between images, or between images and a known model [16]. In our study, the fiducial marker was defined as an aggregate of coplanar control points used for the camera pose estimation. Coplanar reference objects are especially easy to manufacture and measure [17]. In addition, a control point detection based on a detection of the centre of a circular target is advantageous because the circle is centrosymmetric and the detection of the circle centre is not sensitive to the thresholding error [18]. Nevertheless most of 3D tracking systems are based on the use of spherical markers because their circular image is almost independent of the viewing direction [19] of the camera. In contrast to this, the perspective projection of a circle marker on the image-plane has an elliptical form with exception if the image-plane is parallel to the circle-plane. The ellipse centre differs from the centre of the projected circle depending on the angle and displacement between the circle surface and the image-plane. This effect is known as eccentricity [20]. In order to avoid the systematic geometric image measurement error because of the eccentricity, its correction is required [18, 20, 21].

Using the combination of all advantageous aspects of the different techniques mentioned above, the purpose of this study was to develop and evaluate a non-position fixed motion analysis system configured for a small measurement volume. The system was used zoom to track small round flat markers with respect to the fiducial marker consisting of four coplanar circles. Additionally, a unique solution to find the circle's centre projecting on the image had to be developed and applied. This 3D motion analysis system was specifically configured to measure spatial changes of small bones in the foot region.

Methods

Tracking Devise and Image Acquisition

Two web cameras (Logitech®, Webcam Pro 9000) were used to acquire static pictures with a resolution of 1600 × 1200 pixels. A camera holder was constructed to allow the camera's viewing axis to cross with an angle of 40° on the object of interest. The plastic fiducial marker consisted of four black circles (ø 5 mm) on a white background and was placed on a 27 cm long pin between the cameras (Figure 1). The black circles on the fiducial marker were mounted with an accuracy of 0.005 mm. The unique arrangement of the four black circles is similar to what was used to build a reference marker in the previous study [22]. The markers to track were single black circles (ø 1 or 2 mm) printed (HP Laserjet 1300, resolution of 1200 × 1200 dpi) on a white background.

Figure 1
figure 1

Stereophotogrammetrical tracking device. The plastic fiducial marker (left), consisted of four black circles on a white background, was placed on the pin between the two web cameras.

Both cameras were pre-calibrated with a fixed focus and camera zoom factor of 2.2 using an internet available tool [23]. A 6.5 × 6.5 cm calibration board was printed by laser printer as a black-white checkerboard presenting a grid of 144 control points. The calibration board was captured inside of a field of view of each camera in 18 different positions fulfilling a volume where the markers to track and the fiducial marker had to be captured. Thus the intrinsic parameters were determined in order to pre-calibrate each camera [23]. Due to the 90° tilted cameras (Figure 1), the measurement volume (approximately 0.1 × 0.1 × 0.1 m) was behind and above the fiducial marker which was placed in the middle of the lower part of each camera field of view.

The image acquisition tool was programmed with MATLAB (The MathWorks Inc., Natick, MA, USA). This tool allowed the real time streaming view from both cameras. When the fiducial marker was placed near to the tracked objects, two static images from both cameras were simultaneously acquired under optimal light conditions (150 Watt, Ministudio 606, Multiblitz Dr. ing. D. A. Mannesmann GMBH & CO KG, Köln, Germany) described in a manual of the used cameras.

Determination of the circle centre projected onto the image-plane

The black circle on the white background was used to locate the fiducial as well as the object-surface markers. Edge detection between the black and white areas was performed by means of a gray value threshold. On this edge, an ellipse Πi(ci, ai, bi, αi) was fitted, where i was denoted as "initial". The centre cicorresponded only approximately to the centre of the projected circle, when the ellipse axes ratio ai/biwas not equal to one.

When considering the cone with the basis Πion the image plane and the top in the perspective projection centre O, v was defined as the unit vector of the bisecting line of this cone from O to Πi. We rotated the cone about O till the cone bisecting line coincided with the image Z-axis. The rotation matrix R ( n , β ) described this rotation, where n was the unit vector derived from the cross product of v with the image Z-axes and β was the angle between them. The intersection of the rotated cone with the image XY-plane was an ellipse Π(c,a,b,α) (Figure 2). If a = b the ellipse Π was a circle and the normal vector of the circle-plane was the same as the image Z-axis. The angle γ between the circle-plane and the image XY-plane was in this case equal to zero.

Figure 2
figure 2

Schematic diagram of the circle centre determination. Elliptical projection Π of the real imaged circle K r and imaginary circle K i on the XY-image plane with correspondent centres in c r and c i , |AB| = 2a, |CD| = 2b, ABA' = BAB' = γ, the distance between the projection centre O and the XY-plane is equal to d. The bisecting line of the cone with the basis Π on the XY-plane and the top in the perspective projection centre O coincided with the Z-axis. See the text for explanation.

In the case if a < b  the  sin γ = ± ( 1 a 2 b 2 ) / ( 1 + a 2 d 2 ) ,
(1)

where d was the distance from O to the image XY-plane. The rotation of the minor axis a about the ellipse long axis b with the amount of ± γ yielded two vectors which were useful to find the two sections AB' and BA' whose middle lay on the rays passed through O and the centres of the real imaged c r and imaginary c i circles (Figure 2). The reversed rotation with the transposed (t) matrix R t ( n , β ) yielded the searched rays passed trough O and the centres of the real imaged and the imaginary circles. The selection criterion for the fiducial marker was that the four real imaged circles were coplanar and their imaginary circles were not coplanar. Furthermore, during reconstruction of the 3D marker position from the left and right images, two normal vectors for each imaged circle were found ( n r and n i , Figure 2, where r was denoted as 'real' and i - as 'imaginary"). The real imaged circle normal vectors from the left (l) image n r l and from the right (r) image n r r coincided. The imaginary circle normal vectors n i l and n i r did not coincide.

Determination of the 3D position and scaling factor of the fiducial marker

To calculate the relative 3D position of the fiducial marker to the camera, the perspective projection of the four black circles centres (r i in Eq. 2, 1i ≤ 4) of the fiducial marker, called from now on as the fiducial quadrangle, had to be converted into orthogonal projections on the same image-plane (p i in Eq. 3). It was assumed that the image-plane passed through the cross point of the two diagonals of the fiducial quadrangle.

r i = E + t i e i ,
(2)
p i = E + h i e i ,
(3)

where E is the principal point of the image, 1 ≤ i ≤ 4, e is the unit vector, t and h are the lengths of the corresponding vectors. The four orthogonal projection points of the fiducial quadrangle tops (p i ) lay on the rays from E to r i . The cross point divided the diagonals of the fiducial quadrangle under known length ratios used to calculate the p i .

Let denote M 0 (x 0i ,y 0i, z 0i ) as the known coordinates of the fiducial quadrangle with the coordinates origin in the diagonals cross point. The orthogonal projection of M 0 onto the image-plane can be described as M 1(x1 i ,y1 i, z1 i ):

M 1 = k R M 0 + t ,
(4)

Where k is a scaling factor, R is a rotation matrix and t(x t ,y t ,z t ) is a translation vector. The z-components of M 1 were unknown. Therefore the equation (4) was simplified for the known x and y components and the following equation was derived:

s = A \ q ,
(5)

where A = [ x 01 y 01 0 0 0 0 x 01 y 01 . . . x 04 y 04 0 0 0 0 x 04 y 04 ] , q = [ x 11 x t y 11 y t . . . x 14 x t y 14 y t ] and "\" is backslash or matrix left division. If A were a square matrix, A\q would be roughly the same as A-1 q. But in our case A is an m-by-n (m = 8 and n = 4) matrix with mn and q is a column vector with m components, then s = A\q is the solution in the least squares sense to the under- or over-determined system of equations As = q.

In order to calculate the scaling factor k, the elements of the vector s were rearranged into a 2 by 2 matrix S. After the singular value decomposition (svd) [15, 24] of matrix S the largest singular value of S was equal to the scaling factor k:

k = max ( s v d ( S ) )
(6)

Moreover, the four elements of the matrix S divided by the factor k are the four elements of the matrix R:

R = [ c 11 c 12 c 13 c 21 c 22 c 23 c 31 c 32 c 33 ] ,  where  s / k = [ c 11 c 12 c 21 c 22 ]  and  S / k = [ c 11 c 12 c 21 c 22 ] .

In order to reconstruct the matrix R, the property of a unit matrix was used where the sum of the squared column or row elements is equal to one. The signs in the third column and row of R were reconstructed by means of another property where determinant of R is equal to one. Now only two right orthogonal matrixes remained which corresponded to the two possible 3D positions of the fiducial quadrangle. In order to choose the matrix R which described the real imaged position of the fiducial quadrangle, the property of the perspective projection to converge to the perspective projection centre O(0,0,0) was used. The lines had to converge to O when they passed through the four points of the fiducial marker p i (x pi ,y pi ,z pi ) and their perspective projection r i (x ri ,y ri ,d), where d was the distance from O to the image XY-plane and the coordinates of p i were calculated as in the equation (4).

3D reconstruction of markers

The 3D position of the fiducial marker regarding the camera coordinate system was determined by means of the described above rotation matrix R, the scaling factor k and the translation vector t. The two cameras were used to simultaneously acquire two images of the same scene. After rigid body transformation of O j c and the perspective projections of markers m j n c into the fiducial marker coordinate system, the 3D positions of markers with respect to the fiducial marker were reconstructed. The single 3D marker position m n f was the estimated intersection point of the two rays from the perspective projection centre O j f to the corresponding marker perspective projection m j n f [25], where 1 ≤ j ≤ 2, n was an integer between 1 and the number of markers, c was denoted as "camera coordinate system" and f was denoted as "fiducial marker coordinate system".

Evaluation experiments

Experiment 1: Inter-marker distance measurement

In order to evaluate the accuracy (agreement between the measured and reference values) and precision (closeness of measurement values to each other under similar experimental conditions [26]) of the presented measurement system an 8 × 8 grid of black circles was printed on a white surface and adhered to a 10 × 10 cm plate. The test distance between adjacent circles centres in horizontal and vertical directions amounted to 10 mm. Two grids with circles of 1 and 2 mm diameter were prepared to perform the following tests:

Test A - Translating Camera

The plate with the grid of circles was initially captured in close proximity and above the fiducial marker. The test object Z obj axis was perpendicular to the plate surface and parallel to the pin between the cameras. X obj axis was directed horizontally and Y obj axis - vertically (Figure 3). After that, the cameras were moved seven times farther away from the circles grid in step of approximately 1 cm along the Z obj axis while in each camera position the scene was captured. The test was repeated five times.

Figure 3
figure 3

Top view of the camera setup. The Y obj axis is directed towards the reader.

Test B - Rotating Plate

The grid of circles was captured eight times after the plate was rotated about X obj axis. The rotation angles were approximately -60, -45, -30, -15, 15, 30, 45 and 60°. The test was repeated five times.

Test C - Rotating Cameras

Similar as in the test B but here the cameras were rotated about Y obj axis. The rotation angles were approximately -60, -45, -30, -15, 15, 30, 45 and 60°. The test was repeated five times.

Precision test of image processing

The image processing precision was determined by examining the variation of the 112 measured distances between adjacent circles centres in horizontal and vertical directions during the image processing. The 8 × 8 grid of circles was captured in five different positions. Then every of five image-pairs was processed 10 times. The captured positions were: one from the test A (the initial position), two from the test B (-30 and 30° rotation about X obj ) and two from the test C (-30 and 30° rotation about Y obj ). The variation in distances was calculated by means of the Root Mean Squares error as shown in the following equation:

ε = 1 l × m × n k = 1 l = 5 i = 1 m = 112 j = 1 n = 10 ( d k i j d ¯ k i ) 2 ,
(7)

where d ¯ k i is the mean value of the i's inter-marker distance on the grid of circles in the k's position measured 10 times (d kij ).

System repeatability test

In this test the influence of the system rebooting on the 112 measured distances between adjacent circles centres in horizontal and vertical directions was examined. A single position of the grid of circles from the test A (the initial position) was captured 10 times. After acquisition of each capture the measurement system was shut down and started again. Every time, the zoom and focus had to be adjusted to the default values used during the pre-calibration. The variation in distances was calculated similarly to the image processing precision test by means of the Root Mean Squares error (7).

Precision test on cadaveric specimen

In order to test how the background colours of the connective tissue influence the precision of marker detection, a human foot specimen was used. The foot was fresh frozen and stored in a plastic bag at -20°C. The specimen (left, female, 79 years old, without degenerative changes of the connective tissue) was thawed at room temperature for 24 hours. Then the first metatarsal bone was prepared free of the surrounding tissue, proximally osteotomized and fixed with a locking screw plate. Nine markers were attached by means of acrylic superglue on the parts of the first metatarsal bone. The markers were 2 mm black circles on a white water resistant background (0.5 mm thin pieces approximately of a 4 mm diameter, Figure 4).

Figure 4
figure 4

Marker test on the cadaveric foot specimen. The single right camera picture. The nine markers were attached on the first metatarsal bone fragments fixing with a locking screw plate after a proximal osteotomy. The fiducial marker was placed in the middle of the lower image part.

Five different pictures of the specimen were acquired by means of the dual cameras and then each was processed 10 times. The variation in all possible distances between the nine markers ( m = C 9 2 = 36 ) was calculated similarly to the image processing precision test by means of the Root Mean Squares error (7).

Experiment 2: marker displacement measurement

In this experiment two plastic plates were used to determine the translational and rotational accuracy and precision when one plate was moved in respect to another static plate. Each plate was equipped with four black circles (mounting accuracy 0.005 mm). The circles were on the tops of a square whose side length amounted to15 mm. Two plate types were manufactured for circle sizes of 1 and 2 mm diameter.

Rotation test

In this test the Z obj axis of the stationary plate was perpendicular to the plate surface and parallel to the pin between the cameras. In the zero position the non-stationary plate was coplanar to the stationary plate and the circles on the corresponded square sides were collinear. The non-stationary plate was rotated about X obj , Y obj and Z obj axes by means of a headpiece on a milling unit (Deckel Maho Pfronten GmbH, Germany) used for precise rotation with a step of 2.25 ± 0.025° trough the range of 90° (± 45° from the zero position).

The magnitude of rotation between the stationary and non-stationary plates was calculated by means of a detection of the rotation matrix describing the rotation between the plates [27]. From the rotation matrix was calculated "the attitude vector" a ¯ = n ¯ α [28], where n ¯ is the unit vector about which the scalar rotation α occurs. Then "the attitude vector" was orthogonally decomposed onto the X obj , Y obj and Z obj axes [29] of the stationary plate.

Translation test

In order to determine the accuracy and precision of translational measurements the non-stationary plate was translated from the zero position along each of the X obj , Y obj and Z obj axes in amount of 0.5, 1 and 5 mm. The translations were performed by means of the translational manipulator (ThorLabs Inc. Europe, Karlsfeld, Germany). Its accuracy (0.005 mm) and precision (± 0.002 mm) was characterized using laser interferometry in the previous study [30]. The test was repeated five times.

Statistical analysis

The error value was calculated as a difference between the reference value and the measured value. All collected error values were examined with Jarque Bera test to verify that the data was normally distributed. Regarding this test, many of the error values sets were significant (p < 0.05) non normally distributed. Therefore non parametric statistical methods were applied. The accuracy of the presented measurement system was represented by means of the mean and median error [31]. The precision was calculated as a standard deviation [31] or as Root Mean Squares error (7).

The median error was presented due to the applied non parametric methods. The comparison of medians was performed by means of Kruskal Wallis test. The variance of the error values was compared by means of the Levene's test. All statistics were performed using MATLAB (The MathWorks Inc., Natick, MA, USA). The values of significant difference or significant sameness [32] were set at p < 0.05 and p > 0.95 respectively.

Results

Experiment 1: detection of Inter-marker distance

Tests A, B and C - 10 mm distance detection

Results in this experiment section for the accuracy and precision were excellent for both marker sizes (Table 1). The total accuracy and precision (mean (median) ± standard deviation) amounted for 1 mm markers 0.009 (-0.002) ± 0.1002 mm and for 2 mm markers 0.008 (-0.004) ± 0.1003 mm. Levene's test showed that the variances of 1 mm and 2 mm markers during the tests A, B and C were significantly the same (p = 0.956). For both marker sizes, variance of distance detection errors occurred during the test B were significantly higher (p < 0.001 for 1 mm and p = 0.002 for 2 mm markers) as during the test C and A. Kruskal Wallis tests showed that the medians for 2 mm markers were significantly different (p < 0.001) between the tests A, B and C. The medians for 1 mm markers were significantly different (p < 0.001) between the test A and B, otherwise the p values were very small (0.05 < p < 0.08).

Table 1 Detection of the 10 mm inter marker distance

Image processing precision and repeatability tests

The image processing precision test revealed the Root Mean Squares error for 1 and 2 mm Markers at the level of 0.0044 and 0.0051 mm, respectively. Levene's multiple-sample test showed a significantly (p < 0.001) higher variance of mean distance deviation for 2 mm Markers in comparison to 1 mm.

The system repeatability test detected the Root Mean Squares error for both marker sizes at the level of 0.013 mm. Levene's test showed that the variance of the distance detection errors for both marker sizes was significant (p = 0.98) the same.

The image processing precision test on the foot specimen showed the Root Mean Squares error at the level of 0.0035 mm.

Experiment 2: measurement of displacement

Rotation test

Results in this experiment section for the accuracy and precision were very good for both marker sizes (Table 2). The total accuracy and precision (mean (median) ± standard deviation) amounted for 1 mm markers 0.054 (0.027) ± 0.190° and for 2 mm markers 0.062 (0.028) ± 0.154°. No significant differences in variance (p = 0.259) between 1 and 2 mm markers were observed regarding the Levene's test. Regarding Kruskal Wallis test the accuracy of the rotation about the Z obj was significant better as about X obj and Y obj axes with p = 0.004 and p < 0.001 for 1 and 2 mm markers respectively. The better precision of the rotation about the Z obj showed also Levene's test with p = 0.012 and p < 0.001 for 1 and 2 mm markers respectively.

Table 2 Rotation test

Translation test

The stereophotogrammetrical system delivered very good results for translational accuracy and precision (Table 3). The total accuracy and precision (mean (median) ± standard deviation) across the translational test were 0.008 (0.005) ± 0.053 mm for 1 mm markers and -0.001 (-0.002) ± 0.038 mm for 2 mm markers. The Levene's test revealed significantly (p = 0.002) higher variance of the measurements errors with 1 mm markers versus 2 mm markers. The variance of 5 mm translation errors for both marker sizes was significantly higher (p = 0.0004) than the error variance of the smaller 0.5 and 1 mm translations. The error variance of the translations along the Z obj axis was significantly higher (p < 0.001) as along the X obj and Y obj axes.

Table 3 Translation test

Discussion

This study demonstrated that the 3D stereophotogrammetrical system based on the tracking of flat round markers can accurately measure the distances and movements within a small measurement volume. The algorithms to detect the centre of the projected circle and to estimate the camera pose using the fiducial marker were presented. The evaluation tests were designed in order to test the measurement accuracy and precision of distances and movements expected during the tracking of markers fixed on small bones or ligaments. Despite the fact that the presented system processes only static images, in comparison to similar systems using proprietary vendor-specific hardware, the presented system is a very small fraction of the cost. Future goals include the possibility of making the presented measurement system able to extract the tracked markers from a video stream.

The developed algorithm of the detection of the projected circle centre functioned with a single projected circle while other algorithms require for the detection for example concentric circles [33] or coplanar circles [18, 20, 21]. On the other hand, the presented algorithm of the projected circle centre detection required the principal point and principal distance to be known from the pre-calibration. The quality of the pre-calibration could play a determinant role in the accuracy and precision [34] of the presented measurement system, when the measurement conditions were optimal. This played an important role during the camera pose estimation using the fiducial marker where the pose estimation was optimized by means of the direct transformation of the fiducial marker perspective projection into its orthogonal projection, the least squares algorithm using to solve the equation 5 and the reconstruction of the rotation matrix using the orthogonal matrix constrains.

The remaining inaccuracies of the camera pre-calibration could explain the reduction of precision when lager distances were measured. This could be corroborated by the smaller image processing precision (± 0.005 mm) in comparison to the precision ranged from ± 0.011 to ± 0.11 mm while the distances and movements were measured. The system repeatability (± 0.013 mm) also influenced the distance and movement measurements precision because of small discrepancies occurred during the system adjusting the zoom and focus values. Lujan et al. also reported that "lager translations/rotations reduced kinematic accuracy" [5].

In the presented study, the precision was calculated as the standard deviation or as the Root Mean Squares error (equation 7) because of the calculation similarity. Therefore these two parameters had to be more comparable than the mean standard deviation [6] or the two standard deviations [5], which were used instead of the Root Mean Squares error in the related studies [5, 6].

The accuracy values were very close to zero and ranged from -0.045 to 0.052 mm for translations and from -0.008 to 0.105° for rotations. The presented system showed comparable results in accuracy for the similar displacements measurements in the study of Lujan et al. (0.034 mm for the translations and 0.132° for the rotations) [5] and in the study of Everaert et al. the accuracy ranged from 0 to 0.05 mm [6].

The measurement with the flat round markers was limited by the angle λ between the normal vector to the marker plane and the camera view axis (0°≤λ<90°). When this angle became too close to 90°, there were difficulties to fit the ellipse of the projected circle because the ellipse became too slim. Therefore the axes ratio of the ellipse could be a criterion of the critical value of the angle λ. This axes ratio criterion was set at 0.2 (correspond approximately to λ = 78.5°). Therefore the angle range between the Z obj and the pin with the fiducial marker, what was mounted between the cameras, was set at ± 60° (Figure 1 and 3, test B and C).

It has been stated that: "The accuracy of the target location deteriorates if the number of edge pixels compared to central pixels increases, because of the uncertain grey values of the edge" [19]. This phenomenon occurred when the marker size became smaller [5] and the angle λ for the flat round markers, bigger. Due to using the zoom and good camera resolution, the marker size of 1 or 2 mm diameter showed in the tests A, B and C the significant agreement in precision. "Zoom lenses are used extensively in computer vision to overcome the limited resolution" [35]. Both cameras of the presented system were calibrated for the fixed zoom and focus settings because Wiley and Wong admitted in their study that: "There were significant changes in the distortion characteristics with changes in the focal setting. However, the pattern of change for a given camera-lens combination was very systematic and stable over time" [35], what was confirmed through the good precision values (± 0.013 mm) of the repeatability test.

The black circle on the white background - this is advantageous for the edge detection colour combination remained during the test on the cadaveric foot specimen. Therefore the colours of connective tissues surrounding the markers did not deteriorate the precision of the presented measurement system. Nevertheless the use of flat round retroreflective markers may be more advantageous because the maker size reduction.

The measurement of the displacements in the second experiment showed typical distribution of the measurement errors for this camera setup [5, 31]. Regarding this distribution the presented measurement system performed the best translation measurement in the XY obj -plane and the best rotation measurement, when rotations occurred about the Z obj axes.

Conclusions

The study demonstrated that the handy 3D stereophotogrammetrical system based on the tracking of the flat round markers within a small measurement volume with respect to the fiducial marker can accurately measure the distances and movements. The evaluation experiments of the 10 mm distances measurement showed the total accuracy of 0.0086 mm (mean error) and the precision of ± 0.1002 mm (standard deviation). The translations from 0.5 mm to 5 mm were measured with the total accuracy of 0.0038 mm and the precision of ± 0.0461 mm. The rotations of 2.25° amount were measured with the entire accuracy of 0.058° and the precision of ± 0.172°. These levels of accuracy and precision may provide opportunities for new applications of stereophotogrammetrical analysis in orthopaedic research projects, focusing on small displacements in a small measurement volume.

References

  1. Bogue R: Three-dimensional measurements: A review of technologies and applications. Sensor Review 2010, 30: 102–106. 10.1108/02602281011022670

    Article  Google Scholar 

  2. Malicky DM, Kuhn JE, Frisancho JC, Lindholm SR, Raz JA, Soslowsky LJ: Neer Award 2001: nonrecoverable strain fields of the anteroinferior glenohumeral capsule under subluxation. J Shoulder Elbow Surg 2002, 11: 529–540. 10.1067/mse.2002.127093

    Article  Google Scholar 

  3. Phatak NS, Sun Q, Kim SE, Parker DL, Kent Sanders R, Veress AI, Ellis BJ, Weiss JA: Noninvasive determination of ligament strain with deformable image registration. Annals of Biomedical Engineering 2007, 35: 1175–1187. 10.1007/s10439-007-9287-9

    Article  Google Scholar 

  4. Malicky DM, Soslowsky LJ, Kuhn JE, Bey MJ, Mouro CM, Raz JA, Liu CA: Total strain fields of the antero-inferior shoulder capsule under subluxation: a stereoradiogrammetric study. J Biomech Eng 2001, 123: 425–431. 10.1115/1.1394197

    Article  Google Scholar 

  5. Lujan TJ, Lake SP, Plaizier TA, Ellis BJ, Weiss JA: Simultaneous measurement of three-dimensional joint kinematics and ligament strains with optical methods. Journal of Biomechanical Engineering 2005, 127: 193–197. 10.1115/1.1835365

    Article  Google Scholar 

  6. Everaert DG, Spaepen AJ, Wouters MJ, Stappaerts KH, Oostendorp RA: Measuring small linear displacements with a three-dimensional video motion analysis system: determining its accuracy and precision. Arch Phys Med Rehabil 1999, 80: 1082–1089. 10.1016/S0003-9993(99)90065-5

    Article  Google Scholar 

  7. Ehara Y, Fujimoto H, Miyazaki S, Tanaka S, Yamamoto S: Comparison of the performance of 3D camera systems. Gait and Posture 1995, 3: 166–169. 10.1016/0966-6362(95)99067-U

    Article  Google Scholar 

  8. Mössner M, Kaps P, Nachbauer W: A method for obtaining 3-D data in alpine skiing using pan-and-tilt cameras with zoom lenses. ASTM Special Technical Publication 1996, 1266: 155–164.

    Google Scholar 

  9. Chen L: An investigation on the accuracy of three-dimensional space reconstruction using the direct linear transformation technique. Journal of Biomechanics 1994, 27: 493–500. 10.1016/0021-9290(94)90024-8

    Article  Google Scholar 

  10. Abdel-Aziz YI, Karara HM: Direct Linear Transformation from Comparator Coordinates into Object Space Coordinates in Close-Range Photogrammetry. ASP Symposium on Close Range Photogrammetry 1971, 1–18.

    Google Scholar 

  11. Hatze H: High-precision three-dimensional photogrammetric calibration and object space reconstruction using a modified DLT-approach. J Biomech 1988, 21: 533–538. 10.1016/0021-9290(88)90216-3

    Article  Google Scholar 

  12. Haralick BM, Lee CN, Ottenberg K, Nölle M: Review and analysis of solutions of the three point perspective pose estimation problem. International Journal of Computer Vision 1994, 13: 331–356. 10.1007/BF02028352

    Article  Google Scholar 

  13. Ansar A, Daniilidis K: Linear pose estimation from points or lines. IEEE Transactions on Pattern Analysis and Machine Intelligence 2003, 25: 578–589. 10.1109/TPAMI.2003.1195992

    Article  Google Scholar 

  14. Lepetit V, Moreno-Noguer F, Fua P: EPnP: An accurate O(n) solution to the PnP problem. International Journal of Computer Vision 2009, 81: 155–166. 10.1007/s11263-008-0152-6

    Article  Google Scholar 

  15. Lu CP, Hager GD, Mjolsness E: Fast and globally convergent pose estimation from video images. IEEE Transactions on Pattern Analysis and Machine Intelligence 2000, 22: 610–622. 10.1109/34.862199

    Article  Google Scholar 

  16. Fiala M: Designing highly reliable fiducial markers. IEEE Transactions on Pattern Analysis and Machine Intelligence 2010, 32: 1317–1324. 10.1109/TPAMI.2009.146

    Article  Google Scholar 

  17. Triggs B: Camera pose and calibration from 4 or 5 known 3D points. In 1999. Kerkyra, Greece; 278–284.

  18. Liu Q, Su H: Correction of the asymmetrical circular projection in DLT camera calibration. Sanya, Hainan; 2008:344–348.

    Google Scholar 

  19. Trinder JC, Jansa J, Huang Y: An assessment of the precision and accuracy of methods of digital target location. ISPRS Journal of Photogrammetry and Remote Sensing 1995, 50: 12–20. 10.1016/0924-2716(95)98211-H

    Article  Google Scholar 

  20. Ahn SJ, Warnecke HJ, Kotowski R: Systematic geometric image measurement errors of circular object targets: Mathematical formulation and correction. Photogrammetric Record 1999, 16: 485–502. 10.1111/0031-868X.00138

    Article  Google Scholar 

  21. Heikkilä J: Geometric camera calibration using circular control points. IEEE Transactions on Pattern Analysis and Machine Intelligence 2000, 22: 1066–1077.

    Article  Google Scholar 

  22. Bobrowitsch E, Imhauser C, Graichen H, Durselen L: Evaluation of a 3D object registration method for analysis of humeral kinematics. J Biomech 2007, 40: 511–518. 10.1016/j.jbiomech.2006.02.016

    Article  Google Scholar 

  23. Camera calibration toolbox for matlab [http://www.vision.caltech.edu/bouguetj/calib_doc/]

  24. Horn BKP, Hilden HM, Negahdaripour S: Closed-form solution of absolute orientation using orthonormal matrices. J Opt Soc Am A 1988, 5: 1127–1135. 10.1364/JOSAA.5.001127

    Article  MathSciNet  Google Scholar 

  25. Köhler M: Vision Based Remote Control in Intelligent Home Environments. In 3D Image Analysis and Synthesis'96. Edited by: B Girod HN-PS. University of Erlangen-Nürnberg/Germany: Inx-Verlag; 1996:147–154.

    Google Scholar 

  26. Maletsky LP, Sun J, Morton NA: Accuracy of an optical active-marker system to track the relative motion of rigid bodies. J Biomech 2007, 40: 682–685. 10.1016/j.jbiomech.2006.01.017

    Article  Google Scholar 

  27. Soderkvist I, Wedin PA: Determining the movements of the skeleton using well-configured markers. J Biomech 1993, 26: 1473–1477. 10.1016/0021-9290(93)90098-Y

    Article  Google Scholar 

  28. Spoor CW, Veldpaus FE: Rigid body motion calculated from spatial co-ordinates of markers. J Biomech 1980, 13: 391–393. 10.1016/0021-9290(80)90020-2

    Article  Google Scholar 

  29. Woltring HJ: 3-D attitude representation of human joints: a standardization proposal. J Biomech 1994, 27: 1399–1414. 10.1016/0021-9290(94)90191-0

    Article  Google Scholar 

  30. Seehaus F, Emmerich J, Kaptein BL, Windhagen H, Hurschler C: Experimental analysis of Model-Based Roentgen Stereophotogrammetric Analysis (MBRSA) on four typical prosthesis components. J Biomech Eng 2009, 131: 041004. 10.1115/1.3072892

    Article  Google Scholar 

  31. Wiles A, Thompson D, Frantz D: Accuracy assessment and interpretation for optical tracking systems. In Medical Imaging: Visualization, Image-Guided Procedures, and Display Proceedings of the SPIE Edited by: Galloway, Robert L Jr. 2004, 5367: 421–432.

    Google Scholar 

  32. Lindsay RM: Reconsidering the status of tests of significance: An alternative criterion of adequacy. Accounting, Organizations and Society 1995, 20: 35–53. 10.1016/0361-3682(93)E0004-Z

    Article  Google Scholar 

  33. Kim JS, Gurdjos P, Kweon IS: Geometric and algebraic constraints of projected concentric circles and their applications to camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence 2005, 27: 637–642. 10.1109/TPAMI.2005.80

    Article  Google Scholar 

  34. Xiaopeng L, Faig W: Digital camera calibration: a summary of current methods. Geomatics Info Magazine 1997, 11: 40–41.

    Google Scholar 

  35. Wiley AG, Wong KW: Geometric calibration of zoom lenses for computer vision metrology. Photogrammetric Engineering & Remote Sensing 1995, 61: 69–74.

    Google Scholar 

Download references

Acknowledgements

We are grateful to the German Research Foundation for the financial support (DFG-HU 873_2-1), Joerg Viering, Michael Breyvogel and theirs colleagues from the machine shop for technical support. We thank Christopher Müller who helped on the design of figures.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Evgenij Bobrowitsch.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors' contributions

EB developed the measurement system design and the algorithms, written the manuscript, carried out the evaluation study and analyzed the data. GO and CP were involved in the drafting and revision of the manuscript. CH contributed to the evaluation tests design and was involved in the drafting and revision of the manuscript. HW contributed to the evaluation test design and performance. HA was involved in the study design and the performance of the evaluation test. CS was involved in the study design. All authors read and approved the final version of the manuscript.

Authors’ original submitted files for images

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Bobrowitsch, E., Hurschler, C., Olender, G. et al. Digital stereophotogrammetry based on circular markers and zooming cameras: evaluation of a method for 3D analysis of small motions in orthopaedic research. BioMed Eng OnLine 10, 12 (2011). https://doi.org/10.1186/1475-925X-10-12

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1475-925X-10-12

Keywords