3-D shape measurement of complex objects by

lation to calculate the 3-D coordinates related to the sensor coordinate system ... [S0091-3286(00)02701-X] ... multiple points is imaged under different viewing angles, ... value images, the optical properties of the object, and the ... lens distortions .... 3 Fringes projected on a real object and corresponding mod 2π phase map.
576KB taille 4 téléchargements 191 vues
3-D shape measurement of complex objects by combining photogrammetry and fringe projection Carsten Reich Reinhold Ritter Jan Thesing Technical University of Braunschweig Institute of Metrology and Experimental Mechanics Schleinitzstraße 20 D-38106 Braunschweig, Germany E-mail: [email protected]

Abstract. A new concept for 3-D shape measurement of complex objects is proposed, which is based on photogrammetry and fringe projection. The shape of a partial view is determined by a combination of the phase-shift method for fringe evaluation and a photogrammetric triangulation to calculate the 3-D coordinates related to the sensor coordinate system. For the measurement of complex objects, partial views are taken from different sensor positions. The problem of matching the partial views into each other is solved by transforming each sensor position and the relating point cloud of the shape in a global coordinate system by photogrammetric matching of reference targets. © 2000 Society of PhotoOptical Instrumentation Engineers. [S0091-3286(00)02701-X]

Subject terms: 3-D shape measurement, fringe projection, phase shifting, photogrammetry, transformation of point clouds. Paper SM-27 received May 5, 1999; revised manuscript received Aug. 10, 1999; accepted for publication Aug. 10, 1999.

1

Introduction

Optical 3-D shape measurement systems are gaining importance in industrial applications. Sensor systems based on triangulation and structured light provide a noncontact and fast measurement compared to conventional coordinate measurement machines. The light-stripe technique has been especially highly developed.1,2 Besides devices with moving components like line scanners, fringe projection sensors based on the phase-shift method are widespread in surface inspection, quality control, and reverse engineering. The phase-shift method provides fast acquisition of dense point clouds. In general, systems with one camera and one projecting unit are used to acquire whole-field depth information on the inspected surface.3 Shape measurement of complex objects requires registration from different views, resulting in point clouds related to different coordinate systems. The precise transformation of each sensor coordinate system into a common global coordinate system is necessary to obtain good accuracy of the whole measurement result. Several approaches are currently used to determine the relative orientations of the sensor. The sensor can be fixed to a mechanical device that is positioned with high precision, and the sensor orientation is then derived from the position information.4 Besides the high cost of this systems, its accuracy is limited because small angular inaccuracies can result in large errors in coordinate calculation. Another transformation method is the matching of point clouds. They are transformed by optimal fitting.5 Problems arise if the object does not offer contoured features. In this case it is impossible to obtain unambiguous solutions. 224

Opt. Eng. 39(1) 224–231 (January 2000)

For practical applications, none of the described approaches is free from disadvantages. They are either too expensive or too restrictive. Unlike most other fringe projection systems, the proposed shape sensor is based on a passive stereophotogrammetric setup with two cameras and one projection unit. Only the cameras have to be calibrated using an accurate bundle adjustment process. The problem of image correspondence is solved by projecting coded fringes with different orientations. Typical features from photogrammetry, such as self-checking and self-calibration, are transferred to fringe projection. High spatial resolution and high measurement precision are obtained by the phase-shifting technique with an large number of phase samples. Systematical error sources like phase-shifting errors or nonsinusoidal waveforms are eliminated by special correlation algorithms in addition to the passive setup. Furthermore, the photogrammetric matching of point clouds will be presented. Photogrammetry is used to determine the global coordinates of circular targets that are attached to the object, resulting in a precise reference frame. The targets are also measured by the two cameras of the shape sensor with respect to the sensor coordinate system. They are used for orientation of the sensor relative to the reference frame. This information is needed for transforming the shape coordinates of the partial views to the global coordinate system.

2 Photogrammetry According to the new concept, photogrammetry is used for measurement of sensor coordinates as well as for global matching of the partial views. Photogrammetry is the re-

0091-3286/2000/$15.00

© 2000 Society of Photo-Optical Instrumentation Engineers

Downloaded from SPIE Digital Library on 26 Aug 2011 to 130.215.169.248. Terms of Use: http://spiedl.org/terms

Reich, Ritter, and Thesing: 3-D shape measurement of complex objects . . .

Fig. 2 Setup of the shape sensor.

where X P ,Y P ,Z P ⫽ object coordinates X O ,Y O ,Z O ⫽ coordinates of perspective center

Fig. 1 Principle of central projection.

construction of 3-D coordinates by combining two or more 2-D images taken from different points of view.6 The correspondence problem 共i.e., the precise determination of the image coordinates of corresponding structures like circles, crosses, or characteristic gray-value patterns兲 is solved for two or more image planes. After finding homologous image points, the object-point coordinates are calculated by spatial intersection using triangulation. If a set of multiple points is imaged under different viewing angles, the sensor orientation itself can be calculated by a bundle adjustment process.7 In photogrammetry highly developed camera models based on the principle of central projection are used. By combining these models with digital image processing, a high degree of automation and precision can be achieved.8,9 Depending on the signal content of the grayvalue images, the optical properties of the object, and the surface slope, accuracies of object-point determination as good as 1/70,000 relative to the field of view can be achieved with standard hardware components.10 As in general the number of observations exceeds the number of unknown variables, photogrammetry is closely linked to adjustment calculation which allows a statistical error analysis.11 Figure 1 shows the camera model used, which is based on the principle of central projection. The relation between the image coordinates and the object coordinates is given by the collinear equations

冉 冊

冉 冊冉 冊冉 冊

c X* xp xO dx P ⫽⫺ ⫹ ⫹ , * yp y dy Y Z* O P P

冉冊 冉 冊

X* X P ⫺X O P Y* Y P ⫺Y O , ⫽R P Z P ⫺Z O * ZP

共1兲

共2兲

X* P ,Y * P ,Z * P ⫽ object coordinates in camera system x O ,y O ⫽ image coordinates of principal point x p ,y p ⫽ image coordinates dx,dy R c

3

⫽ lens distortions ⫽ rotation matrix ⫽ principal distance.

Shape Measurement of Partial Views by Fringe Projection

3.1 Optical Setup and Calibration of the Shape Sensor Figure 2 shows the optical setup of the shape sensor, which is used for measuring the 3-D coordinates of partial views. The projecting unit is only needed for structuring of the surface and is not involved in determination of object coordinates. Because of this passive approach, just the two cameras of the shape sensor have to be calibrated: A reference object with a large number of circular targets attached to it is imaged by both cameras at different viewing angles. The relative orientation of the two cameras to each other can now be computed by a photogrammetric bundle adjustment process using the proposed model for central projection. Once these orientations are known, object coordinates can be calculated by a photogrammetric triangulation based on four measured image coordinates.

3.2 Phase Measurement by Fringe Projection In phase-shifting fringe projection, the fringes should have a sinusoidal shape when they are projected onto a plane object. By shifting the phase with a fixed phase step ⌬␸, the recorded intensity for one pixel changes. This follows from I⫽I 0 关 1⫹M cos共 ␸ ⫹⌬ ␸ 兲兴 , Optical Engineering, Vol. 39 No. 1, January 2000

Downloaded from SPIE Digital Library on 26 Aug 2011 to 130.215.169.248. Terms of Use: http://spiedl.org/terms

共3兲 225

Reich, Ritter, and Thesing: 3-D shape measurement of complex objects . . .

Fig. 3 Fringes projected on a real object and corresponding mod 2␲ phase map.

where I ⫽ recorded intensity I 0 ⫽ mean intensity M ⫽ fringe modulation ␴ ⫽ phase. At least three intensities are necessary for determining the unknown phase ␸. Using for example four steps (0,␲ /2,␲ ,3␲ /2), the phase ␸ can be calculated from the related intensities (I 1 ,I 2 ,I 3 ,I 4 ) by

␸ ⫽arctan

I 4 ⫺I 2 . I 1 ⫺I 3

共4兲

Redundancy of phase determination is increased on introducing more samples of the signal function. An N-step phase shift with relative phase steps of ⌬ ␸ ⫽2 ␲ /N leads for example to

␸ ⫽arctan

N I i sin关共 2 ␲ /N 兲 共 i⫺1 兲兴 ⫺ 兺 i⫽1 N 兺 i⫽1 I i cos关共 2 ␲ /N 兲 共 i⫺1 兲兴

.

共5兲

Phase-shift formulas like these can be deduced from a least-squares approach.12,13 The higher redundancy does not imply a higher absolute precision of phase detection if systematical error sources like phase-shift errors or nonsinusoidal waveforms appear. The influence of such error sources on the reliability of phase measurement has been investigated by several authors. New algorithms have been presented that are insensitive to some of these.14–16 Figure 3 shows fringes projected on a curved object surface and the corresponding phase map, which is obtained by computing the phase from Eq. 共5兲 for each pixel of the recording CCD chip. 3.3 Absolute Phase Measurement As deduced from the periodic fringe pattern that is projected onto the object surface, the original phase map is ambiguous perpendicular to the fringe direction. A common method to solve this problem consists in applying additional Gray-code patterns, which provide the wanted fringe orders.17 However, this procedure is time-consuming without increasing the accuracy. An alternative proposal to 226

solve the ambiguity problem was presented by Zumbrunn.18 Two phase maps are combined, with one of them consisting of only one fringe order and covering the whole field of view. This phase distribution can be used for obtaining the phase orders of a second fringe pattern with a smaller fringe spacing. Malz19 showed how to calculate the fringe orders from the beat of two periodical functions with high frequencies. Here, a multiple-wavelength strategy is used to obtain the wanted fringe orders: For example, two different phase functions ␸ 1 (x) and ␸ 2 (x) with the wavelengths ␭ 1 and ␭ 2 are superimposed 共Fig. 4兲. The superposition produces the phase-correct subtraction of these two functions, yielding the desired beat function ␾ (x) with the wavelength ␭ b : ␭ b⫽

␭ 1␭ 2 . ␭ 1 ⫺␭ 2

共6兲

The frequencies of the initial functions have to be selected in a way that the beat frequency is low enough to be unambiguous over the field of view of the cameras. Then, the phase values of the beat frequency can be used to unwrap the periodic phase values of the initial functions. Figure 5 shows two phase maps with the initial frequencies selected to get an unambiguous beat function. Because of phase errors, it is not possible to unwrap a large number of

Fig. 4 Phase-correct subtraction.

Optical Engineering, Vol. 39 No. 1, January 2000 Downloaded from SPIE Digital Library on 26 Aug 2011 to 130.215.169.248. Terms of Use: http://spiedl.org/terms

Reich, Ritter, and Thesing: 3-D shape measurement of complex objects . . .

Fig. 5 Two-wavelength-technique adapted to fringe projection.

phase periods 共for example 64 periods兲 in a single step. In this case, three initial phase functions are introduced in order to get an unambiguous beat. 3.4 Projection of Fringes with Different Orientations In classical photogrammetry, structures like circles, crosses, or natural patterns are attached or projected onto the object surface. Image operators like ellipse operators are applied in order to find the image coordinates of corresponding image points in different cameras or image planes. As the phase information is only one-dimensional, sequences of fringes with different orientations g i and h i have to be projected onto the object surface 共Fig. 6兲. In this case, the image coordinates of homologous image points p 1 and p 2 are determined by evaluation of the absolute phase maps, and the corresponding object coordinates (X P ,Y P ,Z P ) can be calculated by a redundant space intersection. 3.5 Image-Point Correlation In this section, the basic principles of the new image-point correlator based on absolute phase maps of different orientations in two image planes are presented.20 Systematic phase errors are largely compensated because they appear in both image planes in the same manner. Figure 7 shows two absolute phase maps ␾ (x,y) and ␬ (x,y) with different orientation in the image plane B 1 . As a given combination

of the two phases 共␾,␬兲 exists at only one point in B 1 , the image plane is coded in an unambiguous manner by these phase distributions. Thus, an image point p 2 (x p 2 ,y p 2 ) in B 2 homologous to a given image point p 1 (x p 1 ,y p 1 ) in B 1 with the known phases ( ␾ p , ␬ p ) can be determined if the following conditions are fulfilled:

␾ 共 x 2 ,y 2 兲 ⫽ ␾ p

共7兲

␬ 共 x 2 ,y 2 兲 ⫽ ␬ p

共8兲

Figure 8 shows how to detect this point: The real phase distribution ␾ (x 2 ,y 2 ) in B 2 is fitted with a bilinear function that intersects the searched phase plane ␾ (x 2 ,y 2 )⫽ ␾ p . By repeating this procedure with ␬, two isophase lines come into existence with the common point p 共Fig. 9兲. 3.6 Determination of 3-D Coordinates As the phase maps provide measurement values for each picture element, the image correlation algorithm is repeated for each pixel. Dense object point clouds are generated by triangulation of all of these image points. Because of the redundancy in triangulation, an intersection error is now computable for each object point, and the exterior sensor orientations can be checked and corrected if necessary. Figure 10 shows an object point cloud of one partial view. 4 Matching of Partial Views by Photogrammetry For the 3-D measurement of complex objects, the shape sensor has to be placed at different positions around the object. Each object point cloud of a partial view is measured in the respective sensor coordinate system. Knowing the sensor orientation for each view, these clouds can be transformed into the global coordinate system.

Fig. 6 Projection of image sequences.

4.1 Demonstration of Photogrammetric Matching For demonstrating the concept of photogrammetric matching of point clouds from partial views, the door of a car was chosen. First, it was marked with circular reference targets 共Fig. 11兲. The global coordinates of these targets were determined in advance by a photogrammetric bundle adjustment 共Fig. 12兲. The result was an accurate reference frame for the orientation of the different positions of the shape sensor. Next, a partial view of the door was measured by the shape sensor 共Fig. 13兲, resulting in a point cloud of 3-D Optical Engineering, Vol. 39 No. 1, January 2000

Downloaded from SPIE Digital Library on 26 Aug 2011 to 130.215.169.248. Terms of Use: http://spiedl.org/terms

227

Reich, Ritter, and Thesing: 3-D shape measurement of complex objects . . .

Fig. 7 Absolute phase distributions ␾ and ␬ in the image plane B 1 .

coordinates related to the sensor coordinate system 共Fig. 10兲. For sensor orientation, each partial view has to contain at least three reference targets, whose sensor coordinates are calculated by the shape sensor. With this information, a transformation from the sensor coordinate system into the global coordinate system of the reference frame was calculated 共Fig. 14兲. Other missing parts of the object were measured similarly, resulting in the complete point cloud 共Fig. 15兲. A shaded plot of the result is shown in Fig. 16. All partial views were transformed independently into the reference frame, without the addition of orientation errors. The additional errors caused by the reference frame were small compared to the errors of the shape measurement. This follows from the accuracy of the reference target coordinates calculated by the photogrammetric bundle adjustment, which is about three times higher than the accuracy of the shape measurements by fringe projection. The accuracies of the transformations can be calculated by an error-propagation model, yielding important information for the user. An additional advantage of the proposed concept is the flexibility and transportability of the measurement system,21 consisting only of a digital camera for the photogrammetric measurement of the reference frame and the shape sensor. The following subsections describe some implementation details of this measurement concept.

4.2 Photogrammetric Measurement of Reference Targets Figure 17 shows the principle of photogrammetric bundle adjustment for calculation of the coordinates of the reference targets P i from different images B i .

Transformation of Point Clouds into Reference Coordinate System For the transformation from the sensor coordinate system into the reference coordinate system the method of sensorsystem orientation is presented. The two cameras of the shape sensor have a certain fixed relative orientation to each other, which has already been determined by the sensor calibration process. Now, the orientation parameters of the sensor related to the observed reference targets are calculated. For this purpose, the deviations between the measured image coordinates and the projected image coordinates of the reference targets 关Eq. 共1兲兴, have to be minimized under the restriction that the relative orientation b of the sensor cameras is fixed 共Fig. 18兲. Using this method, even reference targets that are visible only in one camera can be included in the sensor orientation process.

4.3

5 Conclusion In this paper, a new concept for shape measurement of complex objects has been proposed. It is based on combin-

Fig. 8 Local intersection of the real phase distributions ␾ (left) and ␬ (right) in B 2 with the wanted phase planes. 228

Optical Engineering, Vol. 39 No. 1, January 2000 Downloaded from SPIE Digital Library on 26 Aug 2011 to 130.215.169.248. Terms of Use: http://spiedl.org/terms

Reich, Ritter, and Thesing: 3-D shape measurement of complex objects . . .

Fig. 13 Partial view of the door with four circular reference targets.

Fig. 9 Local intersection of isophase lines in B 2 .

Fig. 14 Partial point cloud transformed into the global coordinate system.

Fig. 10 Point cloud of 3-D coordinates measured by shape sensor.

Fig. 15 Complete point cloud composed of several partial shape measurements. Fig. 11 Door of a car with circular reference targets.

Fig. 12 Global coordinates of reference targets.

Fig. 16 Shaded plot of the complete point cloud. Optical Engineering, Vol. 39 No. 1, January 2000

Downloaded from SPIE Digital Library on 26 Aug 2011 to 130.215.169.248. Terms of Use: http://spiedl.org/terms

229

Reich, Ritter, and Thesing: 3-D shape measurement of complex objects . . .

5.

6. 7. 8. 9. 10.

11. 12. Fig. 17 Principle of photogrammetric bundle adjustment.

13. 14.

ing photogrammetry and fringe projection. Partial views of the object shape are acquired by a passive fringe projection sensor consisting of a projector and two cameras. The projected fringes with different orientations are evaluated by a multiple-wavelength phase-shift technique including the compensation of systematical phase errors. A redundant triangulation of homologous image points leads to dense object point clouds. Additionally, the object coordinates of circular reference targets on the object surface are determined in the sensor coordinate system. They are transformed into a global framework using the concept of photogrammetric matching. According to this strategy, point clouds taken from different sensor positions can be transformed into the global coordinate system with very high precision.

16. 17. 18.

19. 20.

References 1. P. Besl, ‘‘Active optical range imaging sensors,’’ in Advances in Machine Vision, J. Sanz, Ed., pp. 1–63, Springer-Verlag, New York 共1989兲. 2. T. Strand, ‘‘Optical three-dimensional sensing for machine vision,’’ Opt. Eng. 24共1兲, 33–40 共1985兲. 3. B. Breuckmann, ‘‘Topometric sensors for prototyping and manufacturing,’’ in Rapid Prototyping, Proc. SPIE 2787, 2–11 共1996兲. 4. D. Walter, ‘‘Integration eines Streifenprojektionssystems in ein Koor-

Fig. 18 Orientation of sensor system. 230

15.

21.

dinatenmesszentrum,’’ in GMA-Bericht 30, Optische Formerfassung, pp. 23–27, VDI/VDE-Gesellschaft Mess- und Automatisierungstechnik 共1997兲. G. Ha¨usler, S. Karbacher, D. Ritter, and H. Scho¨nfeld, ‘‘Reverse engineering mit Hilfe optischer 3D-Sensoren,’’ in GMA-Bericht 30, Optische Formerfassung, pp. 83–86, VDI/VDE-Gesellschaft Mess- und Automatisierungstechnik 共1997兲. G. Konecny and G. Lehmann, Photogrammetrie, de Gruyter, 1984. W. Wester-Ebbinghaus, ‘‘Ingenieur-Photogrammetrie—Neue Mo¨glichkeiten—,’’ FORUM, Z. BDVI 4, 193–213 共1987兲. F. Ackermann, W. Schneider, and G. Vosselmann, ‘‘Experimental investigation into the precision of digital image correlation,’’ Int. Arch. Photogramm. Remote Sensing 26共3/3兲, 115–130 共1986兲. J. Trinder, ‘‘Precision of digital target location,’’ Photogramm. Eng. Remote Sens. 55共6兲, 883–886 共1989兲. H. Beyer, ‘‘Geometric and radiometric analysis of a CCD-camera based photogrammetric close-range system,’’ in Mitteilungen des Instituts fu¨r Geoda¨sie und Photogrammetrie an der ETH Zu¨rich Nr. 51 共1992兲. R. Ludwig, Methoden der Fehler- und Ausgleichsrechnug, Vieweg, Braunschweig 共1969兲. J. E. Greivenkamp, ‘‘Generalized data reduction for heterodyne interferometry,’’ Opt. Eng. 23共7兲, 350–352 共1984兲. T. Kreis, Holographic Interferometry: Principles and Methods, Akademie Verlag, Berlin, 1996. P. Hariharan, B. F. Oreb, and T. Eiju, ‘‘Digital phase-shifting interferometry: a simple error-compensating phase calculation algorithm,’’ Appl. Opt. 26共13兲, 2504–2507 共1987兲. K. Freischlad and C. L. Koliopoulos, ‘‘Fourier description of digital phase-measurement interferometry,’’ J. Opt. Soc. Am. A 7共4兲, 542– 551 共1990兲. K. Hibino, B. Oreb, D. Farrant, and K. Larkin, ‘‘Phase shifting for nonsinusodial waveforms with phase-shift errors,’’ J. Opt. Soc. Am. A 12共4兲, 761–768 共1995兲. H. Wolf, ‘‘Strukturierte Beleuchtung zur schnellen dreidimensionalen Vermessung von Objekten,’’ in Abstraktband des 2. bundesweiten Transputer-Anwender-Treffens TAT ’90 共1990兲. R. Zumbrunn, ‘‘Automated fast shape determination of diffuse reflecting objects at close range by means of structured light and digital phase measurement,’’ in Proc. ISPRS Intermission Conf. on Fast Processing of Photogrammetric Data, pp. 363–379, Interlaken, Switzerland 共1987兲. R. W. Malz, Codierte Lichtstrukturen fu¨r 3-D-Messtechnik und Inspektion, PhD Thesis, Universita¨t Stuttgart, Institut fu¨r Technische Optik 共1992兲. J. Thesing, ‘‘New approaches for phase determination,’’ in Interferometry IX: Techniques and Analysis, M. Kujawinska, G. Brown, and M. Takeda, Eds., Proc. SPIE 3478, 133–141 共1998兲. C. Reich, ‘‘3D-Measurement of complex objects,’’ in Rapid Prototyping, Proc. SPIE 2787, 53–61 共1996兲. Carsten Reich received his diploma in computer science from the Technical University of Braunschweig in 1995. Since 1995, he has worked as a research assistant at the Institute for Measurement Techniques and Experimental Mechanics in Braunschweig. He received his PhD degree in 1999. His interests are photogrammetry, contour measurement, and digital image processing.

Reinhold Ritter received his diploma in mechanical engineering from the Technical University of Braunschweig in 1963. From 1963 to 1968 he worked as a research assistant at the Institute of Mechanics and Strength of Materials and received his PhD degree in 1967. From 1968 to 1972 he worked as a department manager at Rheinstahl, Hannover. In 1972 he became head of the Department of Experimental Mechanics at the Technical University of Braunschweig. Since April 1997 he has been the head of the new Institute for Measurement Techniques and Experimental Mechanics in Braunschweig. His working topics are optical field methods for measurement of mechanical and geometrical properties, including digital image processing.

Optical Engineering, Vol. 39 No. 1, January 2000 Downloaded from SPIE Digital Library on 26 Aug 2011 to 130.215.169.248. Terms of Use: http://spiedl.org/terms

Reich, Ritter, and Thesing: 3-D shape measurement of complex objects . . . Jan Thesing received his diploma in mechanical engineering from the Technical University of Braunschweig in 1996. Since 1996, he has worked as a research assistant at the Institute for Measurement Techniques and Experimental Mechanics in Braunschweig. He received his PhD degree in 1999. His interests are deformation measurement, contour measurement, and digital image processing.

Optical Engineering, Vol. 39 No. 1, January 2000 Downloaded from SPIE Digital Library on 26 Aug 2011 to 130.215.169.248. Terms of Use: http://spiedl.org/terms

231