Dynamic 3-D shape measurement method A review

May 8, 2009 - for high-speed photography [81–83]. The objects in rapid periodic ..... This project was supported by the National Natural Science. Foundation of China (No. ..... /http://web.mit.edu/museum/exhibits/S. [81] Asundi AK, Sajan MR.
1MB taille 4 téléchargements 280 vues
ARTICLE IN PRESS Optics and Lasers in Engineering 48 (2010) 191–204

Contents lists available at ScienceDirect

Optics and Lasers in Engineering journal homepage: www.elsevier.com/locate/optlaseng

Dynamic 3-D shape measurement method: A review Xianyu Su , Qican Zhang Opto-Electronics Department, Sichuan University, Chengdu 610064, China

a r t i c l e in fo

abstract

Available online 8 May 2009

Three-dimensional (3-D) shape measurement for a dynamic object or process, whose height distributions is varying with the time, has been a hot topic in recent years due to its wide field of application. A number of techniques have been presented and in-depth studied. Among several noncontact 3-D shape measurements for a dynamic object or process, an optical 3-D measurement system, based on 2-D grating pattern projection and Fast Fourier transform (FFT) fringe analysis has been developed and widely used, due to its particularly merits of requesting a low-cost and easy-to-use equipments, recording the full-field information simultaneously, requiring only one frame of the deformed fringe pattern to reconstruct the height distribution with fast data processing. In this paper, after an overview of dynamic 3-D shape measurement techniques is presented, the basic principles and typical applications of this technique based on grating projected and fringe analysis, which attracts our attention and research effort in the past ten years, has been targeted as main objective to review. Finally, the high-definition real-time depth-mapping TV camera, a 2-D color imaging with high-resolution and depth sensing system, has been briefly restated as a good development trend of 3-D modeling, robotic and graphics animation. & 2009 Elsevier Ltd. All rights reserved.

Keywords: Dynamic 3-D shape measurement Grating projection Structured illumination Fringe analysis Fourier transform profilometry Phase unwrapping Stroboscope Rotating blade Vibration analysis

1. Introduction Any object’s three-dimensional (3-D) curved surface can be represented by an array of points with known Cartesian coordinates (x,y,z), and then a mathematical model can be formulated to describe the surface shape based on these points. Therefore, the ultimate goals of 3-D shape measurement techniques, both mechanical and optical, is the determination of the Cartesian coordinates. For dynamic object and scene, the Cartesian coordinates of these points are varying with the time. From these variational Cartesian coordinates, other quantities, such as displacement and curvature, can be calculated. Optical non-contact 3-D shape measurement technique (namely 3-D sensing) [1,2] is concerned with extracting the geometry information from the image of the measured object. With the excellence in high speed and high accuracy, it has been widely used for 3-D sensing, machine vision, robot simulation, industrial monitoring, dressmaking, biomedicine, etc. It can be divided in to two types of techniques, active 3-D shape measurement and passive 3-D shape measurement. In passive 3-D shape measurement, for measuring dynamic object like human facial animation, Mova [3] provides 3-D performance capture services and technology for capturing the motion. For measuring dynamic object like dynamic deformation,

 Corresponding author. Tel.: +86 28 85463879; fax: +86 28 85464568.

E-mail address: [email protected] (X. Su). 0143-8166/$ - see front matter & 2009 Elsevier Ltd. All rights reserved. doi:10.1016/j.optlaseng.2009.03.012

using the fine-grid method [4], a regular pattern, such as a welldefined array of lines, is adhered to the measured sample’s surface, and an image of this grid is obtained before and after deformation. An automatic analysis of the dynamic deformation of this fine grid has allowed the full-field measurement of inplane displacement [5]. Usually, by the basic physical principles, active 3-D shape measurement techniques is able to classify it to two categories: the one is based on time delay, which employs the speed of light or laser coherence to measure 3-D shape; the other is triangulation-based, which mainly used the structured pattern projection to demodulate the measured object’s information. The subdivisions of the first one are time-of-flight (TOF) and interferometry (optically coherent detection). In interferometry method, electronic speckle pattern interferometry (ESPI) has been an intense research subject in recent years and presents good performances for full-field vibration or rotation measurements [6–10]. In this area, for dynamic characterization of microelectromechanical systems (MEMS), Doppler interferometry, optical microscopic interferometry, digital holography and stroboscopic interferometry have been developed to achieve its full-field out-of-plane measurements [11–15]. The subdivisions of triangulation-based technique by its projected patterns are single spot projection, 1-D line-shaped light projection and 2-D fringe projection. In single spot projection method, accelerometers and laser doppler vibrometers (LDV) [16] are pointwise measurement techniques and are used in conjunction with spectrum analyzers and modal analysis software to

ARTICLE IN PRESS 192

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

characterize the dynamic object, especially for the vibration behavior [17]. It is necessary to consume time in the 1-D or 2-D scanning. A new point-array encoding method is proposed to calculate object height directly through the affine transformation [18]. Yoshihiro Watanabe [19] uses a multi-spot leaser projection method in which numerous pre-calibrated spots are projected into the measured dynamic object and a high-speed vision with a co-processor for numerous-point analysis to reconstruct the 3-D shape of the dynamic object. In 1-D line-shape light projection method, a variation of the machine vision techniques use lineshaped laser light combined with machine vision sensors to give the repeatability advantages of coordinate measuring machines (CMMs) and the speed advantages of machine vision [20–23]. Unlike 1-D line-shape light projection method, 2-D fringe projection method has the advantages of speed and providing whole-field information. Several 3-D shape measurement methods based on 2-D fringe projected, including moire´ technique (MT) [24–27], phase measuring profilometry (PMP) [28–32], Fourier transformation profilometry (FTP) [33–37], modulation measurement profilometry (MMP) [38–40], spatial phase detection (SPD) [41–43], color-coded fringe projection [44–49], graycoded binary fringe sequences [50–52] have been exhaustively studied. Some of them need a series of fringes to be projected or a phase shifting process to be executed, which is unfit for measuring 3-D shape of a dynamic object. The technique, which is fit for dynamic measurement, has been allowed using only one single image to recover the shape information, so it is also called ‘one-shot’ structured light technique. A simple and stable grid pattern formed by a number of straight lines distinguishable only as vertical or horizontal lines with different colors has been used to densely measure shapes of both static and dynamic scenes and objects [48,49]. A set of gray-scale random stripe patterns have been projected onto a dynamic object, and then the time-varying depths maps have been recovered by space-time stereo method [51]. Actually, more commonly used fringe is 2-D Rochi or sinusoidal grating [53–61]. This grating is projected onto the measured dynamic object’s surface to modulate its height distribution, and then a serial of the deformed fringe patterns are recorded from the other view and processed by a full-field fringe analysis to demodulate the variational 3-D shape information. Among the studied fringe analysis, FTP [33–36], introduced first by Takeda, is particularly fit for its high-speed measurement, because of its merits, only one (or two) fringe(s) needed, full-field analysis, and high precision, etc. In the past years, the FTP method has been extensively studied and improved. Two-dimensional Fourier transform and 2-D hanning filtering are applied to provide a better separation of the height information from noise when speckle-like structures and discontinuities exist in the fringe pattern [35]. Compared with MT, FTP can accomplish fully automatic distinction between a depression and protrusion of the measured object shape. It requires no fringe order assignment or fringe center determination, and it requires no interpolation between fringes because it gives the height distribution at each pixel over the entire fields. Compared with PMP and MMP, FTP requires only one or two images of the deformed fringe pattern, and makes real-time data processing and dynamic data processing possible [36,37]. Recently, a high-definition real-time depth-mapping television (TV) camera (HDTV Axi-vision camera) [62–66] has been proposed. It achieves simultaneous 2-D color imaging with high resolution and depth sensing in real time. At the first appearance, this 3-D TV camera system’s main purpose might not be the measurement of 3-D shape, but it can detect 3-D information and has a wide range of applications in television program production,

3-D modeling, robotic vision and graphics animation. It could be a good development trend of dynamic 3-D shape measurement by speeding it to approach real time naturally. After an overview of the classification and achievements of dynamic 3-D shape measurement techniques is presented, moreover the technique based on grating projected and fringe analysis, which attracts our attention and research effort in the past ten years, has been well explained and illustrated in this review. It is organized as follows. Section 2 gives a theoretical analysis of the basic principles of dynamic 3-D shape measurement based on fringe projection and FTP. Section 3 describes the process of 3-D phase calculation and unwrapping. Section 4 illustrates the applications of this method in different fields. Finally, in Section 5, 3-DTV, as a prominent development trend, has been reviewed in this paper also.

2. Basic principles of dynamic 3-D shape measurement FTP for the dynamic 3-D shape measurement is usually implemented as the following. A Ronchi grating or sinusoidal grating is projected onto an object’s surface. Then, a sequence of dynamic deformed fringe images can be grabbed by a CCD camera from the other view and saved in a computer rapidly. Next, data are processed with three steps. Firstly, by using Fourier transform, we obtain their spectra, which are isolated in the Fourier plane when sampling theorem is satisfied. Secondly, by adopting a suitable band-pass filter (e.g., a 2-D Hanning window) in the spatial frequency domain, all the frequency components are eliminated except the fundamental component. And by calculating inverse Fourier transform of the fundamental component, a sequence of phase-maps can be obtained. Thirdly, by applying the phase unwrapping algorithm in 3-D phase space, the height distributions of the measured object in different time can be reconstructed under a perfect phase-to-height mapping. The optical geometry of the measurement system for dynamic object is similar to traditional FTP, as shown in Fig. 1, in which the optical axis E0 pEp of a projector lens crosses the optical axis E0 cEc of a camera lens at point O on a reference plane, which is normal to the optical axis E0 cEc and serves as a fictitious reference to measure the object height Z(x,y). d is the distance between the projector system and the camera system, l0 is the distance between the camera system and the reference plane.

E∞

CCD Camera S E′

G

Projector

Ep′

c

Ec

d

Ep

l0 z

Reference Plane -Z(x, y, 1)

B

C

y x

D O H A

Object t=1 t=2 t=3 …

Fig. 1. Optical geometry of a dynamic 3-D shape measurement system based on FTP.

ARTICLE IN PRESS X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

By projecting a grating fringe onto the reference plane, the grating image (with period p0) on the reference plane observed through the CCD camera on other view can be represented by þ1 X

g 0 ðx; yÞ ¼

An r 0 ðx; yÞ expfi½2npf 0 x þ nf0 ðx; yÞg

(1)

n¼1

where r0(x,y) is a non-uniform distribution of reflectivity on the reference plane, An are the weighting factors of Fourier series, f0 (f0 ¼ 1/p0) is the fundamental frequency of the observed grating image, and f0 (x,y) is the original phase on the reference plane (i.e., Z(x,y) ¼ 0). The coordinate axes are chosen as shown in Fig. 1. When the measured object is stationary, the image intensity, which is obtained by CCD camera, is independent of the time and usually expressed as g(x,y). But when a dynamic 3-D object, whose height distributions is varying with the time, is placed into the optical field, the intensity of these fringe patterns is obviously a function of the time and can be marked as g(x,y, z(t)), and the phase distribution which implicated the height variation of the measured dynamic object is also a function of the time and can be noted as f (x,y,t). Strictly speaking, in dynamic scenes x and y coordinates are also changing with time which are much smaller in comparison with z coordinates. Therefore, their changes are usually ignored. A sequence of the deformed fringe patterns can be grabbed by CCD camera and stored into a computer rapidly. The intensity distributions of these fringe patterns in difference time can be expressed as gðx; y; zðtÞÞ ¼

þ1 X

193

It means that FTP can only be used for the surfaces on which the slopes do not exceed this limitation. When the measurable slope of height variation extends this limitation, the fundamental component will overlap the other components, then the reconstruction will fail. In practice, to determine the height of each point of the object, the dual-direction nonlinear phase-to-height mapping technique is usually adopted, detailed description in Ref. [67]. Generally, the relation between phase and the height Z(x,y,t) can be written as 1 bðx; yÞ cðx; yÞ , þ ¼ aðx; yÞ þ DFr ðx; y; tÞ DF2r ðx; y; tÞ Zðx; y; tÞ

(9)

where for a sampling instant t, DFr(x,y,t) ¼ F(x,y,t)Fr(x,y) is the phase difference between the two unwrapped phase distributions, F(x,y,t) for the measured object and Fr(x,y) for the reference plane. Z(x,y,t) is the relative height from the reference plane, a(x,y), b(x,y) and c(x,y) are the mapping parameters that can be calculated from the continuous phase distributions of four or more standard planes with known heights. The height distribution of the measured object at each sampling instant will be obtained by Eq. (9), as long as its 3-D phase distribution has been unwrapped.

3. 3-D phase calculation and unwrapping 3.1. 3-D wrapped phase calculation

An rðx; y; tÞ expfi½2npf 0 x þ nfðx; y; tÞg

n¼1

ðt ¼ 1; 2; . . . ; mÞ

(2)

where r(x,y,t) and f(x,y,t) respectively represent a non-uniform distribution of reflectivity on the object surface and the phase modulation caused by the object height variation in the different time, m is the number of all fringe images grabbed by CCD camera. Fourier transform, filtering only the first order term (n ¼ 1) of the Fourier spectra, and inverse Fourier transform are carried out to deal with each fringe pattern grabbed by CCD camera at the different time. Complex signals at different time can be calculated: ^ y; zðtÞÞ ¼ A1 rðx; y; tÞ expfi½2pf 0 x þ fðx; y; tÞg. gðx;

(3)

The same operations are applied to the fringe pattern on the reference plane to obtain the complex signal of the reference plane: g^ 0 ðx; yÞ ¼ A1 r 0 ðx; yÞ expfi½2pf 0 x þ f0 ðx; yÞg.

(4)

Noting the geometry relationship between the two similar triangles, DEpHEc and DCHD, in Fig. 1, we can write dZðx; y; tÞ . CD ¼ l0  Zðx; y; tÞ

(5)

is (6)

Substituting Eq. (5) into Eq. (6) and solving it for Z(x,y,t), the formula of height distribution can be obtained: Zðx; y; tÞ ¼

l0 DFðx; y; tÞ DFðx; y; tÞ  2pf 0 d

^ y; zðt ¼ 0ÞÞ ¼ gðx; ^ y; zð0ÞÞ g^ 0 ðx; yÞ ¼ gðx; ¼ A1 rðx; y; 0Þ expfi½2pf 0 x þ fðx; y; 0Þg.

(7)

where DF(x,y,t) is the unwrapped phase distribution of Df(x,y,t). The measurable slope of the height variation is [34]   @Zðx; yÞ l0   . (8) o  @x  3d max

(10)

There are two methods to obtain the phase variation

Df(x,y,t) ¼ f(x,y,t)f0(x,y) resulted from height variation in difference time. Details will be discussed below.

3.1.1. Direct phase algorithm Calculating directly the multiplication of the conjugation of ^ y; z ðtÞÞ with g^  ðx; y; z ð0ÞÞ in each 2-D space at different gðx; sampling time, we obtain ^ y; zðtÞÞg^  ðx; y; zð0ÞÞ ¼ jA1 j2 rðx; y; 0Þrðx; y; tÞ exp½iDfðx; y; tÞ. gðx;

(11)

The phase distribution Df(x,y,t) can be calculated by n

Dfðx; y; tÞ ¼ fðx; y; tÞ  fðx; y; 0Þ ¼ arctan

The phase variation resulted from the object height distribution

Dfðx; y; tÞ ¼ fðx; y; tÞ  f0 ðx; yÞ ¼ 2pf 0 ðBD  BCÞ ¼ 2pf 0 CD.

For convenient discussion, the fringe pattern on the reference plane is chosen as a special deformed fringe which is obtained at t ¼ 0, so we can rewrite Eq. (4) as

^ y; zðtÞÞg^ ðx; y; zð0ÞÞ Im½gðx; , ^ y; zðtÞÞg^  ðx; y; zð0ÞÞ Re½gðx;

(12) where Im and Re represent the imaginary part and real part of ^ y; z ðtÞÞg^  ðx; y; z ð0ÞÞ, respectively. By this method, we can obtain gðx; a sequence of phase distributions contained in each deformed fringe, which include the fluctuation information of dynamic object.

3.1.2. Phase difference algorithm By calculating the product of every two frame complex signals  ^ y; z ðtÞÞ in neighboring time, we obtain g^ ðx; y; z ðt  1ÞÞ with gðx; ^ y; zðtÞÞg^ n ðx; y; zðt  1ÞÞ ¼ jA1 j2 rðx; y; tÞrðx; y; t  1Þ exp½iDft ðx; yÞ, gðx; (13)

ARTICLE IN PRESS 194

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

where n

Dft ðx; yÞ ¼ fðx; y; tÞ  fðx; y; t  1Þ ¼ arctan

^ y; zðtÞÞg^ ðx; y; zðt  1ÞÞ Im½gðx; . ^ y; zðtÞÞg^ n ðx; y; zðt  1ÞÞ Re½gðx;

(14) Summing all the Dft(x,y) from t ¼ 0 to t ¼ m1, we get the phase difference between the time of t and 0:

Dfðx; y; tÞ ¼

t X

Dfn ðx; yÞ ¼ fðx; y; tÞ  fðx; y; 0Þ.

(15)

n¼1

In evidence, the phase difference f(x,y,t)f(x,y,t1) is always smaller than f(x,y,t)f(x,y,0), so it could lead to an easy phase unwrapping process in 3-D space. More details can be found in Section 3.2.4.

3.1.3. 3-D Fourier fringe analysis [68] The above mentioned processes deal with 3-D data (timesequence fringes) on an individual 2-D frame-by-frame basis. On the other hand, the fringe analysis can also be done in a single 3-D volume not as a set of individual 2-D frames that are processed in isolation. 3-D FFT and 3-D filtering in the frequency domain to obtain the 3-D wrapped phase volume is equivalent in performance to 2-D fringe analysis.

3.2. 3-D phase unwrapping Since the phase calculation by any inverse trigonometric function (i.e., arctangents) provides the principal phase values ranging from –p to p consequently, phase values have discontinuities with 2p phase jumps. For phase data without noise, these discontinuities can be easily corrected by adding or subtracting 2p according to the phase jump ranging from p to p or vice versa. This is called the phase unwrapping procedure [69–73]. 3-D phase unwrapping is not only conducted along x- and y-directions, but also along t-direction necessarily, so it can provide more choices of the path for the unwrapping process. Some points of discontinuity, which are resulted from noise, shadow, under-sampling and cannot be unwrapped along x- or y-direction in its own frame, can be unwrapped successfully along t-direction. So compared with 2-D unwrapping [69–73], 3-D phase unwrapping is easier and more accurate.

3.2.1. Direct 3-D phase unwrapping If the wrapped phase is reliable everywhere and the phase difference is less than p between the neighboring pixels in a 3-D phase space, the unwrapping problem is trivial. In the 3-D phase space, a direct phase unwrapping can be carried out along any direction. First, we calculate the wrapped phase-maps at different time, and select any pixel at initial time as a starting point to be unwrapped. A 1-D phase unwrapping procedure is carried out along the t-direction. Afterwards, a column from the unwrapped pixel in every phase map is selected to be unwrapped. Finally, all the rows from the initial column in every phase map are unwrapped, and natural phases in the whole 3-D space are obtained. Of course, the column and row operations can be interchanged. When the time interval between the tth frame fringe image and initial one exceeds a certain threshold, the wrapped phase with respect to that time is not reliable everywhere, and the phase difference between the neighboring pixels may be more than p. In this condition, we cannot obtain a correct surface by this unwrapping method.

3.2.2. 3-D diamond phase unwrapping Definitely, the 2-D phase unwrapping based on the adjacent pixels (the unwrapping path is diamond-like) can be expanded to deal with the 3-D wrapped phase. If the wrapped phase is reliable everywhere or all the branch-cuts have been placed and all the unreliable points (residues) have been covered by a 3-D mask, the 3-D phase unwrapping process is able to be transferred from the start point to the adjacent 6 pixels in 3-D space. In this condition, the path of the phase unwrapping process is fast spreading like an inflating regular octahedron, shown as Fig. 2. This method is timesaving and more feasible. 3.2.3. 3-D phase unwrapping based on modulation ordering [60] In an actual measurement, many factors, such as noise, fringe break, shadow, under-sampling resulting from very high-fringe density, under-modulation from very sparse fringe density, make the actual unwrapping procedure complicated and path dependent. Combined with a modulation analysis technique, the 2-D phase unwrapping method based on modulation ordering has been well discussed in Refs. [36,69]. Now we extend the 2-D phase unwrapping procedure based on modulation as a reliability function to 3-D phase space. The unwrapping path is always along the direction from the pixel with a higher modulation to the pixel with lower modulation until the entire 3-D wrapped phase is unwrapped. The unwrapping scheme is as shown in Fig. 3, the lines with arrows display one of the phase unwrapping paths based on modulation ordering. 3.2.4. 3-D phase unwrapping based on a phase difference algorithm When the frame rate of the frame grabber is high enough, the time interval between two grabbed frame fringe patterns is small, so the phase difference between two neighboring pixels, which have the same image coordinates, is less than p in t-direction. We select only one phase map (assuming its number to be S), which can be unwrapped by 2-D phase unwrapping procedure easily and accurately, and its natural phase is Fuws(x,y,S). The unwrapped phase at any time (t ¼ K) can be obtained by calculating the phase sum from t ¼ S to t ¼ K along t-direction. We can describe it as

i+1

S

i

i-1

Fig. 2. Sketch map of 3-D diamond phase unwrapping.

i+1 high i m i-1 low Fig. 3. Sketch map of 3-D phase unwrapping based on modulation ordering.

ARTICLE IN PRESS X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

195

follow:

B

FuwK ðx; y; KÞ ¼ FuwS ðx; y; SÞ þ

K X

Dfi ðx; yÞ,

(16)

A

C

D

i¼Sþ1

where FuwK(x,y,K) stands for the unwrapped phase map of t ¼ K, Dfi(x,y) represents the phase difference between two frames t ¼ i and t ¼ i1. The schematic diagram of 3-D phase unwrapping based on phase difference algorithm is shown in Fig. 4. The advantage of this method is that only one frame wrapped phase with reliable phase everywhere needs to be unwrapped, and the unwrapping procedure is very simple under this condition. So the precise phase values in the whole 3-D phase field can be obtained by calculating the sum along t-direction. 3.2.5. Marked/coded fringe pattern for unwrapping phase of spatially isolated dynamic objects When the measured dynamic object is spatially isolated or breaking into several isolated parts (e.g., an impact process), it will bring some difficulties for phase unwrapping. We embedded a special mark into the projected sinusoidal gratings to identify the fringe order [74]. It is shown in Fig. 5. The orientation of the Fourier spectra of the mark is perpendicular to that of the fringe, therefore the mark will not affect the Fourier spectra of the deformed fringe and could be extracted easily with a proper bandpass filter in the other direction. The phase value on the same marked strip is equivalent and known, such as point B and D in Fig. 6, so these phase values at marked strip keep the relation of those separated fringes. The phase unwrapping process of each broken parts will be done from the local marked strip respectively. For the same purpose, Wei-huang Su [75] made a structured pattern in which the sinusoidal fringes are both encoded with binary stripes and colorful grids. The binary stripes play a key role to identify the local fringe order, while the colorful grid provides additional degree of freedom to identify the stripes. Even though

Fig. 4. Sketch map of 3-D phase unwrapping base on phase difference algorithm.

Reference unwrapped phase

B1 A1

C1

D1 B2

A2

D2

Deformed wrapped

C2

phases

Bn An

Dn

Cn

Fig. 6. Sketch map for phase unwrapping of a separated fringe.

the inspected surfaces are colorful, the proposed encoding scheme can still identify the local fringe orders. This encoding scheme provides a more reliable performance to identify the fringe orders since it is not sensitive to the observed colors. In Guo and Huang’s research work [76,77], a small crossshaped marker is embedded in the fringe pattern to facilitate the retrieval of the absolute phase map from a single fringe pattern, which could be used with the Fourier transform method to improve its coordinate measurement accuracy while preserving its potential measurement speed.

4. Dynamic 3-D shape measurement methods and application 4.1. 3-D shape and deformation measurement of rotating object [78,79] Real-time measurement of the periodic motion like rotation and vibration is important for the studies of material properties, shape deformation, and tension analysis. However, real-time 3-D shape measurement is difficult due to the requirement of the multiple measurements of most optical 3-D profilometry methods. Even FTP has not been successful due to the difficulties of objects tracking and position detection. The stroboscope [80] is an intense, high-speed light source used for the visual analysis of the objects in periodic motion and for high-speed photography [81–83]. The objects in rapid periodic motion can be studied by using the stroboscope to produce an optical illusion of stopped or slowed motion. When the flash repetition rate of the stroboscope is exactly the same as the object movement frequency, or an integral multiple thereof, the moving object will appear to be stationary. It is called ‘‘stroboscopic effect’’. The combination of stroboscopic effect and FTP can solve the problem of this requirement. In Ref. [76], we used a stroboscopic structured illumination for the surface shape determination of rotating blade. The deformation of a rotating blade measured in each cycle has been recovered by this method. 4.2. Stroboscopic structured illumination system and control

Fig. 5. Marked fringes image.

Synchronization is very important for the snap shot measurement of high-speed motion. To ‘‘freeze’’ the motion and record an instantaneous and stationary image of a rotating object, the flash

ARTICLE IN PRESS 196

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

D/A

VCO Reset

PC Bus Counter

Position Detector

Coincidence Circuit

Trigger Circuit

Wave Transform

LED Flash Image Grabber

Frequency Division Step Compiler

PC Bus

Fig. 7. Block diagram of sync control unit.

Stroboflash

Projector

Fan Detector

Grating Vtri

Fine Adjustment

CCD

Stest Stri

Frame Grabber Computer

Sync Control Unit

Fig. 8. Schematic diagram of experimental setup for measuring 3-D shape of a rotating object.

repetition rate of stroboflash and the shoot of CCD camera must be precisely synchronized with the periodic rotation. The block diagram of the stroboflash sync control unit is shown in Fig. 7. The unit has two control modes: interior trigger and exterior trigger. Interior trigger mode triggers the LED to flash and the frame grabber to capture an image according to a frequency set artificially through software. The current frequency range covers from 0 Hz to 20 MHz. This mode is used in study of motion without obviously repetition, such as expansion, contraction or ballistic flight, and dynamic objects with known frequencies. Exterior trigger mode triggers the LED to flash and the frame grabber to capture an image according to the frequency signals of the position detector output. The position detector tracks the motion of dynamic objects. In this mode, the system will work automatically without any artificial interference. This mode will be used in real-time measurement for the dynamic objects with obviously repetition, such as rotation and vibration. When the control unit is working in exterior trigger mode, the output signal of position detector will trigger the framer grabber to start a new capture, trigger the LED flash and reset the CCD after a negligible lag (to ensure that the capture of frame grabber is prior to the video signal output).

4.3. Dynamic 3-D shape measurement of rotating object [78] A schematic diagram of the experimental setup used for the measurement of rotating object is given in Fig. 8. The image of a sinusoidal grating of 2 lines/mm is projected and focused onto a rotating blade surface. The deformed grating image is observed by a CCD camera (PULNiX-6AS) with a 16 mm focal length lens, via a video-frame grabber, which has the function of external trigger, digitizing the image in 512  512 pixels. The fan is located at a distance l0 ¼ 625 mm from the projection system exit pupil and d ¼ 650 mm.

In practice, we started to capture the deformed fringe images at the instant of the commencement of the fan’s rotation. From quiescence to stable rotation (about 1112 rpm, revolutions per minute) and with one image for each revolution, 100 frame images of the fan blade were totally obtained and two of them are shown in Figs. 9(a) and (b). The corresponding revolution numbers are 1st and 50th. Fig. 9(c) gives the grid chart for the reconstructed shape information at 50th revolution. Fig. 9(d) shows the deformation (relative to the first revolution) of the whole reconstructed blade at the 50th revolution. Figs. 9(e) and (f) show the deformations (relative to the first revolution also) of the reconstructed blades at difference revolutions along the line shown in each inset, the numbers marked on left side of curves are their corresponding revolutions. They depicted the twisting deformations of blade with increasing speed distinctly. The farther the distance away from the rotating shaft is, the bigger the deformation is. We can see the maximal deformation is in excess of 8 mm. Except high-speed motion object with obvious and variable repetition, such as rotation and vibration, this method can also be used in study of high-speed motion without obvious repetition, such as expansion, contraction or ballistic flight, and high-speed objects with known and invariable frequencies. This method can also be expanded using different detectors or sensors. For example, explosion phenomena with sound signals can be studied using a sound control unit instead of the optical position detector.

4.4. High-speed optical measurement for the vibrated shape [84,85] The drum plays an important role in many musical styles. Understanding the characteristics of the vibration of the membrane of the drum is important in studying the acoustic characteristic and improving the manufacture technique of the drum. The drumhead has been studied to a certain extent using various methods. Bertsch [17] employs laser interferometry, a polytec scanning vibrometry, to measure the vibration patterns of the Viennese timpani by quickly scanning 110 points on the membrane. His method is the non-contact measurement technique and can achieve high accuracy, about nanometers, in vibration amplitudes measurement. A schematic diagram of the experimental setup used for vibrating drum measurement is shown in Fig. 10. The measured object is a Chinese double-side drum with a diameter of 250 mm, shown in Fig. 11. An image of a sinusoidal grating, 2 lines/mm, is projected and focused onto the drum membrane, which is located at a distance of 780 mm (l0) from the exit pupil of the projection system. The deformed grating image is observed by a high frame rate CCD camera (SpeedCam Visario, the sampling rate speed is up to 10,000 fps.) with a zoom lens (Sigma Zoom, F82 mm, 1:28 DG,

ARTICLE IN PRESS X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

197

Fig. 9. Dynamic 3-D shape measurement for rotating blade. (a) and (b) Two deformed fringes recorded at 1st, 50th revolutions; (c) the reconstructed shape at 50th revolution; (d) the deformation (relative to the first revolution) at 50th revolution; (e) and (f) the deformations (relative to the first revolution) along the line shown in inset.

Drumstick

Host of Camera

Computer

High-speed Camera Drum

Projector Fig. 10. Schematic diagram of experimental setup.

24–70 mm). The distance between the projection system and the high-speed camera is 960 mm (d). We hit the batter side of the drumhead three times quickly with a wood drumstick and measured the whole vibration of the resonant side. During the whole vibration process, we had recorded near 1000 fringe images of vibrating membrane in 900  900 pixels with 1000 fps in sampling rate speed. One of the Fig. 11. The measured drumhead.

ARTICLE IN PRESS 198

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

Fig. 12. Dynamic 3-D shape measurement for vibrating drum. (a) The deformed fringe. (b) Fourier spectra of (a). (c) Position of the center point of drumhead. (d) Restored profiles of six sampling instant in one period. (e) and (f) Grid charts of restored height distribution of vibrating drumhead.

deformed fringes and its Fourier spectra is shown in Figs. 12(a) and (b), respectively. Fig. 12(c) shows the vibration position of the center point of drumhead form which we can see the time of hit and the damped motion of the drumhead. The height distributions of the vibrating drumhead at their corresponding sampling instants are exactly restored. The maximum amplitude of this vibration is 1.79 to 1.84 mm. Fig. 12(d) gives the profiles of the center row of six sampling instants in one period; the number above each line is the corresponding sampling instant. Figs. 12(e) and (f) show the grid charts of two sampling instants, and their sampling instants are given respectively as its titles. Observing them, we will find that there are two modes, (1,0) and (1,1), in one period of the principal vibration of the drumhead. It indicates that the point we hit is not the exact center of the drumhead. Furthermore, there are nonlinear effects (such as the uniform surface tension across the entire drumhead), which exert

their own preference for certain modes by transferring energy from one vibration mode to another. This method has the advantage of high-speed full-field data acquisition and analysis, so its application can be expanded into other fields, such as dynamic 3-D shape measurement and analysis of the instantaneous phenomenon, such as inflation and deflation.

4.5. Dynamic measurement of vortex shape [86,87] Another experience for measuring the dynamic object is digitizing the motion of vortex when the poster paint surface is stirring. The deformed grating image is observed by a lowdistortion TV CCD camera (TM560, 25 fps) with a 12 mm focal length lens, via a video-frame grabber digitizing the image in

ARTICLE IN PRESS

Height/mm

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

199

0 -0.5 -1 -1.5 -2 -2.5 -3 -3.5 -4 0

20

40

60 80 X /mm

100

120

Fig. 13. Dynamic measurement of vortex shape. (a) One of the deformed fringes when the stirrer is working and (b) profiles of the reconstructed vortices at different times.

Fig. 14. Six frame deformed fringes images.

128  128 pixels. The paint is located at a distance l0 ¼ 890 mm from the projection system exit pupil, and d ¼ 290 mm. In this experience, we captured a fringe pattern as the reference plane while the paint is still. Then, we began to catch the deformed fringe images at the very moment when the stirrer was turned on. From quiescence to stable rotation, 205 frame images of the paint we got in all in 10.4 s and one of them shown as in Fig. 13(a). Fig. 13(b) shows the profiles of the reconstructed vortices, the number under each line is the corresponding sampling instant. It illuminates the formation and evolution of the vortex clearly.

4.6. Dynamic 3-D shape measurement for breaking object [74] Another application is to test a breaking ceramic tile. Obviously the tile is divided into 4 spatial-isolated blocks and the fringe discontinuity is followed. The introduced marked fringe, which has been detailed in Section 3.2.5, has been used to ensure the phase unwrapping process error free. The CCD camera is Lumenera Camera Lu050 M (320  240 pixels, with a 12 mm lens and 200 fps in sample rate speed). The whole breaking process contains 47 frames and lasts 235 ms. Six of them are shown in Fig. 14, their corresponding sampling instants are 745, 750 755,

765, 775 and 785 ms, respectively (from the beginning of capture). Fig. 15 shows the reconstructed 3-D shape of the breaking tile at the corresponding sampling instants in Fig. 14.

4.7. Dynamic 3-D shape measurement for face in chewing [88,89] We designed a low-cost measurement system and used it to reconstruct the 3-D facial shape and movement in chewing. The projected image was a sinusoidal grating of 3 lines/mm, and the CCD camera was a commercial one (MINTRON MTV 1881EX) with a 50 mm focal length lens. The image was digitizing in 512  512 pixels. The head was located at a distance l0 ¼ 2220 mm from the projection system exit pupil, and d ¼ 590 mm. One hundred and seventeen frame images of the facial movement in 4.68 s had been recorded at the video rate (25 fps). One frame of image is shown as in Fig. 16(a), and the area inside the white dashed frame is to be reconstructed. Fig. 16(b) draws the contour chart of the reconstructed facial shape when t ¼ 1.04 s (it corresponding frame number is 26th). Fig. 16(c) gives the grid chart of the reconstructed facial shape at the same time shown in Fig. 16(b). These digital facial shapes will be used in biomedicine for cosmetology and orthopedics or in security system for face recognition.

ARTICLE IN PRESS

0 -10 360 300 240 180 120 x/pixels

60

180 240

Height/mm

0 60 120

10

0

60

0

0 60 120

10 0 -10 360 300 240 180 120 x/pixels

60

180 240 0

0 -10 360 300 240 180 120 x/pixels

60

180 240 0

0 60 120

10 0 -10 360 300 240 180 120 x/pixels

y/pi xels

y/pi xels

-10 360 300 240 180 120 x/pixels

180 240

0 60 120

10

60

180 240

y/pi xels

0

0

Height/mm

60

0 60 120

10

y/pi xels

-10 360 300 240 180 120 x/pixels

180 240

y/pi xels

0

Height/mm

0 60 120

10

Height/mm

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

y/pi xels

Height/mm

Height/mm

200

0

Fig. 15. Six grid charts of restored shape of breaking tile.

Reconstructed Facial Shape and Movement~Frame #26~ 500 450 400 350 300 250 200 150 100 50 50 10 0 15 0 20 0 25 0 30 0 35 0 40 0 45 0 50 0

Y/pixels

One Frame of Deformed Fringe

X/pixels Fig. 16. Dynamic 3-D measurement for facial shape and movement. (a) One frame of deformed fringe; (b) contour chart of frame at t ¼ 1.04 s and (c) grid chart of reconstructed shape of face.

4.8. Dynamic 3-D shape measurement for woofer’s vibration [90] Experiment was performed on a woofer speaker in hopes of estimating whether the woofer would be a very effective radiator of sound or not and determining whether the speaker performs the expected way or not. The projected grating was 2 lines/mm. The images were captured in 630  630 pixels with 800 fps. A function generator and a power amplifier were used to create a sine wave, which transferred to the measured woofer and synchronously captured by an oscillograph as the input signal, and a high-speed camera (Basler 504 k with a 16 mm focal length lens) was employed to record the deformed fringe patterns resulted by the cone’s vibration. The reconstructed vibration had been taken as the output of the woofer and compared with the input obtained by oscillograph. The measured woofer’s cone was compelled to vibrate by a 5 v sine wave. Five hundred frame deformed fringe images of the vibrating woofer were obtained in 625 ms and one of them is shown in Fig. 17(a). The height distributions of the vibrating woofer at the corresponding sampling instants are exactly restored and shown in Fig. 17(b). Fig. 17(c) shows the positions of the center point of the woofer in 50 ms vibration (40 frames)

and indicates each sampling instant using an asterisk (*). From this figure, we can clearly see that the motion of one point on the woofer’s cone is also a sine wave and we can work out the principal vibration frequency of this woofer is near 50 Hz (there are two and half periods in 50 ms), which exactly corresponds with the input signal. Fig. 17(d) gives the profiles of the center rows of nine sampling instants in one period; the number (unit: ms) labeled near each line is the corresponding sampling instant respectively. The vibration range (the amplitude displacement of the speaker cone, pumping in and pumping out) is from 1.25 to 1.14 mm.

4.9. Dynamic 3-D shape measurement for impact [91] Another experience for measuring was carried on a thin plate impacted by a flying rammer. The projected grating was 2 lines/ mm. The thin plate was fixed on the central hole of a stability mount plate, and located at a distance l0 ¼ 550 mm from the projection system exit pupil, and d ¼ 360 mm. The deformed grating image (180  180 pixels) was observed by a high frame rate CCD camera (Lumenera Lu050 M, 200 fps) with a 12 mm focal

ARTICLE IN PRESS X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

201

Height/mm

1

44.96 29.97 14.99 0 235.6 191.5 143.6 X/m 95.7 m 47.9

0 95.7 0

7.5

1.14 1 Vibration range/mm

Vibration range/mm

1.11 1 0.5 0 -0.5

235.6

0

5

10 15 20 25 30 35 40 45 50 Time/ms

143.6 m

Y/m

8.75 11.25

0.5 3.75 0 -0.5

1.25

15

-1 -1.25

-1 -1.21

191.5

47.9

0

47.1

20 18.75 17.5

94.3 141.4 Position/mm

188.5

235.6

Fig. 17. Dynamic 3-D shape measurement for vibrating woofer. (a) One deformed fringe; (b) the reconstructed shape; (c) positions of center point in vibration and (d) profiles of center rows of nine sampling instants in one period.

Fig. 18. Three frames of deformed fringes at difference time.

Fig. 19. 3-D reconstructed height distributions of impacting object’s surface.

length lens. We employed a flying rammer who tightly bound at the end of a string to horizontally impact the thin plate. The rammer was flying towards the thin plate at the instant of commencement of capturing image. From beginning to ending of capture, 200 frame deformed fringe images had been obtained in one second. In afterwards image processing, we found that the

front part, near 70 frames, were obtained before impaction and the rear ones, near 70 frames, were obtained after impaction completely. That it to say, the whole impaction lasted 0.3 s only. Three of the 200 frame deformed fringes are shown in Fig. 18. Their frame sequence numbers are 86, 101 and 130 (their corresponding time is 430, 505 and 650 ms after capturing).

ARTICLE IN PRESS 202

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

Fig. 19 gives the grid chart for the reconstructed height distributions of the corresponding frames shown in Fig. 18.

camera after reflection from one particular point P on the object. The triangle formed by the dotted line is delayed by Dt ¼ 2d/v from that formed by the solid line because of the time required for the light to make the round trip, where d is the distance from the camera to point P and v is the velocity of light. The pair of vertical black dotted lines indicates the duration when the ultra-fast shutter in front of the CCD camera is open. If the shutter opens during the ascending intensity modulation, as shown in first video frame, the input light to the CCD camera during the exposure time (represented by the hatched section in Fig. 20) decreases with the distance d to the object, because the dotted line shifts to the right. During this cycle of illumination, smaller input light to the CCD camera means longer distance to the object. On the other hand, if the shutter opens during the descending intensity modulation, as shown in the second video frame, larger input light to the CCD camera means longer distance. The 3-D TV camera systems can detect 3-D information, including a 2-D color imaging with high-resolution and a depthmapping imaging. They have a wide range of applications in TV program production, 3-D modeling, and robotic vision, as well as graphics animation.

5. Real-time 3-D TV camera [62–66] Among the several methods used for 3-D surface measurement, triangulation is the most common method of detecting depth information. The systems based on triangulation use either stereoscopic images taken by several cameras [62] or structured light projections [63]. Sometimes, these triangulation methods are limited by shadow. The time-of-flight (TOF) method has the advantage of quick, straightforward information processing [64]. The conventional system using this TOF method, however, requires two-dimensional laser beams scanning to make a depth image of an object, so it is only suitable for stationary objects. Recently, a high-definition real-time depth-mapping TV camera was proposed, which was named as the HDTV Axi-vision camera [65,66]. The camera achieves simultaneous 2-D color imaging with high resolution and depth sensing. The depth information can be calculated in real time. The principle of depth mapping is shown as Fig. 20 [66]. The intensity of an ultra-fast snapshot of an object becomes dependent on the distance from the camera to the object if the intensity of the illuminating light is varied at a speed comparable to the speed of light. In Fig. 20, the upper section shows the arrangement of the components, with the infrared intensitymodulated LED illuminating sources on the left and a CCD camera with an ultra-fast shutter utilizing an image intensifier on the right. The lower section of the figure shows a time diagram of the illuminating and reflect light during the first three video-frame cycles. The solid triangle formed line indicates the triangularly intensity-modulated illuminating LED light at the source. The dotted triangle formed line indicates the light intensity at the

Intensity-modulated infrared light

LED light source

P

6. Conclusion There are increasing demands for obtaining the 3-D information, moreover it is expected that the trend is to expand the existed techniques to deal with the dynamic process. As a noncontact shape measurement technique, Fourier transform profilometry has been well introduced to deal with the dynamic process in recent years, due to its merits of only one needed fringe pattern, full-field analysis and high precision. In this paper, we mainly review our research work based on this method in the past ten years. The basic principles of this

Photocathode Microchannel plate Phosphor CCD

Signal processor

Depth image

Ultra-fast shutter using image intensifier

d

Reflected light

Illuminating light Video frame 1 (1/30s or 1/60s)

t Image A t

ts Depth image 1

Video frame 2 t Image B

Depth image 2

Video frame 3 t Image A T = 20 ns to 100 ns

Depth image 3

Fig. 20. Principle of acquiring depth information by using intensity-modulated illumination and an ultra-fast camera shutter using an image intensifier (from Ref. [66]).

ARTICLE IN PRESS X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

method and a variety of phase unwrapping techniques are also introduced. We also list a number of typical applications such as the combination of stroboscopic effect and FTP to test the objects in rapid periodic motion, high-speed optical measurement for the vibrating membrane of the drum, dynamic measurement of rotating vortex, breaking tile, chewing face, vibrating woofer and impacted plate, etc. Although this method has been well studied and applied in different application field, there are still some challenges which are necessary to be addressed in future work. (1) Real-time 3-D shape measurement, which is the key for successfully implementing 3-D coordinate display and measurement, manufacturing control and on line quality inspection, is more difficult in dynamic process. Right now the data processing in dynamic process is just executed after the event; therefore there is a long road to realize the real-time 3-D dynamic shape measurement. (2) Due to the varying object, the accuracy of dynamic 3-D shape measurement is usually less than that of static 3-D shape measurement. Generally the different experimental setup of each measurement will lead to different accuracy. In the applications proposed in this paper, the accuracy of this kind of method is up to decades of micrometers. For example, in vibrating drumhead measurement, the standard deviation of the restored height is 0.075 mm. In rotating blade measurement, the standard deviation of the heights is less than 0.022 mm. The primary task of the future work would be to improve the accuracy of dynamic 3-D shape measurement to meet the industrial application requirements. The entire accuracy of a measurement system is determined by the system parameter d/l0 and the grating period p0. It can be improved by increasing the d/l0 or decreasing the p0 according to the maximum variational range of the measured height. (3) The requirements of the environment for dynamic measurement are tougher than static 3-D shape measurement, for example, the influence of the varying environmental light in stroboscope, the influence of the unexpected motion of the peripheral equipments. (4) There is also a lack of research activity for overcoming the shading problem inherent in the grating-projection based techniques. This 3-D shape dynamic measurement has the advantage of high-speed full-field data acquisition and analysis, With the development of high-resolution CCD cameras and high frame rate frame grabbers, the method, as proposed here, should be a promising one in studying high-speed motion including rotation, vibration, explosion, expansion, contraction and even shock wave.

Acknowledgement This project was supported by the National Natural Science Foundation of China (No. 60838002, 10876021, 60527001 and 60807006). References [1] Chen F, Brown G M, Song M. Overview of three-dimensional shape measurement using optical methods. Opt Eng 2000;39(1):10–22. [2] Ja¨hne B, Haußecker H, Geißler P. Handbook of computer vision and applications, volume 1: sensors and imaging. San Diego: Academic Press; 1999 (Chapters 17–21). [3] http://www.mova.com/. [4] Rae PJ, Goldrein HT, Bourne NK, et al. Measurement of dynamic large-strain deformation maps using an automated fine grid technique. Opt Laser Eng 1999;31:113–22. [5] Sevenhuijsen PJ, Sirkis JS, Bremand F. Current trends in obtaining deformation data from grids. Exp Tech 1993;17(3):22–6. [6] Lo´pez CP, Santoyo FM, Rodrı´guez VR, Funes-Gallanzi M. Separation of vibration fringe data from rotation object fringes using pulsed ESPI. Opt Laser Eng 2002;38(2):145–52. [7] Servin M, Davila A, Quiroga JA. Extended-range temporal electronic speckle pattern interferometry. Appl Opt 2002;41(22):4541–7.

203

[8] Fan H, Wang J, Tan YS. Simultaneous measurement of whole in-plane displacement using phase-shifting ESPI. Opt Laser Eng 1997;28:249–57. [9] Petitgrand S, Yahiaoui R, Danaie K, Bosseboeuf A, Gillers JP. 3-D measurement of micromechanical devices vibration mode shapes with a stroboscopic interferometric microscope. Opt Laser Eng 2001;36(2):77–101. [10] Maa CC, Huang CH. Experimental whole-field interferometry for transverse vibration of plates. J Sound Vib 2004;271:493–506. [11] Chen LC, Huang YT, Fan KC. A dynamic 3-D surface profilometer with nanoscale measurement resolution and MHz bandwidth for MEMS characterization. IEEE/ASME Trans Mechatron 2007;12(3):299–307. [12] Chen LC, Huang YT, Nguyen XL, Chen JL, Chang CC. Dynamic out-of-plane profilometry for nano-scale full-field characterization of MEMS using stroboscopic interferometry with novel signal deconvolution algorithm. Opt Laser Eng 2009;47:237–51. [13] Guo T, Chang H, Chen JP, Fu X, Hu XT. Micro-motion analyzer used for dynamic MEMS characterization. Opt Laser Eng 2008. [14] Brown GC, Pryputniewicz RJ. New test methodology for static and dynamic shape measurements of microelectromechanical systems. Opt Eng 2000;39:127–36. [15] Osten W, Ju¨ptner W, Seebacher S, Baumach T. The qualification of optical measurement techniques for the investigation of materials parameters of microcomponents. Proc SPIE 1999;3825:152–64. [16] Nakagawa H, Kurita Y, OgawaK, et al. Experimental analysis of chatter vibration in end-milling using laser Doppler vibrometers. J Autom Technol 2008;2(6):431–8. [17] Bertsch M. vibration patterns and sound analysis of the Viennese timpani. Proc ISMA 2001;2:281–4. [18] Tian J, Peng X. 3-D digital imaging based on shifted point-array encoding. App Opt 2005;44(26):5491–6. [19] Watanabe Y, Komuro Y, Ishikawa M. 955-fps real-time shape measurement of a moving/deforming object using high-speed vision for numerous-point analysis. IEEE Robot Autom 2007:3192–7. [20] Cheng XX, Su XY, Gou LR. Automatic measurement method for 3601 profilometry of 3-D diffuse objects. Appl Opt 1991;30(10):1274–8. [21] Li JL, Su XY, Zhou WS. The 3-D sensing using laser sheet projection: influence of speckle. Opt Rev 1995;2(2):144–9. [22] Asundi A, Zhou WS. Unified calibration technique and its applications in optical triangular profilometry. Appl Opt 1999;38(16):3556–61. [23] Asundi A, Zhou WS. Mapping algorithm for 360-deg. profilometry with time delayed integration imaging. Opt Eng 1999;38(2):339–44. [24] Takasaki H. Generation of surface contours by moire´ pattern. Appl Opt 1970;9(4):942–7. [25] Yoshizawa T. The recent trend of moire´ metrology. J Robustic Mech 1991;3(3):80–5. [26] Wang B, Luo X, Pfeifer T, Mischo H. Moire´ deflectometry based on Fouriertransform analysis. Measurement 1999;25(4):249–53. [27] Sajan MR, Tay CJ, Shang HM, Asundi A. TDI imaging and scanning moire´ for online defect detection. Opt Laser Technol 1997;29(6):327–31. [28] Srinivasan V, Liu HC, Halioua H. Automated phase-measuring profilometry of 3-D diffuse objects. Appl Opt 1984;23(18):3105–8. [29] Halioua M, Liu HC. Optical three-dimensional sensing by phase measurement profilometry. Opt Laser Eng 1989;11:115–85. [30] Su XY, Zhou WS, von Bally V, Vukicevic D. Automated phase-measuring profilometry using defocused projection of a Ronchi grating. Opt Commun 1992;94(6):561–73. [31] Li J, Su HJ, Su XY. Two-frequency grating used in phase-measuring profilometry. Appl Opt 1997;36(1):277–80. [32] Zhang H, Lalor MJ, Burton DR. Spatio temporal phase unwrapping for the measurement of discontinuous objects in dynamic fringe-projection phaseshifting profilometry. Appl Opt 1999;38(16):3534–41. [33] Takeda M, Ina H, Kobayashi S. Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry. J Opt Soc Am 1982;72(1):156–60. [34] Takeda M, Motoh K. Fourier transform profilometry for the automatic measurement of 3-D object shapes. Appl Opt 1983;22(24):3977–82. [35] Li J, Su XY, Guo LR. An improved Fourier transform profilometry for automatic measurement of 3-D object shapes. Opt Eng 1990;29(12):1439–44. [36] Su XY, Chen WJ. Fourier transform profilomitry: a review. Opt Laser Eng 2001;35(5):263–84. [37] Su XY, Chen WJ, Zhang QC, Chao YP. Dynamic 3-D shape measurement method based on FTP. Opt Laser Eng 2001;36:49–64. [38] Su XY, Su LK. New 3D profilometry based on modulation measurement. Proc SPIE 1998;3558:1–7. [39] Su XY, Su LK, Li WS. A new Fourier transform profilometry based on modulation measurement. Proc SPIE 1999;3749:438–9. [40] Su LK, Su XY, Li WS. Application of modulation measurement profilometry to objects with surface holes. Appl Opt 1999;38(7):1153–8. [41] Toyooka S, Tominaga M. Spatial fringe scanning for optical phase measurement. Opt Commun 1984;51(2):68–70. [42] Sajan MR, Tay CJ, Shang HM, et al. TDI imaging—a tool for profilometry and automated visual inspection. Opt Lasers Eng 1998;29(6):403–11. [43] Sajan MR, Tay CJ, Shang HM, Asundi A. Improved spatial phase detection for profilometry using a TDI imager. Opt Commun 1998;150(1-6):66–70. [44] Hausler G, Ritter D. Parallel three-dimension sensing by color-coded triangulation. Appl Opt 1993;32(35):7164–9.

ARTICLE IN PRESS 204

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204

[45] Zhang L, Curless B, Seitz SM. Rapid shape acquisition using color structured light and multi-pass dynamic programming. IEEE Int Symp 3D Data Process Visualization Transm 2002:24–36. [46] Geng ZJ. Rainbow 3-D camera: new concept of high-speed three vision system. Opt Eng 1996;35:376–83. [47] Xu ZQ, Ye SH, Fan GZ. Color 3D reverse engineering. J Mater Process Technol 2002;129(1):495–9. [48] Kawasaki H, Furukawa R, Sagawa R, Yagi Y. Dynamic scene shape reconstruction using a single structured light pattern, 2008. CVPR 2008. IEEE Compute Vision Pattern Recogn 2008:1–8. [49] Page`s J, Salvi J, Matabosch C. Implementation of a robust coded structured light technique for dynamic 3D measurements. IEEE Image Process 2003;3(2):1073–6. [50] Besi PJ, et al. Active optical image sensors. In: Sanz JLC, editor. Advances in machine vision. Berlin: Springer; 1989. [51] Zhang L, Curless B, Seitz S. Spacetime stereo: shape recovery for dynamic senses. IEEE Computer Vision Pattern Recogn 2003:367–74. [52] Davis J, Ramamoorthi R, Rusinkiewicz S. Spacetime stereo: a unifying framework for depth from triangulation. IEEE Trans Pattern Anal Mach Intell 2005;27(2):1–7. [53] Rodriguez-vera R, Servin M. Phase locked loop profilometry. Opt Laser Technol 1994;26(6):393–8. [54] Li NB, Peng X, Xi JT, Chicharo JF, Yao JQ, Zhang DW. Multi-frequency and multiple phase shift sinusoidal fringe projection for 3D interferometry. Opt Exp 2005;13(5):1561–9. [55] Zhang Z, Zhang D, Peng X. Performance analysis of 3-D full field sensor based on fringe projection. Opt Laser Eng 2004;42:341–53. [56] Quan C, Tay CJ, Huang HY. 3-D deformation measurement using fringe projection and digital image correlation. Optik 2004;115(4):164–8. [57] Cheng P, Hu J, Zhang G, Hou L, Xu B, Wu X. Deformation measurements of dragonfly’s wings in free flight by using Windowed Fourier Transform. Opt Laser Eng 2008;46:157–61. [58] Huang PS, Zhang C, Chiang FP. High speed 3-D shape measurement based on digital fringe projection. Opt Eng 2003;42(1):163–8. [59] Zhang S, Huang PS. High-resolution, real-time three-dimensional shape measurement. Opt Eng 2006;45(12):123601–18. [60] Zhang GJ, Ye SH. Online measurement of the size of standard wire sieves using an optical Fourier transform. Opt Eng 2000;39(4):1098–102. [61] Li Y, Nemes JA, Derdouri A. Optical 3-D dynamic measurement system and its application to polymer membrane inflation tests. Opt Laser Eng 2000;33:261–76. [62] Kimura S, Kano H, Kanade T, Yoshida A, Kawamura E, Oda K. CMU video-rate stereo machine. Mobil Mapp Symp Photogrammetry Remote Sensing 1995:9–18 (Available also from /http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.33.3692S). [63] Sato K, Inokuchi S. Range-imaging system utilizing nematic liquid crystal mask. ICCV 1987:657–61. [64] Jarvis RA. A laser time-of-flight range scanner for robotic vision. IEEE Trans Pattern Anal Mach Intel PAMI-5 1983:505–12. [65] Kawakita M, Iizuka K, Aida T, Kikuchi H, Fujikake H, Yonai J, et al. Axi-Vision Camera (real-time depth-mapping camera). Appl Opt 2000;39:3931–9. [66] Kawakita M, Iizuka K, Aida T, Kikuchi H, Fujikake H, Yonai J, et al. Highdefinition real-time depth-mapping TV camera: HDTV Axi-Vision Camera. Opt Exp 2004;12(12):2781–94.

[67] Li WS, Su XY, Liu ZB. Large-scale three-dimensional object measurement: a practical coordinate mapping and imaging data-patching method. Appl Opt 2001;40(20):3326–33. [68] Abdul-Rahman HS, Gdeisat MA, Burton DR, Lalor MJ, Lilley F, Abid A. Threedimensional Fourier fringe analysis. Opt Laser Eng 2008;46:446–55. [69] Su XY, Chen WJ. Reliability-guided phase unwrapping algorithm: a review. Opt Laser Eng 2004;42(3):245–61. [70] Judge TR, Bryyanston-Cross PJ. A review of phase unwrapping techniques in fringe analysis. Opt Laser Eng 1994;21:199–239. [71] Su XY. Phase unwrapping techniques for 3-D shape measurement. Proc SPIE 1996;2866:460–5. [72] Li JL, Su XY, Li JT. Phase unwrapping algorithm-based on reliability and edgedetection. Opt Eng 1997;36(6):1685–90. [73] Asundi AK, Zhou WS. Fast phase-unwrapping algorithm based on a gray-scale mask and flood fill. Appl Opt 1999;38(16):3556–61. [74] Xiao YS, Su XY, Zhang QC, Li ZR. 3-D profilometry for the impact process with marked fringes tracking. Opto-Electron Eng 2007;34(8):46–52 (in Chinese). [75] Su WH. Projected fringe profilometry using the area-encoded algorithm for spatially isolated and dynamic objects. Opt Exp 2008;16(4):2590–6. [76] Guo H, Huang PS. 3-D shape measurement by use of a modified Fourier transform method. Proc SPIE 2008;7066 (p. 70660E-1–8). [77] Guo H, Huang PS. Absolute phase retrieval for 3D shape measurement by the Fourier transform method. Proc SPIE 2007;6762:676110–204. [78] Zhang QC, Su XY, Cao YP, Li Y, Xiang LQ, Chen WJ. An optical 3-D shape and deformation measurement for rotating blades using stroboscopic structured illumination. Opt Eng 2005;44(11):113601–17. [79] Su XY, Zhang QC, Xiang LQ, Cao YP, Chen WJ. A stroboscopic structured illumination system using in dynamic 3D visualization of high-speed motion object. Proc SPIE 2005;5852:796–9. [80] Edgerton HE. Stroboscope. /http://web.mit.edu/museum/exhibits/S. [81] Asundi AK, Sajan MR. Low-cost digital polariscope for dynamic photoelasticity. Opt Eng 1994;33(9):3052–5. [82] Asundi AK, Sajan MR. Digital drum camera for dynamic recording. Opt Eng 1996;35(6):1707–13. [83] Asundi AK, Sajan MR. Multiple LED camera for dynamic photoelasticity. Appl Opt 1995;34(13):2236–40. [84] Zhang QC, Su XY. High-speed optical measurement for the drumhead vibration. Opt Exp 2005;13(8):3310–6. [85] Su XY, Zhang QC, Li J, Li ZR. Optical 3D shape measurement for vibrating drumhead. Proc SPIE 2006(6027):60271. [86] Zhang QC, Su XY. An optical measurement of vortex shape at a free surface. Opt Laser Technol 2002;34(2):107–13. [87] Zhang QC, Su XY. Dynamic liquid surface measurement. Acta Optica Sinica 2001;21:1506–8 (in Chinese). [88] Zhang QC, Su XY, Chen WJ, Cao YP, Xiang LQ. An optical real-time 3-D measurement for facial shape and movement. Proc SPIE 2003;5254:214–21. [89] Zhang QC, Su XY, Chen WJ, et al. An optical real-time 3-D measurement for facial shape and movement in chewing. J Optoelectron Laser 2004;15(2):194–9 (in Chinese). [90] Zhang QC, Su XY, Xiang LQ. Whole-field vibration analysis of a woofer’s cone using a high-speed camera. Proc SPIE 2007;6279:627900-1–5. [91] Zhang QC, Su XY. Three-dimensional shape visualization of object in impact. Proc SPIE 2006;6027II:60273E.