Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYNCHRONOUS MODULATE, GATE AND INTEGRATE 3D SENSOR
Document Type and Number:
WIPO Patent Application WO/2024/081846
Kind Code:
A1
Abstract:
A confocal three-dimensional sensor (22) for measuring height of a point on an object (10) is provided. The sensor (22) includes a light source (2) and a light source modulator (16) configured to temporally modulate the light source intensity. A source pinhole aperture (4) is positioned to be illuminated by the light source (2) and a focus-tunable lens (8) is configured to focus illumination passing through the source pinhole aperture (4) onto the object (10). A detector pinhole aperture (12) is configured to receive reflected light from the object (10), wherein the focus-tunable lens (8) is configured to image the reflected light from the object (10) onto the detector pinhole aperture (12). A detector (14) and integrator (15) are configured to output a measurement indicative of total light transmitted through the detector pinhole aperture (12). A processor (20) is operably coupled to the detector (14), integrator (15), and light source modulator (16). The processor (20) is configured to synchronously cause the light source modulator (16) to modulate light source intensity while causing the focus tunable lens (8) to sweep axial focal position, the processor (20) being further configured to calculate a height of the point on the object (10) based on the output from the detector (14) and integrator (15).

Inventors:
HAUGAN CARL E (US)
SKUNES TIMOTHY A (US)
Application Number:
PCT/US2023/076784
Publication Date:
April 18, 2024
Filing Date:
October 13, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
CYBEROPTICS CORP (US)
International Classes:
G01B11/06; G01B11/24; G02B21/24; G02B21/36; G02B26/02
Domestic Patent References:
WO2013105922A22013-07-18
Foreign References:
JP2019200327A2019-11-21
US20180164562A12018-06-14
US20110310395A12011-12-22
US20070263226A12007-11-15
EP3985423A12022-04-20
Attorney, Agent or Firm:
CHRISTENSON, Christopher R. (US)
Download PDF:
Claims:
What is claimed is:

1. A single point confocal sensor for measuring height of a point on an object, the sensor comprising: a light source; a light source modulator configured to temporally modulate the light source intensity; a source pinhole aperture positioned to be illuminated by the light source; a focus-tunable lens configured to focus illumination passing through tire source pinhole aperture onto the object; a detector pinhole aperture configured to receive reflected light from the object, wherein the focus- tunable lens is configured to image the reflected light from the object onto the detector pinhole aperture; a detector and an integrator configured to output a measurement indicative of total light transmitted through the detector pinhole aperture; a processor operably coupled to the detector, integrator, and light source modulator, wherein the processor is configured to synchronously cause the light source modulator to modulate light source intensity while causing the focus tunable lens to sweep axial focal position.

2. Tire single point confocal sensor of claim 1, wherein the processor is further configured to calculate a height of the point on the object based on tire output from the detector and integrator.

3. The single point confocal sensor of claim 2: wherein during a first integration period, the processor is configured to control the light source modulator to modulate the light source at a first phase and first frequency while synchronously causing the focus-tunable lens to sweep axial focal position through a focus range, the processor being configured to read and store the integrator output for the first integration period and then reset the integrator; wherein during a second integration period, the processor is configured to control the light source modulator to modulate the light source at a second phase and first frequency while synchronously causing the focus-tunable lens to sweep axial focal position through the focus range, the processor being configured to read and store the integrator output for the second integration period and then reset the integrator; wherein during a third integration period, the processor is configured to control the light source modulator to modulate the light source at a third phase and first frequency while synchronously causing the focus-tunable lens to sweep axial focal position through the focus range, the processor being configured to read and store the integrator output for the third integration period; and wherein the processor is configured to compute a phase of the light source corresponding to focus position based on the first, second, and third integrator outputs and to convert the phase of light source to a height of the point on the object.

4. Tire single point confocal sensor of claim 1, wherein the light source is a light emitting diode.

5. Tire single point confocal sensor of claim 1, wherein the light source is a laser.

6. The single point confocal sensor of claim 1, wherein the light source is an incandescent source.

7. The single point confocal sensor of claim 1, wherein the focus-tunable lens is mechanically scanned by a voice coil.

8. The single point confocal sensor of claim 1 , wherein the focus-tunable lens is mechanically scanned by a linear stage.

9. The single point confocal sensor of claim 1, wherein the focus-tunable lens is a liquid lens.

10. The single point confocal sensor of claim 9, wherein focus of the liquid focus-tunable lens is adjusted by electrostatically changing curvature of a liquid lens surface.

11. The single point confocal sensor of claim 1, wherein focus of the focus-tunable lens is adjusted by using sound waves to change the refractive index of the focus-tunable lens.

12. Tire single point confocal sensor of claim 1, wherein the temporal modulation is based on Gray codes.

13. The single point confocal sensor of claim 1. wherein the temporal modulation is based on ramps.

14. The single point confocal sensor of claim 1, wherein the temporal modulation is based on Hamiltonian codes.

15. The single point confocal sensor of claim 1, wherein the temporal light source modulation is a periodic function.

16. The single point confocal sensor of claim 15, wherein the periodic function is a sinusoidal function.

17. The single point confocal sensor of claim 15, wherein the periodic function has a period that is equal to an amount of time necessary for the focus-tunable lens to sweep axial focal position through the entire focus range.

18. A confocal three-dimensional sensor for measuring height of an array of points on an object, the sensor comprising: a light source; a light source modulator configured to temporally modulate light source intensity; a pinhole aperture array configured to be illuminated by the light source and receive reflected light from the object; a focus-tunable lens configured to focus illumination passing through the pinhole aperture array onto the object and image reflected light from the object onto the pinhole aperture array: a focus modulator operably coupled to the focus-tunable lens, the focus modulator being configured to sweep axial focal position; an imaging system to image the pinhole aperture array onto a camera detector; a camera detector configured to receive reflected light from the object imaged onto the pinhole aperture array by the focus-tunable lens and to provide an output measurement indicative of total transmitted light through the pinhole aperture array for each point in the array of points; a processor operably coupled to the camera detector, the processor being configured to synchronously cause the light source modulator to modulate light source intensity while causing the focus modulator to sweep axial focal position.

19. The confocal three-dimensional sensor of claim 18. wherein the processor is further configured to calculate a height of each point in the array of points based on the output from the camera detector.

20. The confocal three-dimensional sensor of claim 19, wherein the processor is configured: during a first integration period, to cause the light source modulator to modulate tire light source at a first phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through a focus range, the processor being configured to read and store a video frame from the camera detector relative to the first integration period; during a second integration period, to cause the light source modulator to modulate the light source at a second phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through the focus range, the processor being configured to read and store a video frame from the camera detector relative to the second integration period; during a third integration period, to cause the light source modulator to modulate the light source at a third phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through the focus range, the processor being configured to read and store a video frame from the camera detector relative to the third integration period; and to compute phase of the light source corresponding to focus position for video frame pixels based on tire first, second, and third, stored video frames, and then convert the phase of the pixels to a height value.

21. The confocal three-dimensional sensor of claim 18, wherein the pinhole aperture array is a moving pinhole aperture array. l ' l. The confocal three-dimensional sensor of claim 21, wherein the moving pinhole aperture array is a spinning Nipkow disk.

23. The confocal three-dimensional sensor of claim 21, wherein the moving pinhole aperture is a reciprocally-translating array.

24. Tire confocal three-dimensional sensor of claim 18, wherein the light source is a light emitting diode.

25. The confocal three-dimensional sensor of claim 18, wherein the light source is a laser.

26. The confocal three-dimensional sensor of claim 18, wherein the light source is an incandescent source.

27. Tire confocal three-dimensional sensor of claim 18, wherein the focus-tunable lens is mechanically scanned by a voice coil.

28. The confocal three-dimensional sensor of claim 18, wherein the focus-tunable lens is mechanically scanned by a linear stage.

29. The confocal three-dimensional sensor of claim 18, wherein the focus-tunable lens is a liquid lens.

30. The confocal three-dimensional sensor of claim 29, wherein focus of the liquid focus-tunable lens is adjusted by electrostatically changing curvature of a liquid lens surface.

31. The confocal three-dimensional sensor of claim 18, wherein focus of the focus-tunable lens is adjusted by using sound waves to change the refractive index of the focus-tunable lens.

32. The confocal three-dimensional sensor of claim 18, wherein the temporal modulation is based on Gray codes.

33. The confocal three-dimensional sensor of claim 18, wherein the temporal modulation is based on ramps.

34. The confocal three-dimensional sensor of claim 18, wherein tire temporal modulation is based on Hamiltonian codes.

35. The confocal three-dimensional sensor of claim 18, wherein the temporal light source modulation is a periodic function.

36. The confocal three-dimensional sensor of claim 35, wherein the periodic function is a sinusoidal function.

37. The confocal three-dimensional sensor of claim 35, wherein the periodic function has a period that is equal to an amount of time necessary for the focus-tunable lens to sweep axial focal position through the focus range.

38. The confocal three-dimensional sensor of claim 18, wherein the camera detector is a complementary metal -oxide semiconductor (CMOS) area array having a two-dimensional array of pixels.

39. The confocal three-dimensional sensor of claim 18, wherein the camera detector is a charge coupled device (CCD) area array having a two-dimensional array of pixels.

40. A confocal three-dimensional sensor for measuring height of an array of points on an object, the sensor comprising: a light source; a light source modulator configured to temporally modulate light source intensity; a pinhole aperture array configured to be illuminated by the light source and receive reflected light from the object; a focus modulator configured to sweep axial focal position; an imaging system to image the pinhole aperture array onto a camera detector; an interference objective configured to focus illumination passing through the pinhole aperture array onto the object and to provide a coherence interference signal onto the pinhole aperture array during a focus sweep; a camera detector configured to receive the coherence interference signal transmitted through the pinhole aperture array and to provide an output measurement indicative of total transmitted light through the pinhole aperture array for each point in the array of points; a processor operably coupled to the camera detector, the processor being configured to synchronously cause the light source modulator to modulate light source intensity while causing the focus modulator to sweep axial focal position.

41 . The confocal three-dimensional sensor of claim 40, wherein the processor is further configured to calculate a height of each point in the array of points based on output from the camera detector.

42. Tire confocal three-dimensional sensor of claim 41, wherein the processor is configured: during a first integration period, to cause the light source modulator to modulate the light source at a first phase and first frequency while synchronously causing tire focus modulator to sweep axial focal position through a focus range, the processor being configured to read and store a video frame from the camera detector relative to the first integration period; during a second integration period, to cause the light source modulator to modulate the light source at a second phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through the focus range, the processor being configured to read and store a video frame from the camera detector relative to the second integration period; during a third integration period, to cause the light source modulator to modulate the light source at a third phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through the focus range, the processor being configured to read and store a video frame from the camera detector relative to the third integration period; and to compute phase of the light source corresponding to focus position for video frame pixels based on the first, second, and third, stored video frames, and then convert the phase of the pixels to a height value.

43. The confocal three-dimensional sensor of claim 40, wherein the pinhole aperture array is a moving pinhole aperture array.

44. The confocal three-dimensional sensor of claim 43, wherein the moving pinhole aperture array is a spinning Nipkow disk.

45. Tire confocal three-dimensional sensor of claim 43, wherein the moving pinhole aperture is a reciprocally -translating array.

46. The confocal three-dimensional sensor of claim 40, wherein the light source is a light emitting diode.

47. The confocal three-dimensional sensor of claim 40, wherein the light source is a laser.

48. The confocal three-dimensional sensor of claim 40, wherein the light source is an incandescent source.

49. The confocal three-dimensional sensor of claim 40, wherein tire temporal modulation is based on Gray codes.

50. The confocal three-dimensional sensor of claim 40, wherein the temporal modulation is based on ramps.

51. The confocal three-dimensional sensor of claim 40, wherein the temporal modulation is based on Hamiltonian codes.

52. The confocal three-dimensional sensor of claim 40, wherein the temporal light source modulation is a periodic function.

53. The confocal three-dimensional sensor of claim 52, wherein the periodic function is a sinusoidal function.

54. The confocal three-dimensional sensor of claim 52, wherein the periodic function has a period that is equal to an amount of time necessary for the focus-tunablc lens to sweep axial focal position through the focus range.

55. The confocal three-dimensional sensor of claim 40, wherein the camera detector is a complementary metal -oxide semiconductor (CMOS) area array having a two-dimensional array of pixels.

56. The confocal three-dimensional sensor of claim 40, wherein the camera detector is a charge coupled device (CCD) area array having a two-dimensional array of pixels.

58. The confocal three-dimensional sensor of claim 40, wherein the interference objective is a focus tunable interference objective.

59. A confocal three-dimensional sensor for measuring height of an array of points on an object, the sensor comprising: a light source; a light source modulator configured to temporally modulate light source intensity; a spatial light modulator configured to be illuminated by the light source and receive reflected light from the object; a focus-tunable lens configured to focus illumination gated by the spatial light modulator onto the object and image reflected light from the object onto the spatial light modulator; a focus modulator operably coupled to the focus-tunable lens, the focus modulator being configured to sweep axial focal position; an imaging system to image the spatial light modulator onto a camera detector; a camera detector configured to receive reflected light from the object imaged onto the spatial light modulator by the focus-tunable lens and to provide an output measurement indicative of total light gated by the spatial light modulator for each point in the array of points; a processor operably coupled to the camera detector, the processor being configured to synchronously cause the light source modulator to modulate light source intensity while causing the focus modulator to sweep axial focal position.

60. The confocal three-dimensional sensor of claim 59, wherein the processor is further configured to calculate a height of each point in the array of points based on the output from the camera detector.

61. Tire confocal three-dimensional sensor of claim 60, wherein the processor is configured: during a first integration period, to cause the light source modulator to modulate the light source at a first phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through a focus range, the processor being configured to read and store a video frame from the camera detector relative to the first integration period; during a second integration period, to cause the light source modulator to modulate the light source at a second phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through the focus range, the processor being configured to read and store a video frame from the camera detector relative to the second integration period; during a third integration period, to cause the light source modulator to modulate the light source at a third phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through the focus range, the processor being configured to read and store a video frame from the camera detector relative to the third integration period; and to compute phase of the light source corresponding to focus position for video frame pixels based on the first, second, and third, stored video frames, and then convert the phase of the pixels to a height value.

62. The confocal three-dimensional sensor of claim 59, wherein the light source is a light emitting diode.

63. The confocal three-dimensional sensor of claim 59, wherein the light source is a laser.

64. The confocal three-dimensional sensor of claim 59, wherein the light source is an incandescent source.

65. The confocal three-dimensional sensor of claim 59, wherein the focus-tunable lens is mechanically scanned by a voice coil.

66. The confocal three-dimensional sensor of claim 59, wherein the focus-tunable lens is mechanically scanned by a linear stage.

67. The confocal three-dimensional sensor of claim 59, wherein the focus-tunable lens is a liquid lens.

68. Tire confocal three-dimensional sensor of claim 67, wherein focus of the liquid focus-tunable lens is adjusted by electrostatically changing curvature of a liquid lens surface.

69. The confocal three-dimensional sensor of claim 59, wherein focus of the focus-tunable lens is adjusted by using sound waves to change the refractive index of the focus-tunable lens.

70. The confocal three-dimensional sensor of claim 59, wherein the temporal modulation is based on Gray codes.

71. Tire confocal three-dimensional sensor of claim 59, wherein the temporal modulation is based on ramps.

72. The confocal three-dimensional sensor of claim 59. wherein the temporal modulation is based on Hamiltonian codes.

73. The confocal three-dimensional sensor of claim 59, wherein the temporal light source modulation is a periodic function.

74. Tire confocal three-dimensional sensor of claim 73, wherein the periodic function is a sinusoidal function.

75. The confocal three-dimensional sensor of claim 73, wherein the periodic function has a period that is equal to an amount of time necessary for the focus-tunable lens to sweep axial focal position through the focus range.

76. The confocal three-dimensional sensor of claim 59, wherein the camera detector is a complementary metal-oxide semiconductor (CMOS) area array having a two-dimensional array of pixels.

77. The confocal three-dimensional sensor of claim 59, wherein the camera detector is a charge coupled device (CCD) area array having a two-dimensional array of pixels.

78. Tire confocal three-dimensional sensor of claim 59, wherein the spatial light modulator is a pixelated spatial light modulator.

79. The confocal three-dimensional sensor of claim 59, wherein the spatial light modulator is a digital mirror device (DMD).

80. The confocal three-dimensional sensor of claim 59, wherein the spatial light modulator is a silicon device (LCDS).

81. A single point confocal sensor for measuring height of a point on an object, the sensor comprising: a light source: a light source modulator configured to temporally modulate the light source intensity: a lens configured to focus illumination passing therethrough onto the object: a source pinhole aperture positioned to be illuminated by the light source; a focus modulator configured to generate relative motion between the object and the single point confocal three-dimensional sensor; a detector pinhole aperture configured to receive reflected light from the object: a detector and integrator configured to output a measurement indicative of total light transmitted through the detector pinhole aperture; a processor operably coupled to the detector, light source modulator, and focus modulator wherein the processor is configured to synchronously cause the light source modulator to modulate light source intensity while causing the focus modulator to sweep axial focal position.

82. The single point confocal sensor of claim 81, wherein the processor is further configured to calculate a height of the point on the object based on an output from the detector and integrator.

83. The single point confocal sensor of claim 82: wherein during a first integration period, the processor is configured to control the light source modulator to modulate the light source at a first phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through a focus range, the processor being configured to read and store the detector output for the first integration period and then reset the detector; wherein during a second integration period, the processor is configured to control the light source modulator to modulate the light source at a second phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through the focus range, the processor being configured to read and store the detector output for the second integration period and then reset the detector; wherein during a third integration period, the processor is configured to control the light source modulator to modulate the light source at a third phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through the focus range, the processor being configured to read and store the detector output for the third integration period: and wherein the processor is configured to compute a phase of the light source corresponding to focus position based on the first, second, and third detector outputs and to convert the phase of light source to a height of the point on the object.

84. The single point confocal sensor of claim 81, wherein the light source is a light emitting diode.

85. The single point confocal sensor of claim 81, wherein the light source is a laser.

86. The single point confocal sensor of claim 81, wherein the light source is an incandescent source.

87. The single point confocal sensor of claim 81, wherein the focus modulator is operable coupled to a moving stage assembly.

88. The single point confocal sensor of claim 81, wherein the temporal modulation is based on Gray codes.

89. The single point confocal sensor of claim 81, wherein the temporal modulation is based on ramps.

90. The single point confocal sensor of claim 81, wherein the temporal modulation is based on Hamiltonian codes.

91. Tire single point confocal sensor of claim 81, wherein the temporal light source modulation is a periodic function.

92. The single point confocal sensor of claim 91 , wherein the periodic function is a sinusoidal function.

93. The single point confocal sensor of claim 91, wherein the periodic function has a period that is equal to an amount of time necessary for the focus-tunable lens to sweep axial focal position through the entire focus range.

94. A confocal three-dimensional sensor for measuring height of an array of points on an object, the sensor comprising: a light source: a light source modulator configured to temporally modulate light source intensity; a pinhole aperture array configured to be illuminated by the light source and receive reflected light from the object; a lens configured to focus illumination passing through the pinhole aperture array onto the object: a focus modulator configured to generate relative motion between the object and confocal three- dimensional sensor; a camera detector configured to receive reflected light from tire object imaged onto the pinhole aperture array by the lens and to provide an output for each point in the array of points; a processor operably coupled to the light source modulator, the focus modulator, and the camera detector, the processor being configured to synchronously cause the light source modulator to modulate light source intensity while causing the focus modulator to sweep axial focal position.

95. The confocal three-dimensional sensor of claim 94, wherein the processor is further configured to calculate a height of each point in the array of points based on the output from the camera detector.

96. The confocal three-dimensional sensor of claim 95, wherein the processor is configured: during a first integration period, to cause the light source modulator to modulate the light source at a first phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through a focus range, the processor being configured to read and store a video frame from the camera detector relative to the first integration period; during a second integration period, to cause the light source modulator to modulate the light source at a second phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through the focus range, the processor being configured to read and store a video frame from the camera detector relative to the second integration period: during a third integration period, to cause the light source modulator to modulate the light source at a third phase and first frequency while synchronously causing the focus modulator to sweep axial focal position through the focus range, the processor being configured to read and store a video frame from the camera detector relative to the third integration period; and to compute phase of the light source corresponding to focus position for video frame pixels based on the first, second, and third, stored video frames, and then convert the phase of the pixels to a height value.

97. The confocal three-dimensional sensor of claim 94, wherein the light source is a light emitting diode.

98. The confocal three-dimensional sensor of claim 94, wherein the light source is a laser.

99. The confocal three-dimensional sensor of claim 94, wherein the light source is an incandescent source.

100. The confocal three-dimensional sensor of claim 94. wherein the temporal modulation is based on Gray codes.

101. The confocal three-dimensional sensor of claim 94, wherein the temporal modulation is based on ramps.

102. Tire confocal three-dimensional sensor of claim 94, wherein the temporal modulation is based on Hamiltonian codes.

103. The confocal three-dimensional sensor of claim 94. wherein the temporal light source modulation is a periodic function.

104. The confocal three-dimensional sensor of claim 103, wherein the periodic function is a sinusoidal function.

105. The confocal three-dimensional sensor of claim 103, wherein the periodic function has a period that is equal to an amount of time necessary for the focus-tunable lens to sweep axial focal position through the focus range.

106. The confocal three-dimensional sensor of claim 94, wherein the camera detector is a complementary metal -oxide semiconductor (CMOS) area array having a two-dimensional array of pixels.

107. Tire confocal three-dimensional sensor of claim 94, wherein the camera detector is a charge coupled device (CCD) area array having a two-dimensional array of pixels.

108. A confocal three-dimensional sensor for measuring height of an array of points on an object, the sensor comprising: a light source; a light source modulator configured to temporally modulate light source intensity; a spatial light modulator configured to be illuminated by the light source and receive reflected light from tire object; a focus modulator configured to sweep axial focal position; an imaging system to image the spatial light modulator onto a camera detector; an interference objective configured to focus illumination gated by the spatial light modulator onto the object and to provide a coherence interference signal onto the spatial light modulator during a focus sweep; a camera detector configured to receive the coherence interference signal gated by the spatial light modulator and to provide an output measurement indicative of total reflected light by the spatial light modulator for each point in the array of points; a processor operably coupled to the camera detector, the processor being configured to synchronously cause the light source modulator to modulate light source intensity while causing the focus modulator to sweep axial focal position.

109. Tire confocal three-dimensional sensor of claim 108, wherein the processor is further configured to calculate a height of each point in tire array of points based on the output from the camera detector.

110. The confocal three-dimensional sensor of claim 108, wherein the light source is a light emitting diode.

111. The confocal three-dimensional sensor of claim 108, wherein the light source is a laser.

112. The confocal three-dimensional sensor of claim 108, wherein the temporal modulation is based on Gray codes.

113. The confocal three-dimensional sensor of claim 108, wherein the temporal modulation is based on ramps.

114. The confocal three-dimensional sensor of claim 108, wherein the temporal modulation is based on Hamiltonian codes.

115. The confocal three-dimensional sensor of claim 108, wherein the temporal light source modulation is a periodic function.

116. The confocal three -dimensional sensor of claim 108, wherein the periodic function is a sinusoidal function.

117. A single point confocal sensor for measuring height of a point on an object, the sensor comprising: a light source; a source pinhole aperture positioned to be illuminated by the light source; a focus-tunable lens configured to focus illumination passing through the source pinhole aperture onto the object; a detector pinhole aperture configured to receive reflected light from the object, wherein the focus- tunable lens is configured to image the reflected light from the object onto the detector pinhole aperture; a detector and an integrator configured to output a measurement indicative of total light transmitted through the detector pinhole aperture; a source modulator configured to temporally modulate at least one of the light source, the light reflected by the object, integrator gain, and detector gain; a processor operably coupled to the detector, integrator, and source modulator, wherein the processor is configured to synchronously cause the source modulator to temporally modulate while causing the focus-tunable lens to sweep axial focal position.

118. A confocal three-dimensional sensor for measuring height of an array of points on an object, the sensor comprising: a light source; a pinhole aperture array configured to be illuminated by the light source and receive reflected light from the object; a focus-tunable lens configured to focus illumination passing through the pinhole aperture array onto the object and image reflected light from the object onto the pinhole aperture array; a focus modulator operably coupled to the focus-tunable lens, the focus modulator being configured to sweep axial focal position; an imaging system to image the pinhole aperture array onto a camera detector; a camera detector configured to receive reflected light from the object imaged onto the pinhole aperture array by the focus-tunable lens and to provide an output measurement indicative of total transmitted light through the pinhole aperture array for each point in the array of points: a source modulator configured to temporally modulate at least one of the light source, light reflected from the object, integrator gain, and detector gain; and a processor operably coupled to the camera detector, the processor being configured to synchronously cause the source modulator to temporally modulate while causing the focus modulator to sweep axial focal position.

1 19. A confocal three-dimensional sensor for measuring height of an array of points on an object, the sensor comprising: a light source; a pinhole aperture array configured to be illuminated by the light source and receive reflected light from the object: a focus modulator configured to sweep axial focal position; an imaging system to image the pinhole aperture array onto a camera detector: an interference objective configured to focus illumination passing through the pinhole aperture array onto the object and to provide a coherence interference signal onto the pinhole aperture array during a focus sweep; a camera detector configured to receive the coherence interference signal transmitted through the pinhole aperture array and to provide an output measurement indicative of total transmitted light through the pinhole aperture array for each point in the array of points; a source modulator configured to temporally modulate at least one of the light source, light reflected from the object, integrator gain, and detector gain; and a processor operably coupled to the camera detector, the processor being configured to synchronously cause the source modulator to temporally modulate while causing the focus modulator to sweep axial focal position.

Description:
SYNCHRONOUS MODULATE, GATE AND INTEGRATE 3D SENSOR

BACKGROUND

[0001] One known technology for 3D optical sensing is triangulation-based phase measurement profilometry. Spatially modulated patterns of light are projected onto an object in triangulationbased phase measurement profilometry and then viewed by an imaging system from a different direction than the projection system. The three-dimensional topography of the object under inspection distorts the projected patterns as viewed by the imaging system and the 3D topography may be calculated by measuring the distortions. Triangulation-based phase measurement profilometry is well suited for high-speed industrial applications since the number of projected patterns, and hence the number of imaging system video frames required is low. Typically, three to twelve patterns and video frames are required for good performance. Tri angulation -based systems are not suitable for applications much below lateral resolutions of 2 pm, however, since the required numerical apertures at these resolutions force the size of the projector and imaging systems to expand to the point where they physically interfere with each other. In addition, as the numerical aperture of the optics increases the depth of field of the system decreases, limiting such systems to very small height ranges.

[0002] Confocal 3D optical sensing systems are applicable for applications requiring high numerical apertures and lateral resolutions finer than 2 pm since, by definition, the same optical system that illuminates the object also serves to collect the light reflected from the object under inspection. There are many confocal 3D optical sensing technologies including White Light Interferometry (WLI), conventional confocal microscopy, Structured Illumination Microscopy SIM, and chromatic confocal. All of these technologies can deliver high accuracy and large depth of field but suffer from relatively slow speeds which make them unsuitable for many industrial applications.

[0003] White Light Interferometry (WLI) axially scans either the object or reference mirror and peak interference at each image pixel is observed when the optical path between the object and reference mirror are equal. One hundred or more axial positions and corresponding video frames are often required to accurately measure the 3D topography of an object making WLI too slow for many industrial applications. Example white light interferometer is disclosed in US 5,706,085. [0004] Conventional 3D confocal microscopes project an array of individual point sources onto an object with a source aperture array, the reflected light is imaged onto a detection aperture array, and the detection aperture array is subsequently imaged onto a camera detector. In some arrangements, the same aperture array can function as both the source and detection aperture arrays. Either the object is mechanically scanned or the focal position is scanned in an axial direction and the peak intensity at each pixel in the camera image is observed when the object is in best focus at that pixel. US 9,041,940 notes that 200 confocal images are conventionally required and the invention of US 9,041,940 claims to reduce that number to 20 images or less.

[0005] Structured illumination microscopy (SIM) projects spatially modulated patterns of light onto the object and peak contrast for each point on the object during an axial scan determines best focus and the three-dimensional coordinate at that object point. Techniques to speed up SIM have been developed, but still often require fifty or more axial locations and video frames to accurately measure the 3D topography of an object. Example structured illumination microscopes are disclosed in US 8,649,024 and US 10,634,487.

[0006] Chromatic confocal 3D sensors encode depth through axial chromatic aberration. By measuring the peak spectral value at each pixel, the 3D topography can be accurately measured. Chromatic confocal 3D sensors do not require mechanical axial scanning and can have large depths of field. However, the spectrometer that determines the peak spectral value typically requires sixty four or more pixels for a single point on the object to obtain the required 3D measurement accuracy. Effectively, sixty four or more detector readings are required at each point on an object, again making this technology too slow for many industrial applications. Example chromatic confocal 3D sensor is disclosed in US 9,494,529.

SUMMARY

[0007] A confocal three-dimensional sensor for measuring height of a point on an object is provided. The sensor includes a light source and a light source modulator configured to temporally modulate the light source intensity. A source pinhole aperture is positioned to be illuminated by the light source and a focus-tunable lens is configured to focus illumination passing through the source pinhole aperture onto the object. A detector pinhole aperture is configured to receive reflected light from the object, wherein the focus-tunable lens is configured to image the reflected light from the object onto the detector pinhole aperture. A detector and integrator are configured to output a measurement indicative of total light transmitted through the detector pinhole aperture. A processor is operably coupled to the detector, integrator, and light source modulator. The processor is configured to synchronously cause the light source modulator to modulate light source intensity while causing the focus tunable lens to sweep axial focal position, the processor being further configured to calculate a height of the point on the object based on the output from the detector and integrator.

BRIEF DESCRIPTION OF THE DRAWINGS

[0008] FIG. 1 is a diagrammatic view of an example single point 3D confocal sensor with temporal modulation in accordance with an embodiment disclosed herein.

[0009] FIGS. 2A-C show example light source modulator current waveforms.

[0010] FIGS. 2D-2F are diagrammatic views of alternate focal sweeps in accordance with one embodiment.

[0011] FIGS. 2G-2M are diagrammatic views of alternate focal sweeps in accordance with one embodiment.

[0012] FIGS. 3A-3B are flow diagrams of a method of a 3D confocal measurement process in accordance with an embodiment disclosed herein.

[0013] FIG. 4 is a diagrammatic view of an example area scan 3D confocal sensor with temporal modulation in accordance with one embodiment.

[0014] FIG. 5 is a flow diagram of a method of measuring a surface using a confocal 3D sensor in accordance with an embodiment disclosed herein.

[0015] FIG. 6 is a diagrammatic view of an example 3D confocal sensor with temporal modulation in accordance with another embodiment described herein.

[0016] FIG. 7 is a diagrammatic view of an example 3D confocal sensor with temporal modulation in accordance with another embodiment described herein.

[0017] FIG. 8 is a diagrammatic view of an example 3D confocal sensor with temporal modulation in accordance with another embodiment described herein.

[0018] FIGS. 9A-D demonstrate a short coherence length source, such as an LED or incandescent source.

[0019] FIG. 9E shows the response of a traditional interferometer when using a longer coherence length source (such as a multi-mode laser). [0020] FIG. 9F is a magnified portion of the scan showing the central portion of FIG. 9E.

[0021] FIG. 9G shows a response/scan using a long coherence length source with a 3D confocal sensor in accordance with an embodiment described herein.

[0022] FIG. 9H is a magnified portion of the scan showing the central portion of FIG 9G.

[0023] FIG. 10 is a diagrammatic view of an example 3D confocal sensor with temporal modulation utilizing a focus tunable interference objective in accordance with an embodiment described herein.

[0024] FIG. 11 is a diagrammatic view of an example 3D confocal sensor with temporal modulation in accordance with another embodiment described herein.

[0025] FIG. 12 shows an example SLM spatial pattern that can be used in combination with embodiments described herein.

[0026] FIG. 13 is a diagrammatic view of an example 3D confocal sensor with temporal modulation utilizing focus tunable interference objective in accordance with another embodiment described herein.

[0027] FIGS. 14A-14F show an example light source modulation phase and frequency pattern for measuring double return.

DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS

[0028] Embodiments disclosed herein include improvements to dramatically reduce the number of confocal images required in high accuracy 3D confocal measurement systems, resulting in high speed, high resolution and large depth of field 3D measurements. In contrast with prior art confocal 3D technologies where, for example, fifty or more images are captured in a single focus sweep, the light source of a confocal 3D measurement system is temporally modulated synchronously with a complete focus sweep during a single detector integration period or image capture period. A selected coding scheme determines the number of distinct temporal light source modulation patterns. The modulation patterns are changed between subsequent image captures and focus sweeps. The resulting image intensities from each modulation pattern and focus sweep may then be used to decode the peak focus position for every pixel in an image. For sinusoidally varying temporal modulation patterns, the phase of the light source is changed between subsequent image captures and the peak focus position for every pixel in the image is computed using standard spatial phase shift algorithms. In this manner, embodiments can synchronously modulate, gate and integrate. The light source and focus can be synchronously modulated. Pinhole apertures are the gates that only allow light from best focus to transmit through. The detector can integrate during an entire modulation of the light source and focus sweep.

[0029] FIG. 1 is a diagrammatic view of an example single point 3D confocal sensor 22 with temporal modulation. Light from source 2 illuminates source pinhole aperture 4, transmits through beam splitter 6 and is converged towards object under test 10 by focus tunable lens 8. Light then reflects from object 10, passes back through focus tunable lens 8, is reflected by beam splitter 6, and converges towards detection pinhole aperture 12. Light passing through detection pinhole 12 is collected by detector 14 and integrator 15. Due to the optical sectioning properties of confocal microscope systems, light transmitting through detection pinhole 12 will be maximized when the transmitted light from pinhole aperture 4 is at best focus on object 10. Away from best focus, most of the light reflected from object 10 will be blocked by detection pinhole 12. The optical sectioning properties of confocal microscopes is also described in the literature as “confocal gating”. The source and detection pinhole apertures create a “gate” that essentially allows only light from the best focus location to transmit through the detection pinhole aperture.

[0030] Light source 2 may be, but is not limited to, a LED, laser (such as a solid state laser), or incandescent source whereby the output intensity may be temporally modulated by light source modulator 16. Focus tunable lens 8 may be, but is not limited to, a lens mechanically scanned by a voice coil or linear stage. Alternatively, focus tunable lens 8 may be a liquid lens where focus is adjusted by electrostatically changing the curvature of a liquid lens surface or by using sound waves to change the refractive index of focus tunable lens 8.

[0031] FIGS. 2A-C show example light source modulator 16 current waveforms Lrc for single frequency sinusoidal modulation patterns. I S rc is sinusoidally modulated at frequency fk according to equation la, where n=0, 1,2, t is time, and I pea k is the peak LED current. The value for peak LED current I pea k is selected to provide suitable illumination levels for the object being inspected.

Equation la

[0032] Focus modulator 18 is also synchronized with light source modulator 16 to sweep focus position Zfoc of tunable lens 8 as shown in Figures 2A-C. In the example of FIG. 2A, there is a peak in the detected current Idet, proportional to I src ,o, by detector 14 at the Zf 0C location indicated by the vertical dashed lines. This detected current is integrated by integrator 15 to record a level Io. This return is visible in the integrated signal lint- In other words, the position of best focus is encoded by the phase of sinusoidal waveform I S rc as the focus position is being swept synchronously with the light source modulator 16.

[0033] In FIG. 2B, the phase of light source modulator 16 is shifted by 2TC/3 radians, corresponding to n=l in Equation 1. Peak detected current Idet is again at best focus and is proportional to I src ,i resulting in an integrated value Ii. In Fig. 2C, the phase of light source modulator 16 is shifted by 4TT/3 radians, corresponding to n=2 in Equation 1. Peak detected current Idet is again at best focus and is proportional to I src ,2, resulting in an integrated value E. To solve for the phase of I S rc , and hence the position of best focus, standard phase shift techniques from interferometry or phase shift profilometry may be used such as the technique known as three phase reconstruction. From standard phase shift algorithms, the phase <I>, which encodes the position of best focus is given by Equation 2. The inverse tangent function in Equation 2 returns values between -nil and TT/2, the signs of the numerator and denominator of the equation may be used to map this phase to a 0 to 2TI range. Once phase is adjusted to the 0 to 2TI range the time t associated with that phase can be calculated using Equation 10. The reflectance, R, of object 10 at the measurement point is given by Equation 3 and is directly proportional to the sum of the three detected peak currents with a scaling factor a. The contrast of the received signal, C, is given by equation 4. Contrast is distinct from reflectivity (defined in equation 3), reflectivity measures all light received while contrast is a measure of the strength of the detected sine wave.

Equation 10 [0034] Other phase shift techniques may be used such as the four phase technique which uses four sinusoidal light source modulations Erc,n, n=0, 1,2,3 and the phase is shifted by 7t/2 radians each time n is incremented by 1. Additionally, higher frequency sinusoidal waveforms I S rc which would go through several periods as the focus is swept may be used to increase the sensitivity of the phase detection. This creates the so-called 2% ambiguity problem which can handled by using multiple sinusoidal waveforms Etc frequencies and phases. For example, two different frequencies can be used to create a longer synthetic wavelength and eliminate the 2TT ambiguity.

[0035] The wavelength for frequency fk is given by equation 5 where v z is the velocity of the focus position change; in MKS units, if v z is in m/s and fk is in cycles/s then Xk will be in m/cycle .

Equation 5

[0036] A synthetically longer wavelength Ayn may be created with wavelengths i and '/ by equation 6.

Equation 6

[0037] If two or more frequencies are used then the closed form solutions for phase and contrast, Equations 3 and 4, no longer apply. Since the integrated light level model includes trigonometric functions the most straightforward method estimating the object characteristics is an iterative least squares solver. A number of math libraries offer tools to minimize the fitting residual defined in equations such as eq. 8, for instance Matlab® (version 2022b, The MathWorks Inc.) includes the function fminsearch. Minimizing the fitting residual begins by defining the fitting residual in equation 8. Equation 8

[0038] Where E.n are the measured image levels for each phase n and frequency k and Ik,n is the estimated image level for an estimated reflectivity, phase, and contrast. Equation lb [0039] Equation lb models the integrated returned intensity. The modeled integrated returned intensity is identified as Ik,n, this is distinct from the measured integrated value In. In Equation lb R is the estimated reflectivity of the object, including detector dark level and ambient light reaching the detector. Estimated signal contrast is modeled as Co. The estimated position of the object surface is identified by to, the time point when the focus plane sweep crossed the object surface. [0040] The typical approach is to minimize the sum of squared residuals, calculated as S in Equation 9. s = z (lk,n - Ik,n) 2 Equation 9

[0041] Supplying this residual function along with initial parameter estimates to the iterative least squares solver results in best fit estimates for R, C, and to.

[0042] Figs. 2A-2C show a linear sweep of focus position Zf 0C . A calibration process, not shown, may accurately characterize any non-linearities of focus position Zf 0C as well as the precise range of the focus sweep.

[0043] Other temporal light source modulation techniques that encode the location of best focus during the focus sweep may include, but are not limited to, Gray codes, linearly ascending and descending ramps, and Hamiltonian codes. Example measurement coding scheme with three Hamiltonian light source modulation patterns is shown in Figs. 2G-2I and example measurement coding scheme with four Hamiltonian light source modulation patterns is shown in Figs. 2J-2M. [0044] FIG. 3A is a flow diagram of a method of a 3D confocal measurement process with phase measurement coding scheme and sinusoidal modulation patterns in accordance with an embodiment disclosed here. Fig. 3A further illustrates the measurement process of confocal 3D sensor 22. The process begins by processor 20 resetting integrator 15 at step 28. The process proceeds to step 30 where current from detector 14 begins to be integrated by integrator 15. Immediately after step 30, processor 20 signals light source modulator 16 to sinusoidally modulate light source 2 at an initial phase and frequency at step 32 while processor 20 also signals focus modulator 18 to synchronously begin sweeping the focus plane at step 34 as is shown in Figure 2A, for example. When the focus sweep is complete and light source has gone through a predetermined number of cycles, processor 20 signals integrator 15 to stop integration at step 36 and then the integrated current from detector is read out as a voltage at step 38 by processor 20. The voltage for each focus sweep is stored by processor 20 at step 40 while proceeding to decision block 42. It is determined at block 42 if the last phase is complete. If it is not the last sweep, then processor 20 increments the phase, and the next frequency if applicable, at step 46. Processor 20 then resets integrator 15 at step 28 and signals detector 14 integration to begin as well as signaling light source modulator with the next phase and next frequency, if applicable. The process then repeats until processor 20 determines the last sweep is complete at step 42. The stored voltages corresponding the phase of best focus for each focus sweep are retrieved at step 48 and processor 20 computes the phase of light source I src corresponding the position of best focus at step 48 using standard phase shift techniques such as equation 2. The time corresponding to the position of focus may then be calculated using Equation 10, for example. Reflectance at the measurement location is calculated using equation 3 and contrast may be calculated using equation 4, for example. At step 49, the time of best focus is converted to a calibrated height value by processor 20 accounting for the precise range of focus sweep and any non-linearities of the focus sweep.

[0045] The measurement process may be sped up by alternating the direction of focus sweep between each integration period to take advantage of the focus retrace. Referring to FIGS. 2D-2F, the focus position Zf 0C is swept from low to high in FIG. 2D. The sweep direction of Zf 0C in FIG. 2E is swept from high to low and swept from low to high in FIG. 2F. To accommodate the polarity change of the sweep direction, the phase of Ere, 1 is time reversed in FIG. 2E relative to I S rc,l in FIG. 2B. Equation 2 may then be used to calculate the phase, ( I>, which encodes the position of best focus. Equation 3 is also used to calculate the reflectance at the measurement location.

[0046] FIG. 3B is a flow diagram of a method of a 3D confocal measurement process with selectable coding patterns in accordance with an embodiment disclosed here. Fig. 3B further illustrates the measurement process of confocal 3D sensor 22. Process 220 begins by selecting an appropriate coding scheme and light source modulation patterns at step 227 such as the coding scheme of Figs. 2J-M with four Hamiltonian light source patterns. Processor 20 then resets integrator 15 at step 228. The process proceeds to step 230 where current from detector 14 begins to be integrated by integrator 15. Immediately after step 230, processor 20 signals light source modulator 16 to modulate light source 2 at step 232 in accordance an initial modulation pattern, in accordance with the selected coding scheme, while processor 20 also signals focus modulator 18 to synchronously begin sweeping the focus plane at step 234 as is shown in Figure 2J, for example. When the focus sweep and modulation pattern are complete, processor 20 signals integrator 15 to stop integration at step 236 and then the integrated current from detector is read out as a voltage at step 238 by processor 20. The voltage for each focus sweep is stored by processor 20 at step 240 while proceeding to decision block 242. It is determined at block 242 if the last modulation pattern is complete. If it is not the last sweep, then processor 20 increments the modulation pattern at step 246. Processor 20 then resets integrator 15 at step 228 and signals detector 14 integration to begin as well as signaling light source modulator with the modulation pattern. The process then repeats until processor 20 determines the last sweep is complete at step 242. The stored voltages corresponding the time of best focus for each focus sweep are retrieved at step 248 and processor 20 decodes the time corresponding the position of best focus at step 248 in accordance with the selected coding scheme. Reflectance is also calculated at step 248. At step 249, the time of best focus is converted to a calibrated height value by processor 20 accounting for the precise range of focus sweep and any non-linearities of the focus sweep.

[0047] FIG. 4 is a diagrammatic view of an example area scan 3D confocal sensor 90 with temporal modulation. Light from source 50 is modulated by light source modulator 52, collected by condenser lens 54, transmitted through beam splitter 56 and projected onto Nipkow disk 58. Nipkow disk 58 contains an array of pinhole apertures and is rotated by motor 59. An example Nipkow disk is disclosed in US 4,927,254. Lens 60, aperture stop 62 and focus tunable lens 64 create an imaging system to image Nipkow disk 58 pinhole apertures onto object 10. Light reflected from object 10 is imaged back onto Nipkow disk 58 by lens 64, aperture stop 62 and lens 60. Reflected light passing through Nipkow disk 58 apertures reflects off beam splitter 56 and is imaged onto camera detector 70 by the imaging system formed by lens 65, aperture stop 66 and lens 68. Again, due to the optical sectioning properties of confocal microscopes, reflected light transmitted through Nipkow disk 58 pinhole apertures will have a peak intensity when the point on the object is in best focus and the intensity will decrease rapidly away from best focus. Camera detector 70 may be, but is not limited to, a CMOS or CCD area array with a two-dimensional array of pixels. Focus tunable lens 64 may be, but is not limited to, a lens mechanically scanned by a voice coil or linear stage. Alternatively, focus tunable lens 64 may be a liquid lens where focus is adjusted by electrostatically changing the curvature of a liquid lens surface or by using sound waves to change the refractive index of focus tunable lens 64. Light source 50 may be, but is not limited to, a LED, solid state laser, or incandescent source whereby the output intensity may be temporally modulated by light source modulator 52.

[0048] Object 10 is conveyed by stage assembly 51. Stage assembly 51 may include one or more linear or rotary stages.

[0049] Timing controller 72 signals the temporal modulation pattern for each focus sweep to light source modulator 52. Timing controller 72 also synchronizes the timing of light source modulator 52 and focus modulator 74 to sweep the focus while simultaneously temporally modulating light source 50 during one integration period of camera detector 70.

[0050] Nipkow disk 58 may be designed with pinhole patterns along Archimedean spirals, the pattern may consist of a single continuous spiral or of multiple interleaved spirals. If a single spiral is used, then the disk must spin an entire revolution to sample all radial distances. If there are N spirals, then the disk must spin 1/N revolutions to sample all radial distances. Because the focus sweep results in only a brief period of being near best focus (when maximum light levels are returned to the detector), the pinhole pattern may be designed to sample all necessary radial positions over a small rotation angle. This can be accomplished by utilizing a very large number of spirals and by staggering the radius of the pinholes in each spiral to maximize the radial coverage over short rotation angles. The geometric shape of the pinhole apertures may be, but are not limited to circular, square, or octagon shapes. The geometric shape of the pinhole apertures may also thin straight or curved lines. In another embodiment, rotating Nipkow disk 58 may be replaced by an array of pinhole apertures that are linearly translated. The design of the aperture patterns may be optimized to balance light throughput, axial resolution, and cross-talk from out of focus regions passing through adjacent apertures. The cross-talk contributes to background intensity Idet away from the location of best focus.

[0051] FIG. 5 is a flow diagram of a method of measuring a surface using a confocal 3D sensor in accordance with embodiments disclosed herein. Method 300 begins in step 96 by computer 76 providing selected coding scheme and light source modulation patterns to timing controller 72 such as the coding scheme of FIGS. 2G-2I with three Hamiltonian light source modulation patterns. Next, camera detector 70 is reset by timing controller 72 at step 98. Method 300 proceeds to step 100 where the integration begins for a single video frame of detector 70. Immediately following step 100, timing controller 72 signals light source modulator 52 to modulate light source 50 with an initial modulation pattern at step 102 while timing controller 72 also signals focus modulator 74 to synchronously begin sweeping the focus plane at step 104 as is shown in Figure 2G, for example. When the focus sweep is complete and light source modulator 52 has completed the modulation pattern, timing controller 72 signals detector 70 to stop integration at step 106. Readout of video data begins at step 108 and is transferred to memory in computer 76 at step 110. The process proceeds to decision block 112 where it is determined by timing controller if the last modulation pattern is complete. If it is not the last sweep, then timing controller 72 increments the modulation pattern at step 116. Detector 70 is then reset at step 98 and timing controller 72 then signals camera detector 70 to begin the integration of the next video frame at step 100. Method 300 then repeats until timing controller 72 determines the last sweep is complete at step 112. Stored pixel values of the corresponding time of best focus for each focus sweep are retrieved at step 118 and computer 76 decodes the time of light source I S rc corresponding to the time of best focus at step 118 for all pixels of camera detector 70. Reflectance for all pixels is also calculated at step 118. At step 120, the time of best focus for each pixel is converted to a calibrated height value for each pixel by computer 76 accounting for the precise range of focus sweep and any non-linearities of the focus sweep or optical aberrations. The calibration process of 3D confocal sensor 90 may also accommodate for other design and manufacturing tolerances such as field curvature across the field of view of camera detector 70. At this point, computer 76 may command stage assembly

51 to translate object 10 to a new location and begin another measurement cycle at a different field of view.

[0052] In another example, detector 70 may also be a line scan detector which is configured as a one-dimensional array of photodetectors or pixels, or a Time Delay and Integration (TDI) image sensor, each of which creates a line field of view. In this example, stage assembly 51 may move continuously during detector 70 integration in a direction perpendicular to the line field of view. The velocity of stage assembly 51, the integration time of detector 70 and the number of focal sweeps per measurement then affect the lateral resolution in the direction of stage movement.

[0053] FIG. 6 is a diagrammatic view of an example 3D confocal sensor 92 with temporal modulation similar to 3D confocal sensor 90. Like numbered elements supply the same functionality. Focus tunable lens 64 has been replaced by fixed lens 84 in 3D confocal sensor 92. Focus tunable lens 80 in FIG. 6 is placed at or near aperture stop 62. Focus tunable lens 80 may be, but is not limited to, a lens mechanically scanned by a voice coil or linear stage. Alternatively, focus tunable lens 80 may be a liquid lens where focus is adjusted by electrostatically changing the curvature of a liquid lens surface or by using sound waves to change the refractive index of focus tunable lens 80.

[0054] FIG. 7 is a diagrammatic view of an example 3D confocal sensor 91 with temporal modulation. Focus modulator 82 synchronously sweeps the position of object 10 through focus by moving stage assembly 51 in an axial direction of lens 84.

[0055] FIG. 8 is a diagrammatic view of an example 3D confocal sensor 93 with temporal modulation similar to 3D confocal sensor 91. Like numbered elements supply the same functionality. Fixed lens 84 has been replaced by interference objective 69 in 3D confocal sensor 93. Interference objective 69 may be, but is not limited to, known Mirau, Michelson, or Linnik type interferometer objectives. Light source 53 may be a short coherence length source or a long coherence length source. Light source 53 may be, but is not limited to, a LED, super-luminescent LED (SLED), laser, or incandescent source whereby the output intensity may be temporally modulated by light source modulator 52. Focus modulator 82 synchronously sweeps the position of object 10 through focus by moving stage assembly 51 in an axial direction of lens 84.

[0056] Individual pixels of camera detector 70 will receive a coherence interference signal during the focus sweep, due to interference objective 69, which is superimposed on the confocal response due to the pinhole apertures of Nipkow disk 58.

[0057] FIGS. 9A-D demonstrate a short coherence length source, such as an LED or incandescent source. FIG. 9A shows the interference pattern for a traditional interferometer, typically called a White Light Interferometer (WLI). FIG 9B is a magnified portion of the scan showing the central portion of FIG. 9A. For a traditional WLI for most of the scan range there are no interference ripples, and for a very narrow height range defined by the coherence length of the source, a strong interference pattern, known as a correlogram, is visible. Away from best focus, the detector receives a high background level. For sensor 93, a characteristic pixel response, Idet, function is shown in FIG. 9C as a function of the focus position Zf 0C . FIG. 9D is the same pixel response, Idet, as FIG. 9C over a smaller range focus position Zf 0C . As can be seen in FIG. 9C, the background level away from best focus is much lower, due to the gating properties of the source and detection pinhole apertures, than in a traditional WLI response shown in FIG 9A. This background level reduction allows the modulated return to be integrated onto a detector without adding excessive signal level or noise. Method 300 may be used to calculate height values of object 10 at each pixel location for 3D confocal sensor 93. As is described in method 300, several different light source modulation frequencies may be used while sweeping focus of object 10 with stage assembly 51. A minimum of two frequencies may be used to find the position of the modulation envelope and the position of the peak of the correlogram. For instance, a relatively low modulation frequency may be used to localize the confocal pinhole return and a high frequency used to find the peak position of the interference pattern.

[0058] FIG. 9E shows the response of a traditional interferometer when using a longer coherence length source (such as a multi-mode laser). FIG. 9F is a magnified portion of the scan showing the central portion of FIG. 9E. For a long coherence length source the interference fringes are visible over a very broad range but it is difficult to determine the peak location. This is the wrapping problem common to laser based interferometers. Using a long coherence length source with sensor 93 results in the return shown in FIG. 9G. FIG. 9H is a magnified portion of the scan showing the central portion of FIG 9G. The sectioning properties of the pinhole confocal system confine the return to a small region near best focus. The peak of the correlogram is much more easily seen in the sensor 93 response (FIG. 9G) compared to a standard interferometer (FIG. 9F).

[0059] A common problem with scanning white light interferometers is the need for a large number of images, sampled at many focus height planes, particularly if there the height range is large. Method 300 applied to 3D confocal sensor 93 affords a method of overcoming this limitation. Be applying method 300 to find the position of the object 10 a first height estimate using low frequency temporal modulation pattern may be found in as few as two to three images. Low modulation frequencies are insensitive to the interference ripples near best focus, only the envelope is detected. This envelope detection is identical to the operating mode for sensor 90. Method 300 is then applied again with high frequency modulation pattern, where the modulation is only applied in a smaller focal sweep region near the first height estimate. Limiting illumination to a smaller scan region reduces scan time and reduces the integration of background light level and the shot noise associated with the integration of excess background level.

[0060] FIG. 10 is a diagrammatic view of an example 3D confocal sensor 94 with temporal modulation utilizing focus tunable interference objective 67. Focus tunable interference objective 67 may be, but is not limited to, an interference objective mechanically scanned by a voice coil or linear stage. Alternatively, focus tunable interference objective may include a liquid lens element or by using sound waves to change the refractive index of a lens element where focus is adjusted by either electrostatically changing the curvature of a liquid lens surface or by using sound waves to change the refractive index of a lens element focus tunable interference objective 67. Method 300 may be used to calculate height values of object 10 at each pixel location for 3D confocal sensor 94.

[0061] FIG. 11 is a diagrammatic view of an example 3D confocal sensor 190 with temporal modulation. 3D confocal sensor 190 operates under the same principle as 3D confocal sensor 90 with the functionality provided by Nipkow disk 58 replaced by spatial light modulator (SLM) 158. Both Nipkow disk 58 and spatial light modulator 158 reduce the background intensity away from best focus on camera 70 and camera 170 pixels, respectively, during a focal sweep when the measurement position of object 10 is away from best focus. Light from source 150 is modulated by light source modulator 152, collected by condenser lens 154, transmits through beam splitter 163, and is incident on spatial light modulator 158. Pixelated spatial light modulator 158 may be, but is not limited to, a digital mirror device (DMD) or liquid crystal on silicon device (LCDS). Light then reflects from actively on SLM 158 pixels. Lens 160, aperture stop 162 and focus tunable lens 164 create an imaging system to image SLM 158 pixels onto object 10. Light reflected from object 10 is imaged back onto SLM 158 by lens 164, aperture stop 162 and lens 160. Light then reflects off actively on SLM 158 pixels, reflects off beam splitter 163 and is then imaged onto camera 170 by the imaging system formed by lens 165, aperture stop 166 and lens 168. Spatial light modulator 158 may emulate the optical sectioning properties of a Nipkow disk system by utilizing a temporal series of spatial patterns similar to the spatial pattern of the Nipkow aperture array. An example SLM 158 spatial pattern is shown in FIG. 12. During a single focal sweep, the spatial patterns are temporally switched at high speed to effectively emulate the sweeping aperture array of a spinning Nipkow disk. Due to the optical sectioning properties of confocal microscopes, light reflected from actively on SLM 158 pixels will have a peak intensity when the point on the object is in best focus and the intensity will decrease rapidly away from best focus.

[0062] Fig. 13 is a diagrammatic view of an example 3D confocal sensor 194 with temporal modulation utilizing focus tunable interference objective 167. 3D confocal sensor 194 is similar to 3D confocal sensor 190 with focus tunable lens 164 replaced by focus tunable interference objective 167.

[0063] A common measurement task is the estimation of the thickness of a transparent layer, for example the thickness of a mask layer on the surface of a printed circuit board or the photoresist on a semiconductor wafer. For a single reflecting surface the 3D confocal sensor must estimate the surface reflectivity, height (phase of the returned signal), and contrast level of the returned signal. Including a second return results in five unknowns: object reflectivity, height of both surfaces, and contrast levels of both surfaces. At least five data points are needed to solve for these five unknowns.

[0064] FIGS. 14A-14F show an example light source modulation phase and frequency pattern for measuring double return. In FIGS. 14A-14C, I src is sinusoidally modulated at frequency fk = 1 according to equation 1, where n=0,l,2. In FIGS. 14D-14F, I src is sinusoidally modulated at frequency fk = 3 according to equation 1, where n=0,l,2. For FIGS. 14A-14F the focus sweep is identical, the detected current Let shows a double return. The timing of the two peak returns are marked by the dashed lines labeled ‘Surface 0’ and ‘Surface 1’. This double return is visible in the integrated signal lint. Integrated values Io, Ii, I2 correspond to measurements with frequency fk = 1. Integrated values I3, 14, Is correspond to measurements with frequency fk = 3. As described, there are five unknowns; using two modulation frequencies with three phases for each provides six measurements. Equation 7

[0065] The integrated return for each phase and frequency for a double return object may be modeled by equation 7. The object reflectivity estimate is R, the returned contrast for each of the two surfaces are estimated as Co and Ci. Light sensed at the two focus positions is estimated as to and ti. Images of the object are collected at multiple phases n and frequencies fk. Once to and ti have been estimated, these values may be converted to phase using Equation 10 and to height using the step 120 of method 300.

[0066] Since the integrated light level model includes trigonometric functions the most straightforward method estimating the object characteristics is an iterative least squares solver. A number of math libraries offer tools to minimize the fitting residual defined in equations such as Equation 8, for instance the Matlab® includes the function fminsearch. The process begins by defining the fitting residual in Equation 8. r k ,n = Ik.n - Ik,n Equation 8

[0067] One approach is to minimize the sum of squared residuals, calculated as S in Equation 9. s = z Ok,n - ik,n) 2 Equation 9

[0068] Supplying this residual function along with initial parameter estimates to the iterative least squares solver results in best fit estimates of target reflectivity, phase, and contrast.

[0069] In practice, lens blurring will cause contrast C to decrease as modulation frequency f is increased. To achieve best results the C terms in Equation 7 should be weighted by this expected blurring with frequency. The fitting accuracy and robustness may be improved by including more frequencies in the image set.

[0070] As described above, temporal modulation is provided by modulating the light source. The same functionality can be achieved by temporally modulating sensitivity or transmission in other portions of the signal path. The gain of integrator 15 in sensor 22 or the integrator which is part of camera detector 70 and 170 may modulated temporally according to the same methods described to achieve the same performance. On-camera detector integrator modulation is used, for example, in Time Of Flight sensors such as the Texas Instruments OPT8241.

[0071] Alternatively, the gain of the optical path may be temporally modulated by the addition of a ferroelectric or liquid crystal light valve in the optical path. Other means of modulating the optical throughput include variable absorbers and crossed polarizers. Modulating the transparency or reflectance of a light valve may be used instead of temporally modulating light source 2, light source 50, or light source 53.

[0072] Alternatively in sensors 190 and 194, SLM 158 may temporally modulate the light by temporally modulating the throughput of the actively on SLM 158 pixels.