WO2018123992A1 - 測定装置、光学式センサ、測定方法及びプログラム - Google Patents

測定装置、光学式センサ、測定方法及びプログラム Download PDF

Info

Publication number
WO2018123992A1
WO2018123992A1 PCT/JP2017/046484 JP2017046484W WO2018123992A1 WO 2018123992 A1 WO2018123992 A1 WO 2018123992A1 JP 2017046484 W JP2017046484 W JP 2017046484W WO 2018123992 A1 WO2018123992 A1 WO 2018123992A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
light receiving
receiving unit
optical path
measurement
Prior art date
Application number
PCT/JP2017/046484
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
真幸 丸山
和田 智之
徳人 斎藤
Original Assignee
国立研究開発法人理化学研究所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 国立研究開発法人理化学研究所 filed Critical 国立研究開発法人理化学研究所
Publication of WO2018123992A1 publication Critical patent/WO2018123992A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication

Definitions

  • the present invention relates to a measuring apparatus, an optical sensor, a measuring method, and a program.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2004-170437
  • the method of receiving light from an object with a two-dimensional sensor has a problem that the measurement accuracy or measurement range is easily affected by the element arrangement of the two-dimensional sensor.
  • a measuring device may measure the shape of the object with light.
  • the measurement apparatus may include a control unit that changes at least one optical path of light incident on the object and light from the object with respect to the light receiving unit for detecting light from the object.
  • the measurement apparatus may include a measurement unit that measures the shape of the object based on the optical path when the light receiving unit detects light from the object.
  • the measuring device may include an optical element that changes the optical path.
  • the control unit may change the optical path with respect to the light receiving unit by changing the state of the optical element.
  • the optical element may change the optical path by refraction or diffraction.
  • the control unit may change the optical path with respect to the light receiving unit by changing the direction of the optical element.
  • the measurement unit may measure the shape of the object based on the direction of the optical element when the light receiving unit detects light from the object.
  • the optical element may have one or more prisms.
  • the control unit may change the optical path with respect to the light receiving unit by changing the direction of the prism.
  • the measurement unit may measure the shape of the object based on the orientation of the prism when the light receiving unit detects light from the object.
  • the optical element may change the optical path by reflection.
  • the control unit may change the optical path with respect to the light receiving unit by changing the direction of the optical element.
  • the measurement unit may measure the shape of the object based on the direction of the optical element when the light receiving unit detects light from the object.
  • the control unit may change the optical path with respect to the light receiving unit until light from the object is detected by the light receiving unit.
  • the control unit may change the position of the light receiving unit until light from the object is detected by the light receiving unit.
  • the light receiving unit may be a line sensor provided so as to be substantially orthogonal to the direction of change of the optical path.
  • the light receiving unit may be an area sensor.
  • the control unit may cause the light path from the object to enter the first region of the area sensor with the light path of the light from the object as the first optical path.
  • the control unit may change the optical path of light from the object from the first optical path to the second optical path, and allow the light from the object to enter the second region of the area sensor.
  • the measurement unit is configured to detect the light from the object by the plurality of light receiving elements provided in the first area of the area sensor, and from the object by the plurality of light receiving elements provided in the second area of the area sensor.
  • the shape of the object may be measured based on the detection result of light.
  • the light receiving unit may be an area sensor.
  • the control unit may change the optical path with respect to the area sensor.
  • the measurement unit may measure the shape of the object based on an optical path when light from the object is detected at a predetermined position on the area sensor.
  • the control unit may cause the area sensor to receive light from the object in a state where the optical path is fixed with respect to the area sensor.
  • the measurement unit may measure the shape of the object based on the position on the area sensor where the light from the object is detected.
  • the measurement apparatus may include a switching unit that switches between the first measurement mode and the second measurement mode.
  • an optical sensor may include the measuring device and a light receiving unit.
  • the optical sensor may include a light emitting unit that emits light incident on the object.
  • a measurement method may measure the shape of the object with light.
  • the measurement method may include a step of changing at least one optical path of light incident on the object and light from the object with respect to a light receiving unit for detecting light from the object.
  • the measurement method may include a step of measuring the shape of the object based on an optical path when the light receiving unit detects light from the object.
  • a program may be a program for measuring the shape of the object.
  • the program may cause the computer to execute a procedure of changing at least one optical path of light incident on the object and light from the object with respect to the light receiving unit for detecting light from the object.
  • the program may execute a procedure for measuring the shape of the object based on the optical path when the light receiving unit detects light from the object.
  • FIG. 1 schematically shows a prism system 200 as an example of an optical element 100.
  • the state of the prism system 200 when light from the H1 position enters the light receiving unit 130 is shown.
  • the state of the prism system 200 when light from the H2 position enters the light receiving unit 130 is shown.
  • the state of the prism system 200 when light from the H3 position enters the light receiving unit 130 is shown.
  • 6 is a graph showing the dependence of the received light intensity I on the rotation angle ⁇ of the prism system 200.
  • the correspondence relationship between the rotation angle ⁇ of the prism system 200 and the height H of the object is shown in a table format.
  • FIG. 9 It is a flowchart which shows an example of the measuring method which measures the shape of a target object with light.
  • An example of functional composition of optical sensor 910 in a 2nd embodiment is shown roughly.
  • the irradiated line light is schematically shown together with a signal of the received light intensity I obtained by the light receiving unit 130.
  • An example of a functional composition of optical sensor 1110 in a 3rd embodiment is shown roughly.
  • the light receiving element arrangement in the light receiving unit 1130 and the intensity distribution of reflected light are schematically shown.
  • 6 schematically shows the intensity distribution of reflected light when the rotation angle ⁇ of the prism system 200 is changed.
  • 1 schematically shows a light projecting system 90 and a light receiving system 1460 of an optical sensor 1410 as a first modification of the first embodiment.
  • a light projecting system 90 and a light receiving system 1560 of an optical sensor 1510 as a second modification of the first embodiment are schematically shown.
  • 9 schematically shows a light projecting system 90 and a light receiving system 1660 of an optical sensor 1610 as a third modification of the first embodiment.
  • An example of a functional composition of optical sensor 1710 in a 4th embodiment is shown roughly.
  • An example of a functional composition of optical sensor 1810 in a 5th embodiment is shown roughly.
  • 1 illustrates an example of a computer 2000 in which embodiments can be implemented in whole or in part.
  • FIG. 1 schematically shows an example of a functional configuration of the optical sensor 10 in the first embodiment.
  • the optical sensor 10 includes a measuring device 20, a driving device 110, a light projecting system 90, and a light receiving system 60.
  • the light projecting system 90 includes a light emitting unit 180 and a lens 170.
  • the light receiving system 60 includes an optical element 100, a lens 120, and a light receiving unit 130.
  • the measurement apparatus 20 includes a control unit 140 and a measurement unit 150.
  • the measuring device 20 measures the shape of the object with light. Specifically, the measuring device 20 measures the shape of the object by measuring the height of the object. In the present embodiment, the measuring device 20 measures the height H of the surface of the object from a predetermined reference plane. In the present embodiment, “height” indicates the distance from the reference plane in the direction along the optical axis of the light projecting system 90. In the present embodiment, the height of the surface of the object from a predetermined reference plane may be referred to as “the height of the object”.
  • the surface of the object is at a height H1 from the reference surface
  • the surface of the object is at a height H2 from the reference surface
  • the surface of the object is at a height H3 from the reference surface.
  • the reflected light of an object whose surface is at a height H1 from the reference plane may be referred to as light from the H1 position.
  • the reflected light of the object whose surface is at a height of H2 from the reference plane is referred to as light from the H2 position
  • the reflected light of the object whose surface is at a height of H3 from the reference plane is referred to from the H3 position.
  • the direction or the like may be expressed using an xyz coordinate system.
  • the z-axis of the orthogonal coordinate system is determined in a direction parallel to the optical axis of the light projecting system 90.
  • the direction in which the light from the light projecting system 90 travels is the z-axis minus direction.
  • the y axis is determined so that the plane including the optical axis of the light projecting system 90 and the optical axis of the light receiving system 60 is parallel to the yz plane.
  • the x-axis, y-axis, and z-axis are a right-handed orthogonal coordinate system.
  • an XYZ coordinate system may be used to indicate the direction of the light receiving unit 130 or the like.
  • the Z axis is determined in a direction parallel to the optical axis of the light receiving unit 130.
  • the X axis is defined in the same direction as the x axis.
  • the X axis, Y axis, and Z axis are a right-handed orthogonal coordinate system.
  • the light emitting unit 180 emits light for measuring the shape of the object.
  • the light emitting unit 180 is a laser diode (LD), for example, and emits laser light.
  • the light emitted from the light emitting unit 180 enters the object.
  • the light incident on the object is reflected by the surface of the object. At least a part of the light reflected by the surface of the object enters the light receiving system 60.
  • LD laser diode
  • the light incident on the light receiving system 60 passes through the optical element 100 and the lens 120 and enters the light receiving unit 130.
  • the light receiving unit 130 is a member for detecting light from the object.
  • the light receiving unit 130 includes a single light receiving element.
  • the light receiving element may be a photoelectric conversion element such as a photodiode.
  • the optical element 100 is provided in the optical path of light from the object.
  • the optical element 100 changes the optical path of light from the object.
  • the optical element 100 is an optical element whose amount of change in the optical path is variable.
  • the control unit 140 changes the optical path with respect to the light receiving unit 130 by changing the state of the optical element 100.
  • the driving device 110 changes the change amount of the optical path changed by the optical element 100 according to the control of the control unit 140.
  • the light from the H1 position passes through the optical element 100 and enters the light receiving unit 130.
  • the driving device 110 can change the change amount of the optical path by the optical element 100 so that the light from the H2 position enters the light receiving unit 130. Further, the driving device 110 can change the amount of change of the optical path by the optical element 100 so that light from the H3 position enters the light receiving unit 130.
  • the control unit 140 changes the optical path of light from the object with respect to the light receiving unit 130. Specifically, the control unit 140 changes the amount of change of the optical path by the optical element 100 by controlling the driving device 110. And the measurement part 150 measures the shape of a target object based on the optical path when the light-receiving part 130 detects the light from a target object. For example, the measurement unit 150 acquires the drive amount of the drive device 110 from the control unit 140. The measuring unit 150 calculates the height H of the surface of the object from the reference plane based on the driving amount when the light receiving unit 130 detects light from the object. Thereby, the measurement part 150 measures the surface shape of a target object.
  • the shape of the object can be measured by controlling the optical element 100 so that the light receiving unit 130 receives light from the object. Therefore, compared with the case where the shape of the object is measured from the light receiving level of the two-dimensional sensor, the measurement result of the object shape can be made less susceptible to the element arrangement of the two-dimensional sensor.
  • FIG. 2 schematically shows a prism system 200 as an example of the optical element 100.
  • the prism system 200 includes a first prism 101 and a second prism 102. Each of the first prism 101 and the second prism 102 is a wedge prism.
  • the prism system 200 forms a Risley prism pair.
  • the prism system 200 changes the optical path of light passing through the prism system 200. Specifically, as shown in FIG. 2, the prism system 200 deflects incident light to the prism system 200. More specifically, the incident light to the prism system 200 is deflected by the first prism 101 and the second prism 102, respectively. Thereby, the outgoing light from the prism system 200 travels in a different direction from the incident light.
  • the first prism 101 and the second prism 102 are provided to be rotatable about the optical axis AX of the prism system 200.
  • the driving device 110 rotates the first prism 101 and the second prism 102 in the opposite directions.
  • the control unit 140 outputs an angle ⁇ indicating the rotation angle of the first prism 101 and the second prism 102 to the driving device 110 as a driving amount.
  • the driving device 110 includes the first prism 101 and the second prism 102 such that the rotation angle of the first prism 101 is ⁇ from the reference angle and the rotation angle of the second prism 102 is ⁇ from the reference angle. Rotate. Thereby, the deflection angle ⁇ of the outgoing light with respect to the incident light changes according to the rotation angle ⁇ .
  • the control unit 140 changes the optical path with respect to the light receiving unit 130 by changing the directions of the first prism 101 and the second prism 102.
  • the directions of the first prism 101 and the second prism 102 correspond to the rotation angle around the optical axis AX of the prism system 200.
  • the measurement unit 150 can measure the shape of the object based on the orientation of the prism when the light receiving unit 130 detects light from the object. It will be described later.
  • the prism system 200 is an example of an optical element that changes the optical path by refraction.
  • the driving device 110 may be a galvano motor.
  • various actuators such as a servo motor and a stepping motor can be applied in addition to the galvano motor.
  • the sign of the rotation angle ⁇ may be described as a plus direction in the clockwise direction when viewed in the traveling direction of incident light.
  • the rotation angle of the first prism 101 with respect to the reference angle may be referred to as “the rotation angle of the prism system 200”.
  • FIG. 3 shows a state of the prism system 200 when light from the H1 position enters the light receiving unit 130. This state is referred to as a first state.
  • the incident-side surface of the first prism 101 is parallel to the exit-side surface of the second prism 102. In this case, the light from the H1 position travels substantially straight in the yz plane.
  • FIG. 4 shows a state of the prism system 200 when light from the H2 position enters the light receiving unit 130. This state is referred to as a second state.
  • the second state is a state in which the first prism 101 is rotated by ⁇ 90 ° and the second prism 102 is rotated by 90 ° with respect to the first state of FIG.
  • FIG. 5 shows a state of the prism system 200 when light from the H3 position enters the light receiving unit 130. This state is referred to as a second state.
  • the second state is a state in which the first prism 101 is rotated by 90 ° and the second prism 102 is rotated by ⁇ 90 ° with respect to the first state of FIG.
  • FIG. 6 is a graph showing the dependence of the received light intensity I on the rotation angle ⁇ of the prism system 200.
  • the controller 140 controls the driving device 110 to change the rotation angle ⁇ of the prism system 200 with time.
  • the measuring unit 150 Based on the received light amount signal output from the light receiving unit 130, the measuring unit 150 detects a temporal change in the light reception intensity I of the light receiving unit 130. As described above, when the rotation angle ⁇ of the prism system 200 is changed, the peak of the light receiving intensity I of the light receiving unit 130 is obtained at an angle corresponding to the position of the surface of the object.
  • the measurement unit 150 analyzes the light reception intensity I of the light reception unit 130 to identify the rotation angle ⁇ 1 of the prism system 200 when the peak of the light reception intensity I is obtained.
  • the measurement unit 150 calculates the height H of the surface of the object based on the specified rotation angle ⁇ 1.
  • FIG. 7 shows the correspondence between the rotation angle ⁇ of the prism system 200 and the height H of the object in a table format.
  • the measuring unit 150 stores association information that associates the rotation angle ⁇ of the prism system 200 with the height H of the object.
  • the rotation angle ⁇ is the rotation angle when the peak of the received light intensity I is obtained.
  • FIG. 7 shows a table as an example of association information. The table associates the height H of the object with each of the plurality of rotation angles ⁇ .
  • the measurement unit 150 refers to the table and determines H corresponding to the rotation angle ⁇ by the table, It is specified as the height of the object. For example, when the rotation angle of the prism system 200 when the peak of the received light intensity I is obtained is ⁇ 1, the measurement unit 150 specifies H1 associated with ⁇ 1 as the height of the object. When the rotation angle ⁇ referred to in the association information is a discrete value, the measurement unit 150 determines the height H of the object associated with the rotation angle ⁇ closest to the identified rotation angle as the object height. May be calculated as Further, the measurement unit 150 may calculate the height of the object by performing an interpolation operation or the like using a plurality of Hs associated with a plurality of rotation angles ⁇ .
  • the height H of the object is determined based on the principle of triangulation, the position of the light emitting unit 180 and the light receiving unit 130, the angle of the optical axis of the irradiation light from the light emitting unit 180, and the optical system composed of the light receiving unit 130 and the lens 120. It is determined from the angle of the optical axis and the optical path change amount by the prism system 200.
  • triangulation an angle from each of two reference points to a measurement target point is measured, and a base line connecting the two reference points is measured based on the measured angle and the positions of the two reference points. The position of the measurement target point is calculated.
  • the prism system 200 does not change the optical path of incident light
  • the light emitting unit 180 and the light receiving unit 130 correspond to two reference points in triangulation, and the optical axis of the irradiation light from the light emitting unit 180.
  • the angle of the optical axis of the optical system composed of the light receiving unit 130 and the lens 120 correspond to the angle from each reference point to the measurement target point in triangulation.
  • the position of the reference point of triangulation corresponding to the position of the light receiving unit 130 and the angle of the measurement target point from the reference point changes according to the optical path change amount of the prism system 200.
  • the rotation angle ⁇ of the prism system 200, the position of the light receiving unit 130, and the angle of the optical axis of the optical system including the light receiving unit 130 and the lens 120 are determined. Therefore, the height H of the object is the position of the light emitting unit 180 and the light receiving unit 130, the angle of the optical axis of the light emitted from the light emitting unit 180, the angle of the optical axis of the optical system composed of the light receiving unit 130 and the lens 120, and
  • the rotation angle ⁇ of the prism system 200 can be calculated.
  • the measurement unit 150 applies the above-described method for calculating the height H of the object without using a table, and determines the positions of the light emitting unit 180 and the light receiving unit 130 and the optical axis of the irradiation light from the light emitting unit 180.
  • the angle of the optical axis of the optical system composed of the light receiving unit 130 and the lens 120, and the rotation angle ⁇ of the prism system 200 when the peak of the received light intensity I is obtained the height H of the object is obtained. May be calculated.
  • the measuring unit 150 receives the light
  • the height H of the object may be calculated by calculation using the rotation angle ⁇ of the prism system 200 when the peak of intensity I is obtained.
  • the positions of the light emitting unit 180 and the light receiving unit 130, the angle of the optical axis of the irradiation light from the light emitting unit 180, the optical system including the light receiving unit 130 and the lens 120 A function using the angle of the optical axis and the rotation angle ⁇ of the prism system 200 as arguments may be used.
  • association information for specifying the height H of the object based on the rotation angle ⁇ of the prism system 200.
  • the association information is not limited to a table or a function. Various data other than tables and functions can be adopted as the association information.
  • FIG. 8 is a flowchart showing an example of a measurement method for measuring the shape of an object with light.
  • control unit 140 controls the driving device 110 to rotate the prism system 200 until the rotation angle of the prism system 200 reaches a predetermined scanning start angle.
  • control unit 140 controls the light emitting unit 180 to emit light from the light emitting unit 180.
  • the measurement unit 150 starts reading the received light amount signal from the light receiving unit 130.
  • the control unit 140 controls the driving device 110 to start the rotation of the prism system 200.
  • the control unit 140 determines whether or not the rotation angle of the prism system 200 has reached a predetermined scanning end angle. If it is determined in S810 that the rotation angle of the prism system 200 has not reached the scanning end angle, the determination in S810 is repeated until the rotation angle of the prism system 200 reaches the scanning end angle.
  • the measurement unit 150 receives the light received by the light receiving unit 130 based on the data of the received light intensity I obtained from the light receiving unit 130. A rotation angle ⁇ at which the intensity I reaches a peak is calculated.
  • the measurement unit 150 calculates the height H of the object based on the rotation angle ⁇ at which the light reception intensity I of the light reception unit 130 reaches a peak.
  • the measurement apparatus 20 outputs the data on the height H of the object calculated in S814 to the outside of the measurement unit 150, and ends the process.
  • the prism system 200 is rotated until the scanning end angle is reached.
  • the control unit 140 may stop the rotation of the prism system 200 when the peak of the received light intensity I is detected.
  • the control unit 140 may change the optical path with respect to the light receiving unit 130 until the light receiving unit 130 detects light from the object.
  • the function of the measuring apparatus 20 may be realized by a computer.
  • the functions of the control unit 140 and the measurement unit 150 may be realized by a processor included in a computer.
  • the program loaded in the computer may cause the processor to execute the procedure shown in the flowchart of FIG. 8 and the procedure for realizing the functions of the control unit 140 and the measurement unit 150.
  • the control unit 140 outputs control signals for the driving device 110, the light receiving unit 130, and the light emitting unit 180, and the measuring unit 150 receives the light.
  • a received light amount signal indicating the amount of light may be acquired from the light receiving unit 130.
  • a specific hardware configuration of the measuring device 20 will be described later.
  • the optical sensor 10 it is not necessary to provide a plurality of light receiving elements two-dimensionally unlike a two-dimensional sensor. Moreover, it is not necessary to calculate a peak from the outputs of a plurality of light receiving elements of the two-dimensional sensor. Therefore, the measurement result is not easily affected by the resolving power of the plurality of light receiving elements of the two-dimensional sensor. For example, it can suppress that the measurement accuracy of the height H of a target object falls by the width of the pitch width of the light receiving element of a two-dimensional sensor.
  • the prism system 200 is used as the optical element 100. Thereby, even if the optical path is changed, the optical path length does not change greatly. Therefore, it is possible to prevent the light distortion and spread from being greatly different depending on the optical path. Therefore, the measurement accuracy of the height H of the object can be increased.
  • the light receiving unit 130 includes a single light receiving element.
  • a line sensor can be used instead of the light receiving unit 130.
  • a mode in which a line sensor is used as the light receiving unit will be described as a second embodiment.
  • FIG. 9 schematically shows an example of the functional configuration of the optical sensor 910 in the second embodiment.
  • the optical sensor 910 includes a measuring device 20, a driving device 110, a light projecting system 990, and a light receiving system 960.
  • the light projecting system 990 includes a light emitting unit 980 and a lens 970.
  • the light receiving system 960 includes an optical element 100, a lens 120, and a light receiving unit 930.
  • the measurement apparatus 20 includes a control unit 140 and a measurement unit 150.
  • constituent elements included in the optical sensor 910 constituent elements having substantially the same functions as the constituent elements included in the optical sensor 10 in the first embodiment are denoted by the same reference numerals. Therefore, regarding the optical sensor 910, differences from the optical sensor 10 will be mainly described, and redundant description may be omitted.
  • the light emitting unit 980 emits light having a spread at least in the x-axis direction.
  • the lens 970 converts the light from the light emitting unit 980 into line light having a spread in the x-axis direction and irradiates the object.
  • the light projecting system 990 irradiates the target with line light having a spread in the x-axis direction. That is, the light projecting system 990 emits line light that spreads in a direction substantially orthogonal to the yz plane.
  • the light receiving unit 930 is a line sensor.
  • the light receiving unit 930 is arranged so that the longitudinal direction is along the x-axis.
  • the plurality of light receiving elements of the light receiving unit 930 are arranged in a direction along the x axis. That is, the light receiving unit 930 is provided so as to be substantially orthogonal to a plane including the optical axis of the irradiation light and the optical axis of the light receiving unit 930.
  • the optical element 100 changes the optical path in the yz plane. Therefore, the light receiving unit 930 is provided so as to be substantially orthogonal to the direction of change of the optical path by the optical element 100.
  • FIG. 10 schematically shows irradiated line light together with a signal of received light intensity I obtained by the light receiving unit 130.
  • FIG. 10 is a view of the arrangement of the light receiving unit 930, the light emitting unit 980, and the lens 970 and the state of the line light viewed from the direction along the y-axis.
  • the line light is represented by a solid line.
  • the range of reflected light that can be received by the light receiving unit 930 is represented by a broken line in FIG.
  • the alternate long and short dash line indicates the optical path of the reflected light from the surface of the portion where the height H of the object 1000 is low.
  • a graph 1001 shows the received light intensity I of the reflected light from the surface of the portion where the height H of the object 1000 is low.
  • the two-dot chain line indicates the optical path of the reflected light from the surface of the portion where the height H of the object 1000 is high.
  • a graph 1002 is a graph of the received light intensity I of the reflected light from the surface of the portion where the height H of the object 1000 is high. Both the graph 1001 and the graph 1002 show the dependency of the prism system 200 on the rotation angle ⁇ .
  • the rotation angle ⁇ at which the peak of the received light intensity I is obtained varies depending on the height H of the surface of the object 1000.
  • the measuring unit 150 can calculate the height H of the position corresponding to each light receiving element from the rotation angle ⁇ at which the light receiving intensity I has a peak at each light receiving element included in the light receiving unit 930.
  • the shape of the object 1000 can be measured by applying the light cutting method.
  • a line sensor in comparison with an area sensor, can easily increase the number of light receiving elements per row. Therefore, by applying a line sensor as the light receiving unit 930, the measurable range of the surface shape of the object 1000 can be expanded as compared with the case of applying an area sensor.
  • the line sensor can relatively easily increase the arrangement density of the light receiving elements arranged in a row. Therefore, by applying a line sensor as the light receiving unit 930, the spatial resolution in the x-axis direction of the object 1000 can be increased as compared with the case where an area sensor is applied. Further, as described in relation to the optical sensor 10 of the first embodiment, since the light path is changed by the optical element 100 and detected by the light receiving unit 930, the measurement result is influenced by the resolving power of the plurality of light receiving elements. Hateful. Therefore, for example, it can suppress that the measurement precision of the height H of the target object 1000 falls with the width of the pitch width of a some light receiving element.
  • FIG. 11 schematically illustrates an example of a functional configuration of the optical sensor 1110 according to the third embodiment.
  • the optical sensor 1110 includes a measuring device 1120, a driving device 110, a light projecting system 990, and a light receiving system 1160.
  • the light projecting system 990 includes a light emitting unit 980 and a lens 970.
  • the light receiving system 1160 includes the optical element 100, a lens 120, and a light receiving unit 1130.
  • the measuring device 1120 includes a control unit 140, a measuring unit 150, and a switching unit 1142.
  • optical sensor 1110 components having substantially the same functions as those provided in the optical sensor 910 in the second embodiment are denoted by the same reference numerals. Therefore, with respect to the optical sensor 1110, differences from the optical sensor 910 will be mainly described, and redundant description may be omitted.
  • the light receiving unit 1130 is an area sensor.
  • the light receiving unit 1130 is arranged so that the longitudinal direction is along the x-axis.
  • the control unit 140 changes the optical path with respect to the light receiving unit 1130.
  • the measurement unit 150 measures the shape of the object based on the optical path when light from the object is detected at a predetermined position on the light receiving unit 1130.
  • the first measurement mode as described in relation to the second embodiment, the light reception of each of the plurality of light receiving elements provided in a specific row while changing the optical path by the optical element 100.
  • the intensity I the light cutting method is applied to measure the height H of the object.
  • the height H of the object may be measured by detecting.
  • the control unit 140 causes the light receiving unit 1130 to receive light from the object in a state where the optical path is fixed to the light receiving unit 1130. Then, the control unit 140 measures the shape of the object on the light receiving unit 1130 based on the position where the light from the object is detected. Specifically, the measurement unit 150 calculates the height H of the object from the received light intensity distribution obtained from the plurality of light receiving elements arranged in the Y direction in the light receiving unit 1130.
  • the switching unit 1142 switches between the first measurement mode and the second measurement mode.
  • the switching unit 1142 may switch between the first measurement mode and the second measurement mode based on the selection of the user who uses the optical sensor 1110.
  • the switching unit 1142 may switch between the first measurement mode and the second measurement mode based on the required measurement accuracy. As an example, switching to the first measurement mode may be performed when measurement accuracy is required, and switching to the second measurement mode may be performed when measurement accuracy is required.
  • FIG. 12 schematically shows the arrangement of the light receiving elements in the light receiving unit 1130 and the intensity distribution of reflected light.
  • indicates the position of the light receiving element. Reflected light from a portion where the height H of the object is constant in the x-axis direction is incident on the light receiving unit 1130 with a band-shaped intensity distribution indicated by an intensity distribution 1200 in FIG. The shading of the intensity distribution 1200 indicates the magnitude of the light intensity.
  • the dark and light portions represent the portions with high light intensity.
  • the peak position of the intensity distribution in the Y-axis direction of the light receiving unit 1130 exists between the row 1201 and the row 1202.
  • an error corresponding to the pitch width of the light receiving elements in the Y-axis direction can be included in the calculation result of the peak position. Therefore, an error corresponding to the pitch width of the light receiving elements in the Y-axis direction can be included in the measurement result of the shape of the object.
  • FIG. 13 schematically shows the intensity distribution of the reflected light when the rotation angle ⁇ of the prism system 200 is changed.
  • the portion corresponding to the intensity distribution 1200 in FIG. 12 shifts in the Y-axis direction.
  • the measurement unit 150 calculates the height H of the object based on the rotation angle ⁇ when the peak position of the intensity distribution matches the row 1201 and the position of the row 1201. Thereby, since the peak position can be specified with high accuracy, the measurement error of the height H of the object can be reduced.
  • control unit 140 changes the optical path of light from the object in the yz plane so that the peak position of the intensity distribution matches the row of the light receiving elements.
  • the measurement unit 150 can reduce a measurement error of the height H of the target object using a plurality of measurement results obtained by the control unit 140 changing the optical path.
  • the optical element 100 is configured to change the optical path in the xz plane instead of changing the optical path in the yz plane or in addition to changing the optical path in the yz plane. May be. “Changing the optical path in the xz plane” means changing the optical path when projected onto the xz plane. If the optical path is changed in the xz plane, the position of the object that can be detected by the light receiving element of the light receiving unit 1130 can be shifted in the x-axis direction. Therefore, the control unit 140 changes the optical path of light from the object using the optical element 100 so that the position of the object that can be detected by the light receiving element of the light receiving unit 1130 is shifted by a distance less than the pitch width of the light receiving element. To do. And the measurement part 150 can calculate the height H of a target object with high spatial resolution using the measurement result of multiple times obtained by the control part 140 changing an optical path.
  • control unit 140 causes the light path of the light from the object to be the first light path, and causes the light from the object to enter the first region of the light receiving unit 1130. Further, the control unit 140 changes the optical path of light from the object from the first optical path to the second optical path, and causes the light from the object to enter the second region of the light receiving unit 1130. Then, the measurement unit 150 detects the light from the object by the plurality of light receiving elements provided in the first region of the light receiving unit 1130 and the plurality of light receiving elements provided in the second region of the light receiving unit 1130. The shape of the object is measured based on the detection result of light from the object.
  • FIG. 14 schematically shows a light projecting system 90 and a light receiving system 1460 of an optical sensor 1410 as a first modification of the first embodiment.
  • the optical sensor 1410 is different from the optical sensor 10 in that a flat plate 1400 is used instead of the prism system 200 as the optical element 100. Therefore, with respect to the optical sensor 1410, differences from the optical sensor 10 will be mainly described, and overlapping description may be omitted.
  • the flat plate 1400 has an incident surface that is a surface on which light is incident and an output surface that is a surface from which light is emitted.
  • the entrance surface is parallel to the exit surface.
  • the entrance surface and the exit surface are provided in parallel to the x axis.
  • the flat plate 1400 is provided to be rotatable around an axis AX1 parallel to the x axis.
  • the control unit 140 controls the driving device 110 to rotate the flat plate 1400 around the axis AX1.
  • the control unit 140 outputs information indicating the angle ⁇ of the flat plate 1400 with reference to a predetermined reference angle to the driving device 110.
  • the optical path from the H2 position is indicated by a solid line
  • the optical path from the H3 position is indicated by a broken line.
  • control unit 140 changes the optical path with respect to the light receiving unit 130 by changing the direction of the flat plate 1400.
  • the measurement part 150 measures the shape of a target object based on the direction of the flat plate 1400 when the light-receiving part 130 detects the light from a target object.
  • the flat plate 1400 is an example of an optical element that changes the optical path by refraction.
  • FIG. 15 schematically shows a light projecting system 90 and a light receiving system 1560 of an optical sensor 1510 as a second modification of the first embodiment.
  • the optical sensor 1510 is different from the optical sensor 10 in that a flat mirror 1500 is used instead of the prism system 200 as the optical element 100. Therefore, with respect to the optical sensor 1510, differences from the optical sensor 10 will be mainly described, and redundant description may be omitted.
  • the flat mirror 1500 is provided to be rotatable around an axis AX2 parallel to the x axis.
  • the control unit 140 controls the driving device 110 to rotate the flat mirror 1500 around the axis AX2.
  • the control unit 140 outputs information indicating the angle ⁇ of the flat mirror 1500 with reference to a predetermined reference angle to the driving device 110.
  • the optical path from the H2 position is indicated by a solid line
  • the optical path from the H3 position is indicated by a broken line.
  • the control unit 140 changes the optical path of light from the object with respect to the light receiving unit 130 by changing the direction of the flat mirror 1500.
  • the measurement unit 150 calculates the height H of the object based on the angle of the flat mirror 1500 when light is detected by the light receiving unit 130. In this way, the measurement unit 150 measures the shape of the object based on the orientation of the flat mirror 1500 when the light receiving unit 130 detects light from the object.
  • a flat mirror 1500 having one reflecting surface is illustrated.
  • the flat mirror 1500 is an example of an optical element that changes an optical path by reflection.
  • a polygon mirror, a galvanometer mirror, or the like can be applied instead of the flat mirror 1500.
  • FIG. 16 schematically shows a light projecting system 90 and a light receiving system 1660 of an optical sensor 1610 as a third modification of the first embodiment.
  • the optical sensor 1610 is different from the optical sensor 10 in that an acoustooptic deflector 1600 is used instead of the prism system 200 as the optical element 100. Therefore, with respect to the optical sensor 1610, differences from the optical sensor 10 will be mainly described, and redundant description may be omitted.
  • the acousto-optic deflector 1600 includes a first acousto-optic deflector 1601 and a second acousto-optic deflector 1602.
  • Each of the first acousto-optic deflector 1601 and the second acousto-optic deflector 1602 has an elastic medium to which ultrasonic waves are applied by a vibrator such as a piezoelectric element.
  • Incident light can be diffracted by a coarse-wave diffraction grating generated in an elastic medium by ultrasonic waves.
  • the first acoustooptic deflector 1601 and the second acoustooptic deflector 1602 diffract incident light by the Bragg effect.
  • the incident angle at which diffraction occurs varies with the frequency of the ultrasonic wave. Therefore, the control unit 140 outputs information indicating the frequency f of the electric signal applied to the vibrator provided in the first acoustooptic deflector 1601 and the second acoustooptic deflector 1602 to the driving device 110.
  • the driving device 110 drives the respective vibrators of the first acousto-optic deflector 1601 and the second acousto-optic deflector 1602 with the frequency f specified by the control unit 140, thereby allowing light incident at a specific incident angle. To deflect.
  • the optical path from the H2 position is indicated by a solid line
  • the optical path from the H3 position is indicated by a broken line.
  • the optical path of the light from the object is moved using two acoustooptic deflectors and received by the light receiving unit 130.
  • the number of acoustooptic deflectors is not limited to two.
  • the light from the object may be received by the light receiving unit 130 by changing the angle of the optical path using one acousto-optic deflector.
  • the optical path can be changed using one or more acousto-optic deflectors.
  • the light from the object may be received by any one of the light receiving elements on the line sensor or the area sensor.
  • an electro-optic deflection light device may replace with the acousto-optic deflector 1600 in the optical sensor 1610, and an electro-optic deflection light device may be applied.
  • the electro-optic deflector include an optical element having an electro-optic effect whose refractive index changes according to an electric field, such as a KTN crystal or a liquid crystal.
  • the control unit 140 controls the optical path of light from the object by controlling the voltage applied to the electro-optic deflector.
  • the measurement unit 150 calculates the height H of the object based on the voltage applied to the electro-optic deflector when light is detected by the light receiving unit 130.
  • An electro-optic deflector is an example of an optical element that changes an optical path by refraction.
  • FIG. 17 schematically illustrates an example of a functional configuration of the optical sensor 1710 according to the fourth embodiment.
  • the optical sensor 1710 includes a measuring device 20, a driving device 110, a light projecting system 1790, and a light receiving system 1760.
  • the light projecting system 1790 includes a light emitting unit 980, a lens 970, and an optical element 1700.
  • the light receiving system 1760 includes a lens 120 and a light receiving unit 930.
  • the measurement apparatus 20 includes a control unit 140 and a measurement unit 150.
  • optical sensor 1710 components having substantially the same functions as those provided in the optical sensor 910 in the second embodiment are denoted by the same reference numerals. Therefore, regarding the optical sensor 1710, differences from the optical sensor 910 will be mainly described, and redundant description may be omitted.
  • the optical element 1700 is provided in an optical path of incident light that is light incident on an object.
  • the optical element 1700 changes the optical path of the incident light to the object in at least one of the y-axis direction and the x-axis direction.
  • the optical element 1700 deflects incident light on the object.
  • the light from the position on the optical axis of the light receiving unit 930 is incident on the light receiving unit 930.
  • the measurement unit 150 calculates the height H3 of the object at the position P in the y-axis direction based on the angle at which the optical element 1700 deflects the incident light.
  • the measuring unit 150 includes the position of the light receiving element in which light is detected and the incident by the optical element 1700 among the plurality of light receiving elements included in the light receiving unit 930.
  • the position where the height is H1 is calculated from the light deflection angle.
  • the measurement part 150 can measure the shape of a target object by changing the optical path of the incident light to a target object.
  • the optical element 1700 deflects incident light in the x-axis direction
  • the position of the object that can be detected by the light receiving element of the light receiving unit 930 can be shifted. Therefore, according to the optical sensor 1710, the position having the height H1 in the object can be detected with high spatial resolution.
  • the control unit 140 changes the optical path of the light incident on the object with respect to the light receiving unit 930.
  • the measurement part 150 measures the shape of a target object based on the optical path when the light-receiving part 930 detects the light from a target object.
  • a mode in which each light projecting system includes an optical element 1700 can be employed. That is, the control unit 140 may change the optical paths of both the light incident on the object and the light from the object with respect to the light receiving unit 930.
  • FIG. 18 schematically illustrates an example of a functional configuration of the optical sensor 1810 according to the fifth embodiment.
  • the optical sensor 1810 includes a measuring device 20, a driving device 110, a light projecting system 990, and a light receiving system 1860.
  • the light projecting system 990 includes a light emitting unit 980 and a lens 970.
  • the light receiving system 1860 includes a lens 120 and a light receiving unit 930.
  • the measurement apparatus 20 includes a control unit 140 and a measurement unit 150.
  • optical sensor 1810 components having substantially the same functions as those provided in the optical sensor 910 in the second embodiment are denoted by the same reference numerals. Therefore, with respect to the optical sensor 1810, differences from the optical sensor 910 will be mainly described, and redundant description may be omitted.
  • the light receiving unit 930 is provided to be movable in a direction orthogonal to the optical axis of the light receiving unit 930.
  • the driving device 110 moves the light receiving unit 930 in a direction orthogonal to the Y axis.
  • the driving device 110 acquires information indicating the amount of movement of the light receiving unit 930 from the reference position from the control unit 140.
  • the driving device 110 moves the light receiving unit 930 in the Y-axis direction of the light receiving unit 930 according to the information acquired from the control unit 140. In this way, the control unit 140 changes the position of the light receiving unit 930 with respect to the optical path of light from the object.
  • the light receiving unit 930 can scan the light from the object in the Y-axis direction.
  • the measuring unit 150 calculates the height H from the object based on the position of the light receiving unit 930 in the Y-axis direction when the light reception intensity peak of the light receiving unit 930 is obtained.
  • the number of light receiving elements per row can be easily increased as compared with the area sensor. Therefore, by applying a line sensor as the light receiving unit 930, it is possible to widen the measurable range of the surface shape of the object in the x-axis direction compared to the case of applying an area sensor.
  • the spatial resolution in the x-axis direction of the surface shape of the object can be increased as compared with the case where an area sensor is applied.
  • the light receiving unit 930 can be scanned in the Y-axis direction, so that the measurable range and spatial resolution in the Y-axis direction can be increased.
  • control unit 140 may change the position of the light receiving unit 930 until light from the object is detected by the light receiving unit 930.
  • the control unit 140 may stop the change in the position of the light receiving unit 930 when the light receiving unit 930 detects light from the object.
  • the light receiving system detects a part of the diffuse reflected light generated by the object.
  • a method for detecting total reflection light generated by an object may be applied.
  • the measurement target of the optical sensor in each embodiment described above is an electrode formed on a substrate.
  • the measurement target of the optical sensor may be a bump or the like formed on the substrate.
  • the measurement target of the optical sensor may be the inner wall of the tunnel.
  • the measurement target of the optical sensor is not limited to these.
  • the measurement accuracy and measurement range can be made less affected by the element arrangement of the two-dimensional sensor, compared to the case where only the two-dimensional sensor is used.
  • Each block in the block diagram may represent (1) a stage of the process in which the operation is performed or (2) a section of the device responsible for performing the operation.
  • each of the control unit 140 and the measurement unit 150 may represent one section of the measurement device 20.
  • Certain stages and sections are implemented by dedicated circuitry, programmable circuitry supplied with computer readable instructions stored on a computer readable medium, and / or processor supplied with computer readable instructions stored on a computer readable medium. It's okay.
  • Dedicated circuitry may include digital and / or analog hardware circuitry and may include integrated circuits (ICs) and / or discrete circuits.
  • Programmable circuits include memory elements such as logical AND, logical OR, logical XOR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate arrays (FPGA), programmable logic arrays (PLA), etc. Reconfigurable hardware circuitry, including and the like.
  • Computer readable media may include any tangible device capable of storing instructions to be executed by a suitable device, such that a computer readable medium having instructions stored thereon is specified in a flowchart or block diagram. Constitutes at least part of the product including instructions that may be executed to provide a means for performing the operations. Examples of computer readable media may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like.
  • Computer readable media include floppy disks, diskettes, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), Electrically erasable programmable read only memory (EEPROM), static random access memory (SRAM), compact disc read only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory stick, integrated A circuit card or the like may be included.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • EEPROM Electrically erasable programmable read only memory
  • SRAM static random access memory
  • CD-ROM compact disc read only memory
  • DVD digital versatile disc
  • RTM Blu-ray
  • Computer readable instructions can be assembler instructions, instruction set architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state setting data, or object oriented programming such as Smalltalk, JAVA, C ++, etc. Including any source code or object code written in any combination of one or more programming languages, including languages and conventional procedural programming languages such as "C" programming language or similar programming languages Good.
  • Computer readable instructions may be directed to a general purpose computer, special purpose computer, or other programmable data processing device processor or programmable circuit locally or in a wide area network (WAN) such as a local area network (LAN), the Internet, etc.
  • the computer-readable instructions may be executed to provide a means for performing the operations provided through and specified in the flowchart or block diagram.
  • processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
  • FIG. 19 illustrates an example of a computer 2000 in which a plurality of embodiments may be embodied in whole or in part.
  • a program installed in the computer 2000 causes the computer 2000 to perform an operation associated with the device or one or more sections of the device, causing the computer 2000 to function as the device according to the embodiment or one or more sections of the device. And / or a process according to an embodiment or a stage of the process can be executed.
  • Such a program may be executed by CPU 2012 to cause computer 2000 to perform certain operations associated with some or all of the blocks in the flowcharts and block diagrams described herein.
  • the computer 2000 includes a CPU 2012, a RAM 2014, a graphic controller 2016, and a display device 2018, which are connected to each other by a host controller 2010.
  • Computer 2000 also includes ROM 2030.
  • the ROM 2030 is connected to the host controller 2010 via the input / output controller 2020.
  • the computer 2000 also includes input / output units such as a communication interface 2022, a hard disk drive 2024, a DVD-ROM drive 2026, and a memory card drive 2028, which are connected to the host controller 2010 via the input / output controller 2020.
  • the computer 2000 also includes legacy input / output units such as a keyboard 2042, which are connected to an input / output controller 2020 via an input / output chip 2040.
  • the CPU 2012 operates according to a program stored in the ROM 2030 and / or the RAM 2014, thereby controlling each unit.
  • the graphic controller 2016 acquires a frame buffer or the like provided in the RAM 2014 or image data generated by the CPU 2012 in the RAM 2014 and causes the display device 2018 to display the image data.
  • the communication interface 2022 communicates with other electronic devices via a network.
  • the hard disk drive 2024 stores programs and data used by the CPU 2012 in the computer 2000.
  • the DVD-ROM drive 2026 reads a program and / or data from the DVD-ROM 2001 and provides the program and / or data to the hard disk drive 2024 via the RAM 2014.
  • the memory card drive 2028 reads programs and / or data from the memory card 2003, and provides the programs and / or data to the hard disk drive 2024 via the RAM 2014.
  • the memory card drive 2028 may write a program and / or data to the memory card 2003.
  • the ROM 2030 stores therein a boot program executed by the computer 2000 at the time of activation and / or a program depending on the hardware of the computer 2000.
  • the input / output chip 2040 may also connect various input / output units to the input / output controller 2020 via a parallel port, serial port, keyboard port, mouse port, and the like.
  • the program is provided by a computer-readable medium such as a DVD-ROM 2001 or a memory card 2003.
  • the program is read from a computer-readable medium, installed in the hard disk drive 2024, the RAM 2014, and / or the ROM 2030, which are also examples of the computer-readable medium, and executed by the CPU 2012.
  • Information processing described in these programs is read by the computer 2000 to bring about cooperation between the programs and the various types of hardware resources.
  • the apparatus or method may be configured by implementing information operations or processing in accordance with the use of computer 2000.
  • the CPU 2012 executes a communication program loaded in the RAM 2014 and performs communication processing on the communication interface 2022 based on the processing described in the communication program. You may order.
  • the communication interface 2022 reads and reads transmission data stored in a transmission buffer processing area provided in a recording medium such as the RAM 2014, the hard disk drive 2024, the DVD-ROM 2001, or the memory card 2003 under the control of the CPU 2012.
  • the transmission data is transmitted to the network, or the reception data received from the network is written in a reception buffer processing area provided on the recording medium.
  • the CPU 2012 reads all or a necessary part of a file or database stored in an external recording medium such as the hard disk drive 2024, the DVD-ROM 2001, the memory card 2003 or the like into the RAM 2014, Various types of processing may be performed. Next, the CPU 2012 writes back the processed data to the external recording medium.
  • an external recording medium such as the hard disk drive 2024, the DVD-ROM 2001, the memory card 2003 or the like.
  • the CPU 2012 performs various types of operations, information processing, condition determination, conditional branching, unconditional branching, information retrieval / information described in this specification and specified by the instruction sequence of the program for the data read from the RAM 2014. Various types of processing, including replacement, may be performed, and the result written back to the RAM 2014. Further, the CPU 2012 may search for information in files, databases, etc. in the recording medium. For example, when a plurality of entries each having an attribute value of the first attribute associated with the attribute value of the second attribute are stored in the recording medium, the CPU 2012 specifies the attribute value of the first attribute. The entry matching the condition is searched from the plurality of entries, the attribute value of the second attribute stored in the entry is read, and the first attribute satisfying a predetermined condition is thereby read. The attribute value of the associated second attribute may be obtained.
  • the above-described program or software module may be stored on a computer-readable medium on or near the computer 2000.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable medium, thereby providing the program to the computer 2000 via the network. You can do it.
  • Control system 10 10 optical sensor 20 two-dimensional image sensor 60 Light receiving system 90 Floodlight system 100 optical elements 101 First prism 102 2nd prism 110 Drive device 120 lenses 130 Light receiver 140 Control unit 150 Measuring unit 170 lenses 180 Light emitter 200 Prism system 910 Optical sensor 930 Light receiver 960 Light receiving system 970 lens 980 light emitting part 990 Floodlight system 1001 graph 1002 Graph 1110 Optical sensor 1120 Measuring device 1130 Light receiver 1142 switching unit 1160 Light receiving system 1200 intensity distribution Line 1201 Line 1202 1400 flat plate 1410 Optical sensor 1460 Light receiving system 1500 Flat mirror 1510 Optical sensor 1560 light receiving system 1600 Acousto-optic deflector 1601 First acousto-optic deflector 1602 Second acousto-optic deflector 1610 Optical sensor 1660 Light receiving system 1700 Optical device 1710 Optical sensor 1760 Light receiving system 1790 Floodlight system 1810 Optical sensor 1860 Light receiving system 2000 computers 2001 DVD-ROM 2003 Memory card 2010

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)
PCT/JP2017/046484 2016-12-27 2017-12-25 測定装置、光学式センサ、測定方法及びプログラム WO2018123992A1 (ja)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016254494A JP6890797B2 (ja) 2016-12-27 2016-12-27 測定装置、光学式センサ、測定方法及びプログラム
JP2016-254494 2016-12-27

Publications (1)

Publication Number Publication Date
WO2018123992A1 true WO2018123992A1 (ja) 2018-07-05

Family

ID=62710416

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/046484 WO2018123992A1 (ja) 2016-12-27 2017-12-25 測定装置、光学式センサ、測定方法及びプログラム

Country Status (2)

Country Link
JP (1) JP6890797B2 (enrdf_load_stackoverflow)
WO (1) WO2018123992A1 (enrdf_load_stackoverflow)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7380865B2 (ja) * 2020-05-19 2023-11-15 日本電信電話株式会社 角度計測装置
EP4517255B1 (de) * 2023-08-30 2025-06-04 Sick Ag Aufnahme von 3d-bilddaten mit einem lichtschnittverfahren

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5032023A (en) * 1990-07-02 1991-07-16 General Electric Company Optical fiber based sensor for a variable depth range camera
JPH0452619A (ja) * 1990-06-20 1992-02-20 Fujitsu Ltd 光学走査方法およびその装置
JPH0566116A (ja) * 1991-09-09 1993-03-19 Sumitomo Electric Ind Ltd 三次元位置測定装置
JPH0571930A (ja) * 1991-09-10 1993-03-23 Amada Co Ltd 溶接線検出装置及び溶接線検出方法
JP2015111148A (ja) * 2009-09-02 2015-06-18 ケーエルエー−テンカー・コーポレーションKla−Tencor Corporation スピンウェーハ検査システム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0452619A (ja) * 1990-06-20 1992-02-20 Fujitsu Ltd 光学走査方法およびその装置
US5032023A (en) * 1990-07-02 1991-07-16 General Electric Company Optical fiber based sensor for a variable depth range camera
JPH0566116A (ja) * 1991-09-09 1993-03-19 Sumitomo Electric Ind Ltd 三次元位置測定装置
JPH0571930A (ja) * 1991-09-10 1993-03-23 Amada Co Ltd 溶接線検出装置及び溶接線検出方法
JP2015111148A (ja) * 2009-09-02 2015-06-18 ケーエルエー−テンカー・コーポレーションKla−Tencor Corporation スピンウェーハ検査システム

Also Published As

Publication number Publication date
JP6890797B2 (ja) 2021-06-18
JP2018105796A (ja) 2018-07-05

Similar Documents

Publication Publication Date Title
JP5740321B2 (ja) 距離計測装置、距離計測方法及び制御プログラム
JP6359466B2 (ja) サブ解像度での光学的検出
JP5618898B2 (ja) 変位検出装置
WO2018123992A1 (ja) 測定装置、光学式センサ、測定方法及びプログラム
JP6288280B2 (ja) 表面形状測定装置
CN113884180A (zh) 衍射光波导的测试系统、方法及装置
JP6331985B2 (ja) 形状測定装置及び方法
US8908084B2 (en) Electronic device and method for focusing and measuring points of objects
CN114838916A (zh) 衍射光波导的测试系统、方法及装置
EP2873946A1 (en) Measuring apparatus, and method of manufacturing article
US11913776B2 (en) Interference in-sensitive Littrow system for optical device structure measurement
CN114088349B (zh) 合色棱镜的测试方法、装置及系统
US10900987B2 (en) Robust particle velocity measurement
TWI843228B (zh) 測量系統和方法
US11223816B2 (en) Multi-image projector and electronic device having multi-image projector
US20210173080A1 (en) Optical position-measurement device
KR20220137370A (ko) 복수의 홀로그램 소자를 이용한 비전 검사 장치 및 방법
JP2016133334A (ja) レーザレンジファインダ、3次元スキャナおよびレーザ光偏向装置
US12203870B2 (en) Measurement system and measurement method
US20240142339A1 (en) Methods of geometry parameters measurement for optical gratings
JP7246543B1 (ja) 通信装置、プログラム、及び通信方法
US20080316878A1 (en) Method for measuring thickness and measuring device using the same
KR102103919B1 (ko) 다중 이미지 프로젝터 및 다중 이미지 프로젝터를 구비한 전자 기기
EP3637044A1 (en) Multi-image projector and electronic device having multi-image projector
JP4320101B2 (ja) 光遮断検出装置および情報表示システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17889374

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17889374

Country of ref document: EP

Kind code of ref document: A1