WO2022048311A1 - 测距装置、测距方法、摄像头及电子设备 - Google Patents

测距装置、测距方法、摄像头及电子设备 Download PDF

Info

Publication number
WO2022048311A1
WO2022048311A1 PCT/CN2021/105726 CN2021105726W WO2022048311A1 WO 2022048311 A1 WO2022048311 A1 WO 2022048311A1 CN 2021105726 W CN2021105726 W CN 2021105726W WO 2022048311 A1 WO2022048311 A1 WO 2022048311A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
distance
wavelength
measured object
determined
Prior art date
Application number
PCT/CN2021/105726
Other languages
English (en)
French (fr)
Inventor
邵明天
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP21863379.0A priority Critical patent/EP4194897A4/en
Publication of WO2022048311A1 publication Critical patent/WO2022048311A1/zh
Priority to US18/090,062 priority patent/US20230194667A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4915Time delay measurement, e.g. operational details for pixel components; Phase measurement
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/02Diffusing elements; Afocal elements
    • G02B5/0273Diffusing elements; Afocal elements characterized by the use
    • G02B5/0278Diffusing elements; Afocal elements characterized by the use used in transmission
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B2210/00Aspects not specifically covered by any group under G01B, e.g. of wheel alignment, caliper-like sensors
    • G01B2210/50Using chromatic effects to achieve wavelength-dependent depth resolution

Definitions

  • the present application relates to the field of camera technology, and in particular, to a ranging device, a ranging method, a camera, and electronic equipment.
  • An electronic device with a shooting function usually uses a ranging module in the electronic device to detect the distance between the lens and the object to be photographed, so as to perform automatic focusing based on the distance.
  • the distance detected by the ranging module in the related art has a large error with the actual distance, resulting in poor focusing effect of the electronic device and unclear captured images.
  • embodiments of the present application provide a ranging apparatus, a ranging method, a camera, and an electronic device.
  • An embodiment of the present application provides a ranging device, including:
  • a light source configured to emit detection light in a set wavelength band
  • a beam splitter configured to transmit the probe light and output the transmitted light corresponding to the probe light; the probe light is incident on the first surface of the beam splitter;
  • a lens group including at least one dispersive lens configured to disperse the transmitted light from the beam splitter to the lens group to focus light of different wavelengths to different locations;
  • the first light limiter has a light-transmitting area that allows the first reflected light to pass through; the first reflected light represents that the second reflected light is transmitted from the lens group to the second surface of the beam splitter at the second surface The reflected light generated; the second reflected light represents the reflected light generated on the surface of the measured object by the light focused on the surface of the measured object; the second surface is disposed opposite to the first surface;
  • a spectral sensor configured to output first information when receiving the first reflected light; the first information at least represents the light intensity corresponding to the wavelength of the first reflected light; the first information is configured as The distance between the light source and the measured object is determined.
  • the embodiment of the present application also provides a ranging method based on any of the above ranging devices, including:
  • the distance between the light source and the measured object is determined;
  • the light intensity corresponding to the first wavelength is the largest.
  • the embodiment of the present application also provides a camera, including:
  • a lens group, a focus motor and any one of the above distance measuring devices wherein,
  • the focus motor is configured to drive the lens group to move to a corresponding focus position based on the focus distance; the focus distance is determined based on the positional relationship between the lens group and the light source in the ranging device, and the first distance out; the first distance represents the distance between the light source in the distance measuring device and the measured object.
  • the embodiment of the present application also provides a camera, including:
  • a processor configured to store a computer program executable on the processor, and any of the above distance measuring devices;
  • the processor is configured to implement any of the above ranging methods when running the computer program.
  • the embodiment of the present application also provides an electronic device, including:
  • processor a memory configured to store a computer program executable on the processor, and any of the foregoing cameras;
  • the processor is configured to implement any one of the above ranging methods when running the computer program.
  • Embodiments of the present application further provide a storage medium, on which a computer program is stored, and when the computer program is executed by a processor, any one of the foregoing ranging methods is implemented.
  • FIG. 1 is a schematic structural diagram of a distance measuring device according to an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a spectral curve provided in an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of a distance measuring device according to another embodiment of the present application.
  • FIG. 4 is a schematic flowchart of implementing a ranging method based on any of the above ranging devices provided by an embodiment of the present application;
  • FIG. 5 is a schematic structural diagram of a hardware composition of a camera according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a hardware composition structure of a camera according to another embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of a hardware composition of an electronic device according to an embodiment of the present application.
  • the related art provides an automatic focusing method, in which a laser emission device of a ranging module in an electronic device emits a first infrared laser, and the first infrared laser is reflected when it propagates to the surface of an object to be photographed, thereby generating a second infrared laser;
  • the laser receiving device of the ranging module in the electronic equipment receives the second infrared laser; calculates the time difference according to the emission time corresponding to the first infrared laser and the receiving time corresponding to the second infrared laser; according to the propagation speed of the infrared laser in the propagation medium and
  • the calculated time difference calculates the distance between the camera of the electronic device and the object to be photographed, and the focus motor in the electronic device drives the lens group or lens group of the camera in the electronic device to the corresponding focus position based on the calculated distance to complete autofocus.
  • the actual distance between the camera of the electronic device and the object to be shot is relatively small.
  • the electronic device uses the distance between the camera and the object to be shot measured by the ranging module. The error between the test distance and the corresponding actual distance is large, resulting in an unclear image when the electronic device performs auto-focusing based on the test distance.
  • an embodiment of the present application provides a ranging device, including:
  • a light source configured to emit detection light in a set wavelength band
  • a beam splitter configured to transmit the probe light, and output the transmitted light corresponding to the probe light; the probe light is incident on the first surface of the beam splitter;
  • a lens group including at least one dispersive lens configured to disperse the transmitted light from the beam splitter to the lens group to focus light of different wavelengths to different locations;
  • the first light limiter has a light-transmitting area that allows the first reflected light to pass through; the first reflected light represents that the second reflected light is transmitted from the lens group to the second surface of the beam splitter at the second surface The reflected light generated, the second reflected light represents the reflected light generated on the surface of the measured object by the light focused on the surface of the measured object; the second surface is arranged opposite to the first surface;
  • a spectral sensor configured to output first information when receiving the first reflected light; the first information at least represents the light intensity corresponding to the wavelength of the first reflected light; the first information is configured as The distance between the light source and the measured object is determined.
  • the light passing area of the first light limiter is a through hole; the spectral sensor corresponds to a lattice spectral sensor.
  • the distance measuring device further comprises: a first lens and a second light limiter disposed between the light source and the beam splitter;
  • the first lens is configured to convert the first detection light incident to the first lens into a corresponding second detection light; the second detection light represents a parallel beam corresponding to the first detection light;
  • the second light limiter has a slit allowing the second probe light to pass therethrough; the second probe light passing through the slit of the second light limiter is configured to be incident on the beam splitter.
  • the first lens includes at least one collimating mirror and at least one cylindrical mirror arranged oppositely; wherein,
  • a collimating mirror configured to convert the first probe light into a parallel light beam
  • the cylindrical mirror is configured to condense the parallel light beams into the corresponding second detection light.
  • the light passing area of the first light limiter is a slit; the spectral sensor corresponds to an area array spectral sensor; the first information also represents a second reflection corresponding to the first reflected light The focus position of light on the surface of the object being measured.
  • the lens group includes at least two dispersive lenses; the lens group is further configured to adjust the dispersion range; wherein,
  • the dispersion range represents the distance range between the focus position of the light of different wavelengths in the corresponding transmitted light and the lens group when the transmitted light is transmitted to the lens group for dispersion.
  • the transmittance of the first surface of the beam splitter is greater than the reflectance
  • the first surface of the beam splitter is coated with at least one layer of anti-reflection coating
  • the second surface of the beam splitter is coated with at least one layer of anti-reflection coating
  • the first information includes a spectral curve
  • the spectral curve represents the corresponding relationship between the wavelength of the first reflected light and the light intensity.
  • the embodiment of the present application also provides a ranging method using the above-mentioned ranging device, including:
  • the distance between the light source and the measured object is determined;
  • the light intensity corresponding to the first wavelength is the largest.
  • a first wavelength of light focused on the surface of the object to be measured is determined; the set correspondence between the wavelength and the calibration distance and the The determined at least one first wavelength determines the distance between the light source and the measured object, including:
  • a first calibration distance corresponding to the first wavelength is determined; the first calibration distance represents the distance between the light source and the measured object.
  • the spectral sensor when the spectral sensor is an area array spectral sensor, at least two first wavelengths of light focused on the surface of the object to be measured are determined; the set correspondence between the wavelengths and the calibration distance is determined. and determining the distance between the light source and the measured object based on the determined at least one first wavelength, including:
  • the distance between the light source and the measured object is determined.
  • the ranging method further includes:
  • Clustering at least two first wavelengths to obtain a clustering result
  • Distribution characteristics corresponding to at least two first wavelengths are determined based on the clustering result.
  • the embodiment of the present application also provides a camera, including:
  • a lens group, a focus motor and any one of the above distance measuring devices wherein,
  • the focus motor is configured to drive the lens group to move to a corresponding focus position based on the focus distance; the focus distance is determined based on the positional relationship between the lens group and the light source in the ranging device, and the first distance out; the first distance represents the distance between the light source in the distance measuring device and the measured object.
  • the embodiment of the present application also provides a camera, including:
  • a processor a memory configured to store a computer program executable on the processor, and the above distance measuring apparatus;
  • the processor is configured to execute the steps of the above-mentioned ranging method when running the computer program.
  • the embodiment of the present application also provides an electronic device, including:
  • processor a memory configured to store a computer program executable on the processor, and the aforementioned camera;
  • the processor is configured to execute the steps of the above-mentioned ranging method when running the computer program.
  • FIG. 1 shows a schematic structural diagram of a ranging apparatus provided by an embodiment of the present application.
  • the distance measuring device 1 includes: a light source 11 , a beam splitter 12 , a lens group 13 including at least one dispersing lens, a first light limiter 14 and a spectral sensor 15 . in,
  • the light source 11 is configured to emit detection light in a set wavelength band
  • the beam splitter 12 is configured to transmit the probe light and output the transmitted light corresponding to the probe light; the probe light is incident on the first surface of the beam splitter;
  • a lens group 13 including at least one dispersive lens, configured to disperse the transmitted light transmitted from the beam splitter 12 to the lens group 13, so as to focus light of different wavelengths to different positions;
  • the first light limiter 14 has a light-transmitting area that allows the first reflected light to pass through; the first reflected light represents that the second reflected light is transmitted from the lens group 13 to the second surface of the beam splitter 12 when the The reflected light generated by the second surface; the second reflected light represents the reflected light generated on the surface of the measured object by the light focused on the surface of the measured object; the second surface is disposed opposite to the first surface;
  • the spectral sensor 15 is configured to output first information when receiving the first reflected light; the first information at least represents the light intensity corresponding to the wavelength of the first reflected light; the first information is configured To determine the distance between the light source and the measured object.
  • the beam splitter 12 has a first surface and a second surface disposed opposite to each other.
  • the first light limiter 14 is disposed between the beam splitter 12 and the spectral sensor 15 , and the first light limiter 14 is close to the second surface of the beam splitter 12 and away from the first surface of the beam splitter 12 .
  • the working principle of the distance measuring device 1 is described below with reference to FIG. 1 :
  • the light source 11 emits probe light in a set wavelength band.
  • the light source 11 is a point light source, which can emit near-infrared light in a set wavelength band.
  • the probe light has a wide wavelength range.
  • the set wavelength band may be 760 nanometers (nm) to 1500 nanometers.
  • the probe light emitted by the light source 11 propagates to the first surface of the beam splitter 12 , a part of the probe light is reflected on the first surface of the beam splitter 12 , and the other part of the probe light passes through the beam splitter 12
  • the first and second surfaces of the beam splitter 12 are transmitted from the second surface of the beam splitter 12 .
  • the first surface and the second surface of the beam splitter 12 are disposed opposite to each other.
  • the beam splitter 12 does not change the propagation direction of the probe light, therefore, the propagation direction of the transmitted light transmitted from the second surface of the beam splitter 12 is the same as the propagation direction of the corresponding probe light.
  • the lens group 13 including at least one dispersive lens, dispersion occurs, so that the color-transmitting light (the color-transmitting light is polychromatic light) is decomposed into monochromatic light, so that the transmitted light is
  • the monochromatic light of different wavelengths included in is focused to different positions in the axial direction, for example, e, f, g, and g in Fig. 1 represent the focusing position of the surface of the measured object.
  • the lens group 13 does not change the propagation direction of the transmitted light transmitted from the beam splitter 12 to the lens group 13 .
  • the second reflected light is a light beam.
  • the second reflected light returns to the lens group 13 along the emission optical path of the corresponding transmitted light, and is transmitted to the second surface of the beam splitter 12 through the lens group 13 .
  • the emission light path of the transmitted light passing through the second surface of the beam splitter 12 eventually returns to the second surface of the beam splitter 12 .
  • the transmitted light transmitted from the area corresponding to ab of the beam splitter 12 is incident on the area corresponding to cd of the lens group 13 , the transmitted light is scattered when passing through the lens group 13 , and light of different wavelengths in the transmitted light Focus on different positions respectively, and the light focused on the surface of the measured object 2 is reflected at the focusing position where the surface g of the measured object is located to generate the corresponding second reflected light.
  • the second reflected light is presented as: The inverted cone formed by the three points , c and d. Based on the principle of reversibility of the optical path, the second reflected light returns to the area corresponding to the second surface ab of the beam splitter 12 through the area corresponding to cd of the lens group 13 .
  • the dispersion range represents the distance range between the focus position of light of different wavelengths in the corresponding transmitted light and the optical center o of the lens group 13 when the transmitted light is transmitted to the lens group 13 for dispersion.
  • the dispersion range is determined by the minimum and maximum distances between the focus position and the optical center o of the lens group 13 .
  • the spectral sensor 15 outputs first information when detecting the first reflected light passing through the light-transmitting region of the first light limiter 14, and the first information at least represents the light intensity corresponding to the wavelength of the first reflected light.
  • the triangle formed by three points a, b and h is an isosceles triangle
  • h represents the center of the light passing area of the first light limiter 14
  • ab corresponds to the projection of the probe light to the beam splitter 12
  • the maximum diameter of the surface can thus ensure that the first reflected light generated by the reflection of the second reflected light on the second surface of the beam splitter 12 can pass through the first light limiter 14 and be received by the spectral sensor 15 .
  • the beam splitter 12 is configured to transmit the probe light incident from the first surface of the beam splitter 12 and to reflect the second reflected light transmitted from the lens group 13 to the second surface of the beam splitter 12 .
  • the transmittance of the first surface of the beam splitter 12 is greater than the reflectivity, for example, the transmittance of the first surface of the beam splitter is greater than 50%. Therefore, when any beam of probe light emitted by the light source 11 propagates to the first surface of the beam splitter, the light passing through the second surface of the beam splitter 12 in the beam of probe light is greater than the first surface of the beam splitter 12 The reflected light can reduce the energy lost during the propagation of the probe light.
  • the transmittance of the second surface of the beam splitter 12 is less than the reflectivity, for example, the reflectivity of the second surface of the beam splitter is greater than 50%, thereby ensuring that the second reflected light can be reflected on the second surface of the beam splitter 12
  • the corresponding first reflected light is generated, so that the spectral sensor 15 can detect the corresponding first reflected light.
  • the first surface of the beam splitter 12 is coated with at least one layer of anti-reflection coating
  • the second surface of the beam splitter 12 is coated with at least one layer of anti-reflection coating.
  • the anti-reflection coating is configured to increase the amount of light transmitted by the first surface of the beam splitter 12 .
  • the anti-reflection coating is configured to reduce the amount of light transmitted through the second surface of the beam splitter 12 and increase the reflected light.
  • the size of the light passing area of the first light limiter 14 is adjustable.
  • the first light limiter 14 is configured to limit the intensity of the first reflected light passing through the light passing area.
  • the light passing area of the first light limiter 14 ensures that only the first reflected light corresponding to the light focused on the surface of the measured object 2 can pass through.
  • the spectral sensor 15 can receive light in the visible light band.
  • the spectral sensor 15 may be a charge-coupled device (CCD, Charge-coupled Device) spectral sensor, or a complementary metal-oxide semiconductor (CMOS, Complementary Metal-Oxide-Semiconductor) spectral sensor.
  • CCD Charge-coupled Device
  • CMOS complementary metal-oxide semiconductor
  • the first information may include a spectral curve. As shown in FIG. 2 , the spectral curve represents the corresponding relationship between the wavelength of the first reflected light and the light intensity.
  • the transmitted light transmitted from the second surface of the beam splitter 12 is dispersed through the lens group 13
  • the transmitted light is focused on the surface of the measured object 2 with a wavelength corresponding to the light
  • the intensity is the largest, so the first wavelength of the light focused on the surface of the measured object 2 can be determined based on the corresponding relationship between the wavelength and the light intensity represented by the first information, based on the setting relationship between the wavelength and the calibration distance and Based on the determined first wavelength, the distance between the light source 11 and the measured object 2 can be determined.
  • the probe light in the set wavelength range emitted by the light source has a wide wavelength range.
  • the spectral sensor When the transmitted light transmitted from the beam splitter to the lens group is dispersed, the light of different wavelengths in the transmitted light can be focused to multiple different focusing positions, and Compared with the spectrometer, the spectral sensor is smaller in size, so the ranging device can be applied to the application scenario of short-distance ranging.
  • the receiving surface of the spectral sensor configured to receive the first reflected light is larger than the receiving surface of the spectrometer configured to receive the corresponding reflected light, the spectral sensor receives relatively more first reflected light, and the utilization rate of light energy is high, Therefore, even when the distance between the distance measuring device and the measured object is short, the accuracy of the distance measured by the distance measuring device is high.
  • the lens group can be configured to adjust the dispersion range, thereby changing the focus position corresponding to light of different wavelengths, and then changing the focus distance corresponding to light of different wavelengths. High accuracy.
  • the light passing area of the first light limiter 14 is a through hole, and at this time, the first light limiter 14 is a diaphragm; the spectral sensor 15 corresponds to a lattice spectral sensor.
  • the diameter of the aperture of the diaphragm is adjustable.
  • the probe light emitted by the light source 11 is projected to the beam splitter 12, a corresponding light spot is formed.
  • the light spot is in the area corresponding to ab in FIG. 1 .
  • the transmitted light transmitted is projected onto the surface of the lens group 13 to form a light spot, for example, the light spot is in a region corresponding to cd.
  • the transmitted light passes through the lens group 13 and undergoes dispersion, the light focused on the surface of the measured object 2 forms a light spot, and the light spot corresponds to a point on the surface of the measured object 2 .
  • the first information output by the spectral sensor 15 is configured to determine the distance between the focal position where the light spot focused on the surface of the measured object is located and the light source.
  • the distance measuring device 1 further includes: a first lens 16 and a second light limiter 17 disposed between the light source 11 and the beam splitter 12;
  • the first lens 16 is configured to convert the first detection light incident to the first lens 16 into a corresponding second detection light; the second detection light represents a parallel beam corresponding to the first detection light;
  • the second light limiter 17 has a slit allowing the second probe light to pass through; the second probe light passing through the slit of the second light limiter 17 is configured to be incident on the beam splitter 12 . first surface.
  • FIG. 3 shows a schematic structural diagram of a ranging apparatus provided by another embodiment of the present application.
  • the ranging device shown in FIG. 3 adds a first lens 16 and a second light limiter 17 arranged between the light source 11 and the beam splitter 12 on the basis of FIG. 1 .
  • the first lens 16 is a linear lens. After the first probe light emitted by the light source 11 passes through the first lens 16 , it is converted into a corresponding second probe light, and part of the second probe light corresponding to the first probe light is incident through the slit of the second light limiter 17 . To the first surface of the beam splitter 12 , part of the second probe light is transmitted from the second surface of the beam splitter 12 to the lens group 13 for dispersion.
  • the slit of the second light limiter 17 is a narrow and elongated slit hole with adjustable width.
  • the second detection light is a parallel light beam corresponding to the first detection light.
  • the distance between two adjacent light rays in the first probe light becomes farther as the propagation distance increases, and the light rays in the second probe light are parallel to each other.
  • the second detection light passing through the second light limiter 17 in FIG. 3 corresponds to a linear light spot
  • the light focused on the surface of the measured object 2 in FIG. 3 also corresponds to a linear light spot
  • the second reflected light passing through the first light limiter 14 also corresponds to a line-shaped light spot
  • the spectral sensor 15 detects a line-shaped light spot
  • the light focused on the surface of the measured object 2 in FIG. 1 corresponds to a point-shaped light spot
  • the second reflected light passing through the first light limiter 14 also corresponds to a point light spot
  • the spectral sensor 15 detects a point light spot.
  • the position of the linear light spot in FIG. 3 is the detection range of the surface of the measured object 2 .
  • the position of the point-shaped light spot in FIG. 2 is the detection range of the surface of the measured object 2 .
  • the first lens 16 includes at least one collimating mirror 161 and at least one cylindrical mirror 162 disposed oppositely; wherein,
  • a collimating mirror configured to convert the first probe light into a parallel light beam
  • the cylindrical mirror is configured to condense the parallel light beams into the corresponding second detection light.
  • the light-transmitting area of the first light limiter 14 in FIG. 3 is a slit; the spectral sensor 15 corresponds to an area array spectral sensor; the first information also represents the first reflected light corresponding to the first reflected light. 2. The focal position of the reflected light on the surface of the measured object.
  • the spectral sensor 15 may output at least two spectral curves, each of which represents the corresponding relationship between wavelength and light intensity. Each spectral curve corresponds to a focus position, and different spectral curves correspond to different focus positions.
  • any beam of first detection light emitted by the light source 11 is converted into a corresponding second detection light after passing through the first lens 16 .
  • the second probe light is a parallel beam corresponding to the first probe light
  • the second probe light passing through the slit of the second light limiter 17 forms a linear light spot when projected to the beam splitter 12.
  • the transmitted light transmitted from the second surface of the beam splitter 12 in the second detection light of the slit of the second light limiter 17 is dispersed by the lens group 13
  • the first reflected light passing through the first light limiter 14 also forms a line-shaped light spot
  • the first information output by the spectral sensor 15 can represent the corresponding focus positions of the linear light spots on the surface of the measured object 2.
  • the light intensity corresponding to the wavelength of the first reflected light thus, based on the light intensity corresponding to the wavelength of the first reflected light corresponding to each focusing position and based on the setting correspondence between the wavelength and the calibration distance, determine the distance measuring device.
  • the wavelengths with larger errors can be discarded based on the distribution characteristics of the wavelengths of the first reflected light corresponding to the respective focusing positions on the surface of the measured object 2 represented by the first information, so that the wavelengths corresponding to the wavelengths after the wavelengths with larger errors are discarded.
  • the light intensity and based on the set correspondence between the wavelength and the calibration distance, determine the distances between the different focusing positions on the surface of the measured object 2 and the light source 11, and finally determine the distance measuring device according to the distances corresponding to the different focusing positions
  • the distance between the light source 11 and the measured object 2 in . Therefore, when the surface of the measured object 2 is uneven, for example, there are unevenness, the measurement error of the distance between the light source 11 and the measured object 2 can be reduced, and the measurement accuracy can be improved.
  • the lens group 13 includes at least two dispersive lenses; the lens group 13 is further configured to adjust the dispersion range; wherein,
  • the dispersion range represents the distance range between the focus position of the light of different wavelengths in the corresponding transmitted light and the lens group when the transmitted light is transmitted to the lens group for dispersion.
  • the surface shape and structure of the lens group 13 can be adjusted to adjust the corresponding focusing positions of different wavelengths in the transmitted light, thereby adjusting the optical center of the lens group 13 and the focusing point on the surface of the measured object 2 The distance between them, and then adjust the focusing distance corresponding to the light of different wavelengths.
  • the dispersion range is determined by the nearest focus position and the farthest focus position corresponding to different wavelengths in the corresponding transmitted light.
  • the closest focus position indicates that the focus position is closest to the optical center of the lens group 13
  • the farthest focus position indicates that the focus position is farthest from the optical center of the lens group 13 .
  • the lens group 13 includes at least two dispersive lenses, which corresponds to adjusting the dispersion range by adjusting the structure of the lens group 13 .
  • the dispersion range of the lens group 13 when the dispersion range of the lens group 13 is shortened, the distance between the focus position on the surface of the measured object 2 and the optical center of the lens group 13 can be reduced; when the dispersion range of the lens group 13 is increased, the focus on the measured object 2 can be increased. The distance between the focus position of the surface of the object 2 and the optical center of the lens group 13 .
  • the distance measuring device can be arranged in the camera, and the dispersion range of the lens group 13 can be shortened, which is suitable for the macro focusing mode. Enlarging the dispersion range of lens group 13 is suitable for telephoto mode.
  • FIG. 4 shows a schematic flowchart of the implementation of a ranging method based on any of the above ranging devices provided by an embodiment of the present application.
  • the executing subject of the ranging method is a processor, and the processor may be set in the ranging device. , can also be set in the camera, and can also be set in an electronic device with a shooting function. Electronic devices can be cell phones, microscopes, etc.
  • the ranging method in this embodiment is applicable to the close-range zoom shooting mode, the wide-angle macro shooting mode, and the short-distance focusing mode.
  • the ranging method provided by this embodiment includes:
  • S401 Based on the first information, determine at least one first wavelength of light focused on the surface of the measured object; wherein, at the same focusing position on the measured object surface, the light intensity corresponding to the first wavelength is the largest.
  • the first information at least represents the light intensity corresponding to the wavelength of the first reflected light.
  • the processor determines a first wavelength corresponding to at least one focusing position corresponding to the light on the surface of the measured object based on the corresponding relationship between the wavelength and the light intensity represented by the first information.
  • the light focused on the surface of the object to be measured may correspond to one focusing position, or may correspond to at least two different focusing positions. At the same focus position on the surface of the measured object, the light intensity corresponding to the first wavelength is the largest.
  • S402 Determine the distance between the light source and the measured object based on the set correspondence between the wavelength and the calibration distance and based on the determined at least one first wavelength.
  • the processor stores the set correspondence between the wavelength and the calibration distance.
  • the processor determines the first wavelength corresponding to at least one focusing position corresponding to the light on the surface of the object to be measured, based on the relationship between the wavelength and the calibration distance. and based on the determined first wavelength corresponding to at least one focusing position corresponding to the light on the surface of the measured object, determine the first calibration distance corresponding to each first wavelength in the at least one first wavelength, based on The first calibration distance corresponding to each of the at least one first wavelength determines the distance between the light source 11 in the distance measuring device and the measured object 2 .
  • the processor can measure the wavelength and the corresponding calibration distance of the light focused on the surface of the same measured object in different environments based on the auxiliary distance measurement device, and establish the setting between the wavelength and the calibration distance. Correspondence.
  • the wavelength of the light focused on the surface of the object to be measured and the corresponding calibration distance can be measured multiple times in different environments based on the auxiliary ranging device, so as to finally determine the distance focused on the surface of the object to be measured.
  • the wavelength of light and the corresponding calibration distance can reduce the error of the calibration distance measured in a single environment.
  • the different environments may include at least one of the following: environments with different color temperatures; environments with different brightness.
  • the processor When the processor is installed in an electronic device with a photographing function, after the electronic device leaves the factory, it can determine the wavelength of the light focused on the surface of the object to be measured and the corresponding distance measured by the first focusing method supported by the electronic device. The setting correspondence between the wavelength and the calibration distance is corrected, so as to ensure the accuracy of the calibration distance corresponding to the first wavelength determined based on the first information output by the ranging device and the setting relationship between the wavelength and the calibration distance. sex.
  • the first focusing method includes at least one of the following:
  • Phase focus (PDAF, Phase Detection Auto Focus);
  • Contrast Focus (CDAF, Contrast Detection Auto Focus).
  • the light intensity corresponding to the first wavelength is the largest
  • the processor determines at least one focal position corresponding to the light on the surface of the object to be measured based on the first information
  • the corresponding first wavelength based on the set correspondence between the wavelength and the calibrated distance, and based on the determined at least one first wavelength, can accurately measure the distance between the light source in the ranging device and the measured object, thereby Based on the positional relationship between the lens group of the camera in the electronic device and the light source in the ranging device, the distance between the lens group and the measured object is obtained, and then based on the distance between the lens group and the measured object, the electronic device can be accurately measured.
  • the focusing distance of the device which improves the clarity of images captured by autofocus based on the focusing distance.
  • the processor determines a first wavelength of light focused on the surface of the object to be measured based on the first information.
  • the distance between the light source and the measured object is determined based on the set correspondence between the wavelength and the calibration distance and based on the determined at least one first wavelength, including:
  • a first calibration distance corresponding to the first wavelength is determined; the first calibration distance represents the distance between the light source and the measured object.
  • the dot-matrix spectral sensor is suitable for the scene of point detection, thereby detecting the second reflected light corresponding to a focus position on the surface of the measured object.
  • the focus position corresponds to the position where a point-shaped light spot is formed by the light focused on the surface of the object to be measured.
  • the transmitted light from the second surface of the beam splitter 12 is dispersed through the lens group 13
  • the light focused on the surface of the measured object 2 in the transmitted light corresponds to a point-shaped light spot
  • the first reflected light received by the second surface of the beam splitter 12 is the reflected light of the same focal position of the surface of the measured object 2.
  • the dot-matrix spectral sensor The first reflected light received by 15 and incident to the lattice spectral sensor 15 through the through hole of the first light limiter 14 also forms a dot-type light spot, and the first information output by the lattice spectral sensor 15 represents the surface of the measured object 2
  • the same focus position corresponds to the wavelength and intensity of the second reflected light.
  • the processor determines the first calibration distance corresponding to the first wavelength based on the set correspondence between the wavelength and the calibration distance in the case of determining a first wavelength with the maximum light intensity based on the first information, that is, The distance between the light source 11 and the measured object 2.
  • the processor can determine the distance between a focus position on the surface of the measured object 2 and the light source 11 , and the distance determined by the processor is more accurate when the surface of the measured object 2 is relatively flat.
  • the spectral sensor when the spectral sensor is an area array spectral sensor, at least two first wavelengths of light focused on the surface of the object to be measured are determined; the set correspondence between the wavelengths and the calibration distance is determined. and determining the distance between the light source and the measured object based on the determined at least one first wavelength, including:
  • the distance between the light source and the measured object is determined.
  • the light passing area of the first light limiter 14 is a slit
  • the spectral sensor 15 is an area array spectral sensor.
  • the area array spectral sensor is suitable for the scene of linear detection, thereby detecting the second reflected light corresponding to different focal positions of the surface of the measured object.
  • the first information output by the area array spectral sensor 15 represents the corresponding The wavelength and light intensity of the second reflected light.
  • the processor determines the first wavelength corresponding to each of the at least two focal positions on the surface of the measured object based on the first information, based on the first wavelength corresponding to each of the at least two focal positions wavelength, and the distribution characteristics corresponding to at least two first wavelengths are determined.
  • the distribution feature represents the distribution of the first wavelengths corresponding to different focus positions on the surface of the measured object.
  • at least two first wavelengths may be clustered to obtain a clustering result, and distribution characteristics corresponding to the at least two first wavelengths may be determined based on the clustering result.
  • the processor determines the distance between the light source 11 and the measured object 2 based on the distribution characteristics of the at least two first wavelengths and the set correspondence between the wavelengths and the calibration distance.
  • the processor may determine at least two valid first wavelengths based on the distribution characteristics of the at least two first wavelengths, so as to determine the at least two valid first wavelengths based on the set correspondence between the wavelengths and the calibration distance.
  • the first calibration distance corresponding to each valid first wavelength in a wavelength so that the distance between the light source 11 and the measured object 2 is calculated based on the determined at least two first calibration distances.
  • the processor may calculate the corresponding mean value, variance value, etc. based on the determined at least two first calibration distances, so as to obtain the distance between the light source 11 and the measured object 2 .
  • the effective first wavelength represents the first wavelength within a set tolerance range.
  • the effective first wavelength can also represent the first wavelength whose wavelength difference is within a set range.
  • the processor may also determine the first calibration distance corresponding to each first wavelength in the at least one first wavelength based on the set correspondence between the wavelength and the calibration distance; determine the first calibration distance based on the distribution characteristics of the at least two first wavelengths. At least two effective first wavelengths; based on the at least two effective first wavelengths, from the first calibration distance corresponding to each of the at least two first wavelengths, determine at least two effective first wavelengths The effective first calibration distance corresponding to each effective first wavelength in , so that the distance between the light source 11 and the measured object 2 is calculated based on the determined at least two effective first calibration distances.
  • the processor may calculate the corresponding mean value, variance value, etc.
  • the effective first calibration distance represents the first calibration distance within the set allowable error range.
  • the effective first calibration distance may also represent the first calibration distance where the distance difference is within a set range.
  • the method for determining at least two effective first wavelengths includes one of the following:
  • the first wavelengths in the set wavelength interval are determined as valid first wavelengths.
  • the set number is determined based on the total number of focus positions, for example, the set number may be 80% of the total number of focus positions. Set the number as an integer.
  • the processor may discard the first wavelength The first wavelength corresponding to the focus position.
  • the first wavelengths in the set wavelength interval are determined as valid first wavelengths.
  • the set number is determined based on the total number of first wavelengths, for example, the set number may be 80% of the total number of first wavelengths. Set the number as an integer.
  • the set wavelength interval may be set based on the set wavelength band, or may be determined based on at least two first wavelengths.
  • the processor can determine the difference between different focus positions on the surface of the measured object 2 and the measured object 2 based on the first information output by the area array spectral sensor and the set correspondence between the wavelength and the calibration distance. Therefore, the distance between the light source 11 in the distance measuring device and the measured object 2 is finally determined. In this process, based on the distribution characteristics of at least two first wavelengths, the first wavelength with a larger error or the corresponding distance can be discarded. Therefore, the surface of the measured object 2 is uneven, for example, there are unevenness. At the time, the measurement error of the distance between the light source 11 and the measured object 2 can be reduced, and the measurement accuracy can be improved.
  • the processor when the processor configured to execute the above-mentioned ranging method based on any one of the above-mentioned ranging devices is installed in the electronic device, the processor uses the above-mentioned ranging method to measure the first distance between the light source and the surface of the object to be measured. After a distance, when the first focusing distance is determined based on the first distance and the positional relationship between the lens group in the camera of the electronic device and the light source in the ranging device, the corresponding compensation value can be used to determine the first focusing distance.
  • the first focus distance is calibrated to obtain the second focus distance, and the second focus distance is sent to the focus motor in the camera, so that the focus motor drives the camera in the camera based on the second focus distance when it receives the second focus distance.
  • the lens group moves to the corresponding focus position to complete the auto focus.
  • the corresponding compensation value is determined based on the third focusing distance measured by the first focusing mode supported by the electronic device and the fourth focusing distance; the fourth focusing distance is determined under the measurement conditions corresponding to the first focusing mode , and use the distance measuring device to measure the obtained focusing distance.
  • the processor measures the distance between the measured object and the lens of the camera by using the first focusing method supported by the electronic device, and determines the corresponding third focusing distance based on the distance.
  • the processor uses the distance measuring device in the camera to measure the distance between the measured object and the lens of the camera, and determines the corresponding fourth focusing distance based on the distance.
  • the measurement conditions represent parameters of the measurement environment, and the parameters include at least the distance between the electronic device and the measured object, and may also include color temperature, brightness, and the like.
  • the processor calibrates the fourth focusing distance based on the third focusing distance, and determines a compensation value corresponding to the ranging device.
  • the corresponding compensation value when the third focusing distance is greater than the fourth focusing distance, the corresponding compensation value is a positive compensation value; when the third focusing distance is smaller than the fourth focusing distance, the corresponding compensation value is a negative compensation value.
  • the absolute value of the compensation value is equal to the absolute value of the difference between the third focusing distance and the fourth focusing distance. When the third focusing distance is equal to the fourth focusing distance, the corresponding compensation value is zero.
  • the processor calibrates the first focusing distance by using the corresponding compensation value to obtain the second focusing distance.
  • the absolute value of the compensation value is subtracted from the first focusing distance to obtain the second focusing distance.
  • the absolute value of the compensation value is added to the first focusing distance to obtain the second focusing distance.
  • the processor calibrates the fourth focus distance based on the third focus distance, and determines the compensation value corresponding to the distance measuring device
  • the corresponding measurement conditions can be established.
  • the correlation between (for example, color temperature, brightness, etc.) and the corresponding compensation value so that when the processor determines the first focusing distance using the first distance measured by the distance measuring device under the current measurement conditions, the first focusing distance is based on the first focusing distance.
  • the color temperature and brightness in the corresponding measurement conditions, etc. determine the compensation value corresponding to the measurement condition, so as to use the compensation value to calibrate the first focusing distance to obtain the second focusing distance, which can improve the calibration accuracy and further improve the second focusing distance. distance accuracy.
  • FIG. 5 shows a schematic structural diagram of a hardware composition of a camera provided by an embodiment of the present application.
  • the camera includes: a lens group 51 , a focus motor 52 and the distance measuring device 53 described in any of the above embodiments; wherein,
  • the focus motor 52 is configured to drive the lens group 51 to move to a corresponding focus position based on the focus distance; the focus distance is determined based on the positional relationship between the lens group 51 and the light source in the ranging device 53 and the first distance out; the first distance represents the distance between the light source in the distance measuring device 53 and the measured object.
  • a relevant calculation equation can be configured based on the positional relationship between the lens group 51 and the light source in the ranging device 53, and the corresponding focus distance can be obtained by substituting the first distance into the calculation equation.
  • the first distance is determined based on the first information output by the distance measuring device 53, and the first information at least represents the corresponding relationship between wavelength and light intensity.
  • the first distance can be calculated by the processor in the camera.
  • the first distance can also be calculated by a processor in the electronic device used with the camera.
  • the focus motor 52 When the focus motor 52 acquires the focus distance, it drives the lens group 51 to move to the corresponding focus position based on the focus distance to complete the automatic focus.
  • the camera in this embodiment can perform auto-focusing in the close-range zoom shooting mode, can also perform auto-focusing in the wide-angle macro shooting mode, and can also perform auto-focusing in the short-distance focusing mode.
  • the distance between the light source in the distance measuring device and the measured object can be determined based on the first information output by the distance measuring device, which improves the accuracy of the distance, so that the determination based on the distance can be improved.
  • the accuracy of the obtained focusing distance is better, and the automatic focusing effect based on the focusing distance is better, which can improve the clarity of the image captured in the automatic focusing mode.
  • the spectral sensor in the ranging device in the camera is an area array spectral sensor
  • the corresponding distance can be finally determined based on the wavelengths and light intensities corresponding to different focusing positions on the surface of the measured object, even if the surface of the measured object is not In the case of flatness, the sharpness of the image captured by autofocus is high.
  • the camera includes a wide-angle camera or a macro camera.
  • the dispersion range of the lens group in the distance measuring device 53 can be increased.
  • the dispersion range of the lens group in the ranging device 53 can be shortened.
  • the dispersion range of the lens group in the distance measuring device 53 is shortened.
  • the accuracy of the distance measured by the ranging device is relatively high, when the image is automatically focused based on the distance measured by the ranging device, a clearer image can be obtained. image to improve the quality of images taken at close range.
  • the lens group in the ranging device includes at least two dispersive lenses, which can be configured to adjust the dispersion range, so as to adjust the focus position of light of different wavelengths, and then adjust the focus distance corresponding to light of different wavelengths.
  • the accuracy of the distance measured by the ranging device is high, and the image obtained by the camera using the distance measured by the ranging device to automatically focus and shoot has high definition.
  • the embodiment of the present application further provides a camera. As shown in FIG. 6 , the camera includes:
  • a communication interface 61 capable of information interaction with other devices such as electronic devices;
  • the processor 62 is connected to the communication interface 61 to realize information interaction with other devices, and is configured to execute the ranging method provided by one or more of the above technical solutions when running a computer program. And the computer program is stored on the memory 63 .
  • the ranging device 65 can interact with the processor 62 for information. For example, the processor 62 sends the focusing distance determined based on the first distance to the focusing motor 66 .
  • bus system 64 is configured to enable connection communication between these components.
  • bus system 64 also includes a power bus, a control bus and a status signal bus.
  • bus system 64 is labeled as bus system 64 in FIG. 6 .
  • the memory 63 in the embodiment of the present application is configured to store various types of data to support the operation of the camera. Examples of such data include: any computer program configured to operate on the camera.
  • FIG. 7 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application. As shown in FIG. 7 , the electronic device includes:
  • a communication interface 71 capable of information interaction with other devices such as network devices;
  • the processor 72 is connected to the communication interface 71 to realize information interaction with other devices, and is configured to execute the ranging method provided by one or more of the above technical solutions when running a computer program. And the computer program is stored on the memory 73 .
  • the camera 75 can exchange information with the processor 72.
  • the processor 72 sends the determined focus distance to the camera 75 when determining the corresponding focus distance using the distance measured by the above distance measuring method.
  • the camera 75 is the camera provided by any of the above technical solutions.
  • bus system 74 is configured to enable connection communication between these components.
  • bus system 74 also includes a power bus, a control bus and a status signal bus.
  • bus system 74 is designated as bus system 74 in Figure 7 .
  • the memory 73 in the embodiment of the present application is configured to store various types of data to support the operation of the electronic device. Examples of such data include: any computer program configured to operate on an electronic device.
  • the memory 63 or the memory 73 may be a volatile memory or a non-volatile memory, and may also include both volatile and non-volatile memory.
  • the non-volatile memory can be a read-only memory (ROM, Read Only Memory), a programmable read-only memory (PROM, Programmable Read-Only Memory), an erasable programmable read-only memory (EPROM, Erasable Programmable Read-only memory) Only Memory), Electrically Erasable Programmable Read-Only Memory (EEPROM, Electrically Erasable Programmable Read-Only Memory), Magnetic Random Access Memory (FRAM, ferromagnetic random access memory), Flash Memory (Flash Memory), Magnetic Surface Memory , CD-ROM, or CD-ROM (Compact Disc Read-Only Memory); magnetic surface memory can be disk memory or tape memory.
  • RAM Random Access Memory
  • SRAM Static Random Access Memory
  • SSRAM Synchronous Static Random Access Memory
  • DRAM Dynamic Random Access Memory
  • SDRAM Synchronous Dynamic Random Access Memory
  • DDRSDRAM Double Data Rate Synchronous Dynamic Random Access Memory
  • ESDRAM Double Data Rate Synchronous Dynamic Random Access Memory
  • ESDRAM Enhanced Type Synchronous Dynamic Random Access Memory
  • SLDRAM Synchronous Link Dynamic Random Access Memory
  • DRAM Direct Rambus Random Access Memory
  • DRRAM Direct Rambus Random Access Memory
  • the methods disclosed in the above embodiments of the present application may be applied to the processor 62 or the processor 72 , or implemented by the processor 62 or the processor 72 .
  • the processor 62 or the processor 72 may be an integrated circuit chip with signal processing capability. In the implementation process, each step of the above-mentioned method may be completed by an integrated logic circuit of hardware in the processor 62 or the processor 72 or instructions in the form of software.
  • the aforementioned processor 62 or processor 72 may be a general-purpose processor, a DSP, or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like.
  • the processor 62 or the processor 72 may implement or execute the methods, steps, and logical block diagrams disclosed in the embodiments of the present application.
  • a general purpose processor may be a microprocessor or any conventional processor or the like.
  • the steps of the method disclosed in the embodiments of the present application can be directly embodied as being executed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module may be located in a storage medium, and the storage medium may be located in the memory 63, and the processor 62 reads the program in the memory 63, and completes the steps of the foregoing method in combination with its hardware; the storage medium may also be located in the memory 73, and the processor 72 reads the program.
  • the program in the memory 73, in conjunction with its hardware, completes the steps of the aforementioned method
  • an embodiment of the present application further provides a storage medium, that is, a computer storage medium, specifically a computer-readable storage medium, for example, including a memory 63 storing a computer program, and the above-mentioned computer program can be executed by the processor 62,
  • a storage medium that is, a computer storage medium, specifically a computer-readable storage medium, for example, including a memory 63 storing a computer program, and the above-mentioned computer program can be executed by the processor 62,
  • it includes a memory 73 storing a computer program executable by the processor 72 to perform the steps described in the aforementioned methods.
  • the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EPROM, EEPROM, Flash Memory, magnetic surface memory, optical disk, or CD-ROM.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined, or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling, or direct coupling, or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be electrical, mechanical or other forms. of.
  • the unit described above as a separate component may or may not be physically separated, and the component displayed as a unit may or may not be a physical unit, that is, it may be located in one place or distributed to multiple network units; Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may all be integrated into one processing module, or each unit may be separately used as a unit, or two or more units may be integrated into one unit; the above integration
  • the unit can be implemented either in the form of hardware or in the form of hardware plus software functional units.
  • the aforementioned program can be stored in a computer-readable storage medium, and when the program is executed, execute Including the steps of the above method embodiment; and the aforementioned storage medium includes: a mobile storage device, a read-only memory (ROM, Read-Only Memory), a random access memory (RAM, Random Access Memory), a magnetic disk or an optical disk and other various A medium on which program code can be stored.
  • ROM read-only memory
  • RAM random access memory
  • magnetic disk or an optical disk and other various A medium on which program code can be stored.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

提供了测距装置、测距方法、摄像头及电子设备。测距装置包括:光源(11)、分束器(12)、包括至少一个色散透镜的透镜组(13)、第一限光器(14)和光谱传感器(15);光源(11)配置为发射处于设定波段的探测光;分束器(12)配置为透射探测光;透镜组(13)配置为对从分束器(12)透射至透镜组(13)的透射光进行色散,将不同波长的光聚焦到不同位置;第一限光器(14),具有允许第一反射光通过的通光区域;第一反射光表征第二反射光从透镜组(13)透射至分束器(12)的第二表面时在第二表面产生的反射光,第二反射光表征聚焦到被测对象(2)表面的光在被测对象(2)表面产生的反射光;光谱传感器(15)配置为接收到第一反射光时,输出第一信息;第一信息至少表征第一反射光的波长对应的光强,第一信息配置为确定光源(11)与被测对象(2)之间的距离。

Description

测距装置、测距方法、摄像头及电子设备
相关申请的交叉引用
本申请基于申请号为202010911834.3,申请日为2020年09月02日的中国专利申请提出,并要求上述中国专利申请的优先权,上述中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本申请涉及摄像技术领域,具体涉及一种测距装置、测距方法、摄像头及电子设备。
背景技术
具有拍摄功能的电子设备通常利用电子设备中的测距模块检测镜头与被拍摄对象之间的距离,从而基于该距离进行自动对焦。在微距拍摄场景下,相关技术中利用测距模块检测到的距离与实际距离误差较大,导致电子设备的对焦效果较差,拍摄得到的图像不清晰。
发明内容
为解决相关技术问题,本申请实施例提供了一种测距装置、测距方法、摄像头及电子设备。
本申请实施例提供一种测距装置,包括:
光源,配置为发射处于设定波段的探测光;
分束器,配置为透射所述探测光,输出所述探测光对应的透射光;探测光入射至所述分束器的第一表面;
包括至少一个色散透镜的透镜组,配置为对从所述分束器透射至所述透镜组的透射光进行色散,以将不同波长的光聚焦到不同位置;
第一限光器,具有允许第一反射光通过的通光区域;所述第一反射光表征第二反射光从所述透镜组透射至所述分束器的第二表面时在第二表面产生的反射光;所述第二反射光表征聚焦到被测对象表面的光在被测对象表面产生的反射光;所述第二表面与所述第一表面相对设置;
光谱传感器,配置为在接收到所述第一反射光的情况下,输出第一信息;所述第一信息至少表征所述第一反射光的波长对应的光强;所述第一信息配置为确定所述光源与所述被测对象之间的距离。
本申请实施例还提供一种基于上述任一种测距装置的测距方法,包括:
基于所述第一信息,确定出聚焦到被测对象表面的光的至少一个第一波长;
基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一波长,确定出光源与被测对象之间的距离;其中,
在被测对象表面的同一个聚焦位置上,第一波长对应的光强最大。
本申请实施例还提供一种摄像头,包括:
镜头组、对焦马达和上述任一种测距装置;其中,
所述对焦马达配置为基于对焦距离驱动所述镜头组移动至对应的对焦位置;所述对焦距离基于所述镜头组和所述测距装置中的光源之间的位置关系、以及第一距离确定出;所述第一距离表征所述测距装置中的光源与被测对象之间的距离。
本申请实施例还提供了一种摄像头,包括:
处理器、配置为存储能够在处理器上运行的计算机程序的存储器和上述任一种测距装置;其中,
所述处理器配置为运行所述计算机程序时,实现上述任一种测距方法。
本申请实施例还提供了一种电子设备,包括:
处理器、配置为存储能够在处理器上运行的计算机程序的存储器和上述任一种摄像头;其中,
其中,所述处理器配置为运行所述计算机程序时,实现上述任一种测距方法。
本申请实施例还提供了一种存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现上述任一种测距方法。
附图说明
图1为本申请实施例提供的一种测距装置的结构示意图;
图2为本申请实施例提供的一种光谱曲线的示意图;
图3为本申请另一实施例提供的一种测距装置的结构示意图;
图4为本申请实施例提供的一种基于上述任一种测距装置的测距方法实现流程示意图;
图5为本申请实施例提供的一种摄像头的硬件组成结构示意图;
图6为本申请另一实施例提供的一种摄像头的硬件组成结构示意图;
图7为本申请实施例电子设备的硬件组成结构示意图。
具体实施方式
下面在介绍本申请的技术方案之前,先介绍下相关技术中的自动对焦方法。
相关技术中提供了一种自动对焦方法,电子设备中测距模块的激光发射装置发射出第一红外激光,该第一红外激光传播到被拍摄对象的表面时发生反射,产生第二红外激光;电子设备中测距模块的激光接收装置接收第二红外激光;根据第一红外激光对应的发射时间和第二红外激光对应的接收时间,计算出时间差;根据红外激光在传播介质中的传播速度和计算出的时间差,计算出电子设备的摄像头与被拍摄对象之间的距离,电子设备中的对焦马达基于计算出的距离,将电子设备中的摄像头的镜片组或镜头组驱动到相应的对焦位置,完成自动对焦。
在近距离拍摄场景,或微距拍摄场景下,电子设备的摄像头与被拍摄对象之间的实际距离较小,相关技术中,电子设备利用测距模块测出的摄像头与被拍摄对象之间的测试距离与对应的实际距离误差较大,导致电子设备基于该测试距离进行自动对焦时拍摄得到的图像不清晰。
为了解决上述技术问题,本申请实施例提供一种测距装置,包括:
光源,配置为发射处于设定波段的探测光;
分束器,配置为透射所述探测光,输出所述探测光对应的透射光;探测光入射 至所述分束器的第一表面;
包括至少一个色散透镜的透镜组,配置为对从所述分束器透射至所述透镜组的透射光进行色散,以将不同波长的光聚焦到不同位置;
第一限光器,具有允许第一反射光通过的通光区域;所述第一反射光表征第二反射光从所述透镜组透射至所述分束器的第二表面时在第二表面产生的反射光,所述第二反射光表征聚焦到被测对象表面的光在被测对象表面产生的反射光;所述第二表面与所述第一表面相对设置;
光谱传感器,配置为在接收到所述第一反射光的情况下,输出第一信息;所述第一信息至少表征所述第一反射光的波长对应的光强;所述第一信息配置为确定所述光源与所述被测对象之间的距离。
在一实施例中,所述第一限光器的通光区域为通孔;所述光谱传感器对应为点阵光谱传感器。
在一实施例中,所述的测距装置还包括:设置于所述光源和所述分束器之间的第一镜头和第二限光器;
所述第一镜头,配置为将入射至所述第一镜头的第一探测光转换成对应的第二探测光;第二探测光表征第一探测光对应的平行光束;
所述第二限光器,具有允许所述第二探测光通过的狭缝;通过所述第二限光器的狭缝的第二探测光配置为入射至所述分束器。
在一实施例中,所述第一镜头包括相对设置的至少一个准直镜和至少一个柱面镜;其中,
准直镜,配置为将第一探测光转换为平行光束;
柱面镜,配置为将平行光束汇聚成对应的第二探测光。
在一实施例中,所述第一限光器的通光区域为狭缝;所述光谱传感器对应为面阵光谱传感器;所述第一信息还表征所述第一反射光对应的第二反射光在被测对象表面的聚焦位置。
在一实施例中,所述透镜组包括至少两个色散透镜;所述透镜组还配置为调整色散范围;其中,
所述色散范围表征透射光透射至所述透镜组进行色散时,对应的透射光中不同波长的光的聚焦位置与所述透镜组之间的距离范围。
在一实施例中,所述分束器的第一表面的透射率大于反射率。
在一实施例中,所述分束器的第一表面镀有至少一层增透膜,所述分束器的第二表面镀有至少一层增反膜。
在一实施例中,所述第一信息包括光谱曲线,所述光谱曲线表征第一反射光的波长与光强之间的对应关系。
本申请实施例还提供一种使用上述测距装置的测距方法,包括:
基于所述第一信息,确定出聚焦到被测对象表面的光的至少一个第一波长;
基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一波长,确定出光源与被测对象之间的距离;其中,
在被测对象表面的同一个聚焦位置上,第一波长对应的光强最大。
在一实施例中,当所述光谱传感器为点阵光谱传感器时,确定出聚焦到被测对象表面的光的一个第一波长;所述基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一波长,确定出光源与被测对象之间的距离,包括:
基于波长和标定距离之间的设定对应关系,确定出第一波长对应的第一标定距离;所述第一标定距离表征所述光源与所述被测对象之间的距离。
在一实施例中,当所述光谱传感器为面阵光谱传感器时,确定出聚焦到被测对象表面的光的至少两个第一波长;所述基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一波长,确定出光源与被测对象之间的距离,包括:
基于所述至少两个第一波长的分布特征,以及基于波长和标定距离之间的设定对应关系,确定出所述光源与所述被测对象之间的距离。
在一实施例中,所述测距方法还包括:
对至少两个第一波长进行聚类,得到聚类结果;
基于所述聚类结果确定出至少两个第一波长对应的分布特征。
本申请实施例还提供一种摄像头,包括:
镜头组、对焦马达和上述任一种测距装置;其中,
所述对焦马达配置为基于对焦距离驱动所述镜头组移动至对应的对焦位置;所述对焦距离基于所述镜头组和所述测距装置中的光源之间的位置关系、以及第一距离确定出;所述第一距离表征所述测距装置中的光源与被测对象之间的距离。
本申请实施例还提供一种摄像头,包括:
处理器、配置为存储能够在处理器上运行的计算机程序的存储器和上述测距装置;其中,
所述处理器配置为运行所述计算机程序时,执行上述测距方法的步骤。
本申请实施例还提供一种电子设备,包括:
处理器、配置为存储能够在处理器上运行的计算机程序的存储器和上述摄像头;其中,
所述处理器配置为运行所述计算机程序时,执行上述测距方法的步骤。
以下结合说明书附图及具体实施例对本申请的技术方案做进一步的详细阐述。
图1示出了本申请实施例提供的测距装置的结构示意图。参照图1,测距装置1包括:光源11、分束器(Beam splitter)12、包括至少一个色散透镜(Dispersing lens)的透镜组13、第一限光器14和光谱传感器15。其中,
光源11,配置为发射处于设定波段的探测光;
分束器12,配置为透射探测光,输出所述探测光对应的透射光;探测光入射至所述分束器的第一表面;
包括至少一个色散透镜的透镜组13,配置为对从所述分束器12透射至所述透镜组13的透射光进行色散,以将不同波长的光聚焦到不同位置;
第一限光器14,具有允许第一反射光通过的通光区域;所述第一反射光表征第二反射光从所述透镜组13透射至所述分束器12的第二表面时在第二表面产生的反射光;所述第二反射光表征聚焦到被测对象表面的光在被测对象表面产生的反射光;所述第二表面与所述第一表面相对设置;
光谱传感器15,配置为在接收到所述第一反射光的情况下,输出第一信息;所述第一信息至少表征所述第一反射光的波长对应的光强;所述第一信息配置为确定所述光源与所述被测对象之间的距离。
需要说明的是,分束器12具有相对设置的第一表面和第二表面。第一限光器14设置于分束器12和光谱传感器15之间,且第一限光器14靠近分束器12的第二表面,背离分束器12的第一表面。
下面结合图1介绍测距装置1的工作原理:
光源11发射处于设定波段的探测光。在实际应用中,光源11为点光源,可以发射出处于设定波段的近红外光。这里,探测光具有较宽的波长范围。设定波段可以为760 纳米(nm)~1500纳米。
光源11发射的探测光传播至分束器12的第一表面时,该探测光中的其中一部分光在分束器12的第一表面发生反射,探测光中的另一部分光经过分束器12的第一表面和第二表面,从分束器12的第二表面透射出去。这里,分束器12的第一表面和第二表面相对设置。分束器12不改变探测光的传播方向,因此,从分束器12的第二表面透射出的透射光的传播方向与对应的探测光的传播方向相同。
从分束器12的第二表面透射出的透射光经过包括至少一个色散透镜的透镜组13时,发生色散,使得透色光(透色光为复色光)分解为单色光,以将该透射光中包括的不同波长的单色光聚焦到轴向的不同的位置,例如图1中的e、f、g,g表征被测对象表面的聚焦位置。透镜组13不会改变从分束器12透射至透镜组13的透射光的传播方向。
当被测对象2处于透镜组13中的色散范围时,聚焦到被测对象2的表面的光在被测对象2的表面发生反射,产生对应的第二反射光。这里,第二反射光为光束。基于光路可逆原理,第二反射光沿着对应的透射光的发射光路返回至透镜组13,并经过透镜组13透射至分束器12的第二表面,由此,第二反射光沿着透过分束器12的第二表面的透射光的发射光路,最终返回至分束器12的第二表面。例如,图1中,从分束器12的ab对应的区域透射出的透射光入射至透镜组13的cd对应的区域,该透射光经过透镜组13时发生散射,透射光中不同波长的光分别聚焦到不同的位置,聚焦到被测对象2的表面的光在被测对象表面g所在的聚焦位置发生反射,产生对应的第二反射光,第二反射光呈现为:由图1中g、c和d三点形成的倒锥形。基于光路可逆原理,第二反射光经过透镜组13的cd对应的区域返回至分束器12的第二表面ab对应的区域。色散范围表征透射光透射至透镜组13进行色散时,对应的透射光中不同波长的光的聚焦位置与透镜组13的光心o之间的距离范围。该色散范围由聚焦位置与透镜组13的光心o之间的最小距离和最大距离确定出。
第二反射光传播至分束器12的第二表面时在分束器12的第二表面发生反射,产生对应的第一反射光,该第一反射光通过第一限光器14的通光区域。光谱传感器15在探测到通过第一限光器14的通光区域的第一反射光时,输出第一信息,第一信息至少表征第一反射光的波长对应的光强。这里,图1中,由a、b和h三点组成的三角形是一个等腰三角形,h表征第一限光器14的通光区域的中心,ab对应于探测光投射到分束器12的表面的最大直径,由此,可以保证第二反射光在分束器12的第二表面发生反射产生的第一反射光能够通过第一限光器14,从而被光谱传感器15接收到。
需要说明的是,分束器12配置为透射从分束器12的第一表面入射的探测光,以及反射从透镜组13透射至分束器12的第二表面的第二反射光。
在实际应用中,分束器12的第一表面的透射率大于反射率,例如,分束器的第一表面的透射率大于50%。由此,光源11发射出的任一束探测光传播到分束器的第一表面时,该束探测光中透过分束器12的第二表面的光大于被分束器12的第一表面反射的光,可以减少探测光在传播过程中损失的能量。分束器12的第二表面的透射率小于反射率,例如,分束器的第二表面的反射率大于50%,由此保证第二反射光能够在分束器12的第二表面发生反射产生对应的第一反射光,进而使得光谱传感器15能够探测到对应的第一反射光。
在一实施例中,分束器12的第一表面镀有至少一层增透膜,分束器12的第二表面镀有至少一层增反膜。增透膜配置为增加分束器12的第一表面的透光量。增反膜配置为减少分束器12的第二表面透光量,增加反射光。
第一限光器14的通光区域的尺寸是可调节的。第一限光器14配置为限制通过通光区域的第一反射光的强度。第一限光器14的通光区域保证了仅有聚焦在被测对象2表 面的光对应的第一反射光可以通过,第一限光器14的通光区域越窄,从第一限光器14的通光区域通过的光的强度越低,通过第一限光器14的杂散光就越少,基于光谱传感器15输出的第一信息确定出的距离越准确。
光谱传感器15可以接收处于可见光波段的光。光谱传感器15可以为电荷耦合元件(CCD,Charge-coupled Device)光谱传感器,也可以为互补金属氧化物半导体(CMOS,Complementary Metal-Oxide-Semiconductor)光谱传感器。
在一实施例中,第一信息可以包括光谱曲线,如图2所示,光谱曲线表征第一反射光的波长与光强之间的对应关系。
在本实施例提供的方案中,从分束器12的第二表面透出的透射光在经过透镜组13发生色散时,该透射光中聚焦到被测对象2表面的光的波长对应的光强最大,由此可以基于第一信息表征的波长和光强之间的对应关系,确定出聚焦到被测对象2表面的光的第一波长,基于波长和标定距离之间的设定关系以及基于确定出的第一波长,可以确定出光源11与被测对象2之间的距离。光源发射的处于设定波段的探测光具有较宽的波长范围,从分束器透射至透镜组的透射光发生色散时,透射光中不同波长的光可以聚焦到多个不同的聚焦位置,且光谱传感器相对于光谱仪的体积较小,因此测距装置可以适用于短距离测距的应用场景。另外,光谱传感器配置为接收第一反射光的接收面相对于光谱仪配置为接收对应的反射光的接收面较大,光谱传感器接收到的第一反射光相对比较多,光能的利用率较高,由此即使在测距装置与被测对象之间的距离较短的情况下,测距装置测出的距离的准确度也较高。透镜组可以配置为调节色散范围,从而改变不同波长的光对应的聚焦位置,进而改变不同波长的光对应的对焦距离,由此在近距离变焦拍摄的场景下,测距装置测出的距离的准确度较高。
在一实施例中,所述第一限光器14的通光区域为通孔,此时第一限光器14为光阑;所述光谱传感器15对应为点阵光谱传感器。光阑的通光孔的直径是可调节的。
这里,光阑的通光孔的孔径越小,光谱传感器15输出的光谱曲线越陡,基于光谱曲线确定出的距离的精度越高。
需要说明的是,光源11发射的探测光投射到分束器12时形成对应的光斑,例如,光斑处于图1中ab对应的区域。
光源11发射的任一束探测光经过分束器12的第二表面之后透射出的透射光投射到透镜组13的表面形成光斑,例如,光斑处于cd对应的区域。相应的,透射光中经过透镜组13发生色散时,聚焦到被测对象2表面的光形成一个光斑,该光斑在被测对象2表面对应为一个点。由此,光谱传感器15输出的第一信息配置为确定聚焦到被测对象表面的光斑所在的聚焦位置与光源之间的距离。
在一实施例中,测距装置1还包括:设置于所述光源11和所述分束器12之间的第一镜头16和第二限光器17;
所述第一镜头16,配置为将入射至所述第一镜头16的第一探测光转换成对应的第二探测光;第二探测光表征第一探测光对应的平行光束;
所述第二限光器17,具有允许所述第二探测光通过的狭缝;通过所述第二限光器17的狭缝的第二探测光配置为入射至所述分束器12的第一表面。
请一并参照图3,图3示出了本申请另一实施例提供的测距装置的结构示意图。图3所示的测距装置在图1的基础上增加了设置于光源11和分束器12之间的第一镜头16和第二限光器17。
这里,第一镜头16为线型镜头。光源11发射的第一探测光经过第一镜头16之后,转换成对应的第二探测光,该第一探测光对应的第二探测光中的部分光通过第二限光器17的狭缝入射至分束器12的第一表面,该第二探测光中的部分光从分束器12的第二表 面透射至透镜组13发生色散。第二限光器17的狭缝是一条宽度可调、狭窄细长的缝孔。
需要说明的是,第二探测光是第一探测光对应的平行光束。第一探测光中相邻的两条光线之间的距离随着传播距离的增加而变远,第二探测光中的光线之间相互平行。
需要说明的是,由于图3中通过第二限光器17的第二探测光对应为一条线型光斑,因此,图3中聚焦到被测对象2的表面的光也对应为一条线型光斑,通过第一限光器14的第二反射光也对应为一条线型光斑,光谱传感器15探测到一条线型光斑;而图1中聚焦到被测对象2的表面的光对应为一个点型光斑,通过第一限光器14的第二反射光也对应为一个点型光斑,光谱传感器15探测到一个点型光斑。相对于图2而言,图3中的测距装置1在被测对象2的表面的探测范围变大了。其中,图3中线型光斑所在的位置即为被测对象2的表面的探测范围。图2中点型光斑所在的位置即为被测对象2的表面的探测范围。
在一实施例中,所述第一镜头16包括相对设置的至少一个准直镜161和至少一个柱面镜162;其中,
准直镜,配置为将第一探测光转换为平行光束;
柱面镜,配置为将平行光束汇聚成对应的第二探测光。
在一实施例中,图3中的第一限光器14的通光区域为狭缝;光谱传感器15对应为面阵光谱传感器;所述第一信息还表征所述第一反射光对应的第二反射光在所述被测对象表面的聚焦位置。
由于第一反射光是第二反射光在分束器12的第二表面发生反射产生的反射光,而第二反射光是聚焦到被测对象2的表面的光在被测对象2表面发生反射产生的反射光,因此,第一反射光与第二反射光在所述被测对象表面的聚焦位置具有对应关系,在实际应用中,可以用不同的数字或字母表征对应的第一反射光在被测对象2表面对应的不同聚焦位置。光谱传感器15可以输出至少两条光谱曲线,至少两条光谱曲线中每条光谱曲线表征波长和光强之间的对应关系。每条光谱曲线对应一个聚焦位置,不同的光谱曲线对应的聚焦位置不同。
本实施例提供的方案中,在测距装置1中增设了第一镜头16的情况下,光源11发射出的任一束第一探测光经过第一镜头16后被转换成了对应的第二探测光,由于第二探测光是第一探测光对应的平行光束,通过第二限光器17的狭缝的第二探测光投射到分束器12时形成一条线型光斑,因此,通过第二限光器17的狭缝的第二探测光中从分束器12的第二表面透射出的透射光经过透镜组13发生色散时,聚焦到被测对象表面的光也对应形成一条线型光斑,由此,通过第一限光器14的第一反射光也对应形成一条型光斑,光谱传感器15输出的第一信息可以表征被测对象2表面的线型光斑包括的各个聚焦位置对应的第一反射光的波长对应的光强,由此,基于各个聚焦位置对应的第一反射光的波长对应的光强以及基于波长和标定距离之间的设定对应关系,确定出测距装置中的光源11与被测对象2之间的距离。这里,可以基于第一信息表征的被测对象2表面的各个聚焦位置对应的第一反射光的波长的分布特点,丢弃误差较大的波长,从而基于丢弃误差较大的波长之后的波长对应的光强,以及基于波长和标定距离之间的设定对应关系,确定出被测对象2表面的不同聚焦位置与光源11之间的距离,最终根据不同聚焦位置对应的距离,确定出测距装置中的光源11与被测对象2之间的距离。由此,在被测对象2的表面不平整,例如,存在凹凸不平的情况时,可以减小光源11与被测对象2之间的距离的测量误差,提高测量精度。
在一实施例中,所述透镜组13包括至少两个色散透镜;所述透镜组13还配置为调整色散范围;其中,
所述色散范围表征透射光透射至所述透镜组进行色散时,对应的透射光中不同波长 的光的聚焦位置与所述透镜组之间的距离范围。
在实际应用中,可以通过调整透镜组13的面型和结构,从而调整对应的透射光中不同波长的聚焦位置,从而调整透镜组13的光心与聚焦到被测对象2的表面的聚焦点之间的距离,进而调整不同波长的光对应的对焦距离。这里,色散范围由对应的透射光中不同波长对应的最近聚焦位置和最远聚焦位置确定出。最近聚焦位置表征聚焦位置距离透镜组13的光心最近,最远聚焦位置表征聚焦位置距离透镜组13的光心最远。
这里,透镜组13中包括至少两个色散透镜,对应于通过调整透镜组13的结构的方式,来调整色散范围。
其中,缩短透镜组13色散范围时,可以减小聚焦在被测对象2表面的聚焦位置与透镜组13的光心之间的距离;增大透镜组13色散范围时,可以增加聚焦在被测对象2表面的聚焦位置与透镜组13的光心之间的距离。
在实际应用中,测距装置可以设置于摄像头中,缩短透镜组13色散范围适用于微距对焦模式。增大透镜组13色散范围适用于长焦模式。
图4示出了本申请实施例提供的一种基于上述任一种测距装置的测距方法的实现流程示意图,该测距方法的执行主体为处理器,处理器可以设置于测距装置中,也可以设置于摄像头中,还可以设置于具有拍摄功能的电子设备中。电子设备可以是手机、显微镜等。本实施例中的测距方法适用于近距离变焦拍摄模式,也适用于广角微距拍摄模式,还适用于短距离对焦模式。
参照图4,本实施例提供的测距方法包括:
S401:基于所述第一信息,确定出聚焦到被测对象表面的光的至少一个第一波长;其中,在被测对象表面的同一个聚焦位置上,第一波长对应的光强最大。
这里,第一信息至少表征第一反射光的波长对应的光强。
处理器基于第一信息表征的波长与光强之间的对应关系,确定出被测对象表面的光对应的至少一个聚焦位置对应的第一波长。其中,聚焦到被测对象表面的光可以对应一个聚焦位置,也可以对应至少两个不同的聚焦位置。在被测对象表面的同一个聚焦位置上,第一波长对应的光强最大。
S402:基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一波长,确定出光源与被测对象之间的距离。
处理器中存储有波长和标定距离之间的设定对应关系,在处理器确定出被测对象表面的光对应的至少一个聚焦位置对应的第一波长的情况下,基于波长和标定距离之间的设定对应关系,以及基于确定出的被测对象表面的光对应的至少一个聚焦位置对应的第一波长,确定出至少一个第一波长中每个第一波长对应的第一标定距离,基于至少一个第一波长中每个第一波长对应的第一标定距离,确定出测距装置中的光源11与被测对象2之间的距离。
需要说明的是,处理器在测距之前,可以基于辅助测距装置测量出不同环境下聚焦到同一被测对象表面的光的波长和对应的标定距离,建立波长和标定距离之间的设定对应关系。
在实际标定操作过程中,可以基于辅助测距装置在不同环境下,多次测量聚焦到被测对象的表面的光的波长和对应的标定距离,从而最终确定出聚焦到被测对象的表面的光的波长和对应的标定距离,可以减小单一环境下测量得到的标定距离的误差。不同环境可以包括以下至少之一:不同色温的环境;不同亮度的环境。
当处理器设置于具有拍摄功能的电子设备中时,电子设备在出厂之后,可以基于电子设备支持的第一对焦方式测出的聚焦到被测对象的表面的光的波长和对应的距离,对波长和标定距离之间的设定对应关系进行修正,从而确保基于测距装置输出的第一信息 以及基于波长和标定距离之间的设定关系,确定出的第一波长对应的标定距离的准确性。这里,第一对焦方式包括以下至少之一:
相位对焦(PDAF,Phase Detection Auto Focus);
激光对焦(LDAF,Laser Detection Auto Focus);
反差对焦(CDAF,Contrast Detection Auto Focus)。
本实施例提供的方案中,在被测对象表面的同一个聚焦位置上,第一波长对应的光强最大,处理器基于第一信息,确定出被测对象表面的光对应的至少一个聚焦位置对应的第一波长,基于波长与标定距离之间的设定对应关系,以及基于确定出的至少一个第一波长,可以准确测出测距装置中的光源与被测对象之间的距离,从而基于电子设备中摄像头的镜头组和测距装置中的光源之间的位置关系,得到镜头组与被测对象之间的距离,进而基于镜头组与被测对象之间的距离,准确测出电子设备的对焦距离,提高基于该对焦距离进行自动对焦拍摄得到的图像的清晰度。
在一实施例中,当所述光谱传感器为点阵光谱传感器时,处理器基于第一信息确定出聚焦到被测对象表面的光的一个第一波长。所述基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一波长,确定出光源与被测对象之间的距离,包括:
基于波长和标定距离之间的设定对应关系,确定出第一波长对应的第一标定距离;所述第一标定距离表征所述光源与所述被测对象之间的距离。
这里,点阵光谱传感器适用于点式探测的场景,由此探测被测对象的表面的一个聚焦位置对应的第二反射光。该聚焦位置对应于聚焦到被测对象表面的光形成的点型光斑所在的位置。如图1所示,从分束器12的第二表面透出的透射光在经过透镜组13发生色散的情况下,透射光中聚焦到被测对象2表面的光对应形成一个点型光斑,对应于被测对象2的表面的一个聚焦位置,分束器12的第二表面接收到的第一反射光是被测对象2的表面的同一个聚焦位置的反射光,因此,点阵光谱传感器15接收到的通过第一限光器14的通孔入射至点阵光谱传感器15的第一反射光也对应形成一个点型光斑,点阵光谱传感器15输出的第一信息表征被测对象2表面的同一聚焦位置对应第二反射光的波长和光强。这里,处理器在基于第一信息确定出光强最大的一个第一波长的情况下,基于波长和标定距离之间的设定对应关系,确定出该第一波长对应的第一标定距离,即光源11与被测对象2之间的距离。
本实施例中,处理器可以确定出被测对象2的表面的一个聚焦位置与光源11之间的距离,对于被测对象2的表面比较平整的情况,处理器确定出的距离较准确。
在一实施例中,当所述光谱传感器为面阵光谱传感器时,确定出聚焦到被测对象表面的光的至少两个第一波长;所述基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一波长,确定出光源与被测对象之间的距离,包括:
基于所述至少两个第一波长的分布特征,以及基于波长和标定距离之间的设定对应关系,确定出所述光源与所述被测对象之间的距离。
如图3所示,第一限光器14的通光区域为狭缝,光谱传感器15为面阵光谱传感器。面阵光谱传感器适用于线型探测的场景,由此探测被测对象的表面的不同聚焦位置对应的第二反射光。
这里,从分束器12的第二表面透出的透射光在经过透镜组13发生色散的情况下,透射光中聚焦到被测对象2的表面的光形成一条线型光斑,分束器12的第二表面接收到的第一反射光是被测对象2的表面的不同聚焦位置的反射光,因此,面阵光谱传感器15输出的第一信息表征被测对象2表面的不同聚焦位置对应的第二反射光的波长和光强。这里,处理器在基于第一信息确定出被测对象表面的至少两个聚焦位置中每个聚焦 位置对应的第一波长的情况下,基于至少两个聚焦位置中每个聚焦位置对应的第一波长,确定出至少两个第一波长对应的分布特征。这里,该分布特征表征被测对象表面不同的聚焦位置对应的第一波长的分布情况。在实际应用中,可以对至少两个第一波长进行聚类,得到聚类结果,基于该聚类结果确定出至少两个第一波长对应的分布特征。
处理器基于至少两个第一波长的分布特征,以及基于波长和标定距离之间的设定对应关系,确定出光源11与被测对象2之间的距离。
这里,处理器可以基于至少两个第一波长的分布特征,确定出至少两个有效的第一波长,从而基于波长和标定距离之间的设定对应关系,确定出的至少两个有效的第一波长中每个有效的第一波长对应的第一标定距离,从而基于确定出的至少两个第一标定距离,计算得到光源11与被测对象2之间的距离。其中,处理器可以基于确定出的至少两个第一标定距离计算出对应的均值、方差值等,从而得到光源11与被测对象2之间的距离。有效的第一波长表征在设定的允许误差范围内的第一波长。有效的第一波长也可以表征波长差值处于设定范围的第一波长。
处理器还可以基于波长和标定距离之间的设定对应关系,确定出至少一个第一波长中每个第一波长对应的第一标定距离;基于至少两个第一波长的分布特征,确定出至少两个有效的第一波长;基于至少两个有效的第一波长,从至少两个第一波长中每个第一波长对应的第一标定距离中,确定出至少两个有效的第一波长中每个有效的第一波长对应的有效的第一标定距离,从而基于确定出的至少两个有效的第一标定距离,计算得到光源11与被测对象2之间的距离。处理器可以基于确定出的至少两个有效的第一标定距离计算出对应的均值、方差值等,从而得到光源11与被测对象2之间的距离。有效的第一标定距离表征在设定的允许误差范围内的第一标定距离。有效的第一标定距离也可以表征距离差值处于设定范围的第一标定距离。
在实际应用中,基于至少两个第一波长的分布特征,确定出至少两个有效的第一波长的方法包括以下之一:
在处于至少一个设定波长区间的第一波长对应的聚焦位置的数量大于或等于设定数量的情况下,将处于该设定波长区间的第一波长确定为有效的第一波长。设定数量基于聚焦位置的总数确定,例如,设定数量可以是聚焦位置的总数的80%。设定数量为整数。在此过程中,当第一聚焦位置对应的第一波长与至少两个第二聚焦位置各自对应的第一波长对应的之间的差值大于或等于设定阈值时,处理器可以丢弃第一聚焦位置对应的第一波长。
在处于至少一个设定波长区间的第一波长的数量大于或等于设定数量的情况下,将处于该设定波长区间的第一波长确定为有效的第一波长。设定数量基于第一波长的总数确定,例如,设定数量可以是第一波长的总数的80%。设定数量为整数。设定波长区间可以基于设定波段进行设置,也可以基于至少两个第一波长确定出。
本实施例中,处理器基于面阵光谱传感器输出的第一信息以及基于波长和标定距离之间的设定对应关系,可以确定出被测对象2的表面的不同聚焦位置与被测对象2之间的距离,由此,最终确定出测距装置中的光源11与被测对象2之间的距离。在此过程中,可以基于至少两个第一波长的分布特征,丢弃误差较大的第一波长或对应的距离,由此,在被测对象2的表面不平整,例如,存在凹凸不平的情况时,可以减小光源11与被测对象2之间的距离的测量误差,提高测量精度。
需要说明的是,配置为执行上述基于上述任一种测距装置的测距方法的处理器设置于电子设备时,处理器在利用上述测距方法测出光源与被测对象表面之间的第一距离之后,在基于该第一距离,以及基于电子设备的摄像头中的镜头组和测距装置中的光源之间的位置关系确定出第一对焦距离的情况下,可以利用对应的补偿值对第一对焦距离进 行校准,得到第二对焦距离,向摄像头中的对焦电机发送第二对焦距离,以便对焦电机在接收到该第二对焦距离的情况下,基于该第二对焦距离驱动摄像头中的镜头组移动至对应的对焦位置,完成自动对焦。其中,对应的补偿值基于所述电子设备支持的第一对焦方式测量得到的第三对焦距离以及基于第四对焦距离确定出;所述第四对焦距离是在第一对焦方式对应的测量条件下,利用所述测距装置测量得到的对焦距离。
这里,在不同的环境下,处理器利用电子设备支持的第一对焦方式测出被测对象与摄像头的镜头之间的距离,基于该距离确定出对应的第三对焦距离。在第一对焦方式对应的测量条件下,处理器利用摄像头中的测距装置测出被测对象与摄像头的镜头之间的距离,基于该距离确定出对应的第四对焦距离。测量条件表征测量环境的参数,该参数至少包括电子设备与被测对象之间的距离,还可以包括色温、亮度等。
处理器基于第三对焦距离对第四对焦距离进行校准,确定出测距装置对应的补偿值。
这里,当第三对焦距离大于第四对焦距离时,对应的补偿值为正向补偿值;当第三对焦距离小于第四对焦距离时,对应的补偿值为负向补偿值。补偿值的绝对值等于第三对焦距离与第四对焦距离之间的差值的绝对值。当第三对焦距离等于第四对焦距离时,对应的补偿值为零。
处理器在确定出第一对焦距离的情况下,利用对应的补偿值对该第一对焦距离进行校准,得到第二对焦距离。
这里,当测距装置对应的补偿值为负向补偿值时,在第一对焦距离的基础上减去补偿值的绝对值,得到第二对焦距离。
当测距装置对应的补偿值为正向补偿值时,在第一对焦距离的基础上加上补偿值的绝对值,得到第二对焦距离。
需要说明的是,当测量条件包括色温和亮度中任一参数时,处理器基于第三对焦距离对第四对焦距离进行校准,确定出测距装置对应的补偿值时,可以建立对应的测量条件(例如,色温、亮度等)与对应的补偿值之间的关联关系,以便处理器在当前测量条件下利用测距装置测出的第一距离确定出第一对焦距离时,基于第一对焦距离对应的测量条件中的色温和亮度等,确定出该测量条件对应的补偿值,从而利用该补偿值对第一对焦距离进行校准,得到第二对焦距离,可以提高校准精度,进而提高第二对焦距离的准确度。
为了实现本申请实施例的测距方法,本申请实施例还提供了一种摄像头。参照图5,图5示出了本申请实施例提供的一种摄像头的硬件组成结构示意图。如图5所示,摄像头包括:镜头组51、对焦马达52和上述任一实施例所述的测距装置53;其中,
对焦马达52配置为基于对焦距离驱动镜头组51移动至对应的对焦位置;所述对焦距离基于所述镜头组51和所述测距装置53中的光源之间的位置关系、以及第一距离确定出;所述第一距离表征所述测距装置53中的光源与被测对象之间的距离。
这里,可以基于镜头组51和测距装置53中的光源之间的位置关系,配置相关的计算方程,将第一距离代入该计算方程即可得到对应的对焦距离。其中,第一距离基于测距装置53输出的第一信息确定出,第一信息至少表征波长和光强之间的对应关系。当摄像头配置有处理器时,第一距离可以由摄像头中的处理器计算出。第一距离还可以由与摄像头配合使用的电子设备中的处理器计算出。计算每种测距装置的第一距离的方法请参照上述计算光源与被测对象之间的距离的相关描述,此处不赘述。
对焦马达52在获取到对焦距离时,基于对焦距离驱动镜头组51移动至对应的对焦位置,完成自动对焦。
需要说明的是,本实施例中的摄像头可以在近距离变焦拍摄模式下进行自动对焦, 也可以在广角微距拍摄模式下进行自动对焦,还可以在短距离对焦模式进行自动对焦。
本实施例提供的方案中,可以基于测距装置输出的第一信息确定出测距装置中的光源与被测对象之间的距离,提高了该距离的准确度,从而可以提高基于该距离确定出的对焦距离的准确度,基于该对焦距离进行自动对焦的效果较好,可以提高自动对焦模式下拍摄的到的图像的清晰度。
当摄像头中的测距装置中的光谱传感器为面阵光谱传感器时,可以基于被测对象的表面的不同聚焦位置对应的波长和光强,最终确定出对应的距离,即使在被测对象表面不平整的情况下,通过自动对焦拍摄得到的图像的清晰度较高。
在一实施例中,摄像头包括广角摄像头或微距摄像头。
这里,广角摄像头处于长焦拍摄模式时,可以增大测距装置53中透镜组的色散范围。广角摄像头处于微距拍摄模式时,可以缩短测距装置53中透镜组的色散范围。微距摄像头处于微距拍摄模式时,缩短测距装置53中透镜组的色散范围。
本实施例提供的方案中,在近距离拍摄的场景下,由于测距装置测出的距离的准确度较高,基于测距装置测出的距离进行自动对焦拍摄图像时,可以得到较清晰的图像,提高近距离拍摄的图像的质量。测距装置中的透镜组包括至少两个色散透镜,可以配置为调节色散范围,从而调整不同波长的光的聚焦位置,进而调整不同波长的光对应的对焦距离,在近距离变焦拍摄的场景下,测距装置测出的距离的准确度较高,摄像头利用测距装置测出的距离进行自动对焦拍摄得到的图像的清晰度较高。
为了实现本申请实施例的测距方法,本申请实施例还提供了一种摄像头,如图6所示,摄像头包括:
通信接口61,能够与其它设备比如电子设备等进行信息交互;
处理器62,与通信接口61连接,以实现与其它设备进行信息交互,配置为运行计算机程序时,执行上述一个或多个技术方案提供的测距方法。而所述计算机程序存储在存储器63上。
测距装置65,能够与处理器62进行信息交互,例如,处理器62将基于第一距离确定出的对焦距离向对焦马达66发送。
当然,实际应用时,电子设备中的各个组件通过总线系统64耦合在一起。可理解,总线系统64配置为实现这些组件之间的连接通信。总线系统64除包括数据总线之外,还包括电源总线、控制总线和状态信号总线。但是为了清楚说明起见,在图6中将各种总线都标为总线系统64。
本申请实施例中的存储器63配置为存储各种类型的数据以支持摄像头的操作。这些数据的示例包括:配置为在摄像头上操作的任何计算机程序。
为了实现本申请实施例的测距方法,本申请实施例还提供了一种电子设备。图7为本申请实施例电子设备的硬件组成结构示意图,如图7所示,电子设备包括:
通信接口71,能够与其它设备比如网络设备等进行信息交互;
处理器72,与通信接口71连接,以实现与其它设备进行信息交互,配置为运行计算机程序时,执行上述一个或多个技术方案提供的测距方法。而所述计算机程序存储在存储器73上。
摄像头75,能够与处理器72进行信息交互,例如,处理器72在利用上述测距方法测出的距离,确定出对应的对焦距离时,将确定出的对焦距离向摄像头75发送。摄像头75为上述任一种技术方案提供的摄像头。
当然,实际应用时,电子设备中的各个组件通过总线系统74耦合在一起。可理解,总线系统74配置为实现这些组件之间的连接通信。总线系统74除包括数据总线之外,还包括电源总线、控制总线和状态信号总线。但是为了清楚说明起见,在图7中将各种 总线都标为总线系统74。
本申请实施例中的存储器73配置为存储各种类型的数据以支持电子设备的操作。这些数据的示例包括:配置为在电子设备上操作的任何计算机程序。
可以理解,存储器63或存储器73可以是易失性存储器或非易失性存储器,也可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(ROM,Read Only Memory)、可编程只读存储器(PROM,Programmable Read-Only Memory)、可擦除可编程只读存储器(EPROM,Erasable Programmable Read-Only Memory)、电可擦除可编程只读存储器(EEPROM,Electrically Erasable Programmable Read-Only Memory)、磁性随机存取存储器(FRAM,ferromagnetic random access memory)、快闪存储器(Flash Memory)、磁表面存储器、光盘、或只读光盘(CD-ROM,Compact Disc Read-Only Memory);磁表面存储器可以是磁盘存储器或磁带存储器。易失性存储器可以是随机存取存储器(RAM,Random Access Memory),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(SRAM,Static Random Access Memory)、同步静态随机存取存储器(SSRAM,Synchronous Static Random Access Memory)、动态随机存取存储器(DRAM,Dynamic Random Access Memory)、同步动态随机存取存储器(SDRAM,Synchronous Dynamic Random Access Memory)、双倍数据速率同步动态随机存取存储器(DDRSDRAM,Double Data Rate Synchronous Dynamic Random Access Memory)、增强型同步动态随机存取存储器(ESDRAM,Enhanced Synchronous Dynamic Random Access Memory)、同步连接动态随机存取存储器(SLDRAM,Sync Link Dynamic Random Access Memory)、直接内存总线随机存取存储器(DRRAM,Direct Rambus Random Access Memory)。本申请实施例描述的存储器3旨在包括但不限于这些和任意其它适合类型的存储器。
上述本申请实施例揭示的方法可以应用于处理器62或处理器72中,或者由处理器62或处理器72实现。处理器62或处理器72可能是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法的各步骤可以通过处理器62或处理器72中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器62或处理器72可以是通用处理器、DSP,或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。处理器62或处理器72可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者任何常规的处理器等。结合本申请实施例所公开的方法的步骤,可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于存储介质中,该存储介质可以位于存储器63,处理器62读取存储器63中的程序,结合其硬件完成前述方法的步骤;该存储介质也可以位于存储器73,处理器72读取存储器73中的程序,结合其硬件完成前述方法的步骤
处理器62或处理器72执行所述程序时实现本申请实施例的各个方法中的相应流程,为了简洁,在此不再赘述。
在示例性实施例中,本申请实施例还提供了一种存储介质,即计算机存储介质,具体为计算机可读存储介质,例如包括存储计算机程序的存储器63,上述计算机程序可由处理器62执行,例如包括存储计算机程序的存储器73,上述计算机程序可由处理器72执行以完成前述方法所述步骤。计算机可读存储介质可以是FRAM、ROM、PROM、EPROM、EEPROM、Flash Memory、磁表面存储器、光盘、或CD-ROM等存储器。
在本申请所提供的几个实施例中,应该理解到,所揭露的设备和方法,可以通过其它的方式实现。以上所描述的设备实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,如:多个单元或组件可以 结合,或可以集成到另一个系统,或一些特征可以忽略,或不执行。另外,所显示或讨论的各组成部分相互之间的耦合、或直接耦合、或通信连接可以是通过一些接口,设备或单元的间接耦合或通信连接,可以是电性的、机械的或其它形式的。
上述作为分离部件说明的单元可以是、或也可以不是物理上分开的,作为单元显示的部件可以是、或也可以不是物理单元,即可以位于一个地方,也可以分布到多个网络单元上;可以根据实际的需要选择其中的部分或全部单元来实现本实施例方案的目的。
另外,在本申请各实施例中的各功能单元可以全部集成在一个处理模块中,也可以是各单元分别单独作为一个单元,也可以两个或两个以上单元集成在一个单元中;上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于一计算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤;而前述的存储介质包括:移动存储设备、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
需要说明的是:“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。
另外,本申请实施例所记载的技术方案和技术特征之间,在不冲突的情况下,可以任意组合。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (20)

  1. 一种测距装置,包括:
    光源,配置为发射处于设定波段的探测光;
    分束器,配置为透射所述探测光,输出所述探测光对应的透射光;探测光入射至所述分束器的第一表面;
    包括至少一个色散透镜的透镜组,配置为对从所述分束器透射至所述透镜组的透射光进行色散,以将不同波长的光聚焦到不同位置;
    第一限光器,具有允许第一反射光通过的通光区域;所述第一反射光表征第二反射光从所述透镜组透射至所述分束器的第二表面时在第二表面产生的反射光,所述第二反射光表征聚焦到被测对象表面的光在被测对象表面产生的反射光;所述第二表面与所述第一表面相对设置;
    光谱传感器,配置为在接收到所述第一反射光的情况下,输出第一信息;所述第一信息至少表征所述第一反射光的波长对应的光强;所述第一信息配置为确定所述光源与所述被测对象之间的距离。
  2. 根据权利要求1所述的测距装置,其中,所述第一限光器的通光区域为通孔;所述光谱传感器对应为点阵光谱传感器。
  3. 根据权利要求1所述的测距装置,其中,还包括:设置于所述光源和所述分束器之间的第一镜头和第二限光器;
    所述第一镜头,配置为将入射至所述第一镜头的第一探测光转换成对应的第二探测光;第二探测光表征第一探测光对应的平行光束;
    所述第二限光器,具有允许所述第二探测光通过的狭缝;通过所述第二限光器的狭缝的第二探测光配置为入射至所述分束器。
  4. 根据权利要求3所述的测距装置,其中,所述第一镜头包括相对设置的至少一个准直镜和至少一个柱面镜;其中,
    准直镜,配置为将第一探测光转换为平行光束;
    柱面镜,配置为将平行光束汇聚成对应的第二探测光。
  5. 根据权利要求3所述的测距装置,其中,所述第一限光器的通光区域为狭缝;所述光谱传感器对应为面阵光谱传感器;所述第一信息还表征所述第一反射光对应的第二反射光在被测对象表面的聚焦位置。
  6. 根据权利要求1至5任一项所述的测距装置,其中,所述透镜组包括至少两个色散透镜;所述透镜组还配置为调整色散范围;其中,
    所述色散范围表征透射光透射至所述透镜组进行色散时,对应的透射光中不同波长的光的聚焦位置与所述透镜组之间的距离范围。
  7. 根据权利要求1所述的测距装置,其中,所述分束器的第一表面的透射率大于反射率。
  8. 根据权利要求1或7所述的测距装置,其中,所述分束器的第一表面镀有至少一层增透膜,所述分束器的第二表面镀有至少一层增反膜。
  9. 根据权利要求1所述的的测距装置,其中,所述第一信息包括光谱曲线,所述光谱曲线表征第一反射光的波长与光强之间的对应关系。
  10. 一种使用权利要求1至9任一项所述的测距装置的测距方法,包括:
    基于所述第一信息,确定出聚焦到被测对象表面的光的至少一个第一波长;
    基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一 波长,确定出光源与被测对象之间的距离;其中,
    在被测对象表面的同一个聚焦位置上,第一波长对应的光强最大。
  11. 根据权利要求10所述的测距方法,其中,当所述光谱传感器为点阵光谱传感器时,确定出聚焦到被测对象表面的光的一个第一波长;所述基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一波长,确定出光源与被测对象之间的距离,包括:
    基于波长和标定距离之间的设定对应关系,确定出第一波长对应的第一标定距离;所述第一标定距离表征所述光源与所述被测对象之间的距离。
  12. 根据权利要求10所述的测距方法,其中,当所述光谱传感器为面阵光谱传感器时,确定出聚焦到被测对象表面的光的至少两个第一波长;所述基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一波长,确定出光源与被测对象之间的距离,包括:
    基于所述至少两个第一波长的分布特征,以及基于波长和标定距离之间的设定对应关系,确定出所述光源与所述被测对象之间的距离。
  13. 根据权利要求10所述的测距方法,其中,所述测距方法还包括:
    对至少两个第一波长进行聚类,得到聚类结果;
    基于所述聚类结果确定出至少两个第一波长对应的分布特征。
  14. 一种摄像头,包括:
    镜头组、对焦马达和权利要求1至9任一项所述的测距装置;其中,
    所述对焦马达配置为基于对焦距离驱动所述镜头组移动至对应的对焦位置;所述对焦距离基于所述镜头组和所述测距装置中的光源之间的位置关系、以及第一距离确定出;所述第一距离表征所述测距装置中的光源与被测对象之间的距离。
  15. 一种摄像头,包括:
    处理器、配置为存储能够在处理器上运行的计算机程序的存储器和如权利要求1至9任一项所述的测距装置;其中,
    所述处理器配置为运行所述计算机程序时,执行权利要求10至13任一项所述的测距方法的步骤。
  16. 一种电子设备,包括:
    处理器、配置为存储能够在处理器上运行的计算机程序的存储器和如权利要求14所述的摄像头;其中,
    所述处理器配置为运行所述计算机程序时,执行:
    基于所述第一信息,确定出聚焦到被测对象表面的光的至少一个第一波长;
    基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一波长,确定出光源与被测对象之间的距离;其中,
    在被测对象表面的同一个聚焦位置上,第一波长对应的光强最大。
  17. 根据权利要求16所述的电子设备,其中,当所述光谱传感器为点阵光谱传感器时,确定出聚焦到被测对象表面的光的一个第一波长;所述基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一波长,确定出光源与被测对象之间的距离,包括:
    基于波长和标定距离之间的设定对应关系,确定出第一波长对应的第一标定距离;所述第一标定距离表征所述光源与所述被测对象之间的距离。
  18. 根据权利要求16所述的电子设备,其中,当所述光谱传感器为面阵光谱传感器时,确定出聚焦到被测对象表面的光的至少两个第一波长;所述基于波长和标定距离之间的设定对应关系以及基于确定出的所述至少一个第一波长,确定出光源 与被测对象之间的距离,包括:
    基于所述至少两个第一波长的分布特征,以及基于波长和标定距离之间的设定对应关系,确定出所述光源与所述被测对象之间的距离。
  19. 根据权利要求18所述的电子设备,其中,所述处理器还执行:
    对至少两个第一波长进行聚类,得到聚类结果;
    基于所述聚类结果确定出至少两个第一波长对应的分布特征。
  20. 一种存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现权利要求10至13任一项所述的测距方法的步骤。
PCT/CN2021/105726 2020-09-02 2021-07-12 测距装置、测距方法、摄像头及电子设备 WO2022048311A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21863379.0A EP4194897A4 (en) 2020-09-02 2021-07-12 DISTANCE MEASURING DEVICE, DISTANCE MEASURING METHOD, CAMERA AND ELECTRONIC DEVICE
US18/090,062 US20230194667A1 (en) 2020-09-02 2022-12-28 Distance Measuring Apparatus, Distance Measuring Method, Camera and Electronic Device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010911834.3 2020-09-02
CN202010911834.3A CN112147622B (zh) 2020-09-02 2020-09-02 测距装置、测距方法、摄像头及电子设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/090,062 Continuation US20230194667A1 (en) 2020-09-02 2022-12-28 Distance Measuring Apparatus, Distance Measuring Method, Camera and Electronic Device

Publications (1)

Publication Number Publication Date
WO2022048311A1 true WO2022048311A1 (zh) 2022-03-10

Family

ID=73890486

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/105726 WO2022048311A1 (zh) 2020-09-02 2021-07-12 测距装置、测距方法、摄像头及电子设备

Country Status (4)

Country Link
US (1) US20230194667A1 (zh)
EP (1) EP4194897A4 (zh)
CN (1) CN112147622B (zh)
WO (1) WO2022048311A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112147622B (zh) * 2020-09-02 2024-02-06 Oppo广东移动通信有限公司 测距装置、测距方法、摄像头及电子设备
CN114001931B (zh) * 2021-11-02 2024-04-30 Oppo广东移动通信有限公司 成像组件的测试装置及测试方法
CN114666509A (zh) * 2022-04-08 2022-06-24 Oppo广东移动通信有限公司 图像获取方法与装置、探测模组、终端及存储介质

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000205998A (ja) * 1999-01-13 2000-07-28 Nikon Corp 反射偏芯測定装置
CN103267482A (zh) * 2013-04-08 2013-08-28 辽宁科旺光电科技有限公司 一种高精度位移检测装置及方法
CN104034268A (zh) * 2014-07-01 2014-09-10 西安工业大学 双缝干涉条纹解码光谱共焦位移传感器及其位移测量方法
CN106802129A (zh) * 2016-12-30 2017-06-06 中国科学院光电研究院 一种高分辨力与自校准光谱共焦位移测量系统
CN107044822A (zh) * 2016-02-05 2017-08-15 株式会社三丰 光谱共焦传感器和测量方法
CN108981579A (zh) * 2018-07-25 2018-12-11 浙江大学 一种用于大范围测量的光谱共焦测量系统及方法
CN110849271A (zh) * 2019-12-23 2020-02-28 海伯森技术(深圳)有限公司 一种光谱共焦测量系统及方法
CN211012841U (zh) * 2019-12-23 2020-07-14 海伯森技术(深圳)有限公司 一种光谱共焦测量系统
CN112147622A (zh) * 2020-09-02 2020-12-29 Oppo(重庆)智能科技有限公司 测距装置、测距方法、摄像头及电子设备

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL146174A (en) * 2001-10-25 2007-08-19 Camtek Ltd Confocal system for testing woofers
DE10242374A1 (de) * 2002-09-12 2004-04-01 Siemens Ag Konfokaler Abstandssensor
TWI490444B (zh) * 2009-01-23 2015-07-01 Univ Nat Taipei Technology 線型多波長共焦顯微方法與系統
DE102011083718A1 (de) * 2011-09-29 2013-04-04 Siemens Aktiengesellschaft Konfokales Spektrometer und Verfahren zur Bildgebung in einem konfokalen Spektrometer
CN103018010B (zh) * 2012-11-30 2016-01-13 北京振兴计量测试研究所 一种光源光谱调制装置
CN106017340A (zh) * 2016-07-06 2016-10-12 北京大恒图像视觉有限公司 一种基于机器视觉的透光容器壁厚检测装置及方法
CN106907998B (zh) * 2017-03-20 2018-07-06 深圳立仪科技有限公司 线性优化的光谱共焦测量装置及方法
CN107084665B (zh) * 2017-05-02 2019-03-19 中北大学 一种光谱共焦位移传感器
CN207300162U (zh) * 2017-09-14 2018-05-01 东莞市三姆森光电科技有限公司 基于光谱波长的非接触式位移测量装置
CN107462405B (zh) * 2017-09-27 2019-03-19 北京理工大学 宽波段差动共焦红外透镜元件折射率测量方法与装置
CN107870149B (zh) * 2017-11-01 2020-07-31 武汉能斯特科技有限公司 一种测量的光谱的方法和装置及其用途
JP6986235B2 (ja) * 2018-12-20 2021-12-22 オムロン株式会社 共焦点センサ
CN210293627U (zh) * 2019-09-10 2020-04-10 宁波法里奥光学科技发展有限公司 一种镜片多波长折射率检测装置
CN110553820B (zh) * 2019-09-10 2024-01-05 宁波法里奥光学科技发展有限公司 一种镜片多波长折射率检测装置及方法
CN111220090A (zh) * 2020-03-25 2020-06-02 宁波五维检测科技有限公司 一种线聚焦差动彩色共焦三维表面形貌测量系统及方法
CN111486953A (zh) * 2020-06-02 2020-08-04 南京引创光电科技有限公司 光学测量系统

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000205998A (ja) * 1999-01-13 2000-07-28 Nikon Corp 反射偏芯測定装置
CN103267482A (zh) * 2013-04-08 2013-08-28 辽宁科旺光电科技有限公司 一种高精度位移检测装置及方法
CN104034268A (zh) * 2014-07-01 2014-09-10 西安工业大学 双缝干涉条纹解码光谱共焦位移传感器及其位移测量方法
CN107044822A (zh) * 2016-02-05 2017-08-15 株式会社三丰 光谱共焦传感器和测量方法
CN106802129A (zh) * 2016-12-30 2017-06-06 中国科学院光电研究院 一种高分辨力与自校准光谱共焦位移测量系统
CN108981579A (zh) * 2018-07-25 2018-12-11 浙江大学 一种用于大范围测量的光谱共焦测量系统及方法
CN110849271A (zh) * 2019-12-23 2020-02-28 海伯森技术(深圳)有限公司 一种光谱共焦测量系统及方法
CN211012841U (zh) * 2019-12-23 2020-07-14 海伯森技术(深圳)有限公司 一种光谱共焦测量系统
CN112147622A (zh) * 2020-09-02 2020-12-29 Oppo(重庆)智能科技有限公司 测距装置、测距方法、摄像头及电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4194897A4 *

Also Published As

Publication number Publication date
EP4194897A4 (en) 2024-01-24
US20230194667A1 (en) 2023-06-22
CN112147622A (zh) 2020-12-29
EP4194897A1 (en) 2023-06-14
CN112147622B (zh) 2024-02-06

Similar Documents

Publication Publication Date Title
WO2022048311A1 (zh) 测距装置、测距方法、摄像头及电子设备
JP6087993B2 (ja) イメージ走査のための方法及び装置
TWI406025B (zh) 自動聚焦裝置及方法
JP4972960B2 (ja) 焦点調節装置および撮像装置
US10969299B2 (en) Lens refractive index detection device and method
KR20150084656A (ko) 정보 처리 장치 및 방법
CN109813435B (zh) 静态光反射显微热成像方法、装置及终端设备
WO2013091404A1 (zh) 包含参考光束的垂直入射宽带偏振光谱仪及光学测量系统
US10175041B2 (en) Measuring head and eccentricity measuring device including the same
JP7504189B2 (ja) 膜厚測定装置及び膜厚測定方法
WO2019159427A1 (ja) カメラモジュール調整装置及びカメラモジュール調整方法
CN114923671A (zh) 一种红外光学系统光谱透过率测量装置及测量方法
JP5733846B2 (ja) デジタル光学技術を用いた厚さ測定装置及び方法
JP4547526B2 (ja) 顕微鏡のファーカス制御装置および制御方法
JP5157073B2 (ja) 焦点調節装置および撮像装置
JP5794665B2 (ja) 撮像装置
JP2007033653A (ja) 焦点検出装置及びそれを用いた撮像装置
JPS60233610A (ja) 測距装置
JP2017219791A (ja) 制御装置、撮像装置、制御方法、プログラム、および、記憶媒体
US5845159A (en) Optical apparatus
US20230090825A1 (en) Optical element assembly, optical apparatus, estimation method, and non-transitory storage medium storing estimation program
TWI843228B (zh) 測量系統和方法
JP4933274B2 (ja) 焦点調節装置、その制御方法及び撮像装置
US6389229B1 (en) Optical fstop/resolution apparatus and method for specified depth-of-field
JP2016008816A (ja) 形状測定装置、形状測定装置の制御方法、および形状測定プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21863379

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021863379

Country of ref document: EP

Effective date: 20230307

NENP Non-entry into the national phase

Ref country code: DE