WO2023143593A1 - 激光系统及激光测量方法 - Google Patents

激光系统及激光测量方法 Download PDF

Info

Publication number
WO2023143593A1
WO2023143593A1 PCT/CN2023/073760 CN2023073760W WO2023143593A1 WO 2023143593 A1 WO2023143593 A1 WO 2023143593A1 CN 2023073760 W CN2023073760 W CN 2023073760W WO 2023143593 A1 WO2023143593 A1 WO 2023143593A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
scanning
signal
view
emitted light
Prior art date
Application number
PCT/CN2023/073760
Other languages
English (en)
French (fr)
Inventor
陈如新
杜德涛
Original Assignee
睿镞科技(北京)有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202210113638.0A external-priority patent/CN116559825B/zh
Application filed by 睿镞科技(北京)有限责任公司 filed Critical 睿镞科技(北京)有限责任公司
Publication of WO2023143593A1 publication Critical patent/WO2023143593A1/zh

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters

Definitions

  • the present disclosure relates to the field of radar technology, and more particularly, to a laser system and a laser measurement method.
  • Radar is an electronic device that uses electromagnetic waves to detect target objects.
  • the radar emits electromagnetic waves to the target object and receives its echoes. After processing, the distance, azimuth, height and other information from the target object to the electromagnetic wave emission point can be obtained.
  • lidar Radar that uses laser light as its working beam.
  • the size of the receiving field of view and the transmitting field of view are basically the same.
  • the lidar has multiple transmitting fields of view along this direction, so it needs to match the same number of receiving fields of view as the transmitting field of view. , that is, for each transmitting field of view, the lidar has only one receiving field of view corresponding to it within the preset time period.
  • the lidar In order to match the transmitting field of view and receiving field of view synchronously and accurately at high speed, the lidar needs to set up a complex control system to control the optical scanning part to accurately deflect the emitted light and reflected light, which not only significantly increases the complexity of the entire lidar , but also increased costs. Also, the higher the resolution, the higher the complexity and cost of lidar.
  • the present disclosure relates to laser systems and laser measurement methods.
  • a laser system may include:
  • the light emission component generates emission signals and sequentially emits multiple groups of emission lights within the scanning time of the frame according to the emission signals; wherein, the emission signals include Time information of the start time of light emission;
  • the receiving end component converts at least one group of reflected light after the emitted light is reflected by at least one target object in the target scene into an output signal; wherein, the type of the output signal is an electrical signal;
  • the position of the receiving field of view of the receiving end component in the target scene changes according to a first specified law and/or the shape of the receiving field of view changes according to a second specified law ;
  • the emitting field of view of the light emitting component is located in the current receiving field of view within the preset receiving time period, and the area of the receiving field of view is greater than or Equal to twice the area of the emission field of view;
  • the first specified rule includes a change along a specified direction;
  • the emission field of view is the projected area of each group of the emitted light in the target scene, and the The receiving field of view is an area corresponding to all light beams that can be received by the receiving end component within the preset receiving duration in the target scene.
  • the laser measurement method may include:
  • the position of the receiving field of view in the target scene changes according to a first specified law and/or the shape of the receiving field of view changes according to a second specified law; From the start moment of light emission, the emission field of view is located in the current receiving field of view within the preset receiving time period, and the area of the receiving field of view is greater than or equal to twice the area of the emission field of view;
  • the first specified rule includes changing along a specified direction; the emission field of view is the projected area of each group of the emission light in the target scene, and the reception field of view is within the preset reception duration All light beams that can be converted into the output signal correspond to areas within the target scene.
  • the emission field of view of the light emitting component is located within the current receiving field of view of the receiving end component within the preset receiving time period since the emission start moment corresponding to the emitted light , and the area of the receiving field of view is greater than or equal to twice the area of the emitting field of view, so it is not necessary to use an optical scanning component to accurately and high-speed synchronously match the emitting field of view and the receiving field of view. Therefore, the complexity and cost of the entire system can be reduced while ensuring the resolution.
  • FIG. 1 is a schematic diagram of a receiving field of view and a transmitting field of view of a laser system according to an embodiment of the present disclosure
  • Figure 2 is a block diagram of a laser system according to one embodiment of the present disclosure
  • FIG. 3 is a block diagram of a laser system according to another embodiment of the present disclosure.
  • Fig. 4 is a schematic diagram of the working principle of a receiving end component according to an embodiment of the present disclosure
  • Fig. 5 is a schematic diagram of the working principle of a receiver component according to another embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of a partial working principle of a laser system according to an embodiment of the present disclosure
  • FIG. 7 is a schematic diagram of a receiving field of view and an emitting field of view of a laser system according to another embodiment of the present disclosure
  • FIG. 8 is a schematic diagram of a receiving field of view and an emitting field of view of a laser system according to yet another embodiment of the present disclosure
  • FIG. 9 is a schematic structural diagram of an optical scanning assembly according to an embodiment of the present disclosure.
  • Fig. 10 is a schematic structural diagram of an optical scanning assembly according to another embodiment of the present disclosure.
  • Fig. 11 is a schematic diagram of the working principle of a laser system determining a superpixel receiving a field of view according to an embodiment of the present disclosure
  • FIG. 12 is a flowchart of a laser measurement method according to an embodiment of the disclosure.
  • an embodiment of the present disclosure provides a laser system 100, the laser system 100 includes an optical transmitting component 200 and a receiving end component 400; Multiple groups of emitted light are sequentially emitted within the scanning duration of this frame; the emission signal includes time information representing the emission start time of each group of emitted light; wherein, the receiving end component 400 reflects the emitted light through at least one target object 600 in the target scene 700 At least one group of reflected light after conversion is output signal, and the type of output signal is electric signal; Wherein, in this frame scanning duration, the position of the receiving field of view 102 of receiving end component 400 in target scene 700 follows the first specified law The change and/or the shape of the receiving field of view 102 in the target scene 700 changes according to a second specified law; since the emission start moment when the corresponding emission light is emitted, the emission field of view 101 of the light emitting component 200 is within the preset receiving time period It is located in the current receiving field of view 102 , and the area of the receiving
  • the first specified rule includes changing along the specified direction;
  • the emission field of view 101 is the projection area of each group of emitted light in the target scene 700, and the reception field of view 102 is all light beams that can be received by the receiving end component 400 within the preset receiving time length The corresponding area within the target scene 700 .
  • the emission field of view 101 of the light emitting component 200 is located in the current receiving field of view 102 of the receiving end component 400 within the preset receiving time period from the start moment of emission corresponding to the emitted light in the embodiment of the present disclosure, and the receiving field of view
  • the area of 102 is greater than or equal to twice the area of the emission field of view 101 , so it is not necessary to use the light scanning component 300 to accurately and high-speed synchronously match the emission field of view 101 and the reception field of view 102 . Therefore, the complexity and cost of the entire system can be reduced while ensuring the resolution.
  • the position of the receiving field of view 102 in the target scene 700 changes according to the first specified law generally refers to the position of the receiving field of view 102 in the target scene 700 after the light emitting component 200 sequentially emits multiple sets of emitted light. The position within is changed once. For example, if the emitting fields of view 101 corresponding to multiple sets of emitted light are distributed in a rectangular lattice within the scanning duration of the current frame, then the receiving field of view 102 moves along the width direction of the rectangular lattice at regular intervals.
  • the shape of the receiving field of view 102 in the target scene 700 changes according to the second specified law generally refers to that the receiving field of view 102 is within the target scene 700 after the light emitting component 200 sequentially emits multiple sets of emitted light. shape changes once. For example, if the emitting field of view 101 corresponding to multiple sets of emitted light within the scan time of the current frame is distributed in a circular dot matrix, then the receiving field of view 102 can be a ring-shaped area, and the width of the receiving field of view 102 increases at regular intervals.
  • the receiving field of view 102 Including at least one strip-shaped continuous area, the emitting field of view 101 corresponding to multiple sets of emitted light within the scanning time of this frame is distributed in a dot matrix, the length direction of the dot matrix is adapted to the length direction of the receiving field of view 102, and the width direction of the dot matrix parallel to the specified direction.
  • the length direction of the dot matrix is adapted to the length direction of the receiving field of view 102 ” generally means that the length direction of the dot matrix corresponds to the length direction of the receiving field of view 102 .
  • the light receiving assembly 410 does not include a deflection mirror such as a 45° reflector, then The length direction of the dot matrix is parallel to the length direction of the receiving field of view 102 .
  • the length direction of the dot matrix is no longer parallel to the length direction of the receiving field of view 102, but parallel to the direction of the receiving field of view 102.
  • the length direction after the field 102 is deflected by 45°, that is, the length direction of the dot matrix and the receiving field of view 102 at this time The angle between the length directions is greater than zero.
  • the receiving field of view 102 is a continuous strip-shaped area: since the area of the receiving field of view 102 corresponding to each group of emitted light is greater than or equal to twice the area of the emitting field of view 101, that is, the receiving field of view 102 The area is much larger than the area of the emission field of view 101, so the emission angle of the emission light and the direction of the reflected light to the receiving end assembly 400 do not need to be accurately controlled, that is, the emission field of view 101 and the reception field of view 102 do not need to be accurately controlled.
  • the laser system in the embodiment of the present disclosure does not need to precisely deflect the emitted light and the reflected light through the light scanning component 300 to precisely and synchronously match the emitting field of view 101 and the receiving field of view 102 .
  • the quantity of emitted light emitted by the light emitting component 200 within the scanning time of the current frame is greater than four groups.
  • the receiving end component 400 Taking the previous four groups of emitted light as an example, within the specified time period, that is, from the start moment of the emission of the first group of emitted light to the end of the preset receiving time after the emission of the fourth group of emitted light, the receiving end component 400’s receiving visual
  • the position of the field 102 in the target scene 700 does not change, that is to say, the emission field of view 101 corresponding to the four sets of emission light sequentially emitted by the light emission component 200 corresponds to the same receiving field of view 102, and the receiving field of view 102 is set at intervals of the above-mentioned Change the position once in the specified direction for a period of time. Assume that the area defined by each dotted circle in the target scene 700 in FIG.
  • the area defined by the dotted rectangular box in the target scene 700 in FIG. 1 is all light beams that can be received by the receiving end component 400
  • the corresponding area in the target scene 700 is also the current receiving field of view 102 .
  • its emission field of view 101 can be the area defined by any dotted circle in Fig.
  • the reflected light in the area defined by the dotted circle can be received by the receiving end assembly 400 . It can be seen that neither the transmitting field of view 101 nor the receiving field of view 102 in the embodiment of the present disclosure needs to be precisely controlled.
  • the receiving field of view 102 includes a plurality of strip-shaped continuous areas: since the emission start moment of the corresponding emission light is emitted, the emission field of view 101 of the light emitting component 200 is located at the current receiving field of view within the preset receiving time length.
  • the length direction of the dot matrix is adapted to the length direction of the receiving field of view. Therefore, each continuous area of the receiving field of view 102 is in one-to-one correspondence with each emitting field of view 101 of the light emitting component 200, that is to say, for For any set of emitted light, from the start moment of the emission of the emitted light, within the preset receiving time
  • Multiple strip-shaped continuous areas of the receiving field of view 102 exist simultaneously.
  • the laser system in the embodiment of the present disclosure does not need to accurately deflect the reflected light reflected from the target object 600 by the light scanning component 300 to precisely and synchronously match the emission field of view 101 and the reception field of view 102 .
  • the "strip-shaped continuous area” generally refers to an area with an aspect ratio greater than 1, and the continuous area can be a polygonal area such as a rectangular area, or a curved area such as an S-shaped area. , or other irregularly shaped areas such as special-shaped areas, etc.
  • the ratio of the maximum width to the total length of at least one continuous region is smaller than the first ratio threshold, and the first ratio threshold is not greater than 0.5, for example, the first ratio threshold may be but not limited to 0.5, 0.1, 0.01, or 0.001.
  • the ratio of the area of the transmitting field of view 101 to the area of the receiving field of view 102 is smaller than a first ratio threshold, which may be but not limited to 0.5, 0.1, 0.01, or 0.001.
  • the adjacent two groups along the length direction of the lattice
  • the ratio of the magnitude of change in direction and angle between the transmitting fields of view 101 and the magnitude of change in direction and angle of the receiving field of view 102 is greater than a second ratio threshold, and the second ratio threshold is not less than 1.
  • the second ratio threshold can be but not limited to 1, 10, 100, 10000 or 1000000, that is to say, different from the position of the receiving field of view 102 corresponding to each transmitting field of view 101, the position change range of the transmitting field of view 101 is greater than or equal to the position change range of the receiving field of view 102.
  • the magnitude of change in direction and angle between two adjacent emission fields of view 101 generally refers to the interval between the projection directions of the target scene 700 between two adjacent groups of emitted light along the length direction of the lattice.
  • the "range of change in direction and angle of the receiving field of view 102” generally refers to the angle at which the receiving field of view 102 deflects along a specified direction each time.
  • the angle between the optical path direction and the vertical direction of the first group of emitted light directed towards the target scene 700 is ⁇ 1
  • the light is directed towards the target
  • the included angle between the optical path direction of the first group of emitted light and the vertical direction in the scene 700 is ⁇ 2
  • the projection area of the first group of emitted light in the target scene 700 that is, the emission field of view 101 of the first group of emitted light is located in the current receiving field of view 102 within the preset receiving time period.
  • the deflection angle of the current receiving field of view 102 along the horizontal direction is ⁇ 1 ; and for the second group of emitted light
  • the current receiving field of view 102 is along the horizontal
  • the deflection angle of the direction becomes ⁇ 2
  • the projection area of the second group of emitted light in the target scene 700 that is, the emission field of view 101 of the second group of emitted light is located in the current receiving field of view 102 within the preset receiving time period.
  • the ratio of the area of the emission field of view 101 to the area of the target scene 700 is smaller than the third ratio threshold, and the third ratio threshold is not greater than 0.1.
  • the third ratio threshold may be but not limited to 0.1, 0.01, 0.001, 0.0001 or 0.0001.
  • the emitted light includes a plurality of light pulses, and the included angle of at least two light pulses in the emitted light is larger than the preset included angle ⁇ ; wherein, the preset included angle ⁇ and the viewing angle ⁇ of the receiving field of view 102
  • the ratio is less than a fourth ratio threshold, and the fourth ratio threshold is not less than 0.01.
  • the fourth ratio threshold may be but not limited to 0.01, 0.1, 0.3, 0.5 or 0.9.
  • the ratio of the area of the target scene 700 to the area of the receiving field of view 102 is greater than or equal to the fifth ratio threshold, and the fifth ratio threshold is not less than 2.
  • the fifth ratio threshold may be but not limited to 2, 4, 8, 16, 100, 1000 or 10000.
  • the receiving end component 400 includes a light receiving component 410 and a photoelectric conversion component 420; wherein, the light receiving component 410 sequentially receives multiple groups of reflected light reflected by the target object 600 and sequentially converts multiple groups of reflected light into corresponding First optical signal; the photoelectric conversion component 420 sequentially converts a plurality of first optical signals into corresponding first electrical signals.
  • the photoelectric conversion component 420 can adopt the following structural form, for example:
  • the photoelectric conversion assembly 420 includes a photoelectric conversion element 421 and an optical element 422; wherein, the photoelectric conversion element 421 has a continuous photoelectric conversion area, the light-incoming end of the optical element 422 faces the light receiving assembly 410, and the optical element 422 The light output end faces the photoelectric conversion area; and the light-emitting end of the optical element 422 is strip-shaped and the length direction is adapted to the length direction of the receiving field of view 102, the optical element 422 is used to selectively emit the first optical signal to the photoelectric conversion area, and the photoelectric conversion area is used to convert the first optical signal to the photoelectric conversion area. The first optical signal is converted into a first electrical signal.
  • the optical element 422 may include, but is not limited to, at least one of a microlens array, at least one diaphragm, a light cone 425 and a light guide. It should be noted that “the light-emitting end of the optical element 422 is strip-shaped and its length direction is adapted to the length direction of the receiving field of view 102” generally refers to that the length direction of the optical element 422 corresponds to the length direction of the receiving field of view 102 .
  • the reflected light reflected by the target object 600 is not deflected by the light scanning assembly 300, and the light receiving assembly 410 does not deflect the reflected light at the same time, that is to say, the light receiving assembly 410 does not include a deflection mirror such as a 45° reflector, then
  • the length direction of the optical element 422 is parallel to the length direction of the receiving field of view 102 .
  • the length direction of the optical element 422 is no longer parallel to the length direction of the receiving field of view 102, but parallel to the The longitudinal direction after the receiving field of view 102 is deflected by 45°, that is, the angle between the longitudinal direction of the optical element 422 and the longitudinal direction of the receiving field of view 102 is greater than zero.
  • the area of the photoelectric conversion region of the photoelectric conversion element 421 is greater than or equal to the area of the diaphragm. Since the light entrance end of the diaphragm faces the receiving lens 411, and the light exit end of the diaphragm faces the photoelectric conversion area of the photoelectric conversion element 421, at least part of the reflected light emitted from the area defined by the dotted rectangle in the target scene 700 in FIG.
  • the receiving lens 411 After passing through the receiving lens 411, it can directly irradiate the light input end of the aperture, and the first optical signal emitted from the light output end of the aperture is received by the photoelectric conversion area of the photoelectric conversion element 421, and the photoelectric conversion area converts the first optical signal into first electrical signal. Since the light-emitting end of the diaphragm is strip-shaped and the length direction is adapted to the length direction of the receiving field of view 102, the side of the photoelectric conversion element 421 facing the receiving lens 411 has a continuous photoelectric conversion area.
  • At least part of the reflected light emitted from any position in the defined area can be irradiated to the photoelectric conversion element 421 through the diaphragm, and converted into a first electrical signal by the photoelectric conversion area of the photoelectric conversion element 421 .
  • the receiving field of view 102 of the receiving end component 400 in the embodiment of the present disclosure is the area defined by the dotted rectangular box in FIG. 4 .
  • the reflected light transmits sequentially After passing through the receiving lens 411 and the diaphragm, as long as it can reach any position in the photoelectric conversion area, only one photoelectric conversion element 421 needs to be installed on the transmission light path of the diaphragm, and there is no need to quickly and accurately control the reflection direction of the reflected light, that is to say, no The reflected light is irradiated to a specific position of the photoelectric conversion component 420 , thereby significantly reducing the complexity of the entire laser system 100 .
  • the photoelectric conversion component 420 can adopt the following structural form, for example:
  • the photoelectric conversion assembly 420 includes a photoelectric cell array 423 and at least one optical element 422; wherein the optical element 422 is located between the light receiving assembly 410 and the photoelectric cell array 423, and the photoelectric cell array 423 includes a plurality of The photoelectric conversion units 424 arranged in sequence along the preset direction; the optical element 422 is used to direct the light receiving component 410 to the first element-optical signal deflection direction between two adjacent photoelectric conversion units 424 and then to the photoelectric conversion unit 424, The photoelectric conversion unit 424 is used for converting the first optical signal into a first electrical signal.
  • the preset direction is adapted to the length direction of the receiving field of view 102 .
  • the preset direction is adapted to the length direction of the receiving field of view 102” here is similar to the above, that is, whether the preset direction is parallel to the length direction of the receiving field of view 102 depends on whether the reflected light is irradiated Whether it is deflected by the light scanning component 300 and/or the light receiving component 410 before reaching the photoelectric cell array 423 .
  • the photoelectric conversion unit 424 can be, but not limited to, APD (Avalanche Photo Diode, full name of avalanche photodiode), SPAD (Single Photon Avalanche Diode, full name of single photon avalanche diode), SIPM (Silicon photomultiplier, full name of silicon photomultiplier tube) ), PIN diode and PD (Photo-Diode), which is called photodiode).
  • the photosensitive material of the photoelectric conversion unit 424 includes at least one of Si, GaAs, InP and InGaAs.
  • the optical element 422 may include, but is not limited to, at least one of a microlens array, at least one diaphragm, a light cone, and a light guide.
  • the longitudinal direction of the receiving field of view 102 as the vertical direction and the optical element 422 as a microlens array as an example, as shown in FIG.
  • the length direction of the microlens array is parallel to the vertical direction.
  • At least part of the reflected light emitted from the area defined by the frame passes through the receiving lens 411 and then goes to the microlens array.
  • the first optical signal sent between two adjacent photoelectric conversion units 424 passes through the microlens
  • the refracted and deflected direction of the corresponding microlens in the array is directed to the nearby photoelectric conversion unit 424 .
  • the photoelectric conversion component 420 includes a photoelectric cell array 423.
  • the photoelectric cell array 423 includes a plurality of photoelectric conversion units 424 sequentially arranged along a predetermined direction.
  • the photoelectric conversion units 424 are used to convert a first optical signal into a first electrical signal.
  • the preset direction is adapted to the length direction of the receiving field of view 102 .
  • the receiving end component 400 also includes an electric amplification module 430, the number of the electric amplification module 430 is less than the number of the photoelectric conversion units 424 of the photoelectric unit array 423, and the output terminals of at least two photoelectric conversion units 424 are connected to the same The input end of the electric amplification module 430 .
  • the photoelectric unit array 423 includes a plurality of photoelectric conversion units 424 arranged in sequence along the vertical direction, the rectangle defined by the dotted line in the target scene 700 in FIG. 5
  • the reflected light emitted from any position in the area passes through the receiving lens 411 and at least partly irradiates to a plurality of photoelectric conversion units 424, and these photoelectric conversion units 424 convert the received first optical signal into a first electrical signal.
  • the first electrical signal is a pulse electrical signal
  • all the first electrical signals converted and generated by these photoelectric conversion units 424 are sequentially input into the corresponding
  • a continuous electric wave signal will be formed, and the electric wave signal will be amplified by the electric amplification module 430 to form a second electric signal with a continuous waveform. Therefore, one of the continuous areas of the receiving field of view 102 in the embodiment of the present disclosure is the area defined by the dotted rectangle in FIG. 5 .
  • the photoelectric conversion component 420 includes a photoelectric cell array 423.
  • the photoelectric cell array 423 includes a plurality of photoelectric conversion units 424 arranged in sequence along a predetermined direction.
  • the photoelectric conversion unit 424 is used to convert the first optical signal into a first electrical signal.
  • the preset direction is adapted to the length direction of the receiving field of view 102 .
  • the receiving end assembly 400 also includes an electrical amplification module 430, and the number of the electrical amplification modules 430 is greater than or equal to the number of the photoelectric conversion units 424 of the photoelectric unit array; the output terminal of each photoelectric conversion unit 424 is connected with at least one electrical amplifier The input terminals of the modules 430 are electrically connected, and the output terminals of at least two electrical amplification modules 430 connected to different photoelectric conversion units 424 are connected to each other to form a total output terminal.
  • each photoelectric conversion unit 424 is electrically connected to the input terminals of different electrical amplification modules 430, that is, the electrical amplification modules 430 and
  • the output terminals of at least two electrical amplification modules 430 are connected to each other to form a total output terminal.
  • the first electrical signal is a pulse electrical signal
  • the first electrical signals generated by each photoelectric conversion unit 424 are sequentially input into the corresponding electrical amplification modules 430 respectively.
  • the pulse electrical signals amplified by the corresponding electrical amplification modules 430 after the first electrical signals input to these electrical amplification modules 430 are sequentially output from the total output terminals, thus from The signal output from the total output end can form the second electrical signal with continuous waveform.
  • one of the continuous areas of the receiving field of view 102 in the embodiment of the present disclosure is the area defined by the dotted rectangle in FIG. 5 .
  • the light receiving component 410 includes at least one lens group, and the lens group includes at least one receiving lens 411 located on the optical path of the reflected light.
  • the multiple sets of lens groups are sequentially arranged along a specified direction.
  • the mirror surface of the receiving lens 411 can be parallel to the vertical direction, or can form a certain angle with the vertical direction.
  • the mirror surface of the receiving lens 411 can be relatively vertical
  • the direction is inclined at 45°.
  • the laser system 100 in the embodiment of the present disclosure also includes a scanning control part, an optical scanning assembly 300 and a processing device 500; wherein, the scanning control part generates a scanning control signal;
  • the transmitted light emitted by the transmitting component 200 deflects the direction and then illuminates at least one target object 600 in the target scene 700, and/or deflects at least one set of reflected light reflected by the at least one target object 600 to be received by the receiving end component 400;
  • processing The device 500 is electrically connected to the light emitting component 200, the scanning control part and the receiving end component 400 respectively, and the processing device 500 is used to determine the distance of the target object 600, the direction angle of the target object 600, and the direction angle of the target object 600 according to the transmitted signal and/or the output signal. At least one of the reflectivity of the target object 600 and the profile of the target object 600 .
  • the light scanning assembly 300 includes multiple The optical scanning parts arranged in sequence along the optical path of the emitted light, one of the two adjacent optical scanning parts deflects the direction of the emitted light and shoots to the other optical scanning part; wherein, the scanning method of at least two optical scanning parts Different; wherein, the scanning method includes at least one of the area of the reflective surface of the optical scanning part, the scanning direction, the scanning angle range, the scanning frequency and the scanning dimension.
  • the scanning dimension of the light scanning component 300 may be, but not limited to, one-dimensional or two-dimensional.
  • multiple optical scanning parts include a first scanning part 310 and a second scanning part 320, and the scanning direction of the first scanning part 310 is different from that of the second scanning part 320.
  • the direction specified above is the first scanning direction, specifically, the first scanning part 310 deflects multiple groups of emitted light along the second scanning direction in sequence within the scanning time of this frame, and then shoots to the second scanning part 320;
  • the scanning part 320 deflects the emitted light deflected by the first scanning part 310 along the first scanning direction and shoots it towards the target object 600; wherein, the second scanning direction is parallel to the length direction of the receiving field of view 102, and the first scanning direction is parallel to the second scanning direction.
  • the scanning direction is different.
  • the embodiment of the present disclosure facilitates compound scanning by using two one-dimensional scanning parts, that is, the first scanning part 310 and the second scanning part 320.
  • the two-dimensional scanning can be realized and the scanning range of the light scanning component 300 can be enlarged under the premise of reducing the cost.
  • the first scanning part 310 and the second scanning part 320 may include, but are not limited to, at least one of a MEMS vibrating mirror, a rotating prism, a rotating wedge mirror, an optical phased array, a photoelectric deflection device, and a liquid crystal scanning part;
  • the liquid crystal scanning part includes a liquid crystal spatial light modulator, a liquid crystal supercrystal, a liquid crystal line array, a see-through one-dimensional liquid crystal array, a transmissive two-dimensional liquid crystal array or a liquid crystal display module.
  • the first scanning direction and the second scanning direction may be, but not limited to, a horizontal direction, a vertical direction or an oblique direction; wherein, the oblique direction is between the vertical direction and the horizontal direction.
  • the first scanning part 310 is a MEMS vibrating mirror 330
  • the second scanning part 320 is a rotating mirror 360
  • the first scanning direction is a horizontal direction
  • the second scanning direction is a vertical direction.
  • the rotating mirror 360 may be, but not limited to, a rotating prism or a rotating wedge mirror.
  • the scanning frequency of MEMS vibrating mirror 330 is very fast, the scanning frequency of rotating mirror 360 is relatively slow, so MEMS vibrating mirror 330 shoots to rotating mirror 360 after the multiple groups of emission light are deflected rapidly along vertical direction successively, and rotating mirror 360 again
  • the multiple groups of emitted light received are deflected sequentially along the horizontal direction and directed at the target object 600 at a large horizontal scanning angle, that is to say, multiple
  • the trajectories of the groups of emitted light that are deflected by the rotating mirror 360 to the target object 600 form a fan-shaped surface with a large central angle on the horizontal plane.
  • the entire optical scanning assembly 300 can realize vertical high-frequency scanning + horizontal wide-angle scanning, thereby not only improving the scanning resolution, but also increasing the receiving area of reflected light, that is, the reflected light of the target object 600 passes through
  • the reflection of the rotating mirror 360 can increase the area where the reflected light irradiates on the photoelectric conversion component 420 after passing through the light receiving component 410 .
  • the cost of the rotating mirror 360 is much lower than that of the MEMS galvanometer 330, and the scanning speed of the MEMS galvanometer 330 is faster, by using the MEMS galvanometer 330 and the rotating mirror 360 to deflect the emitted light, not only can the The receiving field of view of the light receiving component 410 is expanded, and the resolution ability of the laser system to the target scene 700 can also be improved
  • the light scanning component 300 can also adopt other structural forms:
  • the optical scanning assembly 300 includes a MEMS vibrating mirror 330 and an optical phased array 340, the optical phased array 340 is fixed on the reflective surface of the MEMS vibrating mirror 330, and the light inlet of the optical phased array 340 passes through a line
  • the cable is connected to the light emitting component 200 , and the light outlet of the optical phased array 340 faces the target object 600 .
  • the optical phased array 340 (Optical Phased Array, OPA for short) includes a plurality of waveguides distributed in an array, and the material of the waveguides includes at least one of silicon crystal, silicon oxide and silicon nitride.
  • the optical phased array 340 responds to the scanning control signal generated by the scanning control component so that the relative feeding phase of each waveguide changes accordingly to generate a phase difference.
  • the existence of the phase difference will cause the emitted light to interfere and change the direction of the optical path.
  • the MEMS vibrating mirror 330 responds to the scanning control signal, and its mirror surface undergoes slight translation and torsional reciprocating motions. Since the optical phased array 340 is fixed on the reflective surface of the MEMS vibrating mirror 330, the optical phased array 340 as a whole follows the The mirror surface of the MEMS vibrating mirror 330 moves synchronously, so that the optical phased array 340 can realize omni-directional scanning.
  • the optical phased array 340 can be scanned based on the phase difference and at the same time rotate as a whole, so that the optical scanning can be expanded.
  • the scan range of the assembly 300, and the scan rate are increased.
  • the optical scanning assembly 300 includes a MEMS vibrating mirror 330 and a grating array 350, and the grating array 350 is fixed on the reflective surface of the MEMS vibrating mirror 330; wherein,
  • the emission signal also includes wavelength information representing the wavelength of each group of emitted light, and the deflection direction of the emitted light is determined based on the wavelength information. Since the direction of the light beam reflected by the grating array 350 is related to the wavelength of the incident light beam, the light emitting component 200 emits light with a specified wavelength according to the wavelength information, and the light is reflected in the corresponding direction after irradiating the grating array 350 .
  • the MEMS vibrating mirror 330 responds to the scanning control signal, and its mirror surface undergoes slight translation and torsional reciprocating motions. Since the grating array 350 is fixed on the reflective surface of the MEMS vibrating mirror 330, the grating array 350 as a whole follows the MEMS vibrating mirror 330. The mirrors move synchronously, so that the grating array 350 can realize omni-directional scanning. It can be seen that in the embodiment of the present disclosure, by fixing the grating array 350 on the reflective surface of the MEMS vibrating mirror 330, the grating array 350 can change the direction of the optical path of the emitted light based on the wavelength of the emitted light and at the same time rotate itself as a whole, thereby expanding the light output.
  • the scanning range of the scanning component 300 increases the scanning rate.
  • the light scanning component 300 is further configured to generate a current scanning angle signal while deflecting the direction of the reflected light reflected by the target object 600 .
  • the current scanning angle signal may be a horizontal scanning angle signal: for example, in the case that the optical scanning assembly 300 includes a rotating mirror 360 , a code wheel is disposed on the rotating mirror 360 . The code wheel detects the current horizontal scanning angle of the rotating mirror 360 in real time, and sends the detection result, that is, the current scanning angle signal, to the processing device 500 .
  • a torque detector is disposed on the MEMS vibrating mirror 330 .
  • the torque detector detects the torque of the MEMS vibrating mirror 330 in real time, converts the torque of the MEMS vibrating mirror 330 into a current scanning angle signal and sends it to the processing device 500 .
  • the processing device 500 is further configured to determine the irradiation angle at which the emitted light irradiates the target object 600 according to at least one of the emission signal, the scanning control signal, the current scanning angle signal, the output signal, and the position on the photoelectric conversion assembly 420 where the first electrical signal is output. .
  • the position on the photoelectric conversion component 420 that outputs the first electrical signal generally refers to the position where the photoelectric conversion unit 424 that outputs the first electrical signal is located.
  • the multiple sets of emitted light include at least one set of first emitted light and at least one set of second emitted light,
  • the emission moment of the first emission light is earlier than the emission moment of the second emission light
  • the reflected light after the first emission light is reflected by the corresponding target object 600 is converted into an output signal
  • the second emission light is visible light, that is to say, the first emission light
  • a beam of light is used to measure distance
  • At least one of reflectance or profile, the second emitted light is used to project an image.
  • the light scanning component 300 is configured to project the second emitted light on the plurality of target objects 600 according to a preset effect according to at least one of distance, irradiation angle, reflectivity and contour after the first emitted light is irradiated to the plurality of target objects 600
  • the surface of one of the target objects 600 in . Since the second emitted light is projected on the surface of the target object 600 according to at least one of the distance of the target object 600, the illumination angle, the reflectivity of the target object 600, and the outline of the target object 600, the second emitted light is projected on the surface of the target object 600.
  • the imaging can reproduce the real image.
  • the light emitting assembly 200 emits at least one set of first emitted light to the surface of the target object 600 through the probe assembly, and then emits at least one set of second emitted light.
  • the processing device 500 determines at least one of the distance of the target object 600, the reflectivity of the target object 600, and the profile of the target object 600 according to the emission signal and/or output signal corresponding to the first emission light, and at the same time, the processing device 500 also According to at least one of the scan control signal, the current scan angle signal, the output signal and the position on the photoelectric conversion component 420 where the first electrical signal is output, the irradiation angle of the emitted light to the target object 600 is determined.
  • the light scanning assembly 300 converts the second emitted light such as insects into The image is projected on the surface of the target object 600 . Since the second emitted light is projected on the surface of the target object 600 according to at least one of the distance of the target object 600, the irradiation angle, the reflectivity of the target object 600, and the outline of the target object 600, the image of the insect is not captured by the target object.
  • the curved surface of the object 600 is distorted, but covers the curved surface of the target object 600 according to a certain curvature, so that the target object 600 truly restores the insect.
  • the second emitted light may include, but is not limited to, at least one of red light, blue light and green light.
  • the light scanning component 300 first projects the first emitted light on the car windshield or AR glasses, and then At least one of the reflectivity of the object 600 and the outline of the target object 600, projecting the preset virtual AR image, that is, the second emitted light, on the car windshield or AR glasses, so that the user can see the augmented reality The world and virtual world scene.
  • the light scanning component 300 can also directly separate the first emitted light and the second emitted light Do not project on the surfaces of two different target objects 600, in this case the laser system 100 is equivalent to a common projection device.
  • the current scan angle signal includes a first scan angle signal; wherein the first scan angle signal is a scan angle signal generated when the light scanning component 300 deflects the reflected light along the first scan direction; the processing device 500 is configured In order to determine the component of the irradiation angle along the first scanning direction according to the first scanning angle signal, at the same time, the processing device 500 also according to the scanning control signal, the current scanning angle signal, the output signal and the output signal of the first electrical signal on the photoelectric conversion component 420 At least one of the positions determines a component of the illumination angle along a second scan direction; wherein the specified direction is the first scan direction.
  • the second scanning part 320 is a rotating mirror 360.
  • the second scanning part 320 deflects a specified angle according to the scanning control signal and feeds back the first scanning angle signal to the processing device 500.
  • the processing device 500 The component of the irradiation angle of the target object 600 along the first scanning direction can be determined according to the first scanning angle signal.
  • the current scanning angle signal includes the second scanning angle signal, and the processing device 500 may directly determine the component of the irradiation angle along the second scanning direction through the second scanning angle signal.
  • the laser system 100 also includes a communication component, which is used to transmit specified information to the outside world and/or receive external information; wherein the specified information includes the distance of the target object, the reflectivity of the target object, and the direction of the target object at least one of the angle, the outline of the target object, and the angle of illumination.
  • the processing device 500 is further configured to determine at least one of the three-dimensional fused image of the target object 600, the superpixel 802 of the target object, the superpixel 803 of the receiving field of view, the first specified rule and the second specified rule according to the target parameter; wherein,
  • the target parameter includes at least one of the emission signal, the scanning control signal, the current scanning angle signal, the output signal, the position on the photoelectric conversion component 420 where the first electrical signal is output, and external information.
  • the superpixel 802 of the target object generally refers to the collection of multiple pixels among all the pixels that constitute the image of the target object
  • the superpixel 803 of the receiving field of view generally refers to A collection of multiple pixels among all the pixels that make up the image of the receiving field of view.
  • the shape of the superpixel 802 of the target object and the shape of the superpixel 803 of the receiving field of view may include, but not limited to, at least one of straight line, polygon, circle and ellipse.
  • the laser system 100 further includes an image sensor for acquiring a two-dimensional image of the target scene 700; the target parameters include the two-dimensional image.
  • the communication component is also used to communicate the superpixels 802 of the target object to the outside world.
  • the processing device 500 determines the three-dimensional point cloud image 801 of the target object according to at least one of the emission signal, the scanning control signal, the current scanning angle signal, the output signal, the position of the first electrical signal output on the photoelectric conversion component 420, and the external information, and the processing device 500 performs superpixel segmentation on the 3D point cloud image 801 of the target object 600, and decomposes it into superpixels 802 of 13 target objects in FIG. field of superpixels 803 .
  • the receiving field of view 102 is first located at the position of the superpixel 803 of the upper receiving field of view, and the light emitting component 200 first emits multiple groups of emitted light toward the receiving field of view 102 in sequence, so that each group of emitted light is on the target.
  • each emission field of view 101 is distributed in a dot matrix and just located in the receiving field of view 102; then, the receiving field of view 102 moves to the position of the superpixel 803 of the receiving field of view below according to the first specified law Then, the light emitting component 200 sequentially emits multiple sets of emitted light towards the current receiving field of view 102 , so that each emitting field of view 101 is distributed in a dot matrix and exactly located in the receiving field of view 102 .
  • the receiving end component 400 includes a light receiving component 410 and a photoelectric conversion component 420; wherein, the light receiving component 410 sequentially receives multiple groups of reflected light reflected by the target object 600 and sequentially converts multiple groups of reflected light into corresponding First optical signal; the photoelectric conversion component 420 sequentially converts a plurality of first optical signals into corresponding first electrical signals.
  • the first electrical signal serves as the output signal.
  • the receiving end component 400 includes a light receiving component 410, a photoelectric conversion component 420 and an electrical amplification module 430; wherein , the light receiving component 410 sequentially receives multiple groups of reflected light reflected by the target object 600 and sequentially converts the multiple groups of reflected light into corresponding first optical signals; the photoelectric conversion component 420 sequentially converts multiple first optical signals into corresponding first optical signals An electrical signal, the electrical amplification module 430 is used to amplify the first electrical signal into a second electrical signal. In this case, the second electrical signal serves as the output signal.
  • the processing device 500 may determine at least one of the distance of the target object 600, the reflectivity of the target object 600, and the profile of the target object 600 based on various methods, for example, the processing device 500 may determine at least one of the target object 600 based on the time-of-flight method
  • the distance of the target object 600 is determined by methods such as distance measurement by phase method or triangulation distance measurement method.
  • the processing device 500 determines the distance of the target object 600 based on the time-of-flight method
  • the processing device 500 includes a processor, at least one comparator, and a duration determination module corresponding to the comparator one-to-one.
  • the electrical amplifying module 430 includes multiple amplifiers connected in series or in parallel, and the intensity of the amplified electrical signal output by at least one of the multiple amplifiers is less than half of the intensity of the amplified electrical signal output by the other amplifier.
  • at least the output end of the amplifier outputting the largest amplified electrical signal is connected to the input end of at least one comparator, and the comparison inputs of the comparator correspond to the amplifiers one by one.
  • the amplified electrical signal output by the last stage amplifier is the largest. If the number of comparators is one, then in the case of one comparator, the comparator is connected to the last stage amplifier and The duration determines the module connection; when there are multiple comparators, the output terminals of multiple amplifiers are all connected to comparators, and the voltage values of the comparison inputs of each comparator are different.
  • the comparator is connected to the comparison input, which is used to compare the voltage value of the comparison input with the electrical signal output by the corresponding amplifier to determine the trigger start time, trigger end time and pulse width; wherein, the trigger start time and trigger end time are respectively The intensity of the electrical signal output by the amplifier is higher than the start time and end time of the voltage value compared to the input, and the pulse width is the difference between the trigger end time and the trigger start time; the duration determination module corresponds to the comparator one by one; the duration determination The module is used to determine the light flight time according to the starting moment of emission and the triggering starting moment output by the corresponding comparator.
  • the processor determines at least one of distance, reflectivity, and profile based on at least one of light flight time, pulse width, intensity of the second electrical signal, and speed of light.
  • the processor determines the distance of the target object 600 according to the time-of-flight method. Since the triggering start time is affected by the voltage value of the comparison input, and the voltage value of the comparison input of the electrical signal output by the trigger amplifier is different, the corresponding pulse width is also different, so in order to reduce the above-mentioned influence, the processor first corrects according to the pulse width light flight time, and then determine the distance of the target object 600 according to the speed of light and the corrected light flight time.
  • the comparison input may be a dynamic voltage curve input to the comparator from outside, or a dynamic voltage curve pre-stored in the comparator.
  • the duration determination module may be, but not limited to, a TDC (time-to-digital converter, referred to as a time-to-digital converter).
  • the duration determining module and the processor may be independent components, or may be integrated into one component.
  • the laser system 100 in the embodiments of the present disclosure further includes a main housing and at least one probe housing, the probe housing is provided separately from the main housing, and the probe housings correspond to target scenes one by one.
  • the light emitting assembly 200, the scanning control part and the processing device 500 are arranged in the main casing; the light receiving assembly 410 and the light scanning assembly 300 are arranged in the probe casing; wherein, the photoelectric conversion assembly 420 is arranged in the main casing or the probe casing.
  • the probe housing and the main housing are separately provided in the embodiment of the present disclosure, the probe housing and the main housing can be fixed and installed separately. Compared with the entire laser system 100, the volume of the probe housing is very small, and the probe housing The body can be installed on a small-volume application object or application location.
  • the probe housing can be fixed on the frame of the glasses for the blind, and the main housing is clamped on the user's waist or placed in the user's clothes pocket.
  • the probe housing can be fixed on the rearview mirror of the car, and the main housing can be fixed on the ceiling of the car.
  • the light scanning components 300 in each probe housing can irradiate the corresponding emitted light to the target objects 600 in different target scenes 700 .
  • the light emitting component 200 is connected to the light scanning component 300 through the first optical fiber
  • the light receiving component 410 is connected to the photoelectric conversion component 420 through the second optical fiber
  • the processing device 500 is respectively connected to the light emitting component 200 and the scanning component through cables.
  • the control element, the photoelectric conversion component 420 and the light scanning component 300 are electrically connected. It should be noted that, in addition to realizing the optical/electrical connection by means of optical fibers or cables, the above-mentioned components can also be connected by means of other optical elements. 422 and the wireless communication element realize the optical/electrical connection by transmitting electrical signals and/or optical signals through space.
  • the laser system 100 also includes a display component and/or a prompt component; wherein, the display component is used to display at least one of the distance of the target object 600, the irradiation angle, the reflectivity of the target object 600 and the outline of the target object 600 One; the prompting component is used to output a prompting signal according to at least one of the distance of the target object 600 , the irradiation angle, the reflectivity of the target object 600 and the outline of the target object 600 .
  • the prompting component may be, but not limited to, a microphone or a vibrator.
  • the receiver assembly 400 further includes a bias voltage module.
  • the bias voltage module provides a dynamic bias voltage; the absolute value of the dynamic bias voltage changes to a first predetermined threshold value according to a first preset law from the moment of emission start in a first preset time length, and remains not less than the first The predetermined threshold is a second preset duration, and the absolute value of the dynamic bias voltage is less than the first predetermined threshold within the first preset duration; wherein, the photoelectric conversion component 420 is used to sequentially convert the first optical signal into Corresponding to the first electrical signal: the first preset duration is less than the maximum difference between the transmission start time and the receiving time, and the receiving time is the time when the reflected light is received by the receiving end assembly 400 .
  • the target object 600 is far away from the light emitting component 200 , compared with the emitted light emitted by the light emitting component 200 , the light intensity of the reflected light received by the light receiving component 410 is significantly attenuated. Since the absolute value of the dynamic bias voltage changes from the start of emission to the first predetermined threshold for the first preset time length and remains not less than the first predetermined threshold for the second preset time length, it can be seen from the above that the emitted light travels far It takes a long time to reflect back from the target object 600, so the absolute value of the dynamic bias voltage corresponding to the moment when the light receiving component 410 receives the reflected light is not less than the first predetermined threshold, so that the photoelectric conversion unit 424 generates The weaker optical signal can be converted into a stronger first electrical signal.
  • the target object 600 is closer to the light emitting component 200 , then compared with the emitted light emitted by the light emitting component 200 , the light intensity attenuation of the reflected light received by the light receiving component 410 is less.
  • the absolute value of the dynamic bias voltage is less than the first predetermined threshold within the first preset time period from the start of emission, it can be seen from the above that it takes a short time for the emitted light to be reflected back by the short-distance target object 600, so the light
  • the absolute value of the dynamic bias voltage corresponding to the moment when the receiving component 410 receives the reflected light is smaller than the first predetermined threshold, so that the photoelectric conversion unit
  • the 424 can convert the strong optical signal into a relatively weak first electrical signal according to the dynamic bias voltage, so as to avoid saturation and distortion of the strong optical signal after being amplified through photoelectric conversion.
  • the radar system in the embodiment of the present disclosure is based on the principle that the intensity of the light beam attenuates with the increase of the propagation distance, that is, the propagation time during the propagation process. It is possible to make the reflected light reflected from the distant target object 600 correspond to a dynamic bias voltage with a large absolute value, that is, the absolute value of the dynamic bias voltage is not less than the first predetermined threshold, so that the reflected light from the short-distance target object 600
  • the dynamic bias voltage corresponding to the reduced absolute value of the reflected light, that is, the absolute value of the dynamic bias voltage is less than the first predetermined threshold, so that it can not only improve the measurement accuracy at short distances, but also avoid the saturation of short-distance reflected light beams after being amplified by photoelectric conversion Distortion without affecting the long-distance detection ability.
  • the absolute value of the dynamic bias voltage changes from the first adjustment moment to the second predetermined threshold according to the second preset law in the third preset time length, and remains not less than the second predetermined threshold and the fourth preset duration, and the absolute value of the dynamic bias voltage is less than the second predetermined threshold within the third preset duration; wherein, the first adjustment moment is earlier than the receiving moment; the processing device 500 is also configured to At least one of the scanning angle signal, the output signal and the position on the photoelectric conversion component 420 where the first electrical signal is output determines the adjustment moment.
  • an embodiment of the present disclosure also provides a laser measurement method, which includes:
  • S300 Determine at least one of the distance of the target object 600, the reflectivity of the target object 600, and the outline of the target object 600 according to the transmitted signal and/or the output signal;
  • the position of the receiving field of view 102 in the target scene 700 changes according to the first specified law and/or the shape of the receiving field of view 102 changes according to the second specified law; From the initial moment, the emission field of view 101 is at the preset The receiving duration is located in the current receiving field of view 102 , and the area of the receiving field of view 102 is greater than or equal to twice the area of the transmitting field of view 101 .
  • the first specified rule includes changing along the specified direction; the emission field of view 101 is the projection area of each group of emitted light in the target scene 700, and the reception field of view 102 is the range of all light beams that can be converted into output signals within the preset receiving time. The corresponding area in the target scene 700 .
  • Step S200 includes:
  • S210 Receive multiple groups of reflected light reflected by the target object 600 in sequence, and sequentially convert the multiple groups of reflected light into corresponding first optical signals;
  • the laser measurement method also includes:
  • step S120 includes:
  • S122 Deflect the emitted light deflected along the second scanning direction along the first scanning direction and shoot it toward the target object 600; wherein, the second scanning direction is parallel to the length direction of the receiving field of view 102, and the first scanning direction is parallel to the second scanning direction. If the directions are different, the specified direction is the first scanning direction.
  • the laser measurement method further includes:
  • the irradiation angle at which the emitted light is irradiated to the target object 600 is determined according to at least one of the emitted signal, the scanning control signal, the current scanning angle signal, the output signal and the conversion position of the first light signal.
  • Step S100 includes: sequentially emitting at least one set of first emitted light and at least one set of second emitted light within the scanning duration of the current frame; the emission time of the first emitted light is earlier than that of the second emitted light The moment of emission; wherein, the second emission light is visible light;
  • Step S200 includes: converting the reflected light after the first emitted light is reflected by the corresponding target object 600 into an output signal.
  • step S120 deflecting the emitted light according to the scanning control signal and then irradiating at least one target object 600 in the target scene 700 includes: irradiating the first emitted light to a plurality of target objects 600 according to the scanning control signal Projecting the second emitted light onto the surface of one of the target objects 600 according to a preset effect according to at least one of distance, illumination angle, reflectivity and profile.
  • the advantage of this setting is that the preset virtual AR image, ie, the second emitted light, is projected on the target object 600, so that the user can see the scene of the augmented real world and the virtual world.
  • step S120 deflecting the direction of the emitted light according to the scanning control signal and then irradiating the at least one target object 600 in the target scene 700 includes: deflecting the direction of the emitted light according to the scanning control signal, and then The first emitted light and the second emitted light are respectively irradiated to two different target objects 600 .
  • this step is equivalent to an ordinary projection operation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

一种激光系统及激光测量方法,该激光系统(100)包括:光发射组件(200),生成发射信号并根据发射信号在本帧扫描时长内依次射出多组发射光;接收端组件(400),将发射光经过目标场景(700)中的至少一个目标物体(600)反射后的至少一组反射光转换为输出信号;在本帧扫描时长内,接收端组件(400)的接收视场(102)在目标场景(700)内的位置按照第一指定规律变化和/或接收视场(102)的形状按照第二指定规律变化;自对应发射光发出的发射起始时刻起,光发射组件(200)的发射视场(101)在预设接收时长内位于当前的接收视场(102)中,且接收视场(102)的面积大于或等于发射视场(101)的面积的两倍。该系统及方法无需利用光扫描组件精确高速的同步匹配发射视场(101)与接收视场(102),能够在保证分辨率的情况下,降低整个系统的复杂程度和成本。

Description

激光系统及激光测量方法
相关申请的交叉引用
本公开要求于2022年01月30日提交于中国国家知识产权局(CNIPA)的专利申请号为202210113638.0的中国专利申请的优先权和权益,上述中国专利申请通过引用整体并入本文。
技术领域
本公开涉及雷达技术领域,更具体地,涉及激光系统及激光测量方法。
背景技术
雷达是利用电磁波探测目标物体的电子设备,雷达对目标物体发射电磁波并接收其回波,通过处理后可获得目标物体至电磁波发射点的距离、方位、高度等信息。
以激光为工作光束的雷达称为激光雷达。相关技术中,接收视场与发射视场大小基本相同,为了提高某一方向的分辨率,激光雷达沿该方向具有多个发射视场,从而需要对应匹配与发射视场相同数量的接收视场,也即,对于每个发射视场来说,激光雷达在预设时长内只有一个接收视场与其对应。而为了能够高速精确的同步匹配发射视场与接收视场,激光雷达需要设置复杂的控制系统来控制光扫描件精准的偏转发射光和反射光,这不仅显著增大了整个激光雷达的复杂程度,而且还提高了成本。并且,分辨率越高,激光雷达的复杂程度和成本随之越高。
发明内容
本公开涉及激光系统及激光测量方法。
根据本公开的实施方式,激光系统可以包括:
光发射组件,生成发射信号并根据所述发射信号在本帧扫描时长内依次射出多组发射光;其中,所述发射信号包括表示每组所述发射 光的发射起始时刻的时刻信息;
接收端组件,将所述发射光经过目标场景中的至少一个目标物体反射后的至少一组反射光转换为输出信号;其中,所述输出信号的类型为电信号;
其中,在所述本帧扫描时长内,所述接收端组件的接收视场在所述目标场景内的位置按照第一指定规律变化和/或所述接收视场的形状按照第二指定规律变化;自对应所述发射光发出的发射起始时刻起,所述光发射组件的发射视场在预设接收时长内位于当前的所述接收视场中,且所述接收视场的面积大于或等于所述发射视场的面积的两倍;其中,所述第一指定规律包括沿指定方向变动;所述发射视场为每组所述发射光在所述目标场景内的投射区域,所述接收视场为在所述预设接收时长内所述接收端组件能够接收到的所有光束在所述目标场景内对应的区域。
根据本公开的实施方式,激光测量方法可以包括:
生成发射信号并根据所述发射信号在本帧扫描时长内依次射出多组发射光;
将所述发射光经过目标场景中的至少一个目标物体反射后的至少一组反射光转换为输出信号;其中,所述输出信号的类型为电信号;
根据所述发射信号和/或所述输出信号确定所述目标物体的距离、所述目标物体的反射率和所述目标物体的轮廓中的至少一个;
其中,在所述本帧扫描时长内,接收视场在所述目标场景内的位置按照第一指定规律变化和/或所述接收视场的形状按照第二指定规律变化;自对应所述发射光发出的发射起始时刻起,发射视场在预设接收时长内位于当前的所述接收视场中,且所述接收视场的面积大于或等于所述发射视场的面积的两倍;其中,所述第一指定规律包括沿指定方向变化;所述发射视场为每组所述发射光在所述目标场景内的投射区域,所述接收视场为在所述预设接收时长内能够被转换为所述输出信号的所有光束在所述目标场景内对应的区域。
在本公开中,由于自对应发射光发出的发射起始时刻起光发射组件的发射视场在预设接收时长内位于接收端组件的当前的接收视场 中,且接收视场的面积大于或等于发射视场的面积的两倍,因此无需利用光扫描组件精确高速的同步匹配发射视场与接收视场。由此便能在保证分辨率的情况下,降低整个系统的复杂程度和成本。
本领域技术人员将理解的是,以上发明内容仅是说明性的,并且不旨在以任何方式进行限制。除了上述说明性方面、实施方式和特征之外,通过参考附图和以下详细描述,其他方面、实施方式和特征将变得显而易见。
附图说明
通过阅读参照以下附图所作的对非限制性实施方式的详细描述,本公开的其它特征、目的和优点将会变得更明显。其中:
图1是根据本公开一个实施方式的激光系统的接收视场和发射视场的示意图;
图2是根据本公开一个实施方式的激光系统的框图;
图3是根据本公开另一个实施方式的激光系统的框图;
图4是根据本公开一个实施方式的接收端组件的工作原理示意图;
图5是根据本公开另一个实施方式的接收端组件的工作原理示意图;
图6是根据本公开实施方式的激光系统的局部工作原理示意图;
图7是根据本公开另一个实施方式的激光系统的接收视场和发射视场的示意图;
图8是根据本公开又一个实施方式的激光系统的接收视场和发射视场的示意图;
图9是根据本公开一个实施方式的光扫描组件的结构示意图;
图10是根据本公开另一个实施方式的光扫描组件的结构示意图;
图11是根据本公开实施方式的激光系统确定接收视场的超像素的工作原理示意图;
图12是根据本公开实施方式的激光测量方法的流程图。
具体实施方式
为了更好地理解本公开,将参照附图对本公开的各个方面做出更详细的说明。应理解的是,这些详细说明只是对本公开的示例性实施方式的描述,而非以任何方式限制本公开的范围。为了便于描述,附图中仅示出了与本发明相关的部分而非全部结构。
除非另外限定,否则本文中使用的所有术语(包括工程术语和科技术语)具有与本公开所属领域普通技术人员通常理解的含义相同的含义。还应理解的是,除非本公开中有明确的说明,否则诸如在常用词典中限定的术语应被解释为具有与它们在相关技术的上下文中的含义一致的含义,而不应以理想化或过于形式化的意义解释。
需要说明的是,在不冲突的情况下,本公开中的实施方式及实施方式中的特征可以相互组合。另外,除非明确限定或与上下文相矛盾,否则本公开所记载的方法中包含的具体步骤不必限于所记载的顺序,而可以任意顺序执行或并行地执行。下面将参照附图并结合实施方式来详细说明本公开。
如图1和图6所示,本公开实施例提供了一种激光系统100,该激光系统100包括光发射组件200和接收端组件400;其中,光发射组件200生成发射信号并根据发射信号在本帧扫描时长内依次射出多组发射光;发射信号包括表示每组发射光的发射起始时刻的时刻信息;其中,接收端组件400将发射光经过目标场景700中的至少一个目标物体600反射后的至少一组反射光转换为输出信号,输出信号的类型为电信号;其中,在本帧扫描时长内,接收端组件400的接收视场102在目标场景700内的位置按照第一指定规律变化和/或接收视场102的在目标场景700内的形状按照第二指定规律变化;自对应发射光发出的发射起始时刻起,光发射组件200的发射视场101在预设接收时长内位于当前的接收视场102中,且接收视场102的面积大于或等于发射视场101的面积的两倍。其中,第一指定规律包括沿指定方向变动;发射视场101为每组发射光在目标场景700的投射区域,接收视场102为在预设接收时长内接收端组件400能够接收到的所有光束在目标场景700内对应的区域。
由于本公开实施例中自对应发射光发出的发射起始时刻起光发射组件200的发射视场101在预设接收时长内位于接收端组件400的当前的接收视场102中,且接收视场102的面积大于或等于发射视场101的面积的两倍,因此无需利用光扫描组件300精确高速的同步匹配发射视场101与接收视场102。由此便能在保证分辨率的情况下,降低整个系统的复杂程度和成本。
需要说的是,“接收视场102在目标场景700内的位置按照第一指定规律变化”一般指代的是每当光发射组件200依次发出多组发射光后接收视场102在目标场景700内的位置变动一次。例如,若本帧扫描时长内多组发射光对应的发射视场101呈矩形点阵分布,那么接收视场102则每间隔一定时长沿矩形点阵的宽度方向移动一次位置。同理,“接收视场102的在目标场景700内的形状按照第二指定规律变化”一般指代的是每当光发射组件200依次发出多组发射光后接收视场102在目标场景700内的形状变化一次。例如,若本帧扫描时长内多组发射光对应的发射视场101呈环形点阵分布,那么接收视场102可为环形区域,则每间隔一定时长接收视场102的宽度增大一次。
考虑到接收视场102大于发射视场101的情况下,接收的背景噪音随之增大,所以,为了适当降低噪音,以平衡噪音、成本和分辨率,如图1所示,接收视场102包括至少一个条形的连续区域,本帧扫描时长内多组发射光对应的发射视场101呈点阵分布,点阵的长度方向与接收视场102的长度方向相适应,点阵的宽度方向与指定方向平行。需要说明的是,“点阵的长度方向与接收视场102的长度方向相适应”一般指代的是点阵的长度方向与接收视场102的长度方向相对应。例如,若目标物体600反射的反射光不经过光扫描组件300偏转,同时光接收组件410对反射光也不进行偏转,也就是说,光接收组件410不包括偏转镜例如45°反射镜,那么点阵的长度方向与接收视场102的长度方向平行。若目标物体600反射的反射光经过光扫描组件300或光接收组件410偏转方向,例如偏转45°,那么点阵的长度方向不再平行于接收视场102的长度方向,而平行于将接收视场102偏转45°以后的长度方向,也即,此时点阵的长度方向与接收视场102的 长度方向之间的夹角大于零。
在接收视场102为一整块条形的连续区域的情况下:由于每组发射光对应的接收视场102的面积大于或等于发射视场101的面积的两倍,也即接收视场102的面积远大于发射视场101的面积,因此发射光的发射角度以及反射光射向接收端组件400的方向无需被精确控制,也就是说,发射视场101和接收视场102均无需被精确控制,只要每组发射光经目标物体600反射后的反射光能够从当前的接收视场102的任意位置射出,则均能被接收端组件400接收。从而本公开实施例中的激光系统无需通过光扫描组件300精准地偏转发射光和反射光来精确的同步匹配发射视场101与接收视场102。例如,如图1所示,在本帧扫描时长内光发射组件200射出的发射光数量大于四组。以前四组发射光为例,在指定时长内也即自第一组发射光发出的发射起始时刻起一直到第四组发射光发出后的预设接收时长终止,接收端组件400的接收视场102在目标场景700内的位置未变化,也就是说,光发射组件200依次射出的四组发射光对应的发射视场101均对应同一个接收视场102,接收视场102每间隔上述指定时长沿指定方向变化一次位置。假设图1中目标场景700内每个虚线圆圈所限定的区域即为一个发射视场101,图1中目标场景700内虚线矩形框所限定的区域即为接收端组件400能够接收到的所有光束在目标场景700内对应的区域也即当前的接收视场102。那么,对于每组发射光来说,其发射视场101可以为图1中任意一个虚线圆圈所限定的区域,也就是说,发射光的发射角度无需被精确控制,发射光投射至上述任意一个虚线圆圈所限定的区域其反射光均能被接收端组件400接收。可见,本公开实施例中发射视场101和接收视场102均无需被精确控制。
在接收视场102包括多个条形的连续区域的情况下:由于自对应发射光发出的发射起始时刻起,光发射组件200的发射视场101在预设接收时长内位于当前的接收视场102中,点阵的长度方向与所述接收视场的长度方向相适应,因此,接收视场102的各个连续区域与光发射组件200各个发射视场101一一对应,也就是说,对于任意一组发射光来说,自该发射光发出的发射起始时刻起,在预设接收时长内 接收视场102的多个条形的连续区域同时存在。由此,每组发射光只要按照既定的方向射出,该发射光经目标物体600反射后的反射光必定能从接收视场102的对应连续区域射出,进而被接收端组件400接收。从而本公开实施例中的激光系统无需通过光扫描组件300精准地偏转从目标物体600反射的反射光来精确的同步匹配发射视场101与接收视场102。
此外,还需要说明的是,“条形的连续区域”一般指代的是长宽比大于1的区域,该连续区域既可以为多边形区域例如长方形区域,也可以为曲形区域例如S形区域,或者其他不规则形状的区域例如异形区域等。其中,至少一个连续区域的最大宽度和总长度之比小于第一比例阈值,第一比例阈值不大于0.5,例如第一比例阈值可以但不限于是0.5、0.1、0.01、或者0.001。
在一些实施例中,发射视场101的面积与接收视场102的面积之比小于第一比例阈值,第一比例阈值可以但不限于是0.5、0.1、0.01、或者0.001。
在一些实施例中,自相邻两组发射光中前一组发射光发出的发射起始时刻起至后一组发射光发出预设接收时长后止,沿点阵的长度方向的相邻两个发射视场101之间的方向角度变化幅度与接收视场102的方向角度变化幅度之比大于第二比例阈值,第二比例阈值不小于1,例如第二比例阈值可以但不限于是1、10、100、10000或者1000000,也就是说,与各个发射视场101对应的接收视场102的位置不同,发射视场101的位置变化幅度大于或等于接收视场102的位置变化幅度。需要说明的是,“相邻两个发射视场101之间的方向角度变化幅度”一般指代的是沿点阵长度方向的相邻两组发射光在目标场景700的投射方向之间的夹角;同理,“接收视场102的方向角度变化幅度”一般指代的是接收视场102每次沿指定方向偏转的角度。下面以接收视场102的长度方向沿竖直方向、指定方向为水平方向为例,假设射向目标场景700的第一组发射光的光路方向与垂直方向的夹角为α1,射向目标场景700的第一组发射光的光路方向与垂直方向的夹角为α2,那么对于第一组发射光来说,自第一组发射光发出的发射起始时刻起, 第一组发射光在目标场景700内的投射区域也即第一组发射光的发射视场101在预设接收时长内位于当前的接收视场102中。自第一组发射光发出的发射起始时刻起至第二组发射光发出的发射起始时刻止,当前的接收视场102沿水平方向的偏转角度为γ1;而对于第二组发射光来说,由于接收视场102的位置发生了变化,因此自第二组发射光发出的发射起始时刻起至第三组发射光发出的发射起始时刻止,当前的接收视场102沿水平方向的偏转角度变为γ2,第二组发射光在目标场景700内的投射区域也即第二组发射光的发射视场101在预设接收时长内位于当前的接收视场102中。其中,(α21)÷(γ21)≥T,其中,T表示第二比例阈值。
在一些实施例中,发射视场101的面积与目标场景700的面积之比小于第三比例阈值,第三比例阈值不大于0.1,例如第三比例阈值可以但不限于是0.1、0.01、0.001、0.0001或0.0001。
如图8所示,发射光包括多个光脉冲,发射光中的至少两个光脉冲的夹角大于预设夹角α;其中,预设夹角α与接收视场102的视场角β之比小于第四比例阈值,第四比例阈值不小于0.01,例如第四比例阈值可以但不限于是0.01、0.1、0.3、0.5或0.9。
在一些实施例中,目标场景700的面积与接收视场102的面积之比大于或等于第五比例阈值,第五比例阈值不小于2,例如第五比例阈值可以但不限于是2、4、8、16、100、1000或10000。
如图3所示,接收端组件400包括光接收组件410和光电转换组件420;其中,光接收组件410依次接收经目标物体600反射的多组反射光并将多组反射光依次转换为对应的第一光信号;光电转换组件420将多个第一光信号依次转换为对应的第一电信号。
在接收视场102为一整块条形的连续区域的情况下:为了使接收视场102为条形的连续区域,光电转换组件420可以采用如下结构形式,例如:
如图4所示,光电转换组件420包括光电转换件421和光学元件422;其中,光电转换件421具有连续的光电转换区域,光学元件422的进光端朝向光接收组件410,光学元件422的出光端朝向光电转换 区域;且光学元件422的出光端呈条形且长度方向与接收视场102的长度方向相适应,光学元件422用于将第一光信号选择性射向光电转换区域,光电转换区域用于将第一光信号转换为第一电信号。其中,光学元件422可以但不限于包括微透镜阵列、至少一个光阑、光锥425和光导器中的至少一种。需要说明的是,“光学元件422的出光端呈条形且长度方向与接收视场102的长度方向相适应”一般指代的是光学元件422的长度方向与接收视场102的长度方向相对应。例如,若目标物体600反射的反射光不经过光扫描组件300偏转,同时光接收组件410对反射光也不进行偏转,也就是说,光接收组件410不包括偏转镜例如45°反射镜,那么光学元件422的长度方向与接收视场102的长度方向平行。若目标物体600反射的反射光经过光扫描组件300或光接收组件410偏转方向,例如偏转45°,那么光学元件422的长度方向不再平行于接收视场102的长度方向,而是平行于将接收视场102偏转45°以后的长度方向,也即,此时光学元件422的长度方向与接收视场102的长度方向之间的夹角大于零。
以接收视场102的长度方向为竖直方向、光学元件422为光阑为例,如图4所示,光接收组件410为接收透镜411,光阑为位于光电转换件421与接收透镜411之间,光电转换件421的光电转换区域的面积大于或等于光阑的面积。由于光阑的进光端朝向接收透镜411,光阑的出光端朝向光电转换件421的光电转换区域,因此从图4中目标场景700内虚线矩形框所限定的区域射出的至少部分反射光透过接收透镜411后可直接照射至光阑的进光端,从光阑的出光端射出的第一光信号则被光电转换件421的光电转换区域接收,光电转换区域将第一光信号转换为第一电信号。由于光阑的出光端呈条形且长度方向与接收视场102的长度方向相适应,光电转换件421朝向接收透镜411的一侧具有连续的光电转换区域,因此从图4中虚线矩形框所限定的区域中任意位置射出的至少部分反射光均能通过光阑照射至光电转换件421,并被光电转换件421的光电转换区域转换为第一电信号。可见,本公开实施例中接收端组件400的接收视场102即为图4中虚线矩形框所限定的区域。并且,由上可知本公开实施例中反射光依次透 过接收透镜411和光阑后只要能够射到光电转换区域的任意位置即可,光阑的透射光路上只需设置一个光电转换件421,无需快速精确控制反射光的反射方向,也就是说,无需使反射光照射至光电转换组件420的某个特定位置,从而显著降低了整个激光系统100的复杂程度。
在接收视场102包括多个条形的连续区域的情况下:为了使接收视场102具有多个条形的连续区域,光电转换组件420可以采用如下结构形式,例如:
形式一、如图5所示,光电转换组件420包括光电单元阵列423和至少一个光学元件422;其中,光学元件422位于光接收组件410与光电单元阵列423之间,光电单元阵列423包括多个沿预设方向依次设置的光电转换单元424;光学元件422用于将光接收组件410射向相邻两个光电转换单元424之间的第元件一光信号偏转方向后射向光电转换单元424,光电转换单元424用于将第一光信号转换为第一电信号。其中,预设方向与接收视场102的长度方向相适应。需要说明的是,此处“预设方向与接收视场102的长度方向相适应”与上文类似,也就是说,预设方向是否平行于接收视场102的长度方向取决于反射光在照射至光电单元阵列423之前是否被光扫描组件300和/或光接收组件410偏转方向。其中,光电转换单元424可以但不限于是APD(Avalanche Photo Diode,全称为雪崩光电二极管)、SPAD(Single Photon Avalanche Diode,全称为单光子雪崩二极管)、SIPM(Silicon photomultiplier,全称为硅光电倍增管)、PIN二极管和PD(Photo-Diode),全称为光电二极管)中的至少一种。光电转换单元424的光敏材料包括Si、GaAs、InP和InGaAs中至少一种。其中,光学元件422可以但不限于包括微透镜阵列、至少一个光阑、光锥和光导器中的至少一种。
以接收视场102的长度方向为竖直方向、光学元件422为微透镜阵列为例,如图5所示,光接收组件410为接收透镜411,微透镜阵列(图中未示出)位于光电转换件421与接收透镜411之间,微透镜阵列的长度方向平行于竖直方向。从图5中目标场景700内虚线矩形 框所限定的区域射出的至少部分反射光透过接收透镜411后射向微透镜阵列,因微透镜阵列的存在,射向相邻两个光电转换单元424之间的第一光信号经过微透镜阵列中对应微透镜的折射后偏转方向射向附近的光电转换单元424。
形式二、光电转换组件420包括光电单元阵列423,光电单元阵列423包括多个沿预设方向依次设置的光电转换单元424,光电转换单元424用于将第一光信号转换为第一电信号。其中,预设方向与接收视场102的长度方向相适应。在此情况下,接收端组件400还包括电放大模块430,电放大模块430的数量少于光电单元阵列423的光电转换单元424的数量,至少两个光电转换单元424的输出端连接至同一个电放大模块430的输入端。
以接收视场102的长度方向为竖直方向为例,由于光电单元阵列423包括多个沿竖直方向依次设置的光电转换单元424,因此从图5中目标场景700内虚线矩形框所限定的区域中任意位置射出的反射光穿过接收透镜411后至少部分能够照射至多个光电转换单元424,这些光电转换单元424将接收到的第一光信号转换为第一电信号。假设,这些光电转换单元424的输出端均连接至同一个电放大模块430的输入端,那么若第一电信号为脉冲电信号,这些光电转换单元424转换生成的所有第一电信号依次输入对应的电放大模块430时就会形成一个连续的电波信号,该电波信号经过电放大模块430放大后形成波形连续的第二电信号。由此,本公开实施例中接收视场102的其中一个连续区域即为图5中虚线矩形框所限定的区域。
形式三、光电转换组件420包括光电单元阵列423,光电单元阵列423包括多个沿预设方向依次设置的光电转换单元424,光电转换单元424用于将第一光信号转换为第一电信号。其中,预设方向与接收视场102的长度方向相适应。在此情况下,接收端组件400还包括电放大模块430,电放大模块430的数量大于或等于光电单元阵列的光电转换单元424的数量;每个光电转换单元424的输出端与至少一个电放大模块430的输入端电连接,至少两个与不同的光电转换单元424的连接的电放大模块430的输出端相互连接形成总输出端。下面 以电放大模块430的数量与光电单元阵列的光电转换单元424的数量相同为例,假设各个光电转换单元424分别与不同的电放大模块430的输入端电连接,也即,电放大模块430与光电单元阵列423的光电转换单元424一一对应时,至少两个电放大模块430的输出端相互连接形成总输出端。由此,每当第一电信号为脉冲电信号时,各个光电转换单元424生成的第一电信号依次分别输入对应的电放大模块430。对于输出端相互连接的至少两个电放大模块430来说,输入这些电放大模块430的第一电信号经过对应的电放大模块430放大后的脉冲电信号依次从总输出端输出,由此从总输出端输出的信号便可形成波形连续的第二电信号。可见,本公开实施例中接收视场102的其中一个连续区域即为图5中虚线矩形框所限定的区域。
在一些实施例中,光接收组件410包括至少一组透镜组,透镜组包括至少一个位于反射光的光路上的接收透镜411。在光接收组件410包括多组透镜组的情况下,多组透镜组沿指定方向方向依次设置。由此,从接收视场102的任意位置射出的至少部分反射光能够照射至其中至少一组透镜组,反射光穿过该透镜组的接收透镜411并最终被光电转换组件420转换为第一电信号。当接收视场102的长度方向平行于竖直方向时,接收透镜411的镜面既可以平行于竖直方向,也可以与竖直方向之间呈一定夹角例如接收透镜411的镜面可相对竖直方向倾斜45°。
如图2所示,本公开实施例中的激光系统100还包括扫描控制件、光扫描组件300和处理装置500;其中,扫描控制件生成扫描控制信号;光扫描组件300根据扫描控制信号将光发射组件200射出的发射光偏转方向后照射至目标场景700内的至少一个目标物体600,和/或将至少一个目标物体600反射的至少一组反射光偏转方向以被接收端组件400接收;处理装置500分别与光发射组件200、扫描控制件和接收端组件400电连接,处理装置500用于根据发射信号和/或输出信号确定目标物体600的距离、目标物体600的方向角度、目标物体600的反射率和目标物体600的轮廓中的至少一个。
为了扩大光扫描组件300的扫描范围,光扫描组件300包括多个 沿发射光的光路依次设置的光扫描件,相邻两个光扫描件中其中一个光扫描件将发射光偏转方向后射向另外一个光扫描件;其中,至少两个光扫描件的扫描方式不同;其中,扫描方式包括光扫描件的反射面的面积、扫描方向、扫描角度范围、扫描频率和扫描维度中的至少一个。其中,光扫描组件300的扫描维度可以但不限于是一维或二维。
下面以二维扫描为例,如图3所示,多个光扫描件包括第一扫描件310和第二扫描件320,第一扫描件310的扫描方向和第二扫描件320的扫描方向不同,上文中指定方向为第一扫描方向,具体地,第一扫描件310在本帧扫描时长内将多组发射光沿着第二扫描方向依次偏转方向后射向第二扫描件320;第二扫描件320将经过第一扫描件310偏转的发射光沿第一扫描方向偏转后射向目标物体600;其中,第二扫描方向平行于接收视场102的长度方向,第一扫描方向与第二扫描方向不同向。相比于直接采用价格昂贵的二维扫描件例如2D的MEMS振镜330来说,本公开实施例通过采用两个一维扫描件即第一扫描件310和第二扫描件320进行复合扫描便能在降低成本的前提下实现二维扫描、扩大光扫描组件300的扫描范围。
在一些实施例中,第一扫描件310和第二扫描件320可以但不限于包括MEMS振镜、旋转棱镜、旋转楔镜、光学相控阵列、光电偏转器件和液晶扫描件中的至少一个;所述液晶扫描件包括液晶空间光调制器、液晶超晶面、液晶线控阵、透视式一维液晶阵列、透射式二维液晶阵列或液晶显示模组。第一扫描方向和第二扫描方向可以但不限于是水平方向、竖直方向或倾斜方向;其中,倾斜方向介于竖直方向与水平方向之间。
例如,如图6所示,第一扫描件310为MEMS振镜330,第二扫描件320为旋转镜360,第一扫描方向为水平方向,第二扫描方向为竖直方向。其中,旋转镜360可以但不限于是旋转棱镜或旋转楔镜。由于,MEMS振镜330的扫描频率很快,旋转镜360的扫描频率较慢,因此MEMS振镜330将多组发射光快速沿竖直方向依次偏转方向后射向旋转镜360,旋转镜360再将接收到的多组发射光沿水平方向依次偏转方向后以很大的水平扫描角度射向目标物体600,也就是说,多 组发射光经旋转镜360偏转方向后射向目标物体600的轨迹在水平面围设形成一个圆心角很大的扇形面。由此,整个光扫描组件300便能实现垂直高频扫描+水平广角扫描,从而不仅可以提高扫描分辨率,而且还可以增大反射光的接收面积,也就是说,目标物体600的反射光经过旋转镜360的反射,可以增大该反射光穿过光接收组件410后照射在光电转换组件420的面积。另外,由于旋转镜360的成本远低于MEMS振镜330的成本,而MEMS振镜330扫描速度更快,因此通过依次利用MEMS振镜330和旋转镜360偏转发射光不仅能够以较低的成本扩大光接收组件410的接收视场角,而且还可以提高激光系统对目标场景700的分辨能力
当然,为了实现二维扫描,光扫描组件300还可以采用其他结构形式:
例如,如图9所示,光扫描组件300包括MEMS振镜330和光学相控阵列340,光学相控阵列340固定于MEMS振镜330的反射面,光学相控阵列340的进光口通过线缆与光发射组件200连接,光学相控阵列340的出光口朝向目标物体600。其中,光学相控阵列340(Optical Phased Array,简称OPA)包括多个呈阵列分布的波导,波导的材质包括硅晶体、氧化硅和氮化硅中的至少一个。由此,光学相控阵列340响应于扫描控制件生成的扫描控制信号使得各个波导的相对馈电相位随之发生改变进而产生相位差,相位差的存在又会使发射光发生干涉进而改变光路方向。与此同时,MEMS振镜330响应于扫描控制信号其镜面发生微小的平动和扭转往复运动,而由于光学相控阵列340固定于MEMS振镜330的反射面,因此光学相控阵列340整体随MEMS振镜330的镜面同步运动,由此便可使光学相控阵列340实现全方位扫描。可见,本公开实施例通过将光学相控阵列340固定在MEMS振镜330的反射面,就可使光学相控阵列340基于相位差进行扫描的同时自身整体也发生转动,从而便可扩大光扫描组件300的扫描范围、提高扫描速率。
又如,如图10所示,光扫描组件300包括MEMS振镜330和光栅阵列350,光栅阵列350固定于MEMS振镜330的反射面;其中, 发射信号还包括表示每组发射光的波长的波长信息,发射光的偏转方向基于波长信息确定。由于光栅阵列350反射光束的方向与入射光束的波长相关,因此光发射组件200根据波长信息射出指定波长的发射光,发射光照射至光栅阵列350后则按照相应的方向反射出去。与此同时,MEMS振镜330响应于扫描控制信号其镜面发生微小的平动和扭转往复运动,而由于光栅阵列350固定于MEMS振镜330的反射面,因此光栅阵列350整体随MEMS振镜330的镜面同步运动,由此便可使光栅阵列350实现全方位扫描。可见,本公开实施例通过将光栅阵列350固定在MEMS振镜330的反射面,就可使光栅阵列350基于发射光的波长改变发射光光路方向的同时自身整体也发生转动,从而便可扩大光扫描组件300的扫描范围、提高扫描速率。
在一些实施例中,光扫描组件300还被配置为将目标物体600反射的反射光偏转方向的同时生成当前扫描角度信号。当前扫描角度信号可为水平扫描角度信号:例如,在光扫描组件300包括旋转镜360的情况下,旋转镜360上设置有码盘。码盘实时检测旋转镜360当前的水平扫描角度,并将检测结果即当前扫描角度信号发送给处理装置500。又如,在光扫描组件300包括MEMS振镜330的情况下,MEMS振镜330上设置有扭矩检测器。扭矩检测器实时检测MEMS振镜330的扭矩,并将MEMS振镜330的扭矩转换为当前扫描角度信号后发送给处理装置500。处理装置500还被配置为根据发射信号、扫描控制信号、当前扫描角度信号、输出信号以及光电转换组件420上输出第一电信号的位置中的至少一个确定发射光照射至目标物体600的照射角度。例如,在光电转换组件420包括多个光电转换单元424的情况下,“光电转换组件420上输出第一电信号的位置”一般指代的是输出第一电信号的光电转换单元424所在的位置。
如图7所示,为了扩大该激光系统100的应用领域,使其能够应用于AR、VR和元宇宙领域,多组发射光包括至少一组第一发射光和至少一组第二发射光,第一发射光的发射时刻早于第二发射光的发射时刻,第一发射光经对应的目标物体600反射后的反射光被转换为输出信号,第二发射光为可见光,也就是说,第一发射光用于测量距离、 反射率或轮廓中的至少一个,第二发射光用于投影图像。光扫描组件300被配置为将第一发射光照射至多个目标物体600后,根据距离、照射角度、反射率和轮廓中的至少一个将第二发射光按照预设效果投影于多个目标物体600中的其中一个目标物体600的表面。由于第二发射光是根据目标物体600的距离、照射角度、目标物体600的反射率和目标物体600的轮廓中的至少一个投影在目标物体600表面的,因此第二发射光在目标物体600表面的成像可以再现真实图像。
例如,目标物体600的表面为球面时,光发射组件200通过探头组件先发射至少一组第一发射光至目标物体600表面,然后再射出至少一组第二发射光。处理装置500根据与第一发射光对应的发射信号和/或输出信号确定目标物体600的距离、目标物体600的反射率和目标物体600的轮廓中的至少一个,与此同时,处理装置500还根据扫描控制信号、当前扫描角度信号、输出信号以及光电转换组件420上输出第一电信号的位置中的至少一个确定发射光照射至目标物体600的照射角度。之后,光扫描组件300根据处理装置500基于第一发射光确定的目标物体600的距离、照射角度、目标物体600的反射率和目标物体600的轮廓中的至少一个,将第二发射光例如昆虫图像投影在目标物体600的表面。由于,第二发射光是根据目标物体600的距离、照射角度、目标物体600的反射率和目标物体600的轮廓中的至少一个投影在目标物体600的表面的,因此昆虫的图像并未被目标物体600的曲面扭曲,而是按照一定的曲率覆盖在目标物体600的曲面,使得目标物体600真实还原了昆虫。其中,第二发射光可以但不限于包括红光、蓝光和绿光中的至少一种。
又如,目标物体600为汽车挡风玻璃或AR眼镜时,光扫描组件300先将第一发射光投影在汽车挡风玻璃或AR眼镜上,然后再根据目标物体600的距离、照射角度、目标物体600的反射率和目标物体600的轮廓中的至少一个,将预设的虚拟AR图像也即第二发射光投影在汽车挡风玻璃或AR眼镜上,以使用户能够看到增强后的现实世界和虚拟世界的景象。
当然,光扫描组件300也可以直接将第一发射光和第二发射光分 别投影在两个不同的目标物体600的表面,在此情况下该激光系统100相当于一个普通的投影设备。
在一些实施例中,当前扫描角度信号包括第一扫描角度信号;其中,第一扫描角度信号为光扫描组件300将反射光沿第一扫描方向偏转时生成的扫描角度信号;处理装置500被配置为根据第一扫描角度信号确定照射角度沿第一扫描方向的分量,与此同时,处理装置500还根据扫描控制信号、当前扫描角度信号、输出信号以及光电转换组件420上输出第一电信号的位置中的至少一个确定照射角度沿第二扫描方向的分量;其中,指定方向为第一扫描方向。例如,第二扫描件320为旋转镜360,由于旋转镜360的扫描频率较慢,因此第二扫描件320根据扫描控制信号偏转指定角度后反馈第一扫描角度信号至处理装置500,处理装置500根据第一扫描角度信号就可确定出目标物体600的照射角度沿第一扫描方向的分量。当然,当第一扫描件310的扫描频率较慢时,当前扫描角度信号包括第二扫描角度信号,处理装置500也可以直接通过第二扫描角度信号确定照射角度沿第二扫描方向的分量。
在一些实施例中,激光系统100还包括通讯部件,通讯部件用于向外界传送指定信息和/或接收外界信息;其中,指定信息包括目标物体的距离、目标物体的反射率、目标物体的方向角度、目标物体的轮廓和照射角度中的至少一个。
处理装置500还被配置为根据目标参数确定目标物体600的三维融合图像、目标物体的超像素802、接收视场的超像素803、第一指定规律和第二指定规律中的至少一个;其中,目标参数包括发射信号、扫描控制信号、当前扫描角度信号、输出信号、光电转换组件420上输出第一电信号的位置和外界信息中的至少一个。需要说明的是,“目标物体的超像素802”一般指代的是构成目标物体的图像的所有像素点中的多个像素点的集合;“接收视场的超像素803”一般指代的是构成接收视场的图像的所有像素点中多个像素点的集合。其中,目标物体的超像素802的形状和接收视场的超像素803的形状可以但不限于包括直线、多边形、圆形和椭圆形中的至少一个。
在一些实施例中,激光系统100还包括图像传感器,图像传感器用于获取目标场景700的二维图像;目标参数包括二维图像。通讯部件还用于向外界传送目标物体的超像素802。
下面以确定接收视场的超像素803为例,如图11所示,光发射组件200在本帧扫描时长内依次射出多组发射光,光扫描组件300根据扫描控制信号将光发射组件200射出的多组发射光依次偏转方向后照射至目标场景700内的目标物体600。处理装置500根据发射信号、扫描控制信号、当前扫描角度信号、输出信号、光电转换组件420上输出第一电信号的位置和外界信息中的至少一个确定目标物体的三维点云图像801,处理装置500将目标物体600的三维点云图像801进行超像素分割,将其分解为图11中的13个目标物体的超像素802,进而在根据这13个目标物体的超像素802生成两个接收视场的超像素803。下一帧扫描时长内,接收视场102先位于上方的接收视场的超像素803所在位置,光发射组件200先朝接收视场102依次射出多组发射光,以使各组发射光在目标场景700内的投射区域也即各个发射视场101呈点阵分布并恰好位于接收视场102内;接着,接收视场102按照第一指定规律移动至下方的接收视场的超像素803所在位置,光发射组件200再朝当前的接收视场102依次射出多组发射光,以使各个发射视场101呈点阵分布并恰好位于该接收视场102内。
在一些实施例中,接收端组件400包括光接收组件410和光电转换组件420;其中,光接收组件410依次接收经目标物体600反射的多组反射光并将多组反射光依次转换为对应的第一光信号;光电转换组件420将多个第一光信号依次转换为对应的第一电信号。在此情况下,第一电信号作为输出信号。
当然,考虑到第一电信号的信号强度可能较弱,为了提高测量的准确性,在一些实施例中,接收端组件400则包括光接收组件410、光电转换组件420和电放大模块430;其中,光接收组件410依次接收经目标物体600反射的多组反射光并将多组反射光依次转换为对应的第一光信号;光电转换组件420将多个第一光信号依次转换为对应的第一电信号,电放大模块430用于将第一电信号放大为第二电信号。 在此情况下,第二电信号作为输出信号。
此外,需要说明的是,处理装置500可以基于多种方法来确定目标物体600的距离、目标物体600的反射率和目标物体600的轮廓中的至少一个,例如,处理装置500可以基于飞行时间法、相位法测距或三角式测距法等方法来确定目标物体600的距离。
在处理装置500基于飞行时间法确定目标物体600的距离的情况下,处理装置500包括处理器、至少一个比较器以及与比较器一一对应的时长确定模块。其中,电放大模块430包括多个相互串联或并联的放大器,多个放大器中至少其中一个放大器输出的放大电信号的强度小于另外一个放大器输出的放大电信号的强度的一半。其中,至少输出最大的放大电信号的放大器的输出端连接至少一个比较器的输入端,比较器的比较输入与放大器一一对应。例如,当多个放大器依次串联时,最后一级放大器输出的放大电信号最大,若比较器的数量为一个,那么在比较器的数量为一个的情况下,这个比较器通过最后一级放大器与时长确定模块连接;当比较器的数量为多个时,多个放大器的输出端均连接有比较器,且每个比较器的比较输入的电压值不同。比较器接入比较输入,用于将比较输入的电压值与对应放大器输出的电信号进行比较,以确定触发起始时刻、触发结束时刻和脉冲宽度;其中,触发起始时刻和触发结束时刻分别为放大器输出的电信号的强度高于比较输入的电压值的起始时刻和终止时刻,脉冲宽度为触发结束时刻与触发起始时刻的差值;时长确定模块与比较器一一对应;时长确定模块用于根据发射起始时刻与对应比较器输出的触发起始时刻确定光飞行时长。处理器根据光飞行时长、脉冲宽度、第二电信号的强度和光速中的至少一个确定距离、反射率和轮廓中的至少一个。
以测量目标物体600的距离为例,在此情况下处理器根据飞行时间法确定目标物体600的距离。由于触发起始时刻受到比较输入的电压值大小的影响,而触发放大器输出的电信号的比较输入的电压值不同时对应的脉冲宽度也不同,因此为了减小上述影响处理器先根据脉冲宽度修正光飞行时长,然后再根据光速和修正后的光飞行时长确定目标物体600的距离。
其中,比较输入可以是从外部输入比较器的动态电压曲线,也可以是预存在比较器内的动态电压曲线。此外,时长确定模块可以但不限于是TDC(时间数字转换器,全称为时间数字转换器)。时长确定模块与处理器可以均为独立部件,也可以集成为一个部件。
在一些实施例中,本公开实施例中的激光系统100还包括主壳体和至少一个探头壳体,探头壳体与主壳体分体设置,探头壳体与目标场景一一对应。其中,主壳体内设置有光发射组件200、扫描控制件和处理装置500;探头壳体内设置有光接收组件410和光扫描组件300;其中,光电转换组件420设于主壳体或探头壳体。
由于本公开实施例中探头壳体与主壳体分体设置,因此探头壳体和主壳体可以分开固定安装,相比于整个激光系统100来说,探头壳体的体积很小,探头壳体能够安装在小体积的应用对象或应用位置上。以应用对象是盲人眼镜为例,探头壳体可以固定在盲人眼镜的镜架上,主壳体夹持在用户的腰部或者放置在用户的衣服口袋内。再以应用对象是汽车的后视镜为例,探头壳体可以固定在汽车的后视镜上,主壳体则固定在汽车的天花板。可见,安装该激光系统100时便只需将探头壳体安装于应用对象或应用位置,而无需将整个激光系统100安装在应用对象或应用位置上,从而便可扩大激光系统100的适用范围。此外,由于向目标物体600射出发射光的光扫描组件300以及接收目标物体600的反射光的光接收组件410均设置在探头壳体上,而探头壳体安装在应用对象或应用位置上,因此可以保证整个激光系统100的探测范围不受影响。
在探头壳体的数量为多个的情况下,各个探头壳体内的光扫描组件300可分别将对应的发射光照射至不同目标场景700内的目标物体600。
在一些实施例中,光发射组件200通过第一光纤与光扫描组件300连接,光接收组件410通过第二光纤与光电转换组件420连接,处理装置500通过线缆分别与光发射组件200、扫描控制件、光电转换组件420和光扫描组件300电连接。需要说明的是,上述各个部件除了可以借助光纤或线缆实现光/电连接以外,还可以借助其他光学元件 422和无线通信元件通过空间传送电信号和/或光信号来实现光/电连接。
在一些实施例中,激光系统100还包括显示部件和/或提示部件;其中,显示部件用于显示目标物体600的距离、照射角度、目标物体600的反射率和目标物体600的轮廓中的至少一个;提示部件用于根据目标物体600的距离、照射角度、目标物体600的反射率和目标物体600的轮廓中的至少一个输出提示信号。其中,提示部件可以但不限于是麦克风或震动器。
在一些实施例中,接收端组件400还包括偏置电压模块。其中,偏置电压模块提供动态偏置电压;动态偏置电压的绝对值从发射起始时刻起按照第一预设规律在第一预设时长变化至第一预定阈值、并保持不小于第一预定阈值第二预设时长,且动态偏置电压的绝对值在第一预设时长内小于第一预定阈值;其中,光电转换组件420用于根据动态偏置电压将第一光信号依次转换为对应的第一电信号;第一预设时长小于发射起始时刻与接收时刻的最大差值,接收时刻为反射光被接收端组件400接收的时刻。
若目标物体600距离光发射组件200较远,那么相比于光发射组件200射出的发射光来说,光接收组件410接收到的反射光的光强显著衰减。由于动态偏置电压的绝对值从发射起始时刻起在第一预设时长变化至第一预定阈值、并保持不小于第一预定阈值第二预设时长,而由上文可知发射光经远距离目标物体600反射回来的耗时较长,因此光接收组件410接收反射光的时刻对应的动态偏置电压的绝对值不小于第一预定阈值,从而光电转换单元424根据该动态偏置电压就可将较弱的光信号转换为较强的第一电信号。
同理,若目标物体600距离光发射组件200较近,那么相比于光发射组件200射出的发射光来说,光接收组件410接收到的反射光的光强衰减较少。由于动态偏置电压的绝对值从发射起始时刻起在第一预设时长内小于第一预定阈值,而由上文可知发射光经近距离目标物体600反射回来的耗时较短,因此光接收组件410接收反射光的时刻对应的动态偏置电压的绝对值小于第一预定阈值,从而光电转换单元 424根据该动态偏置电压就可将较强的光信号转换为相对较弱的第一电信号,以避免较强的光信号经光电转换放大后饱和失真。
由上可知,本公开实施例中的雷达系统基于光束在传播过程中其强度随传播距离即传播时间的增大而衰减的原理,通过采用随时间变化的动态偏置电压,在光电转换过程中就可使自远距离目标物体600反射回来的反射光对应绝对值较大的动态偏置电压也即该动态偏置电压的绝对值不小于第一预定阈值,使自近距离目标物体600反射回来的反射光对应绝对值减小的动态偏置电压也即该动态偏置电压的绝对值小于第一预定阈值,从而不仅可以提高近距离的测量精度、避免近距离反射光束经光电转换放大后饱和失真,而且又不影响远距离的探测能力。
在一些实施例中,动态偏置电压的绝对值从第一调整时刻起按照第二预设规律在第三预设时长变化至第二预定阈值、并保持不小于第二预定阈值第四预设时长,且动态偏置电压的绝对值在第三预设时长内小于第二预定阈值;其中,第一调整时刻早于接收时刻;处理装置500还被配置为根据发射信号、扫描控制信号、当前扫描角度信号、输出信号和光电转换组件420上输出第一电信号的位置中的至少一个确定调整时刻。
如图12所示,本公开实施例还提供了一种激光测量方法,该方法包括:
S100、生成发射信号并根据发射信号在本帧扫描时长内依次射出多组发射光;
S200、将发射光经过目标场景700中的至少一个目标物体600反射后的至少一组反射光转换为输出信号;其中,输出信号的类型为电信号;
S300、根据发射信号和/或输出信号确定目标物体600的距离、目标物体600的反射率和目标物体600的轮廓中的至少一个;
其中,在本帧扫描时长内,接收视场102在目标场景700内的位置按照第一指定规律变化和/或接收视场102的形状按照第二指定规律变化;自对应发射光发出的发射起始时刻起,发射视场101在预设接 收时长内位于当前的接收视场102中,且接收视场102的面积大于或等于发射视场101的面积的两倍。其中,第一指定规律包括沿指定方向变化;发射视场101为每组发射光在目标场景700的投射区域,接收视场102为在预设接收时长内能够被转换为输出信号的所有光束在目标场景700内对应的区域。
步骤S200包括:
S210、依次接收经目标物体600反射的多组反射光并将多组反射光依次转换为对应的第一光信号;
S220、将多个第一光信号依次转换为对应的第一电信号。
在执行步骤S100之后,并在执行步骤S200之前,该激光测量方法还包括:
S110、生成扫描控制信号;
S120、根据扫描控制信号将发射光偏转方向后照射至目标场景700内的至少一个目标物体600,和/或将至少一个目标物体600反射的至少一组反射光偏转至接收方向。
进一步地,步骤S120包括:
S121、在本帧扫描时长内将多组发射光沿着第二扫描方向依次偏转方向;
S122、将沿第二扫描方向偏转后的发射光沿第一扫描方向偏转后射向目标物体600;其中,第二扫描方向平行于接收视场102的长度方向,第一扫描方向与第二扫描方向不同向,指定方向为第一扫描方向。
在一些实施例中,在执行步骤S110之后,该激光测量方法还包括:
将目标物体600反射的反射光偏转方向的同时生成当前扫描角度信号;
根据发射信号、扫描控制信号、当前扫描角度信号、输出信号以及第一光信号的转换位置中的至少一个确定发射光照射至目标物体600的照射角度。
步骤S100包括:在本帧扫描时长内依次射出至少一组第一发射光和至少一组第二发射光;第一发射光的发射时刻早于第二发射光的发 射时刻;其中,第二发射光为可见光;
步骤S200包括:将第一发射光经对应的目标物体600反射后的反射光转换为输出信号。
在一些实施例中,步骤S120中根据扫描控制信号将发射光偏转方向后照射至目标场景700内的至少一个目标物体600,包括:根据扫描控制信号将第一发射光照射至多个目标物体600后,根据距离、照射角度、反射率和轮廓中的至少一个将第二发射光按照预设效果投影于多个目标物体600中的其中一个目标物体600的表面。这样设置的好处在于,将预设的虚拟AR图像也即第二发射光投影在目标物体600上,以使用户能够看到增强后的现实世界和虚拟世界的景象。
在一些实施例中,步骤S120中根据扫描控制信号将发射光偏转方向后照射至目标场景700内的至少一个目标物体600,包括:根据所述扫描控制信号将所述发射光偏转方向后,将第一发射光和第二发射光分别照射至两个不同的目标物体600。在此情况下,该步骤相当于一个普通的投影操作。
以上描述仅为本公开的实施方式以及对所运用技术原理的说明。本领域技术人员应当理解,本公开中所涉及的保护范围,并不限于上述技术特征的特定组合而成的技术方案,同时也应涵盖在不脱离技术构思的情况下,由上述技术特征或其等同特征进行任意组合而形成的其它技术方案。例如上述特征与本公开中公开的(但不限于)具有类似功能的技术特征进行互相替换而形成的技术方案。

Claims (50)

  1. 一种激光系统,其特征在于,包括:
    光发射组件,生成发射信号并根据所述发射信号在本帧扫描时长内依次射出多组发射光;其中,所述发射信号包括表示每组所述发射光的发射起始时刻的时刻信息;
    接收端组件,将所述发射光经过目标场景中的至少一个目标物体反射后的至少一组反射光转换为输出信号;其中,所述输出信号的类型为电信号;
    其中,在所述本帧扫描时长内,所述接收端组件的接收视场在所述目标场景内的位置按照第一指定规律变化和/或所述接收视场的形状按照第二指定规律变化;自对应所述发射光发出的发射起始时刻起,所述光发射组件的发射视场在预设接收时长内位于当前的所述接收视场中,且所述接收视场的面积大于或等于所述发射视场的面积的两倍;其中,所述第一指定规律包括沿指定方向变动;所述发射视场为每组所述发射光在所述目标场景内的投射区域,所述接收视场为在所述预设接收时长内所述接收端组件能够接收到的所有光束在所述目标场景内对应的区域。
  2. 根据权利要求1所述的激光系统,其中,所述接收视场包括至少一个条形的连续区域,所述本帧扫描时长内多组所述发射光对应的所述发射视场呈点阵分布,所述点阵的宽度方向与所述指定方向平行。
  3. 根据权利要求2所述的激光系统,其中,所述连续区域为曲形区域。
  4. 根据权利要求2或3所述的激光系统,其中,所述发射视场的面积与所述接收视场的面积之比小于第一比例阈值,所述第一比例阈值为0.5、0.1、0.01、或者0.001。
  5. 根据权利要求4所述的激光系统,其中,至少一个所述连续区域的最大宽度和总长度之比小于所述第一比例阈值。
  6. 根据权利要求2所述的激光系统,其中,自相邻两组所述发射光中前一组所述发射光发出的发射起始时刻起至后一组所述发射光发出所述预设接收时长后止,沿所述点阵的长度方向的相邻两个所述发射视场之间的方向角度变化幅度与所述接收视场的方向角度变化幅度之比大于第二比例阈值,所述第二比例阈值为1、10、100、10000或者1000000。
  7. 根据权利要求1至3任一项所述的激光系统,其中,所述发射视场的面积与所述目标场景的面积之比小于第三比例阈值,所述第三比例阈值为0.1、0.01、0.001、0.0001或0.0001。
  8. 根据权利要求1至3任一项所述的激光系统,其中,所述发射光包括多个光脉冲,所述发射光中的至少两个所述光脉冲的夹角大于预设夹角;其中,所述预设夹角与所述接收视场的视场角之比小于第四比例阈值,所述第四比例阈值为0.01、0.1、0.3、0.5或0.9。
  9. 根据权利要求1至3任一项所述的激光系统,其中,所述目标场景的面积与所述接收视场的面积之比大于或等于第五比例阈值,所述第五比例阈值为2、4、8、16、100、1000或10000。
  10. 根据权利要求2所述的激光系统,其中,所述接收端组件包括:
    光接收组件,依次接收经所述目标物体反射的多组反射光并将多组所述反射光依次转换为对应的第一光信号;以及
    光电转换组件,将多个所述第一光信号依次转换为对应的第一电信号。
  11. 根据权利要求10所述的激光系统,其中,所述光电转换组件包括:
    光电转换件,具有连续的光电转换区域;
    光学元件,所述光学元件的进光端朝向所述光接收组件,所述光学元件的出光端朝向所述光电转换区域;所述光学元件用于将所述第一光信号选择性射向所述光电转换区域,所述光电转换区域用于将所述第一光信号转换为所述第一电信号。
  12. 根据权利要求10所述的激光系统,其中,所述光电转换组件包括:
    光电单元阵列,包括多个沿预设方向依次设置的光电转换单元;所述光电转换单元用于将所述第一光信号转换为所述第一电信号。
  13. 根据权利要求12所述的激光系统,其中,所述光电转换组件还包括:
    至少一个光学元件,位于所述光接收组件与所述光电单元阵列之间,用于将所述光接收组件射向相邻两个所述光电转换单元之间的所述第一光信号偏转方向后射向所述光电转换单元。
  14. 根据权利要求12所述的激光系统,其中,所述光电转换单元包括APD、SPAD、SIPM、PIN和PD中的至少一种。
  15. 根据权利要求11或13所述的激光系统,其中,所述光学元件包括微透镜阵列、至少一个光阑、光锥和光导器中的至少一种。
  16. 根据权利要求10至14任一项所述的激光系统,其中,所述光接收组件包括至少一组透镜组,所述透镜组包括至少一个位于所述反射光的光路上的接收透镜。
  17. 根据权利要求10至14任一项所述的激光系统,其中,所述 激光系统还包括:
    扫描控制件,生成扫描控制信号;
    光扫描组件,根据所述扫描控制信号将所述光发射组件射出的所述发射光偏转方向后照射至所述目标场景内的至少一个所述目标物体,和/或将至少一个所述目标物体反射的至少一组所述反射光偏转方向以被所述接收端组件接收;
    处理装置,分别与所述光发射组件、所述扫描控制件和所述接收端组件电连接,所述处理装置用于根据所述扫描控制信号、所述发射信号和所述输出信号中的至少一个确定所述目标物体的距离、所述目标物体的反射率、所述目标物体的方向角度、所述目标物体的轮廓中的至少一个。
  18. 根据权利要求17所述的激光系统,其中,所述光扫描组件包括多个沿所述发射光的光路依次设置的光扫描件,相邻两个所述光扫描件中其中一个所述光扫描件将所述发射光偏转方向后射向另外一个所述光扫描件;其中,至少两个所述光扫描件的扫描方式不同;所述扫描方式包括所述光扫描件的反射面的面积、扫描方向、扫描角度范围、扫描频率和扫描维度中的至少一个。
  19. 根据权利要求18所述的激光系统,其中,多个所述光扫描件包括第一扫描件和第二扫描件;所述第一扫描件在本帧扫描时长内将多组所述发射光沿着第二扫描方向依次偏转方向后射向所述第二扫描件;所述第二扫描件将经过所述第一扫描件偏转的所述发射光沿第一扫描方向偏转后射向所述目标物体;
    其中,所述第二扫描方向平行于所述接收视场的长度方向,所述第一扫描方向与所述第二扫描方向不同向,所述指定方向为所述第一扫描方向。
  20. 根据权利要求19所述的激光系统,其中,所述第一扫描方向和所述第二扫描方向为水平方向、竖直方向或倾斜方向;其中,所述 倾斜方向介于所述竖直方向与所述水平方向之间。
  21. 根据权利要求19所述的激光系统,其中,所述第一扫描件和所述第二扫描件包括MEMS振镜、旋转棱镜、旋转楔镜、光学相控阵列、光电偏转器件和液晶扫描件中的至少一个;所述液晶扫描件包括液晶空间光调制器、液晶超晶面、液晶线控阵、透视式一维液晶阵列、透射式二维液晶阵列或液晶显示模组。
  22. 根据权利要求17所述的激光系统,其中,所述光扫描组件包括MEMS振镜和光学相控阵列,所述光学相控阵列固定于所述MEMS振镜的反射面,所述光学相控阵列的进光口通过线缆与所述光发射组件连接,所述光学相控阵列的出光口朝向所述目标物体。
  23. 根据权利要求22所述的激光系统,其中,所述光学相控阵列包括多个呈阵列分布的波导,所述波导的材质包括硅晶体、氧化硅和氮化硅中的至少一个。
  24. 根据权利要求17所述的激光系统,其中,所述光扫描组件包括MEMS振镜和光栅阵列,所述光栅阵列固定于所述MEMS振镜的反射面;其中,所述发射信号还包括表示每组所述发射光的波长的波长信息,所述发射光的偏转方向基于所述波长信息确定。
  25. 根据权利要求17所述的激光系统,其中,所述光扫描组件还被配置为将所述目标物体反射的反射光偏转方向的同时生成当前扫描角度信号;所述处理装置还被配置为根据所述发射信号、所述扫描控制信号、所述当前扫描角度信号、所述输出信号以及所述光电转换组件上输出所述第一电信号的位置中的至少一个确定所述发射光照射至所述目标物体的照射角度。
  26. 根据权利要求25所述的激光系统,其中,多组所述发射光包 括至少一组第一发射光和至少一组第二发射光,所述第一发射光的发射时刻早于所述第二发射光的发射时刻,所述第一发射光经对应的所述目标物体反射后的反射光被转换为所述输出信号,所述第二发射光为可见光;
    其中,所述光扫描组件将所述第一发射光照射至多个所述目标物体后,根据所述距离、所述照射角度、所述反射率和所述轮廓中的至少一个将所述第二发射光按照预设效果投影于多个所述目标物体中的其中一个所述目标物体的表面;或者,
    所述光扫描组件将所述第一发射光和所述第二发射光分别照射至两个不同的所述目标物体。
  27. 根据权利要求26所述的激光系统,其中,所述第二发射光包括红光、蓝光和绿光中的至少一种。
  28. 根据权利要求25所述的激光系统,其中,所述当前扫描角度信号包括第一扫描角度信号;其中,所述第一扫描角度信号为所述光扫描组件将所述反射光沿第一扫描方向偏转时生成的扫描角度信号;
    所述处理装置被配置为根据所述第一扫描角度信号确定所述照射角度沿所述第一扫描方向的分量,以及根据所述扫描控制信号、所述当前扫描角度信号、所述输出信号以及所述光电转换组件上输出所述第一电信号的位置中的至少一个确定所述照射角度沿第二扫描方向的分量;其中,所述指定方向为所述第一扫描方向。
  29. 根据权利要求25所述的激光系统,其中,所述激光系统还包括通讯部件,所述通讯部件用于向外界传送指定信息和/或接收外界信息;其中,所述指定信息包括所述目标物体的距离、所述目标物体的反射率、所述目标物体的方向角度、所述目标物体的轮廓和所述照射角度中的至少一个。
  30. 根据权利要求29所述的激光系统,其中,所述处理装置还被 配置为根据目标参数确定所述目标物体的三维融合图像、所述目标物体的超像素、所述接收视场的超像素、所述第一指定规律和所述第二指定规律中的至少一个;
    其中,所述目标参数包括所述发射信号、所述扫描控制信号、所述当前扫描角度信号、所述输出信号、所述光电转换组件上输出所述第一电信号的位置和所述外界信息中的至少一个。
  31. 根据权利要求30所述的激光系统,其中,所述激光系统还包括图像传感器,所述图像传感器用于获取所述目标场景的二维图像;所述目标参数包括所述二维图像。
  32. 根据权利要求30所述的激光系统,其中,所述指定信息还包括所述目标物体的超像素。
  33. 根据权利要求17所述的激光系统,其中,所述接收端组件还包括电放大模块,所述电放大模块用于将所述第一电信号放大为第二电信号。
  34. 根据权利要求33所述的激光系统,其中,所述光电转换组件包括光电单元阵列,所述电放大模块的数量少于所述光电单元阵列的光电转换单元的数量,至少两个所述光电转换单元的输出端连接至同一个所述电放大模块的输入端。
  35. 根据权利要求33所述的激光系统,其中,所述光电转换组件包括光电单元阵列,所述电放大模块的数量大于或等于所述光电单元阵列的光电转换单元的数量;每个所述光电转换单元的输出端与至少一个所述电放大模块的输入端电连接,至少两个与不同的所述光电转换单元的连接的所述电放大模块的输出端相互连接形成总输出端。
  36. 根据权利要求33所述的激光系统,其中,所述电放大模块包 括多个相互串联或并联的放大器,多个所述放大器中至少其中一个所述放大器输出的放大电信号的强度小于另外一个所述放大器输出的放大电信号的强度的一半。
  37. 根据权利要求36所述的激光系统,其中,所述处理装置包括:
    至少一个比较器;至少输出最大放大电信号的所述放大器的输出端连接至少一个所述比较器的输入端,所述比较器的比较输入与所述放大器一一对应;所述比较器用于将所述比较输入的电压值与对应所述放大器输出的电信号进行比较,以确定触发起始时刻、触发结束时刻和脉冲宽度;其中,所述触发起始时刻和所述触发结束时刻分别为所述放大器输出的电信号的强度高于所述比较输入的电压值的起始时刻和终止时刻,所述脉冲宽度为所述触发结束时刻与所述触发起始时刻的差值;
    时长确定模块,与所述比较器一一对应;所述时长确定模块用于根据所述发射起始时刻与对应所述比较器输出的所述触发起始时刻确定光飞行时长;以及
    处理器,根据所述光飞行时长、所述脉冲宽度、所述第二电信号的强度和光速中的至少一个确定所述距离、所述反射率和所述轮廓中的至少一个。
  38. 根据权利要求17所述的激光系统,其中,所述激光系统还包括:
    主壳体,设置有所述光发射组件、所述扫描控制件和所述处理装置;
    至少一个探头壳体,与所述主壳体分体设置;每个所述探头壳体内均设置有所述光接收组件和所述光扫描组件,所述探头壳体与所述目标场景一一对应;
    其中,所述光电转换组件设于所述主壳体或所述探头壳体。
  39. 根据权利要求38所述的激光系统,其中,所述光发射组件通 过第一光纤与所述光扫描组件连接,所述处理装置通过线缆分别与所述光发射组件、所述扫描控制件、所述光电转换组件和所述光扫描组件电连接。
  40. 根据权利要求17所述的激光系统,其中,所述激光系统还包括:
    显示部件,显示所述距离、所述反射率和所述轮廓中的至少一个;和/或
    提示部件,根据所述距离、所述反射率和所述轮廓中的至少一个输出提示信号。
  41. 根据权利要求25所述的激光系统,其中,所述接收端组件还包括:
    偏置电压模块,提供动态偏置电压;所述动态偏置电压的绝对值从所述发射起始时刻起按照第一预设规律在第一预设时长变化至第一预定阈值、并保持不小于所述第一预定阈值第二预设时长,且所述动态偏置电压的绝对值在所述第一预设时长内小于所述第一预定阈值;
    其中,光电转换组件用于根据所述动态偏置电压将所述第一光信号依次转换为对应的第一电信号;所述第一预设时长小于所述发射起始时刻与接收时刻的最大差值,所述接收时刻为所述反射光被所述接收端组件接收的时刻。
  42. 根据权利要求41所述的激光系统,其中,所述动态偏置电压的绝对值从第一调整时刻起按照第二预设规律在第三预设时长变化至第二预定阈值、并保持不小于所述第二预定阈值第四预设时长,且所述动态偏置电压的绝对值在所述第三预设时长内小于所述第二预定阈值;其中,所述第一调整时刻早于所述接收时刻;
    所述处理装置还被配置为根据所述发射信号、所述扫描控制信号、所述当前扫描角度信号、所述输出信号和所述光电转换组件上输出所述第一电信号的位置中的至少一个确定所述调整时刻。
  43. 一种激光测量方法,其特征在于,包括:
    生成发射信号并根据所述发射信号在本帧扫描时长内依次射出多组发射光;
    将所述发射光经过目标场景中的至少一个目标物体反射后的至少一组反射光转换为输出信号;其中,所述输出信号的类型为电信号;
    根据所述发射信号和/或所述输出信号确定所述目标物体的距离、所述目标物体的反射率和所述目标物体的轮廓中的至少一个;
    其中,在所述本帧扫描时长内,接收视场在所述目标场景内的位置按照第一指定规律变化和/或所述接收视场的形状按照第二指定规律变化;自对应所述发射光发出的发射起始时刻起,发射视场在预设接收时长内位于当前的所述接收视场中,且所述接收视场的面积大于或等于所述发射视场的面积的两倍;其中,所述第一指定规律包括沿指定方向变化;所述发射视场为每组所述发射光在所述目标场景内的投射区域,所述接收视场为在所述预设接收时长内能够被转换为所述输出信号的所有光束在所述目标场景内对应的区域。
  44. 根据权利要求43所述的激光测量方法,其中,将所述发射光经过目标场景中的至少一个目标物体反射后的至少一组反射光转换为输出信号,包括:
    依次接收经所述目标物体反射的多组反射光并将多组所述反射光依次转换为对应的第一光信号;
    将多个所述第一光信号依次转换为对应的第一电信号。
  45. 根据权利要求44所述的激光测量方法,其中,执行将所述发射光经过目标场景中的至少一个目标物体反射后的至少一组反射光转换为输出信号的步骤之前,所述激光测量方法还包括:
    生成扫描控制信号;
    根据所述扫描控制信号将所述发射光偏转方向后照射至所述目标场景内的至少一个所述目标物体,和/或将至少一个所述目标物体反射 的至少一组所述反射光偏转至接收方向。
  46. 根据权利要求45所述的激光测量方法,其中,所述根据所述扫描控制信号将所述发射光偏转方向后照射至所述目标场景内的至少一个所述目标物体,包括:
    在本帧扫描时长内将多组所述发射光沿着第二扫描方向依次偏转方向;
    将沿所述第二扫描方向偏转后的所述发射光沿第一扫描方向偏转后射向所述目标物体;
    其中,所述第二扫描方向平行于所述接收视场的长度方向,所述第一扫描方向与所述第二扫描方向不同向,所述指定方向为所述第一扫描方向。
  47. 根据权利要求45所述的激光测量方法,其中,执行生成扫描控制信号的步骤之后,所述激光测量方法还包括:
    将所述目标物体反射的反射光偏转方向的同时生成当前扫描角度信号;
    根据所述发射信号、所述扫描控制信号、所述当前扫描角度信号、所述输出信号以及所述第一光信号的转换位置中的至少一个确定所述发射光照射至所述目标物体的照射角度。
  48. 根据权利要求47所述的激光测量方法,其中,生成发射信号并根据所述发射信号在本帧扫描时长内依次射出多组发射光的步骤,包括:
    在本帧扫描时长内依次射出至少一组第一发射光和至少一组第二发射光;所述第一发射光的发射时刻早于所述第二发射光的发射时刻;其中,所述第二发射光为可见光;
    其中,将所述发射光经过目标场景中的至少一个目标物体反射后的至少一组反射光转换为输出信号的步骤,包括:
    将所述第一发射光经对应的所述目标物体反射后的反射光转换为 所述输出信号。
  49. 根据权利要求48所述的激光测量方法,其中,根据所述扫描控制信号将所述发射光偏转方向后照射至所述目标场景内的至少一个所述目标物体的步骤,包括:
    根据所述扫描控制信号将所述第一发射光照射至多个所述目标物体;以及
    根据所述距离、所述照射角度、所述反射率和所述轮廓中的至少一个将所述第二发射光按照预设效果投影于多个所述目标物体中的其中一个所述目标物体的表面。
  50. 根据权利要求48所述的激光测量方法,其中,根据所述扫描控制信号将所述发射光偏转方向后照射至所述目标场景内的至少一个所述目标物体的步骤,包括:
    根据所述扫描控制信号将所述发射光偏转方向后,将所述第一发射光和所述第二发射光分别照射至两个不同的所述目标物体。
PCT/CN2023/073760 2022-01-30 2023-01-30 激光系统及激光测量方法 WO2023143593A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210113638.0A CN116559825B (zh) 2022-01-30 激光系统及激光测量方法
CN202210113638.0 2022-01-30

Publications (1)

Publication Number Publication Date
WO2023143593A1 true WO2023143593A1 (zh) 2023-08-03

Family

ID=87470746

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/073760 WO2023143593A1 (zh) 2022-01-30 2023-01-30 激光系统及激光测量方法

Country Status (1)

Country Link
WO (1) WO2023143593A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110346781A (zh) * 2018-04-02 2019-10-18 探维科技(北京)有限公司 基于多激光束的雷达发射接收装置及激光雷达系统
CN110998365A (zh) * 2017-07-05 2020-04-10 奥斯特公司 具有电子扫描发射器阵列和同步传感器阵列的光测距装置
CN110988842A (zh) * 2018-10-01 2020-04-10 英飞凌科技股份有限公司 激光雷达2d接收器阵列架构
WO2020232016A1 (en) * 2019-05-13 2020-11-19 Ouster, Inc. Synchronized image capturing for electronic scanning lidar systems
CN112130161A (zh) * 2019-06-07 2020-12-25 英飞凌科技股份有限公司 1d扫描lidar中的发送器和接收器的校准
CN113454421A (zh) * 2018-12-26 2021-09-28 奥斯特公司 具有用于增加通道的高侧和低侧开关的固态电子扫描激光器阵列

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110998365A (zh) * 2017-07-05 2020-04-10 奥斯特公司 具有电子扫描发射器阵列和同步传感器阵列的光测距装置
CN110346781A (zh) * 2018-04-02 2019-10-18 探维科技(北京)有限公司 基于多激光束的雷达发射接收装置及激光雷达系统
CN110988842A (zh) * 2018-10-01 2020-04-10 英飞凌科技股份有限公司 激光雷达2d接收器阵列架构
CN113454421A (zh) * 2018-12-26 2021-09-28 奥斯特公司 具有用于增加通道的高侧和低侧开关的固态电子扫描激光器阵列
WO2020232016A1 (en) * 2019-05-13 2020-11-19 Ouster, Inc. Synchronized image capturing for electronic scanning lidar systems
CN112130161A (zh) * 2019-06-07 2020-12-25 英飞凌科技股份有限公司 1d扫描lidar中的发送器和接收器的校准

Also Published As

Publication number Publication date
CN116559825A (zh) 2023-08-08

Similar Documents

Publication Publication Date Title
CN107272014B (zh) 一种固态的二维扫描激光雷达及其扫描方法
CN106569224B (zh) 一种扫描型激光雷达光学系统
EP2568314B1 (en) Laser radar device
WO2011102992A2 (en) Single-transducer, three-dimensional laser imaging system and method
CN111954827B (zh) 利用波长转换的lidar测量系统
US20180372847A1 (en) Focal region optical elements for high-performance optical scanners
KR20150057011A (ko) 광원일체형 카메라
KR20200102900A (ko) 라이다 장치
JP2023516297A (ja) 仮想保護筐体を有するアイセーフな走査型lidar
US20190094531A1 (en) Focal region optical elements for high-performance optical scanners
CN113933811B (zh) 激光雷达的探测方法、激光雷达以及计算机存储介质
CN116829977A (zh) 具有用于感测低能量反射的检测器的扫描激光装置和方法
KR20120069487A (ko) 능동형 광 레이더 장치
KR102527887B1 (ko) 라이다 센서용 광학계
WO2023143593A1 (zh) 激光系统及激光测量方法
CN109407108A (zh) 一种激光雷达系统和测距方法
CN210123470U (zh) 一种激光扫描雷达
CN110456371B (zh) 一种激光雷达系统及相关测量方法
CN116559825B (zh) 激光系统及激光测量方法
WO2023143594A1 (zh) 光扫描组件、激光系统和激光测量方法
WO2023143592A1 (zh) 激光系统及激光测量方法
CN209894980U (zh) 面阵激光雷达
KR20210072671A (ko) 라이다 모듈
CN114578380A (zh) 探测装置及其控制方法、控制装置、激光雷达系统和终端
CN111913165A (zh) 探测系统及其探测方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23746470

Country of ref document: EP

Kind code of ref document: A1