US20220187426A1 - Distance detection system and distance detection method - Google Patents
Distance detection system and distance detection method Download PDFInfo
- Publication number
- US20220187426A1 US20220187426A1 US17/510,159 US202117510159A US2022187426A1 US 20220187426 A1 US20220187426 A1 US 20220187426A1 US 202117510159 A US202117510159 A US 202117510159A US 2022187426 A1 US2022187426 A1 US 2022187426A1
- Authority
- US
- United States
- Prior art keywords
- specific
- light
- modulation
- image frames
- polarization direction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4814—Constructional features, e.g. arrangements of optical elements of transmitters alone
- G01S7/4815—Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4861—Circuits for detection, sampling, integration or read-out
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0242—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
The disclosure provides a distance detection method and a distance detection system. The distance detection method includes: capturing multiple image frames at multiple timing points based on a field of view, in which the field of view includes an object, and each image frame includes a pixel corresponding to the object; obtaining a first and a second modulation presented by the pixel at the timing points; finding a first specific light-emitting unit and a second specific light-emitting unit based on the first modulation and the second modulation, respectively; and estimating a specific distance between the distance detection system and the object based on the first specific light-emitting unit and the second specific light-emitting unit.
Description
- This application claims the priority benefit of Taiwan application serial no. 109144212, filed on Dec. 15, 2020. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
- The disclosure relates to a technique for distance detection, and in particular, to a distance detection system and a distance detection method.
- In current technology, there are automotive vision systems which can be used to assist driving. These automotive vision systems, however, may not exhibit good recognition performance in some cases. For example, in an environment with strong sunlight, there may be shadows of shelters such as bridges and trees. Here, if a contrast between the shadow on a road surface and the strong sunlight on the road surface is too sharp, the image recognition function of the conventional automotive vision systems may not be able to determine a distance between a vehicle and an object such as a road marking or an obstacle, which may lead to collisions or car accidents.
- Similarly, in a dim environment, since it is not easy for the conventional automotive vision systems to recognize a road marking, a vehicle, or an outline of an object, collisions or car accidents may occur due to a failure to correctly recognize a distance.
- The disclosure is directed to a distance detection system and a distance detection method.
- The disclosure provides a distance detection system, the distance detection system including a first light source, a second light source, an image capturing circuit, and a processor. The first light source has a first polarization direction and includes multiple first light-emitting units. Each of the first light-emitting units emits a first light to illuminate a specific object based on the first polarization direction and a first modulation of each of the first light-emitting units. The second light source has a second polarization direction and includes multiple second light-emitting units. Each of the second light-emitting units emits a second light to illuminate the specific object based on the second polarization direction and a second modulation of each of the second light-emitting units. The image capturing circuit is configured to capture multiple image frames in a specific field of view of the image capturing circuit at multiple timing points. The specific object is within the specific field of view. Each of the image frames includes a specific pixel corresponding to the specific object. The processor is coupled to the first light source, the second light source, and the image capturing circuit, and is configured to execute the following. A first specific modulation and a second specific modulation presented by the specific pixel at the multiple timing points based on the image frames are obtained. The first specific modulation corresponds to the first polarization direction, and the second specific modulation corresponds to the second polarization direction. A first specific light-emitting unit among the multiple first light-emitting units is found based on the first specific modulation, and a second specific light-emitting unit among the multiple second light-emitting units is found based on the second specific modulation. A specific distance between the distance detection system and the specific object is calculated based on the first specific light-emitting unit and the second specific light-emitting unit.
- The disclosure provides a distance detection method which is adapted for a distance detection system. The distance detection method includes the following. A first light is emitted to illuminate a specific object by multiple first light-emitting units of a first light source respectively based on a first polarization direction and a first modulation of each of the first light-emitting units. A second light is emitted to illuminate the specific object by multiple second light-emitting units of a second light source respectively based on a second polarization direction and a second modulation of each of the second light-emitting units. Multiple image frames are captured by an image capturing circuit in a specific field of view at multiple timing points. The specific object is within the specific field of view. Each of the image frames includes a specific pixel corresponding to the specific object. A first specific modulation and a second specific modulation presented by the specific pixel at the multiple timing points are obtained by a processor based on the image frames. A first specific light-emitting unit among the multiple first light-emitting units is found based on the first specific modulation by the processor, and a second specific light-emitting unit among the multiple second light-emitting units is found based on the second specific modulation by the processor. A specific distance between the distance detection system and the specific object is calculated based on the first specific light-emitting unit and the second specific light-emitting unit by the processor.
- Therefore, even in a bright light/dim light environment, the specific distance between the distance detection system and the specific object can still be accurately determined by the method of the disclosure. As a result, the chances of collisions and car accidents are reduced.
-
FIG. 1 is a schematic diagram of a distance detection system according to an embodiment of the disclosure. -
FIG. 2 is a flow chart of a distance detection method according to an embodiment of the disclosure. -
FIG. 3 is an application scenario diagram according to an embodiment of the disclosure. -
FIG. 4 is a schematic diagram of a light source adopting a digital micro-mirror device technology according to an embodiment of the disclosure. - Referring to
FIG. 1 ,FIG. 1 is a schematic diagram of a distance detection system according to an embodiment of the disclosure. In different embodiments, adistance detection system 100 may be configured to measure a distance between thedistance detection system 100 and an object located in a field of view (FOV) of thedistance detection system 100 in various devices/scenarios. For the ease of description of the concept of the disclosure, it is assumed below that thedistance detection system 100 is disposed on a vehicle and is configured to measure a distance between the vehicle and an object located in front of the vehicle; however, the disclosure is not limited thereto. - As shown in
FIG. 1 , thedistance detection system 100 may include afirst light source 101, asecond light source 102, an image capturingcircuit 103, and aprocessor 104. In the embodiments of the disclosure, thefirst light source 101 and thesecond light source 102 may respectively be a left headlight and a right headlight of the vehicle which may be configured to illuminate forward of the vehicle. In different embodiments, thefirst light source 101 and thesecond light source 102 may be each realized as a pixel light device adopting digital light processing (DLP)/digital micro-mirror device (DMD) technology, a matrix light device adopting a light-emitting diode (LED), and/or a scan light device. However, the disclosure is not limited thereto. - In the embodiments of the disclosure, the
first light source 101 may have a first polarization direction and include multiple first light-emitting units. Each of the first light-emitting units may emit a first light to illuminate a specific object based on the first polarization direction and a first modulation of each of the first light-emitting units. Similarly, thesecond light source 102 may have a second polarization direction and include multiple second light-emitting units. Each of the second light-emitting units may emit a second light to illuminate the specific object based on the second polarization direction and a second modulation of each of the second light-emitting units. - In an embodiment, the
first light source 101 includes a polarization device/component (e.g. a polarizer) corresponding to the first polarization direction so that each of the first light-emitting units may emit the first light in the first polarization direction. Similarly, thesecond light source 102 includes another polarization device/component corresponding to the second polarization direction so that each of the second light-emitting units may emit the second light in the second polarization direction. - In the embodiments of the disclosure, a first combination composed of the first modulation of each of the first light-emitting units and the first polarization direction is unique in the
distance detection system 100, and a second combination composed of the second modulation of each of the second light-emitting units and the second polarization direction is unique in thedistance detection system 100. - For convenience of description, it is assumed below that the first polarization directions respectively corresponding to the first light-emitting units are all the same (e.g. all being a horizontal polarization direction), and the first modulations respectively corresponding to the first light-emitting units are all different. In addition, it is assumed that the second polarization directions respectively corresponding to the second light-emitting units are all the same (e.g. all being a vertical polarization direction), and the second modulations respectively corresponding to the second light-emitting units are all different. Furthermore, in some embodiments, the first polarization direction may be orthogonal to the second polarization direction; however, the disclosure is not limited thereto.
- In an embodiment, the
first light source 101 includes N1 first light-emitting units, and a single horizontal polarizer may be disposed in front of the N1 first light-emitting units so that the first polarization directions of the first light emitted by each of the first light-emitting units are all horizontal polarization directions. Furthermore, the first modulations of the N1 first light-emitting units may be pulse-amplitude modulations (PAMs) corresponding to different amplitudes. In this case, the first lights respectively emitted by the N1 first light-emitting units have the same first polarization direction (e.g. horizontal polarization direction) but different pulse-amplitude modulations. - In addition, the second
light source 102 includes N2 second light-emitting units, and a single vertical polarizer may be disposed in front of the N2 second light-emitting units so that the second polarization directions of the second light emitted by each of the second light-emitting units are all vertical polarization directions. Furthermore, the second modulations respectively corresponding to the N2 second light-emitting units may be pulse-amplitude modulations corresponding to different amplitudes. In this case, the second lights respectively emitted by the N2 second light-emitting units have the same second polarization direction (e.g. vertical polarization direction) but different pulse-amplitude modulations. - In addition, in other embodiments, the first modulation corresponding to each of the first light-emitting units and/or the second modulation corresponding to each of the second light-emitting units may also be realized by other modulations (e.g. pulse width modulation (PWM)); however, the disclosure is not limited thereto.
- In the embodiments of the disclosure, the
image capturing circuit 103 is, for example, any type of polarization image capturing device. For example, in a first embodiment, theimage capturing circuit 103 may include a first lens and a second lens respectively corresponding to the first polarization direction and the second polarization direction. The first lens includes a first polarizer corresponding to the first polarization direction, and the second lens includes a second polarizer corresponding to the second polarization direction. In this case, the first lens may capture multiple first image frames having the first polarization direction in a specific field of view through the first polarizer at multiple timing points. The second lens may capture multiple second image frames having the second polarization direction in the specific field of view through the second polarizer at the multiple timing points. - In a second embodiment, a polarization component array may be disposed in the
image capturing circuit 103. The polarization component array may include multiple polarization component sets, and each of the polarization component sets may correspond to one of image capturing pixels of theimage capturing circuit 103. In addition, each of the polarization component sets may include a first polarization component and a second polarization component respectively corresponding to the first polarization direction and the second polarization direction. In this case, each of the image frames captured by theimage capturing circuit 103 includes multiple pixels. Each of the pixels may include a first sub-pixel and a second sub-pixel respectively corresponding to the first polarization direction and the second polarization direction; however, the disclosure is not limited thereto. - In an embodiment, the
image capturing circuit 103 may be designed to photograph/capture image frames forward of the vehicle. That is, a field of view (hereinafter referred to as a specific field of view) of theimage capturing circuit 103 capturing the image frames extends forward of the vehicle; however, the disclosure is not limited thereto. - In different embodiments, the
processor 104 may be coupled to the firstlight source 101, the secondlight source 102, and theimage capturing circuit 103. Theprocessor 104 may be a general-purpose processor, a special-purpose processor, a conventional processor, a digital signal processor (DSP), multiple microprocessors, one or more microprocessors, controllers or microcontrollers integrated with a DSP core, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), any other type of integrated circuit, a state machine, an advanced RISC machine (ARM)-based processor or the like. - In the embodiments of the disclosure, the
processor 104 may access a required module or program code to realize a method for determining a distance based on polarization vision provided by the disclosure. The method is described in detail below. - Referring to
FIG. 2 ,FIG. 2 is a flow chart of a distance detection method according to an embodiment of the disclosure. The method of the embodiment may be executed by thedistance detection system 100 inFIG. 1 , and the details of each step inFIG. 2 are described below with reference to the elements shown inFIG. 1 . Furthermore, in order to make the disclosure more comprehensible, further descriptions will be provided below with reference toFIG. 3 .FIG. 3 is an application scenario diagram according to an embodiment of the disclosure. - First, in step S210, multiple first light-emitting
units 101 a of the firstlight source 101 may respectively emit a first light to illuminate aspecific object 499 based on a first polarization direction and a first modulation of each of the first light-emittingunits 101 a. As shown inFIG. 3 , the multiple first light-emittingunits 101 a may be arranged in a matrix. A polarizer P1 (e.g. a horizontal polarizer) may be disposed in front of the matrix. Furthermore, the first modulation corresponding to each of the first light-emittingunits 101 a may be one of multiple pulse width modulations MM as shown. The first modulation corresponding to each of the first light-emittingunits 101 a is different from each other; however, the disclosure is not limited thereto. - In this case, the first lights respectively emitted by the first light-emitting
units 101 a (the first polarization direction is, for example, a horizontal polarization direction) may integrally form afirst illumination range 411. Thefirst illumination range 411 may include afirst illumination sub-range 411 a corresponding to each of the first light-emittingunits 101 a. - In addition, in step S220, multiple second light-emitting
units 102 a of the secondlight source 102 may respectively emit a second light to illuminate thespecific object 499 based on a second polarization direction and a second modulation of each of the second light-emittingunits 102 a. As shown inFIG. 3 , the multiple second light-emittingunits 102 a may be arranged in a matrix. A polarizer P2 (e.g. a vertical polarizer) may be disposed in front of the matrix. Furthermore, the second modulation corresponding to each of the second light-emittingunits 102 a may be one of the multiple pulse width modulations MM as shown. The second modulation corresponding to each of the second light-emittingunits 102 a is different from each other; however, the disclosure is not limited thereto. - In this case, the second lights respectively emitted by the second light-emitting
units 102 a (the second polarization direction is, for example, a vertical polarization direction) may integrally form asecond illumination range 412. Thesecond illumination range 412 may include asecond illumination sub-range 412 a of each of the second light-emittingunits 102 a. - Next, in step S230, the
image capturing circuit 103 may capture multiple image frames in a specific field of view at multiple timing points. Thespecific object 499 illuminated by the firstlight source 101 and the secondlight source 102 is present in the specific field of view. In this case, each of the image frames captured by theimage capturing circuit 103 includes an image area (including at least one specific image pixel) corresponding to thespecific object 499. In other words, each of the image frames includes at least one specific image pixel corresponding to thespecific object 499. - In the embodiments of the disclosure, based on a degree of polarization of a specific image pixel (hereinafter referred to as specific pixel) corresponding to the
specific object 499 in each of the image frames, theprocessor 104 may learn which of the first light-emittingunits 101 a emits the first light and which of the second light-emittingunits 102 a emits the second light that contribute to the light corresponding to the specific pixel. For example, the specific pixel is a pixel of thespecific object 499 located at the center in each of the image frames; however, the disclosure is not limited thereto. Any pixel which may represent thespecific object 499 falls in the scope of the disclosure. - Next, in step S240, the
processor 104 may obtain a first specific modulation and a second specific modulation presented by the specific pixel at the timing points based on the image frames. - For example, in an embodiment, the image frames may include the multiple first image frames (corresponding to the first polarization direction) and the second image frames (corresponding to the second polarization direction) of the first embodiment. In this case, the
processor 104 may obtain the first specific modulation presented by the specific pixel at the timing points based on the first image frames and obtain the second specific modulation presented by the specific pixel at the timing points based on the second image frames. - In another embodiment, the
image capturing circuit 103 obtains the image frames through a method described in the second embodiment above. In this case, theprocessor 104 obtains first sub-pixels and second sub-pixels respectively corresponding to the first polarization direction and the second polarization direction among the pixels of each of the image frames. Theprocessor 104 combines the first sub-pixels into multiple first image frames and combines the second sub-pixels into multiple second image frames. - Specifically, with regard to an ith (i being a positive integer) image frame among the image frames, the
processor 104 may combine the first sub-pixels of each pixel in the ith image frame into a first image frame corresponding to the ith image frame and combine the second sub-pixels of each pixel in the ith image frame into a second image frame corresponding to the ith image frame. Furthermore, theprocessor 104 may analyze the first image frame corresponding to each of the image frames to obtain the first specific modulation presented by the specific pixel at the timing points. Similarly, theprocessor 104 may obtain the second specific modulation presented by the specific pixel at the timing points based on the second image frame corresponding to each of the image frames. - In addition, in step S250, the
processor 104 may find a first specific light-emitting unit among the first light-emittingunits 101 a based on the first specific modulation and find a second specific light-emitting unit among the second light-emittingunits 102 a based on the second specific modulation. - For example, a first light-emitting unit A1 of the first
light source 101 is modulated to emit the first light (e.g. having a horizontal polarization direction) by the first modulation having a bright-dark-bright-dark pattern. In this case, after theprocessor 104 analyzes the specific pixel of the first image frame and discovers that the degree of polarization of the specific pixel in the first image frame is presented in the first specific modulation of the bright-dark-bright-dark pattern, theprocessor 104 may determine that the light of the specific pixel is contributed by at least the first light of the first light-emitting unit A1. Accordingly, theprocessor 104 may determine that the first light-emitting unit A1 is the first specific light-emitting unit. Furthermore, a second light-emitting unit B1 in the secondlight source 102 is modulated to emit the second light (e.g. having a vertical polarization direction) by the second modulation having a dark-dark-bright-bright pattern. In this case, after theprocessor 104 analyzes the specific pixel of the second image frame and discovers that the degree of polarization of the specific pixel of the second image frame is presented in the second specific modulation of the dark-dark-bright-bright pattern, theprocessor 104 may determine that the light of the specific pixel is also contributed by the second light of the second light-emitting unit B1. Accordingly, theprocessor 104 may determine that the second light-emitting unit B1 is the second specific light-emitting unit. - Then, in step S260, the
processor 104 may calculate a specific distance LL between thedistance detection system 100 and thespecific object 499 based on the first specific light-emitting unit and the second specific light-emitting unit. - In an embodiment, the
processor 104 may obtain a predetermined distance between the first specific light-emitting unit and the second specific light-emitting unit. Then, theprocessor 104 may execute a triangulation location method based on a first emission direction of the first specific light-emitting unit, a second emission direction of the second specific light-emitting unit, and the predetermined distance to calculate the specific distance LL between thedistance detection system 100 and thespecific object 499. - For example, in the scenario in
FIG. 3 , the first specific light-emitting unit and the second specific light-emitting unit found by theprocessor 104 after the execution of step S250 are respectively a first light-emittingunit 401 a and a second light-emittingunit 402 a as shown. Hence, theprocessor 104 may estimate the specific distance LL. Specifically, in the embodiments of the disclosure, the locations of each first light-emittingunit 101 a in the firstlight source 101 and each second light-emittingunit 102 a in the secondlight source 102 are known. A distance between any first light-emittingunit 101 a and any second light-emittingunit 102 a may be measured in advance and thus considered to be known. In other words, a distance (hereinafter referred to as predetermined distance DD) between the first light-emittingunit 401 a and the second light-emittingunit 402 a is also known. - Furthermore, the first emission direction in which each of the first light-emitting
units 101 a emits the first light may be fixed, and the second emission direction in which each of the second light-emittingunits 102 a emits the second light may also be fixed. - Therefore, after it is determined that the first specific light-emitting unit and the second specific light-emitting unit are respectively the first light-emitting
unit 401 a and the second light-emittingunit 402 a, theprocessor 104 may obtain the predetermined distance DD (having a value of, for example, d) between the first light-emittingunit 401 a and the second light-emittingunit 402 a. Then, theprocessor 104 may execute the triangulation location method based on a first emission direction D1 of the first light-emittingunit 401 a, a second emission direction D2 of the second light-emittingunit 402 a, and the predetermined distance DD to calculate the specific distance LL (having a value of, for example, l) between thedistance detection system 100 and thespecific object 499. - In
FIG. 3 , an included angle AN1 (having a value of, for example, α) is formed between a connecting line between the first light-emittingunit 401 a and the second light-emittingunit 402 a, and the first emission direction D1, and an included angle AN2 (having a value of, for example, β) is formed between a connecting line between the first light-emittingunit 401 a and the second light-emittingunit 402 a, and the second emission direction D2. In this case, the specific distance LL may be, for example, calculated by theprocessor 104 as -
- but is not limited thereto.
- Based on the above, in the method of the disclosure, the specific distance LL between the
distance detection system 100 and thespecific object 499 may be determined in a different manner from that of the current technology. Even in a bright light/dim light environment, the specific distance LL between thedistance detection system 100 and thespecific object 499 can still be accurately determined by the method of the disclosure. Therefore, the chances of collisions and car accidents are reduced. - In other embodiments, when multiple specific objects are present in the specific field of view of the
image capturing circuit 103, the method of the disclosure may still be employed to determine a distance between thedistance detection system 100 and each of the specific objects. The disclosure is not limited thereto. - In addition, the disclosure further provides a mechanism below to improve the efficiency of finding the first specific light-emitting unit and the second specific light-emitting unit. Specifically, in the scenario of
FIG. 3 , the firstlight source 101 may be, for example, considered located on the left side of thedistance detection system 100, and the secondlight source 102 may be, for example, considered located on the right side of thedistance detection system 100. In this case, thefirst illumination range 411 of the firstlight source 101 corresponds to a first image area located on the left side in each of the image frame, and thesecond illumination range 412 of the secondlight source 102 corresponds to a second image area located on the right side in each of the image frame. - For example, assuming that an image frame IM is one of the image frames, the image frame IM may include a first image area IM1 and a second image area IM2 respectively corresponding to the
first illumination range 411 and thesecond illumination range 412. As shown inFIG. 3 , the first image area IM1 and the second image area IM2 have an overlapping area OR, and the specific pixel corresponding to thespecific object 499 may be located in the overlapping area OR. - As shown in
FIG. 3 , among the first light-emittingunits 101 a of the firstlight source 101, only a part of the first light-emittingunits 101 a located near the right side are more likely to contribute to the first light for the specific pixel. Similarly, among the second light-emittingunits 102 a of the secondlight source 102, only a part of the second light-emittingunits 102 a located near the left side are more likely to contribute to the second light for the specific pixel. Therefore, theprocessor 104 may find the first specific light-emitting unit among the first light-emittingunits 101 a near the right side and find the second specific light-emitting unit among the second light-emittingunits 102 a near the left side. - In other words, the
processor 104 may find the first specific light-emitting unit/the second specific light-emitting unit within a relatively small range. Therefore, the efficiency of finding the first specific light-emitting unit and the second specific light-emitting unit may be improved. - Referring to
FIG. 4 ,FIG. 4 is a schematic diagram of a light source adopting a digital micro-mirror device technology according to an embodiment of the disclosure. InFIG. 4 , a firstlight source 501 of thedistance detection system 100 may include asub-light source 511 and amicro-mirror array 512. Themicro-mirror array 512 may include multiple micro-mirrors. Each of the micro-mirrors may emit a first light by reflecting a light of thesub-light source 511. In other words, the micro-mirrors in themicro-mirror array 512 may be understood as the first light-emitting units of the embodiment. In addition, a secondlight source 502 of thedistance detection system 100 may include a sub-light source 521 and amicro-mirror array 522. Themicro-mirror array 522 may include multiple micro-mirrors. Each of the micro-mirrors may emit a second light by reflecting a light of the sub-light source 521. In other words, the micro-mirrors in themicro-mirror array 522 may be understood as the second light-emitting units of the embodiment. - In the embodiment, assuming that the first specific light-emitting unit and the second specific light-emitting unit found are respectively a micro-mirror 501 a and a micro-mirror 502 b, the
processor 104 may still calculate the specific distance between thedistance detection system 100 and aspecific object 599 based on the teaching above. The details thereof are not repeated here. - In summary of the above, each of the first light-emitting units in the first light source emits the first light based on the first polarization direction and the corresponding first modulation, and each of the second light-emitting units in the second light source emits the second light based on the second polarization direction and the corresponding second modulation. Therefore, in the method of the disclosure, after the first specific modulation and the second specific modulation presented by the specific pixel in multiple image frames are determined, the first specific light-emitting unit and the second specific light-emitting unit can be found accordingly. Then, in the method of the disclosure, the specific distance between the distance detection system and the specific object corresponding to the specific pixel can be estimated based on relative positions of the first specific light-emitting unit and the second specific light-emitting unit. Accordingly, even in a bright light/dim light environment, the specific distance between the distance detection system and the specific object can still be accurately determined by the method of the disclosure. Therefore, the chances of collisions and car accidents are reduced.
- Although the disclosure has been described with reference to the above embodiments, they are not intended to limit the disclosure. It will be apparent to one of ordinary skill in the art that modifications to the described embodiments may be made without departing from the spirit and the scope of the disclosure. Accordingly, the scope of the disclosure will be defined by the attached claims and their equivalents and not by the above detailed descriptions.
Claims (11)
1. A distance detection system, comprising:
a first light source having a first polarization direction and comprising a plurality of first light-emitting units, wherein each of the first light-emitting units emits a first light to illuminate a specific object based on the first polarization direction and a first modulation of the each of the first light-emitting units;
a second light source having a second polarization direction and comprising a plurality of second light-emitting units, wherein each of the second light-emitting units emits a second light to illuminate the specific object based on the second polarization direction and a second modulation of the each of the second light-emitting units;
an image capturing circuit configured to capture a plurality of image frames in a specific field of view at a plurality of timing points, wherein the specific object is within the specific field of view, and each of the image frames comprises a specific pixel corresponding to the specific object; and
a processor coupled to the first light source, the second light source and the image capturing circuit and configured to:
obtain a first specific modulation and a second specific modulation presented by the specific pixel at the timing points based on the image frames, wherein the first specific modulation corresponds to the first polarization direction, and the second specific modulation corresponds to the second polarization direction;
find a first specific light-emitting unit among the first light-emitting units based on the first specific modulation and find a second specific light-emitting unit among the second light-emitting units based on the second specific modulation; and
calculate a specific distance between the distance detection system and the specific object based on the first specific light-emitting unit and the second specific light-emitting unit.
2. The distance detection system according to claim 1 , wherein the image capturing circuit comprises a first lens and a second lens respectively corresponding to the first polarization direction and the second polarization direction, the image frames comprise a plurality of first image frames captured by the first lens and a plurality of second image frames captured by the second lens, and in obtaining the first specific modulation and the second specific modulation, the processor is further configured to:
obtain the first specific modulation presented by the specific pixel at the timing points based on the first image frames; and
obtain the second specific modulation presented by the specific pixel at the timing points based on the second image frames.
3. The distance detection system according to claim 1 , wherein in obtaining the first specific modulation and the second specific modulation, the processor is further configured to:
obtain, among a plurality of pixels of each of the image frames, a plurality of first sub-pixels and a plurality of second sub-pixels respectively corresponding to the first polarization direction and the second polarization direction;
combine the first sub-pixels into a plurality of first image frames and combine the second sub-pixels into a plurality of second image frames;
obtain the first specific modulation presented by the specific pixel at the timing points based on the first image frames; and
obtain the second specific modulation presented by the specific pixel at the timing points based on the second image frames.
4. The distance detection system according to claim 1 , wherein in calculating the distance between the distance detection system and the specific object, the processor is further configured to:
obtain a predetermined distance between the first specific light-emitting unit and the second specific light-emitting unit; and
execute a triangulation location method based on a first emission direction of the first specific light-emitting unit, a second emission direction of the second specific light-emitting unit, and the predetermined distance to calculate the specific distance between the distance detection system and the specific object.
5. The distance detection system according to claim 1 , wherein the first modulations respectively corresponding to the first light-emitting units are all different, and the second modulations respectively corresponding to the second light-emitting units are all different.
6. The distance detection system according to claim 1 , wherein the first polarization direction and the second polarization direction are orthogonal to each other.
7. A distance detection method adapted for a distance detection system, comprising:
emitting a first light to illuminate a specific object by a plurality of first light-emitting units of a first light source respectively based on a first polarization direction and a first modulation of each of the first light-emitting units;
emitting a second light to illuminate the specific object by a plurality of second light-emitting units of a second light source respectively based on a second polarization direction and a second modulation of each of the second light-emitting units;
capturing a plurality of image frames in a specific field of view at a plurality of timing points by an image capturing circuit, wherein the specific object is within the specific field of view comprises, and each of the image frames comprises a specific pixel corresponding to the specific object;
obtaining a first specific modulation and a second specific modulation presented by the specific pixel at the timing points by a processor based on the image frames, wherein the first specific modulation corresponds to the first polarization direction, and the second specific modulation corresponds to the second polarization direction;
finding a first specific light-emitting unit among the first light-emitting units based on the first specific modulation by the processor and finding a second specific light-emitting unit among the second light-emitting units based on the second specific modulation by the processor; and
calculating a specific distance between the distance detection system and the specific object based on the first specific light-emitting unit and the second specific light-emitting unit by the processor.
8. The distance detection method according to claim 7 , wherein capturing the image frames by the image capturing circuit comprises:
capturing a plurality of first image frames having the first polarization direction in the specific field of view at the timing points by a first lens of the image capturing circuit; and
capturing a plurality of second image frames having the second polarization direction in the specific field of view at the timing points by a second lens of the image capturing circuit;
wherein obtaining the first specific modulation and the second specific modulation by the processor comprises:
obtaining the first specific modulation presented by the specific pixel at the timing points based on the first image frames; and
obtaining the second specific modulation presented by the specific pixel at the timing points based on the second image frames.
9. The distance detection method according to claim 7 , wherein obtaining the first specific modulation and the second specific modulation by the processor comprises:
obtaining, among a plurality of pixels of each of the image frames, a plurality of first sub-pixels and a plurality of second sub-pixels respectively corresponding to the first polarization direction and the second polarization direction;
combining the first sub-pixels into a plurality of first image frames and combining the second sub-pixels into a plurality of second image frames;
obtaining the first specific modulation presented by the specific pixel at the timing points based on the first image frames; and
obtaining the second specific modulation presented by the specific pixel at the timing points based on the second image frames.
10. The distance detection method according to claim 7 , wherein calculating the specific distance between the distance detection system and the specific object by the processor comprises:
obtaining a predetermined distance between the first specific light-emitting unit and the second specific light-emitting unit; and
executing a triangulation location method based on a first emission direction of the first specific light-emitting unit, a second emission direction of the second specific light-emitting unit, and the predetermined distance to calculate the specific distance between the distance detection system and the specific object.
11. The distance detection method according to claim 7 , wherein the first polarization direction and the second polarization direction are orthogonal to each other.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW109144212 | 2020-12-15 | ||
TW109144212A TWI746313B (en) | 2020-12-15 | 2020-12-15 | Method and system for distance detection |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220187426A1 true US20220187426A1 (en) | 2022-06-16 |
Family
ID=78709235
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/510,159 Pending US20220187426A1 (en) | 2020-12-15 | 2021-10-25 | Distance detection system and distance detection method |
Country Status (4)
Country | Link |
---|---|
US (1) | US20220187426A1 (en) |
EP (1) | EP4016475A1 (en) |
CN (1) | CN114636996A (en) |
TW (1) | TWI746313B (en) |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8538636B2 (en) * | 1995-06-07 | 2013-09-17 | American Vehicular Sciences, LLC | System and method for controlling vehicle headlights |
US6476943B1 (en) * | 1999-02-23 | 2002-11-05 | Virtual Pro, Inc. | Distance measurement apparatus |
US7711441B2 (en) * | 2007-05-03 | 2010-05-04 | The Boeing Company | Aiming feedback control for multiple energy beams |
DE102013219344A1 (en) * | 2013-09-26 | 2015-03-26 | Conti Temic Microelectronic Gmbh | Method for determining the distance of an object by means of a polarization-modulated transmitted light beam |
US9840003B2 (en) * | 2015-06-24 | 2017-12-12 | Brain Corporation | Apparatus and methods for safe navigation of robotic devices |
TWI604979B (en) * | 2017-03-07 | 2017-11-11 | 和碩聯合科技股份有限公司 | Vehicle distance detecting method |
US10222474B1 (en) * | 2017-12-13 | 2019-03-05 | Soraa Laser Diode, Inc. | Lidar systems including a gallium and nitrogen containing laser light source |
CN109188451A (en) * | 2018-10-15 | 2019-01-11 | 北京径科技有限公司 | A kind of laser radar system |
-
2020
- 2020-12-15 TW TW109144212A patent/TWI746313B/en active
-
2021
- 2021-10-25 US US17/510,159 patent/US20220187426A1/en active Pending
- 2021-11-02 CN CN202111287479.8A patent/CN114636996A/en active Pending
- 2021-11-18 EP EP21208931.2A patent/EP4016475A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
EP4016475A1 (en) | 2022-06-22 |
TW202225000A (en) | 2022-07-01 |
TWI746313B (en) | 2021-11-11 |
CN114636996A (en) | 2022-06-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108919294B (en) | Distance measurement imaging system and solid-state imaging element | |
US20150243017A1 (en) | Object recognition apparatus and object recognition method | |
US10635896B2 (en) | Method for identifying an object in a surrounding region of a motor vehicle, driver assistance system and motor vehicle | |
CN103782307A (en) | Method and device for detecting objects in the area surrounding a vehicle | |
US20140257644A1 (en) | Method and device for controlling a headlamp of a vehicle | |
US11908119B2 (en) | Abnormality detection device for vehicle | |
CN110293973B (en) | Driving support system | |
US11704910B2 (en) | Vehicle detecting device and vehicle lamp system | |
KR20220139933A (en) | Car's ambient monitoring system | |
US20200236338A1 (en) | Sensor system | |
US11303817B2 (en) | Active sensor, object identification system, vehicle and vehicle lamp | |
US10025995B2 (en) | Object detecting arrangement | |
US11170517B2 (en) | Method for distance measurement using trajectory-based triangulation | |
US20220187426A1 (en) | Distance detection system and distance detection method | |
US11924554B2 (en) | Imaging system which determines an exposure condition based on a detected distance | |
CN111971527B (en) | Image pickup apparatus | |
WO2019176418A1 (en) | Vehicular lamp, vehicle detection method, and vehicle detection device | |
KR20140048529A (en) | Vehicle lamp detecting light-source generated by reflective structure | |
JP7276304B2 (en) | object detector | |
WO2022263683A1 (en) | Method for detecting an object in a road surface, method for autonomous driving and automotive lighting device | |
WO2022263685A1 (en) | Method for detecting an object in a road surface, method for autonomous driving and automotive lighting device | |
US20220018964A1 (en) | Surroundings detection device for vehicle | |
CN113721252A (en) | Light supplement method, device and system | |
CN117584853A (en) | Vehicle and control method thereof | |
JP2021075163A (en) | Camera system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PEGATRON CORPORATION, TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HSU, PO-CHING;REEL/FRAME:057905/0835 Effective date: 20211019 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |