CN105651245A - Optical ranging system and optical ranging method - Google Patents

Optical ranging system and optical ranging method Download PDF

Info

Publication number
CN105651245A
CN105651245A CN201410636269.9A CN201410636269A CN105651245A CN 105651245 A CN105651245 A CN 105651245A CN 201410636269 A CN201410636269 A CN 201410636269A CN 105651245 A CN105651245 A CN 105651245A
Authority
CN
China
Prior art keywords
image
region
optical ranging
time shutter
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410636269.9A
Other languages
Chinese (zh)
Other versions
CN105651245B (en
Inventor
王国振
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN201410636269.9A priority Critical patent/CN105651245B/en
Priority to CN201810018459.2A priority patent/CN108332720B/en
Priority claimed from CN201410636269.9A external-priority patent/CN105651245B/en
Publication of CN105651245A publication Critical patent/CN105651245A/en
Application granted granted Critical
Publication of CN105651245B publication Critical patent/CN105651245B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The invention relates to an optical ranging system, which comprises an image sensor and a processing unit, wherein the processing unit generates an image to be calculated according to at least an image acquired by the image sensor, and different image areas of the image to be calculated are corresponding to different exposure times so as to increase the accuracy of the distance calculation.

Description

Optical ranging system and method
Technical field
The relevant a kind of ranging system of the present invention, particularly about a kind of optical ranging system and the method that use many works exposure mechanism.
Background technology
Optical ranging system can adopt triangulation method to calculate the distance of object. Such as, distance measuring device system can comprise light source and photographic camera. Described light source is towards determinand throw light, and described photographic camera receives the reflected light from described determinand to form image frame. When the spatial relation of described light source and described photographic camera is known, the distance of described determinand namely can be calculated according to triangulation meter according to the subject image position in described image frame.
But, when space exists multiple determinand of different distance simultaneously, determinand closely may have the situation of exposure (overexposure), and determinand may have the situation of under-exposure (underexposure) at a distance, therefore the counting accuracy of optical ranging system may reduce. Especially, when remote determinand under-exposed, there will be the problem of the object distance that cannot calculate remote determinand.
Summary of the invention
In view of this, the present invention also proposes a kind of optical ranging system and the method that can retain far and near determinand information in image frame simultaneously, to increase counting accuracy.
The present invention provide a kind of duration of service many works exposure mechanism optical ranging system and method.
The present invention provides a kind of optical ranging system and the method for usage space many works exposure mechanism.
The present invention provides a kind of optical ranging system, comprises image sensor and processing unit. Described image sensor obtains the first image with the first time shutter and obtains the 2nd image with the 2nd time shutter, and wherein said first time shutter is different from described 2nd time shutter. Described processing unit is multiple first image-region in order to receive described first image and described 2nd image, to split described first image, split described 2nd image is multiple 2nd image-region, compare the signal feature of corresponding described first image-region and described 2nd image-region and described 2nd image-region bigger to described first image-region bigger for described signal feature and described signal feature is combined as combination image.
The present invention also provides a kind of optical ranging system, comprises image sensor and processing unit. Described image sensor obtains reference picture with the reference exposure time and obtains the different images region of current image with multiple time shutter. Described processing unit in order to receive described reference picture, split described reference picture be multiple image-region, the mean flow rate calculating the described image-region of described reference picture respectively and control, according to described mean flow rate, described multiple time shutter in described different images region that described image sensor obtains described current image.
The present invention also provides the distance-finding method of a kind of optical ranging system, comprises the following step: utilize image sensor to obtain the first image with the first time shutter;Described image sensor is utilized to obtain the 2nd image with the 2nd time shutter; Split the first signal feature that described first image is multiple first image-region and calculates each described first image-region; Split the second signal feature that described 2nd image is multiple 2nd image-region and calculates each described 2nd image-region; The relatively described first signal feature of each described first image-region and the described second signal feature of corresponding described 2nd image-region; And described first signal feature is greater than described first image-region of described second signal feature and described 2nd image-region that described second signal feature is greater than described first signal feature is combined into combination image.
The present invention also provides the distance-finding method of a kind of optical ranging system, comprises the following step: utilize image sensor to obtain reference picture with the reference exposure time; Split the mean flow rate that described reference picture is multiple image-region and calculates each described image-region; And utilize described image sensor to obtain the different images region of current image respectively with multiple time shutter according to described mean flow rate.
Invention is another provides a kind of optical ranging system, comprises image sensor and processing unit. Described image sensor obtains the first image and the 2nd image respectively with the different time shutter. Described processing unit is in order to receive described first image and described 2nd image and the partial image region in the partial image region in described first image and described 2nd image is combined as combination image.
In order to allow the above and other object of the present invention, the feature and advantage can be more obvious, illustrate appended by hereafter will coordinating, it is described in detail as follows. In addition, in the explanation of the present invention, identical component represents with identical symbol, illustrates in advance at this.
Accompanying drawing explanation
Fig. 1 is the block schematic diagram of the optical ranging system of one embodiment of the invention;
Fig. 2 is the schematic diagram of the optical ranging system of one embodiment of the invention;
Fig. 3 is the schema of the distance-finding method of the optical ranging system of first embodiment of the invention;
Fig. 4 A is the sequential chart of the Image Acquisition of the optical ranging system of first embodiment of the invention;
Fig. 4 B is the running schematic diagram of the optical ranging system of first embodiment of the invention;
Fig. 5 is the schema of the distance-finding method of the optical ranging system of second embodiment of the invention;
Fig. 6 A is the sequential chart of the Image Acquisition of the optical ranging system of second embodiment of the invention;
Fig. 6 B is the running schematic diagram of the optical ranging system of second embodiment of the invention.
Description of reference numerals
1 optical ranging system
11 image sensors
13 processing units
131 exposure control unit
133 multiple station die groups
135 metrics calculation unit
15 light sources
9 determinands
I9 reflected light image
Fm image to be calculated
AV1-AV4 mean flow rate
F��FS��FL��FT��FT+1Image
C1-C4, C1'-C4' signal feature
A1-A4, A1'-A4' image-region
ET1-ET4��ETS��ETL, the ETr time shutter
S31-S36, S51-S53 step
Embodiment
Please refer to shown in Fig. 1, it is the block schematic diagram of the optical ranging system of one embodiment of the invention. Optical ranging system 1 comprises image sensor with 11 and processing unit 13. Described image sensor 11 is preferably active image sensor, such as CMOS (CMOS) image sensor, its different images region (citing is specified in rear) that can change the time shutter (exposuretime) when obtaining image F or obtain described image F with multiple time shutter respectively.
Described processing unit 13 such as can be digital signal processor (DSP), single-chip (MCU), central processing unit (CPU) etc., in order to receive image F that described image sensor 11 exports to carry out aftertreatment, and control the Image Acquisition of described image sensor 11. In one embodiment, described processing unit 13 can comprise exposure control unit 131, multiple station die group 133 and metrics calculation unit 135; Wherein, described exposure control unit 131, multiple station die group 133 and metrics calculation unit 135 are the data processing unit in described processing unit 13, and it can the mode of software or hard body realize, and there is no specific restriction. It could be understood that although described processing unit 13 is divided into different operating module by Fig. 1 so that illustrating, all functions performed by work module in described processing unit 13, all it may be said that by performed by described processing unit 13.
Described exposure control unit 131 is in order to control described image sensor 11 with all image-regions (i.e. an image corresponding time shutter) of different exposure time acquisition different images F, or obtains the different images region (i.e. an image corresponding multiple time shutter) of same image F with multiple time shutter. Described multiple station die group 133 utilize the many works of time-multiplex or space to process image F that described processing unit 13 receives also produces image Fm to be calculated (such as this illustrates combination image described later and image at present). Described metrics calculation unit 135 utilizes default algorithm to calculate at least one object distance according to described image Fm to be calculated, such as, utilize triangulation method to calculate described object distance.
Please refer to shown in Fig. 2, it is the schematic diagram of the optical ranging system of one embodiment of the invention. Optical ranging system 1 can also comprise light source 15 in order to project two dimension light region (such as having the light line of predetermined width) to determinand 9; Wherein, described light source 15 such as can be people having the same aspiration and interest light source, part people having the same aspiration and interest light source or non-coherent light source, there is no specific restriction, in order to send visible ray or invisible light. After described image sensor 11 receives the reflected light of described determinand 9, the image F producing to comprise reflected light image I9 is transferred into described processing unit 13. First described processing unit 13 utilizes many works mechanism (citing is specified in rear) of the present invention to produce described image Fm to be calculated according to described image F, and calculates at least one object distance D according to described image Fm to be calculated; Wherein, described image Fm to be calculated comprises reflected light image I9 equally. In more detail, multiple time shutter corresponding to different images region of described image Fm to be calculated at least partially can (citing be specified in after) different from each other so that the brightness of the reflected light image I9 in each image-region is suitable for calculating described object distance D. In addition, in some embodiment, described processing unit 13 can wire or wirelessly export described image Fm to be calculated and carry out aftertreatment for outer part device, such as, to outside main frame (host). Should be noted that, although the two-dimentional light region that light source 15 described in Fig. 2 projects is shown as discontinuous, so it is only in order to explanation, it does not mean to limit the present invention.
In one embodiment, described processing unit 13 can comprise storage unit (not illustrating), and in order to store synopsis, it comprises the relation of reflected light image I9 position and object distance D. By this, after described processing unit 13 tries to achieve the position of reflected light image I9 in described image Fm to be calculated, directly can try to achieve at least one object distance D according to described synopsis;Wherein, described synopsis is calculated according to described light source 15 and the spatial relation (such as distance L) of described image sensor 11 and the illumination angle of described light source 15, and is stored in advance in described storage element. In another embodiment, the storage unit of described processing unit 13 can store distance algorithm, when, behind the position trying to achieve reflected light image I9 in described image Fm to be calculated, described distance algorithm being utilized to calculate at least one object distance D.
In the embodiment of the present invention, owing to described light source 15 is in order to project two dimension light region, therefore the image F that described image sensor 11 exports comprises linear reflective light image I9, described processing unit 13 can calculate the multiple object distance section of the reflected light image that different determinand is corresponding different (and be positioned at different positions) simultaneously, therefore has preferably suitability. Finally, the object distance D calculated is exported to carry out corresponding control by described processing unit 13, such as, export main frame or computer system to; Wherein, the controlling functions of described object distance D is then determined according to different application.
Please refer to shown in Fig. 3, it is the schema of the distance-finding method of the optical ranging system of first embodiment of the invention, comprises the following step: obtain the first image (step S31) with the first time shutter; The 2nd image (step S32) is obtained with the 2nd time shutter; Split described first image to be multiple first image-region and calculate the first signal feature (step S33) of each described first image-region; Split the second signal feature (step S34) that described 2nd image is multiple 2nd image-region and calculates each described 2nd image-region; Relatively described first signal feature and described second signal feature (step S35); And described first signal feature is greater than described first image-region of described second signal feature and described 2nd image-region that described second signal feature is greater than described first signal feature is combined into combination image (step S36).
Shown in Fig. 1-3 and Fig. 4 A-4B, the detailed embodiment of first embodiment of the invention is then described. Described processing unit 13 controls described light source 15 and lights when described image sensor 11 obtains image F, so that the image F that obtains of described image sensor 11 comprises the reflected light image I9 from described determinand 9, use the object distance D calculating described determinand 9.
Step S31: described image sensor 11 is controlled by the described exposure control unit 131 of described processing unit 13, with the first time shutter ETSObtain the first image FS��
Step S32: then, described image sensor 11 is controlled by described processing unit 13, with the 2nd time shutter ETLObtain the 2nd image FL; Wherein, described first image FSWith described 2nd image FLCan be described image sensor 11 continuously or two image F obtaining of at least one image of being separated by, and described first time shutter ETSIt is different from described 2nd time shutter ETL. Should be noted that, although Fig. 4 A shows described first time shutter ETSIt is less than described 2nd time shutter ETL, but the present invention is not as limit. In some embodiment, described first time shutter ETSIt is greater than described 2nd time shutter ETL. In one embodiment, the exposure control unit 131 of described processing unit 13 controls described image sensor 11 alternately with described first time shutter ETSAnd described 2nd time shutter ETLCarry out Image Acquisition.
Step S33: described processing unit 13 receives described first image FSAfter, described multiple station die group 133 splits described first image F with predetermined mannerSFor multiple first image-region, such as A1-A4 (Fig. 4 B), and calculate the first signal feature C1-C4 (Fig. 4 B) of each described first image-region A1-A4;Wherein, each described first image-region A1-A4 can be described first image FSA row pixel region, multiple row pixel region, one-row pixels region, multiple row pixel region or rectangular pixels region, be not limited to shown in Fig. 4 B. In one embodiment, described signal feature C1-C4 is respectively the signal to noise ratio (SNR) of described first image-region A1-A4; Such as, described multiple station die group 133 distinguishes signal data (signaldata) and noise data (noisedata) according to the dynamic threshold in each described first image-region A1-A4, and calculate the ratio (ratio) of the Energy value summation of all signal datas and the Energy value summation of institute's noise data in each described first image-region A1-A4, using as described signal to noise ratio. In one embodiment, the mean value of the maximum energy value that described dynamic threshold is such as chosen as in first image-region and the average energy value summation, but the present invention is not as limit, and therefore each described first image-region A1-A4 all can try to achieve threshold value. Owing to the threshold value of each described first image-region A1-A4 is calculated according to the view data obtained, therefore each other may be different, therefore what claim in this explanation is dynamic threshold.
Step S34: with reason, described processing unit 13 receives described 2nd image FLAfter, described multiple station die group 133 splits described 2nd image F with described predetermined manner (identical with step S33)LFor multiple 2nd image-region, such as A1'-A4'(Fig. 4 B), and calculate second signal feature C1'-C4'(Fig. 4 B of each described 2nd image-region A1'-A4'); Wherein, each described 2nd image-region A1'-A4' can be described 2nd image FLA row pixel region, multiple row pixel region, one-row pixels region, multiple row pixel region or rectangular pixels region, be not limited to shown in Fig. 4 B. With reason, described signal feature C1'-C4' can be respectively the signal to noise ratio (SNR) of described 2nd image-region A1'-A4'; Such as, described multiple station die group 133 distinguishes signal data and noise data according to the dynamic threshold in each described 2nd image-region A1'-A4', and calculate the Energy value summation of all signal datas and the ratio of the Energy value summation of institute's noise data, using as described signal to noise ratio. The deciding means of described dynamic threshold is as described in step S33, therefore repeats no more in this.
Step S35: then, described multiple station die group 133 compares corresponding described first image-region A1-A4 and the signal feature of described 2nd image-region A1'-A4', such as, compare the second signal feature C1' of the first signal feature C1 and described 2nd image-region A1' of described first image-region A1; The relatively second signal feature C2' of the first signal feature C2 and described 2nd image-region A2' of described first image-region A2; The relatively second signal feature C3' of the first signal feature C3 and described 2nd image-region A3' of described first image-region A3; And compare the second signal feature C4' of the first signal feature C4 and described 2nd image-region A4' of described first image-region A4.
Step S36: then, described multiple station die group 133 utilizes time-multiplex mechanism (timemultiplexingmechanism) by a part of image-region of described first image FS and described 2nd image FLA part of image-region carry out recombinating to produce combination image Fm. In one embodiment, described 2nd image-region bigger to described first image-region bigger for signal feature and signal feature is combined into combination image Fm by described multiple station die group 133.Such as, assume that described first signal feature C1 and C4 is greater than described second signal feature C1' and C4' respectively herein, represent that described first image-region A1 and A4 is relatively applicable to calculate correct object distance compared to described 2nd image-region A1' and A4'; And assume that described first signal feature C2 and C3 is less than described second signal feature C2' and C3' respectively, represent that described 2nd image-region A2' and A3' is relatively applicable to calculate correct object distance compared to described first image-region A2 and A3. Described multiple station die group 133 restructuring combination image Fm, it comprises image-region A1, A2', A3' and A4, as shown in Figure 4 B.
Although it could be understood that Fig. 4 B shows combination image Fm contains described first image F respectivelySPartial image region (such as A1, A4) and contain described 2nd image FLPartial image region (such as A2', A3'), but the present invention is not as limit. According to described image sensor 11 reality obtain image F, described combination image Fm may with described first image FSOr described 2nd image FLIdentical.
Finally, the metrics calculation unit of described processing unit 13 135 calculates at least one object distance D according to described combination image Fm. Should be noted that, the number of at least one object distance in the present embodiment such as can determine according to the number of the pixel column of image F, such as each pixel column all tries to achieve corresponding object distance or every multiple pixel column (such as 2-5 row) all tries to achieve corresponding object distance, looks closely it and judges resolution and determine. Described metrics calculation unit 135 also can judge determinand number according to calculated multiple object distance and the object distance being relevant to identical determinand is merged into same object distance, therefore the last object distance D only exporting number corresponding with determinand number of described metrics calculation unit 135.
In addition, compare the signal feature in different images region of two image F although Fig. 4 A and Fig. 4 B shows described processing unit 13 and produce combination image Fm, but the present invention is not as limit. In some embodiment, described processing unit 13 can compare the signal feature in the different images region of the image F of more than two and produce combination image, its enforcement mode only need to be selected in the image of described more than two in step S36, in corresponding image-region, signal feature the maximum is to produce combination image Fm, other step S31-S35 are then mutually similar with above-mentioned first embodiment, therefore repeat no more in this. In other words, what described image sensor 11 was obtained by the multiple station die group 133 of the present embodiment often open image F divides into identical (such as identical position and identical size) image-region, so that combination image Fm and image F has identical size.
Generally speaking, in above-described embodiment, described processing unit 13 can according to the image quality of partial image region, different partial image region in different images frame are combined as combination image again, to calculate at least one object distance according to described combination image, wherein, the shape of described partial image region and size there is no specific restriction. Such as, processing unit 13 can according to image quality (such as signal feature), such as, by the partial image region in described first image Fs, a part of A1-A4, with described 2nd image FLIn partial image region, a part of such as A1'-A4', is combined as combination image Fm again.
Please refer to shown in Fig. 5, it is the schema of the distance-finding method of the optical ranging system of second embodiment of the invention, comprises the following step: obtain reference picture (step S51) with the reference exposure time;Split described reference picture to be multiple image-region and calculate the mean flow rate (step S52) of each described image-region; And obtain the different images region (step S53) of current image respectively with multiple time shutter according to described mean flow rate.
Shown in Fig. 1-2, Fig. 5 and Fig. 6 A-6B, the detailed embodiment of second embodiment of the invention is then described. With reason, described processing unit 13 controls described light source 15 equally and lights when described image sensor 11 obtains image F.
Step S51: described image sensor 11 is subject to the control of the exposure control unit 131 of described processing unit 13, obtains reference picture F with reference exposure time ETrT. In the present embodiment, described reference picture FTIn order to judge to obtain current image (such as FT+1) time multiple time shutter ET, not in order to calculate described object distance D.
Step S52: after described processing unit 13 receives described reference picture FT, described multiple station die group 133 utilizes space many works mechanism (spatialmultiplexingmechanism) to calculate described reference picture FTIn the mean flow rate of multiple image-regions, to determine multiple time shutter when obtaining image Fm to be calculated. Such as, described multiple station die group 133 splits described reference picture FTFor multiple image-region A1-A4 (Fig. 6 B), and calculate the mean flow rate AV1-AV4 (Fig. 6 B) of described image-region A1-A4 respectively; Wherein, each described different images region A1-A4 can be described current image FT+1A row pixel region, multiple row pixel region, one-row pixels region, multiple row pixel region or rectangular pixels region, be not limited to shown in Fig. 6 B.
Step S53: last, the exposure control unit 131 of described processing unit 13 controls described image sensor 11 according to described mean flow rate AV1-AV4 and obtains current image FT+1Different images region A1-A4 relevant multiple time shutter ET1-ET4 (Fig. 6 A-6B). In one embodiment, the multiple station die group 133 of described processing unit 13 determines described multiple time shutter ET1-ET4 according to the comparative result of the mean flow rate AV1-AV4 of the described image-region A1-A4 of described reference picture FT and at least one threshold value; Such as, described multiple station die group 133 when judge described mean flow rate AV1 between multiple threshold value wherein between two threshold values time (or multiple brightness section one of them), then time shutter (set in advance and store) corresponding to described two threshold values directly determines to obtain described current image FT+1Time shutter of described image-region A1 be ET1, the deciding means of time shutter ET2-ET4 of other image-regions A2-A4 is also identical. In the present embodiment, described current image FT+1Then as image Fm to be calculated.
Finally, the metrics calculation unit of described processing unit 135 is according to described current image FT+1Calculate at least one object distance D.
In another embodiment, described multiple station die group 133 once can only adjust time shutter step rank (step), it is therefore possible to not only according to a reference picture FTCan by described current image FT+1The time shutter ET1-ET4 of all image-region A1-A4 adjust to target value. Now, as described current image FT+1One of them mean flow rate of different images region A1-A4 not when predetermined luminance range, the exposure control unit 131 of described processing unit 13 can according to described current image FT+1The mean flow rate of different images region A1-A4 control described image sensor 11 and obtain next image FT+2Multiple time shutter (Fig. 6 A) of different images region A1'-A4'.When the multiple station die group 133 of described processing unit 13 judges next image F describedT+2The mean flow rate of all image-region A1'-A4' when being all suitable between predetermined luminance range calculating object distance, next image F described in the metrics calculation unit of described processing unit 13 135 basesT+2Calculate at least one object distance D. It could be understood that a relatively described image FT+2Multiple time shutter of different images region A1'-A4' and relative described current image FT+1Different images region A1-A4 multiple time shutter can part equal or all not etc., look closely described current image FT+1The mean flow rate of different images region A1-A4 and determine. As next image F describedT+2Different images region A1-A4 wherein mean flow rate not yet when predetermined luminance range, sustainable carry out adjusting until the mean flow rate of all image-region A1-A4 is all between predetermined luminance range.
Should be noted that, although image sensor 11 described in above-mentioned steps S51 is described for reference exposure time ETr, but relative different images region, described image sensor 11 can be obtain described reference picture F with identical multiple identical reference exposure time ETrTDifferent images region, the such as image-region A1-A4 shown in Fig. 6 B.
Should be noted that, although reference picture F described in above-mentioned 2nd embodimentTNot in order to calculate described object distance D, but as described reference picture FTThe mean flow rate AV1-AV4 of all image-region A1-A4 all between predetermined luminance range, described metrics calculation unit 135 can directly according to described reference picture FTCalculate described object distance D, and need not notify that described exposure control unit 133 controls described image sensor 11 and obtains described current image F with different exposure time ET by described multiple station die group 133T+1; Wherein, described predetermined luminance range can set in advance and be stored in storage element.
With reason, the number of at least one the object distance D in the present embodiment such as can determine according to the number of the number of the pixel column of image F and determinand 9, there is no specific restriction.
Should be noted that, although Fig. 6 A shows the corresponding different time shutter ET1-ET4 of each image-region A1-A4, but its only in order to explanation and be not used to limit the present invention. According to the image content that reality obtains, obtain described current image FT+1Multiple time shutter ET1-ET4 of different images region A1-A4 only different from each other at least partially.
In addition, in order to eliminate the impact of environment light further, described processing unit 13 also can be lighted in order to control the Image Acquisition of the relatively described image sensor 11 of described light source 15 and extinguish, and such as, obtains a bright image when relatively described light source 15 is lighted and obtains a dark image when relatively described light source 15 extinguishes. Described processing unit 13 also can calculate the difference image of described bright image and described dark image using the described first image F as above-mentioned first enforcementSAnd described 2nd image FL, or the described reference picture F of above-mentioned 2nd embodimentT, described current image FT+1And next image F describedT+2��
In above-described embodiment, the multiple station die group 133 of described processing unit 13 is in order to split image F to calculate the signal feature in different images region, such as signal to noise ratio or mean flow rate, to determine that whether exporting image Fm to be calculated calculates at least one object distance D for described metrics calculation unit 135. In first embodiment, described exposure control unit 131 controls described image sensor 11 with the default time shutter and obtains different images (such as FSAnd FL), therefore described exposure control unit 131 controls the time shutter that described image sensor 11 obtains different images F is default fixed value (the such as ET of Fig. 4 AS��ETL).In 2nd embodiment, described multiple station die group 133 determines the time shutter in relatively described different images region according to the mean flow rate in different images region and notifies described exposure control unit 131, therefore described exposure control unit 131 controls the fixed value that described image sensor 11 obtains the time shutter in described different images region and may not preset, but determines according to results of calculation (such as mean flow rate).
In sum, it is known that optical ranging system exists the problem of the determinand distance that cannot accurately measure different positions, the situation that cannot measure especially may occur in time measuring remote determinand. Therefore, the present invention also provides a kind of optical ranging system (Fig. 1,2) and optical ranging method (Fig. 3,5), it to retain the view data of the determinand of different distance simultaneously, uses enhancement counting accuracy by time-multiplex exposure mechanism or space many works exposure mechanism.
Although the present invention is disclosed by previous examples, but itself and be not used to limit the present invention, any technician in the technical field of the invention with usual knowledge, without departing from the spirit and scope of the present invention, when doing various changes and amendment. Therefore protection scope of the present invention ought be as the criterion depending on the scope that accompanying claim defines.

Claims (21)

1. an optical ranging system, this optical ranging system comprises:
Image sensor, obtains the first image with the first time shutter and obtains the 2nd image with the 2nd time shutter, and wherein said first time shutter is different from described 2nd time shutter; And
Processing unit is multiple first image-region in order to receive described first image and described 2nd image, to split described first image, split described 2nd image is multiple 2nd image-region, compare the signal feature of corresponding described first image-region and described 2nd image-region and described 2nd image-region bigger to described first image-region bigger for described signal feature and described signal feature is combined as combination image.
2. optical ranging system according to claim 1, wherein said processing unit also calculates at least one object distance according to described combination image.
3. optical ranging system according to claim 1, wherein said signal is characterized as signal to noise ratio.
4., in optical ranging system according to claim 3, wherein said first image-region and described 2nd image-region, distinguish signal data and noise data according to dynamic threshold.
5. optical ranging system according to claim 1, wherein said processing unit controls described image sensor and alternately carries out Image Acquisition with described first time shutter and described 2nd time shutter.
6. optical ranging system described in claim any one of claim 1-5, wherein
Each described first image-region is a row pixel region of described first image, multiple row pixel region, one-row pixels region, multiple row pixel region or rectangular pixels region; And
Each described 2nd image-region is a row pixel region of described 2nd image, multiple row pixel region, one-row pixels region, multiple row pixel region or rectangular pixels region.
7. an optical ranging system, this optical ranging system comprises:
Image sensor, obtains reference picture with the reference exposure time and obtains the different images region of current image with multiple time shutter; And
Processing unit, in order to receive described reference picture, split described reference picture be multiple image-region, the mean flow rate calculating the described image-region of described reference picture respectively and control, according to described mean flow rate, described multiple time shutter in described different images region that described image sensor obtains described current image.
8. optical ranging system according to claim 7, wherein said processing unit also calculates at least one object distance according to described current image.
9. optical ranging system according to claim 7, wherein obtains described multiple time shutter different from each other at least partially in the described different images region of described current image.
10. optical ranging system according to claim 7, wherein said processing unit, also when one of them mean flow rate of the described different images region of described current image is not between predetermined luminance range, controls, according to the described mean flow rate in the described different images region of described current image, multiple time shutter that described image sensor obtains the different images region of next image.
11. optical ranging systems according to claim 7, wherein said processing unit determines described multiple time shutter according to the comparative result of the mean flow rate of the described image-region of described reference picture and at least one threshold value.
12. any one of claim 7-11 optical ranging system described in claim, wherein each described different images region is a row pixel region, multiple row pixel region, one-row pixels region, multiple row pixel region or rectangular pixels region.
The distance-finding method of 13. 1 kinds of optical ranging systems, this distance-finding method comprises:
Image sensor is utilized to obtain the first image with the first time shutter;
Described image sensor is utilized to obtain the 2nd image with the 2nd time shutter;
Split the first signal feature that described first image is multiple first image-region and calculates each described first image-region;
Split the second signal feature that described 2nd image is multiple 2nd image-region and calculates each described 2nd image-region;
The relatively described first signal feature of each described first image-region and the described second signal feature of corresponding described 2nd image-region; And
Described first signal feature is greater than described first image-region of described second signal feature and described 2nd image-region that described second signal feature is greater than described first signal feature is combined into combination image.
14. distance-finding methods according to claim 13, this distance-finding method also comprises:
At least one object distance is calculated according to described combination image.
15. distance-finding methods according to claim 13, this distance-finding method also comprises:
Determine that the signal to noise ratio of each described first image-region is using as described first signal feature according to dynamic threshold; And
Determine that the signal to noise ratio of each described 2nd image-region is using as described second signal feature according to described dynamic threshold.
16. any one of claim 13-15 distance-finding method described in claim, wherein
Each described first image-region is a row pixel region of described first image, multiple row pixel region, one-row pixels region, multiple row pixel region or rectangular pixels region; And
Each described 2nd image-region is a row pixel region of described 2nd image, multiple row pixel region, one-row pixels region, multiple row pixel region or rectangular pixels region.
The distance-finding method of 17. optical ranging systems, this distance-finding method comprises:
Image sensor is utilized to obtain reference picture with the reference exposure time;
Split the mean flow rate that described reference picture is multiple image-region and calculates each described image-region;
Described image sensor is utilized to obtain the different images region of current image respectively with multiple time shutter according to described mean flow rate.
18. distance-finding methods according to claim 17, this distance-finding method also comprises:
At least one object distance is calculated according to described current image.
19. distance-finding methods according to claim 17, described multiple time shutter in the described different images region of wherein said current image different from each other at least partially.
20. any one of claim 17-19 distance-finding method described in claim, wherein each described different images region is a row pixel region, multiple row pixel region, one-row pixels region, multiple row pixel region or rectangular pixels region.
21. 1 kinds of optical ranging systems, this ranging system comprises:
Image sensor, obtains the first image and the 2nd image respectively with the different time shutter; And
Processing unit, in order to receive described first image and described 2nd image and the partial image region in the partial image region in described first image and described 2nd image is combined as combination image.
CN201410636269.9A 2014-11-12 2014-11-12 Optical ranging system and method Active CN105651245B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201410636269.9A CN105651245B (en) 2014-11-12 Optical ranging system and method
CN201810018459.2A CN108332720B (en) 2014-11-12 2014-11-12 Optical distance measuring system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410636269.9A CN105651245B (en) 2014-11-12 Optical ranging system and method

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201810018459.2A Division CN108332720B (en) 2014-11-12 2014-11-12 Optical distance measuring system

Publications (2)

Publication Number Publication Date
CN105651245A true CN105651245A (en) 2016-06-08
CN105651245B CN105651245B (en) 2018-02-09

Family

ID=

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021077358A1 (en) * 2019-10-24 2021-04-29 华为技术有限公司 Ranging method, ranging device, and computer-readable storage medium
CN114785964A (en) * 2019-08-14 2022-07-22 原相科技股份有限公司 Image pickup system having two exposure modes

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6005936A (en) * 1996-11-28 1999-12-21 Ibm System for embedding authentication information into an image and an image alteration detecting system
US7171057B1 (en) * 2002-10-16 2007-01-30 Adobe Systems Incorporated Image blending using non-affine interpolation
US20130131473A1 (en) * 2011-11-18 2013-05-23 Pixart Imaging Inc. Optical distance measurement system and operation method thereof
CN104050651A (en) * 2014-06-19 2014-09-17 青岛海信电器股份有限公司 Scene image processing method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6005936A (en) * 1996-11-28 1999-12-21 Ibm System for embedding authentication information into an image and an image alteration detecting system
US7171057B1 (en) * 2002-10-16 2007-01-30 Adobe Systems Incorporated Image blending using non-affine interpolation
US20130131473A1 (en) * 2011-11-18 2013-05-23 Pixart Imaging Inc. Optical distance measurement system and operation method thereof
CN104050651A (en) * 2014-06-19 2014-09-17 青岛海信电器股份有限公司 Scene image processing method and device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114785964A (en) * 2019-08-14 2022-07-22 原相科技股份有限公司 Image pickup system having two exposure modes
CN114785964B (en) * 2019-08-14 2024-03-01 原相科技股份有限公司 Image pickup system having two exposure modes
WO2021077358A1 (en) * 2019-10-24 2021-04-29 华为技术有限公司 Ranging method, ranging device, and computer-readable storage medium
CN114556048A (en) * 2019-10-24 2022-05-27 华为技术有限公司 Distance measuring method, distance measuring device and computer readable storage medium
CN114556048B (en) * 2019-10-24 2023-09-26 华为技术有限公司 Ranging method, ranging apparatus, and computer-readable storage medium

Also Published As

Publication number Publication date
CN108332720B (en) 2021-01-26
CN108332720A (en) 2018-07-27

Similar Documents

Publication Publication Date Title
USRE48595E1 (en) Method and system for determining optimal exposure of structured light based 3D camera
TWI713547B (en) Method and apparatus for determining a depth map for an image
US20240106971A1 (en) Method and system for generating at least one image of a real environment
WO2018161758A1 (en) Exposure control method, exposure control device and electronic device
Bimber et al. The visual computing of projector-camera systems
KR102086509B1 (en) Apparatus and method for obtaining 3d image
EP2824923B1 (en) Apparatus, system and method for projecting images onto predefined portions of objects
WO2019047985A1 (en) Image processing method and device, electronic device, and computer-readable storage medium
US20150207975A1 (en) Dct based flicker detection
CN107864342B (en) Image brightness adjusting method and device
US20160125616A1 (en) Apparatus and method of detecting motion mask
KR101941801B1 (en) Image processing method and device for led display screen
US11221207B2 (en) Optical distance measurement system
WO2019107060A1 (en) Illumination control system and illumination control method
EP3381015B1 (en) Systems and methods for forming three-dimensional models of objects
CN105744173B (en) A kind of method, device and mobile terminal of differentiation image front and back scene area
CN105872392B (en) Optical ranging system with the dynamic exposure time
WO2022198862A1 (en) Image correction method, and under-screen system
US20150229896A1 (en) Projector drift corrected compensated projection
JP2004133919A (en) Device and method for generating pseudo three-dimensional image, and program and recording medium therefor
WO2021190745A1 (en) Determination of illumination sections
CN105651245A (en) Optical ranging system and optical ranging method
KR101653649B1 (en) 3D shape measuring method using pattern-light with uniformity compensation
CN105651245B (en) Optical ranging system and method
CN108377383A (en) A kind of mostly projection 3D systems light field setting contrast method and its system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant