CN112399090A - Camera with two exposure modes and camera system using same - Google Patents

Camera with two exposure modes and camera system using same Download PDF

Info

Publication number
CN112399090A
CN112399090A CN202010734127.1A CN202010734127A CN112399090A CN 112399090 A CN112399090 A CN 112399090A CN 202010734127 A CN202010734127 A CN 202010734127A CN 112399090 A CN112399090 A CN 112399090A
Authority
CN
China
Prior art keywords
image
camera
exposure
brightness
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010734127.1A
Other languages
Chinese (zh)
Other versions
CN112399090B (en
Inventor
姚文翰
颜文正
林汉昌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/881,437 external-priority patent/US11614322B2/en
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN202210298031.4A priority Critical patent/CN114785964B/en
Publication of CN112399090A publication Critical patent/CN112399090A/en
Application granted granted Critical
Publication of CN112399090B publication Critical patent/CN112399090B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Abstract

An image pickup system includes a camera and a main controller. The camera is used for judging the ambient brightness and detecting the activation event in the low-energy-consumption mode. When the camera detects an activation event in the low power mode, the master controller is awakened. The camera also determines an exposure mode according to the ambient brightness and informs the main controller of the exposure mode, so that the operation mode of the camera corresponds to the exposure mode of the main controller after being awakened.

Description

Camera with two exposure modes and camera system using same
Technical Field
The present invention relates to an image pickup system, and more particularly, to a camera for determining an exposure mode according to ambient brightness in a low power consumption mode and an image pickup system using the same.
Background
The optical ranging system may calculate the distance of an object using a triangulation method. For example, an optical rangefinder system may include a light source and a camera. The light source projects light towards an object to be measured, and the camera receives reflected light from the object to be measured to form an image frame. When the spatial relationship between the light source and the camera is known, the distance of the object to be measured can be calculated according to triangulation based on the position of the object image in the image frame.
However, when there are multiple targets with different distances in the space, the short-distance target may be over-exposed (over exposure), and the long-distance target may be under-exposed (under exposure), and the calculation accuracy of the optical ranging system may be reduced accordingly. In particular, when the exposure of the remote object to be measured is insufficient, the object distance of the remote object to be measured cannot be calculated.
Therefore, in the camera system, it is desirable to obtain high snr in both bright and dark areas of the captured image to increase the accuracy of the subsequent judgment and control. Therefore, a camera system capable of acquiring high snr images at different ambient brightness is needed.
Disclosure of Invention
The invention also provides a camera for determining the exposure mode during recording according to the ambient brightness and a camera system using the camera, wherein the camera can correctly correspond to the exposure mode selected by the camera when the main control device is awakened to receive the first image frame from the camera.
The invention provides a camera comprising an image sensor and a processing unit. The image sensor is used for sequentially generating images. The processing unit is coupled with the image sensor and used for judging the ambient brightness and carrying out activation event detection in a low energy consumption mode, when the activation event detection is true and the ambient brightness is larger than a first brightness threshold value, the image sensor is controlled to output a first exposure image and a second exposure image in a multiple exposure mode, and when the activation event detection is true and the ambient brightness is smaller than a second brightness threshold value, the image sensor is controlled to operate in a double gain mode and output a combined image.
The invention also provides a camera system comprising the camera and the main control device. The camera is used for judging ambient brightness and carrying out activation event detection in a low-energy-consumption mode, outputting a multi-exposure mode signal, a first exposure image and a second exposure image when the activation event detection is true and the ambient brightness is greater than a first brightness threshold value, and outputting a double-gain mode signal and a gain combined image when the activation event detection is true and the ambient brightness is less than a second brightness threshold value. The main control device is used for generating an exposure combination image according to the first exposure image and the second exposure image when receiving the multiple exposure mode signal and recording the gain combination image when receiving the double gain mode signal.
The invention also provides a camera system comprising the camera and the main control device. The camera is used for carrying out automatic exposure to determine exposure time and a gain value when an activation signal is generated in a low-power-consumption mode, calculating and outputting a brightness parameter according to the exposure time and the gain value, outputting different exposure images when the brightness parameter is smaller than a parameter threshold value, and outputting a gain combined image when the brightness parameter is larger than the parameter threshold value. The main control device is used for ending the low energy consumption mode when receiving the activation signal, receiving the brightness parameter and judging the data format of the image transmitted by the camera according to the received brightness parameter, wherein the difference between the activation signal and the brightness parameter is received by the main control device within a preset time interval, and the automatic exposure is completed within the preset time interval.
In order that the manner in which the above recited and other objects, features and advantages of the present invention are obtained will become more apparent, a more particular description of the invention briefly described below will be rendered by reference to the appended drawings. In the description of the present invention, the same components are denoted by the same reference numerals and are described in advance.
Drawings
FIG. 1 is a block diagram of an optical ranging system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an optical ranging system according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a distance measuring method of an optical distance measuring system according to a first embodiment of the present invention;
FIG. 4A is a timing diagram of image acquisition of the optical ranging system according to the first embodiment of the present invention;
FIG. 4B is a schematic diagram illustrating an optical ranging system according to a first embodiment of the present invention;
FIG. 5 is a flowchart illustrating a distance measuring method of an optical distance measuring system according to a second embodiment of the present invention;
FIG. 6A is a timing diagram of image acquisition of an optical ranging system according to a second embodiment of the present invention;
FIG. 6B is a diagram illustrating an optical ranging system according to a second embodiment of the present invention;
FIG. 7 is a block diagram of a camera system according to an embodiment of the present invention;
FIG. 8 is a block diagram illustrating an exemplary embodiment of a camera system operating in a low power mode;
FIG. 9 is a block diagram illustrating an embodiment of a camera system operating in a dual gain mode;
FIG. 10 is a block diagram illustrating an exemplary embodiment of a camera system operating in a multiple exposure mode;
FIG. 11A is a diagram illustrating ambient brightness and brightness thresholds of a camera system according to an embodiment of the present invention;
fig. 11B is a schematic diagram of a luminance parameter and a parameter threshold of the image capturing system according to the embodiment of the invention; and
fig. 12 is a flowchart of an operating method of the image capturing system according to the embodiment of the invention.
Description of the reference numerals
700 camera system
71 vidicon
711 image sensor
713 internal processing unit
73 master control device
731 external processor
733 image recorder
79 external device
St1、St2Activation signal
Im image
Sgain _ c gain control signal
Sexp _ c exposure control signal
Detailed Description
Fig. 1 is a block diagram of an optical ranging system according to an embodiment of the invention. The optical ranging system 1 includes an image sensor 11 and a processing unit 13. The image sensor 11 is preferably an active image sensor, such as a Complementary Metal Oxide Semiconductor (CMOS) image sensor, which can change an exposure time (exposure time) for acquiring the image F or respectively acquire different image areas of the image F with a plurality of exposure times (for example, described in detail later).
The processing unit 13, which may be a Digital Signal Processor (DSP), a Micro Control Unit (MCU), a Central Processing Unit (CPU), etc., is configured to receive the image F output by the image sensor 11 for post-processing and control image acquisition of the image sensor 11. In one embodiment, the processing unit 13 may include an exposure control unit 131, a multiplexing module 133, and a distance calculation unit 135; the exposure control unit 131, the multiplexing module 133 and the distance calculation unit 135 are data processing units in the processing unit 13, and can be implemented in software or hardware without specific limitation. It is understood that although the processing unit 13 is divided into different work modules in fig. 1 for convenience of illustration, all functions performed by the work modules in the processing unit 13 can be said to be performed by the processing unit 13.
The exposure control unit 131 is used to control the image sensor 11 to obtain all image areas of different images F with different exposure times (i.e. one image corresponds to one exposure time), or to obtain different image areas of the same image F with multiple exposure times (i.e. one image corresponds to multiple exposure times). The multiplexing module 133 processes the image F received by the processing unit 13 by using time multiplexing or spatial multiplexing and generates an image Fm to be calculated (e.g., a combined image and a current image described later in this specification). The distance calculation unit 135 calculates at least one object distance from the image Fm to be calculated by using a predetermined algorithm, for example, by using a triangulation method.
Fig. 2 is a schematic view of an optical ranging system according to an embodiment of the invention. The optical ranging system 1 may further include a light source 15 for projecting a two-dimensional light region (e.g., a light line with a predetermined width) onto the object 9; the light source 15 can be, for example, a coherent light source, a partially coherent light source, or a non-coherent light source, and is not particularly limited to emit visible light or invisible light. After the image sensor 11 receives the reflected light of the object 9, an image F including a reflected light image I9 is generated and transmitted to the processing unit 13. The processing unit 13 first generates the image Fm to be calculated according to the image F by using the multiplexing mechanism of the present invention (for example, described in detail later), and calculates at least one object distance D according to the image Fm to be calculated; wherein the image Fm to be calculated likewise contains a reflected light image I9. More specifically, at least a portion of the plurality of exposure times corresponding to different image areas of the image Fm to be calculated may be different from each other (as described in detail later in the example), so that the brightness of the reflected light image I9 in each image area is suitable for calculating the object distance D. Furthermore, in some embodiments, the processing unit 13 may output the image Fm to be calculated by an external device for post-processing, such as to an external host (host), in a wired or wireless manner. It should be noted that, although the two-dimensional light region projected by the light source 15 is shown as being discontinuous in fig. 2, it is only used for illustration and is not used to limit the present invention.
In one embodiment, the processing unit 13 may include a storage unit (not shown) for storing a look-up table including the relationship between the position of the reflected light image I9 and the object distance D. Thus, after the processing unit 13 obtains the position of the reflected light image I9 in the image Fm to be calculated, at least one object distance D can be directly obtained according to the comparison table; the comparison table is calculated according to the spatial relationship (e.g., distance L) between the light source 15 and the image sensor 11 and the illumination angle of the light source 15, and is stored in the storage unit in advance. In another embodiment, the storage unit of the processing unit 13 may store a distance algorithm, and when the position of the reflected light image I9 in the image Fm to be calculated is obtained, the distance algorithm may be used to calculate at least one object distance D.
In the embodiment of the present invention, since the light source 15 is used to project a two-dimensional light region, the image F output by the image sensor 11 includes the linear reflected light image I9, and the processing unit 13 can calculate a plurality of object distances (different objects correspond to different sections of the reflected light image and are located at different positions) at the same time, so that the present invention has better applicability. Finally, the processing unit 13 outputs the calculated object distance D for corresponding control, for example, to a host or a computer system; the control function of the object distance D depends on different applications.
Referring to fig. 3, a flowchart of a distance measuring method of an optical distance measuring system according to a first embodiment of the invention is shown, which includes the following steps: acquiring a first image at a first exposure time (step S31); acquiring a second image with a second exposure time (step S32); dividing the first image into a plurality of first image regions and calculating a first signal characteristic of each of the first image regions (step S33); dividing the second image into a plurality of second image regions and calculating a second signal characteristic of each of the second image regions (step S34); comparing the first signal characteristic with the second signal characteristic (step S35); and combining the first image region where the first signal characteristic is greater than the second signal characteristic and the second image region where the second signal characteristic is greater than the first signal characteristic into a combined image (step S36).
Referring to fig. 1-3 and fig. 4A-4B, a detailed implementation of the first embodiment of the present invention will be described. The processing unit 13 controls the light source 15 to be turned on when the image sensor 11 acquires the image F, so that the image F acquired by the image sensor 11 includes the reflected light image I9 from the object 9, thereby calculating the object distance D of the object 9.
Step S31: the image sensor 11 is controlled by the exposure control unit 131 of the processing unit 13 for a first exposure time ETSAcquiring a first image FS
Step S32: then, the image sensor 11 is controlled by the processing unit 13 for a second exposure time ETLAcquiring a second image FL(ii) a Wherein the first image FSAnd the second image FLTwo images F that can be acquired continuously or at least one image apart by the image sensor 11, and the first exposure time ETSDifferent from the second exposure time ETL. It should be noted that although the first exposure time ET is shown in fig. 4ASLess than the second exposure time ETLHowever, the present invention is not limited thereto. In some embodiments, the first exposure time ETSGreater than the second exposure time ETL. In one embodiment, the exposure control unit 131 of the processing unit 13 controls the image sensor 11 to alternately perform the first exposure time ETSAnd the second exposure time ETLAnd acquiring an image.
Step S33: the processing unit 13 receives the first image FSThen, the multiplexing module 133 divides the first image F in a predetermined mannerSCalculating a first signal characteristic C1-C4 (FIG. 4B) of each first image region A1-A4 for a plurality of first image regions, such as A1-A4 (FIG. 4B); wherein each of the first image regions A1-A4 may be the first image FSA column of pixel regions, a plurality of column of pixel regions, a row of pixel regions, a plurality of rows of pixel regions, or a rectangular pixel region, is not limited to that shown in fig. 4B. In one embodiment, the signal characteristics C1-C4 are the signal-to-noise ratios (SNRs) of the first image regions A1-A4, respectively; for example, the multiplexing module 133 distinguishes signal data (signal data) and noise data (noise data) according to a dynamic threshold in each of the first image regions a1-a4, and calculates each of the noise dataThe ratio (ratio) of the sum of the energy values of all the signal data to the sum of the energy values of all the noise data in the first image region a1-a4 is taken as the signal-to-noise ratio. In one embodiment, the dynamic threshold is selected as an average of the sum of the maximum energy value and the average energy value in one first image region, but the invention is not limited thereto, and therefore, each of the first image regions a1-a4 can be thresholded. Since the threshold value of each of the first image regions a1-a4 is calculated from the acquired image data and thus may be different from each other, it is referred to as a dynamic threshold value in this description.
Step S34: in the same way, the processing unit 13 receives the second image FLThen, the multiplexing module 133 divides the second image F in the predetermined manner (same as step S33)LCalculating a second signal characteristic C1'-C4' (FIG. 4B) of each of the second image regions A1'-A4' for a plurality of second image regions, such as A1'-A4' (FIG. 4B); wherein each of the second image regions A1'-A4' may be the second image FLA column of pixel regions, a plurality of column of pixel regions, a row of pixel regions, a plurality of rows of pixel regions, or a rectangular pixel region, is not limited to that shown in fig. 4B. Similarly, the signal characteristics C1'-C4' may be the signal-to-noise ratio (SNR) of the second image regions a1'-a4', respectively; for example, the multiplexing module 133 distinguishes signal data and noise data according to a dynamic threshold in each of the second image areas a1'-a4', and calculates a ratio of a sum of energy values of all signal data to a sum of energy values of all noise data as the signal-to-noise ratio. The determination of the dynamic threshold is as described in step S33, and therefore, the detailed description thereof is omitted.
Step S35: then, the multiplexing module 133 compares the signal characteristics of the corresponding first image area a1-a4 and the second image area a1'-a4', for example, compares the first signal characteristic C1 of the first image area a1 and the second signal characteristic C1 'of the second image area a 1'; comparing a first signal characteristic C2 of the first image region a2 with a second signal characteristic C2 'of the second image region a 2'; comparing a first signal characteristic C3 of the first image region A3 with a second signal characteristic C3 'of the second image region A3'; and comparing the first signal characteristic C4 of the first image region a4 with the second signal characteristic C4 'of the second image region a 4'.
Step S36: next, the multiplexing module 133 utilizes a time multiplexing mechanism (time multiplexing mechanism) to combine a part of the image area of the first image FS with the second image FLTo generate a combined image Fm. In one embodiment, the multiplexing module 133 combines the first image region with larger signal characteristics and the second image region with larger signal characteristics into a combined image Fm. For example, assuming that the first signal features C1 and C4 are larger than the second signal features C1 'and C4', respectively, it means that the first image regions a1 and a4 are better suited for calculating correct object distances than the second image regions a1 'and a 4'; and assuming that the first signal features C2 and C3 are smaller than the second signal features C2 'and C3', respectively, it means that the second image regions a2 'and A3' are better suited for calculating correct object distances than the first image regions a2 and A3. The multiplexing module 133 recombines the combined image Fm including the image regions a1, a2', A3' and a4, as shown in fig. 4B.
It will be appreciated that although fig. 4B shows that the combined images Fm respectively comprise said first images FSAnd the second image F is included in the partial image region (e.g., a1, a4)LPartial image regions (e.g., a2', A3'), but the invention is not limited thereto. According to the image F actually acquired by the image sensor 11, the combined image Fm may be the first image FSOr the second image FLThe same is true.
Finally, the distance calculation unit 135 of the processing unit 13 then calculates at least one object distance D from the combined image Fm. It should be noted that the number of at least one object distance in the present embodiment may be determined according to the number of pixel columns of the image F, for example, each pixel column obtains a corresponding object distance or each of a plurality of pixel columns (for example, 2-5 columns) obtains a corresponding object distance, depending on the determination resolution. The distance calculating unit 135 can determine the number of objects to be measured according to the determined object distances and combine the object distances related to the same object to be measured into the same object distance, so that the distance calculating unit 135 only outputs the object distances D corresponding to the number of the objects to be measured.
In addition, although fig. 4A and 4B show that the processing unit 13 compares signal characteristics of different image areas of the two images F and generates the combined image Fm, the invention is not limited thereto. In some embodiments, the processing unit 13 may compare the signal characteristics of different image regions of two or more images F to generate a combined image, and the implementation manner only needs to select the image with the largest signal characteristic in the corresponding image region to generate the combined image Fm in step S36, and other steps S31-S35 are similar to the first embodiment, and therefore are not described herein again. In other words, the multiplexing module 133 of the present embodiment divides each image F acquired by the image sensor 11 into the same (e.g. the same position and the same size) image areas, so that the combined image Fm and the image F have the same size.
In summary, in the above embodiment, the processing unit 13 may recombine different partial image regions in different image frames into a combined image according to the image quality of the partial image regions to calculate at least one object distance according to the combined image, wherein the shape and size of the partial image regions are not particularly limited. For example, the processing unit 13 may combine a portion of an image region, e.g., a portion of a1-a4, in the first image Fs with the second image F according to image quality (e.g., signal characteristics)LPartial image regions of (a) e.g. a1'-a4' are recombined into a combined image Fm.
Referring to fig. 5, a flowchart of a distance measuring method of an optical distance measuring system according to a second embodiment of the present invention is shown, which includes the following steps: acquiring a reference image at a reference exposure time (step S51); dividing the reference image into a plurality of image regions and calculating an average luminance of each of the image regions (step S52); and acquiring different image areas of the current image for a plurality of exposure times respectively according to the average brightness (step S53).
Referring to fig. 1-2, fig. 5 and fig. 6A-6B, a detailed implementation of the second embodiment of the present invention will be described. Similarly, the processing unit 13 also controls the light source 15 to be turned on when the image sensor 11 acquires the image F.
Step S51: the image sensor 11 is controlled by an exposure control unit 131 of the processing unit 13 to acquire a reference image F with reference to an exposure time ETrT. In this embodiment, the reference image FTFor determining the acquisition of the current image (e.g. F)T+1) The plurality of exposure times ET of time is not used to calculate the object distance D.
Step S52: the processing unit 13 receives the reference image FTThen, the multiplexing module 133 calculates the reference image F by using a spatial multiplexing mechanism (spatial multiplexing mechanism)TTo determine a plurality of exposure times at which the image Fm to be calculated is acquired. For example, the multiplexing module 133 divides the reference image FTCalculating average brightness AV1-AV4 (FIG. 6B) of a plurality of image areas A1-A4 (FIG. 6B) and the image areas A1-A4 respectively; wherein each of the different image areas A1-A4 can be the current image FT+1A column of pixel regions, a plurality of column of pixel regions, a row of pixel regions, a plurality of rows of pixel regions, or a rectangular pixel region, is not limited to that shown in fig. 6B.
Step S53: finally, the exposure control unit 131 of the processing unit 13 controls the image sensor 11 to obtain the current image F according to the average brightness AV1-AV4T+1A plurality of exposure times ET1-ET4 (fig. 6A-6B) associated with the different image areas a1-a 4. In one embodiment, the multiplexing module 133 of the processing unit 13 is configured to multiplex the reference image F according to the reference imageTThe comparison of the average luminance AV1-AV4 of the image areas a1-a4 with at least one threshold value determines the plurality of exposure times ET1-ET 4; for example, when the multiplexing module 133 determines that the average brightness AV1 is between two of the plurality of thresholds (or one of the plurality of brightness intervals), it directly determines to obtain the current exposure time according to the exposure times (preset and stored) corresponding to the two thresholdsImage FT+1The exposure time of the image area a1 is ET1, and the exposure times ET2 to ET4 of the other image areas a2 to a4 are determined in the same manner. In this embodiment, the current image FT+1Then the image Fm is calculated.
Finally, the distance calculating unit 135 of the processing unit is based on the current image FT+1At least one object distance D is calculated.
In another embodiment, the multiplexing module 133 can adjust only one exposure time step (step) at a time, so it is possible not to only adjust the exposure time step according to one reference image FTI.e. the current image F can be mergedT+1The exposure times ET1-ET4 of all image areas a1-a4 are adjusted to target values. At this time, when the current image FT+1When the average brightness of one of the different image areas A1-A4 is not within the preset brightness range, the exposure control unit 131 of the processing unit 13 can control the exposure according to the current image FT+1Controls the image sensor 11 to acquire the next image F according to the average brightness of the different image areas A1-A4T+2A plurality of exposure times of different image areas a1'-a4' (fig. 6A). When the multiplexing module 133 of the processing unit 13 determines the next image FT+2When the average brightness of all the image areas A1'-A4' is within the preset brightness range and is suitable for calculating the object distance, the distance calculating unit 135 of the processing unit 13 calculates the object distance according to the next image FT+2At least one object distance D is calculated. It can be understood that with respect to the one image FT+2With respect to the current image F, a plurality of exposure times of the different image areas a1' -a4T+1The exposure times of the different image areas A1-A4 may be partially equal or all different, depending on the current image FT+1Depending on the average brightness of the different image areas a1-a 4. When the next image FT+2The adjustment may be continued until the average brightness of all the image areas a1-a4 is within the predetermined brightness range when the average brightness of the different image areas a1-a4 is not within the predetermined brightness range.
It should be noted that although the image sensor 11 is exposed to a reference light in the above step S51The ETr is taken as an example for illustration, however, the image sensor 11 can acquire the reference image F with the same reference exposure time ETr for the same reference exposure time ETrTSuch as image areas a1-a4 shown in fig. 6B.
It should be noted that although the reference image F is described in the second embodiment aboveTNot used to calculate the object distance D, but when the reference image FTThe average luminance AV1-AV4 of all the image areas A1-A4 are within a preset luminance range, and the distance calculation unit 135 can directly calculate the average luminance based on the reference image FTCalculating the object distance D without informing the exposure control unit 133 through the multiplexing module 133 to control the image sensor 11 to obtain the current image F at different exposure times ETT+1(ii) a The preset brightness range can be set in advance and stored in the storage unit.
Similarly, the number of the at least one object distance D in the present embodiment can be determined according to the number of pixel rows of the image F and the number of the objects 9, for example, and is not limited in particular.
It should be noted that although fig. 6A shows different exposure times ET1-ET4 for each of the image areas a1-a4, it is only illustrative and not intended to limit the present invention. Acquiring the current image F according to the actually acquired image contentT+1Only at least a portion of the plurality of exposure times ET1-ET4 of the different image areas a1-a4 differ from each other.
In addition, in order to further eliminate the influence of the ambient light, the processing unit 13 may be further configured to control the light source 15 to be turned on and off with respect to the image acquisition of the image sensor 11, for example, to acquire a bright image when the light source 15 is turned on and a dark image when the light source 15 is turned off. The processing unit 13 may also calculate a difference image of the bright image and the dark image as the first image F of the first implementation described aboveSAnd the second image FLOr said reference image F of the second embodiment described aboveTThe current image FT+1And said next image FT+2
In the above embodiment, the multiplexing module 133 of the processing unit 13 is configured to divide the image F to calculate signal characteristics, such as signal-to-noise ratio or average brightness, of different image regions, so as to determine whether to output the image Fm to be calculated for the distance calculating unit 135 to calculate at least one object distance D. In the first embodiment, the exposure control unit 131 controls the image sensor 11 to acquire different images (e.g., F) with a preset exposure timeSAnd FL) Therefore, the exposure control unit 131 controls the exposure time of the image sensor 11 to acquire different images F to be a preset fixed value (e.g. ET of fig. 4A)S、ETL). In the second embodiment, the multiplexing module 133 determines the exposure time corresponding to different image areas according to the average brightness of the different image areas and notifies the exposure control unit 131, so that the exposure control unit 131 controls the image sensor 11 to obtain the exposure time of the different image areas may not be a preset fixed value, but is determined according to an actual calculation result (e.g., the average brightness).
The present invention also provides a camera system using the method for obtaining a combined image (e.g. Fm of fig. 4B), and a dual-gain detection technique is used to obtain a better signal-to-noise ratio for dark regions of an image frame obtained in a bright environment and for bright regions of an image frame obtained in a dark environment.
The combined image of the above embodiments is performed using a so-called multiple-exposure (DOL) detection technique. By using the multiple exposure detection technology, the dark area of the combined image (synthesis image) acquired under the bright ambient light has better signal-to-noise ratio. In addition to the manner in which the combined image is obtained in the above embodiments, the present invention is applicable to other DOL detection techniques, such as, but not limited to, U.S. patent publication nos. US 2016/0119575 and US 2017/0339325, the entire contents of which are incorporated herein by reference.
By using a dual gain (DCG) detection technique, a bright area of the combined image obtained under dark ambient light has a better signal-to-noise ratio. The present invention is applicable to any suitable DCG detection technique, for example, see, but not limited to, U.S. patent publication nos. US 2004/0251394 and US 2007/0013797, the entire contents of which are incorporated herein by reference.
Fig. 7 is a block diagram of a camera system 700 according to an embodiment of the invention. The imaging system 700 includes a camera 71 and a main control device 73 connected to each other by wire or wirelessly. After the camera system 700 enters the low power consumption mode, the master control device 73 stops receiving or recording images from the camera 71 to reduce the power consumption of the system, so this state is referred to as the low power consumption mode in the present invention. The manner of entering the low power mode is known and not a primary objective of the present invention, and therefore, will not be described herein.
The present invention is that when the camera system 700 wakes up at an activation event (which can be detected by the camera 71 or the external device 79) in the low power consumption mode, the master control device 73 can correctly correspond to the operation mode of the camera 71 when receiving the first image frame transmitted from the camera 71. In the present invention, the operation mode includes a multiple (DOL) exposure mode or a dual gain (DCG) exposure mode.
The camera 71 comprises an image sensor 711 and a processing unit 713, wherein fig. 7 shows an internal processing unit in order to indicate that the processing unit 713 is located inside the camera 71. In the present invention, the functions performed by the image sensor 711 and the processing unit 713 may be considered to be performed by the camera 71.
The image sensor 711 is, for example, a CCD image sensor or a CMOS image sensor, and is configured to detect light in its field of view and generate an image Im to the processing unit 713.
The processing unit 713 is, for example, an Application Specific Integrated Circuit (ASIC) or a digital processor (DSP). In the low power mode, the processing unit 713 is used for determining the ambient brightness and performing activation event detection. When the activation event is detected as true and the ambient brightness is greater than the first brightness threshold, the processing unit 713 controls the image sensor 711 to output the first exposure image and the second exposure image in a multiple exposure mode, for example, referring to fig. 10, such as acquiring image 1 according to the image capturing parameters AGain1 and Texp1 and acquiring image 2 according to the image capturing parameters AGain2 and Texp 2. When the activation event is detected as true and the ambient brightness is less than the second brightness threshold, the processing unit 713 controls the image sensor 711 to operate in a dual gain mode and output a combined image, where the dual gain mode may refer to fig. 9, for example, acquiring an image according to the exposure time Texp and generating a bright image and a dark image with agail and agais respectively for the camera 71 to perform image synthesis.
In one embodiment, the first luminance threshold is the same as the second luminance threshold, such as Thd of fig. 11A. In another embodiment, the first brightness threshold is different from the second brightness threshold, such as TH1, TH2 of fig. 11A. When the ambient brightness is between the first brightness threshold and the second brightness threshold, the camera 71 may adopt either the multiple exposure mode or the dual gain mode after waking up, because the difference between the signal-to-noise ratios of the bright and dark regions of the images obtained by the two operation modes is not large. For example, the image capturing system 700 (including the camera 71 and the main control device 73) may operate in a mode before the previous low power mode is entered, but is not limited thereto.
In another embodiment, when the ambient brightness is between the first brightness threshold and the second brightness threshold, the camera 71 is preset to use one of the multiple exposure mode or the dual gain mode (i.e., one of the multiple exposure mode and the dual gain mode is a preset mode) after waking up. In another embodiment, the camera 71 is configured to use one of a multiple exposure mode or a dual gain mode after being powered on.
The camera 71, more precisely the processing unit 713, may determine whether the activation event detection is true based on its own detection result or based on a detection result of an external device 79, wherein the external device 79 for example comprises a thermal sensor (PIR sensor or door sensor), a doorbell, a touchpad or other detector that can detect moving objects or living bodies. In the present invention, the detection of an activation event is true, which means the presence of a person or any other condition in the field of view of the camera 71 that requires video recording or surveillance.
In one embodiment, the processing unit 713 of the camera 71 performs the activation event detection based on the image Im generated by the image sensor 711. For example, when the image sensor 711 acquires an image of a person or a moving object, the processing unit 713 generates an activation signal St1 and sends it to the main control device 73 to wake up it and start recording and related control.
In another embodiment, the processing unit 713 of the camera 71 performs the activation event detection based on a detection signal generated by a thermal sensor external to the camera 71. For example, the thermal sensor acquires a thermal image of the person, and generates the activation signal St 2. In one embodiment, the activation signal St2 is sent to the camera 71 and the master control device 73 to wake up the camera 71 and the master control device 73. In another embodiment, the activation signal St2 is first sent to the camera 71 to wake up the camera 71, and then the camera 71 sends the activation signal St1 to the master control device 73 to wake up the master control device 73. That is, when the processing unit 713 confirms that the activation event occurs according to the activation signal St2 of the external heat sensor, the processing unit 713 generates an activation signal St1 to the master control 73 to wake it up.
In another embodiment, the processing unit 713 of the camera 71 performs the activation event detection based on a push signal generated by a doorbell or a touch pad (touch panel) external to the camera 71. This embodiment is advantageously suitable for generating the activation signal St2 by pressing a doorbell or a touchpad when a person visits. Similarly, the activation signal St2 may be sent to the camera 71 and the master control device 73 simultaneously, or may be sent to the camera 71 first, depending on the application. For example, when the processing unit 713 confirms that an activation event occurs according to the external doorbell or the touch pad, the processing unit 713 generates an activation signal St1 to the main control 73 to wake it up.
Waking up the camera 71 may mean that the camera 71 acquires images at a higher frame rate and determines an operation mode according to the ambient brightness. The waking up of the master control device 73 may be that the master control device 73 starts to receive and record different exposure images or combined images (for example, described later) transmitted from the camera 71, and performs corresponding control, such as unlocking a door lock, unlocking a light source, notifying related personnel, and the like according to the received images.
The camera 71 (or more strictly, the processing unit 713) can determine the ambient light according to its own detection and operation or according to the detection and operation of the external device 79, wherein the external device 79 includes at least one of a light source and an ambient light sensor (ambient light sensor).
In one embodiment, the processing unit 713 of the camera 71 determines the ambient brightness based on the gain value and/or exposure time obtained by the auto-exposure process. As mentioned above, the present invention is primarily applicable to wake-up the camera system 700, and thus the auto-exposure is preferably a fast auto-exposure. For example, the quick auto exposure is started to be performed when the processing unit 713 generates the activation signal St1 or receives the activation signal St 2. The fast automatic exposure is higher and shorter time than the frame rate of the camera 71 in the normal mode, for example, the frame rate of the camera 71 in the normal mode is 30 frames/second, and the fast automatic exposure is at least 240 frames/second. The camera 71 completes the auto-exposure process within a predetermined time (for example, but not limited to, 50 ms) after the activation signal is generated until the auto-exposure parameter or the brightness parameter is sent to the host device 73. The processing unit 713 may determine the ambient brightness according to a gain value, an exposure time, or a gain value x an exposure time obtained when the auto-exposure process is completed, or may use a function in which the gain value is related to the exposure time (e.g., LGEP 64 log)2(gain value × exposure time) +512) as a luminance parameter, for example, as follows.
In another embodiment, the processing unit 713 of the camera 71 determines the ambient brightness based on a sum or average of intensities of the image Im generated by the image sensor 711 (e.g., after an auto-exposure process is completed) in the low power mode.
In another embodiment, the processing unit 713 of the camera 71 determines the ambient brightness according to the driving intensity or the operating frequency of the light source in the low power mode, wherein the light source can be disposed on the camera 71 or separately to provide illumination during image capturing. For example, when the ambient brightness is low (e.g., evening hours), the driving power or actuation frequency of the light source is high; when the ambient brightness is high (e.g., daytime hours), the driving power or actuation frequency of the light source is low.
In another embodiment, the processing unit 713 of the camera 71 determines the ambient brightness according to the frequency of the image sensor 711 operating in the low power mode, wherein the camera 71 acquires images in the low power mode at a lower frame rate than in the normal mode. For example, when the ambient brightness is low (e.g., during the evening hours), the frequency of operation of the image sensor 711 is high; when the ambient brightness is high (e.g., during daytime), the activation frequency of the image sensor 711 is low.
In another embodiment, the processing unit 713 of the camera 71 determines the ambient brightness according to a detection signal of an ambient light sensor outside the camera 71 in the low power consumption mode, wherein the operation of the ambient light sensor is known and therefore not described herein again. In another embodiment, the ambient light sensor is built into the camera 71.
Referring to fig. 7 again, the master device 73 includes a processor 731 and an image recorder 733. Here, fig. 7 shows an external processor in order to show that the processing unit 731 is located outside the camera 71. The processor 731 is, for example, a Central Processing Unit (CPU) or a Microprocessor (MCU) of the main control device 73. The image recorder 733 includes storage, such as volatile and/or non-volatile storage, for recording images from the camera 71, which images are transmitted to a display (not shown) for playback.
In one embodiment, the master control device 73 receives a wake-up exposure mode related mode signal from the camera 71 to know the data format of the received image (e.g., starting with the first image) when it is woken up. In other words, when the camera 71 determines that the activation event is detected as true (i.e., the activation signal is generated) and the ambient brightness is greater than the first brightness threshold, in addition to outputting the first exposure image and the second exposure image, a multiple exposure mode signal, which is represented by at least one digital value 1 or 0, for example, is further output to the master control device 73; when the camera 71 determines that the activation event is detected as true and the ambient brightness is less than the second brightness threshold, in addition to outputting the combined image (sometimes referred to herein as a gain combined image in order to indicate its characteristics), a dual gain mode signal, which is indicated by at least one digital value of 0 or 1, for example, is additionally output to the master control device 73.
The master control 73 starts a wake-up procedure when receiving the activation signal St1 or St2, wherein the wake-up procedure of the master control 73 comprises starting to receive and record images from the camera 71. In addition, when the master control device 73 (more specifically, the processor 731 thereof) receives the multiple exposure mode signal, an exposure combination image is generated according to the first exposure image and the second exposure image.
In one embodiment, the first exposure image, the second exposure image and the exposure combination image are the first images F of the above embodiments respectivelySA second image FLAnd a combined image Fm, see fig. 4B. That is, the camera 71 acquires a first exposure image with a first exposure time and a second exposure image with a second exposure time, the first exposure time being different from the second exposure time. The main control device 73 divides the first exposure image into a plurality of first image areas, divides the second exposure image into a plurality of second image areas, compares signal characteristics of the corresponding first image areas and the corresponding second image areas, and combines the first image areas with larger signal characteristics and the second image areas with larger signal characteristics into the exposure combined image, so that the exposure combined image includes partial image areas of the first exposure image and the second exposure image. That is, in the present invention, different exposure images are acquired by the camera 71 and image synthesis is performed by the main control device 73, so fig. 10 shows that the functional blocks of image synthesis in the camera 71 are crossed.
When the master control device 73 (more specifically, the processor 731 thereof) receives the dual gain mode signal, it directly records the gain combined image to the image recorder 733. In the present invention, the gain-combined image includes different image regions of the same image obtained by the camera 71 amplified by different gain values (e.g., AGainL and AGainS in fig. 9), for example, a dark region of the gain-combined image is obtained by amplifying a corresponding region of the same image by using a larger gain value AGainL and a bright region of the gain-combined image is obtained by amplifying a corresponding region of the same image by using a smaller gain value AGainS, so that the bright region in the gain-combined image has a better signal-to-noise ratio, wherein the bright and dark regions are determined by comparing a pixel gray level value with at least one brightness threshold, for example.
In the embodiment where the camera 71 and the master control device 73 are manufactured by different manufacturers, the master control device 73 has various thresholds, such as the first brightness threshold and the second brightness threshold, built in its software and/or firmware before the factory shipment. When the camera 71 is connected to the master control unit 73 for the first time (for example, during system installation), the master control unit 73 stores the first brightness threshold, the second brightness threshold and other operation algorithms in the memory of the camera 71. Thus, the camera 71 and the main control device 73 have the same brightness determination mechanism.
In the present invention, when the camera 71 determines that the activation event is detected as true, the processing unit 713 further outputs auto exposure parameters (including, for example, a gain value obtained by the fast auto exposure and an exposure time) and an activation signal St1 to the main control device 73. The auto exposure parameter is used to inform the main control device 73 of image pickup control for the camera 71 after being awakened. As described above, when the activation signal (e.g., St2) is detected by the external device 79 and directly transmitted to the master device 73, the processing unit 713 may not output the activation signal St1 to the master device 73. In another embodiment, when the activation signal (e.g., St2) is detected by the external device 79 and transmitted to the camera 71 but not to the master control device 73, the camera 71 still outputs the activation signal St1 to the master control device 73 based on the activation signal St 2.
In another embodiment, the camera 71 transmits a brightness parameter (e.g., LGEP shown in FIG. 11B) to the master control device 73. When the ambient brightness is higher, the LGEP value is smaller; and the larger the LGEP value when the ambient brightness is lower. In other words, in the foregoing embodiment, the ambient brightness is greater than the brightness threshold value Thd (as shown in fig. 11A), and the value of the brightness parameter LGEP is less than the parameter threshold value Thd' (as shown in fig. 11B); otherwise, if the ambient brightness is smaller than the brightness threshold Thd, the brightness parameter LGEP is larger than the parameter threshold Thd'. In the present invention, the ambient brightness and the brightness parameter are examples of the parameter for representing the ambient brightness.
In one embodiment, in the low power consumption mode, the camera 71 is configured to perform auto exposure (i.e., the fast auto exposure described above) to determine an exposure time and a gain value when the activation signal (St1 or St2) is generated, and to calculate and output the luminance parameter LGEP accordingly. Referring to fig. 11B, when the luminance parameter LGEP is smaller than a parameter threshold (e.g., Thd'), different exposure images are output; when the luminance parameter LGEP is greater than the parameter threshold (e.g., Thd'), the gain-combined image is output, wherein the different exposure images and the gain-combined image are described above, and thus are not described herein again.
When the master control 73 receives the activation signal (St1 or St2), it starts to end the low power mode. When the main control device 73 receives the brightness parameter LGEP, it determines the data format of the image transmitted by the camera 71, for example, whether the camera 71 transmits different exposure images or gain-combined images. As described above, when the master control device 73 determines (e.g., by comparing LGEP with a parameter threshold) that the camera 71 is to operate in the multiple exposure mode after waking up, different exposure images are recombined into one exposure combined image, such as Fm of fig. 4B, and then recorded; when the master control device 73 determines that the camera 71 is in the dual gain mode after being awakened, it directly receives and records the gain combination image.
In the present embodiment, the main control device 73 receives the brightness parameter LGEP from the camera 71 and compares it with its internal parameter threshold value by itself, instead of directly receiving the mode signal from the camera 71. As described above, the thresholds in the camera 71 are the same as each other since they are from the main control device 73. The main control device 73 can know the operation mode of the camera 71 by comparing the brightness parameter LGEP with the parameter threshold.
The activation signal (St1 or St2) is received by the master control 73 at a predetermined time interval from the brightness parameter, and the automatic exposure of the camera 71 is preferably completed within the predetermined time interval. In other words, the camera 71 preferably completes the quick exposure process before the main control unit 73 wakes up completely to determine the ambient brightness before the main control unit 73 wakes up completely.
In another embodiment, the camera 71 only transmits the gain value and the exposure time obtained by the fast auto-exposure to the main control device 73. The master control unit 73 calculates a brightness parameter (e.g., LGEP) by itself to determine the awake exposure mode.
In other embodiments, the camera 71 transmits the ambient light (which may be obtained by the camera 71 or the external device 79 as described above) to the master control device 73. The master control unit 73 then compares itself with the associated brightness threshold to determine the post-wake-up exposure mode.
In general, in the normal mode, the camera 71 acquires an image based on the gain control signal Sgain _ c and the exposure control signal Sexp _ c transmitted from the main control device 73, and refer to fig. 7. However, in the low power consumption mode, since the control signals Sgain _ c and Sexp _ c are not transmitted from the main control device 73, the storage (not shown) of the camera 71 preferably records a plurality of first exposure times (for example, Texp01 to Texp31 of fig. 11B) and a plurality of first Gain values (for example, Gain01 to Gain31 of fig. 11B) for acquiring the first exposure image, and a plurality of second exposure times (for example, Texp02 to Texp32 of fig. 11B) and a plurality of second Gain values (for example, Gain02 to Gain32 of fig. 11B) for acquiring the second exposure image under different ambient brightness.
Referring to fig. 12, a flowchart of an operating method of a camera system 700 according to an embodiment of the invention includes the following steps: judging an activation event (step S1211); automatic exposure (step S1213); judging the ambient brightness (step S1215); performing a double-gain exposure when the ambient brightness is less than a brightness threshold (step S1217); and performing multiple exposure when the ambient brightness is greater than the brightness threshold (step S1219). As mentioned before, two different values of the brightness threshold may be used in different applications, e.g. TH1, TH 2.
Referring to fig. 7-12, the following describes the present embodiment. In fig. 8-10, the presence of a cross over a functional block indicates that the functional block is turned off or that the functional block is not executing.
Step S1211: the operation method is executed when the camera system 700 is in the low power consumption mode. For example, the camera system 700 is a block diagram of fig. 8 that determines whether an activation event has occurred. In the camera 71 of fig. 8, the parts other than the image sensor 711 are functions performed by the processing unit 713 through software, hardware, and/or firmware. As previously described, in one embodiment, the processing unit 713 performs activation event detection based on the image output by the image sensor 711. When the activation event is detected as true, an activation signal St1 is generated to the master control 73. It will be appreciated that the image is also subjected to other processing procedures, such as amplification by a color gain amplifier, and that image processing is not limited to that shown in figure 8.
In another embodiment, the activation signal St2 is generated by an external device 79. When the activation signal St1 or St2 is generated, it indicates that the imaging system 700 needs to be woken up, and the process proceeds to step S1213.
Step S1213: next, the camera 71 performs fast auto exposure to determine the exposure time and gain value under the current environment. During the fast auto-exposure, the processing unit 713 generates the color gain, the gain control signal Sgain _ c and the exposure control signal Sexp _ c to perform the auto-exposure, since the main control device 73 is not yet woken up. Auto-exposure is known, except that the auto-exposure of this step is the fast auto-exposure described above.
Step S1215: next, the processing unit 713 of the camera 71 determines the ambient brightness. As described above, the processing unit 713 may determine the ambient brightness from the image generated by the camera 71, or from the detection result of the external device 79. Next, the exposure mode of the camera system 700 after wake-up is determined according to the ambient brightness, wherein the comparison between the ambient brightness and the brightness threshold can be performed by the camera 71 or the main control device 73, which is not described herein again because the details thereof are described above. As described above, the processing unit 713 may also decide the exposure mode according to a brightness parameter.
Step S1217: as shown in fig. 11A, when the ambient brightness is less than the brightness threshold Thd (or the brightness parameter of fig. 11B is greater than the parameter threshold Thd'), the dual gain mode is performed. As shown in fig. 9, when the camera system 700 operates in the dual gain mode after waking up (i.e., the normal mode), the camera 71 amplifies the same image acquired by the image sensor 711 with the high analog gain AGainL and the low analog gain AGainS to obtain a bright image and a dark image. The processing unit 713 then performs image synthesis to generate the gain combined image Isyn to the master device 73.
Step S1219: as shown in fig. 11A, when the ambient brightness is greater than the brightness threshold value Thd (or the brightness parameter of fig. 11B is less than the parameter threshold value Thd'), the multiple exposure mode is performed. As shown in fig. 10, if the camera system 700 is operated in the multi-exposure mode after waking up (i.e. the normal mode), the camera 71 sequentially obtains the first exposure image Iexp1 (or image 1) and the second exposure image Iexp2 (or image 2) according to the first set of camera parameters AGain1, Texp1, the second set of camera parameters AGain2, and Texp 2. The processing unit 713 then transmits the first exposure image Iexp1 and the second exposure image Iexp2 to the host device 73. The host 73 then synthesizes the first exposure image Iexp1 and the second exposure image Iexp2 to generate an exposure combination image, such as Fm in fig. 4B, and records the exposure combination image.
After the camera system 700 is woken up, the camera 71 and the master control device 73 need to have the same operation mode, and the master control device 73 can correctly receive the gain combination image or the first exposure image Iexp1 and the second exposure image Iexp2 transmitted from the camera 71 for subsequent operations. As mentioned above, the master control device 73 can receive the mode signal from the camera 71, or can automatically determine the operation mode according to the received brightness information (including the exposure time and the gain value or the brightness parameter), depending on the application.
In the present invention, the low power consumption mode is performed before the camera 71 outputs the gain-combined image or the first and second exposure images to the main control device 73.
In summary, in the bright environment and the dark environment, different exposure effects can be obtained by different exposure modes, so that the bright and dark areas in the combined image have better signal-to-noise ratio. Therefore, the present invention also provides a camera and an image capturing system using the camera (fig. 7 to 10), which operate the image capturing system in an appropriate exposure mode by determining the intensity of ambient light at the wake-up time. Therefore, the signal-to-noise ratio of bright and dark areas in the recorded image frame can be increased, and the accuracy of judgment and control by using the images is improved.
Although the present invention has been disclosed by way of examples, it is not intended to be limited thereto, and various changes and modifications can be made by one of ordinary skill in the art without departing from the spirit and scope of the invention. Therefore, the protection scope of the present invention is subject to the scope defined by the appended claims.

Claims (20)

1. A camera, the camera comprising:
an image sensor for sequentially generating images; and
a processing unit coupled to the image sensor and configured to, in a low power consumption mode,
determining the ambient brightness and performing activation event detection,
controlling the image sensor to output a first exposure image and a second exposure image in a multiple exposure mode when the activation event is detected as true and the ambient brightness is greater than a first brightness threshold, an
And when the activation event is detected to be true and the ambient brightness is less than a second brightness threshold value, controlling the image sensor to operate in a dual-gain mode and output a combined image.
2. The camera of claim 1, wherein,
the first brightness threshold is different from the second brightness threshold, and
when the ambient brightness is between the first brightness threshold and the second brightness threshold, the image sensor operates in a mode before entering the low power consumption mode.
3. The camera of claim 1, wherein the processing unit is to, in the low power mode,
performing the activation event detection based on the image generated by the image sensor,
the detection of said activation event being carried out on the basis of a detection signal generated by a thermal sensor external to said camera, or
And detecting the activation event according to a pressing signal generated by a doorbell or a touch pad outside the camera.
4. The camera of claim 1, wherein the processing unit is to, in the low power mode,
determining the ambient brightness according to at least one of the gain value and the exposure time obtained by the automatic exposure process,
determining the ambient brightness according to the sum or average of intensities of the images generated by the image sensor,
judging the environment brightness according to the driving intensity or the actuating frequency of the light source,
determining the ambient brightness according to the actuating frequency of the image sensor, or
And judging the ambient brightness according to a detection signal of an ambient light sensor outside the camera.
5. The camera of claim 1, wherein the processing unit is further to output an auto-exposure parameter and at least one of an activation signal and a wake-up exposure mode signal when the activation event is detected as true.
6. The camera of claim 1, wherein the combined image includes different image areas that amplify the same image produced by the image sensor with different gain values.
7. The camera of claim 1, wherein the camera further comprises a memory for recording a plurality of first exposure times and a plurality of first gain values for acquiring the first exposure image and a plurality of second exposure times and a plurality of second gain values for acquiring the second exposure image at different ambient light levels.
8. A camera system, the camera system comprising:
a camera for, in a low power consumption mode,
determining the ambient brightness and performing activation event detection,
outputting a multiple exposure mode signal, a first exposure image and a second exposure image, and when the activation event detection is true and the ambient brightness is greater than a first brightness threshold value
Outputting a dual gain mode signal and a gain combined image when the activation event detection is true and the ambient brightness is less than a second brightness threshold; and
a master control device, the master control device being configured to,
generating an exposure combination image based on the first exposure image and the second exposure image when the multiple exposure mode signal is received, an
Recording the gain-combined image when the dual-gain mode signal is received.
9. The camera system of claim 8,
the camera acquires the first exposure image with a first exposure time and acquires the second exposure image with a second exposure time, wherein the first exposure time is different from the second exposure time; and is
The main control device is configured to divide the first exposure image into a plurality of first image areas, divide the second exposure image into a plurality of second image areas, compare signal characteristics of the corresponding first image areas and the corresponding second image areas, and combine the first image areas with the larger signal characteristics and the second image areas with the larger signal characteristics into the exposure combination image.
10. The camera system of claim 8,
the first brightness threshold is different from the second brightness threshold, and
when the ambient brightness is between the first brightness threshold and the second brightness threshold, the camera outputs a mode signal for operating in a mode before entering the low power consumption mode.
11. The camera system of claim 8, wherein the gain-combined image contains different image regions that amplify the same image acquired by the camera at different gain values.
12. The camera system of claim 8, wherein the master device is further configured to store the first and second brightness thresholds in the memory of the camera when the camera is first connected to the master device.
13. The camera system of claim 12, wherein the storage of the camera further records a plurality of first exposure times and a plurality of first gain values for acquiring the first exposure image and a plurality of second exposure times and a plurality of second gain values for acquiring the second exposure image at different ambient brightness.
14. The camera system of claim 8, wherein the camera is configured to, in the low power mode,
the activation event detection is performed based on the acquired image,
detection of said activation event being based on a detection signal generated by an external thermal sensor, or
The detection of the activation event is performed according to a pressing signal generated by an external doorbell or a touch pad.
15. The camera system of claim 8, wherein the camera is configured to, in the low power mode,
determining the ambient brightness according to at least one of the gain value and the exposure time obtained by the automatic exposure process,
judging the ambient brightness according to the sum or average of the intensities of the acquired images,
judging the environment brightness according to the driving intensity or the actuating frequency of the light source,
determining the ambient brightness according to the actuation frequency thereof, or
And judging the ambient brightness according to a detection signal of an external ambient light sensor.
16. The camera system of claim 8, wherein the camera is further configured to sequentially output an activation signal and auto-exposure parameters to the master device when the activation event is detected as true.
17. A camera system, the camera system comprising:
a camera for, in a low power consumption mode,
when the activation signal is generated, automatic exposure is performed to determine the exposure time and the gain value, and then the brightness parameter is calculated and outputted,
when the brightness parameter is less than the parameter threshold, outputting different exposure images, an
When the brightness parameter is larger than the parameter threshold value, outputting a gain combination image; and a master control device for controlling the operation of the mobile phone,
ending the low power consumption mode when the activation signal is received, an
And receiving the brightness parameter and judging the data format of the image transmitted by the camera according to the brightness parameter, wherein the difference between the activation signal and the brightness parameter is received by the main control device within a preset time interval, and the automatic exposure is completed within the preset time interval.
18. The camera system of claim 17, wherein the gain-combined image contains different image regions that amplify the same image acquired by the camera at different gain values.
19. The camera system of claim 17, wherein the camera is configured to, in the low power mode,
the activation signal is generated from the image it acquires,
receiving the activation signal from an external thermal sensor, or
The activation signal is received from an external doorbell or touchpad.
20. The camera system of claim 17, wherein the activation signal is transmitted to the camera and the master device.
CN202010734127.1A 2019-08-14 2020-07-27 Camera with two exposure modes and camera system using same Active CN112399090B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210298031.4A CN114785964B (en) 2019-08-14 2020-07-27 Image pickup system having two exposure modes

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201962886429P 2019-08-14 2019-08-14
US62/886,429 2019-08-14
US16/881,437 2020-05-22
US16/881,437 US11614322B2 (en) 2014-11-04 2020-05-22 Camera having two exposure modes and imaging system using the same

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210298031.4A Division CN114785964B (en) 2019-08-14 2020-07-27 Image pickup system having two exposure modes

Publications (2)

Publication Number Publication Date
CN112399090A true CN112399090A (en) 2021-02-23
CN112399090B CN112399090B (en) 2022-04-15

Family

ID=74603003

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010734127.1A Active CN112399090B (en) 2019-08-14 2020-07-27 Camera with two exposure modes and camera system using same
CN202210298031.4A Active CN114785964B (en) 2019-08-14 2020-07-27 Image pickup system having two exposure modes

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202210298031.4A Active CN114785964B (en) 2019-08-14 2020-07-27 Image pickup system having two exposure modes

Country Status (1)

Country Link
CN (2) CN112399090B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040251394A1 (en) * 2003-06-11 2004-12-16 Rhodes Howard E. Dual conversion gain imagers
TWI427508B (en) * 2010-12-01 2014-02-21 Pixart Imaging Inc An optical sensing device and a method of adjusting an exposure condition for the same
CN104247398A (en) * 2012-04-11 2014-12-24 佳能株式会社 Imaging device and method for controlling same
US20150201140A1 (en) * 2014-01-10 2015-07-16 Omnivision Technologies, Inc. Dual conversion gain high dynamic range sensor
US20160119575A1 (en) * 2014-10-24 2016-04-28 Texas Instruments Incorporated Image data processing for digital overlap wide dynamic range sensors

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3993043B2 (en) * 2002-08-05 2007-10-17 富士フイルム株式会社 Digital still camera
KR20060124119A (en) * 2005-05-31 2006-12-05 주식회사 팬택 Method for fast camera preview and apparatus implementing the method
US20060284895A1 (en) * 2005-06-15 2006-12-21 Marcu Gabriel G Dynamic gamma correction
JP4629002B2 (en) * 2006-06-21 2011-02-09 三菱電機株式会社 Imaging device
KR100910361B1 (en) * 2007-07-26 2009-08-04 노키아 코포레이션 Exposure of digital imaging
KR100950465B1 (en) * 2007-12-21 2010-03-31 손승남 Camera control method for vehicle enrance control system
TWI539816B (en) * 2013-12-25 2016-06-21 恆景科技股份有限公司 Image sensor
KR101637637B1 (en) * 2014-01-06 2016-07-07 재단법인 다차원 스마트 아이티 융합시스템 연구단 Method and apparatus for local auto exposure in video sensor systems
CN103888681B (en) * 2014-04-18 2017-05-31 四川华雁信息产业股份有限公司 A kind of automatic explosion method and device
US9467632B1 (en) * 2015-07-13 2016-10-11 Himax Imaging Limited Dual exposure control circuit and associated method
JP2017028637A (en) * 2015-07-27 2017-02-02 キヤノン株式会社 Photographing device, and control method and program for the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040251394A1 (en) * 2003-06-11 2004-12-16 Rhodes Howard E. Dual conversion gain imagers
TWI427508B (en) * 2010-12-01 2014-02-21 Pixart Imaging Inc An optical sensing device and a method of adjusting an exposure condition for the same
CN104247398A (en) * 2012-04-11 2014-12-24 佳能株式会社 Imaging device and method for controlling same
US20150201140A1 (en) * 2014-01-10 2015-07-16 Omnivision Technologies, Inc. Dual conversion gain high dynamic range sensor
US20160119575A1 (en) * 2014-10-24 2016-04-28 Texas Instruments Incorporated Image data processing for digital overlap wide dynamic range sensors

Also Published As

Publication number Publication date
CN114785964A (en) 2022-07-22
CN112399090B (en) 2022-04-15
CN114785964B (en) 2024-03-01

Similar Documents

Publication Publication Date Title
US11614322B2 (en) Camera having two exposure modes and imaging system using the same
CN108513078B (en) Method and system for capturing video imagery under low light conditions using light emission by a depth sensing camera
JP4254841B2 (en) Imaging apparatus, imaging method, image processing apparatus, image processing method, and image processing program
JP3004382B2 (en) TV camera device with variable shutter
JP4792976B2 (en) Imaging device
US20070263099A1 (en) Ambient Light Rejection In Digital Video Images
US20050220450A1 (en) Image-pickup apparatus and method having distance measuring function
US20210127049A1 (en) Optical distance measurement system and imaging system with dynamic exposure time
US8587713B2 (en) Digital camera and method of controlling the same that calculates needed flash emission
JP2002208493A (en) Illumination control system
JP2020202489A (en) Image processing device, image processing method, and program
JP5544223B2 (en) Imaging device
US8005354B2 (en) Image pickup apparatus, light emission device, and image pickup system
CN112399090B (en) Camera with two exposure modes and camera system using same
TW201617639A (en) Optical distance measurement system and method
JP2006339741A (en) Monitoring device and monitoring method
JP4619981B2 (en) Image recognition device
US20230093565A1 (en) Camera having two exposure modes and imaging system using the same
JP2009266502A (en) Lighting system
JP4329677B2 (en) Motion detection device
JP6525723B2 (en) Imaging device, control method therefor, program, and storage medium
CN108332720B (en) Optical distance measuring system
KR101692831B1 (en) Active-type system for optimizing image quality
JP2010147817A (en) Imaging apparatus, method and program for controlling same
JP3808254B2 (en) Imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant