US20240048852A1 - Apparatus, control method, and storage medium - Google Patents
Apparatus, control method, and storage medium Download PDFInfo
- Publication number
- US20240048852A1 US20240048852A1 US18/488,884 US202318488884A US2024048852A1 US 20240048852 A1 US20240048852 A1 US 20240048852A1 US 202318488884 A US202318488884 A US 202318488884A US 2024048852 A1 US2024048852 A1 US 2024048852A1
- Authority
- US
- United States
- Prior art keywords
- exposure
- image
- value
- exposure correction
- correction value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 28
- 238000012937 correction Methods 0.000 claims abstract description 106
- 238000012544 monitoring process Methods 0.000 description 117
- 238000012545 processing Methods 0.000 description 105
- 238000001514 detection method Methods 0.000 description 52
- 238000004891 communication Methods 0.000 description 22
- 238000011156 evaluation Methods 0.000 description 21
- 238000010586 diagram Methods 0.000 description 19
- 230000008859 change Effects 0.000 description 17
- 230000006870 function Effects 0.000 description 14
- 230000003287 optical effect Effects 0.000 description 9
- 238000006243 chemical reaction Methods 0.000 description 8
- 210000000746 body region Anatomy 0.000 description 4
- 230000004044 response Effects 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 230000002411 adverse Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 241000282326 Felis catus Species 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 210000000481 breast Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000011895 specific detection Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/28—Circuitry to measure or to take account of the object contrast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/72—Combination of two or more compensation controls
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/20—Circuitry for controlling amplitude response
- H04N5/202—Gamma control
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/08—Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
- G03B7/091—Digital circuits
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
Definitions
- aspects of the embodiments generally relate to an apparatus, a control method, and a storage medium.
- aspects of the embodiments are generally directed to an apparatus capable of communicating with a capturing apparatus includes a first acquisition unit configured to acquire an image captured by the capturing apparatus, a second acquisition unit configured to acquire first exposure information determined by the capturing apparatus based on a luminance of a first region in the image, a first determination unit configured to determine second exposure information based on a luminance of a second region, a second determination unit configured to determine correction information based on a difference between the first exposure information and the second exposure information, and an output unit configured to output the correction information to the capturing apparatus.
- FIG. 1 is a diagram illustrating a configuration example of an image capturing control system.
- FIG. 2 is a diagram illustrating an internal configuration example of a monitoring camera.
- FIG. 3 is a diagram illustrating an internal configuration example of a client apparatus.
- FIG. 4 is a diagram illustrating a functional configuration example of the client apparatus.
- FIG. 5 is a diagram used to explain processing to be performed between the monitoring camera and the client apparatus.
- FIGS. 6 A and 6 B are diagrams used to explain a luminance state in an example illustrated in FIG. 5 .
- FIG. 7 is a flowchart of exposure control processing according to a first exemplary embodiment.
- FIG. 8 is a diagram used to explain processing for changing exposure control.
- FIG. 9 is a diagram used to explain processing to be performed in a case where exposure control in the first exemplary embodiment has been performed.
- FIG. 10 is a flowchart of exposure control processing according to a second exemplary embodiment.
- FIG. 11 is a diagram illustrating characteristics of gamma processing and degamma processing.
- One or more functional blocks to be described below can be implemented by hardware, such as an ASIC or programmable logic array (PLA), or can be implemented by a programmable processor, such as a CPU or an MPU, executing software. Moreover, they can be implemented by a combination of software and hardware. Accordingly, in the following description, even in a case where different functional blocks are described as actors, the same hardware can be implemented as an actor.
- ASIC is an abbreviation for application specific integrated circuit.
- CPU is an abbreviation for central processing unit.
- MPU is an abbreviation for micro processing unit.
- FIG. 1 is a diagram illustrating a configuration example of an image capturing control system 100 according to a first exemplary embodiment.
- the image capturing control system 100 includes a monitoring camera 101 , a network 102 , a client apparatus 103 , an input device 104 , and a display device 105 .
- the monitoring camera 101 is an image capturing apparatus for acquiring a moving image and is an apparatus capable of performing image capturing of, for example, a subject and image processing.
- the monitoring camera 101 and the client apparatus 103 are interconnected via the network 102 in such a way as to be able to communicate with each other.
- the client apparatus 103 is connected to the input device 104 and the display device 105 in such a way as to be able to communicate with them.
- the client apparatus 103 is an apparatus for processing various pieces of information and, therefore, can be referred to as an “information processing apparatus”.
- the client apparatus 103 is an apparatus for controlling an image capturing operation of the monitoring camera 101 and, therefore, can be referred to as an “image capturing control apparatus”.
- the input device 104 is configured with, for example, a mouse and a keyboard, and is configured to be operated by the user of the client apparatus 103 .
- the display device 105 is an apparatus including, for example, a monitor which displays an image received from the client apparatus 103 . Furthermore, the display device 105 is also able to function as a UI, such as a touch panel. In this case, the display device 105 becomes able to also function as an input device for inputting, for example, an instruction, information, and data to the client apparatus 103 .
- UI is an abbreviation for user interface.
- the client apparatus 103 , the input device 104 , and the display device 105 are illustrated as respective individual devices, the present exemplary embodiment is not limited to such a configuration.
- the client apparatus 103 and the display device 105 can be integrated together, or the input device 104 and the display device 105 can be integrated together.
- the client apparatus 103 , the input device 104 , and the display device 105 can be integrated together.
- an apparatus obtained by integration can be in the form of, for example, a personal computer, a tablet terminal, or a smartphone.
- FIG. 2 is a block diagram illustrating an internal configuration example of the monitoring camera 101 .
- the monitoring camera 101 includes an image capturing optical system 201 and an image sensor 202 .
- the monitoring camera 101 further includes a camera CPU 203 , a ROM 204 , a RAM 205 , an image capturing system control unit 206 , a control unit 207 , an A/D conversion unit 208 , an image processing unit 209 , an encoder unit 210 , and a network OF 211 .
- the camera CPU 203 through the network I/F 211 of the monitoring camera 101 are interconnected via a system bus 212 .
- CPU is an abbreviation for central processing unit.
- ROM is an abbreviation for read-only memory.
- A/D is an abbreviation for analog-to-digital.
- RAM is an abbreviation for random access memory.
- OF is an abbreviation for interface.
- the image capturing optical system 201 is configured with, for example, a zoom lens, a focus lens, an image shake correction lens, a diaphragm, and a shutter, and is an optical member group for collecting light coming from a subject.
- An optical image of light coming from, for example, a subject collected by the image capturing optical system 201 is formed on an imaging plane of the image sensor 202 .
- the image sensor 202 is a charge accumulation-type solid-state image sensor, such as a CMOS sensor or a CCD sensor, which converts the optical image of light collected by the image capturing optical system 201 into a current value (signal value), and is an image capturing unit which acquires color information in combination with, for example a color filter.
- CMOS is an abbreviation for complementary metal-oxide semiconductor.
- CCD is an abbreviation for charge-coupled device.
- the image sensor 202 is connected to the A/D conversion unit 208 .
- the A/D conversion unit 208 converts the amount of light detected by the image sensor 202 into a digital signal (image data).
- the A/D conversion unit 208 transmits the digital signal to the image processing unit 209 .
- the image processing unit 209 performs image processing on image data which is a digital signal received from the image sensor 202 .
- the image processing unit 209 is connected to the encoder unit 210 .
- the encoder unit 210 performs processing for converting image data processed by the image processing unit 209 into a file format, such as Motion JPEG, H264, or H265.
- the encoder unit 210 is connected to the network I/F 211 .
- the camera CPU 203 is a control unit which comprehensively controls an operation of the monitoring camera 101 .
- the camera CPU 203 reads an instruction stored in the ROM 204 or the RAM 205 and performs processing corresponding to the instruction.
- the image capturing system control unit 206 controls each component of the monitoring camera 101 based on instructions issued from the camera CPU 203 .
- the image capturing system control unit 206 performs control operations, such as focus control, shutter control, and aperture adjustment, with respect to the image capturing optical system 201 .
- the network OF 211 is an interface for use in communicating with an external apparatus, such as the client apparatus 103 , via the network 102 , and is controlled by the control unit 207 .
- the control unit 207 controls communication to be performed with the client apparatus 103 , and performs control to, for example, transmit, to the camera CPU 203 , a control instruction (control signal) issued by the client apparatus 103 to each component of the monitoring camera 101 .
- the network 102 is an Internet Protocol (IP) network used to interconnect the monitoring camera 101 and the client apparatus 103 .
- IP Internet Protocol
- the network 102 is configured with, for example, a plurality of routers compliant with a communication standard such as Ethernet, switches, and cables.
- the network 102 is a network capable of being used to enable communication between the monitoring camera 101 and the client apparatus 103 , and does not have restrictions on, for example, its communication standard, scale, and configuration.
- the network 102 can be configured with, for example, the Internet, a wired local area network (LAN), a wireless LAN, or a wide area network (WAN).
- LAN local area network
- WAN wide area network
- FIG. 3 is a block diagram illustrating an internal configuration example of the client apparatus 103 .
- the client apparatus 103 includes a client CPU 301 , a main storage device 302 , an auxiliary storage device 303 , an input OF 304 , an output OF 305 , and a network OF 306 .
- the respective components of the client apparatus 103 are interconnected via a system bus 307 in such a way as to be able to communicate with each other.
- the client CPU 301 is a central processing unit which comprehensively controls an operation of the client apparatus 103 . Furthermore, the client CPU 301 can be configured to comprehensively control the monitoring camera 101 via the network 102 .
- the main storage device 302 is a storage device, such as a RAM, which functions as a temporary data storage location for the client CPU 301 .
- the auxiliary storage device 303 is a storage device, such as an HDD, a ROM, or an SSD, which stores, for example, various programs and various pieces of setting data. Furthermore, HDD is an abbreviation for hard disk drive. SSD is an abbreviation for solid state drive. Moreover, a program concerned with the present exemplary embodiment is stored in the auxiliary storage device 303 . In the present exemplary embodiment, functions and processing operations of the client apparatus 103 illustrated in FIG. 4 are implemented by the client CPU 301 performing processing based on a program read out from the auxiliary storage device 303 and then loaded onto the main storage device 302 . Details of this processing are described below.
- the auxiliary storage device 303 can be configured to previously store therein, for example, patterns for pattern matching (patterns corresponding to characteristic portions of faces or characteristic portions of human bodies), which are used for the client apparatus 103 to perform face detection or human body detection based on image data. Furthermore, the patterns for pattern matching can be formed by execution of a program and then stored in the main storage device 302 .
- the input I/F 304 is an interface used for the client apparatus 103 to receive an input (signal) from, for example, the input device 104 .
- the output OF 305 is an interface used for the client apparatus 103 to output information (signal) to, for example, the display device 105 .
- the network OF 306 is an interface for use in communication with an external apparatus, such as the monitoring camera 101 , via the network 102 .
- FIG. 4 is a functional block diagram illustrating functions which the client apparatus 103 executes.
- the functional units (functional blocks) illustrated in FIG. 4 are functional units which the client CPU 301 is able to execute, and these functional units are synonymous with the client CPU 301 .
- the client CPU 301 of the client apparatus 103 includes, as functional units, an input signal acquisition unit 401 , a communication control unit 402 , an input image acquisition unit 403 , a camera information acquisition unit 404 , and a detection method setting unit 405 .
- the client CPU 301 further includes, as functional units, a subject detection unit 406 , an exposure determination unit 407 , and a display control unit 408 .
- the functional units illustrated in FIG. 4 i.e., the input signal acquisition unit 401 through the display control unit 408 , can be configured with hardware (or software) different from the client CPU 301 .
- the input signal acquisition unit 401 receives an input from the user via the input device 104 .
- the communication control unit 402 performs control to receive an image transmitted from the monitoring camera 101 (an image captured by the monitoring camera 101 ) via the network 102 . Moreover, the communication control unit 402 performs control to transmit a control instruction issued by the client apparatus 103 to the monitoring camera 101 via the network 102 .
- the input image acquisition unit 403 acquires an image received from the monitoring camera 101 via the communication control unit 402 , as an input image targeted for processing for detecting a subject (an image to which subject detection processing is applied). Details of the detection processing are described below.
- the camera information acquisition unit 404 acquires, via the communication control unit 402 , camera information to be used for the monitoring camera 101 to perform image capturing of, for example, a subject.
- the camera information includes various pieces of camera setting information and image processing information used for performing image capturing of, for example, a subject to acquire an image.
- the camera information includes, for example, exposure parameters for, for example, aperture value, shutter speed, and gain (setting value concerning exposure) and information concerning image processing related to luminance, such as gamma correction, edge enhancement, and white balance.
- the detection method setting unit 405 sets a predetermined (appropriate) detection method from among various detection methods including detection of a face region (face detection) or detection of a human body region (human body detection) with respect to an input image acquired by the input image acquisition unit 403 .
- the detection method setting unit 405 sets (selects) a detection method for face detection or a detection method for human body detection.
- the subject detection unit 406 detects a specific subject region from within an input image captured by the monitoring camera 101 and acquired by the input image acquisition unit 403 . For example, in a case where performing face detection has been set by the detection method setting unit 405 , the subject detection unit 406 detects a face region from the input image. Moreover, for example, in a case where performing human body detection has been set by the detection method setting unit 405 , the subject detection unit 406 detects a human body region from the input image.
- the present exemplary embodiment is not limited to such a setting.
- a detection method for detecting a feature region of a part of a person such as the upper body of the person or some of organs such as the head or the eye, nose, or mouth of the face, can be set (can be selected).
- a specific subject targeted for detection is assumed to be a person
- a configuration capable of detecting a feature region related to a specific subject other than persons can be employed.
- a configuration capable of detecting a specific subject previously set in the client apparatus 103 such as the face of an animal or an automobile, can also be employed.
- the exposure determination unit 407 has the function of determining, based on an exposure setting value for the monitoring camera 101 acquired by the camera information acquisition unit 404 and image information about a subject region detected by the subject detection unit 406 , an exposure amount of the detected subject region. In addition, the exposure determination unit 407 also has the function of performing exposure control of the monitoring camera 101 based on the determined exposure amount. Exposure control of the monitoring camera 101 in the exposure determination unit 407 is performed by transmitting an exposure control value that is based on the determined exposure amount to the monitoring camera 101 via the communication control unit 402 .
- the exposure determination unit 407 calculates an exposure correction amount representing an amount of change of exposure for bringing a subject region into a correct exposure state, based on a difference between the exposure setting value of the monitoring camera 101 and the image information about the subject region, and transmits an exposure control value corresponding to the calculated exposure correction amount to the communication control unit 402 . Then, the communication control unit 402 transmits a control instruction corresponding to the exposure control value (exposure correction amount) to the monitoring camera 101 via the network OF 306 . With this processing, in the monitoring camera 101 having received the control instruction, exposure control is performed by the control unit 207 or the image capturing system control unit 206 .
- the exposure determination unit 407 can be configured to transmit not the exposure control value but the calculated exposure correction amount to the communication control unit 402 and transmit not the control instruction but the exposure correction amount (exposure correction value) to the monitoring camera 101 via the network OF 306 .
- the monitoring camera 101 calculates an exposure control value corresponding to the input exposure correction amount and controls exposure based on the calculated exposure control value.
- the exposure control value as mentioned here is a parameter for use in controlling exposure for the monitoring camera 101 , and indicates, for example, an aperture value, an exposure time, and an analog gain and a digital gain.
- the exposure determination unit 407 determines correction information (exposure correction amount or exposure control value) for bringing a subject region into a correct exposure state, based on a difference between the exposure setting value for the monitoring camera 101 and luminance information about the subject region, and outputs the correction information to the monitoring camera 101 via the network OF 306 .
- correction information exposure correction amount or exposure control value
- the exposure determination unit 407 is configured to perform exposure control using at least a first exposure correction method and a second exposure correction method. While details thereof are described below, for example, the exposure determination unit 407 performs coarse adjustment for changing exposure for the monitoring camera 101 at a stretch up to a predetermined amount as a first exposure correction method and, from that point onwards, performs exposure control by a second exposure correction method for finely adjusting exposure for the monitoring camera 101 . Particularly, in a case where the calculated exposure correction amount is larger than a predetermined amount (in a case where the amount of change of exposure is large), the exposure determination unit 407 performs coarse adjustment for changing exposure at a stretch up to the predetermined amount and, from that point onwards, performs fine adjustment.
- the exposure determination unit 407 performs control to change exposure at a stretch up to the exposure correction amount in a period of coarse adjustment. Details of such processing performed by the exposure determination unit 407 are described below with reference to, for example, the flowchart of FIG. 7 .
- the display control unit 408 outputs, to the display device 105 , a captured image in which exposure correction using the exposure correction amount determined by the exposure determination unit 407 has been reflected, in response to an instruction from the client CPU 301 .
- the client apparatus 103 performs exposure control of the monitoring camera 101 , the client apparatus 103 performs degamma processing as mentioned above and thus correctly determines the brightness of an image which the image sensor 202 of the monitoring camera 101 outputs.
- the monitoring camera 101 and the client apparatus 103 differ from each other in a method for determining an exposure state, it is necessary to take into consideration such a difference in the determination method.
- the monitoring camera 101 is assumed to set the whole image as a light metering area and determine an exposure amount based on a luminance acquired in the whole image.
- the client apparatus 103 is assumed to set, for example, a subject region, such as a person, extracted from the image as a light metering area and perform exposure determination based on information about a luminance acquired in the subject region.
- the monitoring camera 101 determines an exposure amount with the whole image set as an evaluation range for exposure setting
- the client apparatus 103 performs exposure determination with a subject region in the image set as an evaluation range for exposure setting.
- a scene directed to enabling image capturing in which a subject region is exposed with appropriate brightness in a case where the monitoring camera 101 and the client apparatus 103 differ from each other in an evaluation region for exposure state determination is assumed.
- FIG. 5 is a diagram used to explain an exchange of processing between a monitoring camera and a client apparatus, and illustrates a specific example in a case where exposure control for the monitoring camera has not been appropriately performed.
- FIGS. 6 A and 6 B are diagrams used to explain the manner of luminance of an image in each state illustrated in FIG. 5 .
- FIG. 6 A illustrates histograms each representing a relationship between a luminance value and the number of pixels in an image
- FIG. 6 B is a diagram illustrating average values of luminance in an image.
- An image 501 illustrated in FIG. 5 indicates an image in an initial state obtained by the monitoring camera capturing an image of the subject, and is an example of an image captured in a backlit scene.
- the region of a person serving as a subject becomes dark and the surroundings thereof become bright. Even in such a case, to accurately perform recognition of a person, the image is to be set to an exposure state in which the brightness of the region of a person becomes appropriate.
- a graph 601 illustrated in FIG. 6 A represents a distribution of luminance obtained in a state of brightness such as an image 501 illustrated in FIG. 5 .
- the luminance distribution is, therefore, in a state in which a dark portion and a bright portion are dominant in the image 501 as shown in the graph 601 .
- a graph 602 illustrated in FIG. 6 B represents the amount of brightness of the entire image serving as a light metering area (an evaluation region for exposure determination) for use in the monitoring camera, into which the luminance distribution represented by the graph 601 has been converted.
- an image 502 illustrated in FIG. 5 represents an example of an image obtained by the client apparatus performing degamma processing to reproduce a state of brightness which the actual image sensor captures, in such a way as not to be affected by an image capturing condition for use in the monitoring camera.
- the client apparatus extracts the brightness while focusing on persons as a light metering area (an evaluation region for exposure determination), and, since persons are dark as shown in the image 502 , the client apparatus seeks to increase exposure up to an appropriate brightness.
- the client apparatus seeks to perform adjustment to bring the present brightness of persons indicated by a dashed line shown near the base of the arrow 603 into an appropriate brightness indicated by a solid line shown near the tip of the arrow 603 .
- the processing indicated by the arrow 603 is assumed to be, for example, processing for adjusting an exposure level in such a manner that the present brightness obtained before processing becomes a twofold brightness, as indicated by a graph 604 illustrated in FIG. 6 B .
- over correction may be performed as in an image 503 illustrated in FIG. 5 , so that a loss of highlight detail may occur in the region of persons.
- a loss of highlight detail may occur in the region of persons.
- the monitoring camera adjusts brightness with the entire captured image used as an evaluation region, a high-luminance portion enters into a saturation state, in which the portion is unable to be made brighter any more.
- the monitoring camera performs adjustment of exposure in such a way as to make the brightness of a low-luminance portion indicated by a dashed line shown near the base of the arrow 605 two or more times and make the entire brightness two times as indicated by a solid line shown near the tip of the arrow 605 .
- a graph 606 illustrated in FIG. 6 B represents amounts of brightness obtained before and after exposure control performed in such a way as to make the brightness of the low-luminance portion two or more times and make the entire brightness two times.
- correction information for controlling exposure for the monitoring camera 101 is described as an exposure correction amount, but can be an exposure control value as mentioned above.
- FIG. 7 is a flowchart illustrating the flow of subject detection processing through exposure control processing which are performed by the client CPU 301 of the client apparatus 103 according to the first exemplary embodiment.
- the monitoring camera 101 , the client apparatus 103 , the input device 104 , and the display device 105 are assumed to be previously powered on and a connection (communication) between the monitoring camera 101 and the client apparatus 103 are assumed to be previously established.
- image capturing of, for example, a subject with a predetermined updating cycle by the monitoring camera 101 , transmission of image data from the monitoring camera 101 to the client apparatus 103 , and image display by the display device 105 are assumed to be continuously repeated.
- the processing illustrated in the flowchart of FIG. 7 is assumed to be started by the client CPU 301 in response to a captured image of, for example, a subject being input from the monitoring camera 101 to the client apparatus 103 via the network 102 .
- the subject detection unit 406 performs processing for detecting a subject from an image transmitted from the monitoring camera 101 .
- the detection method setting unit 405 sets a detection method for a face or human body to the subject detection unit 406 .
- the subject detection unit 406 performs face detection processing or human body detection processing on an input image according to the setting performed by the detection method setting unit 405 .
- Patterns respectively corresponding to feature portions of faces or feature portions of human bodies are previously stored in the auxiliary storage device 303 of the client apparatus 103 , and the subject detection unit 406 performs detection of a face region or a human body region by pattern matching that is based on the stored patterns.
- detecting a face region usually, it is possible to detect a face with a high degree of accuracy and it is possible to clearly discriminate between a face region of the subject and a region other than the face region.
- the direction of the face, the size of the face, or the brightness of the face is not in a condition adapted for face detection, it may not be possible to accurately detect the face region.
- detecting a human body it is possible to detect a region in which a person is present, irrespective of, for example, the direction of the face, the size of the face, or the brightness of the face.
- human body detection in the present exemplary embodiment does not necessarily need to detect the whole body, but can detect the upper half of the body, the upper body above the breast, or a head region including the face.
- a pattern matching method as a detection method for a subject
- patterns classifiers
- subject detection can be performed by using a method other than the pattern matching method.
- subject detection can be performed with use of a luminance gradient within a local region in the image.
- the detection method for a subject is not limited to a specific detection method, and can include various methods, such as detection that is based on machine learning and detection that is based on distance information.
- step S 702 the subject detection unit 406 determines whether a subject (a face region or a human body region) has been detected from within the image by the subject detection processing performed in step S 701 . Then, if it is determined that at least one subject has been detected (YES in step S 702 ), the client CPU 301 advances the processing to step S 703 , and, on the other hand, if it is determined that no subject has been detected (NO in step S 702 ), the client CPU 301 ends the present processing.
- a subject a face region or a human body region
- step S 703 the exposure determination unit 407 measures the brightness (luminance) of the subject region.
- step S 704 the exposure determination unit 407 acquires the current exposure value acquired from the monitoring camera 101 by the camera information acquisition unit 404 (the current exposure setting value of the monitoring camera 101 ).
- the exposure determination unit 407 calculates an exposure correction amount representing a change amount (correction amount) for exposure (exposure value) for bringing the subject region into a correct exposure state, based on information about the brightness of the subject region and the current exposure setting value of the monitoring camera 101 . More specifically, the exposure determination unit 407 calculates an exposure value for bringing the subject region into a correct exposure state, from luminance information about the subject region. The exposure determination unit 407 determines correction information concerning exposure for the monitoring camera 101 based on a difference between the calculated exposure value and the current exposure setting value of the monitoring camera 101 .
- the correction information can be an exposure correction amount (exposure correction value) for the monitoring camera 101 , or can be an exposure control value (an exposure time or a gain control value).
- step S 706 the exposure determination unit 407 determines whether the exposure correction amount (exposure change amount) is less than or equal to a predetermined amount. Then, if, in step S 706 , it is determined that the exposure correction amount is less than or equal to the predetermined amount (YES in step S 706 ), the client CPU 301 advances the processing to step S 707 , and, on the other hand, if it is determined that the exposure correction amount exceeds the predetermined amount (NO in step S 706 ), the client CPU 301 advances the processing to step S 708 .
- the exposure determination unit 407 performs processing for determining the extent to which the exposure correction amount (correction information) representing the change amount for exposure is.
- the exposure correction amount calculated by the client apparatus is likely to become a correction amount greatly different from the reality. Accordingly, in a case where the exposure correction amount (correction information) is greater than the predetermined amount (a case where the change amount for exposure is large), the exposure determination unit 407 does not change exposure at a stretch with the calculated exposure correction amount but performs coarse adjustment to change exposure at a stretch up to a certain degree of amount and, from that point onwards, progressively performs fine adjustment.
- the exposure determination unit 407 additionally determines correction information in such a way as to modify (coarsely adjust) the correction information according to a predetermined value and, from that point onwards, finely adjust the luminance of the evaluation region on the side of the client apparatus. In this way, it becomes easy to perform matching of brightness, and it is possible to prevent or reduce an adverse effect in which, if coarse adjustment is performed, hunting of luminance occurs. Moreover, it is possible to reduce the possibility that the exposure correction for the monitoring camera to be performed with the correction information determined by the exposure determination unit 407 becomes over correction.
- the evaluation region (light metering region) for the client apparatus is a first region
- the evaluation region (light metering region) for the monitoring camera is a second region
- the evaluation region (light metering region) for the client apparatus, which is the first region is to be a region including a main subject (for example, a human body, a face, or an animal such as a bird or cat).
- FIG. 8 is diagram used to explain an example of such exposure adjustment.
- a previously determined specification upper limit is taken as a predetermined amount (a predetermined value) for use in determination with respect to the exposure correction amount.
- the exposure determination unit 407 follows a transition for correction amount being small indicated by a solid line 801 illustrated in FIG. 8 , thus changing the exposure for the monitoring camera at a stretch up to the exposure correction amount in a coarse adjustment period.
- the exposure determination unit 407 follows a transition for correction amount being large indicated by a solid line 802 illustrated in FIG. 8 , thus changing the exposure for the monitoring camera at a stretch up to the specification upper limit in the coarse adjustment period.
- the exposure determination unit 407 updates the exposure correction amount to the specification upper limit (the predetermined value).
- the exposure determination unit 407 performs exposure control in such a way as to gradually change the exposure level until correct exposure is attained in the subject region.
- the exposure determination unit 407 determines the exposure correction amount again based on luminance information about the second region.
- the predetermined amount which is used for comparison with the exposure correction amount, can be changed as appropriate according to a degree of coincidence indicating to what degree the evaluation region for the monitoring camera and the evaluation region for the monitoring camera coincide with each other.
- the specification upper limit which is used for changing exposure at a stretch, can be a value different from the predetermined amount for use in comparison with the exposure correction amount.
- step S 707 to which the processing has proceeded after, in step S 706 , it is determined that the exposure correction amount is less than or equal to the predetermined amount, the exposure determination unit 407 outputs, to the monitoring camera 101 via the communication control unit 402 , an instruction for controlling exposure for the monitoring camera 101 according to the exposure correction amount calculated in the above-described way.
- the monitoring camera 101 performs exposure control based on the instruction corresponding to the exposure correction amount, thus becoming able to capture an image in which the subject has a correct brightness.
- step S 708 to which the processing has proceeded after, in step S 706 , it is determined that the exposure correction amount exceeds the predetermined amount, the exposure determination unit 407 outputs, to the monitoring camera 101 via the communication control unit 402 , an instruction for controlling exposure for the monitoring camera 101 as indicated by the correction amount being large illustrated in FIG. 8 .
- step S 709 the exposure determination unit 407 measures the brightness of the subject region of an image obtained after exposure control is performed in the above-described way, and then, in step S 710 , the exposure determination unit 407 determines whether exposure of the subject region is currently set to a desired correct value. Then, if, in step S 710 , it is determined that the exposure is currently set to the correct value (YES in step S 710 ), the client CPU 301 ends the processing in the flowchart of FIG. 7 .
- step S 710 determines that the exposure is not currently set to the correct value (NO in step S 710 )
- step S 711 the exposure determination unit 407 additionally performs exposure adjustment for the monitoring camera 101 via the communication control unit 402 , and then returns the processing to step S 709 .
- step S 709 the exposure determination unit 407 measures the brightness of the subject region again, and performs determination processing in step S 710 .
- the exposure adjustment in step S 711 do not greatly change exposure as mentioned above but progressively change exposure while performing fine adjustment.
- the exposure determination unit 407 performs measurement after an interval of a predetermined time in consideration of a time until exposure adjustment performed by the monitoring camera 101 is reflected.
- Performing the above-described processing enables obtaining an image in which the subject region is set to a correct exposure.
- FIG. 9 is a diagram used to explain an exchange of processing between the monitoring camera and the client apparatus in the above-described first exemplary embodiment, and illustrates a specific example in a case where exposure control for the camera has been appropriately performed.
- an image 901 illustrated in FIG. 9 indicates an image in an initial state obtained by the monitoring camera capturing an image of the subject, and is an example of an image captured in a backlit scene.
- the client apparatus performs processing for exposure evaluation such as that described above based on an image 902 similar to the image 502 illustrated in FIG. 5 . While, at this time, since a region of persons in the image 902 is dark, the client apparatus issues an instruction to increase an exposure, in this case, since the exposure correction amount becomes large, the client apparatus performs coarse adjustment once as mentioned above and then performs control to restrict the exposure correction amount up to a certain degree once.
- An image 903 is an example of an image captured by the monitoring camera which has been controlled for exposure by the client apparatus in the above-described way.
- the client apparatus detects the brightness of a subject again, performs fine adjustment in such a way as to gradually change the brightness as indicated in an image 904 , and performs exposure control of the monitoring camera in such a manner that the subject is set to a correct brightness.
- An image 905 is an example of an image which is obtained by such exposure control being performed in the monitoring camera.
- performing changing of the brightness in a stepwise fashion enables acquiring an image with an appropriate brightness even in a case where an evaluation region differs between the monitoring camera and the client apparatus.
- a second exemplary embodiment processing obtained by adding degamma processing to the processing described in the first exemplary embodiment is described.
- the degamma processing is processing used to attain a stable brightness which does not depend on a setting condition, such as exposure, of the monitoring camera 101 , as mentioned above.
- adding degamma processing enables implementing more appropriate processing than in the first exemplary embodiment.
- FIG. 10 is a flowchart illustrating the flow of processing which is performed by the client CPU 301 in the second exemplary embodiment.
- the monitoring camera 101 , the client apparatus 103 , the input device 104 , and the display device 105 are assumed to be previously powered on and a connection (communication) between the monitoring camera 101 and the client apparatus 103 are assumed to be previously established.
- transmission of image data from the monitoring camera 101 to the client apparatus 103 and image display by the display device 105 are assumed to be continuously repeated.
- the processing illustrated in the flowchart of FIG. 10 is assumed to be started by the client CPU 301 in response to a captured image of a subject being input from the monitoring camera 101 to the client apparatus 103 via the network 102 .
- step S 901 the subject detection unit 406 performs processing for detecting a subject from an image transmitted from the monitoring camera 101 .
- step S 902 the subject detection unit 406 determines whether a subject (in this example, being assumed to be a face) has been detected. If, in step S 902 , it is determined that no subject has been detected (NO in step S 902 ), the client CPU 301 ends the processing illustrated in the flowchart of FIG. 10 , and, on the other hand, if it is determined that a subject has been detected (YES in step S 902 ), the client CPU 301 advances the processing to step S 903 .
- a subject in this example, being assumed to be a face
- step S 903 the exposure determination unit 407 measures the brightness of the subject region (face region).
- step S 904 the exposure determination unit 407 performs degamma processing by referring to a luminance conversion table (degamma table).
- FIG. 11 is a diagram used to explain an example of degamma processing.
- a characteristic 1101 illustrated in FIG. 11 represents an image characteristic in the image sensor 202 of the monitoring camera 101
- a characteristic 1102 represents an input-output characteristic (in the present exemplary embodiment, a gamma characteristic) in the luminance conversion which is performed in the monitoring camera 101 .
- the monitoring camera 101 outputs an image obtained by performing, in the monitoring camera 101 , gamma processing ( 1102 ) with respect to the image characteristic ( 1101 ) input from the image sensor 202 .
- input-output characteristic information for luminance conversion which is performed in the monitoring camera 101 i.e., information indicating a gamma characteristic
- the camera information acquisition unit 404 as a gamma table (luminance conversion table).
- the gamma characteristic information can be acquired in the state of being held as metadata about an input image, or a plurality of pieces of gamma characteristic information having respective different patterns corresponding to types of monitoring cameras 101 connectable to the client apparatus 103 can be previously stored and the applicable gamma characteristic information can be acquired from among the stored plurality of pieces of gamma characteristic information.
- such a plurality of pieces of gamma characteristic information having respective different patterns can be stored in, for example, the auxiliary storage device 303 , or can be formed by executing a program and then stored in the main storage device 302 .
- the camera information acquisition unit 404 performs identification information acquisition processing for acquiring, for example, an identifier (ID) indicating the type of the monitoring camera 101 , a serial number, and an individual discrimination number. Then, the camera information acquisition unit 404 selects applicable gamma characteristic information from among the previously stored plurality of pieces of gamma characteristic information based on at least any one of such pieces of gamma characteristic information.
- a gamma characteristic which the camera information acquisition unit 404 acquires (a first input-output characteristic) can be acquired together with an exposure setting value for the monitoring camera to be acquired as described below.
- the exposure determination unit 407 acquires gamma characteristic information from the above-mentioned gamma table, and performs degamma processing on an input image coming from the monitoring camera in such a way as to return the input image to a state obtained before the input image is subjected to gamma processing in the monitoring camera, i.e., a state of an image captured by the image sensor 202 .
- the exposure determination unit 407 performs processing for inversely transforming an image based on a degamma characteristic (a second input-output characteristic), which is an input-output characteristic inverse to the gamma characteristic, which is the first-input-output characteristic, for converting the luminance of an image on the side of the monitoring camera, and thus returns the image to an image obtained before being converted on the side of the monitoring camera.
- a degamma characteristic a second input-output characteristic
- step S 905 the exposure determination unit 407 performs calculation processing for an exposure correction amount that is based on a luminance value subjected to degamma processing.
- step S 906 the exposure determination unit 407 acquires the current exposure setting value for the monitoring camera 101 acquired by the camera information acquisition unit 404 from the monitoring camera 101 .
- step S 907 the exposure determination unit 407 calculates an exposure correction amount for setting the subject region to a correct exposure in a way similar to that in the above-described first exemplary embodiment.
- step S 908 the exposure determination unit 407 determines whether the exposure correction amount calculated in step S 907 is less than or equal to a predetermined amount.
- the exposure determination unit 407 performs processing for determining the extent to which the exposure correction amount (the change amount for exposure) is.
- the determination processing in step S 908 is performed.
- the exposure determination unit 407 performs coarse adjustment to change exposure at a stretch up to a certain degree of amount and, from that point onwards, progressively performs fine adjustment. In this way, it becomes easy to perform matching of brightness, and it is possible to prevent or reduce an adverse effect in which, if coarse adjustment is performed, hunting of luminance occurs.
- step S 908 it is determined that the exposure correction amount is less than or equal to the predetermined amount (YES in step S 908 )
- the client CPU 301 advances the processing to step S 909 , and, on the other hand, if it is determined that the exposure correction amount exceeds the predetermined amount (NO in step S 908 ), the client CPU 301 advances the processing to step S 910 .
- step S 909 the exposure determination unit 407 outputs, to the monitoring camera 101 via the communication control unit 402 , an instruction for controlling the monitoring camera 101 according to the exposure correction amount calculated in step S 907 .
- the monitoring camera 101 performs exposure control based on the instruction corresponding to the exposure correction amount, thus becoming able to capture an image in which the subject has a correct brightness.
- step S 910 to which the processing has proceeded after, in step S 908 , it is determined that the exposure correction amount exceeds the predetermined amount, the exposure determination unit 407 outputs, to the monitoring camera 101 via the communication control unit 402 , an instruction for changing exposure for the monitoring camera 101 as indicated by the correction amount being large illustrated in FIG. 8 in the first exemplary embodiment.
- step S 911 the exposure determination unit 407 measures the brightness of the subject region (face region), and then, in step S 912 , the exposure determination unit 407 determines whether exposure of the subject region is currently set to a correct value. Then, if, in step S 912 , it is determined that the exposure is currently set to the correct value (YES in step S 912 ), the client CPU 301 ends the processing in the flowchart of FIG. 10 .
- step S 912 determines that the exposure is not currently set to the correct value (NO in step S 912 )
- step S 913 the exposure determination unit 407 additionally performs exposure adjustment for the monitoring camera 101 via the communication control unit 402 , and then returns the processing to step S 911 .
- step S 911 the exposure determination unit 407 measures the brightness of the subject region again, and performs determination processing in step S 912 .
- the exposure adjustment in step S 913 do not greatly change exposure as mentioned above but progressively change exposure while performing fine adjustment.
- the exposure determination unit 407 performs measurement after an interval of a predetermined time.
- the second exemplary embodiment it becomes possible to acquire a stable brightness which does not depend on an exposure setting condition of the monitoring camera 101 , so that the monitoring camera 101 becomes able to capture a correct image.
- 2007-102284 it is supposed that it is impossible to correctly perform camera control which brings about an appropriate brightness with respect to a target image.
- exposure control which enables obtaining an image with a correct brightness is implemented.
- the present disclosure can also be implemented by performing processing for supplying a program for implementing one or more functions of the above-described exemplary embodiments to a system or apparatus via a network or a storage medium and causing one or more processors included in a computer of the system or apparatus to read and execute the program.
- the present disclosure can also be implemented by a circuit which implements such one or more functions (for example, an application specific integrated circuit (ASIC)).
- ASIC application specific integrated circuit
- Embodiment(s) of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
- computer executable instructions e.g., one or more programs
- a storage medium which may also be referred to more fully as a
- the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
- the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
- the storage medium may include, for example, one or more of a hard disk, a random access memory (RAM), a read-only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.
Abstract
An apparatus capable of communicating with a capturing apparatus includes a first acquisition unit configured to acquire an image captured by the capturing apparatus, a second acquisition unit configured to acquire first exposure information determined by the capturing apparatus based on a luminance of a first region in the image, a first determination unit configured to determine second exposure information based on a luminance of a second region, a second determination unit configured to determine correction information based on a difference between the first exposure information and the second exposure information, and an output unit configured to output the correction information to the capturing apparatus.
Description
- This application is a Continuation of U.S. application Ser. No. 17/459,915, filed Aug. 27, 2021, which claims priority from Japanese Patent Application No. 2020-151228 filed Sep. 9, 2020, which is hereby incorporated by reference herein in its entirety.
- Aspects of the embodiments generally relate to an apparatus, a control method, and a storage medium.
- Conventionally, there is known a technique of, to bridge the difference between apparatuses in a case where a processing apparatus and an evaluation apparatus differ from each other, performing degamma processing to obtain an appropriate exposure determination value which does not depend on processing to be performed between the apparatuses, as discussed in Japanese Patent Application Laid-Open No. 4-165876 and Japanese Patent Application Laid-Open No. 2007-102284.
- Aspects of the embodiments are generally directed to an apparatus capable of communicating with a capturing apparatus includes a first acquisition unit configured to acquire an image captured by the capturing apparatus, a second acquisition unit configured to acquire first exposure information determined by the capturing apparatus based on a luminance of a first region in the image, a first determination unit configured to determine second exposure information based on a luminance of a second region, a second determination unit configured to determine correction information based on a difference between the first exposure information and the second exposure information, and an output unit configured to output the correction information to the capturing apparatus.
- Further features of the present disclosure will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
-
FIG. 1 is a diagram illustrating a configuration example of an image capturing control system. -
FIG. 2 is a diagram illustrating an internal configuration example of a monitoring camera. -
FIG. 3 is a diagram illustrating an internal configuration example of a client apparatus. -
FIG. 4 is a diagram illustrating a functional configuration example of the client apparatus. -
FIG. 5 is a diagram used to explain processing to be performed between the monitoring camera and the client apparatus. -
FIGS. 6A and 6B are diagrams used to explain a luminance state in an example illustrated inFIG. 5 . -
FIG. 7 is a flowchart of exposure control processing according to a first exemplary embodiment. -
FIG. 8 is a diagram used to explain processing for changing exposure control. -
FIG. 9 is a diagram used to explain processing to be performed in a case where exposure control in the first exemplary embodiment has been performed. -
FIG. 10 is a flowchart of exposure control processing according to a second exemplary embodiment. -
FIG. 11 is a diagram illustrating characteristics of gamma processing and degamma processing. - Various exemplary embodiments, features, and aspects of the disclosure will be described in detail below with reference to the drawings.
- First, common subject matter for use in describing various exemplary embodiments is described, and, then, the detailed description of exemplary embodiments is performed. Furthermore, the following exemplary embodiments are not intended to limit the present disclosure, and not all of the combinations of features described in the respective exemplary embodiments are necessarily essential for solutions in the present disclosure. The configuration of each exemplary embodiment can be modified or altered as appropriate according to specifications and various conditions (for example, use conditions and usage environments) of an apparatus to which the present disclosure is applied. Moreover, parts of respective exemplary embodiments to be described below can be configured in combination as appropriate. In the following exemplary embodiments, the same constituent elements are assigned the respective same reference characters for description thereof.
- One or more functional blocks to be described below can be implemented by hardware, such as an ASIC or programmable logic array (PLA), or can be implemented by a programmable processor, such as a CPU or an MPU, executing software. Moreover, they can be implemented by a combination of software and hardware. Accordingly, in the following description, even in a case where different functional blocks are described as actors, the same hardware can be implemented as an actor. Furthermore, ASIC is an abbreviation for application specific integrated circuit. CPU is an abbreviation for central processing unit. MPU is an abbreviation for micro processing unit.
-
FIG. 1 is a diagram illustrating a configuration example of an image capturingcontrol system 100 according to a first exemplary embodiment. - The image capturing
control system 100 includes amonitoring camera 101, anetwork 102, aclient apparatus 103, aninput device 104, and adisplay device 105. Themonitoring camera 101 is an image capturing apparatus for acquiring a moving image and is an apparatus capable of performing image capturing of, for example, a subject and image processing. Themonitoring camera 101 and theclient apparatus 103 are interconnected via thenetwork 102 in such a way as to be able to communicate with each other. Theclient apparatus 103 is connected to theinput device 104 and thedisplay device 105 in such a way as to be able to communicate with them. Theclient apparatus 103 is an apparatus for processing various pieces of information and, therefore, can be referred to as an “information processing apparatus”. Moreover, theclient apparatus 103 is an apparatus for controlling an image capturing operation of themonitoring camera 101 and, therefore, can be referred to as an “image capturing control apparatus”. - The
input device 104 is configured with, for example, a mouse and a keyboard, and is configured to be operated by the user of theclient apparatus 103. - The
display device 105 is an apparatus including, for example, a monitor which displays an image received from theclient apparatus 103. Furthermore, thedisplay device 105 is also able to function as a UI, such as a touch panel. In this case, thedisplay device 105 becomes able to also function as an input device for inputting, for example, an instruction, information, and data to theclient apparatus 103. UI is an abbreviation for user interface. - While, in
FIG. 1 , theclient apparatus 103, theinput device 104, and thedisplay device 105 are illustrated as respective individual devices, the present exemplary embodiment is not limited to such a configuration. For example, theclient apparatus 103 and thedisplay device 105 can be integrated together, or theinput device 104 and thedisplay device 105 can be integrated together. Moreover, theclient apparatus 103, theinput device 104, and thedisplay device 105 can be integrated together. In a case where theclient apparatus 103 and thedisplay device 105 are integrated together, an apparatus obtained by integration can be in the form of, for example, a personal computer, a tablet terminal, or a smartphone. -
FIG. 2 is a block diagram illustrating an internal configuration example of themonitoring camera 101. - The
monitoring camera 101 includes an image capturingoptical system 201 and animage sensor 202. Themonitoring camera 101 further includes acamera CPU 203, aROM 204, aRAM 205, an image capturingsystem control unit 206, acontrol unit 207, an A/D conversion unit 208, animage processing unit 209, anencoder unit 210, and a network OF 211. Thecamera CPU 203 through the network I/F 211 of themonitoring camera 101 are interconnected via asystem bus 212. Furthermore, CPU is an abbreviation for central processing unit. ROM is an abbreviation for read-only memory. A/D is an abbreviation for analog-to-digital. RAM is an abbreviation for random access memory. OF is an abbreviation for interface. - The image capturing
optical system 201 is configured with, for example, a zoom lens, a focus lens, an image shake correction lens, a diaphragm, and a shutter, and is an optical member group for collecting light coming from a subject. An optical image of light coming from, for example, a subject collected by the image capturingoptical system 201 is formed on an imaging plane of theimage sensor 202. - The
image sensor 202 is a charge accumulation-type solid-state image sensor, such as a CMOS sensor or a CCD sensor, which converts the optical image of light collected by the image capturingoptical system 201 into a current value (signal value), and is an image capturing unit which acquires color information in combination with, for example a color filter. CMOS is an abbreviation for complementary metal-oxide semiconductor. CCD is an abbreviation for charge-coupled device. Theimage sensor 202 is connected to the A/D conversion unit 208. - The A/
D conversion unit 208 converts the amount of light detected by theimage sensor 202 into a digital signal (image data). The A/D conversion unit 208 transmits the digital signal to theimage processing unit 209. - The
image processing unit 209 performs image processing on image data which is a digital signal received from theimage sensor 202. Theimage processing unit 209 is connected to theencoder unit 210. - The
encoder unit 210 performs processing for converting image data processed by theimage processing unit 209 into a file format, such as Motion JPEG, H264, or H265. Theencoder unit 210 is connected to the network I/F 211. - The
camera CPU 203 is a control unit which comprehensively controls an operation of themonitoring camera 101. Thecamera CPU 203 reads an instruction stored in theROM 204 or theRAM 205 and performs processing corresponding to the instruction. - The image capturing
system control unit 206 controls each component of themonitoring camera 101 based on instructions issued from thecamera CPU 203. For example, the image capturingsystem control unit 206 performs control operations, such as focus control, shutter control, and aperture adjustment, with respect to the image capturingoptical system 201. - The network OF 211 is an interface for use in communicating with an external apparatus, such as the
client apparatus 103, via thenetwork 102, and is controlled by thecontrol unit 207. - The
control unit 207 controls communication to be performed with theclient apparatus 103, and performs control to, for example, transmit, to thecamera CPU 203, a control instruction (control signal) issued by theclient apparatus 103 to each component of themonitoring camera 101. - The
network 102 is an Internet Protocol (IP) network used to interconnect themonitoring camera 101 and theclient apparatus 103. Thenetwork 102 is configured with, for example, a plurality of routers compliant with a communication standard such as Ethernet, switches, and cables. In the present exemplary embodiment, thenetwork 102 is a network capable of being used to enable communication between the monitoringcamera 101 and theclient apparatus 103, and does not have restrictions on, for example, its communication standard, scale, and configuration. For example, thenetwork 102 can be configured with, for example, the Internet, a wired local area network (LAN), a wireless LAN, or a wide area network (WAN). -
FIG. 3 is a block diagram illustrating an internal configuration example of theclient apparatus 103. - The
client apparatus 103 includes aclient CPU 301, amain storage device 302, anauxiliary storage device 303, an input OF 304, an output OF 305, and a network OF 306. The respective components of theclient apparatus 103 are interconnected via asystem bus 307 in such a way as to be able to communicate with each other. - The
client CPU 301 is a central processing unit which comprehensively controls an operation of theclient apparatus 103. Furthermore, theclient CPU 301 can be configured to comprehensively control themonitoring camera 101 via thenetwork 102. - The
main storage device 302 is a storage device, such as a RAM, which functions as a temporary data storage location for theclient CPU 301. - The
auxiliary storage device 303 is a storage device, such as an HDD, a ROM, or an SSD, which stores, for example, various programs and various pieces of setting data. Furthermore, HDD is an abbreviation for hard disk drive. SSD is an abbreviation for solid state drive. Moreover, a program concerned with the present exemplary embodiment is stored in theauxiliary storage device 303. In the present exemplary embodiment, functions and processing operations of theclient apparatus 103 illustrated inFIG. 4 are implemented by theclient CPU 301 performing processing based on a program read out from theauxiliary storage device 303 and then loaded onto themain storage device 302. Details of this processing are described below. Moreover, theauxiliary storage device 303 can be configured to previously store therein, for example, patterns for pattern matching (patterns corresponding to characteristic portions of faces or characteristic portions of human bodies), which are used for theclient apparatus 103 to perform face detection or human body detection based on image data. Furthermore, the patterns for pattern matching can be formed by execution of a program and then stored in themain storage device 302. - The input I/
F 304 is an interface used for theclient apparatus 103 to receive an input (signal) from, for example, theinput device 104. - The output OF 305 is an interface used for the
client apparatus 103 to output information (signal) to, for example, thedisplay device 105. - The network OF 306 is an interface for use in communication with an external apparatus, such as the
monitoring camera 101, via thenetwork 102. -
FIG. 4 is a functional block diagram illustrating functions which theclient apparatus 103 executes. In other words, the functional units (functional blocks) illustrated inFIG. 4 are functional units which theclient CPU 301 is able to execute, and these functional units are synonymous with theclient CPU 301. - As illustrated in
FIG. 4 , theclient CPU 301 of theclient apparatus 103 includes, as functional units, an inputsignal acquisition unit 401, acommunication control unit 402, an inputimage acquisition unit 403, a camerainformation acquisition unit 404, and a detectionmethod setting unit 405. Moreover, theclient CPU 301 further includes, as functional units, asubject detection unit 406, anexposure determination unit 407, and adisplay control unit 408. Furthermore, in theclient apparatus 103, the functional units illustrated inFIG. 4 , i.e., the inputsignal acquisition unit 401 through thedisplay control unit 408, can be configured with hardware (or software) different from theclient CPU 301. - The input
signal acquisition unit 401 receives an input from the user via theinput device 104. - The
communication control unit 402 performs control to receive an image transmitted from the monitoring camera 101 (an image captured by the monitoring camera 101) via thenetwork 102. Moreover, thecommunication control unit 402 performs control to transmit a control instruction issued by theclient apparatus 103 to themonitoring camera 101 via thenetwork 102. - The input
image acquisition unit 403 acquires an image received from themonitoring camera 101 via thecommunication control unit 402, as an input image targeted for processing for detecting a subject (an image to which subject detection processing is applied). Details of the detection processing are described below. - The camera
information acquisition unit 404 acquires, via thecommunication control unit 402, camera information to be used for themonitoring camera 101 to perform image capturing of, for example, a subject. The camera information includes various pieces of camera setting information and image processing information used for performing image capturing of, for example, a subject to acquire an image. Specifically, the camera information includes, for example, exposure parameters for, for example, aperture value, shutter speed, and gain (setting value concerning exposure) and information concerning image processing related to luminance, such as gamma correction, edge enhancement, and white balance. - The detection
method setting unit 405 sets a predetermined (appropriate) detection method from among various detection methods including detection of a face region (face detection) or detection of a human body region (human body detection) with respect to an input image acquired by the inputimage acquisition unit 403. In the present exemplary embodiment, the detectionmethod setting unit 405 sets (selects) a detection method for face detection or a detection method for human body detection. - The
subject detection unit 406 detects a specific subject region from within an input image captured by themonitoring camera 101 and acquired by the inputimage acquisition unit 403. For example, in a case where performing face detection has been set by the detectionmethod setting unit 405, thesubject detection unit 406 detects a face region from the input image. Moreover, for example, in a case where performing human body detection has been set by the detectionmethod setting unit 405, thesubject detection unit 406 detects a human body region from the input image. - Furthermore, the present exemplary embodiment is not limited to such a setting. For example, a detection method for detecting a feature region of a part of a person, such as the upper body of the person or some of organs such as the head or the eye, nose, or mouth of the face, can be set (can be selected). Moreover, while, in the present exemplary embodiment, a specific subject targeted for detection is assumed to be a person, a configuration capable of detecting a feature region related to a specific subject other than persons can be employed. For example, a configuration capable of detecting a specific subject previously set in the
client apparatus 103, such as the face of an animal or an automobile, can also be employed. - The
exposure determination unit 407 has the function of determining, based on an exposure setting value for themonitoring camera 101 acquired by the camerainformation acquisition unit 404 and image information about a subject region detected by thesubject detection unit 406, an exposure amount of the detected subject region. In addition, theexposure determination unit 407 also has the function of performing exposure control of themonitoring camera 101 based on the determined exposure amount. Exposure control of themonitoring camera 101 in theexposure determination unit 407 is performed by transmitting an exposure control value that is based on the determined exposure amount to themonitoring camera 101 via thecommunication control unit 402. Specifically, theexposure determination unit 407 calculates an exposure correction amount representing an amount of change of exposure for bringing a subject region into a correct exposure state, based on a difference between the exposure setting value of themonitoring camera 101 and the image information about the subject region, and transmits an exposure control value corresponding to the calculated exposure correction amount to thecommunication control unit 402. Then, thecommunication control unit 402 transmits a control instruction corresponding to the exposure control value (exposure correction amount) to themonitoring camera 101 via the network OF 306. With this processing, in themonitoring camera 101 having received the control instruction, exposure control is performed by thecontrol unit 207 or the image capturingsystem control unit 206. Furthermore, theexposure determination unit 407 can be configured to transmit not the exposure control value but the calculated exposure correction amount to thecommunication control unit 402 and transmit not the control instruction but the exposure correction amount (exposure correction value) to themonitoring camera 101 via the network OF 306. In this case, themonitoring camera 101 calculates an exposure control value corresponding to the input exposure correction amount and controls exposure based on the calculated exposure control value. The exposure control value as mentioned here is a parameter for use in controlling exposure for themonitoring camera 101, and indicates, for example, an aperture value, an exposure time, and an analog gain and a digital gain. In other words, theexposure determination unit 407 determines correction information (exposure correction amount or exposure control value) for bringing a subject region into a correct exposure state, based on a difference between the exposure setting value for themonitoring camera 101 and luminance information about the subject region, and outputs the correction information to themonitoring camera 101 via the network OF 306. - Moreover, in the present exemplary embodiment, the
exposure determination unit 407 is configured to perform exposure control using at least a first exposure correction method and a second exposure correction method. While details thereof are described below, for example, theexposure determination unit 407 performs coarse adjustment for changing exposure for themonitoring camera 101 at a stretch up to a predetermined amount as a first exposure correction method and, from that point onwards, performs exposure control by a second exposure correction method for finely adjusting exposure for themonitoring camera 101. Particularly, in a case where the calculated exposure correction amount is larger than a predetermined amount (in a case where the amount of change of exposure is large), theexposure determination unit 407 performs coarse adjustment for changing exposure at a stretch up to the predetermined amount and, from that point onwards, performs fine adjustment. On the other hand, in a case where the calculated exposure correction amount is smaller than or equal to the predetermined amount, theexposure determination unit 407 performs control to change exposure at a stretch up to the exposure correction amount in a period of coarse adjustment. Details of such processing performed by theexposure determination unit 407 are described below with reference to, for example, the flowchart ofFIG. 7 . - The
display control unit 408 outputs, to thedisplay device 105, a captured image in which exposure correction using the exposure correction amount determined by theexposure determination unit 407 has been reflected, in response to an instruction from theclient CPU 301. - With the above-described configuration employed, in a case where the
client apparatus 103 performs exposure control of themonitoring camera 101, theclient apparatus 103 performs degamma processing as mentioned above and thus correctly determines the brightness of an image which theimage sensor 202 of themonitoring camera 101 outputs. However, since themonitoring camera 101 and theclient apparatus 103 differ from each other in a method for determining an exposure state, it is necessary to take into consideration such a difference in the determination method. - In the following description, for example, the
monitoring camera 101 is assumed to set the whole image as a light metering area and determine an exposure amount based on a luminance acquired in the whole image. On the other hand, theclient apparatus 103 is assumed to set, for example, a subject region, such as a person, extracted from the image as a light metering area and perform exposure determination based on information about a luminance acquired in the subject region. Thus, while themonitoring camera 101 determines an exposure amount with the whole image set as an evaluation range for exposure setting, theclient apparatus 103 performs exposure determination with a subject region in the image set as an evaluation range for exposure setting. In the present exemplary embodiment, a scene directed to enabling image capturing in which a subject region is exposed with appropriate brightness in a case where themonitoring camera 101 and theclient apparatus 103 differ from each other in an evaluation region for exposure state determination is assumed. -
FIG. 5 is a diagram used to explain an exchange of processing between a monitoring camera and a client apparatus, and illustrates a specific example in a case where exposure control for the monitoring camera has not been appropriately performed. Moreover,FIGS. 6A and 6B are diagrams used to explain the manner of luminance of an image in each state illustrated inFIG. 5 .FIG. 6A illustrates histograms each representing a relationship between a luminance value and the number of pixels in an image, andFIG. 6B is a diagram illustrating average values of luminance in an image. - An
image 501 illustrated inFIG. 5 indicates an image in an initial state obtained by the monitoring camera capturing an image of the subject, and is an example of an image captured in a backlit scene. In the backlit scene, usually, the region of a person serving as a subject becomes dark and the surroundings thereof become bright. Even in such a case, to accurately perform recognition of a person, the image is to be set to an exposure state in which the brightness of the region of a person becomes appropriate. - A
graph 601 illustrated inFIG. 6A represents a distribution of luminance obtained in a state of brightness such as animage 501 illustrated inFIG. 5 . In the case of theimage 501, which is in a backlit state, the luminance distribution is, therefore, in a state in which a dark portion and a bright portion are dominant in theimage 501 as shown in thegraph 601. Then, agraph 602 illustrated inFIG. 6B represents the amount of brightness of the entire image serving as a light metering area (an evaluation region for exposure determination) for use in the monitoring camera, into which the luminance distribution represented by thegraph 601 has been converted. - Moreover, an
image 502 illustrated inFIG. 5 represents an example of an image obtained by the client apparatus performing degamma processing to reproduce a state of brightness which the actual image sensor captures, in such a way as not to be affected by an image capturing condition for use in the monitoring camera. At this time, the client apparatus extracts the brightness while focusing on persons as a light metering area (an evaluation region for exposure determination), and, since persons are dark as shown in theimage 502, the client apparatus seeks to increase exposure up to an appropriate brightness. Thus, as indicated by anarrow 603 illustrated inFIG. 6A , the client apparatus seeks to perform adjustment to bring the present brightness of persons indicated by a dashed line shown near the base of thearrow 603 into an appropriate brightness indicated by a solid line shown near the tip of thearrow 603. The processing indicated by thearrow 603 is assumed to be, for example, processing for adjusting an exposure level in such a manner that the present brightness obtained before processing becomes a twofold brightness, as indicated by agraph 604 illustrated inFIG. 6B . - Then, if exposure control of the monitoring camera is performed based on an instruction from such a client apparatus, over correction may be performed as in an
image 503 illustrated inFIG. 5 , so that a loss of highlight detail may occur in the region of persons. For example, in the case of performing exposure control to obtain a twofold brightness, since the monitoring camera adjusts brightness with the entire captured image used as an evaluation region, a high-luminance portion enters into a saturation state, in which the portion is unable to be made brighter any more. Thus, as indicated by anarrow 605 illustrated inFIG. 6A , the monitoring camera performs adjustment of exposure in such a way as to make the brightness of a low-luminance portion indicated by a dashed line shown near the base of thearrow 605 two or more times and make the entire brightness two times as indicated by a solid line shown near the tip of thearrow 605. Agraph 606 illustrated inFIG. 6B represents amounts of brightness obtained before and after exposure control performed in such a way as to make the brightness of the low-luminance portion two or more times and make the entire brightness two times. - As mentioned above, in a case where an evaluation region differs between on the side of the monitoring camera and on the side of the client apparatus, for example, even if making the brightness of a person appropriate is tried, over correction may be performed. Thus, in a case where an evaluation region differs between on the side of the monitoring camera and on the side of the client apparatus, the possibility that appropriate exposure control is unable to be performed becomes high.
- In the first exemplary embodiment, a configuration and a processing operation which enable performing appropriate exposure control even if, as mentioned above, a target for exposure evaluation (evaluation region) differs between on the side of the monitoring camera and on the side of the client apparatus are described. Furthermore, in the present exemplary embodiment, correction information for controlling exposure for the
monitoring camera 101 is described as an exposure correction amount, but can be an exposure control value as mentioned above. -
FIG. 7 is a flowchart illustrating the flow of subject detection processing through exposure control processing which are performed by theclient CPU 301 of theclient apparatus 103 according to the first exemplary embodiment. Furthermore, in the image capturingcontrol system 100 illustrated inFIG. 1 , themonitoring camera 101, theclient apparatus 103, theinput device 104, and thedisplay device 105 are assumed to be previously powered on and a connection (communication) between the monitoringcamera 101 and theclient apparatus 103 are assumed to be previously established. Moreover, in this state, image capturing of, for example, a subject with a predetermined updating cycle by themonitoring camera 101, transmission of image data from themonitoring camera 101 to theclient apparatus 103, and image display by thedisplay device 105 are assumed to be continuously repeated. Then, the processing illustrated in the flowchart ofFIG. 7 is assumed to be started by theclient CPU 301 in response to a captured image of, for example, a subject being input from themonitoring camera 101 to theclient apparatus 103 via thenetwork 102. - First, in step S701, the
subject detection unit 406 performs processing for detecting a subject from an image transmitted from themonitoring camera 101. In the present exemplary embodiment, an example in which a human body or a face is detected as a subject is taken, and, therefore, prior to detection processing for a subject, the detectionmethod setting unit 405 sets a detection method for a face or human body to thesubject detection unit 406. Then, thesubject detection unit 406 performs face detection processing or human body detection processing on an input image according to the setting performed by the detectionmethod setting unit 405. Patterns respectively corresponding to feature portions of faces or feature portions of human bodies are previously stored in theauxiliary storage device 303 of theclient apparatus 103, and thesubject detection unit 406 performs detection of a face region or a human body region by pattern matching that is based on the stored patterns. - Furthermore, in the case of detecting a face region, usually, it is possible to detect a face with a high degree of accuracy and it is possible to clearly discriminate between a face region of the subject and a region other than the face region. However, in a case where, for example, the direction of the face, the size of the face, or the brightness of the face is not in a condition adapted for face detection, it may not be possible to accurately detect the face region. On the other hand, in the case of detecting a human body, it is possible to detect a region in which a person is present, irrespective of, for example, the direction of the face, the size of the face, or the brightness of the face. Furthermore, human body detection in the present exemplary embodiment does not necessarily need to detect the whole body, but can detect the upper half of the body, the upper body above the breast, or a head region including the face.
- Moreover, in the case of employing a pattern matching method as a detection method for a subject, patterns (classifiers) created by using statistical learning can be used as patterns for use in the pattern matching method. Alternatively, subject detection can be performed by using a method other than the pattern matching method. For example, subject detection can be performed with use of a luminance gradient within a local region in the image. Thus, the detection method for a subject is not limited to a specific detection method, and can include various methods, such as detection that is based on machine learning and detection that is based on distance information.
- Next, in step S702, the
subject detection unit 406 determines whether a subject (a face region or a human body region) has been detected from within the image by the subject detection processing performed in step S701. Then, if it is determined that at least one subject has been detected (YES in step S702), theclient CPU 301 advances the processing to step S703, and, on the other hand, if it is determined that no subject has been detected (NO in step S702), theclient CPU 301 ends the present processing. - In step S703, the
exposure determination unit 407 measures the brightness (luminance) of the subject region. - Next, in step S704, the
exposure determination unit 407 acquires the current exposure value acquired from themonitoring camera 101 by the camera information acquisition unit 404 (the current exposure setting value of the monitoring camera 101). - Next, in step S705, the
exposure determination unit 407 calculates an exposure correction amount representing a change amount (correction amount) for exposure (exposure value) for bringing the subject region into a correct exposure state, based on information about the brightness of the subject region and the current exposure setting value of themonitoring camera 101. More specifically, theexposure determination unit 407 calculates an exposure value for bringing the subject region into a correct exposure state, from luminance information about the subject region. Theexposure determination unit 407 determines correction information concerning exposure for themonitoring camera 101 based on a difference between the calculated exposure value and the current exposure setting value of themonitoring camera 101. The correction information can be an exposure correction amount (exposure correction value) for themonitoring camera 101, or can be an exposure control value (an exposure time or a gain control value). - Next, in step S706, the
exposure determination unit 407 determines whether the exposure correction amount (exposure change amount) is less than or equal to a predetermined amount. Then, if, in step S706, it is determined that the exposure correction amount is less than or equal to the predetermined amount (YES in step S706), theclient CPU 301 advances the processing to step S707, and, on the other hand, if it is determined that the exposure correction amount exceeds the predetermined amount (NO in step S706), theclient CPU 301 advances the processing to step S708. Thus, in step S706, theexposure determination unit 407 performs processing for determining the extent to which the exposure correction amount (correction information) representing the change amount for exposure is. This is because, as mentioned above, in a case where the evaluation region (light metering region) differs between on the side of the monitoring camera and on the side of the client apparatus, the exposure correction amount calculated by the client apparatus is likely to become a correction amount greatly different from the reality. Accordingly, in a case where the exposure correction amount (correction information) is greater than the predetermined amount (a case where the change amount for exposure is large), theexposure determination unit 407 does not change exposure at a stretch with the calculated exposure correction amount but performs coarse adjustment to change exposure at a stretch up to a certain degree of amount and, from that point onwards, progressively performs fine adjustment. In other words, in a case where the exposure correction value or exposure control value serving as correction information is larger than a predetermined value, theexposure determination unit 407 additionally determines correction information in such a way as to modify (coarsely adjust) the correction information according to a predetermined value and, from that point onwards, finely adjust the luminance of the evaluation region on the side of the client apparatus. In this way, it becomes easy to perform matching of brightness, and it is possible to prevent or reduce an adverse effect in which, if coarse adjustment is performed, hunting of luminance occurs. Moreover, it is possible to reduce the possibility that the exposure correction for the monitoring camera to be performed with the correction information determined by theexposure determination unit 407 becomes over correction. Furthermore, the evaluation region (light metering region) for the client apparatus is a first region, and the evaluation region (light metering region) for the monitoring camera is a second region. Moreover, the evaluation region (light metering region) for the client apparatus, which is the first region, is to be a region including a main subject (for example, a human body, a face, or an animal such as a bird or cat). -
FIG. 8 is diagram used to explain an example of such exposure adjustment. In the example illustrated inFIG. 8 , a previously determined specification upper limit is taken as a predetermined amount (a predetermined value) for use in determination with respect to the exposure correction amount. In a case where the exposure correction amount calculated in step S705 is less than or equal to the specification upper limit, theexposure determination unit 407 follows a transition for correction amount being small indicated by asolid line 801 illustrated inFIG. 8 , thus changing the exposure for the monitoring camera at a stretch up to the exposure correction amount in a coarse adjustment period. On the other hand, in a case where the exposure correction amount calculated in step S705 is greater than the specification upper limit, theexposure determination unit 407 follows a transition for correction amount being large indicated by asolid line 802 illustrated inFIG. 8 , thus changing the exposure for the monitoring camera at a stretch up to the specification upper limit in the coarse adjustment period. Thus, theexposure determination unit 407 updates the exposure correction amount to the specification upper limit (the predetermined value). Then, after reaching the specification upper limit value, theexposure determination unit 407 performs exposure control in such a way as to gradually change the exposure level until correct exposure is attained in the subject region. Thus, theexposure determination unit 407 determines the exposure correction amount again based on luminance information about the second region. - Furthermore, the predetermined amount, which is used for comparison with the exposure correction amount, can be changed as appropriate according to a degree of coincidence indicating to what degree the evaluation region for the monitoring camera and the evaluation region for the monitoring camera coincide with each other. Moreover, in a case where exposure control such as that indicated by the
solid line 801 illustrated inFIG. 8 is performed, the specification upper limit, which is used for changing exposure at a stretch, can be a value different from the predetermined amount for use in comparison with the exposure correction amount. - In step S707, to which the processing has proceeded after, in step S706, it is determined that the exposure correction amount is less than or equal to the predetermined amount, the
exposure determination unit 407 outputs, to themonitoring camera 101 via thecommunication control unit 402, an instruction for controlling exposure for themonitoring camera 101 according to the exposure correction amount calculated in the above-described way. With this processing, themonitoring camera 101 performs exposure control based on the instruction corresponding to the exposure correction amount, thus becoming able to capture an image in which the subject has a correct brightness. - On the other hand, in step S708, to which the processing has proceeded after, in step S706, it is determined that the exposure correction amount exceeds the predetermined amount, the
exposure determination unit 407 outputs, to themonitoring camera 101 via thecommunication control unit 402, an instruction for controlling exposure for themonitoring camera 101 as indicated by the correction amount being large illustrated inFIG. 8 . - Next, in step S709, the
exposure determination unit 407 measures the brightness of the subject region of an image obtained after exposure control is performed in the above-described way, and then, in step S710, theexposure determination unit 407 determines whether exposure of the subject region is currently set to a desired correct value. Then, if, in step S710, it is determined that the exposure is currently set to the correct value (YES in step S710), theclient CPU 301 ends the processing in the flowchart ofFIG. 7 . On the other hand, if, in step S710, it is determined that the exposure is not currently set to the correct value (NO in step S710), then in step S711, theexposure determination unit 407 additionally performs exposure adjustment for themonitoring camera 101 via thecommunication control unit 402, and then returns the processing to step S709. Then, in step S709, theexposure determination unit 407 measures the brightness of the subject region again, and performs determination processing in step S710. Furthermore, in one embodiment, the exposure adjustment in step S711 do not greatly change exposure as mentioned above but progressively change exposure while performing fine adjustment. Moreover, in a case where theexposure determination unit 407 returns the processing to step S709 and re-measures the brightness of the subject region, theexposure determination unit 407 performs measurement after an interval of a predetermined time in consideration of a time until exposure adjustment performed by themonitoring camera 101 is reflected. - Performing the above-described processing enables obtaining an image in which the subject region is set to a correct exposure.
-
FIG. 9 is a diagram used to explain an exchange of processing between the monitoring camera and the client apparatus in the above-described first exemplary embodiment, and illustrates a specific example in a case where exposure control for the camera has been appropriately performed. - As with the
image 501 illustrated inFIG. 5 , animage 901 illustrated inFIG. 9 indicates an image in an initial state obtained by the monitoring camera capturing an image of the subject, and is an example of an image captured in a backlit scene. - The client apparatus performs processing for exposure evaluation such as that described above based on an
image 902 similar to theimage 502 illustrated inFIG. 5 . While, at this time, since a region of persons in theimage 902 is dark, the client apparatus issues an instruction to increase an exposure, in this case, since the exposure correction amount becomes large, the client apparatus performs coarse adjustment once as mentioned above and then performs control to restrict the exposure correction amount up to a certain degree once. Animage 903 is an example of an image captured by the monitoring camera which has been controlled for exposure by the client apparatus in the above-described way. - Then, the client apparatus detects the brightness of a subject again, performs fine adjustment in such a way as to gradually change the brightness as indicated in an
image 904, and performs exposure control of the monitoring camera in such a manner that the subject is set to a correct brightness. Animage 905 is an example of an image which is obtained by such exposure control being performed in the monitoring camera. - As described above, in the first exemplary embodiment, performing changing of the brightness in a stepwise fashion enables acquiring an image with an appropriate brightness even in a case where an evaluation region differs between the monitoring camera and the client apparatus.
- In a second exemplary embodiment, processing obtained by adding degamma processing to the processing described in the first exemplary embodiment is described. The degamma processing is processing used to attain a stable brightness which does not depend on a setting condition, such as exposure, of the
monitoring camera 101, as mentioned above. In the second exemplary embodiment, adding degamma processing enables implementing more appropriate processing than in the first exemplary embodiment. -
FIG. 10 is a flowchart illustrating the flow of processing which is performed by theclient CPU 301 in the second exemplary embodiment. In the processing illustrated in the flowchart ofFIG. 10 , as with the above-described processing, themonitoring camera 101, theclient apparatus 103, theinput device 104, and thedisplay device 105 are assumed to be previously powered on and a connection (communication) between the monitoringcamera 101 and theclient apparatus 103 are assumed to be previously established. Moreover, in this state, transmission of image data from themonitoring camera 101 to theclient apparatus 103 and image display by thedisplay device 105 are assumed to be continuously repeated. Then, the processing illustrated in the flowchart ofFIG. 10 is assumed to be started by theclient CPU 301 in response to a captured image of a subject being input from themonitoring camera 101 to theclient apparatus 103 via thenetwork 102. - First, in step S901, as with step S701 illustrated in
FIG. 7 , thesubject detection unit 406 performs processing for detecting a subject from an image transmitted from themonitoring camera 101. - Next, in step S902, the
subject detection unit 406 determines whether a subject (in this example, being assumed to be a face) has been detected. If, in step S902, it is determined that no subject has been detected (NO in step S902), theclient CPU 301 ends the processing illustrated in the flowchart ofFIG. 10 , and, on the other hand, if it is determined that a subject has been detected (YES in step S902), theclient CPU 301 advances the processing to step S903. - In step S903, the
exposure determination unit 407 measures the brightness of the subject region (face region). - In step S904, the
exposure determination unit 407 performs degamma processing by referring to a luminance conversion table (degamma table). -
FIG. 11 is a diagram used to explain an example of degamma processing. A characteristic 1101 illustrated inFIG. 11 represents an image characteristic in theimage sensor 202 of themonitoring camera 101, and a characteristic 1102 represents an input-output characteristic (in the present exemplary embodiment, a gamma characteristic) in the luminance conversion which is performed in themonitoring camera 101. Thus, themonitoring camera 101 outputs an image obtained by performing, in themonitoring camera 101, gamma processing (1102) with respect to the image characteristic (1101) input from theimage sensor 202. - In the case of the present exemplary embodiment, input-output characteristic information for luminance conversion which is performed in the
monitoring camera 101, i.e., information indicating a gamma characteristic, is assumed to be previously acquired and prepared by, for example, the camerainformation acquisition unit 404 as a gamma table (luminance conversion table). Furthermore, the gamma characteristic information can be acquired in the state of being held as metadata about an input image, or a plurality of pieces of gamma characteristic information having respective different patterns corresponding to types ofmonitoring cameras 101 connectable to theclient apparatus 103 can be previously stored and the applicable gamma characteristic information can be acquired from among the stored plurality of pieces of gamma characteristic information. Furthermore, such a plurality of pieces of gamma characteristic information having respective different patterns can be stored in, for example, theauxiliary storage device 303, or can be formed by executing a program and then stored in themain storage device 302. For example, in a case where a plurality of pieces of gamma characteristic information having respective different patterns is previously stored, the camerainformation acquisition unit 404 performs identification information acquisition processing for acquiring, for example, an identifier (ID) indicating the type of themonitoring camera 101, a serial number, and an individual discrimination number. Then, the camerainformation acquisition unit 404 selects applicable gamma characteristic information from among the previously stored plurality of pieces of gamma characteristic information based on at least any one of such pieces of gamma characteristic information. Furthermore, a gamma characteristic which the camerainformation acquisition unit 404 acquires (a first input-output characteristic) can be acquired together with an exposure setting value for the monitoring camera to be acquired as described below. - The
exposure determination unit 407 acquires gamma characteristic information from the above-mentioned gamma table, and performs degamma processing on an input image coming from the monitoring camera in such a way as to return the input image to a state obtained before the input image is subjected to gamma processing in the monitoring camera, i.e., a state of an image captured by theimage sensor 202. In other words, theexposure determination unit 407 performs processing for inversely transforming an image based on a degamma characteristic (a second input-output characteristic), which is an input-output characteristic inverse to the gamma characteristic, which is the first-input-output characteristic, for converting the luminance of an image on the side of the monitoring camera, and thus returns the image to an image obtained before being converted on the side of the monitoring camera. - Next, in step S905, the
exposure determination unit 407 performs calculation processing for an exposure correction amount that is based on a luminance value subjected to degamma processing. - Additionally, in step S906, the
exposure determination unit 407 acquires the current exposure setting value for themonitoring camera 101 acquired by the camerainformation acquisition unit 404 from themonitoring camera 101. - Then, in step S907, the
exposure determination unit 407 calculates an exposure correction amount for setting the subject region to a correct exposure in a way similar to that in the above-described first exemplary embodiment. - Next, in step S908, the
exposure determination unit 407 determines whether the exposure correction amount calculated in step S907 is less than or equal to a predetermined amount. Thus, in step S908, theexposure determination unit 407 performs processing for determining the extent to which the exposure correction amount (the change amount for exposure) is. As mentioned above, since, in a case where the evaluation region differs between on the side of the monitoring camera and on the side of the client apparatus, the exposure correction amount calculated by the client apparatus is likely to become a correction amount greatly different from the reality, the determination processing in step S908 is performed. Even in the second exemplary embodiment, as with the first exemplary embodiment, in a case where the exposure correction amount is greater than the predetermined amount, theexposure determination unit 407 performs coarse adjustment to change exposure at a stretch up to a certain degree of amount and, from that point onwards, progressively performs fine adjustment. In this way, it becomes easy to perform matching of brightness, and it is possible to prevent or reduce an adverse effect in which, if coarse adjustment is performed, hunting of luminance occurs. If, in step S908, it is determined that the exposure correction amount is less than or equal to the predetermined amount (YES in step S908), theclient CPU 301 advances the processing to step S909, and, on the other hand, if it is determined that the exposure correction amount exceeds the predetermined amount (NO in step S908), theclient CPU 301 advances the processing to step S910. - In step S909, the
exposure determination unit 407 outputs, to themonitoring camera 101 via thecommunication control unit 402, an instruction for controlling themonitoring camera 101 according to the exposure correction amount calculated in step S907. Themonitoring camera 101 performs exposure control based on the instruction corresponding to the exposure correction amount, thus becoming able to capture an image in which the subject has a correct brightness. - On the other hand, in step S910, to which the processing has proceeded after, in step S908, it is determined that the exposure correction amount exceeds the predetermined amount, the
exposure determination unit 407 outputs, to themonitoring camera 101 via thecommunication control unit 402, an instruction for changing exposure for themonitoring camera 101 as indicated by the correction amount being large illustrated inFIG. 8 in the first exemplary embodiment. - Next, in step S911, the
exposure determination unit 407 measures the brightness of the subject region (face region), and then, in step S912, theexposure determination unit 407 determines whether exposure of the subject region is currently set to a correct value. Then, if, in step S912, it is determined that the exposure is currently set to the correct value (YES in step S912), theclient CPU 301 ends the processing in the flowchart ofFIG. 10 . On the other hand, if, in step S912, it is determined that the exposure is not currently set to the correct value (NO in step S912), then in step S913, theexposure determination unit 407 additionally performs exposure adjustment for themonitoring camera 101 via thecommunication control unit 402, and then returns the processing to step S911. Then, in step S911, theexposure determination unit 407 measures the brightness of the subject region again, and performs determination processing in step S912. Even in the second exemplary embodiment, the exposure adjustment in step S913 do not greatly change exposure as mentioned above but progressively change exposure while performing fine adjustment. Moreover, even in the second exemplary embodiment, in a case where theexposure determination unit 407 returns the processing to step S911 and re-measures the brightness of the subject region, theexposure determination unit 407 performs measurement after an interval of a predetermined time. - Performing the above-described processing enables obtaining an image with a correct brightness in the second exemplary embodiment. Furthermore, even in the second exemplary embodiment, the behavior of an exchange of processing between the monitoring camera and the client apparatus is similar to that described with reference to
FIG. 9 . - As described above, according to the second exemplary embodiment, it becomes possible to acquire a stable brightness which does not depend on an exposure setting condition of the
monitoring camera 101, so that themonitoring camera 101 becomes able to capture a correct image. - Furthermore, in the above-mentioned technique discussed in Japanese Patent Application Laid-Open No. 4-165876, while the brightness of an image sensor which does not depend on an image capturing condition is able to be obtained by performing degamma processing, there is no disclosure about a method for performing exposure control for an image capturing apparatus based on a result of degamma processing. Moreover, in the above-mentioned technique discussed in Japanese Patent Application Laid-Open No. 2007-102284, processing performed in a case where there is no information about a degamma curve is discussed, but camera control to be performed after degamma processing is not discussed. Therefore, in the techniques discussed in Japanese Patent Application Laid-Open No. 4-165876 and Japanese Patent Application Laid-Open No. 2007-102284, it is supposed that it is impossible to correctly perform camera control which brings about an appropriate brightness with respect to a target image. On the other hand, according to the first and second exemplary embodiments, exposure control which enables obtaining an image with a correct brightness is implemented.
- The present disclosure can also be implemented by performing processing for supplying a program for implementing one or more functions of the above-described exemplary embodiments to a system or apparatus via a network or a storage medium and causing one or more processors included in a computer of the system or apparatus to read and execute the program. Moreover, the present disclosure can also be implemented by a circuit which implements such one or more functions (for example, an application specific integrated circuit (ASIC)).
- Each of the above-described exemplary embodiments is merely a specific example for implementing the present disclosure, and should not be construed to limit the technical scope of the present disclosure. Thus, the present disclosure can be implemented in various forms without departing from the technical idea or the principal characteristics thereof.
- Embodiment(s) of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random access memory (RAM), a read-only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
- While the present disclosure has been described with reference to exemplary embodiments, it is to be understood that the disclosure is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Claims (12)
1. An apparatus capable of communicating with a capturing apparatus, the apparatus comprising:
at least one processor executing instructions that, when executed by the at least one processor, cause the at least one processor to:
acquire an image captured by the capturing apparatus;
acquire first exposure setting value from the capturing apparatus;
determine second exposure setting value based on a luminance of a specific region;
determine, based on the first exposure setting value and the second exposure setting value, an exposure correction value;
output, in a case where the exposure correction value is larger than a predetermined value, a new exposure correction value that is approximately equal to the predetermined value, to the capturing apparatus; and
output, in a case where the exposure correction value is equal to or smaller than the predetermined value, the exposure correction value, to the capturing apparatus.
2. The apparatus according to claim 1 , wherein in the case where the exposure correction value is larger than a predetermined value, the first exposure setting value is changed to the exposure correction value gradually after the first exposure setting value is changed to the output new exposure correction value.
3. The apparatus according to claim 1 , wherein the specific region includes a specific subject in the image.
4. The apparatus according to claim 1 , wherein the instructions further cause the at least one processor to:
acquire a first input-output characteristic of the capturing apparatus concerning the image; and
convert a luminance of the image based on a second input-output characteristic which is an input-output characteristic inverse to the first input-output characteristic,
wherein a determination of the second exposure setting value based on a luminance of the specific region in the image.
5. A method for controlling an apparatus capable of communicating with a capturing apparatus, the method comprising:
acquiring an image captured by the capturing apparatus;
acquiring first exposure setting value determined by the capturing apparatus based on a luminance of a first region in the image;
determining second exposure setting value based on a luminance of a specific region;
determining an exposure correction value based on the first exposure setting value and the second exposure setting value;
outputting, in a case where the exposure correction value is larger than a predetermined value, a new exposure correction value that is approximately equal to the predetermined value, to the capturing apparatus; and
outputting, in a case where the exposure correction value is equal to or smaller than the predetermined value, the exposure correction value, to the capturing apparatus.
6. The method according to claim 5 , wherein in the case where the exposure correction value is larger than a predetermined value, the first exposure setting value is changed to the exposure correction value gradually after the first exposure setting value is changed to the output new exposure correction value.
7. The method according to claim 5 , wherein the specific region includes a specific subject in the image.
8. The method according to claim 5 , further comprising:
acquiring a first input-output characteristic of the capturing apparatus concerning the image;
converting a luminance of the image based on a second input-output characteristic which is an input-output characteristic inverse to the first input-output characteristic; and
determining the second exposure setting value based on a luminance of the specific region in the image.
9. A non-transitory computer-readable storage medium storing computer-executable instructions that, when executed by a computer, cause the computer to perform a method for controlling an apparatus capable of communicating with a capturing apparatus, the method comprising:
acquiring an image captured by the capturing apparatus;
acquiring first exposure setting value determined by the capturing apparatus based on a luminance of a first region in the image;
determining second exposure setting value based on a luminance of a specific region;
determining an exposure correction value based on the first exposure setting value and the second exposure setting value;
outputting, in a case where the exposure correction value is larger than a predetermined value, a new exposure correction value that is approximately equal to the predetermined value, to the capturing apparatus; and
outputting, in a case where the exposure correction value is equal to or smaller than the predetermined value, the exposure correction value, to the capturing apparatus.
10. The non-transitory computer-readable storage medium according to claim 9 , wherein in the case where the exposure correction value is larger than a predetermined value, the first exposure setting value is changed to the exposure correction value gradually after the first exposure setting value is changed to the output new exposure correction value.
11. The non-transitory computer-readable storage medium according to claim 9 , wherein the specific region includes a specific subject in the image.
12. The non-transitory computer-readable storage medium according to claim 9 , wherein the method further comprises:
acquiring a first input-output characteristic of the capturing apparatus concerning the image;
converting a luminance of the image based on a second input-output characteristic which is an input-output characteristic inverse to the first input-output characteristic; and
determining the second exposure setting value based on a luminance of the specific region in the image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/488,884 US20240048852A1 (en) | 2020-09-09 | 2023-10-17 | Apparatus, control method, and storage medium |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020151228A JP2022045567A (en) | 2020-09-09 | 2020-09-09 | Imaging control device, imaging control method, and program |
JP2020-151228 | 2020-09-09 | ||
US17/459,915 US11825204B2 (en) | 2020-09-09 | 2021-08-27 | Apparatus for controlling exposure of capturing apparatus, method for controlling the same, and storage medium |
US18/488,884 US20240048852A1 (en) | 2020-09-09 | 2023-10-17 | Apparatus, control method, and storage medium |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/459,915 Continuation US11825204B2 (en) | 2020-09-09 | 2021-08-27 | Apparatus for controlling exposure of capturing apparatus, method for controlling the same, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
US20240048852A1 true US20240048852A1 (en) | 2024-02-08 |
Family
ID=80470118
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/459,915 Active US11825204B2 (en) | 2020-09-09 | 2021-08-27 | Apparatus for controlling exposure of capturing apparatus, method for controlling the same, and storage medium |
US18/488,884 Pending US20240048852A1 (en) | 2020-09-09 | 2023-10-17 | Apparatus, control method, and storage medium |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/459,915 Active US11825204B2 (en) | 2020-09-09 | 2021-08-27 | Apparatus for controlling exposure of capturing apparatus, method for controlling the same, and storage medium |
Country Status (2)
Country | Link |
---|---|
US (2) | US11825204B2 (en) |
JP (1) | JP2022045567A (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022045532A (en) * | 2020-09-09 | 2022-03-22 | キヤノン株式会社 | Imaging control device, imaging control method, and program |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007102284A (en) | 2005-09-30 | 2007-04-19 | Canon Inc | Image processing apparatus and method, and computer program |
JP4897593B2 (en) * | 2007-07-09 | 2012-03-14 | 富士フイルム株式会社 | Compound eye photographing apparatus and adjustment method thereof |
JP4841582B2 (en) * | 2008-03-18 | 2011-12-21 | 富士通株式会社 | Image correction program and image correction apparatus |
JP2015195477A (en) * | 2014-03-31 | 2015-11-05 | ブラザー工業株式会社 | Program, terminal device and method |
JP6444073B2 (en) * | 2014-06-25 | 2018-12-26 | キヤノン株式会社 | Image processing device |
US11089228B2 (en) * | 2018-10-25 | 2021-08-10 | Canon Kabushiki Kaisha | Information processing apparatus, control method of information processing apparatus, storage medium, and imaging system |
-
2020
- 2020-09-09 JP JP2020151228A patent/JP2022045567A/en active Pending
-
2021
- 2021-08-27 US US17/459,915 patent/US11825204B2/en active Active
-
2023
- 2023-10-17 US US18/488,884 patent/US20240048852A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US11825204B2 (en) | 2023-11-21 |
JP2022045567A (en) | 2022-03-22 |
US20220078325A1 (en) | 2022-03-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8879802B2 (en) | Image processing apparatus and image processing method | |
US9489747B2 (en) | Image processing apparatus for performing object recognition focusing on object motion, and image processing method therefor | |
CN101213828B (en) | Method and apparatus for incorporating iris color in red-eye correction | |
US20240048852A1 (en) | Apparatus, control method, and storage medium | |
US10362207B2 (en) | Image capturing apparatus capable of intermittent image capturing, and control method and storage medium thereof | |
US10764550B2 (en) | Image processing apparatus, image processing method, and storage medium | |
US11258956B2 (en) | Image capturing apparatus, image capturing method, and program | |
CN110365897B (en) | Image correction method and device, electronic equipment and computer readable storage medium | |
US11575841B2 (en) | Information processing apparatus, imaging apparatus, method, and storage medium | |
US20120188437A1 (en) | Electronic camera | |
US9215365B2 (en) | Imaging apparatus and imaging method | |
JP2014007528A (en) | Image processing apparatus and control method thereof | |
US11727716B2 (en) | Information processing apparatus, imaging apparatus, which determines exposure amount with respect to face detection and human body detection | |
US10021314B2 (en) | Image processing apparatus, image capturing apparatus, method of controlling the same, and storage medium for changing shading using a virtual light source | |
US20210306543A1 (en) | Information processing apparatus, image capturing apparatus, method, and storage medium | |
US11711619B2 (en) | Controlling exposure based on inverse gamma characteristic | |
US20190052803A1 (en) | Image processing system, imaging apparatus, image processing apparatus, control method, and storage medium | |
US20230276134A1 (en) | Information processing apparatus, imaging apparatus, method, and storage medium | |
WO2017208991A1 (en) | Image capturing and processing device, electronic instrument, image capturing and processing method, and image capturing and processing device control program | |
US11716541B2 (en) | Image capturing apparatus, method of controlling image capturing apparatus, system, and non-transitory computer-readable storage medium | |
US20220360701A1 (en) | Imaging control apparatus for controlling exposure of imaging apparatus based on subject authentication result, imaging control method, and storage medium | |
US11849221B2 (en) | Apparatus, method, and storage medium | |
JP5629562B2 (en) | Image processing apparatus, control method therefor, and program | |
US9936158B2 (en) | Image processing apparatus, method and program | |
JP2024015578A (en) | Control device, imaging device, control method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |