US20210357676A1 - Information processing apparatus, information processing method, and storage medium - Google Patents

Information processing apparatus, information processing method, and storage medium Download PDF

Info

Publication number
US20210357676A1
US20210357676A1 US17/318,893 US202117318893A US2021357676A1 US 20210357676 A1 US20210357676 A1 US 20210357676A1 US 202117318893 A US202117318893 A US 202117318893A US 2021357676 A1 US2021357676 A1 US 2021357676A1
Authority
US
United States
Prior art keywords
image capturing
processing
detection
blocks
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/318,893
Inventor
Eiichiro Kitagawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KITAGAWA, EIICHIRO
Publication of US20210357676A1 publication Critical patent/US20210357676A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06K9/4671
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06K9/4604
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, and a storage medium.
  • Japanese Patent No. 5235718 discusses a technique that performs image analysis on a captured image to extract a feature amount of the image, and detects, based on a change of the feature amount, an action (camera tampering attempts) obstructing image capturing.
  • the present disclosure is directed to a technique capable of detecting a state where image capturing is obstructed depending on a situation.
  • an information processing apparatus determining whether image capturing by an image capturing apparatus is obstructed, includes a dividing unit configured to divide an input image captured by the image capturing apparatus into a plurality of blocks, a processing determination unit configured to determine whether to perform first detection processing using a reference image corresponding to the image capturing apparatus or second detection processing using a feature amount of the input image, on each of the blocks, and an obstruction determination unit configured to determine whether the image capturing by the image capturing apparatus is obstructed, based on a detection result of each of the blocks by the first detection processing or the second detection processing.
  • FIG. 1 is a block diagram illustrating an example of a configuration of a system.
  • FIG. 2 is a block diagram illustrating an example of a hardware configuration of an information processing apparatus.
  • FIG. 3 is a block diagram illustrating an example of a functional configuration of an image capturing apparatus.
  • FIG. 4 is a diagram illustrating an example of a method of dividing an image into a plurality of blocks.
  • FIG. 5 is a flowchart illustrating an example of processing performed by the image capturing apparatus.
  • FIG. 6 is a flowchart illustrating an example of processing performed by the image capturing apparatus.
  • FIGS. 7A to 7H are diagrams each illustrating an example of a setting screen for receiving an instruction from a user.
  • FIGS. 8A and 8B are flowcharts each illustrating an example of processing by the image capturing apparatus.
  • FIG. 9 is a block diagram illustrating another example of the functional configuration of the image capturing apparatus.
  • FIG. 10 is a diagram illustrating an example of algorithm to determine detection processing.
  • a system 1 includes a plurality of image capturing apparatuses A 101 - 1 to A 101 - 3 and a management apparatus A 105 .
  • Each of the image capturing apparatuses A 101 - 1 to A 101 - 3 and the management apparatus A 105 are connected so as to transmit and receive information and data to and from each other through a predetermined network A 103 .
  • a type of the network A 103 is not particularly limited as long as the network A 103 can connect each of the image capturing apparatuses A 101 - 1 to A 101 - 3 with the management apparatus A 105 .
  • Specific examples of the network A 103 include the Internet, a local area network (LAN), a wide area network (WAN), a public line (e.g., telephone line or mobile communication line).
  • other examples of the network A 103 include a dedicated line, an asynchronous transfer mode (ATM) line, a frame relay line, a cable television line, and a data broadcasting wireless communication line.
  • the network A 103 may be a wireless network or a wired network.
  • the network A 103 may include a plurality of different types of networks.
  • communication between each of the image capturing apparatuses A 101 - 1 to A 101 - 3 and the management apparatus A 105 may be relayed by a communication apparatus.
  • the different types of networks may be applied to the communication between the communication apparatus and each of the image capturing apparatuses A 101 - 1 to A 101 - 3 , and the communication between the communication apparatus and the management apparatus A 105 .
  • Each of the image capturing apparatuses A 101 - 1 to A 101 - 3 has a detection function to detect an action (e.g., camera tampering attempts) that shields at least a part of a viewing angle to obstruct the image capturing.
  • each of the image capturing apparatuses A 101 - 1 to A 101 - 3 are used as a monitoring camera.
  • each of the image capturing apparatuses A 101 - 1 to A 101 - 3 is also referred to as an “image capturing apparatus A 101 ”.
  • the management apparatus A 105 is an information processing apparatus that is used for monitoring operation based on images corresponding to image capturing results of the respective image capturing apparatuses A 101 - 1 to A 101 - 3 .
  • the management apparatus A 105 has functions of, for example, presentation of the image corresponding to the image capturing result of each image capturing apparatus A 101 , control of the above-described detection function of each image capturing apparatus A 101 , and reception of notification (e.g., alert) from each image capturing apparatus A 101 .
  • the management apparatus A 105 can be realized by, for example, a personal computer (PC).
  • the management apparatus A 105 includes, for example, a main body performing various kinds of calculations, an output device (e.g., display) presenting information to the user, and an input device (e.g., keyboard and pointing device) receiving an instruction from the user.
  • the management apparatus A 105 may receive, from the user, an instruction about setting of each image capturing apparatus A 101 through a user interface such as a web browser, and may update setting of the target image capturing apparatus A 101 based on the instruction. Further, the management apparatus A 105 may receive the image (e.g., moving image or still image) corresponding to the image capturing result from each image capturing apparatus A 101 , and may present the image to the user through the output device or record the image.
  • the image e.g., moving image or still image
  • the management apparatus A 105 may receive notification of an alert and the like from each image capturing apparatus A 101 , and present information corresponding to the notification to the user through the output device.
  • the various kinds of functions described above may be implemented by, for example, applications installed in the management apparatus A 105 .
  • FIG. 2 An example of a hardware configuration of an information processing apparatus 100 adoptable as parts relating to execution of the various kinds of calculations of the image capturing apparatus A 101 and as the management apparatus A 105 is described with reference to FIG. 2 .
  • the information processing apparatus 100 includes a central processing unit (CPU) 101 , a read only memory (ROM) 102 , and a random access memory (RAM) 103 .
  • the information processing apparatus 100 further includes an auxiliary storage device 104 and a communication interface (I/F) 107 .
  • the information processing apparatus 100 may include at least any of an output device 105 and an input device 106 .
  • the CPU 101 , the ROM 102 , the RAM 103 , the auxiliary storage device 104 , the output device 105 , the input device 106 , and the communication I/F 107 are connected to one another through a bus 108 .
  • the CPU 101 controls various kinds of operation of the information processing apparatus 100 .
  • the CPU 101 may control operation of the entire information processing apparatus 100 .
  • the ROM 102 stores control programs, a boot program, and other programs executable by the CPU 101 .
  • the RAM 103 is a main storage memory of the CPU 101 , and is used as a work area or a temporary storage area for loading various kinds of programs.
  • the auxiliary storage device 104 stores various kinds of data and various kinds of programs.
  • the auxiliary storage device 104 is implemented by a storage device temporarily or persistently storing various kinds of data, such as a nonvolatile memory represented by a hard disk drive (HDD) and a solid state drive (SSD).
  • HDD hard disk drive
  • SSD solid state drive
  • the output device 105 is a device outputting various kinds of information, and is used for presentation of the various kinds of information to the user.
  • the output device 105 is implemented by a display device such as a display.
  • the output device 105 presents the information to the user by displaying various kinds of display information.
  • the output device 105 may be implemented by a sound output device outputting sound such as voice and electronic sound.
  • the output device 105 presents the information to the user by outputting sound such as voice and electronic sound.
  • the device adopted as the output device 105 may be appropriately changed depending on a medium used for presentation of information to the user.
  • the input device 106 is used to receive various kinds of instructions from the user.
  • the input device 106 can be implemented by, for example, a mouse, a keyboard, and a touch panel. Further, as another example, the input device 106 may include a sound collection device such as a microphone, and may collect voice uttered by the user. In this case, when various kinds of analysis processing such as acoustic analysis and natural language processing is performed on the collected voice, contents represented by the voice are recognized as the instruction from the user. Further, a device adopted as the input device 106 may be appropriately changed depending on a method of recognizing the instruction from the user. In addition, a plurality of types of devices may be adopted as the input device 106 .
  • the communication DF 107 is used for communication with an external apparatus through the network.
  • a device adopted as the communication I/F 107 may be appropriately changed depending on a type of a communication path and an adopted communication system.
  • the image capturing apparatus A 101 includes an image capturing unit A 201 , a compression unit A 202 , a format conversion unit A 203 , and a communication unit A 204 .
  • the image capturing apparatus A 101 further includes a block dividing unit A 205 , a detection processing switching unit A 206 , a first detection unit A 207 , a second detection unit A 208 , an obstruction determination unit A 209 , a notification unit A 210 , and a setting reception unit A 211 .
  • the image capturing unit A 201 guides light of an object incident through an optical system such as a lens, to an image capturing device, photoelectrically converts the light into an electric signal by the image capturing device, and generates image data based on the electric signal.
  • the compression unit A 202 applies encoding processing, compression processing, and other processing on the image data output from the image capturing unit A 201 , to reduce a data amount of the image data.
  • the format conversion unit A 203 converts the image data, the data amount of which has been reduced by compression, into other image data of a predetermined format.
  • the format conversion unit A 203 may convert the target image data into image data of a format more suitable for transmission through the network.
  • the format conversion unit A 203 outputs the format-converted image data to a predetermined output destination.
  • the format conversion unit A 203 may output the format-converted image data to the communication unit A 204 to transmit the image data to the other apparatus (e.g., management apparatus A 105 ) through the network.
  • the communication unit A 204 transmits and receives information and data to and from the other apparatus through a predetermined network. For example, the communication unit A 204 receives information corresponding to an instruction about various kinds of settings received by the management apparatus A 105 from the user. In addition, the communication unit A 204 transmits an image corresponding to the image capturing result of the image capturing unit A 201 and notifies the management apparatus A 105 of various kinds of notification information (e.g., alert information).
  • notification information e.g., alert information
  • the block dividing unit A 205 divides the image of the image data output from the image capturing unit A 201 (i.e., image corresponding to image capturing result of image capturing unit A 201 ) into a plurality of blocks.
  • the block dividing unit A 205 may divide the image corresponding to the image capturing result of the image capturing unit A 201 , into a plurality of blocks each having a rectangular shape.
  • FIG. 4 illustrates an example of a method of dividing the image into the plurality of blocks.
  • the block dividing unit A 205 divides the entire image (i.e., entire viewing angle of image capturing unit A 201 ) into 12 blocks each having a uniform size by dividing the entire image into four blocks in vertical direction and into three blocks in a lateral direction.
  • reference numerals A 301 to A 312 are added to the blocks in order from an upper-left block to a lower-right block, for convenience.
  • the example illustrated in FIG. 4 is illustrative, and does not limit the method of dividing the image.
  • the image may be divided into a plurality of blocks in such a manner that an area positioned at a center of the image has a size smaller than an area positioned at an end part of the image.
  • FIG. 3 is referred to again.
  • the detection processing switching unit A 206 selectively switches, based on a predetermined condition, whether to apply processing by the first detection unit A 207 described below or processing by the second detection unit A 208 described below to each of the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A 201 .
  • the detection processing switching unit A 206 may acquire, from the setting reception unit A 211 described below, the information corresponding to the instruction received by the management apparatus A 105 from the user, and may determine processing to be applied to each of the blocks based on the information.
  • the first detection unit A 207 detects occurrence of a state where a partial area corresponding to an input image (e.g., partial image corresponding to each of blocks) in the viewing angle of the image capturing unit A 201 is shielded, based on a difference between the input image and a reference image.
  • a partial area corresponding to an input image e.g., partial image corresponding to each of blocks
  • the second detection unit A 208 detects occurrence of the state where the partial area corresponding to the input image (e.g., partial image corresponding to each of blocks) in the viewing angle of the image capturing unit A 201 is shielded, based on a feature amount representing a predetermined image feature extracted from the input image.
  • the second detection unit A 208 may extract edge power as the above-described feature amount by applying a Sobel filter to the input image.
  • the second detection unit A 208 may detect occurrence of the state where the partial area corresponding to the input image in the viewing angle of the image capturing unit A 201 is shielded, based on uniformity of the input image corresponding to the extracted edge power.
  • the state where the partial area corresponding to the input image (e.g., partial image corresponding to each of blocks) in the viewing angle of the image capturing unit A 201 is shielded is also referred to as a “shielded state”, for convenience.
  • the obstruction determination unit A 209 determines whether the image capturing by the image capturing unit A 201 is obstructed, based on a detection result of the shielded state of each of the blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A 201 , detected by the first detection unit A 207 or the second detection unit A 208 .
  • the obstruction determination unit A 209 may determine whether the image capturing by the image capturing unit A 201 is obstructed, based on a ratio of the blocks detected as being shielded to the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A 201 .
  • the notification unit A 210 notifies a predetermined notification destination (e.g., management apparatus A 105 illustrated in FIG. 1 ) of information corresponding to the determination result of the obstruction determination unit A 209 .
  • a predetermined notification destination e.g., management apparatus A 105 illustrated in FIG. 1
  • the notification unit A 210 may notify the management apparatus A 105 of information notifying alert (hereinafter, also referred to as alert information).
  • the setting reception unit A 211 receives, from the management apparatus A 105 , an instruction about various kinds of settings received by the management apparatus A 105 from the user, and controls various kinds of settings for operation of the image capturing apparatus A 101 in response to the instruction.
  • the setting reception unit A 211 may control the detection processing switching unit A 206 to switch the shielded state detection processing to be applied to each of the blocks, in response to the instruction from the user received from the management apparatus A 105 .
  • the setting reception unit A 211 may transmits, to the management apparatus A 105 , information to present a user interface (UI) for receiving instructions about control of the various kinds of settings from the user (e.g., setting screen) to the user, thereby causing the management apparatus A 105 to present the UI. Further, the setting reception unit A 211 may control the various kinds of settings for operation of the image capturing apparatus A 101 (e.g., setting about switching condition of detection processing switching unit A 206 ), in response to the instruction received by the management apparatus A 105 from the user through the above-described UI.
  • UI user interface
  • FIG. 5 An example of processing by the image capturing apparatus A 101 according to the present exemplary embodiment is described with reference to FIG. 5 while particularly focusing on processing to detect obstruction of the image capturing by the image capturing unit A 201 .
  • edge power is used as the feature amount extracted from the block.
  • step S 101 the block dividing unit A 205 divides the image corresponding to the image capturing result of the image capturing unit A 201 , into a predetermined number of blocks.
  • step S 102 the detection processing switching unit A 206 determines whether to apply the processing by the first detection unit 207 or the processing by the second detection unit A 208 to each of the blocks, based on the user instruction notified from the setting reception unit A 211 .
  • the processing in step S 102 is separately described in detail below.
  • step S 103 the image capturing apparatus A 101 determines whether processing in steps S 104 to S 106 described below has been performed on all of the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A 201 . In a case where the image capturing apparatus A 101 determines in step S 103 that the processing in steps S 104 to S 106 has not been performed on all of the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A 201 (NO in step S 103 ), the processing proceeds to step S 104 .
  • step S 104 the detection processing switching unit A 206 confirms whether application of the processing by the first detection unit A 207 (shielded state detection processing based on background difference) to the target block is determined in step S 102 .
  • step S 104 in a case where the detection processing switching unit A 206 confirms that the processing by the first detection unit A 207 (shielded state detection processing based on background difference) is applied to the target block (YES in step S 104 ), the processing proceeds to step S 106 .
  • step S 106 the detection processing switching unit A 206 requests the first detection unit A 207 to perform the processing on the target block.
  • the first detection unit A 207 detects the shielded state of a partial area corresponding to the target block in the viewing angle of the image capturing unit A 201 by using a background difference based on comparison between a partial image corresponding to the target block and a reference image.
  • step S 104 in a case where the detection processing switching unit A 206 confirms that the processing by the first detection unit A 207 (shielded state detection processing based on background difference) is not applied to the target block (NO in step S 104 ), the processing proceeds to step S 105 .
  • step S 105 the detection processing switching unit A 206 requests the second detection unit A 208 to perform the processing on the target block.
  • the second detection unit A 208 detects the shielded state of the partial area corresponding to the target block in the viewing angle of the image capturing unit A 201 by using edge power extracted from the partial image corresponding to the target block.
  • the image capturing apparatus A 101 performs the processing in steps S 104 to S 106 on all of the blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A 201 , in the above-described manner.
  • step S 103 determines in step S 103 that the processing in steps S 104 to S 106 has been already performed on all of the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A 201 (YES in step S 103 ).
  • the processing proceeds to step S 107 .
  • the obstruction determination unit A 209 determines whether the image capturing by the image capturing unit A 201 is obstructed based on the number of blocks detected as being shielded among all of the blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A 201 . More specifically, the obstruction determination unit A 209 calculates a ratio of the blocks detected as being shielded to all of the blocks, and compares the ratio with a threshold. In a case where the calculated ratio exceeds the threshold, the obstruction determination unit A 209 determines that the image capturing by the image capturing unit A 201 is obstructed.
  • step S 108 the obstruction determination unit 209 confirms whether it is determined in step S 107 that the image capturing by the image capturing unit A 201 is obstructed.
  • step S 108 in a case where the obstruction determination unit A 209 confirms that the image capturing by the image capturing unit A 201 is obstructed (YES in step S 108 ), the processing proceeds to step S 109 .
  • step S 109 the notification unit A 210 notifies the management apparatus A 105 of detection of the state where the image capturing by the image capturing unit A 201 is obstructed.
  • step S 108 in a case where the obstruction determination unit A 209 confirms that the image capturing by the image capturing unit A 201 is not obstructed (NO in step S 108 ), the series of processing illustrated in FIG. 5 ends. In this case, the processing in step S 109 is not performed.
  • FIG. 6 is a flowchart illustrating a flow of a series of processing.
  • FIGS. 7A to 7H each illustrate an example of a setting screen that presents information to the user and receives designation of various kinds of settings from the user.
  • step S 201 the setting reception unit A 211 transmits a screen that presents the detection result of the shielded state of each of the blocks based on the current setting to each of the blocks, to the management apparatus A 105 through the communication unit A 204 , and causes the management apparatus A 105 to present the screen.
  • FIG. 7A illustrates an example of the above-described screen presented by the management apparatus A 105 based on the instruction from the setting reception unit A 211 .
  • the image corresponding to the image capturing result of the image capturing unit A 201 is displayed on an upper part of the screen.
  • An area A 401 illustrated on the image indicates an area corresponding to blocks detected as being shielded, based on the current setting.
  • hatching (mask) in a predetermined presentation form is superimposed on the area A 401 to highlight the target blocks.
  • a start button A 410 is a button for receiving an instruction to start setting about the shielded state detection, from the user.
  • An end button A 411 is a button for receiving an instruction to end the setting about the shielded state detection, from the user.
  • a close button A 412 is a button for receiving an instruction to close the setting screen, from the user.
  • Radio buttons A 413 and A 414 are interfaces for receiving selection of a method to detect the shielded state of each of the blocks, from the user. In a case where the radio button A 413 is selected, the shielded state detection processing based on the edge power by the second detection unit A 208 is applied to the target block. In a case where the radio button A 414 is selected, the shielded state detection processing based on the background difference by the first detection unit A 207 is applied to the target block.
  • the end button A 411 is invalid, and the start button A 410 , the close button A 412 , and the radio buttons A 413 and A 414 can receive operation from the user.
  • step S 202 the management apparatus A 105 determines whether an instruction to complete all setting processing has been received from the user.
  • the management apparatus A 105 may recognize that the instruction to complete all setting processing has been received from the user.
  • step S 202 determines in step S 202 that the instruction to complete all setting processing has been received from the user (YES in step S 202 ).
  • the series of processing illustrated in FIG. 6 ends.
  • step S 203 the management apparatus A 105 determines whether an instruction to start setting about the shielded state detection has been received from the user. As a specific example, in a case where the start button A 410 is pressed, the management apparatus A 105 may recognize that the instruction to start the setting about the shielded state detection has been received from the user.
  • step S 203 determines in step S 203 that the instruction to start the setting about the shielded state detection has not been received (NO in step S 203 ).
  • the processing proceeds to step S 201 .
  • the series of processing from step S 201 illustrated in FIG. 6 is performed again.
  • step S 204 the management apparatus A 105 determines whether, out of the method based on the edge power and the method based on the background difference, the method based on the edge power has been selected as the method to detect the shielded state.
  • the management apparatus A 105 may recognize that the method based on the edge power has been selected.
  • step S 204 determines in step S 204 that the method based on the edge power has been selected as the method to detect the shielded state (e.g., in a case where the radio button A 413 is designated) (YES in step S 204 ).
  • the processing proceeds to step S 205 .
  • step S 205 the management apparatus A 105 performs setting processing relating to the shielded state detection by the method based on the edge power. The processing is separately described in detail below with reference to FIG. 8A .
  • step S 204 determines in step S 204 that the method based on the edge power has not been selected as the method to detect the shielded state (e.g., in a case where the radio button A 414 is designated) (NO in step S 204 )
  • the processing proceeds to step S 206 .
  • step S 206 the management apparatus A 105 performs setting processing relating to the shielded state detection by the method based on the background difference. The processing is separately described in detail below with reference to FIG. 8B .
  • the management apparatus A 105 performs the series of processing illustrated in FIG. 6 in the above-described manner until the management apparatus A 105 determines in step S 202 that the instruction to complete all setting processing has been received from the user.
  • FIG. 8A is a flowchart illustrating a flow of the series of processing.
  • step S 301 the setting reception unit A 211 transmits a screen that presents a detection result of the shielded state of each of the blocks based on the edge power, to the management apparatus A 105 through the communication unit A 204 , and causes the management apparatus A 105 to present the screen.
  • FIG. 7B illustrates an example of the above-described screen presented by the management apparatus A 105 based on the instruction from the setting reception unit A 211 .
  • An area A 402 illustrated on the image corresponding to the image capturing result of the image capturing unit A 201 indicates an area corresponding to blocks set as execution targets of the shielded state detection processing based on the edge power.
  • an area A 403 illustrated on the above-described image indicates an area corresponding to blocks detected as being shielded, based on the edge power.
  • hatching (mask) in a predetermined presentation form is superimposed on each of the areas A 402 and A 403 to highlight the target blocks.
  • execution and inexecution of the shielded state detection processing based on the edge power can be selectively switched in response to an instruction to designate each of the presented blocks (e.g., designation operation using pointing device).
  • step S 302 the management apparatus A 105 determines whether the instruction to designate a block has been received from the user through the above-described screen.
  • step S 303 the management apparatus A 105 requests the image capturing apparatus A 101 that has captured the image displayed on the screen, to switch execution and inexecution of the shielded state detection processing based on the edge power on the designated block.
  • the setting reception unit A 211 of the image capturing apparatus A 101 instructs the detection processing switching unit A 206 to switch execution and inexecution of the shielded state detection processing based on the edge power on the block designated by the user, in response to the request from the management apparatus A 105 .
  • the detection processing switching unit A 206 switches execution and inexecution of the shielded state detection processing based on the edge power on the target block, in response to the instruction from the setting reception unit A 211 .
  • the shielded state detection processing based on the edge power on the target block is switched to inexecution, the shielded state detection processing based on the background difference is performed on the target block.
  • a screen illustrated in FIG. 7C illustrates an example of a screen presented based on a switching result of execution and inexecution of the shielded state detection processing based on the edge power on the block designated by the user.
  • An area A 404 illustrated on the image corresponding to the image capturing result of the image capturing unit A 201 indicates an area corresponding to blocks set as execution targets of the shielded state detection processing based on the edge power.
  • step S 302 determines in step S 302 that the instruction to designate the block has not been received from the user (NO in step S 302 )
  • the processing proceeds to step S 304 .
  • the processing in step S 303 is not performed.
  • step S 304 the management apparatus A 105 determines whether an instruction to end setting about the shielded state detection has been received from the user. As a specific example, in a case where the end button A 411 is pressed, the management apparatus A 105 may recognize that the instruction to end the setting about the shielded state detection has been received from the user.
  • step S 304 determines in step S 304 that the instruction to end the setting about the shielded state detection has been received from the user (YES in step S 304 ).
  • the series of processing illustrated in FIG. 8A ends.
  • a screen illustrated in FIG. 7D illustrates an example of a screen presented after the series of processing illustrated in FIG. 8A is completed. As presented in the screen, it is found that erroneous detection occurred at a timing when the screen illustrated in FIG. 7B is presented is eliminated at a timing when the screen illustrated in FIG. 7D is presented.
  • step S 304 determines in step S 304 that the instruction to end the setting about the shielded state detection has not been received from the user (NO in step S 304 ).
  • the processing proceeds to step S 301 .
  • the management apparatus A 105 performs the series of processing illustrated in FIG. 8A in the above-described manner until the management apparatus A 105 determines in step S 304 that the instruction to end the setting about the shielded state detection has been received from the user.
  • FIG. 8B is a flowchart illustrating a flow of the series of processing.
  • a screen illustrated in FIG. 7E illustrates an example of a screen that receives, from the user, the instruction to switch execution and inexecution of the shielded state detection processing based on the background difference on each of the blocks.
  • the shielded state detection processing based on the background difference on each of the blocks is set to inexecution.
  • a screen illustrated in FIG. 7F illustrates another example of the screen that receives, from the user, the instruction to switch execution and inexecution of the shielded state detection processing based on the background difference on each of the blocks.
  • the shielded state detection processing based on the background difference on each of the blocks is set to execution.
  • step S 305 the setting reception unit A 211 transmits a screen that presents a detection result of the shielded state of each of the blocks based on the background difference, to the management apparatus A 105 through the communication unit A 204 , and causes the management apparatus A 105 to present the screen.
  • an area A 405 illustrated on the image corresponding to the image capturing result of the image capturing unit A 201 indicates an area corresponding to blocks set as execution targets of the shielded state detection processing based on the background difference.
  • an area A 406 illustrated on the above-described image indicates an area corresponding to blocks detected as being shielded, based on the background difference.
  • hatching (mask) in a predetermined presentation form is superimposed on each of the areas A 405 and A 406 to highlight the target blocks.
  • execution and inexecution of the shielded state detection processing based on the background difference can be selectively switched by an instruction to designate each of the presented blocks (e.g., designation operation using pointing device).
  • step S 306 the management apparatus A 105 determines whether an instruction to designate a block has been received from the user through the above-described screen.
  • step S 306 the processing proceeds to step S 307 .
  • step S 307 the management apparatus A 105 requests the image capturing apparatus A 101 that has captured the image displayed on the screen, to switch execution and inexecution of the shielded state detection processing based on the background difference on the designated block.
  • the setting reception unit A 211 of the image capturing apparatus A 101 instructs the detection processing switching unit A 206 to switch execution and inexecution of the shielded state detection processing based on the background difference on the block designated by the user, in response to the request from the management apparatus A 105 .
  • the detection processing switching unit A 206 switches execution and inexecution of the shielded state detection processing based on the background difference on the target block, in response to the instruction from the setting reception unit A 211 .
  • the shielded state detection processing based on the background difference on the target block is switched to inexecution, the shielded state detection processing based on the edge power is performed on the target block.
  • a screen illustrated in FIG. 7G illustrates an example of a screen presented based on a switching result of execution and inexecution of the shielded state detection processing based on the background difference on the block designated by the user.
  • An area A 407 illustrated on the image corresponding to the image capturing result of the image capturing unit A 201 indicates an area corresponding to blocks set as execution targets of the shielded state detection processing based on the background difference.
  • step S 306 determines in step S 306 that the instruction to designate the block has not been received from the user (NO in step S 306 )
  • the processing proceeds to step S 308 .
  • the processing in step S 307 is not performed.
  • step S 308 the management apparatus A 105 determines whether an instruction to end the setting about the shielded state detection has been received from the user.
  • the management apparatus A 105 may recognize that the instruction to end the setting about the shielded state detection has been received from the user.
  • step S 308 determines in step S 308 that the instruction to end the setting about the shielded state detection has been received from the user (YES in step S 308 ).
  • the series of processing illustrated in FIG. 8B ends.
  • a screen illustrated in FIG. 7H illustrates an example of a screen presented after the series of processing illustrated in FIG. 8B is completed. As presented in the screen, it is found that erroneous detection occurred at a timing when the screen illustrated in FIG. 7E is presented is eliminated at a timing when the screen illustrated in FIG. 7H is presented.
  • step S 308 determines in step S 308 that the instruction to end the setting about the shielded state detection has not been received from the user (NO in step S 308 ).
  • the processing returns to step S 305 .
  • the management apparatus A 105 performs the series of processing illustrated in FIG. 8B in the above-described manner until the management apparatus A 105 determines in step S 308 that the instruction to end the setting about the shielding state detection has been received from the user.
  • Applying the above-described processing makes it possible to selectively switch the processing to be applied to the determination whether each of the blocks is shielded, between the processing based on the feature amount (e.g., edge power) and the processing based on the background difference depending on the situation of the time.
  • Such a mechanism makes it possible to improve detection accuracy of the state where the image capturing by the image capturing unit A 201 is obstructed, depending on the situation of the time (e.g., scene to be monitored).
  • the processing to be applied to each of the blocks is manually set by the user operation.
  • an example of a mechanism in which the image capturing apparatus A 101 automatically set the processing to be applied to each of the blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A 201 by using a detection result of the shielded state of each of the blocks is described.
  • FIG. 9 an example of a functional configuration of the image capturing apparatus A 101 according to the present modification is described with reference to FIG. 9 .
  • the image capturing apparatus A 101 according to the present modification is different from the example illustrated in FIG. 3 that the image capturing apparatus A 101 includes a detection processing determination unit A 212 , and the detection processing switching unit A 206 switches the processing to be applied to the target block based on an instruction from the detection processing determination unit A 212 .
  • reference numerals similar to the reference numerals in FIG. 3 indicate components similar to the components denoted by the reference numerals in FIG. 3 .
  • the functional configuration of the image capturing apparatus A 101 according to the present modification is described while focusing on differences from the example illustrated in FIG. 3 .
  • the detection processing determination unit A 212 receives feedback of the detection result of the shielded state of the block based on the background difference by the first detection unit A 207 and feedback of the detection result of the shielded state of the block based on the feature amount (e.g., edge power) by the second detection unit A 208 .
  • the detection processing determination unit A 212 determines whether to apply the detection processing based on the background difference or the detection processing based on the feature amount, to the target block, based on the feedback (i.e., detection result described above).
  • FIG. 10 illustrates an example of algorithm for the detection processing determination unit A 212 to determine the detection processing to be applied to the target block.
  • the second detection unit A 208 uses the edge power as the feature amount for detecting the shielded state of each of the blocks.
  • the detection processing determination unit A 212 determines the detection processing to be applied to each of the blocks based on whether the background difference acquired for each of the blocks is larger than or smaller than a threshold, and whether the feature amount extracted from each of the blocks is larger than or smaller than a threshold. More specifically, the detection processing determination unit A 212 basically determines the processing based on the edge power as the applied processing, and in a case where the edge power is smaller than the threshold and the background difference is smaller than the threshold, the detection processing determination unit A 212 determines the detection processing based on the background difference as the applied processing. Further, the detection processing determination unit A 212 controls the detection processing switching unit A 206 to switch the processing to be applied to the target block, based on the determination result of the detection processing applied to the target block.
  • Applying the above-described control makes it possible to automatically and selectively switch the processing to be applied for determination whether each of the blocks is shielded, between the processing based on the feature amount (e.g., edge power) and the processing based on the background difference, depending on the situation of the time.
  • Such a mechanism makes it possible to improve the detection accuracy of the state where the image capturing by the image capturing unit A 201 is obstructed, depending on the situation of the time (e.g., scene to be monitored).
  • the present disclosure can be realized by supplying programs implementing one or more functions of the above-described exemplary embodiment to a system or an apparatus through a network or a recording medium, and causing one or more processors of a computer in the system or the apparatus to read out and execute the programs. Further, the present disclosure can be realized by a circuit (e.g., application specific integrated circuit (ASIC)) implementing one or more functions of the above-described exemplary embodiment.
  • ASIC application specific integrated circuit
  • the configurations described with reference to FIG. 3 and FIG. 9 are merely examples, and are not intended to limit the functional configuration of the image capturing apparatus A 101 according to the present modification.
  • some of the components may be provided outside the image capturing apparatus A 101 .
  • the components A 205 to A 211 relating to detection of the state where the image capturing by the image capturing unit A 201 is obstructed may be provided outside the image capturing apparatus A 101 .
  • an apparatus including the components A 205 to A 211 relating to detection of the state where the image capturing by the image capturing unit A 201 is obstructed corresponds to an example of the “information processing apparatus” according to the present exemplary embodiment.
  • a load of the processing by at least some of the components may be distributed to a plurality of apparatuses.
  • Embodiment(s) of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
  • computer executable instructions e.g., one or more programs
  • a storage medium which may also be referred to more fully as a
  • the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
  • the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
  • the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

An information processing apparatus determining whether image capturing by an image capturing apparatus is obstructed, the information processing apparatus comprising: a dividing unit configured to divide an input image captured by the image capturing apparatus into a plurality of blocks; a processing determination unit configured to determine whether to perform first detection processing using a reference image corresponding to the image capturing apparatus or second detection processing using a feature of the input image, on each of the blocks; and an obstruction determination unit configured to determine whether the image capturing by the image capturing apparatus is obstructed, based on a detection result of each of the blocks by the first detection processing or the second detection processing.

Description

    BACKGROUND Field of the Disclosure
  • The present disclosure relates to an information processing apparatus, an information processing method, and a storage medium.
  • Description of the Related Art
  • Japanese Patent No. 5235718 discusses a technique that performs image analysis on a captured image to extract a feature amount of the image, and detects, based on a change of the feature amount, an action (camera tampering attempts) obstructing image capturing.
  • SUMMARY
  • The present disclosure is directed to a technique capable of detecting a state where image capturing is obstructed depending on a situation.
  • According to an aspect of the present disclosure, an information processing apparatus determining whether image capturing by an image capturing apparatus is obstructed, includes a dividing unit configured to divide an input image captured by the image capturing apparatus into a plurality of blocks, a processing determination unit configured to determine whether to perform first detection processing using a reference image corresponding to the image capturing apparatus or second detection processing using a feature amount of the input image, on each of the blocks, and an obstruction determination unit configured to determine whether the image capturing by the image capturing apparatus is obstructed, based on a detection result of each of the blocks by the first detection processing or the second detection processing.
  • Further features of the present disclosure will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram illustrating an example of a configuration of a system.
  • FIG. 2 is a block diagram illustrating an example of a hardware configuration of an information processing apparatus.
  • FIG. 3 is a block diagram illustrating an example of a functional configuration of an image capturing apparatus.
  • FIG. 4 is a diagram illustrating an example of a method of dividing an image into a plurality of blocks.
  • FIG. 5 is a flowchart illustrating an example of processing performed by the image capturing apparatus.
  • FIG. 6 is a flowchart illustrating an example of processing performed by the image capturing apparatus.
  • FIGS. 7A to 7H are diagrams each illustrating an example of a setting screen for receiving an instruction from a user.
  • FIGS. 8A and 8B are flowcharts each illustrating an example of processing by the image capturing apparatus.
  • FIG. 9 is a block diagram illustrating another example of the functional configuration of the image capturing apparatus.
  • FIG. 10 is a diagram illustrating an example of algorithm to determine detection processing.
  • DESCRIPTION OF THE EMBODIMENTS
  • An exemplary embodiment of the present disclosure is described in detail below with reference to the accompanying drawings. In the present specification and the drawings, components having substantially the same functional configuration are denoted by the same reference numerals, and repetitive descriptions of the components are omitted.
  • <System Configuration>
  • An example of a configuration of a system according to an exemplary embodiment of the present disclosure is described with reference to FIG. 1. A system 1 according to the present exemplary embodiment includes a plurality of image capturing apparatuses A101-1 to A101-3 and a management apparatus A105. Each of the image capturing apparatuses A101-1 to A101-3 and the management apparatus A105 are connected so as to transmit and receive information and data to and from each other through a predetermined network A103.
  • A type of the network A103 is not particularly limited as long as the network A103 can connect each of the image capturing apparatuses A101-1 to A101-3 with the management apparatus A105. Specific examples of the network A103 include the Internet, a local area network (LAN), a wide area network (WAN), a public line (e.g., telephone line or mobile communication line). Further, other examples of the network A103 include a dedicated line, an asynchronous transfer mode (ATM) line, a frame relay line, a cable television line, and a data broadcasting wireless communication line. Further, the network A103 may be a wireless network or a wired network. In addition, the network A103 may include a plurality of different types of networks. As a specific example, communication between each of the image capturing apparatuses A101-1 to A101-3 and the management apparatus A105 may be relayed by a communication apparatus. In this case, the different types of networks may be applied to the communication between the communication apparatus and each of the image capturing apparatuses A101-1 to A101-3, and the communication between the communication apparatus and the management apparatus A105.
  • Each of the image capturing apparatuses A101-1 to A101-3 has a detection function to detect an action (e.g., camera tampering attempts) that shields at least a part of a viewing angle to obstruct the image capturing. In the example illustrated in FIG. 1, each of the image capturing apparatuses A101-1 to A101-3 are used as a monitoring camera. In the following description, in a case where the image capturing apparatuses A101-1 to A101-3 are not particularly distinguished from one another, each of the image capturing apparatuses A101-1 to A101-3 is also referred to as an “image capturing apparatus A101”.
  • The management apparatus A105 is an information processing apparatus that is used for monitoring operation based on images corresponding to image capturing results of the respective image capturing apparatuses A101-1 to A101-3. The management apparatus A105 has functions of, for example, presentation of the image corresponding to the image capturing result of each image capturing apparatus A101, control of the above-described detection function of each image capturing apparatus A101, and reception of notification (e.g., alert) from each image capturing apparatus A101. The management apparatus A105 can be realized by, for example, a personal computer (PC).
  • The management apparatus A105 includes, for example, a main body performing various kinds of calculations, an output device (e.g., display) presenting information to the user, and an input device (e.g., keyboard and pointing device) receiving an instruction from the user. The management apparatus A105 may receive, from the user, an instruction about setting of each image capturing apparatus A101 through a user interface such as a web browser, and may update setting of the target image capturing apparatus A101 based on the instruction. Further, the management apparatus A105 may receive the image (e.g., moving image or still image) corresponding to the image capturing result from each image capturing apparatus A101, and may present the image to the user through the output device or record the image. Furthermore, the management apparatus A105 may receive notification of an alert and the like from each image capturing apparatus A101, and present information corresponding to the notification to the user through the output device. The various kinds of functions described above may be implemented by, for example, applications installed in the management apparatus A105.
  • <Hardware Configuration>
  • An example of a hardware configuration of an information processing apparatus 100 adoptable as parts relating to execution of the various kinds of calculations of the image capturing apparatus A101 and as the management apparatus A105 is described with reference to FIG. 2.
  • The information processing apparatus 100 includes a central processing unit (CPU) 101, a read only memory (ROM) 102, and a random access memory (RAM) 103. The information processing apparatus 100 further includes an auxiliary storage device 104 and a communication interface (I/F) 107. The information processing apparatus 100 may include at least any of an output device 105 and an input device 106. The CPU 101, the ROM 102, the RAM 103, the auxiliary storage device 104, the output device 105, the input device 106, and the communication I/F 107 are connected to one another through a bus 108.
  • The CPU 101 controls various kinds of operation of the information processing apparatus 100. For example, the CPU 101 may control operation of the entire information processing apparatus 100. The ROM 102 stores control programs, a boot program, and other programs executable by the CPU 101. The RAM 103 is a main storage memory of the CPU 101, and is used as a work area or a temporary storage area for loading various kinds of programs.
  • The auxiliary storage device 104 stores various kinds of data and various kinds of programs. The auxiliary storage device 104 is implemented by a storage device temporarily or persistently storing various kinds of data, such as a nonvolatile memory represented by a hard disk drive (HDD) and a solid state drive (SSD).
  • The output device 105 is a device outputting various kinds of information, and is used for presentation of the various kinds of information to the user. For example, the output device 105 is implemented by a display device such as a display. In this case, the output device 105 presents the information to the user by displaying various kinds of display information. As another example, the output device 105 may be implemented by a sound output device outputting sound such as voice and electronic sound. In this case, the output device 105 presents the information to the user by outputting sound such as voice and electronic sound. The device adopted as the output device 105 may be appropriately changed depending on a medium used for presentation of information to the user.
  • The input device 106 is used to receive various kinds of instructions from the user. The input device 106 can be implemented by, for example, a mouse, a keyboard, and a touch panel. Further, as another example, the input device 106 may include a sound collection device such as a microphone, and may collect voice uttered by the user. In this case, when various kinds of analysis processing such as acoustic analysis and natural language processing is performed on the collected voice, contents represented by the voice are recognized as the instruction from the user. Further, a device adopted as the input device 106 may be appropriately changed depending on a method of recognizing the instruction from the user. In addition, a plurality of types of devices may be adopted as the input device 106.
  • The communication DF 107 is used for communication with an external apparatus through the network. A device adopted as the communication I/F 107 may be appropriately changed depending on a type of a communication path and an adopted communication system.
  • When the CPU 101 loads programs stored in the ROM 102 or the auxiliary storage device 104 to the RAM 103 and executes the programs, functional configurations illustrated in FIG. 3 and FIG. 9 and processing illustrated in FIG. 5, FIG. 6, FIGS. 7A to 7H, and FIGS. 8A and 8B is implemented.
  • <Functional Configuration>
  • An example of a functional configuration of the image capturing apparatus A101 according to the present exemplary embodiment is described with reference to FIG. 3. The image capturing apparatus A101 includes an image capturing unit A201, a compression unit A202, a format conversion unit A203, and a communication unit A204. The image capturing apparatus A101 further includes a block dividing unit A205, a detection processing switching unit A206, a first detection unit A207, a second detection unit A208, an obstruction determination unit A209, a notification unit A210, and a setting reception unit A211.
  • The image capturing unit A201 guides light of an object incident through an optical system such as a lens, to an image capturing device, photoelectrically converts the light into an electric signal by the image capturing device, and generates image data based on the electric signal.
  • The compression unit A202 applies encoding processing, compression processing, and other processing on the image data output from the image capturing unit A201, to reduce a data amount of the image data.
  • The format conversion unit A203 converts the image data, the data amount of which has been reduced by compression, into other image data of a predetermined format. As a specific example, the format conversion unit A203 may convert the target image data into image data of a format more suitable for transmission through the network.
  • The format conversion unit A203 outputs the format-converted image data to a predetermined output destination. As a specific example, the format conversion unit A203 may output the format-converted image data to the communication unit A204 to transmit the image data to the other apparatus (e.g., management apparatus A105) through the network.
  • The communication unit A204 transmits and receives information and data to and from the other apparatus through a predetermined network. For example, the communication unit A204 receives information corresponding to an instruction about various kinds of settings received by the management apparatus A105 from the user. In addition, the communication unit A204 transmits an image corresponding to the image capturing result of the image capturing unit A201 and notifies the management apparatus A105 of various kinds of notification information (e.g., alert information).
  • The block dividing unit A205 divides the image of the image data output from the image capturing unit A201 (i.e., image corresponding to image capturing result of image capturing unit A201) into a plurality of blocks. As a specific example, the block dividing unit A205 may divide the image corresponding to the image capturing result of the image capturing unit A201, into a plurality of blocks each having a rectangular shape.
  • For example, FIG. 4 illustrates an example of a method of dividing the image into the plurality of blocks. In the example illustrated in FIG. 4, the block dividing unit A205 divides the entire image (i.e., entire viewing angle of image capturing unit A201) into 12 blocks each having a uniform size by dividing the entire image into four blocks in vertical direction and into three blocks in a lateral direction. Further, in the example illustrated in FIG. 4, reference numerals A301 to A312 are added to the blocks in order from an upper-left block to a lower-right block, for convenience.
  • Note that the example illustrated in FIG. 4 is illustrative, and does not limit the method of dividing the image. As a specific example, the image may be divided into a plurality of blocks in such a manner that an area positioned at a center of the image has a size smaller than an area positioned at an end part of the image.
  • FIG. 3 is referred to again.
  • The detection processing switching unit A206 selectively switches, based on a predetermined condition, whether to apply processing by the first detection unit A207 described below or processing by the second detection unit A208 described below to each of the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201. As a specific example, the detection processing switching unit A206 may acquire, from the setting reception unit A211 described below, the information corresponding to the instruction received by the management apparatus A105 from the user, and may determine processing to be applied to each of the blocks based on the information.
  • The first detection unit A207 detects occurrence of a state where a partial area corresponding to an input image (e.g., partial image corresponding to each of blocks) in the viewing angle of the image capturing unit A201 is shielded, based on a difference between the input image and a reference image.
  • The second detection unit A208 detects occurrence of the state where the partial area corresponding to the input image (e.g., partial image corresponding to each of blocks) in the viewing angle of the image capturing unit A201 is shielded, based on a feature amount representing a predetermined image feature extracted from the input image. As a specific example, the second detection unit A208 may extract edge power as the above-described feature amount by applying a Sobel filter to the input image. In this case, the second detection unit A208 may detect occurrence of the state where the partial area corresponding to the input image in the viewing angle of the image capturing unit A201 is shielded, based on uniformity of the input image corresponding to the extracted edge power.
  • In the following description, the state where the partial area corresponding to the input image (e.g., partial image corresponding to each of blocks) in the viewing angle of the image capturing unit A201 is shielded is also referred to as a “shielded state”, for convenience.
  • The obstruction determination unit A209 determines whether the image capturing by the image capturing unit A201 is obstructed, based on a detection result of the shielded state of each of the blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201, detected by the first detection unit A207 or the second detection unit A208. As a specific example, the obstruction determination unit A209 may determine whether the image capturing by the image capturing unit A201 is obstructed, based on a ratio of the blocks detected as being shielded to the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201.
  • The notification unit A210 notifies a predetermined notification destination (e.g., management apparatus A105 illustrated in FIG. 1) of information corresponding to the determination result of the obstruction determination unit A209. As a specific example, in a case where it is determined that the image capturing by the image capturing unit A201 is obstructed, the notification unit A210 may notify the management apparatus A105 of information notifying alert (hereinafter, also referred to as alert information).
  • The setting reception unit A211 receives, from the management apparatus A105, an instruction about various kinds of settings received by the management apparatus A105 from the user, and controls various kinds of settings for operation of the image capturing apparatus A101 in response to the instruction. As a specific example, the setting reception unit A211 may control the detection processing switching unit A206 to switch the shielded state detection processing to be applied to each of the blocks, in response to the instruction from the user received from the management apparatus A105.
  • Further, the setting reception unit A211 may transmits, to the management apparatus A105, information to present a user interface (UI) for receiving instructions about control of the various kinds of settings from the user (e.g., setting screen) to the user, thereby causing the management apparatus A105 to present the UI. Further, the setting reception unit A211 may control the various kinds of settings for operation of the image capturing apparatus A101 (e.g., setting about switching condition of detection processing switching unit A206), in response to the instruction received by the management apparatus A105 from the user through the above-described UI.
  • <Processing>
  • An example of processing by the image capturing apparatus A101 according to the present exemplary embodiment is described with reference to FIG. 5 while particularly focusing on processing to detect obstruction of the image capturing by the image capturing unit A201. In the example illustrated in FIG. 5, to detect the shielded state of the target block of the second detection unit A208, edge power is used as the feature amount extracted from the block.
  • In step S101, the block dividing unit A205 divides the image corresponding to the image capturing result of the image capturing unit A201, into a predetermined number of blocks.
  • In step S102, the detection processing switching unit A206 determines whether to apply the processing by the first detection unit 207 or the processing by the second detection unit A208 to each of the blocks, based on the user instruction notified from the setting reception unit A211. The processing in step S102 is separately described in detail below.
  • In step S103, the image capturing apparatus A101 determines whether processing in steps S104 to S106 described below has been performed on all of the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201. In a case where the image capturing apparatus A101 determines in step S103 that the processing in steps S104 to S106 has not been performed on all of the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201 (NO in step S103), the processing proceeds to step S104.
  • In step S104, the detection processing switching unit A206 confirms whether application of the processing by the first detection unit A207 (shielded state detection processing based on background difference) to the target block is determined in step S102.
  • In step S104, in a case where the detection processing switching unit A206 confirms that the processing by the first detection unit A207 (shielded state detection processing based on background difference) is applied to the target block (YES in step S104), the processing proceeds to step S106. In step S106, the detection processing switching unit A206 requests the first detection unit A207 to perform the processing on the target block. The first detection unit A207 detects the shielded state of a partial area corresponding to the target block in the viewing angle of the image capturing unit A201 by using a background difference based on comparison between a partial image corresponding to the target block and a reference image.
  • On the other hand, in step S104, in a case where the detection processing switching unit A206 confirms that the processing by the first detection unit A207 (shielded state detection processing based on background difference) is not applied to the target block (NO in step S104), the processing proceeds to step S105. In step S105, the detection processing switching unit A206 requests the second detection unit A208 to perform the processing on the target block. The second detection unit A208 detects the shielded state of the partial area corresponding to the target block in the viewing angle of the image capturing unit A201 by using edge power extracted from the partial image corresponding to the target block.
  • The image capturing apparatus A101 performs the processing in steps S104 to S106 on all of the blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201, in the above-described manner.
  • In a case where the image capturing apparatus A101 determines in step S103 that the processing in steps S104 to S106 has been already performed on all of the plurality of blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201 (YES in step S103), the processing proceeds to step S107.
  • In step S107, the obstruction determination unit A209 determines whether the image capturing by the image capturing unit A201 is obstructed based on the number of blocks detected as being shielded among all of the blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201. More specifically, the obstruction determination unit A209 calculates a ratio of the blocks detected as being shielded to all of the blocks, and compares the ratio with a threshold. In a case where the calculated ratio exceeds the threshold, the obstruction determination unit A209 determines that the image capturing by the image capturing unit A201 is obstructed.
  • In step S108, the obstruction determination unit 209 confirms whether it is determined in step S107 that the image capturing by the image capturing unit A201 is obstructed.
  • In step S108, in a case where the obstruction determination unit A209 confirms that the image capturing by the image capturing unit A201 is obstructed (YES in step S108), the processing proceeds to step S109. In step S109, the notification unit A210 notifies the management apparatus A105 of detection of the state where the image capturing by the image capturing unit A201 is obstructed.
  • On the other hand, in step S108, in a case where the obstruction determination unit A209 confirms that the image capturing by the image capturing unit A201 is not obstructed (NO in step S108), the series of processing illustrated in FIG. 5 ends. In this case, the processing in step S109 is not performed.
  • Next, an example of the processing by the detection processing switching unit A206 to determine whether to apply the processing by the first detection unit A207 or the processing by the second detection unit A208 to the target block, illustrated in step S102 of FIG. 5 is described with reference to FIG. 6 and FIGS. 7A to 7H. FIG. 6 is a flowchart illustrating a flow of a series of processing. FIGS. 7A to 7H each illustrate an example of a setting screen that presents information to the user and receives designation of various kinds of settings from the user.
  • In step S201, the setting reception unit A211 transmits a screen that presents the detection result of the shielded state of each of the blocks based on the current setting to each of the blocks, to the management apparatus A105 through the communication unit A204, and causes the management apparatus A105 to present the screen.
  • FIG. 7A illustrates an example of the above-described screen presented by the management apparatus A105 based on the instruction from the setting reception unit A211. The image corresponding to the image capturing result of the image capturing unit A201 is displayed on an upper part of the screen. An area A401 illustrated on the image indicates an area corresponding to blocks detected as being shielded, based on the current setting. In the screen illustrated in FIG. 7A, hatching (mask) in a predetermined presentation form is superimposed on the area A401 to highlight the target blocks.
  • A start button A410 is a button for receiving an instruction to start setting about the shielded state detection, from the user. An end button A411 is a button for receiving an instruction to end the setting about the shielded state detection, from the user. A close button A412 is a button for receiving an instruction to close the setting screen, from the user. Radio buttons A413 and A414 are interfaces for receiving selection of a method to detect the shielded state of each of the blocks, from the user. In a case where the radio button A413 is selected, the shielded state detection processing based on the edge power by the second detection unit A208 is applied to the target block. In a case where the radio button A414 is selected, the shielded state detection processing based on the background difference by the first detection unit A207 is applied to the target block.
  • As illustrated in FIG. 7A, in a state where the detection result is presented, the end button A411 is invalid, and the start button A410, the close button A412, and the radio buttons A413 and A414 can receive operation from the user.
  • In step S202, the management apparatus A105 determines whether an instruction to complete all setting processing has been received from the user. As a specific example, in a case where the close button A412 is pressed, the management apparatus A105 may recognize that the instruction to complete all setting processing has been received from the user.
  • In a case where the management apparatus A105 determines in step S202 that the instruction to complete all setting processing has been received from the user (YES in step S202), the series of processing illustrated in FIG. 6 ends.
  • On the other hand, in a case where the management apparatus A105 determines in step S202 that the instruction to complete all setting processing has not been received from the user (NO in step S202), the processing proceeds to step S203. In step S203, the management apparatus A105 determines whether an instruction to start setting about the shielded state detection has been received from the user. As a specific example, in a case where the start button A410 is pressed, the management apparatus A105 may recognize that the instruction to start the setting about the shielded state detection has been received from the user.
  • In a case where the management apparatus A105 determines in step S203 that the instruction to start the setting about the shielded state detection has not been received (NO in step S203), the processing proceeds to step S201. In this case, the series of processing from step S201 illustrated in FIG. 6 is performed again.
  • On the other hand, in a case where the management apparatus A105 determines in step S203 that the instruction to start the setting about the shielded state detection has been received (YES in step S203), the processing proceeds to step S204. In step S204, the management apparatus A105 determines whether, out of the method based on the edge power and the method based on the background difference, the method based on the edge power has been selected as the method to detect the shielded state. As a specific example, in a case where the radio button A413 associated with the method based on the edge power is designated out of the radio buttons A413 and A414, the management apparatus A105 may recognize that the method based on the edge power has been selected.
  • In a case where the management apparatus A105 determines in step S204 that the method based on the edge power has been selected as the method to detect the shielded state (e.g., in a case where the radio button A413 is designated) (YES in step S204), the processing proceeds to step S205. In step S205, the management apparatus A105 performs setting processing relating to the shielded state detection by the method based on the edge power. The processing is separately described in detail below with reference to FIG. 8A.
  • On the other hand, in a case where the management apparatus A105 determines in step S204 that the method based on the edge power has not been selected as the method to detect the shielded state (e.g., in a case where the radio button A414 is designated) (NO in step S204), the processing proceeds to step S206. In step S206, the management apparatus A105 performs setting processing relating to the shielded state detection by the method based on the background difference. The processing is separately described in detail below with reference to FIG. 8B.
  • The management apparatus A105 performs the series of processing illustrated in FIG. 6 in the above-described manner until the management apparatus A105 determines in step S202 that the instruction to complete all setting processing has been received from the user.
  • Next, an example of the setting processing relating to the shielded state detection by the method based on the edge power, described as the processing in step S205 of FIG. 6 is described with reference to FIGS. 7A to 7D and FIG. 8A. FIG. 8A is a flowchart illustrating a flow of the series of processing.
  • In step S301, the setting reception unit A211 transmits a screen that presents a detection result of the shielded state of each of the blocks based on the edge power, to the management apparatus A105 through the communication unit A204, and causes the management apparatus A105 to present the screen.
  • FIG. 7B illustrates an example of the above-described screen presented by the management apparatus A105 based on the instruction from the setting reception unit A211. An area A402 illustrated on the image corresponding to the image capturing result of the image capturing unit A201 indicates an area corresponding to blocks set as execution targets of the shielded state detection processing based on the edge power. Further, an area A403 illustrated on the above-described image indicates an area corresponding to blocks detected as being shielded, based on the edge power. In the screen illustrated in FIG. 7B, hatching (mask) in a predetermined presentation form is superimposed on each of the areas A402 and A403 to highlight the target blocks.
  • It is found from the screen illustrated in FIG. 7B that erroneous detection of the shielded state detection based on the edge power has occurred in blocks corresponding to a vicinity of a ceiling.
  • In the screen illustrated in FIG. 7B, execution and inexecution of the shielded state detection processing based on the edge power can be selectively switched in response to an instruction to designate each of the presented blocks (e.g., designation operation using pointing device).
  • In step S302, the management apparatus A105 determines whether the instruction to designate a block has been received from the user through the above-described screen.
  • In a case where the management apparatus A105 determines in step S302 that the instruction to designate a block has been received from the user (YES in step S302), the processing proceeds to step S303. In step S303, the management apparatus A105 requests the image capturing apparatus A101 that has captured the image displayed on the screen, to switch execution and inexecution of the shielded state detection processing based on the edge power on the designated block. The setting reception unit A211 of the image capturing apparatus A101 instructs the detection processing switching unit A206 to switch execution and inexecution of the shielded state detection processing based on the edge power on the block designated by the user, in response to the request from the management apparatus A105. The detection processing switching unit A206 switches execution and inexecution of the shielded state detection processing based on the edge power on the target block, in response to the instruction from the setting reception unit A211.
  • In the present exemplary embodiment, in a case where the shielded state detection processing based on the edge power on the target block is switched to inexecution, the shielded state detection processing based on the background difference is performed on the target block.
  • For example, a screen illustrated in FIG. 7C illustrates an example of a screen presented based on a switching result of execution and inexecution of the shielded state detection processing based on the edge power on the block designated by the user. An area A404 illustrated on the image corresponding to the image capturing result of the image capturing unit A201 indicates an area corresponding to blocks set as execution targets of the shielded state detection processing based on the edge power.
  • As can be seen from comparison between the screen illustrated in FIG. 7C and the screen illustrated in FIG. 7B, erroneous detection occurred in the blocks corresponding to the vicinity of the ceiling in the screen illustrated in FIG. 7B is eliminated in the screen illustrated in FIG. 7C.
  • On the other hand, in a case where the management apparatus A105 determines in step S302 that the instruction to designate the block has not been received from the user (NO in step S302), the processing proceeds to step S304. In this case, the processing in step S303 is not performed.
  • In step S304, the management apparatus A105 determines whether an instruction to end setting about the shielded state detection has been received from the user. As a specific example, in a case where the end button A411 is pressed, the management apparatus A105 may recognize that the instruction to end the setting about the shielded state detection has been received from the user.
  • In a case where the management apparatus A105 determines in step S304 that the instruction to end the setting about the shielded state detection has been received from the user (YES in step S304), the series of processing illustrated in FIG. 8A ends.
  • For example, a screen illustrated in FIG. 7D illustrates an example of a screen presented after the series of processing illustrated in FIG. 8A is completed. As presented in the screen, it is found that erroneous detection occurred at a timing when the screen illustrated in FIG. 7B is presented is eliminated at a timing when the screen illustrated in FIG. 7D is presented.
  • On the other hand, in a case where the management apparatus A105 determines in step S304 that the instruction to end the setting about the shielded state detection has not been received from the user (NO in step S304), the processing proceeds to step S301.
  • The management apparatus A105 performs the series of processing illustrated in FIG. 8A in the above-described manner until the management apparatus A105 determines in step S304 that the instruction to end the setting about the shielded state detection has been received from the user.
  • Next, an example of the setting processing relating to the shielded state detection by the method based on the background difference, described as the processing in step S206 of FIG. 6 is described with reference to FIGS. 7E to 7H and FIG. 8B. FIG. 8B is a flowchart illustrating a flow of the series of processing.
  • For example, a screen illustrated in FIG. 7E illustrates an example of a screen that receives, from the user, the instruction to switch execution and inexecution of the shielded state detection processing based on the background difference on each of the blocks. In the screen illustrated in FIG. 7E, the shielded state detection processing based on the background difference on each of the blocks is set to inexecution.
  • On the other hand, a screen illustrated in FIG. 7F illustrates another example of the screen that receives, from the user, the instruction to switch execution and inexecution of the shielded state detection processing based on the background difference on each of the blocks. In the screen illustrated in FIG. 7F, the shielded state detection processing based on the background difference on each of the blocks is set to execution.
  • In step S305, the setting reception unit A211 transmits a screen that presents a detection result of the shielded state of each of the blocks based on the background difference, to the management apparatus A105 through the communication unit A204, and causes the management apparatus A105 to present the screen.
  • In FIG. 7F, an area A405 illustrated on the image corresponding to the image capturing result of the image capturing unit A201 indicates an area corresponding to blocks set as execution targets of the shielded state detection processing based on the background difference. Further, an area A406 illustrated on the above-described image indicates an area corresponding to blocks detected as being shielded, based on the background difference. In the screen illustrated in FIG. 7F, hatching (mask) in a predetermined presentation form is superimposed on each of the areas A405 and A406 to highlight the target blocks.
  • In other words, it is found from the screen illustrated in FIG. 7F that erroneous detection of the shielded state detection based on the background difference has occurred in blocks (nine blocks on lower part) corresponding to a vicinity of persons and windows.
  • In the screen illustrated in FIG. 7F, execution and inexecution of the shielded state detection processing based on the background difference can be selectively switched by an instruction to designate each of the presented blocks (e.g., designation operation using pointing device).
  • In step S306, the management apparatus A105 determines whether an instruction to designate a block has been received from the user through the above-described screen.
  • In a case where the management apparatus A105 determines in step S306 that the instruction to designate a block has been received from the user (YES in step S306), the processing proceeds to step S307. In step S307, the management apparatus A105 requests the image capturing apparatus A101 that has captured the image displayed on the screen, to switch execution and inexecution of the shielded state detection processing based on the background difference on the designated block. The setting reception unit A211 of the image capturing apparatus A101 instructs the detection processing switching unit A206 to switch execution and inexecution of the shielded state detection processing based on the background difference on the block designated by the user, in response to the request from the management apparatus A105. The detection processing switching unit A206 switches execution and inexecution of the shielded state detection processing based on the background difference on the target block, in response to the instruction from the setting reception unit A211.
  • In the present exemplary embodiment, in a case where the shielded state detection processing based on the background difference on the target block is switched to inexecution, the shielded state detection processing based on the edge power is performed on the target block.
  • For example, a screen illustrated in FIG. 7G illustrates an example of a screen presented based on a switching result of execution and inexecution of the shielded state detection processing based on the background difference on the block designated by the user. An area A407 illustrated on the image corresponding to the image capturing result of the image capturing unit A201 indicates an area corresponding to blocks set as execution targets of the shielded state detection processing based on the background difference.
  • As can be seen from the comparison between the screen illustrated in FIG. 7G and the screen illustrated in FIG. 7E, erroneous detection occurred in the blocks corresponding to the vicinity of persons and windows in the screen illustrated in FIG. 7E is eliminated in the screen illustrated in FIG. 7G.
  • On the other hand, in a case where the management apparatus A105 determines in step S306 that the instruction to designate the block has not been received from the user (NO in step S306), the processing proceeds to step S308. In this case, the processing in step S307 is not performed.
  • In step S308, the management apparatus A105 determines whether an instruction to end the setting about the shielded state detection has been received from the user. As a specific example, in the case where the end button A411 is pressed, the management apparatus A105 may recognize that the instruction to end the setting about the shielded state detection has been received from the user.
  • In a case where the management apparatus A105 determines in step S308 that the instruction to end the setting about the shielded state detection has been received from the user (YES in step S308), the series of processing illustrated in FIG. 8B ends.
  • For example, a screen illustrated in FIG. 7H illustrates an example of a screen presented after the series of processing illustrated in FIG. 8B is completed. As presented in the screen, it is found that erroneous detection occurred at a timing when the screen illustrated in FIG. 7E is presented is eliminated at a timing when the screen illustrated in FIG. 7H is presented.
  • On the other hand, in a case where the management apparatus A105 determines in step S308 that the instruction to end the setting about the shielded state detection has not been received from the user (NO in step S308), the processing returns to step S305.
  • The management apparatus A105 performs the series of processing illustrated in FIG. 8B in the above-described manner until the management apparatus A105 determines in step S308 that the instruction to end the setting about the shielding state detection has been received from the user.
  • Applying the above-described processing makes it possible to selectively switch the processing to be applied to the determination whether each of the blocks is shielded, between the processing based on the feature amount (e.g., edge power) and the processing based on the background difference depending on the situation of the time. Such a mechanism makes it possible to improve detection accuracy of the state where the image capturing by the image capturing unit A201 is obstructed, depending on the situation of the time (e.g., scene to be monitored).
  • <Modification>
  • Subsequently, a modification of the present exemplary embodiment is described. In the above-described exemplary embodiment, the processing to be applied to each of the blocks is manually set by the user operation. In contrast, in the present modification, an example of a mechanism in which the image capturing apparatus A101 automatically set the processing to be applied to each of the blocks obtained by dividing the image corresponding to the image capturing result of the image capturing unit A201 by using a detection result of the shielded state of each of the blocks, is described.
  • <Functional Configuration>
  • First, an example of a functional configuration of the image capturing apparatus A101 according to the present modification is described with reference to FIG. 9. The image capturing apparatus A101 according to the present modification is different from the example illustrated in FIG. 3 that the image capturing apparatus A101 includes a detection processing determination unit A212, and the detection processing switching unit A206 switches the processing to be applied to the target block based on an instruction from the detection processing determination unit A212. In FIG. 9, reference numerals similar to the reference numerals in FIG. 3 indicate components similar to the components denoted by the reference numerals in FIG. 3. With this in mind, in the following description, the functional configuration of the image capturing apparatus A101 according to the present modification is described while focusing on differences from the example illustrated in FIG. 3.
  • The detection processing determination unit A212 receives feedback of the detection result of the shielded state of the block based on the background difference by the first detection unit A207 and feedback of the detection result of the shielded state of the block based on the feature amount (e.g., edge power) by the second detection unit A208. The detection processing determination unit A212 determines whether to apply the detection processing based on the background difference or the detection processing based on the feature amount, to the target block, based on the feedback (i.e., detection result described above).
  • For example, FIG. 10 illustrates an example of algorithm for the detection processing determination unit A212 to determine the detection processing to be applied to the target block. In the example illustrated in FIG. 10, the second detection unit A208 uses the edge power as the feature amount for detecting the shielded state of each of the blocks.
  • The detection processing determination unit A212 determines the detection processing to be applied to each of the blocks based on whether the background difference acquired for each of the blocks is larger than or smaller than a threshold, and whether the feature amount extracted from each of the blocks is larger than or smaller than a threshold. More specifically, the detection processing determination unit A212 basically determines the processing based on the edge power as the applied processing, and in a case where the edge power is smaller than the threshold and the background difference is smaller than the threshold, the detection processing determination unit A212 determines the detection processing based on the background difference as the applied processing. Further, the detection processing determination unit A212 controls the detection processing switching unit A206 to switch the processing to be applied to the target block, based on the determination result of the detection processing applied to the target block.
  • Applying the above-described control makes it possible to automatically and selectively switch the processing to be applied for determination whether each of the blocks is shielded, between the processing based on the feature amount (e.g., edge power) and the processing based on the background difference, depending on the situation of the time. Such a mechanism makes it possible to improve the detection accuracy of the state where the image capturing by the image capturing unit A201 is obstructed, depending on the situation of the time (e.g., scene to be monitored).
  • Other Exemplary Embodiments
  • The present disclosure can be realized by supplying programs implementing one or more functions of the above-described exemplary embodiment to a system or an apparatus through a network or a recording medium, and causing one or more processors of a computer in the system or the apparatus to read out and execute the programs. Further, the present disclosure can be realized by a circuit (e.g., application specific integrated circuit (ASIC)) implementing one or more functions of the above-described exemplary embodiment.
  • Further, the configurations described with reference to FIG. 3 and FIG. 9 are merely examples, and are not intended to limit the functional configuration of the image capturing apparatus A101 according to the present modification. For example, among the components of the image capturing apparatus A101, some of the components may be provided outside the image capturing apparatus A101.
  • As a specific example, the components A205 to A211 relating to detection of the state where the image capturing by the image capturing unit A201 is obstructed may be provided outside the image capturing apparatus A101. In this case, an apparatus including the components A205 to A211 relating to detection of the state where the image capturing by the image capturing unit A201 is obstructed corresponds to an example of the “information processing apparatus” according to the present exemplary embodiment.
  • Further, as another example, among the components of the image capturing apparatus A101, a load of the processing by at least some of the components may be distributed to a plurality of apparatuses.
  • Embodiment(s) of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
  • While the present disclosure has been described with reference to exemplary embodiments, the scope of the following claims are to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
  • This application claims the benefit of Japanese Patent Application No. 2020-086982, filed May 18, 2020, which is hereby incorporated by reference herein in its entirety.

Claims (8)

What is claimed is:
1. An information processing apparatus determining whether image capturing by an image capturing apparatus is obstructed, the information processing apparatus comprising:
a dividing unit configured to divide an input image captured by the image capturing apparatus into a plurality of blocks;
a processing determination unit configured to determine whether to perform first detection processing using a reference image corresponding to the image capturing apparatus or second detection processing using a feature of the input image, on each of the blocks; and
an obstruction determination unit configured to determine whether the image capturing by the image capturing apparatus is obstructed, based on a detection result of each of the blocks by the first detection processing or the second detection processing.
2. The information processing apparatus according to claim 1, wherein the processing determination unit determines whether to perform the first detection processing or the second detection processing on each of the blocks, based on a user instruction received for each of the blocks.
3. The information processing apparatus according to claim 1, wherein the processing determination unit determines whether to perform the first detection processing or the second detection processing on each of the blocks, based on the detection result by the first detection processing and the detection result by the second detection processing.
4. The information processing apparatus according to claim 3, wherein the processing determination unit determines whether to perform the first detection processing or the second detection processing on each of the blocks, based on the feature in the second detection processing and a difference between the input image and the reference image in the first detection processing.
5. The information processing apparatus according to claim 4, wherein, in a case where the feature extracted from the input image is smaller than a threshold and the difference between the input image and the reference image is smaller than a threshold, the first detection processing is determined.
6. The information processing apparatus according to claim 1, further comprising:
a first detection unit configured to perform the first detection processing that detects occurrence of a state where a partial area corresponding to the input image in a viewing angle of the image capturing apparatus is shielded, based on a difference between the input image and the reference image; and
a second detection unit configured to perform the second detection processing that detects occurrence of the state where the partial area corresponding to the input image in the viewing angle of the image capturing apparatus is shielded, based on the feature amount extracted from the input image.
7. An information processing method performed by an information processing apparatus to determine whether image capturing by an image capturing apparatus is obstructed, the information processing method comprising:
dividing an input image captured by the image capturing apparatus into a plurality of blocks;
determining whether to perform first detection processing using a reference image corresponding to the image capturing apparatus or second detection processing using a feature of the input image, on each of the blocks; and
determining whether the image capturing by the image capturing apparatus is obstructed, based on a detection result of each of the blocks by the first detection processing or the second detection processing.
8. A non-transitory storage medium storing a program causing a computer to execute an information processing method to determine whether image capturing by an image capturing apparatus is obstructed, the information processing method comprising:
dividing an input image captured by the image capturing apparatus into a plurality of blocks;
determining whether to perform first detection processing using a reference image corresponding to the image capturing apparatus or second detection processing using a feature of the input image, on each of the blocks; and
determining whether the image capturing by the image capturing apparatus is obstructed, based on a detection result of each of the blocks by the first detection processing or the second detection processing.
US17/318,893 2020-05-18 2021-05-12 Information processing apparatus, information processing method, and storage medium Abandoned US20210357676A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-086982 2020-05-18
JP2020086982A JP2021182216A (en) 2020-05-18 2020-05-18 Information processing apparatus, information processing method, and program

Publications (1)

Publication Number Publication Date
US20210357676A1 true US20210357676A1 (en) 2021-11-18

Family

ID=78512690

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/318,893 Abandoned US20210357676A1 (en) 2020-05-18 2021-05-12 Information processing apparatus, information processing method, and storage medium

Country Status (2)

Country Link
US (1) US20210357676A1 (en)
JP (1) JP2021182216A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000047511A1 (en) * 1999-02-11 2000-08-17 Tl Jones Limited Obstruction detection system
US20060157639A1 (en) * 2005-01-18 2006-07-20 Ford Motor Company Vehicle imaging processing system and method
US20080095399A1 (en) * 2006-10-23 2008-04-24 Samsung Electronics Co., Ltd. Device and method for detecting occlusion area
US20170078901A1 (en) * 2014-05-30 2017-03-16 Hitachi Kokusai Electric Inc. Wireless communication device and wireless communication system
US11328428B2 (en) * 2019-12-18 2022-05-10 Clarion Co., Ltd. Technologies for detection of occlusions on a camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000047511A1 (en) * 1999-02-11 2000-08-17 Tl Jones Limited Obstruction detection system
US20060157639A1 (en) * 2005-01-18 2006-07-20 Ford Motor Company Vehicle imaging processing system and method
US20080095399A1 (en) * 2006-10-23 2008-04-24 Samsung Electronics Co., Ltd. Device and method for detecting occlusion area
US20170078901A1 (en) * 2014-05-30 2017-03-16 Hitachi Kokusai Electric Inc. Wireless communication device and wireless communication system
US11328428B2 (en) * 2019-12-18 2022-05-10 Clarion Co., Ltd. Technologies for detection of occlusions on a camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
G. Ma, M. Dwivedi, R. Li, C. Sun and A. Kummert, "A real-time rear view camera based obstacle detection," 2009, 2009 12th International IEEE Conference on Intelligent Transportation Systems, pp. 1-6 (Year: 2009) *

Also Published As

Publication number Publication date
JP2021182216A (en) 2021-11-25

Similar Documents

Publication Publication Date Title
US10796543B2 (en) Display control apparatus, display control method, camera system, control method for camera system, and storage medium
EP3338445B1 (en) Photographing apparatus and method for controlling the same
WO2011111371A1 (en) Electronic zoom device, electronic zoom method, and program
US11190747B2 (en) Display control apparatus, display control method, and storage medium
US20190230269A1 (en) Monitoring camera, method of controlling monitoring camera, and non-transitory computer-readable storage medium
EP3442219B1 (en) Information processing apparatus, information processing method, and storage medium
KR20150061277A (en) image photographing apparatus and photographing method thereof
US9955081B2 (en) Imaging apparatus capable of adjusting imaging range to include target region, control method of imaging apparatus, and storage medium storing program therefor
US20200045242A1 (en) Display control device, display control method, and program
EP3422287B1 (en) Information processing apparatus, information processing method, and program
US20210357676A1 (en) Information processing apparatus, information processing method, and storage medium
US11700446B2 (en) Information processing apparatus, system, control method of information processing apparatus, and non-transitory computer-readable storage medium
CN115334237A (en) Portrait focusing method, device and medium based on USB camera
US10949713B2 (en) Image analyzing device with object detection using selectable object model and image analyzing method thereof
US20240040231A1 (en) Image capturing apparatus and control method
JP2021182682A (en) Information processing unit, information processing method and program
JP4433985B2 (en) Surveillance image processing device
EP4312433A1 (en) Display control apparatus, display control method, program, and storage medium
US20230199304A1 (en) Information processing apparatus, information processing method, imaging apparatus, control method, and storage medium
US20210218928A1 (en) Electronic apparatus allowing display control when displaying de-squeezed image, and control method of electronic apparatus
CN110855881B (en) Shooting processing method and device, storage medium and electronic equipment
US11716541B2 (en) Image capturing apparatus, method of controlling image capturing apparatus, system, and non-transitory computer-readable storage medium
US11616929B2 (en) Electronic apparatus and method of controlling the same, and storage medium
US20240161324A1 (en) Information processing device, image capturing device, client device, control methods thereof, and non-transitory computer-readable storage medium
US20180373951A1 (en) Image processing apparatus, image processing method, and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KITAGAWA, EIICHIRO;REEL/FRAME:056492/0636

Effective date: 20210420

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE