WO2016063595A1 - Image processing device, image processing method and program - Google Patents

Image processing device, image processing method and program Download PDF

Info

Publication number
WO2016063595A1
WO2016063595A1 PCT/JP2015/071543 JP2015071543W WO2016063595A1 WO 2016063595 A1 WO2016063595 A1 WO 2016063595A1 JP 2015071543 W JP2015071543 W JP 2015071543W WO 2016063595 A1 WO2016063595 A1 WO 2016063595A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
infrared
image processing
processing apparatus
infrared image
Prior art date
Application number
PCT/JP2015/071543
Other languages
French (fr)
Japanese (ja)
Inventor
拓郎 川合
利昇 井原
昌俊 横川
隆浩 永野
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2016063595A1 publication Critical patent/WO2016063595A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to an image processing apparatus, an image processing method, and a program.
  • Patent Document 1 Conventionally, it has been proposed to use an image captured by an infrared camera for security and other purposes (see, for example, Patent Document 1). Infrared rays have a variety of uses different from visible light, depending on their wavelength. The technique proposed by Patent Document 1 can notify the user of the presence of a suspicious person detected through an infrared image by using infrared rays for night vision applications.
  • Infrared cameras are used not only for security equipment such as surveillance cameras, but also for medical / diagnostic equipment, in-vehicle equipment and inspection equipment. There is also an infrared module that can be connected to (or built in) a general-purpose portable device such as a smartphone or tablet PC (Personal Computer) to capture, display or record infrared images. To do.
  • a general-purpose portable device such as a smartphone or tablet PC (Personal Computer) to capture, display or record infrared images.
  • infrared rays having a certain range of wavelengths are transmissive through materials such as cloth or thin film, the use of infrared images may fall under inappropriate acts such as infringement of privacy.
  • an infrared image acquisition unit that acquires an infrared image
  • a determination unit that determines whether a subject whose transmission is to be suppressed is reflected in the infrared image
  • the infrared image based on a determination result by the determination unit
  • An image processing apparatus includes a processing unit that at least partially suppresses display or recording of an image.
  • the processor of the image processing apparatus acquires an infrared image, determines whether a subject whose transmission is to be suppressed is reflected in the infrared image, and based on the determination result. And at least partially suppressing display or recording of the infrared image.
  • the computer that controls the image processing apparatus includes an infrared image acquisition unit that acquires an infrared image, a determination unit that determines whether a subject whose transmission should be suppressed is reflected in the infrared image, Based on the determination result by the determination unit, a program is provided for functioning as a processing unit that at least partially suppresses display or recording of the infrared image.
  • FIG. 1 is an explanatory diagram for explaining various uses of an infrared (IR) image depending on a wavelength.
  • the horizontal direction in FIG. 1 corresponds to the wavelength of infrared rays, and the wavelength increases from left to right.
  • Light having a wavelength of 0.7 ⁇ m or less is visible light, and human vision senses this visible light.
  • Infrared rays having a wavelength in the range of 0.7 ⁇ m to 1.0 ⁇ m are classified as near infrared rays (NIR).
  • NIR near infrared rays
  • Near-infrared light can be used, for example, for night vision, fluoroscopy, optical communication and ranging.
  • Infrared rays having a wavelength in the range of 1.0 ⁇ m to 2.5 ⁇ m are classified as short wavelength infrared rays (SWIR). Short wavelength infrared is also available for night vision and fluoroscopy.
  • a night vision apparatus using near-infrared rays or short-wavelength infrared rays first irradiates infrared rays in the vicinity and captures the reflected light to generate an IR image.
  • Infrared light having a wavelength in the range of 2.5 ⁇ m to 4.0 ⁇ m is classified as medium wavelength infrared (MWIR). Since a substance-specific absorption spectrum appears in the wavelength range of the medium wavelength infrared, the medium wavelength infrared can be used for identification of the substance.
  • MWIR medium wavelength infrared
  • Medium wavelength infrared can also be used for thermography.
  • Infrared rays having a wavelength of 4.0 ⁇ m or more are classified as far infrared rays (FIR).
  • FIR far infrared rays
  • Far infrared can be utilized for night vision, thermography and heating.
  • Infrared rays emitted by black body radiation from an object correspond to far infrared rays. Therefore, a night vision apparatus using far infrared rays can generate an IR image by capturing black body radiation from an object without irradiating infrared rays.
  • the boundary values of the wavelength range shown in FIG. 1 are merely examples.
  • Various definitions exist for the boundary value of the infrared classification, and the advantages described below of the technology according to the present disclosure can be enjoyed under any definition.
  • near infrared rays and short wavelength infrared rays are mainly permeable to materials such as cloth or thin film. Therefore, for example, when a person appears in an IR image based on these types of infrared rays, the clothes of the person may be transmitted, and underwear and other objects that are not desirable to be seen by others may be exposed in the IR image. is there. Therefore, the use of IR images may fall under inappropriate acts such as privacy infringement or nuisance.
  • many security cameras have been installed in public places due to heightened security awareness, and automobiles equipped with night vision cameras have been marketed from the viewpoint of accident prevention. It is utilized in. In view of this, the present specification proposes a mechanism that can prevent an inappropriate act caused by infrared transparency while maintaining an opportunity for appropriate use of an infrared image.
  • FIG. 2 is a flowchart illustrating an example of a schematic flow of infrared (IR) image processing according to an embodiment.
  • IR image is acquired (step S10).
  • the IR image acquired here is an image generated through a camera that detects infrared rays having transparency.
  • the IR image may be a still image or one of a series of frames that make up a moving image.
  • subject recognition processing is executed (step S11).
  • the subject recognized here is the same as the subject whose transmission should be suppressed, or can be defined in advance in association with the subject whose transmission should be suppressed.
  • the subject recognized in step S11 may be the human body itself or a part of the human body such as a human face or hand.
  • the subject whose transmission should be suppressed may be any object different from the human body (for example, a container in which it is not preferable to visually recognize the contents).
  • the subject recognition process may be executed with an IR image as an input, or may be executed with another type of image (however, an image having an angle of view that can be calibrated with reference to the IR image) as an input.
  • step S12 it is determined whether the subject whose transmission should be suppressed is reflected in the IR image. If it is determined that the subject whose transmission should be suppressed is reflected in the IR image, a transmission suppression process is executed in order to at least partially suppress the display or recording of the IR image (step S13). As described above, in the technology according to the present disclosure, such an operation is suppressed only when it is determined that the display or recording is inappropriate while the display or recording of the IR image is basically allowed.
  • the region corresponding to the whole or a part of the IR image is blurred in the transmission suppression process.
  • imaging, display, or recording of all or part of the IR image is invalidated in the transmission suppression process.
  • the IR image processing described above can be performed on any type of device that captures an IR image or processes a captured IR image. Even if only a few examples are given, devices for imaging IR images are digital video cameras, digital still cameras, television broadcasting cameras, surveillance cameras, intercoms with monitors, in-vehicle cameras, smartphones, PCs (Personal Computers), HMDs. (Head Mounted Display) A terminal, a game terminal, a medical device, a diagnostic device, an inspection device, and the like may be included. In addition to the various imaging devices described above, the device that processes IR images may include a television receiver, a content player, a content recorder, an authoring device, and the like.
  • the image processing apparatus mentioned in the following sections may be a module mounted on or connected to the apparatus exemplified here.
  • FIG. 3A to 3D are explanatory diagrams showing examples of various scenarios in which IR image processing can be executed.
  • an IR image is input to IR image processing from an infrared camera that captures an IR image. Then, the IR image in which the transmission is suppressed is output to the display, and the image is displayed on the display.
  • an IR image is input to IR image processing from an infrared camera that captures an IR image. Then, the IR image in which the transmission is suppressed is output to the storage, and the image is recorded by the storage.
  • FIG. 3C the IR image is read out to the IR image processing from the storage storing the captured IR image.
  • the IR image in which the transmission is suppressed is output to the display, and the image is displayed on the display.
  • the IR image is read out to the IR image processing from the storage storing the captured IR image.
  • the IR image whose transmission is suppressed is recorded again by the storage (that is, the image data is updated or converted).
  • FIG. 4 is a block diagram illustrating an example of a hardware configuration of the image processing apparatus 100 according to the first embodiment.
  • the image processing apparatus 100 includes an infrared camera 102, a sub camera 104, an input interface 106, a memory 108, a display 110, a communication interface 112, a storage 114, a bus 116, and a processor 118.
  • the infrared camera 102 is an imaging module that captures an infrared (IR) image.
  • the infrared camera 102 has an array of imaging elements that sense infrared rays that are mainly classified as near infrared rays or short wavelength infrared rays, and a light emitting element that irradiates infrared rays in the vicinity of the apparatus.
  • the infrared camera 102 irradiates infrared rays from a light emitting element in response to a trigger such as a user input or periodically, and captures infrared rays reflected on a subject or its background to generate an IR image.
  • a series of IR images generated by the infrared camera 102 may constitute a video.
  • the sub-camera 104 captures an auxiliary image that is used auxiliary to recognize a subject or suppress transmission in IR image processing.
  • the auxiliary image captured by the sub camera 104 may be, for example, one or more of a visible light image, an additional IR image (for example, a thermal image, that is, a MWIR image or an FIR image), and a depth map.
  • the input interface 106 is used for a user to operate the image processing apparatus 100 or input information to the image processing apparatus 100.
  • the input interface 106 may include an input device such as a touch sensor, a keyboard, a keypad, a button, or a switch, for example.
  • the input interface 106 may include a microphone for voice input and a voice recognition module.
  • the input interface 106 may also include a remote control module that receives commands selected by the user from the remote device.
  • the memory 108 is a storage medium that can include a RAM (Random Access Memory) and a ROM (Read Only Memory).
  • the memory 108 is coupled to the processor 118 and stores programs and data for processing executed by the processor 118.
  • the display 110 is a display module having a screen for displaying an image.
  • the display 110 may be, for example, an LCD (Liquid Crystal Display), an OLED (Organic light-Emitting Diode), or a CRT (Cathode Ray Tube).
  • the communication interface 112 is a module that mediates communication between the image processing apparatus 100 and another apparatus.
  • the communication interface 112 establishes a communication connection according to any wireless communication protocol or wired communication protocol.
  • Storage The storage 114 is a storage device that stores image data that can include IR images and auxiliary images, or stores a database used in IR image processing.
  • the storage 114 contains a storage medium such as a semiconductor memory or a hard disk. Note that the program and data described in this specification may be acquired from a data source external to the image processing apparatus 100 (for example, a data server, a network storage, or an external memory).
  • the bus 116 connects the infrared camera 102, the sub camera 104, the input interface 106, the memory 108, the display 110, the communication interface 112, the storage 114, and the processor 118 to each other.
  • the processor 118 is a processing module such as a CPU (Central Processing Unit) or a DSP (Digital Signal Processor).
  • the processor 118 operates a function for suppressing undesired transparency in the IR image by executing a program stored in the memory 108 or other storage medium.
  • FIG. 5 is a block diagram illustrating an example of a configuration of logical functions realized by linking the components of the image processing apparatus 100 illustrated in FIG. 4 to each other.
  • the image processing apparatus 100 includes an IR image acquisition unit 120, an auxiliary image acquisition unit 130, a determination unit 140, a recognition DB 145, an image processing unit 150, a blurring DB 155, a user interface unit 160, and a notification unit 170. .
  • the IR image acquisition unit 120 acquires an infrared (IR) image and outputs the acquired IR image to the determination unit 140 and the image processing unit 150.
  • the IR image acquisition unit 120 may acquire an IR image captured by the infrared camera 102. Further, the IR image acquisition unit 120 may acquire an IR image stored in the storage 114.
  • the IR image acquisition unit 120 may acquire an IR image from another device via the communication interface 112.
  • the IR image acquired by the IR image acquisition unit 120 may be an image that has undergone preliminary processing such as signal amplification and noise removal.
  • the IR image acquisition unit 120 may decode the IR image from the compressed and encoded stream.
  • the auxiliary image acquisition unit 130 acquires an auxiliary image that may include a visible light image, an additional IR image, or a depth map. It is assumed that the angle of view of the auxiliary image is calibrated so as to overlap (ideally match) the angle of view of the IR image acquired by the IR image acquisition unit 120.
  • the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the determination unit 140 when the auxiliary image is used for subject recognition in IR image processing.
  • the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the image processing unit 150 when the auxiliary image is used for suppressing transmission.
  • the auxiliary image acquisition unit 130 may acquire an auxiliary image captured by the sub camera 104 and stored in the storage 114 or received via the communication interface 112. When the auxiliary image is not used for any application, the auxiliary image acquisition unit 130 may be omitted from the configuration of the image processing apparatus 100.
  • the determination unit 140 determines whether a subject whose transmission should be suppressed is reflected in the IR image. More specifically, the determination unit 140 performs image recognition for recognizing a predetermined recognition target for an IR image or an auxiliary image having an angle of view overlapping with the IR image. Then, when the predetermined recognition target is recognized as a result of the image recognition, the determination unit 140 determines that the subject whose transmission should be suppressed appears in the IR image.
  • the predetermined recognition target may be a human face, body, or part of the body.
  • the recognition DB 145 stores data referred to in image recognition executed by the determination unit 140.
  • the recognition DB 145 stores in advance image feature quantities to be recognized that are acquired from a number of known IR images through a prior learning process.
  • the determination unit 140 collates the image feature amount extracted from the IR image input from the IR image acquisition unit 120 with the feature amount stored in the recognition DB 145, and the recognition target appears in the IR image according to the collation result. Can be determined.
  • the determination unit 140 may perform the above-described image recognition according to an existing algorithm such as boosting or support vector machine.
  • a prior learning process is also executed based on a known image of the same type as the auxiliary image. Based on the assumption that the angle of view of the auxiliary image overlaps the angle of view of the IR image captured at the same timing, based on the detection of a predetermined recognition target in the auxiliary image, transmission to the IR image should be suppressed It can be estimated that the subject is shown.
  • a visible light image can be used as an auxiliary image. In this case, it is possible to determine whether the subject whose transmission should be suppressed is reflected in the IR image by utilizing a face recognition technique or a person recognition technique with good accuracy based on the visible light image.
  • a plurality of recognition targets may be recognized for one IR image or auxiliary image.
  • Recognition target type a code that identifies the type of the recognized recognition target. Various possible types such as face / human body, face / torso / arm / leg, face / upper body / lower body, face / torso (upper body) / torso (lower body) / arm / leg may be defined in advance.
  • Recognition position / orientation / size The position, orientation and size of the recognized recognition object in the IR image (or auxiliary image). Additionally or alternatively, information indicating the shape may be output.
  • Recognition likelihood an index indicating the likelihood of the result of image recognition. The larger the value, the more likely the recognition result is correct. Also called reliability.
  • the image recognition performed by the determination unit 140 may be shared with image recognition for other purposes, such as pedestrian recognition performed on images from the in-vehicle camera for driving assistance.
  • a display object indicating the result of such image recognition (for example, a frame surrounding a pedestrian) may be superimposed on the IR image.
  • the image processing unit 150 suppresses at least partially the display or recording of the IR image based on the determination result input from the determination unit 140. More specifically, in the present embodiment, the image processing unit 150 determines that the subject whose transmission should be suppressed is reflected in the IR image when the determination unit 140 determines that the subject is reflected in the IR image. A blurring area is set in the IR image so as to include the estimated position of the subject. Then, the image processing unit 150 blurs the blurring area of the IR image according to the setting.
  • the blurring DB 155 stores data referred to in image processing executed by the image processing unit 150.
  • the blurred area may correspond to the entire IR image.
  • the image processing unit 150 can set the unsharp area without grasping the specific position of the subject.
  • the blurred area may correspond to a portion of the IR image.
  • the image processing unit 150 sets a blurring region in the IR image based on the recognition position, orientation, and size information of the recognition target input from the determination unit 140. For example, when a human face is recognized as a recognition target, the blurring area can be set below the recognition position of the face with a size depending on the size of the face.
  • FIG. 6 is an explanatory diagram showing an example of the setting of the blurring area.
  • the IR image Im11 is shown on the left of FIG.
  • a rectangular frame 142 indicates that a recognition target corresponding to a human face is recognized in the IR image Im11.
  • the image processing unit 150 sets the smeared region 144 below the recognized face position.
  • the blurring region 144 is set so as to include an estimated position of the human body that is a subject whose transmission is to be suppressed (see the center of FIG. 6).
  • Such a positional relationship between the subject and the recognition target is defined in advance (for example, based on knowledge of the standard shape of the human body) and can be stored for each recognition target type by the blurring DB 155.
  • the right side of FIG. 6 shows a processed IR image Im12 in which a partial image of the blurred region 144 is blurred.
  • the image processing unit 150 may change the size, orientation, or shape of the blurred area 144 in accordance with the size, orientation, or shape of the recognition target. Further, when a human face is recognized, the image processing unit 150 blurs an area having a pixel value whose difference from a pixel value of the face area (for example, an average value or a median value over the face area) is below a threshold value. It may be set as a conversion area.
  • the positional relationship between the subject and the recognition target is not defined in advance, it corresponds to human skin that is reflected in the IR image (it is assumed to have a gradation close to the gradation of the face) It is possible to set a blurring area in the portion to be processed.
  • the image processing unit 150 may blur the blurred area 144 by smoothing the pixel value.
  • the smoothing here can be performed by applying a smoothing filter typified by a Gaussian filter to each pixel belonging to the unsharp region 144.
  • the level of smearing depends on the filter scale (eg, variance ⁇ ).
  • the level of smearing depends on the sub-region size R sub .
  • the image processing unit 150 may blur the blurred area 144 by filling the blurred area 144 with a specific pixel value.
  • the image processing unit 150 mixes the pixel values of the auxiliary image (the portion corresponding to the unsharp area 144) into the unsharp area 144, thereby blurring the area 144. May be blurred.
  • the left part of FIG. 8 shows a partial image of the auxiliary image corresponding to the blurred area 144 and a partial image of the blurred area 144 of the IR image.
  • the image processing unit 150 may generate a blurred partial image of the blurred region 144 by mixing these two partial images using the mixing ratio ⁇ mix according to the following equation (see the right side of FIG. 8). ).
  • IR x, y and IR x, y_blurred represent the pixel values of the IR image before and after the pixel position (x, y), respectively, and SI x, y represents the pixel position (x, y). y) represents the pixel value of the auxiliary image.
  • a plurality of auxiliary images may be mixed with the IR image using a plurality of mixing ratios. When two mixing ⁇ mix and ⁇ mix are used, for example, the following equation (2) may be used instead of equation (1).
  • SI1 x, y and SI2 x, y represent the pixel value of the first auxiliary image and the pixel value of the second auxiliary image at the pixel position (x, y), respectively.
  • the level of smearing depends on the mixing ratio.
  • a supplementary image for mixing instead of a dynamically acquired image such as a visible light image, an additional IR image or a depth map, a fixed image (eg CG) pre-stored by the blurring DB 155 (Computer Graphics) images or monochrome images etc.) may be used.
  • a fixed image eg CG pre-stored by the blurring DB 155 (Computer Graphics) images or monochrome images etc.
  • mixing may be performed using the pixel value P mix that is dynamically determined as in the following equation.
  • the pixel value P mix may be a representative value of pixel values belonging to the IR image.
  • the representative value here may be, for example, the average value or the median value of the pixel values of the entire IR image, the inside of the blurred area, or the pixel group on the boundary of the blurred area. By using such a representative value, an unnatural color or an image with little change in gradation can be obtained as an image of a blurred region after mixing.
  • the pixel value P mix selected in this way may be used as a fill color for blurring instead of the mixing color. Further, the pixel value P mix may be set by the user via the user interface unit 160.
  • the image processing unit 150 may dynamically determine the blurring level according to one or more conditions, and blur the blurring area according to the determined blurring level.
  • the image processing unit 150 may determine the blurring level based on one or more of infrared irradiation intensity and ambient light intensity when an IR image is captured. For example, as the intensity of infrared rays irradiated to the vicinity when an IR image is captured increases, the texture of the IR image becomes clearer, and the transmitted subject appears more clearly in the IR image. Therefore, the image processing unit 150 acquires information indicating the infrared irradiation intensity when the IR image is captured from the infrared camera 102, and the higher the irradiation intensity, the higher the blurring level. May be determined.
  • the blurring level may be determined based on the irradiation intensity of infrared rays relative to the intensity of the ambient light. The intensity of the ambient light can be measured by an illuminance sensor not shown in FIG.
  • the image processing unit 150 may determine the blurring level based on the recognition likelihood of image recognition input from the determination unit 140. For example, when there is a high possibility that an object recognized as being reflected in an IR image is a human torso, the recognition likelihood (or reliability) indicating a high value associated with the recognition target type indicating “torso” is high. The determination result is input from the determination unit 140. In this case, the image processing unit 150 can determine the blur level value of the body region to be relatively high.
  • the image processing unit 150 determines the blurring level based on one or more of the distance from the camera to the subject when the IR image is captured and the size of the subject in the IR image. Also good.
  • the distance from the camera to the subject can be measured, for example, by an infrared-based depth sensor (eg, by a near infrared round trip time reflected by the subject, or a ranging method based on distortion of the dot pattern projected onto the subject).
  • the size of the subject can be acquired through image recognition executed by the determination unit 140. For example, if the distance from the camera to the subject is smaller or the subject size is larger, the transmitted subject will appear more clearly in the IR image. Therefore, the image processing unit 150 can determine the blurring level so that the blurring level becomes higher in these cases.
  • the blurring level corresponds to the scale of the Gaussian filter. If smearing is achieved by averaging pixel values for each sub-region, the smearing level corresponds to the size of the sub-region. If blurring is achieved by mixing with the auxiliary image, the blurring level corresponds to the mixing ratio.
  • the image processing unit 150 sets values for parameters for determining the blurring level (for example, infrared irradiation intensity, ambient light intensity, recognition likelihood, distance to the subject or size of the subject, or any combination thereof). Regardless, these blurring levels may be limited so as not to fall below a predetermined level.
  • FIG. 9A is a graph illustrating a first example of the relationship between the parameter for determining the blurring level and the blurring level.
  • the horizontal axis of the graph represents the parameter X for determining the blurring level, and the parameter X may correspond to, for example, infrared irradiation intensity, recognition likelihood, or subject size.
  • the vertical axis of the graph represents the level of smearing and may correspond to, for example, the scale of the filter, the size of the sub-region for averaging or the mixing ratio.
  • the blurring level is constant at the minimum value L min regardless of the value of the parameter X.
  • the blurring level Due to such limitation of the blurring level, it is possible to prevent a subject that is not desired to be visually recognized from being visually recognized due to a lack of the blurring level.
  • the blurring level increases with the value of the parameter X.
  • the value of the parameter X is the threshold value Th12 above, blurring level reaches a maximum value L max.
  • Information defining such a relationship between the parameter X and the blurring level may be defined in advance and stored by the blurring DB 155.
  • FIG. 9B is a graph showing a second example of the relationship between the parameter for determining the blurring level and the blurring level.
  • the blurring level when the value of the parameter X is lower than the threshold Th21, the blurring level is constant at the minimum value L min regardless of the value of the parameter X.
  • the blurring level takes a value associated with the subrange to which the value of the parameter X belongs.
  • the subrange boundary is defined by threshold values Th21, Th22, Th23, and Th24.
  • the blurring DB 155 may store information defining such a subrange boundary and a blurring level for each subrange.
  • the user interface unit 160 is additionally provided in the image processing apparatus 100 in order to provide a user interface for allowing the user to adjust the blurring level.
  • the user interface unit 160 acquires user input via the input interface 106.
  • the user interface unit 160 may display a graphical user interface as illustrated in FIG. 10 on the screen of the display 110, for example. Referring to FIG. 10, a setting window U10 is shown.
  • the setting window U10 includes a slider U11 and a button U12. The user can increase or decrease the degree of blurring by sliding the slider U11 along the slider axis and pressing (or tapping) the button U12.
  • FIG. 11 is a graph showing an example of the blurring level depending on the user setting. Referring to FIG. 11, three graphs G1, G2, and G3 having different maximum values and slopes are shown.
  • the graph G1 defines the relationship between the parameter X for the determination of the blurring level and the blurring level, which is selected when the user desires a lower blurring level.
  • Graph G2 defines the relationship between parameter X and the smearing level that is selected when the user desires a medium smearing level.
  • Graph G3 defines the relationship between parameter X and the smearing level that is selected when the user desires a high smearing level. In the example of FIG.
  • the blurring DB 155 may store information that defines the blurring level that depends on the user setting.
  • the relationship between the parameter for determining the blurring level and the blurring level is not limited to the examples shown in FIGS. 9A and 9B and FIG.
  • the graph indicating the relationship may draw any trajectory such as a straight line, a curved line, and a broken line.
  • the blurring level may be determined based on a plurality of parameters (for example, recognition likelihood and distance to the subject). For example, a relationship between a single intermediate parameter calculated as a function of a plurality of parameters and the blurring level may be defined in the blurring DB 155. Also, different blurring levels may be used depending on the imaging time zone such as day or night.
  • the image processing unit 150 dynamically changes the fill color for smearing (for example, depending on information such as the type of recognition target). It may be changed. Further, the image processing unit 150 may dynamically change the color of the monochrome image (that is, the mixing color) in an example in which the blurring region 144 is blurred by mixing with the monochrome image. For example, the image processing unit 150 sets the mixing color to red when the smearing level is high, the mixing color to blue when the smearing level is medium, and the mixing color to gray when the smearing level is low. Can be set to Thereby, it is possible to notify the user who sees the output image how much blurring has been performed. Further, the image processing unit 150 may dynamically change the mixing color depending on other information such as the type of recognition target.
  • the image processing unit 150 may display a sign indicating that the suppression is performed on the screen.
  • the image processing unit 150 can display an indication on the display screen by superimposing an indication indicating that suppression is performed on the IR image.
  • the image processing apparatus 100 is a tablet PC.
  • a sign U21 and a sign U22 are displayed on the screen of the image processing apparatus 100.
  • the sign U21 is a text label for informing the user that the smearing has been performed as a result of the human body being recognized in the IR image.
  • the sign U22 is a frame that surrounds the blurred area. The user can grasp not only that the smearing has been performed but also which part of the output image has been smeared by looking at the sign U22.
  • the sign U21 and the sign U22 illustrated in FIG. 12 are useful for the user who captures or uses the IR image to know that the blurring has been performed on the IR image.
  • the notification unit 170 can notify that an IR image is captured by means such as light emitted from a light emitting element such as an LED (Light Emitting Diode), or a sound effect or sound output from a speaker.
  • the notification unit 170 may perform notification with a different notification pattern depending on whether or not transmission suppression is performed by the image processing unit 150.
  • FIG. 13 is an explanatory diagram for explaining an example of a technique for notifying a nearby person that an IR image is captured.
  • the image processing apparatus 100 is a digital video camera.
  • a light emitting element 172 is disposed on the back surface of the image processing apparatus 100.
  • the notification unit 170 turns on the light emitting element 172 when an IR image is captured.
  • the color of light emitted from the light emitting element 172 is set to, for example, blue when transmission is suppressed by the image processing unit 150, and red when transmission is not suppressed.
  • the person who is the subject can grasp whether or not the transmission is appropriately suppressed by looking at the color of the light from the light emitting element 172.
  • FIG. 14A is a flowchart illustrating a first example of the flow of IR image processing according to the first embodiment. The process shown in FIG. 14A is repeated for each of the one or more IR images to be processed.
  • the IR image acquisition unit 120 acquires an IR image, and outputs the acquired IR image to the determination unit 140 and the image processing unit 150 (step S102).
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S112). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed appears in the IR image based on the result of image recognition (step S120).
  • the image processing unit 150 sets a blurring area in the IR image so as to include the estimated position of the subject (step S131). Next, the image processing unit 150 blurs the partial image of the IR image in the set blurring area (step S135). If it is not determined that the subject whose transmission is to be suppressed appears in the IR image, the processes in steps S131 and S135 are skipped.
  • the image processing unit 150 outputs, to the display 110, the communication interface 112, or the storage 114, the IR image updated in step S135 or the IR image that has not been updated because the subject whose transmission should be suppressed is not shown. (Step S138).
  • step S140 the IR image processing shown in FIG. 14A ends.
  • FIG. 14B is a flowchart illustrating a second example of the flow of IR image processing according to the first embodiment. The process shown in FIG. 14B is repeated for each of the one or more IR images to be processed.
  • the IR image acquisition unit 120 acquires an IR image, and outputs the acquired IR image to the determination unit 140 and the image processing unit 150 (step S102).
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S112). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed appears in the IR image based on the result of image recognition (step S120).
  • the image processing unit 150 sets a blurring area in the IR image so as to include the estimated position of the subject (step S131).
  • the image processing unit 150 uses one or more parameters of infrared irradiation intensity when the IR image is captured, ambient light intensity, recognition likelihood of image recognition, distance to the subject, and size of the subject. Based on the above, the blurring level is determined (step S133).
  • the image processing unit 150 blurs the partial image of the IR image in the blurring area in accordance with the determined blurring level (step S136). If it is not determined that the subject whose transmission is to be suppressed appears in the IR image, the processes of steps S131, S133, and S136 are skipped.
  • the image processing unit 150 outputs, to the display 110, the communication interface 112, or the storage 114, the IR image updated in step S136 or the IR image that has not been updated because the subject whose transmission should be suppressed is not shown. (Step S138).
  • step S140 the IR image processing shown in FIG. 14B ends.
  • FIG. 14C is a flowchart illustrating a third example of the IR image processing flow according to the first embodiment. The process shown in FIG. 14C is repeated for each of the one or more IR images to be processed.
  • an IR image is acquired by the IR image acquisition unit 120, and an auxiliary image is acquired by the auxiliary image acquisition unit 130 (step S104).
  • the angles of view of the IR image and the auxiliary image are calibrated so as to overlap (ideally match) each other.
  • the IR image acquisition unit 120 outputs the acquired IR image to the image processing unit 150.
  • the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the determination unit 140.
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the auxiliary image input from the auxiliary image acquisition unit 130 (step S114).
  • the determination unit 140 determines whether a subject whose transmission should be suppressed appears in the IR image based on the result of image recognition (step S120). For example, when a predetermined recognition target is recognized in the auxiliary image, the determination unit 140 can determine that the subject whose transmission should be suppressed appears in the IR image.
  • the image processing unit 150 sets a blurring area in the IR image so as to include the estimated position of the subject (step S131). Next, the image processing unit 150 blurs the partial image of the IR image in the set blurring area (step S135). If it is not determined that the subject whose transmission is to be suppressed appears in the IR image, the processes in steps S131 and S135 are skipped.
  • the image processing unit 150 outputs, to the display 110, the communication interface 112, or the storage 114, the IR image updated in step S135 or the IR image that has not been updated because the subject whose transmission should be suppressed is not shown. (Step S138).
  • step S140 the IR image processing shown in FIG. 14C ends.
  • FIG. 14D is a flowchart illustrating a fourth example of the IR image processing flow according to the first embodiment. The process shown in FIG. 14D is repeated for each of the one or more IR images to be processed.
  • an IR image is acquired by the IR image acquisition unit 120, and an auxiliary image is acquired by the auxiliary image acquisition unit 130 (step S104).
  • the angles of view of the IR image and the auxiliary image are calibrated so as to overlap (ideally match) each other.
  • the IR image acquisition unit 120 outputs the acquired IR image to the determination unit 140 and the image processing unit 150.
  • the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the image processing unit 150.
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S112). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed appears in the IR image based on the result of image recognition (step S120).
  • the image processing unit 150 sets a blurring area in the IR image and sets a corresponding area in the auxiliary image (step S132). .
  • the image processing unit 150 determines the blurring level based on the one or more parameters described above (step S133).
  • the blurring level here may correspond to, for example, the mixing ratio described above.
  • the image processing unit 150 blurs the partial image in the blurred area by mixing the auxiliary image with the IR image in the blurred area in accordance with the determined blur level (step S137). If it is not determined that the subject whose transmission is to be suppressed appears in the IR image, the processes of steps S132, S133, and S137 are skipped.
  • the image processing unit 150 outputs the IR image updated in step S137 or the IR image that has not been updated because the subject whose transmission is to be suppressed is not reflected, to the display 110, the communication interface 112, or the storage 114. (Step S138).
  • step S140 the IR image processing shown in FIG. 14D ends.
  • the various processing steps described so far are not limited to the examples shown in the flowchart, and may be combined in any way.
  • the IR image and the auxiliary image may be mixed for blurring as in the fourth example.
  • the subject is recognized using the visible light image as the auxiliary image in the daytime period when the visible light image can be used, as in the third example, and as in the first example in the nighttime period.
  • the image to be used may be switched depending on the time zone so that the subject is recognized using the IR image.
  • Second Embodiment> In the first embodiment described in the previous section, the blurred region corresponding to the whole or a part of the IR image is blurred. On the other hand, in the second embodiment, in order to prevent inappropriate actions caused by infrared transparency with a simpler implementation, in the transmission suppression process, all or part of the IR image. The imaging, display or recording of the image is invalidated.
  • FIG. 15 is a block diagram illustrating an example of a logical function configuration of the image processing apparatus 200 according to the second embodiment.
  • the image processing apparatus 200 includes an IR image acquisition unit 120, an auxiliary image acquisition unit 130, a determination unit 140, a recognition DB 145, an image processing unit 250, a user interface unit 260, and a notification unit 170.
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target (for example, a human face, body, or part of a body) for an IR image or an auxiliary image, so that an IR image It is determined whether or not a subject whose transmission should be suppressed is shown. Then, the determination unit 140 outputs the determination result to the image processing unit 250.
  • a predetermined recognition target for example, a human face, body, or part of a body
  • the image processing unit 250 suppresses at least partially the display or recording of the IR image based on the determination result input from the determination unit 140. More specifically, in the present embodiment, the image processing unit 250 captures all or part of the IR image when the determination unit 140 determines that the subject whose transmission should be suppressed is reflected in the IR image. Disable display or recording.
  • the image processing unit 250 sends an invalidation signal to the IR image acquisition unit 120 (or the infrared camera 102), thereby Imaging may be stopped.
  • the imaging of the IR image can be resumed triggered by a user input detected by the user interface unit 260 or when the recognition target is no longer recognized in the auxiliary image.
  • the image processing unit 250 stops outputting the IR image to the display 110 or stops recording the IR image to the storage 114 when a predetermined recognition target is recognized in the IR image or the auxiliary image. May be.
  • the image processing unit 250 invalidates the imaging, display, or recording (for example, one or more frames determined to show a subject whose transmission should be suppressed in a series of frames, or 1
  • An IR image in a part of one frame may be replaced with an auxiliary image.
  • the auxiliary image here may be a visible light image, an additional IR image (for example, a thermal image with low transparency), a depth map, or a fixed image prepared in advance. .
  • FIG. 16 is an explanatory diagram for explaining invalidation of imaging, display, or recording according to the present embodiment.
  • the left side of FIG. 16 shows a series of IR images Im21 to Im24 input to the image processing unit 250 along the time axis.
  • the earliest IR image Im21 does not show a subject whose transmission should be suppressed, but the subsequent IR images Im22 and Im23 show people, and a rectangular frame 242a and a face frame recognized by each IR image 242b is added. Further, the subsequent IR image Im24 does not show a subject whose transmission should be suppressed.
  • the image processing unit 250 sequentially receives these IR images from the IR image acquisition unit 120 and receives corresponding determination results from the determination unit 140.
  • the image processing unit 250 outputs the IR image Im21 and the IR image Im24 to an output destination such as the display 110 or the storage 114, and outputs the IR image Im22 and the IR image Im23 in which the subject whose transmission should be suppressed is displayed. Do not output to the destination.
  • an auxiliary image such as a visible light image, a thermal image, or a depth map is provisionally output instead of an IR image that is not output.
  • the image processing unit 250 may display a sign indicating that transmission is being suppressed on the screen.
  • the sign may be superimposed on an auxiliary image that is output instead of the IR image.
  • the notification unit 170 may notify a nearby person that an IR image is captured.
  • the notification unit 170 may perform notification with a different notification pattern depending on whether or not transmission suppression is performed by the image processing unit 250.
  • FIG. 17A is a flowchart illustrating a first example of the flow of IR image processing according to the second embodiment. The process shown in FIG. 17A is repeated for each of one or more IR images to be processed.
  • the IR image acquisition unit 120 acquires an IR image, and outputs the acquired IR image to the determination unit 140 and the image processing unit 250 (step S202).
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S212). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed is reflected in the IR image based on the image recognition result (step S220).
  • the image processing unit 250 When it is determined that the subject whose transmission should be suppressed is reflected in the IR image, the image processing unit 250 skips the output of the IR image to the display 110, the communication interface 112, or the storage 114 (step S232). On the other hand, when it is determined that the subject whose transmission is to be suppressed is not reflected in the IR image, the image processing unit 250 outputs the IR image to any output destination (step S236).
  • step S240 the IR image processing shown in FIG. 17A ends.
  • FIG. 17B is a flowchart illustrating a second example of the flow of IR image processing according to the second embodiment. The process shown in FIG. 17B is repeated for each of one or more IR images to be processed.
  • an IR image is acquired by the IR image acquisition unit 120, and an auxiliary image is acquired by the auxiliary image acquisition unit 130 (step S204).
  • the angles of view of the IR image and the auxiliary image are calibrated so as to overlap (ideally match) each other.
  • the IR image acquisition unit 120 outputs the acquired IR image to the image processing unit 150.
  • the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the determination unit 140.
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the auxiliary image input from the auxiliary image acquisition unit 130 (step S214).
  • the determination unit 140 determines whether a subject whose transmission should be suppressed is reflected in the IR image based on the image recognition result (step S220). For example, when a predetermined recognition target is recognized in the auxiliary image, the determination unit 140 can determine that the subject whose transmission should be suppressed appears in the IR image.
  • the image processing unit 250 When it is determined that the subject whose transmission should be suppressed is reflected in the IR image, the image processing unit 250 skips the output of the IR image to the display 110, the communication interface 112, or the storage 114 (step S232). On the other hand, when it is determined that the subject whose transmission is to be suppressed is not reflected in the IR image, the image processing unit 250 outputs the IR image to any output destination (step S236).
  • step S240 the IR image processing shown in FIG. 17B ends.
  • FIG. 17C is a flowchart illustrating a third example of the flow of IR image processing according to the second embodiment. The process shown in FIG. 17C is repeated for each of the one or more IR images to be processed.
  • an IR image is acquired by the IR image acquisition unit 120, and an auxiliary image is acquired by the auxiliary image acquisition unit 130 (step S204).
  • the IR image acquisition unit 120 outputs the acquired IR image to the determination unit 140 and the image processing unit 150. Further, the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the image processing unit 150.
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S212). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed is reflected in the IR image based on the image recognition result (step S220).
  • the image processing unit 250 When it is determined that the subject whose transmission should be suppressed is reflected in the IR image, the image processing unit 250 replaces the IR image or its partial image with the auxiliary image input from the auxiliary image acquisition unit 130 (step S233). . Then, the image processing unit 250 outputs the replaced IR image to the display 110, the communication interface 112, or the storage 114 (step S234). On the other hand, if it is determined that the subject whose transmission is to be suppressed is not reflected in the IR image, the image processing unit 250 outputs the IR image input from the IR image acquisition unit 120 as it is to any output destination ( Step S236).
  • step S240 the IR image processing shown in FIG. 17C ends.
  • image recognition for recognizing a recognition target such as a human face, body, or part of a body is performed on an infrared image or an auxiliary image having an angle of view overlapping the infrared image,
  • the recognition target is recognized, it is determined that the subject whose transmission should be suppressed is reflected in the infrared image. Therefore, the above-described mechanism can be easily incorporated into an apparatus or system by utilizing an existing image recognition technology such as face recognition or person recognition.
  • a region corresponding to the whole or a part of the infrared image is blurred.
  • the portion that is not blurred is still clearly visible by the user, and the subject can be identified to some extent (depending on the blurring level) even in the portion that is blurred. Therefore, it is possible to secure a wider opportunity to use the infrared image than in the case where the imaging of the IR image is prohibited.
  • the blurring level when blurring the area where the subject whose transmission should be suppressed is blurred is dynamically determined. Therefore, when the use of infrared images is likely to lead to inappropriate acts, it is possible to effectively prevent inappropriate acts caused by infrared transparency by adaptively increasing the blurring level. .
  • the infrared image when it is determined that a subject whose transmission is to be suppressed is reflected in the infrared image, imaging, display, or recording of the whole or a part of the infrared image is invalidated.
  • a mechanism for suppressing the transmission of the subject since it is not necessary to implement a process for processing pixel values for blurring, a mechanism for suppressing the transmission of the subject can be realized at a lower cost or with a small processing delay.
  • a series of control processing by each device described in this specification may be realized using any of software, hardware, and a combination of software and hardware.
  • the program constituting the software is stored in advance in a storage medium (non-transitory medium) provided inside or outside each device.
  • Each program is read into a RAM at the time of execution, for example, and executed by a processor such as a CPU.
  • processing described using the flowchart in this specification does not necessarily have to be executed in the order shown in the flowchart. Some processing steps may be performed in parallel. Further, additional processing steps may be employed, and some processing steps may be omitted.
  • An infrared image acquisition unit for acquiring an infrared image;
  • a determination unit that determines whether a subject whose transmission should be suppressed is reflected in the infrared image;
  • a processing unit that at least partially suppresses display or recording of the infrared image based on a determination result by the determination unit;
  • An image processing apparatus comprising: (2) The determination unit determines that the subject is reflected in the infrared image when a predetermined recognition target is recognized by image recognition of the infrared image or an auxiliary image having an angle of view overlapping the infrared image;
  • the image processing apparatus according to (1).
  • the image processing apparatus (3) The image processing apparatus according to (2), wherein the predetermined recognition target includes a human face, body, or part of a body. (4) The processing unit blurs a blurring area corresponding to the whole or a part of the infrared image when the determination unit determines that the subject is reflected in the infrared image. The image processing apparatus according to any one of (3). (5) The image processing apparatus according to (4), wherein the processing unit smoothes or fills the blurred area, or blurs the blurred area by mixing other pixel values in the blurred area. . (6) The processing unit sets the blurring area in the infrared image so that the blurring area includes an estimated position of the subject determined to be reflected in the infrared image. The image processing apparatus according to 5).
  • the image processing apparatus according to any one of (4) to (6), wherein the processing unit blurs the blurring area according to a blurring level that is dynamically determined.
  • the processing unit determines the blurring level based on one or more of an infrared irradiation intensity and an ambient light intensity when the infrared image is captured. apparatus.
  • the determination unit determines that the subject is reflected in the infrared image when a predetermined recognition target is recognized by image recognition of the infrared image or an auxiliary image having an angle of view overlapping the infrared image, The processing unit determines the blurring level based on a recognition likelihood of the image recognition.
  • the image processing apparatus according to (7).
  • the processing unit determines the blurring level based on one or more of a distance to the subject when the infrared image is captured and a size of the subject in the infrared image.
  • the image processing apparatus according to 7).
  • (13) The image processing apparatus according to (5), wherein the processing unit determines a fill color or a mixing color for blurring based on a representative value of pixel values belonging to the infrared image.
  • the processing unit invalidates imaging, display, or recording of the whole or a part of the infrared image when the determination unit determines that the subject is reflected in the infrared image.
  • the image processing apparatus according to any one of (3).
  • the auxiliary image includes one or more of a visible light image, a thermal image, and a depth map.
  • the processing unit displays a sign indicating that the suppression is performed on the screen.
  • the image processing apparatus includes: A camera that captures the infrared image; A notification unit for notifying a nearby person that the infrared image is captured by the camera; With The notification unit performs the notification with a different notification pattern depending on whether or not the suppression is performed.
  • the image processing apparatus according to any one of (1) to (17).
  • An image processing method including: (20) A computer for controlling the image processing apparatus; An infrared image acquisition unit for acquiring an infrared image; A determination unit that determines whether a subject whose transmission should be suppressed is reflected in the infrared image; A processing unit that at least partially suppresses display or recording of the infrared image based on a determination result by the determination unit; Program to function as.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

[Problem] To prevent inappropriate actions arising due to the transparency causing properties of infrared light, while maintaining the opportunity for infrared images to be appropriately used. [Solution] Provided is an image processing device that is equipped with the following: an infrared image acquisition unit that acquires infrared images; a determination unit that determines whether a subject, for which transparency is to be suppressed, is shown in the infrared image; and a processing unit that, on the basis of the determination results from the determination unit, at least partially prevents the display or recording of the infrared image

Description

画像処理装置、画像処理方法及びプログラムImage processing apparatus, image processing method, and program
 本開示は、画像処理装置、画像処理方法及びプログラムに関する。 The present disclosure relates to an image processing apparatus, an image processing method, and a program.
 従来、セキュリティ及びその他の目的で、赤外線カメラにより撮像される画像を活用することが提案されている(例えば、特許文献1参照)。赤外線は、その波長に依存して、可視光線とは異なる多様な用途を有する。特許文献1により提案された技術は、赤外線を暗視の用途で活用することにより、赤外線画像を通じて検出される不審者の存在を、夜間でもユーザに通報することができる。 Conventionally, it has been proposed to use an image captured by an infrared camera for security and other purposes (see, for example, Patent Document 1). Infrared rays have a variety of uses different from visible light, depending on their wavelength. The technique proposed by Patent Document 1 can notify the user of the presence of a suspicious person detected through an infrared image by using infrared rays for night vision applications.
 赤外線カメラは、監視カメラなどのセキュリティ機器のみならず、医療/診断機器、車載機器及び検査機器などでも使用される。また、スマートフォン又はタブレットPC(Personal Computer)などの汎用的な携帯機器に接続して(又はそうした機器に内蔵されて)赤外線画像を撮像し、表示し又は記録することを可能とする赤外線モジュールも存在する。 Infrared cameras are used not only for security equipment such as surveillance cameras, but also for medical / diagnostic equipment, in-vehicle equipment and inspection equipment. There is also an infrared module that can be connected to (or built in) a general-purpose portable device such as a smartphone or tablet PC (Personal Computer) to capture, display or record infrared images. To do.
特開2013-171476号公報JP 2013-171476 A
 しかしながら、ある範囲の波長を有する赤外線は布又は薄膜などの素材を透過する透過性を有するため、赤外線画像の利用がプライバシーの侵害などの不適切な行為に該当する恐れがある。 However, since infrared rays having a certain range of wavelengths are transmissive through materials such as cloth or thin film, the use of infrared images may fall under inappropriate acts such as infringement of privacy.
 そこで、赤外線画像の適切な利用の機会を維持しつつ、赤外線の透過性に起因する不適切な行為を防止する仕組みが提供されることが望ましい。 Therefore, it is desirable to provide a mechanism for preventing inappropriate acts caused by infrared transparency while maintaining an opportunity for appropriate use of infrared images.
 本開示によれば、赤外線画像を取得する赤外線画像取得部と、前記赤外線画像に透過を抑制すべき被写体が映っているかを判定する判定部と、前記判定部による判定結果に基づいて、前記赤外線画像の表示又は記録を少なくとも部分的に抑制する処理部と、を備える画像処理装置が提供される。 According to the present disclosure, an infrared image acquisition unit that acquires an infrared image, a determination unit that determines whether a subject whose transmission is to be suppressed is reflected in the infrared image, and the infrared image based on a determination result by the determination unit An image processing apparatus is provided that includes a processing unit that at least partially suppresses display or recording of an image.
 また、本開示によれば、画像処理装置のプロセッサにより、赤外線画像を取得することと、前記赤外線画像に透過を抑制すべき被写体が映っているかを判定することと、前記判定の結果に基づいて、前記赤外線画像の表示又は記録を少なくとも部分的に抑制することと、を含む画像処理方法が提供される。 According to the present disclosure, the processor of the image processing apparatus acquires an infrared image, determines whether a subject whose transmission is to be suppressed is reflected in the infrared image, and based on the determination result. And at least partially suppressing display or recording of the infrared image.
 また、本開示によれば、画像処理装置を制御するコンピュータを、赤外線画像を取得する赤外線画像取得部と、前記赤外線画像に透過を抑制すべき被写体が映っているかを判定する判定部と、前記判定部による判定結果に基づいて、前記赤外線画像の表示又は記録を少なくとも部分的に抑制する処理部と、として機能させるためのプログラムが提供される。 According to the present disclosure, the computer that controls the image processing apparatus includes an infrared image acquisition unit that acquires an infrared image, a determination unit that determines whether a subject whose transmission should be suppressed is reflected in the infrared image, Based on the determination result by the determination unit, a program is provided for functioning as a processing unit that at least partially suppresses display or recording of the infrared image.
 本開示に係る技術によれば、赤外線画像の適切な利用の機会を維持しつつ、赤外線の透過性に起因する不適切な行為を防止することができる。
 なお、上記の効果は必ずしも限定的なものではなく、上記の効果と共に、又は上記の効果に代えて、本明細書に示されたいずれかの効果、又は本明細書から把握され得る他の効果が奏されてもよい。
According to the technology according to the present disclosure, it is possible to prevent an inappropriate act resulting from infrared transparency while maintaining an opportunity for appropriate use of an infrared image.
Note that the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
波長に依存する赤外線(IR)画像の多様な用途について説明するための説明図である。It is explanatory drawing for demonstrating the various use of the infrared (IR) image depending on a wavelength. 一実施形態に係るIR画像処理の概略的な流れの一例を示すフローチャートである。It is a flowchart which shows an example of the schematic flow of IR image processing which concerns on one Embodiment. IR画像処理が実行され得るシナリオの第1の例を示す説明図である。It is explanatory drawing which shows the 1st example of the scenario in which IR image processing may be performed. IR画像処理が実行され得るシナリオの第2の例を示す説明図である。It is explanatory drawing which shows the 2nd example of the scenario where IR image processing may be performed. IR画像処理が実行され得るシナリオの第3の例を示す説明図である。It is explanatory drawing which shows the 3rd example of the scenario in which IR image processing may be performed. IR画像処理が実行され得るシナリオの第4の例を示す説明図である。It is explanatory drawing which shows the 4th example of the scenario in which IR image processing may be performed. 第1の実施形態に係る画像処理装置のハードウェア構成の一例を示すブロック図である。It is a block diagram which shows an example of the hardware constitutions of the image processing apparatus which concerns on 1st Embodiment. 第1の実施形態に係る画像処理装置の論理的機能の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the logical function of the image processing apparatus which concerns on 1st Embodiment. 不鮮明化領域の設定の一例を示す説明図である。It is explanatory drawing which shows an example of the setting of a blurring area | region. 被写体を不鮮明化する手法の一例について説明するための説明図である。It is explanatory drawing for demonstrating an example of the method of blurring a to-be-photographed object. 被写体を不鮮明化する手法の他の例について説明するための説明図である。It is explanatory drawing for demonstrating the other example of the method of blurring a to-be-photographed object. 不鮮明化レベルの決定のためのパラメータと不鮮明化レベルとの間の関係の第1の例を示すグラフである。It is a graph which shows the 1st example of the relationship between the parameter for determination of a blurring level, and a blurring level. 不鮮明化レベルの決定のためのパラメータと不鮮明化レベルとの間の関係の第2の例を示すグラフである。It is a graph which shows the 2nd example of the relationship between the parameter for determination of a blurring level, and a blurring level. 不鮮明化レベルをユーザに調整させるためのユーザインタフェースの一例を示す説明図である。It is explanatory drawing which shows an example of the user interface for making a user adjust a blurring level. ユーザ設定に依存する不鮮明化レベルの例を示すグラフである。It is a graph which shows the example of the blurring level depending on a user setting. 透過が抑制されていることをユーザに通知する標識の一例を示す説明図である。It is explanatory drawing which shows an example of the label | marker which notifies a user that permeation | transmission is suppressed. IR画像が撮像されることを近傍の人物へ報知する手法の一例について説明するための説明図である。It is explanatory drawing for demonstrating an example of the method of alert | reporting to an adjacent person that an IR image is imaged. 第1の実施形態に係るIR画像処理の流れの第1の例を示すフローチャートである。It is a flowchart which shows the 1st example of the flow of IR image processing which concerns on 1st Embodiment. 第1の実施形態に係るIR画像処理の流れの第2の例を示すフローチャートである。It is a flowchart which shows the 2nd example of the flow of IR image processing which concerns on 1st Embodiment. 第1の実施形態に係るIR画像処理の流れの第3の例を示すフローチャートである。It is a flowchart which shows the 3rd example of the flow of IR image processing which concerns on 1st Embodiment. 第1の実施形態に係るIR画像処理の流れの第4の例を示すフローチャートである。It is a flowchart which shows the 4th example of the flow of IR image processing which concerns on 1st Embodiment. 第2の実施形態に係る画像処理装置の論理的機能の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the logical function of the image processing apparatus which concerns on 2nd Embodiment. 第2の実施形態に係る撮像、表示又は記録の無効化について説明するための説明図である。It is explanatory drawing for demonstrating the invalidation of the imaging, display, or recording which concerns on 2nd Embodiment. 第2の実施形態に係るIR画像処理の流れの第1の例を示すフローチャートである。It is a flowchart which shows the 1st example of the flow of IR image processing which concerns on 2nd Embodiment. 第2の実施形態に係るIR画像処理の流れの第2の例を示すフローチャートである。It is a flowchart which shows the 2nd example of the flow of IR image processing which concerns on 2nd Embodiment. 第2の実施形態に係るIR画像処理の流れの第3の例を示すフローチャートである。It is a flowchart which shows the 3rd example of the flow of IR image processing which concerns on 2nd Embodiment.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.
 また、以下の順序で説明を行う。
  1.基本的な原理
   1-1.赤外線の多様な用途
   1-2.概略的な処理の流れ
   1-3.実行シナリオの例
  2.第1の実施形態
   2-1.ハードウェア構成
   2-2.機能構成
   2-3.処理の流れ
  3.第2の実施形態
   3-1.機能構成
   3-2.処理の流れ
  4.まとめ
The description will be given in the following order.
1. Basic principle 1-1. Various uses of infrared 1-2. Schematic processing flow 1-3. Example of execution scenario First embodiment 2-1. Hardware configuration 2-2. Functional configuration 2-3. Flow of processing Second embodiment 3-1. Functional configuration 3-2. Flow of processing Summary
 <1.基本的な原理>
  [1-1.赤外線の多様な用途]
 図1は、波長に依存する赤外線(IR)画像の多様な用途について説明するための説明図である。図1の水平方向は赤外線の波長に対応し、左から右へと波長は長くなる。0.7μm以下の波長を有する光線は可視光線であり、人間の視覚はこの可視光線を感知する。0.7μmから1.0μmまでの範囲内の波長を有する赤外線は、近赤外線(NIR)に分類される。近赤外線は、例えば、暗視(night vision)、透視、光通信及び測距のために利用され得る。1.0μmから2.5μmまでの範囲内の波長を有する赤外線は、短波長赤外線(SWIR)に分類される。短波長赤外線もまた、暗視及透視のために利用可能である。近赤外線又は短波長赤外線を用いた暗視装置は、まず近傍に赤外線を照射し、その反射光を捕捉することによりIR画像を生成する。2.5μmから4.0μmまでの範囲内の波長を有する赤外線は、中波長赤外線(MWIR)に分類される。中波長赤外線の波長範囲では物質固有の吸収スペクトルが現れることから、中波長赤外線は、物質の同定のために利用され得る。また、中波長赤外線は、サーモグラフィのためにも利用可能である。4.0μm以上の波長を有する赤外線は、遠赤外線(FIR)に分類される。遠赤外線は、暗視、サーモグラフィ及び加熱のために利用され得る。物体からの黒体放射によって発せられる赤外線は、遠赤外線に相当する。そのため、遠赤外線を用いた暗視装置は、赤外線を照射せずとも、物体からの黒体放射を捕捉することによりIR画像を生成することができる。なお、図1に示した波長の範囲の境界値は例に過ぎない。赤外線の分類の境界値には様々な定義が存在しており、本開示に係る技術の後述する利点は、いかなる定義の下でも享受され得る。
<1. Basic Principle>
[1-1. Various uses of infrared]
FIG. 1 is an explanatory diagram for explaining various uses of an infrared (IR) image depending on a wavelength. The horizontal direction in FIG. 1 corresponds to the wavelength of infrared rays, and the wavelength increases from left to right. Light having a wavelength of 0.7 μm or less is visible light, and human vision senses this visible light. Infrared rays having a wavelength in the range of 0.7 μm to 1.0 μm are classified as near infrared rays (NIR). Near-infrared light can be used, for example, for night vision, fluoroscopy, optical communication and ranging. Infrared rays having a wavelength in the range of 1.0 μm to 2.5 μm are classified as short wavelength infrared rays (SWIR). Short wavelength infrared is also available for night vision and fluoroscopy. A night vision apparatus using near-infrared rays or short-wavelength infrared rays first irradiates infrared rays in the vicinity and captures the reflected light to generate an IR image. Infrared light having a wavelength in the range of 2.5 μm to 4.0 μm is classified as medium wavelength infrared (MWIR). Since a substance-specific absorption spectrum appears in the wavelength range of the medium wavelength infrared, the medium wavelength infrared can be used for identification of the substance. Medium wavelength infrared can also be used for thermography. Infrared rays having a wavelength of 4.0 μm or more are classified as far infrared rays (FIR). Far infrared can be utilized for night vision, thermography and heating. Infrared rays emitted by black body radiation from an object correspond to far infrared rays. Therefore, a night vision apparatus using far infrared rays can generate an IR image by capturing black body radiation from an object without irradiating infrared rays. Note that the boundary values of the wavelength range shown in FIG. 1 are merely examples. Various definitions exist for the boundary value of the infrared classification, and the advantages described below of the technology according to the present disclosure can be enjoyed under any definition.
 図1からも理解されるように、主に近赤外線及び短波長赤外線は、布又は薄膜などの素材を透過する透過性を有する。そのため、例えば、これら種類の赤外線に基づくIR画像に人物が映っている場合、その人物の衣服が透過し、下着など他人に視認されることが望ましくない物がIR画像において露わとなる場合がある。よって、IR画像の利用は、プライバシーの侵害又は迷惑行為などといった、不適切な行為に該当する恐れがある。特に近年、セキュリティ意識の高まりから公共の場に監視カメラが数多く設置され、事故防止の観点から暗視カメラを備えた自動車が市販されるなど、IR画像を撮像することのできる機器が多くの場面で活用されている。そこで、本明細書では、赤外線画像の適切な利用の機会を維持しつつ、赤外線の透過性に起因する不適切な行為を防止することのできる仕組みを提案する。 As can be understood from FIG. 1, near infrared rays and short wavelength infrared rays are mainly permeable to materials such as cloth or thin film. Therefore, for example, when a person appears in an IR image based on these types of infrared rays, the clothes of the person may be transmitted, and underwear and other objects that are not desirable to be seen by others may be exposed in the IR image. is there. Therefore, the use of IR images may fall under inappropriate acts such as privacy infringement or nuisance. In particular, in recent years, many security cameras have been installed in public places due to heightened security awareness, and automobiles equipped with night vision cameras have been marketed from the viewpoint of accident prevention. It is utilized in. In view of this, the present specification proposes a mechanism that can prevent an inappropriate act caused by infrared transparency while maintaining an opportunity for appropriate use of an infrared image.
  [1-2.概略的な処理の流れ]
 図2は、一実施形態に係る赤外線(IR)画像処理の概略的な流れの一例を示すフローチャートである。図示した実施形態において、まず、IR画像が取得される(ステップS10)。ここで取得されるIR画像は、透過性を有する赤外線を感知するカメラを通じて生成された画像である。このIR画像は、静止画であってもよく、又は動画を構成する一連のフレームのうちの1つであってもよい。
[1-2. Schematic processing flow]
FIG. 2 is a flowchart illustrating an example of a schematic flow of infrared (IR) image processing according to an embodiment. In the illustrated embodiment, first, an IR image is acquired (step S10). The IR image acquired here is an image generated through a camera that detects infrared rays having transparency. The IR image may be a still image or one of a series of frames that make up a moving image.
 次に、被写体認識処理が実行される(ステップS11)。ここで認識される被写体は、透過を抑制すべき被写体と同一であり、又は透過を抑制すべき被写体に関連付けて予め定義され得る。一例として、透過を抑制すべき被写体が人体である場合に、ステップS11において認識される被写体は、人体そのもの、又は人間の顔若しくは手などの人体の一部であり得る。他の例として、透過を抑制すべき被写体は人体とは異なる何らかの物体(例えば、内容物を視認されることが好ましくない容器など)であってもよい。被写体認識処理は、IR画像を入力として実行されてもよく、又は他の種類の画像(但し、IR画像を基準として校正され得る画角を有する画像)を入力として実行されてもよい。 Next, subject recognition processing is executed (step S11). The subject recognized here is the same as the subject whose transmission should be suppressed, or can be defined in advance in association with the subject whose transmission should be suppressed. As an example, when the subject whose transmission is to be suppressed is a human body, the subject recognized in step S11 may be the human body itself or a part of the human body such as a human face or hand. As another example, the subject whose transmission should be suppressed may be any object different from the human body (for example, a container in which it is not preferable to visually recognize the contents). The subject recognition process may be executed with an IR image as an input, or may be executed with another type of image (however, an image having an angle of view that can be calibrated with reference to the IR image) as an input.
 次に、透過を抑制すべき被写体がIR画像に映っているかが判定される(ステップS12)。そして、透過を抑制すべき被写体がIR画像に映っていると判定された場合、IR画像の表示又は記録を少なくとも部分的に抑制するために、透過抑制処理が実行される(ステップS13)。このように、本開示に係る技術では、IR画像の表示又は記録を基本的には許容しつつ、その表示又は記録が不適切であると判定される場合にのみ、そうした動作が抑制される。 Next, it is determined whether the subject whose transmission should be suppressed is reflected in the IR image (step S12). If it is determined that the subject whose transmission should be suppressed is reflected in the IR image, a transmission suppression process is executed in order to at least partially suppress the display or recording of the IR image (step S13). As described above, in the technology according to the present disclosure, such an operation is suppressed only when it is determined that the display or recording is inappropriate while the display or recording of the IR image is basically allowed.
 後述する第1の実施形態では、透過抑制処理において、IR画像の全体又は一部に相当する領域が不鮮明化される。第2の実施形態では、透過抑制処理において、IR画像の全体又は一部の撮像、表示又は記録が無効化される。これら実施形態及び関連する様々なバリエーションについて、次節以降で具体的に説明する。 In the first embodiment to be described later, the region corresponding to the whole or a part of the IR image is blurred in the transmission suppression process. In the second embodiment, imaging, display, or recording of all or part of the IR image is invalidated in the transmission suppression process. These embodiments and related variations will be specifically described in the following sections.
  [1-3.実行シナリオの例]
 上述したIR画像処理は、IR画像を撮像し又は撮像されたIR画像を処理する任意の種類の装置上で実行され得る。いくつかの例を挙げるだけでも、IR画像を撮像する装置は、デジタルビデオカメラ、デジタルスチルカメラ、テレビジョン放送用カメラ、監視カメラ、モニター付きインターホン、車載カメラ、スマートフォン、PC(Personal Computer)、HMD(Head Mounted Display)端末、ゲーム端末、医療機器、診断機器及び検査機器などを含み得る。また、IR画像を処理する装置は、上述した様々な撮像装置に加えて、テレビジョン受像機、コンテンツプレーヤ、コンテンツレコーダ及びオーサリング機器などを含み得る。次節以降で言及する画像処理装置は、ここに例示した装置に搭載され又は当該装置に接続されるモジュールであってもよい。
[1-3. Example of execution scenario]
The IR image processing described above can be performed on any type of device that captures an IR image or processes a captured IR image. Even if only a few examples are given, devices for imaging IR images are digital video cameras, digital still cameras, television broadcasting cameras, surveillance cameras, intercoms with monitors, in-vehicle cameras, smartphones, PCs (Personal Computers), HMDs. (Head Mounted Display) A terminal, a game terminal, a medical device, a diagnostic device, an inspection device, and the like may be included. In addition to the various imaging devices described above, the device that processes IR images may include a television receiver, a content player, a content recorder, an authoring device, and the like. The image processing apparatus mentioned in the following sections may be a module mounted on or connected to the apparatus exemplified here.
 図3A~図3Dは、IR画像処理が実行され得る様々なシナリオの例を示す説明図である。図3Aを参照すると、IR画像を撮像する赤外線カメラから、IR画像がIR画像処理へと入力される。そして、透過が抑制されたIR画像がディスプレイへ出力され、当該画像がディスプレイにより表示される。図3Bを参照すると、IR画像を撮像する赤外線カメラから、IR画像がIR画像処理へと入力される。そして、透過が抑制されたIR画像がストレージへ出力され、当該画像がストレージにより記録される。図3Cを参照すると、撮像されたIR画像を記憶しているストレージから、IR画像がIR画像処理へと読み出される。そして、透過が抑制されたIR画像がディスプレイへ出力され、当該画像がディスプレイにより表示される。図3Dを参照すると、撮像されたIR画像を記憶しているストレージから、IR画像がIR画像処理へと読み出される。そして、透過が抑制されたIR画像が再びストレージにより記録される(即ち、画像データが更新され又は変換される)。次節以降で説明する実施形態は、これら実行シナリオのいずれにも適用可能である。 3A to 3D are explanatory diagrams showing examples of various scenarios in which IR image processing can be executed. Referring to FIG. 3A, an IR image is input to IR image processing from an infrared camera that captures an IR image. Then, the IR image in which the transmission is suppressed is output to the display, and the image is displayed on the display. Referring to FIG. 3B, an IR image is input to IR image processing from an infrared camera that captures an IR image. Then, the IR image in which the transmission is suppressed is output to the storage, and the image is recorded by the storage. Referring to FIG. 3C, the IR image is read out to the IR image processing from the storage storing the captured IR image. Then, the IR image in which the transmission is suppressed is output to the display, and the image is displayed on the display. Referring to FIG. 3D, the IR image is read out to the IR image processing from the storage storing the captured IR image. Then, the IR image whose transmission is suppressed is recorded again by the storage (that is, the image data is updated or converted). The embodiments described in the following sections are applicable to any of these execution scenarios.
 <2.第1の実施形態>
  [2-1.ハードウェア構成]
 図4は、第1の実施形態に係る画像処理装置100のハードウェア構成の一例を示すブロック図である。図4を参照すると、画像処理装置100は、赤外線カメラ102、サブカメラ104、入力インタフェース106、メモリ108、ディスプレイ110、通信インタフェース112、ストレージ114、バス116及びプロセッサ118を備える。
<2. First Embodiment>
[2-1. Hardware configuration]
FIG. 4 is a block diagram illustrating an example of a hardware configuration of the image processing apparatus 100 according to the first embodiment. Referring to FIG. 4, the image processing apparatus 100 includes an infrared camera 102, a sub camera 104, an input interface 106, a memory 108, a display 110, a communication interface 112, a storage 114, a bus 116, and a processor 118.
   (1)赤外線カメラ
 赤外線カメラ102は、赤外線(IR)画像を撮像する撮像モジュールである。赤外線カメラ102は、主に近赤外線又は短波長赤外線に分類される赤外線を感知する撮像素子の配列と、装置の近傍に赤外線を照射する発光素子とを有する。赤外線カメラ102は、例えば、ユーザ入力などのトリガに応じて又は周期的に発光素子から赤外線を照射し、被写体又はその背景において反射した赤外線を捕捉することにより、IR画像を生成する。赤外線カメラ102により生成される一連のIR画像は、映像を構成し得る。
(1) Infrared camera The infrared camera 102 is an imaging module that captures an infrared (IR) image. The infrared camera 102 has an array of imaging elements that sense infrared rays that are mainly classified as near infrared rays or short wavelength infrared rays, and a light emitting element that irradiates infrared rays in the vicinity of the apparatus. The infrared camera 102 irradiates infrared rays from a light emitting element in response to a trigger such as a user input or periodically, and captures infrared rays reflected on a subject or its background to generate an IR image. A series of IR images generated by the infrared camera 102 may constitute a video.
   (2)サブカメラ
 サブカメラ104は、IR画像処理において被写体の認識又は透過の抑制のために補助的に使用される補助画像を撮像する。サブカメラ104により撮像される補助画像は、例えば、可視光画像、追加的なIR画像(例えば、熱画像、即ちMWIR画像若しくはFIR画像)及び深度マップ、のうちの1つ以上であってよい。
(2) Sub-Camera The sub-camera 104 captures an auxiliary image that is used auxiliary to recognize a subject or suppress transmission in IR image processing. The auxiliary image captured by the sub camera 104 may be, for example, one or more of a visible light image, an additional IR image (for example, a thermal image, that is, a MWIR image or an FIR image), and a depth map.
   (3)入力インタフェース
 入力インタフェース106は、ユーザが画像処理装置100を操作し又は画像処理装置100へ情報を入力するために使用される。入力インタフェース106は、例えば、タッチセンサ、キーボード、キーパッド、ボタン又はスイッチなどの入力デバイスを含んでもよい。また、入力インタフェース106は、音声入力用のマイクロフォン及び音声認識モジュールを含んでもよい。また、入力インタフェース106は、ユーザにより選択される命令をリモートデバイスから受信する遠隔制御モジュールを含んでもよい。
(3) Input Interface The input interface 106 is used for a user to operate the image processing apparatus 100 or input information to the image processing apparatus 100. The input interface 106 may include an input device such as a touch sensor, a keyboard, a keypad, a button, or a switch, for example. The input interface 106 may include a microphone for voice input and a voice recognition module. The input interface 106 may also include a remote control module that receives commands selected by the user from the remote device.
   (4)メモリ
 メモリ108は、RAM(Random Access Memory)及びROM(Read Only Memory)を含み得る記憶媒体である。メモリ108は、プロセッサ118に連結され、プロセッサ118により実行される処理のためのプログラム及びデータを記憶する。
(4) Memory The memory 108 is a storage medium that can include a RAM (Random Access Memory) and a ROM (Read Only Memory). The memory 108 is coupled to the processor 118 and stores programs and data for processing executed by the processor 118.
   (5)ディスプレイ
 ディスプレイ110は、画像を表示する画面を有する表示モジュールである。ディスプレイ110は、例えば、LCD(Liquid Crystal Display)、OLED(Organic light-Emitting Diode)又はCRT(Cathode Ray Tube)などであってよい。
(5) Display The display 110 is a display module having a screen for displaying an image. The display 110 may be, for example, an LCD (Liquid Crystal Display), an OLED (Organic light-Emitting Diode), or a CRT (Cathode Ray Tube).
   (6)通信インタフェース
 通信インタフェース112は、画像処理装置100と他の装置との間の通信を仲介するモジュールである。通信インタフェース112は、任意の無線通信プロトコル又は有線通信プロトコルに従って、通信接続を確立する。
(6) Communication Interface The communication interface 112 is a module that mediates communication between the image processing apparatus 100 and another apparatus. The communication interface 112 establishes a communication connection according to any wireless communication protocol or wired communication protocol.
   (7)ストレージ
 ストレージ114は、IR画像及び補助画像を含み得る画像データを蓄積し、又はIR画像処理において利用されるデータベースを記憶する記憶デバイスである。ストレージ114は、半導体メモリ又はハードディスクなどの記憶媒体を内蔵する。なお、本明細書で説明するプログラム及びデータは、画像処理装置100の外部のデータソース(例えば、データサーバ、ネットワークストレージ又は外付けメモリなど)から取得されてもよい。
(7) Storage The storage 114 is a storage device that stores image data that can include IR images and auxiliary images, or stores a database used in IR image processing. The storage 114 contains a storage medium such as a semiconductor memory or a hard disk. Note that the program and data described in this specification may be acquired from a data source external to the image processing apparatus 100 (for example, a data server, a network storage, or an external memory).
   (8)バス
 バス116は、赤外線カメラ102、サブカメラ104、入力インタフェース106、メモリ108、ディスプレイ110、通信インタフェース112、ストレージ114及びプロセッサ118を相互に接続する。
(8) Bus The bus 116 connects the infrared camera 102, the sub camera 104, the input interface 106, the memory 108, the display 110, the communication interface 112, the storage 114, and the processor 118 to each other.
   (9)プロセッサ
 プロセッサ118は、CPU(Central Processing Unit)又はDSP(Digital Signal Processor)などの処理モジュールである。プロセッサ118は、メモリ108又は他の記憶媒体に記憶されるプログラムを実行することにより、IR画像における望ましくない透過を抑制するための機能を動作させる。
(9) Processor The processor 118 is a processing module such as a CPU (Central Processing Unit) or a DSP (Digital Signal Processor). The processor 118 operates a function for suppressing undesired transparency in the IR image by executing a program stored in the memory 108 or other storage medium.
  [2-2.機能構成]
 図5は、図4に示した画像処理装置100の構成要素が互いに連係することにより実現される論理的機能の構成の一例を示すブロック図である。図5を参照すると、画像処理装置100は、IR画像取得部120、補助画像取得部130、判定部140、認識DB145、画像処理部150、不鮮明化DB155、ユーザインタフェース部160及び報知部170を備える。
[2-2. Functional configuration]
FIG. 5 is a block diagram illustrating an example of a configuration of logical functions realized by linking the components of the image processing apparatus 100 illustrated in FIG. 4 to each other. Referring to FIG. 5, the image processing apparatus 100 includes an IR image acquisition unit 120, an auxiliary image acquisition unit 130, a determination unit 140, a recognition DB 145, an image processing unit 150, a blurring DB 155, a user interface unit 160, and a notification unit 170. .
   (1)IR画像取得部
 IR画像取得部120は、赤外線(IR)画像を取得し、取得したIR画像を判定部140及び画像処理部150へ出力する。例えば、IR画像取得部120は、赤外線カメラ102により撮像されるIR画像を取得してもよい。また、IR画像取得部120は、ストレージ114により記憶されているIR画像を取得してもよい。また、IR画像取得部120は、通信インタフェース112を介して他の装置からIR画像を取得してもよい。IR画像取得部120により取得されるIR画像は、信号の増幅及びノイズ除去などの予備的な処理を経た画像であってもよい。また、IR画像取得部120は、圧縮符号化された符号化ストリームからIR画像を復号してもよい。
(1) IR Image Acquisition Unit The IR image acquisition unit 120 acquires an infrared (IR) image and outputs the acquired IR image to the determination unit 140 and the image processing unit 150. For example, the IR image acquisition unit 120 may acquire an IR image captured by the infrared camera 102. Further, the IR image acquisition unit 120 may acquire an IR image stored in the storage 114. The IR image acquisition unit 120 may acquire an IR image from another device via the communication interface 112. The IR image acquired by the IR image acquisition unit 120 may be an image that has undergone preliminary processing such as signal amplification and noise removal. In addition, the IR image acquisition unit 120 may decode the IR image from the compressed and encoded stream.
   (2)補助画像取得部
 補助画像取得部130は、可視光画像、追加的なIR画像又は深度マップを含み得る補助画像を取得する。補助画像の画角は、IR画像取得部120により取得されるIR画像の画角と重なる(理想的には一致する)ように校正されているものとする。補助画像取得部130は、IR画像処理において被写体の認識のために補助画像が使用される場合には、取得した補助画像を判定部140へ出力する。また、補助画像取得部130は、透過の抑制のために補助画像が使用される場合には、取得した補助画像を画像処理部150へ出力する。補助画像取得部130は、サブカメラ104により撮像され、ストレージ114により記憶され、又は通信インタフェース112を介して受信される補助画像を取得し得る。いずれの用途にも補助画像が使用されない場合には、画像処理装置100の構成から補助画像取得部130は省略されてもよい。
(2) Auxiliary image acquisition unit The auxiliary image acquisition unit 130 acquires an auxiliary image that may include a visible light image, an additional IR image, or a depth map. It is assumed that the angle of view of the auxiliary image is calibrated so as to overlap (ideally match) the angle of view of the IR image acquired by the IR image acquisition unit 120. The auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the determination unit 140 when the auxiliary image is used for subject recognition in IR image processing. In addition, the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the image processing unit 150 when the auxiliary image is used for suppressing transmission. The auxiliary image acquisition unit 130 may acquire an auxiliary image captured by the sub camera 104 and stored in the storage 114 or received via the communication interface 112. When the auxiliary image is not used for any application, the auxiliary image acquisition unit 130 may be omitted from the configuration of the image processing apparatus 100.
   (3)判定部
 判定部140は、IR画像に透過を抑制すべき被写体が映っているかを判定する。より具体的には、判定部140は、IR画像又はIR画像と重なる画角を有する補助画像について、所定の認識対象を認識するための画像認識を実行する。そして、判定部140は、画像認識の結果として所定の認識対象が認識される場合に、IR画像に透過を抑制すべき被写体が映っていると判定する。本実施形態において、所定の認識対象とは、人間の顔、体又は体の一部であってよい。認識DB145は、判定部140により実行される画像認識において参照されるデータを記憶する。
(3) Determination Unit The determination unit 140 determines whether a subject whose transmission should be suppressed is reflected in the IR image. More specifically, the determination unit 140 performs image recognition for recognizing a predetermined recognition target for an IR image or an auxiliary image having an angle of view overlapping with the IR image. Then, when the predetermined recognition target is recognized as a result of the image recognition, the determination unit 140 determines that the subject whose transmission should be suppressed appears in the IR image. In the present embodiment, the predetermined recognition target may be a human face, body, or part of the body. The recognition DB 145 stores data referred to in image recognition executed by the determination unit 140.
 例えば、認識DB145は、多数の既知のIR画像から事前の学習処理を通じて取得される認識対象の画像特徴量を予め記憶する。判定部140は、IR画像取得部120から入力されるIR画像から抽出される画像特徴量を認識DB145により記憶されている特徴量と照合し、その照合結果に応じてIR画像に認識対象が映っているかを判定し得る。判定部140は、例えばブースティング又はサポートベクタマシンといった既存のアルゴリズムに従って、上述した画像認識を実行してよい。 For example, the recognition DB 145 stores in advance image feature quantities to be recognized that are acquired from a number of known IR images through a prior learning process. The determination unit 140 collates the image feature amount extracted from the IR image input from the IR image acquisition unit 120 with the feature amount stored in the recognition DB 145, and the recognition target appears in the IR image according to the collation result. Can be determined. The determination unit 140 may perform the above-described image recognition according to an existing algorithm such as boosting or support vector machine.
 判定部140が画像認識においてIR画像の代わりに補助画像を用いる場合には、事前の学習処理もまた補助画像と同じ種類の既知の画像に基づいて実行される。補助画像の画角が同じタイミングで撮像されるIR画像の画角と重なっていることを前提とすることで、補助画像における所定の認識対象の検出に基づいて、IR画像に透過を抑制すべき被写体が映っていると推定することができる。環境光のある程度の照度が確保される環境において画像処理装置100が使用される場合には、補助画像として可視光画像を使用することが可能である。この場合、可視光画像に基づく良好な精度の顔認識技術又は人物認識技術を活用して、透過を抑制すべき被写体がIR画像に映っているかを判定することができる。 When the determination unit 140 uses an auxiliary image instead of an IR image in image recognition, a prior learning process is also executed based on a known image of the same type as the auxiliary image. Based on the assumption that the angle of view of the auxiliary image overlaps the angle of view of the IR image captured at the same timing, based on the detection of a predetermined recognition target in the auxiliary image, transmission to the IR image should be suppressed It can be estimated that the subject is shown. When the image processing apparatus 100 is used in an environment where a certain amount of ambient light illuminance is ensured, a visible light image can be used as an auxiliary image. In this case, it is possible to determine whether the subject whose transmission should be suppressed is reflected in the IR image by utilizing a face recognition technique or a person recognition technique with good accuracy based on the visible light image.
 判定部140は、IR画像に透過を抑制すべき被写体が映っていると判定した場合、次に列挙する情報のうちの1つ以上を、画像処理部150へ出力する。なお、1つのIR画像又は補助画像について、複数の認識対象が認識されてもよい。
  a)認識対象種別:認識された認識対象の種類を特定するコード。顔/人体、顔/胴体/腕/脚、顔/上半身/下半身、顔/胴体(上半身)/胴体(下半身)/腕/脚、といった様々な種別候補(possible types)が予め定義され得る。
  b)認識位置・向き・サイズ:認識された認識対象の、IR画像(又は補助画像)内の位置、向き及びサイズ。追加的に又は代替的に、形状を示す情報が出力されてもよい。
  c)認識尤度:画像認識の結果の尤もらしさを示す指標。その値が大きいほど、認識結果が正しい可能性が高い。信頼度ともいう。
If the determination unit 140 determines that a subject whose transmission is to be suppressed appears in the IR image, the determination unit 140 outputs one or more of the information listed below to the image processing unit 150. A plurality of recognition targets may be recognized for one IR image or auxiliary image.
a) Recognition target type: a code that identifies the type of the recognized recognition target. Various possible types such as face / human body, face / torso / arm / leg, face / upper body / lower body, face / torso (upper body) / torso (lower body) / arm / leg may be defined in advance.
b) Recognition position / orientation / size: The position, orientation and size of the recognized recognition object in the IR image (or auxiliary image). Additionally or alternatively, information indicating the shape may be output.
c) Recognition likelihood: an index indicating the likelihood of the result of image recognition. The larger the value, the more likely the recognition result is correct. Also called reliability.
 なお、判定部140により実行される画像認識は、運転支援のために車載カメラからの画像について実行される歩行者の認識などの、他の目的のための画像認識と共用されてもよい。また、そうした画像認識の結果を示す表示オブジェクト(例えば、歩行者を囲む枠など)がIR画像に重畳されてもよい。 Note that the image recognition performed by the determination unit 140 may be shared with image recognition for other purposes, such as pedestrian recognition performed on images from the in-vehicle camera for driving assistance. In addition, a display object indicating the result of such image recognition (for example, a frame surrounding a pedestrian) may be superimposed on the IR image.
   (4)画像処理部
 画像処理部150は、判定部140から入力される判定結果に基づいて、IR画像の表示又は記録を少なくとも部分的に抑制する。より具体的には、本実施形態において、画像処理部150は、透過を抑制すべき被写体がIR画像に映っていると判定部140により判定された場合に、IR画像に映っていると判定された当該被写体の推定位置を含むように、IR画像に不鮮明化領域を設定する。そして、画像処理部150は、その設定に従って、IR画像の不鮮明化領域を不鮮明化する。不鮮明化DB155は、画像処理部150により実行される画像処理において参照されるデータを記憶する。
(4) Image Processing Unit The image processing unit 150 suppresses at least partially the display or recording of the IR image based on the determination result input from the determination unit 140. More specifically, in the present embodiment, the image processing unit 150 determines that the subject whose transmission should be suppressed is reflected in the IR image when the determination unit 140 determines that the subject is reflected in the IR image. A blurring area is set in the IR image so as to include the estimated position of the subject. Then, the image processing unit 150 blurs the blurring area of the IR image according to the setting. The blurring DB 155 stores data referred to in image processing executed by the image processing unit 150.
 簡易的な手法において、不鮮明化領域は、IR画像の全体に相当してもよい。この場合、画像処理部150は、被写体の具体的な位置を把握することなく、不鮮明化領域を設定することができる。他の手法において、不鮮明化領域は、IR画像の一部に相当し得る。この場合、画像処理部150は、判定部140から入力される認識対象の認識位置、向き及びサイズの情報に基づいて、IR画像に不鮮明化領域を設定する。例えば、認識対象として人間の顔が認識された場合、不鮮明化領域は、その顔の認識位置の下方に、顔のサイズに依存するサイズで設定され得る。 In a simple method, the blurred area may correspond to the entire IR image. In this case, the image processing unit 150 can set the unsharp area without grasping the specific position of the subject. In other approaches, the blurred area may correspond to a portion of the IR image. In this case, the image processing unit 150 sets a blurring region in the IR image based on the recognition position, orientation, and size information of the recognition target input from the determination unit 140. For example, when a human face is recognized as a recognition target, the blurring area can be set below the recognition position of the face with a size depending on the size of the face.
 図6は、不鮮明化領域の設定の一例を示す説明図である。図6の左には、IR画像Im11が示されている。矩形枠142は、IR画像Im11において人間の顔に相当する認識対象が認識されたことを示している。画像処理部150は、このような判定結果から、認識された顔の位置の下方に、不鮮明化領域144を設定する。不鮮明化領域144は、透過が抑制されるべき被写体である人体の推定位置を含むように設定されている(図6中央参照)。こうした被写体と認識対象との間の位置関係は、(例えば人体の標準的な形状の知識に基づいて)予め定義され、不鮮明化DB155により認識対象種別ごとに記憶され得る。図6の右には、不鮮明化領域144の部分画像が不鮮明化された、処理後のIR画像Im12が示されている。 FIG. 6 is an explanatory diagram showing an example of the setting of the blurring area. The IR image Im11 is shown on the left of FIG. A rectangular frame 142 indicates that a recognition target corresponding to a human face is recognized in the IR image Im11. From such a determination result, the image processing unit 150 sets the smeared region 144 below the recognized face position. The blurring region 144 is set so as to include an estimated position of the human body that is a subject whose transmission is to be suppressed (see the center of FIG. 6). Such a positional relationship between the subject and the recognition target is defined in advance (for example, based on knowledge of the standard shape of the human body) and can be stored for each recognition target type by the blurring DB 155. The right side of FIG. 6 shows a processed IR image Im12 in which a partial image of the blurred region 144 is blurred.
 図6の例に限定されず、1つの認識対象について複数の不鮮明化領域が設定されてもよい。また、不鮮明化領域の形状は、矩形でなくてもよい。例えば、画像処理部150は、認識対象のサイズ、向き又は形状に合わせて不鮮明化領域144のサイズ、向き又は形状を変化させてもよい。また、画像処理部150は、人間の顔が認識された場合に、顔領域の画素値(例えば、顔領域にわたる平均値又は中央値)との差が閾値を下回る画素値を有する領域を、不鮮明化領域として設定してもよい。それにより、被写体と認識対象との間の位置関係を予め定義しておかずとも、IR画像に映ってしまっている(顔の階調に近い階調を有すると想定される)人間の肌に該当する部分に、不鮮明化領域を設定することができる。 6 is not limited to the example in FIG. 6, and a plurality of smearing regions may be set for one recognition target. Further, the shape of the blurring region may not be a rectangle. For example, the image processing unit 150 may change the size, orientation, or shape of the blurred area 144 in accordance with the size, orientation, or shape of the recognition target. Further, when a human face is recognized, the image processing unit 150 blurs an area having a pixel value whose difference from a pixel value of the face area (for example, an average value or a median value over the face area) is below a threshold value. It may be set as a conversion area. Therefore, even if the positional relationship between the subject and the recognition target is not defined in advance, it corresponds to human skin that is reflected in the IR image (it is assumed to have a gradation close to the gradation of the face) It is possible to set a blurring area in the portion to be processed.
 一例として、画像処理部150は、画素値を平滑化することにより、不鮮明化領域144を不鮮明化してもよい。ここでの平滑化は、ガウシアンフィルタに代表される平滑化フィルタを不鮮明化領域144に属する画素の各々に適用することにより行われ得る。不鮮明化のレベルは、フィルタのスケール(例えば、分散σ)により左右される。平滑化の一変形例として、画像処理部150は、図7に示すように、不鮮明化領域144を複数のサブ領域に区分し、サブ領域ごとに画素値を平均化することにより、不鮮明化領域144を不鮮明化してもよい。図7の例では、不鮮明化領域144が(5×7=)35個のサブ領域に区分されている。不鮮明化のレベルは、サブ領域のサイズRsubにより左右される。他の例として、画像処理部150は、不鮮明化領域144を特定の画素値で塗りつぶすことにより、不鮮明化領域144を不鮮明化してもよい。 As an example, the image processing unit 150 may blur the blurred area 144 by smoothing the pixel value. The smoothing here can be performed by applying a smoothing filter typified by a Gaussian filter to each pixel belonging to the unsharp region 144. The level of smearing depends on the filter scale (eg, variance σ). As a modification of the smoothing, the image processing unit 150 divides the blurred area 144 into a plurality of sub areas and averages the pixel values for each sub area as shown in FIG. 144 may be blurred. In the example of FIG. 7, the blurred area 144 is divided into (5 × 7 =) 35 sub-areas. The level of smearing depends on the sub-region size R sub . As another example, the image processing unit 150 may blur the blurred area 144 by filling the blurred area 144 with a specific pixel value.
 また別の例として、画像処理部150は、図8に示すように、不鮮明化領域144に補助画像(の不鮮明化領域144に対応する部分)の画素値をミキシングすることにより、不鮮明化領域144を不鮮明化してもよい。図8の左には、不鮮明化領域144に対応する補助画像の部分画像と、IR画像の不鮮明化領域144の部分画像とが示されている。画像処理部150は、これら2つの部分画像を、次式に従い、ミキシング比率αmixを用いてミキシングすることにより、不鮮明化領域144の不鮮明化された部分画像を生成し得る(図8の右参照)。 As another example, as shown in FIG. 8, the image processing unit 150 mixes the pixel values of the auxiliary image (the portion corresponding to the unsharp area 144) into the unsharp area 144, thereby blurring the area 144. May be blurred. The left part of FIG. 8 shows a partial image of the auxiliary image corresponding to the blurred area 144 and a partial image of the blurred area 144 of the IR image. The image processing unit 150 may generate a blurred partial image of the blurred region 144 by mixing these two partial images using the mixing ratio α mix according to the following equation (see the right side of FIG. 8). ).
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 式(1)において、IRx,y及びIRx,y_blurredはそれぞれ画素位置(x,y)の処理前の及び処理後のIR画像の画素値を表し、SIx,yは画素位置(x,y)の補助画像の画素値を表す。複数のミキシング比率を用いて、IR画像に複数の補助画像がミキシングされてもよい。2つのミキシングαmix及びβmixが使用される場合には、式(1)の代わりに、例えば次の式(2)が使用され得る。 In Expression (1), IR x, y and IR x, y_blurred represent the pixel values of the IR image before and after the pixel position (x, y), respectively, and SI x, y represents the pixel position (x, y). y) represents the pixel value of the auxiliary image. A plurality of auxiliary images may be mixed with the IR image using a plurality of mixing ratios. When two mixing α mix and β mix are used, for example, the following equation (2) may be used instead of equation (1).
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 式(2)において、SI1x,y及びSI2x,yは、それぞれ画素位置(x,y)の第1の補助画像の画素値及び第2の補助画像の画素値を表す。こうしたミキシングの例では、不鮮明化のレベルは、ミキシング比率により左右される。ミキシングのための補助画像として、可視光画像、追加的なIR画像又は深度マップのような動的に取得される画像の代わりに、不鮮明化DB155により予め記憶される固定的な画像(例えば、CG(Computer Graphics)画像、又は単色画像など)が使用されてもよい。また、補助画像の代わりに、動的に決定される画素値Pmixを用いて次式のようにミキシングが行われてもよい。 In Equation (2), SI1 x, y and SI2 x, y represent the pixel value of the first auxiliary image and the pixel value of the second auxiliary image at the pixel position (x, y), respectively. In such mixing examples, the level of smearing depends on the mixing ratio. As a supplementary image for mixing, instead of a dynamically acquired image such as a visible light image, an additional IR image or a depth map, a fixed image (eg CG) pre-stored by the blurring DB 155 (Computer Graphics) images or monochrome images etc.) may be used. Further, instead of the auxiliary image, mixing may be performed using the pixel value P mix that is dynamically determined as in the following equation.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 画素値Pmixは、IR画像に属する画素値の代表値であってよい。ここでの代表値は、例えば、IR画像全体、不鮮明化領域の内部又は不鮮明化領域の境界上の画素群の画素値の平均値又は中央値などであってよい。そのような代表値を用いることにより、ミキシング後の不鮮明化領域の画像として、不自然な色又は階調の変化の少ない画像を得ることができる。このように選択される画素値Pmixは、ミキシング色ではなく不鮮明化のための塗りつぶし色として使用されてもよい。また、画素値Pmixは、ユーザインタフェース部160を介してユーザにより設定されてもよい。 The pixel value P mix may be a representative value of pixel values belonging to the IR image. The representative value here may be, for example, the average value or the median value of the pixel values of the entire IR image, the inside of the blurred area, or the pixel group on the boundary of the blurred area. By using such a representative value, an unnatural color or an image with little change in gradation can be obtained as an image of a blurred region after mixing. The pixel value P mix selected in this way may be used as a fill color for blurring instead of the mixing color. Further, the pixel value P mix may be set by the user via the user interface unit 160.
 画像処理部150は、1つ以上の条件に従って不鮮明化レベルを動的に決定し、決定した不鮮明化レベルに従って不鮮明化領域を不鮮明化してもよい。1つの手法において、画像処理部150は、IR画像が撮像された際の赤外線の照射強度及び環境光の強度のうちの1つ以上に基づいて、不鮮明化レベルを決定し得る。例えば、IR画像が撮像される際に近傍へ照射される赤外線の強度が強いほど、IR画像のテクスチャはより鮮明となり、透過した被写体もより鮮明にIR画像に映る。そこで、画像処理部150は、IR画像が撮像された際の赤外線の照射強度を示す情報を赤外線カメラ102から取得し、その照射強度がより強いほど不鮮明化レベルが高くなるように、不鮮明化レベルを決定してもよい。また、不鮮明化レベルは、環境光の強度に対する相対的な赤外線の照射強度に基づいて決定されてもよい。環境光の強度は、図4には示していない照度センサによって測定され得る。 The image processing unit 150 may dynamically determine the blurring level according to one or more conditions, and blur the blurring area according to the determined blurring level. In one method, the image processing unit 150 may determine the blurring level based on one or more of infrared irradiation intensity and ambient light intensity when an IR image is captured. For example, as the intensity of infrared rays irradiated to the vicinity when an IR image is captured increases, the texture of the IR image becomes clearer, and the transmitted subject appears more clearly in the IR image. Therefore, the image processing unit 150 acquires information indicating the infrared irradiation intensity when the IR image is captured from the infrared camera 102, and the higher the irradiation intensity, the higher the blurring level. May be determined. The blurring level may be determined based on the irradiation intensity of infrared rays relative to the intensity of the ambient light. The intensity of the ambient light can be measured by an illuminance sensor not shown in FIG.
 他の手法において、画像処理部150は、判定部140から入力される画像認識の認識尤度に基づいて、不鮮明化レベルを決定してもよい。例えば、IR画像に映っていると認識されたオブジェクトが人間の胴体である可能性が高い場合、「胴体」を示す認識対象種別に関連付けられる、高い値を示す認識尤度(あるいは信頼度)が、判定結果として判定部140から入力される。この場合、画像処理部150は、胴体領域の不鮮明化レベルの値を相対的に高く決定し得る。 In another method, the image processing unit 150 may determine the blurring level based on the recognition likelihood of image recognition input from the determination unit 140. For example, when there is a high possibility that an object recognized as being reflected in an IR image is a human torso, the recognition likelihood (or reliability) indicating a high value associated with the recognition target type indicating “torso” is high. The determination result is input from the determination unit 140. In this case, the image processing unit 150 can determine the blur level value of the body region to be relatively high.
 また別の手法において、画像処理部150は、IR画像が撮像された際のカメラから被写体までの距離及びIR画像における被写体のサイズのうちの1つ以上に基づいて、不鮮明化レベルを決定してもよい。カメラから被写体までの距離は、例えば赤外線ベースの深度センサによって(例えば、被写体で反射する近赤外線のラウンドトリップ時間、又は被写体に投影されるドットパターンの歪みに基づく測距法によって)測定され得る。被写体のサイズは、判定部140により実行される画像認識を通じて取得され得る。例えば、カメラから被写体までの距離がより小さく、又は被写体のサイズがより大きい場合、透過した被写体はより鮮明にIR画像に映るであろう。そこで、画像処理部150は、これらの場合に不鮮明化レベルがより高くなるように、不鮮明化レベルを決定し得る。 In another method, the image processing unit 150 determines the blurring level based on one or more of the distance from the camera to the subject when the IR image is captured and the size of the subject in the IR image. Also good. The distance from the camera to the subject can be measured, for example, by an infrared-based depth sensor (eg, by a near infrared round trip time reflected by the subject, or a ranging method based on distortion of the dot pattern projected onto the subject). The size of the subject can be acquired through image recognition executed by the determination unit 140. For example, if the distance from the camera to the subject is smaller or the subject size is larger, the transmitted subject will appear more clearly in the IR image. Therefore, the image processing unit 150 can determine the blurring level so that the blurring level becomes higher in these cases.
 不鮮明化が不鮮明化領域へのガウシアンフィルタの適用によって実現される場合、不鮮明化レベルはガウシアンフィルタのスケールに対応する。不鮮明化がサブ領域ごとの画素値の平均化によって実現される場合、不鮮明化レベルはサブ領域のサイズに対応する。不鮮明化が補助画像とのミキシングによって実現される場合、不鮮明化レベルはミキシング比率に対応する。画像処理部150は、不鮮明化レベルの決定のためのパラメータ(例えば赤外線の照射強度、環境光の強度、認識尤度、被写体までの距離若しくは被写体のサイズ、又はこれらの任意の組合せ)の値に関わらず、これら不鮮明化レベルを、所定のレベルを下回らないように制限してもよい。 When the blurring is realized by applying a Gaussian filter to the blurring area, the blurring level corresponds to the scale of the Gaussian filter. If smearing is achieved by averaging pixel values for each sub-region, the smearing level corresponds to the size of the sub-region. If blurring is achieved by mixing with the auxiliary image, the blurring level corresponds to the mixing ratio. The image processing unit 150 sets values for parameters for determining the blurring level (for example, infrared irradiation intensity, ambient light intensity, recognition likelihood, distance to the subject or size of the subject, or any combination thereof). Regardless, these blurring levels may be limited so as not to fall below a predetermined level.
 図9Aは、不鮮明化レベルの決定のためのパラメータと不鮮明化レベルとの間の関係の第1の例を示すグラフである。グラフの横軸は、不鮮明化レベルの決定のためのパラメータXを表し、パラメータXは、例えば赤外線の照射強度、認識尤度又は被写体のサイズなどに相当し得る。グラフの縦軸は、不鮮明化レベルを表し、例えばフィルタのスケール、平均化のためのサブ領域のサイズ又はミキシング比率に相当し得る。図9Aから理解されるように、パラメータXの値が閾値Th11を下回る場合、パラメータXの値に関わらず、不鮮明化レベルは最小値Lminで一定である。このような不鮮明化レベルの制限によって、視認されることが望ましくない被写体が不鮮明化レベルの不足に起因して視認されてしまうことを防止することができる。パラメータXの値が閾値Th11を上回る場合、不鮮明化レベルはパラメータXの値と共に増加する。パラメータXの値が閾値Th12以上になると、不鮮明化レベルは最大値Lmaxに達する。このようなパラメータXと不鮮明化レベルとの間の関係を定義する情報(例えば、閾値及び対応する不鮮明化レベルの値など)は、予め定義され、不鮮明化DB155により記憶され得る。 FIG. 9A is a graph illustrating a first example of the relationship between the parameter for determining the blurring level and the blurring level. The horizontal axis of the graph represents the parameter X for determining the blurring level, and the parameter X may correspond to, for example, infrared irradiation intensity, recognition likelihood, or subject size. The vertical axis of the graph represents the level of smearing and may correspond to, for example, the scale of the filter, the size of the sub-region for averaging or the mixing ratio. As understood from FIG. 9A, when the value of the parameter X is lower than the threshold value Th11, the blurring level is constant at the minimum value L min regardless of the value of the parameter X. Due to such limitation of the blurring level, it is possible to prevent a subject that is not desired to be visually recognized from being visually recognized due to a lack of the blurring level. When the value of the parameter X exceeds the threshold value Th11, the blurring level increases with the value of the parameter X. When the value of the parameter X is the threshold value Th12 above, blurring level reaches a maximum value L max. Information defining such a relationship between the parameter X and the blurring level (for example, a threshold value and a corresponding blurring level value, etc.) may be defined in advance and stored by the blurring DB 155.
 図9Bは、不鮮明化レベルの決定のためのパラメータと不鮮明化レベルとの間の関係の第2の例を示すグラフである。図9Bの例においても、パラメータXの値が閾値Th21を下回る場合、パラメータXの値に関わらず、不鮮明化レベルは最小値Lminで一定である。パラメータXの値が閾値Th21を上回る場合、不鮮明化レベルは、パラメータXの値が属するサブレンジに関連付けられる値をとる。サブレンジの境界は、閾値Th21、Th22、Th23及びTh24によって定義される。不鮮明化DB155は、このようなサブレンジの境界と、サブレンジごとの不鮮明化レベルとを定義する情報を記憶していてもよい。 FIG. 9B is a graph showing a second example of the relationship between the parameter for determining the blurring level and the blurring level. Also in the example of FIG. 9B, when the value of the parameter X is lower than the threshold Th21, the blurring level is constant at the minimum value L min regardless of the value of the parameter X. When the value of the parameter X exceeds the threshold Th21, the blurring level takes a value associated with the subrange to which the value of the parameter X belongs. The subrange boundary is defined by threshold values Th21, Th22, Th23, and Th24. The blurring DB 155 may store information defining such a subrange boundary and a blurring level for each subrange.
 ユーザインタフェース部160は、不鮮明化レベルをユーザに調整させるためのユーザインタフェースを提供するために、画像処理装置100に追加的に設けられる。ユーザインタフェース部160は、入力インタフェース106を介してユーザ入力を取得する。ユーザインタフェース部160は、例えば、図10に例示するようなグラフィカルユーザインタフェースをディスプレイ110の画面に表示させてもよい。図10を参照すると、設定ウィンドウU10が示されている。設定ウィンドウU10は、スライダU11及びボタンU12を含む。ユーザは、スライダU11をスライダ軸に沿ってスライドさせ、ボタンU12を押下(又はタップ)することにより、不鮮明化の度合いを増減させることができる。 The user interface unit 160 is additionally provided in the image processing apparatus 100 in order to provide a user interface for allowing the user to adjust the blurring level. The user interface unit 160 acquires user input via the input interface 106. The user interface unit 160 may display a graphical user interface as illustrated in FIG. 10 on the screen of the display 110, for example. Referring to FIG. 10, a setting window U10 is shown. The setting window U10 includes a slider U11 and a button U12. The user can increase or decrease the degree of blurring by sliding the slider U11 along the slider axis and pressing (or tapping) the button U12.
 図11は、ユーザ設定に依存する不鮮明化レベルの例を示すグラフである。図11を参照すると、互いに異なる最大値と傾きとを有する3通りのグラフG1、G2及びG3が示されている。グラフG1は、低い不鮮明化レベルをユーザが望む場合に選択される、不鮮明化レベルの決定のためのパラメータXと不鮮明化レベルとの間の関係を定義する。グラフG2は、中程度の不鮮明化レベルをユーザが望む場合に選択される、パラメータXと不鮮明化レベルとの間の関係を定義する。グラフG3は、高い不鮮明化レベルをユーザが望む場合に選択される、パラメータXと不鮮明化レベルとの間の関係を定義する。図11の例では、パラメータXの値がある閾値を下回る場合、ユーザ設定及びパラメータXの値に関わらず、不鮮明化レベルは最小値Lminで一定であり、即ちIR画像が少なくともある程度不鮮明化される。こうした制限によって、ユーザのニーズに応じた不鮮明化レベルの変更を許容しつつ、視認されることが望ましくない被写体の視認防止という目的の達成を保証することができる。不鮮明化DB155は、このようにユーザ設定に依存する不鮮明化レベルを定義する情報を記憶していてもよい。 FIG. 11 is a graph showing an example of the blurring level depending on the user setting. Referring to FIG. 11, three graphs G1, G2, and G3 having different maximum values and slopes are shown. The graph G1 defines the relationship between the parameter X for the determination of the blurring level and the blurring level, which is selected when the user desires a lower blurring level. Graph G2 defines the relationship between parameter X and the smearing level that is selected when the user desires a medium smearing level. Graph G3 defines the relationship between parameter X and the smearing level that is selected when the user desires a high smearing level. In the example of FIG. 11, when the value of the parameter X is below a certain threshold, the blurring level is constant at the minimum value L min regardless of the user setting and the value of the parameter X, that is, the IR image is blurred at least to some extent. The By such a restriction, it is possible to guarantee the achievement of the object of preventing the visual recognition of a subject that is not desired to be visually recognized, while allowing the change of the blurring level according to the user's needs. The blurring DB 155 may store information that defines the blurring level that depends on the user setting.
 なお、不鮮明化レベルの決定のためのパラメータと不鮮明化レベルとの間の関係は、図9A及び図9B並びに図11に示した例に限定されない。その関係を示すグラフは、直線、曲線及び折れ線など、いかなる軌跡を描いてもよい。また、不鮮明化レベルは、複数のパラメータ(例えば、認識尤度及び被写体までの距離、など)に基づいて決定されてもよい。例えば、複数のパラメータの関数として計算される単一の中間パラメータと不鮮明化レベルとの間の関係が不鮮明化DB155において定義されてもよい。また、昼又は夜といった撮像時間帯に依存して異なる不鮮明化レベルが利用されてもよい。 It should be noted that the relationship between the parameter for determining the blurring level and the blurring level is not limited to the examples shown in FIGS. 9A and 9B and FIG. The graph indicating the relationship may draw any trajectory such as a straight line, a curved line, and a broken line. Also, the blurring level may be determined based on a plurality of parameters (for example, recognition likelihood and distance to the subject). For example, a relationship between a single intermediate parameter calculated as a function of a plurality of parameters and the blurring level may be defined in the blurring DB 155. Also, different blurring levels may be used depending on the imaging time zone such as day or night.
 画像処理部150は、不鮮明化領域144が特定の画素値での塗りつぶしによって不鮮明化される例において、不鮮明化のための塗りつぶし色を(例えば認識対象種別などの情報に依存して)動的に変化させてもよい。また、画像処理部150は、不鮮明化領域144が単色画像とのミキシングによって不鮮明化される例において、単色画像の色(即ち、ミキシング色)を動的に変化させてもよい。例えば、画像処理部150は、不鮮明化レベルが高い場合にはミキシング色を赤色に、不鮮明化レベルが中程度の場合にはミキシング色を青色に、不鮮明化レベルが低い場合にはミキシング色を灰色に設定し得る。それにより、どの程度の不鮮明化が行われたのかを出力画像を見るユーザに知らせることができる。また、画像処理部150は、認識対象種別などの他の情報に依存してミキシング色を動的に変化させてもよい。 In the example where the smeared area 144 is smeared by filling with a specific pixel value, the image processing unit 150 dynamically changes the fill color for smearing (for example, depending on information such as the type of recognition target). It may be changed. Further, the image processing unit 150 may dynamically change the color of the monochrome image (that is, the mixing color) in an example in which the blurring region 144 is blurred by mixing with the monochrome image. For example, the image processing unit 150 sets the mixing color to red when the smearing level is high, the mixing color to blue when the smearing level is medium, and the mixing color to gray when the smearing level is low. Can be set to Thereby, it is possible to notify the user who sees the output image how much blurring has been performed. Further, the image processing unit 150 may dynamically change the mixing color depending on other information such as the type of recognition target.
 画像処理部150は、IR画像の表示又は記録を少なくとも部分的に抑制する際に、当該抑制が行われていることを示す標識を画面に表示させてもよい。画像処理部150は、例えば、図12に示したように、抑制が行われていることを示す標識(indication)をIR画像に重畳することにより、当該標識をディスプレイの画面に表示させることができる。図12の例において、画像処理装置100は、タブレットPCである。画像処理装置100の画面には、標識U21及び標識U22が表示されている。標識U21は、IR画像内で人体が認識された結果として不鮮明化が行われたことをユーザに知らせるためのテキストラベルである。標識U22は、不鮮明化領域を囲む枠である。ユーザは、標識U22を見ることで、不鮮明化が行われたことのみならず、出力画像のどの部分が不鮮明化されたかをも把握することができる。 When the image processing unit 150 at least partially suppresses the display or recording of the IR image, the image processing unit 150 may display a sign indicating that the suppression is performed on the screen. For example, as illustrated in FIG. 12, the image processing unit 150 can display an indication on the display screen by superimposing an indication indicating that suppression is performed on the IR image. . In the example of FIG. 12, the image processing apparatus 100 is a tablet PC. On the screen of the image processing apparatus 100, a sign U21 and a sign U22 are displayed. The sign U21 is a text label for informing the user that the smearing has been performed as a result of the human body being recognized in the IR image. The sign U22 is a frame that surrounds the blurred area. The user can grasp not only that the smearing has been performed but also which part of the output image has been smeared by looking at the sign U22.
 図12に例示した標識U21及び標識U22は、IR画像を撮像し又は利用するユーザが、IR画像について不鮮明化が行われたことを知るために有益である。一方、赤外線カメラが向けられた人物、即ち被写体が、不鮮明化が適切に行われているのかを知ることも重要である。なぜなら、不鮮明化が適切に行われていないならば、赤外線カメラが向けられた人物は撮像されることを拒否し得るためである。そこで、画像処理装置100に報知部170が追加的に設けられる。報知部170は、赤外線カメラ102によりIR画像が撮像される場合に、IR画像が撮像されることを近傍の人物へ報知する。報知部170は、LED(Light Emitting Diode)などの発光素子から発せられる光、又はスピーカから出力される効果音若しくは音声などの手段によって、IR画像が撮像されることを報知し得る。報知部170は、画像処理部150により透過の抑制が行われているか否かに依存して異なる報知パターンで、報知を行ってよい。 The sign U21 and the sign U22 illustrated in FIG. 12 are useful for the user who captures or uses the IR image to know that the blurring has been performed on the IR image. On the other hand, it is also important to know whether the person to whom the infrared camera is directed, that is, the subject, is properly blurred. This is because if the blurring is not properly performed, the person to whom the infrared camera is directed can refuse to be imaged. Therefore, a notification unit 170 is additionally provided in the image processing apparatus 100. When the IR camera 102 captures an IR image, the notification unit 170 notifies a nearby person that the IR image is captured. The notification unit 170 can notify that an IR image is captured by means such as light emitted from a light emitting element such as an LED (Light Emitting Diode), or a sound effect or sound output from a speaker. The notification unit 170 may perform notification with a different notification pattern depending on whether or not transmission suppression is performed by the image processing unit 150.
 図13は、IR画像が撮像されることを近傍の人物へ報知する手法の一例について説明するための説明図である。図13の例において、画像処理装置100は、デジタルビデオカメラである。画像処理装置100の背面には、発光素子172が配設されている。報知部170は、IR画像が撮像される場合には、発光素子172を点灯させる。発光素子172から発せられる光の色は、例えば、画像処理部150により透過の抑制が行われる場合には青色に、透過の抑制が行われない場合には赤色に設定される。被写体となる人物は、このような発光素子172からの光の色を見ることで、透過の抑制が適切に行われているかを把握することができる。 FIG. 13 is an explanatory diagram for explaining an example of a technique for notifying a nearby person that an IR image is captured. In the example of FIG. 13, the image processing apparatus 100 is a digital video camera. A light emitting element 172 is disposed on the back surface of the image processing apparatus 100. The notification unit 170 turns on the light emitting element 172 when an IR image is captured. The color of light emitted from the light emitting element 172 is set to, for example, blue when transmission is suppressed by the image processing unit 150, and red when transmission is not suppressed. The person who is the subject can grasp whether or not the transmission is appropriately suppressed by looking at the color of the light from the light emitting element 172.
  [2-3.処理の流れ]
 本項では、ここまでに説明した第1の実施形態に係るIR画像処理の流れのいくつかの例について説明する。
[2-3. Process flow]
In this section, several examples of the flow of IR image processing according to the first embodiment described so far will be described.
   (1)第1の例
 図14Aは、第1の実施形態に係るIR画像処理の流れの第1の例を示すフローチャートである。図14Aに示した処理は、処理すべき1つ以上のIR画像の各々について繰り返される。
(1) First Example FIG. 14A is a flowchart illustrating a first example of the flow of IR image processing according to the first embodiment. The process shown in FIG. 14A is repeated for each of the one or more IR images to be processed.
 まず、IR画像取得部120は、IR画像を取得し、取得したIR画像を判定部140及び画像処理部150へ出力する(ステップS102)。 First, the IR image acquisition unit 120 acquires an IR image, and outputs the acquired IR image to the determination unit 140 and the image processing unit 150 (step S102).
 次に、判定部140は、IR画像取得部120から入力されるIR画像について、所定の認識対象を認識するための画像認識を実行する(ステップS112)。次に、判定部140は、画像認識の結果に基づいて、IR画像に透過を抑制すべき被写体が映っているかを判定する(ステップS120)。 Next, the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S112). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed appears in the IR image based on the result of image recognition (step S120).
 IR画像に透過を抑制すべき被写体が映っていると判定された場合、画像処理部150は、当該被写体の推定位置を含むようにIR画像に不鮮明化領域を設定する(ステップS131)。次に、画像処理部150は、設定した不鮮明化領域のIR画像の部分画像を不鮮明化する(ステップS135)。IR画像に透過を抑制すべき被写体が映っていると判定されなかった場合には、ステップS131及びS135の処理はスキップされる。 If it is determined that a subject whose transmission is to be suppressed appears in the IR image, the image processing unit 150 sets a blurring area in the IR image so as to include the estimated position of the subject (step S131). Next, the image processing unit 150 blurs the partial image of the IR image in the set blurring area (step S135). If it is not determined that the subject whose transmission is to be suppressed appears in the IR image, the processes in steps S131 and S135 are skipped.
 次に、画像処理部150は、ステップS135において更新したIR画像、又は透過を抑制すべき被写体が映っていないために更新されなかったIR画像を、ディスプレイ110、通信インタフェース112又はストレージ114へ出力する(ステップS138)。 Next, the image processing unit 150 outputs, to the display 110, the communication interface 112, or the storage 114, the IR image updated in step S135 or the IR image that has not been updated because the subject whose transmission should be suppressed is not shown. (Step S138).
 その後、処理すべき次のIR画像が存在する場合には、処理はステップS102へ戻る(ステップS140)。次のIR画像が存在しない場合には、図14Aに示したIR画像処理は終了する。 Thereafter, if there is a next IR image to be processed, the process returns to step S102 (step S140). If there is no next IR image, the IR image processing shown in FIG. 14A ends.
   (2)第2の例
 図14Bは、第1の実施形態に係るIR画像処理の流れの第2の例を示すフローチャートである。図14Bに示した処理は、処理すべき1つ以上のIR画像の各々について繰り返される。
(2) Second Example FIG. 14B is a flowchart illustrating a second example of the flow of IR image processing according to the first embodiment. The process shown in FIG. 14B is repeated for each of the one or more IR images to be processed.
 まず、IR画像取得部120は、IR画像を取得し、取得したIR画像を判定部140及び画像処理部150へ出力する(ステップS102)。 First, the IR image acquisition unit 120 acquires an IR image, and outputs the acquired IR image to the determination unit 140 and the image processing unit 150 (step S102).
 次に、判定部140は、IR画像取得部120から入力されるIR画像について、所定の認識対象を認識するための画像認識を実行する(ステップS112)。次に、判定部140は、画像認識の結果に基づいて、IR画像に透過を抑制すべき被写体が映っているかを判定する(ステップS120)。 Next, the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S112). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed appears in the IR image based on the result of image recognition (step S120).
 IR画像に透過を抑制すべき被写体が映っていると判定された場合、画像処理部150は、当該被写体の推定位置を含むようにIR画像に不鮮明化領域を設定する(ステップS131)。次に、画像処理部150は、IR画像が撮像された際の赤外線の照射強度、環境光の強度、画像認識の認識尤度、被写体までの距離及び被写体のサイズのうちの1つ以上のパラメータに基づいて、不鮮明化レベルを決定する(ステップS133)。次に、画像処理部150は、決定した不鮮明化レベルに従って、不鮮明化領域のIR画像の部分画像を不鮮明化する(ステップS136)。IR画像に透過を抑制すべき被写体が映っていると判定されなかった場合には、ステップS131、S133及びS136の処理はスキップされる。 If it is determined that a subject whose transmission is to be suppressed appears in the IR image, the image processing unit 150 sets a blurring area in the IR image so as to include the estimated position of the subject (step S131). Next, the image processing unit 150 uses one or more parameters of infrared irradiation intensity when the IR image is captured, ambient light intensity, recognition likelihood of image recognition, distance to the subject, and size of the subject. Based on the above, the blurring level is determined (step S133). Next, the image processing unit 150 blurs the partial image of the IR image in the blurring area in accordance with the determined blurring level (step S136). If it is not determined that the subject whose transmission is to be suppressed appears in the IR image, the processes of steps S131, S133, and S136 are skipped.
 次に、画像処理部150は、ステップS136において更新したIR画像、又は透過を抑制すべき被写体が映っていないために更新されなかったIR画像を、ディスプレイ110、通信インタフェース112又はストレージ114へ出力する(ステップS138)。 Next, the image processing unit 150 outputs, to the display 110, the communication interface 112, or the storage 114, the IR image updated in step S136 or the IR image that has not been updated because the subject whose transmission should be suppressed is not shown. (Step S138).
 その後、処理すべき次のIR画像が存在する場合には、処理はステップS102へ戻る(ステップS140)。次のIR画像が存在しない場合には、図14Bに示したIR画像処理は終了する。 Thereafter, if there is a next IR image to be processed, the process returns to step S102 (step S140). If there is no next IR image, the IR image processing shown in FIG. 14B ends.
   (3)第3の例
 図14Cは、第1の実施形態に係るIR画像処理の流れの第3の例を示すフローチャートである。図14Cに示した処理は、処理すべき1つ以上のIR画像の各々について繰り返される。
(3) Third Example FIG. 14C is a flowchart illustrating a third example of the IR image processing flow according to the first embodiment. The process shown in FIG. 14C is repeated for each of the one or more IR images to be processed.
 まず、IR画像取得部120によりIR画像が、補助画像取得部130により補助画像がそれぞれ取得される(ステップS104)。これらIR画像及び補助画像の画角は、互いに重なる(理想的には一致する)ように校正されている。IR画像取得部120は、取得したIR画像を画像処理部150へ出力する。また、補助画像取得部130は、取得した補助画像を判定部140へ出力する。 First, an IR image is acquired by the IR image acquisition unit 120, and an auxiliary image is acquired by the auxiliary image acquisition unit 130 (step S104). The angles of view of the IR image and the auxiliary image are calibrated so as to overlap (ideally match) each other. The IR image acquisition unit 120 outputs the acquired IR image to the image processing unit 150. In addition, the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the determination unit 140.
 次に、判定部140は、補助画像取得部130から入力される補助画像について、所定の認識対象を認識するための画像認識を実行する(ステップS114)。次に、判定部140は、画像認識の結果に基づいて、IR画像に透過を抑制すべき被写体が映っているかを判定する(ステップS120)。例えば、補助画像内で所定の認識対象が認識された場合、判定部140は、IR画像に透過を抑制すべき被写体が映っていると判定し得る。 Next, the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the auxiliary image input from the auxiliary image acquisition unit 130 (step S114). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed appears in the IR image based on the result of image recognition (step S120). For example, when a predetermined recognition target is recognized in the auxiliary image, the determination unit 140 can determine that the subject whose transmission should be suppressed appears in the IR image.
 IR画像に透過を抑制すべき被写体が映っていると判定された場合、画像処理部150は、当該被写体の推定位置を含むようにIR画像に不鮮明化領域を設定する(ステップS131)。次に、画像処理部150は、設定した不鮮明化領域のIR画像の部分画像を不鮮明化する(ステップS135)。IR画像に透過を抑制すべき被写体が映っていると判定されなかった場合には、ステップS131及びS135の処理はスキップされる。 If it is determined that a subject whose transmission is to be suppressed appears in the IR image, the image processing unit 150 sets a blurring area in the IR image so as to include the estimated position of the subject (step S131). Next, the image processing unit 150 blurs the partial image of the IR image in the set blurring area (step S135). If it is not determined that the subject whose transmission is to be suppressed appears in the IR image, the processes in steps S131 and S135 are skipped.
 次に、画像処理部150は、ステップS135において更新したIR画像、又は透過を抑制すべき被写体が映っていないために更新されなかったIR画像を、ディスプレイ110、通信インタフェース112又はストレージ114へ出力する(ステップS138)。 Next, the image processing unit 150 outputs, to the display 110, the communication interface 112, or the storage 114, the IR image updated in step S135 or the IR image that has not been updated because the subject whose transmission should be suppressed is not shown. (Step S138).
 その後、処理すべき次のIR画像が存在する場合には、処理はステップS104へ戻る(ステップS140)。次のIR画像が存在しない場合には、図14Cに示したIR画像処理は終了する。 Thereafter, if there is a next IR image to be processed, the process returns to step S104 (step S140). If there is no next IR image, the IR image processing shown in FIG. 14C ends.
   (4)第4の例
 図14Dは、第1の実施形態に係るIR画像処理の流れの第4の例を示すフローチャートである。図14Dに示した処理は、処理すべき1つ以上のIR画像の各々について繰り返される。
(4) Fourth Example FIG. 14D is a flowchart illustrating a fourth example of the IR image processing flow according to the first embodiment. The process shown in FIG. 14D is repeated for each of the one or more IR images to be processed.
 まず、IR画像取得部120によりIR画像が、補助画像取得部130により補助画像がそれぞれ取得される(ステップS104)。これらIR画像及び補助画像の画角は、互いに重なる(理想的には一致する)ように校正されている。IR画像取得部120は、取得したIR画像を判定部140及び画像処理部150へ出力する。また、補助画像取得部130は、取得した補助画像を画像処理部150へ出力する。 First, an IR image is acquired by the IR image acquisition unit 120, and an auxiliary image is acquired by the auxiliary image acquisition unit 130 (step S104). The angles of view of the IR image and the auxiliary image are calibrated so as to overlap (ideally match) each other. The IR image acquisition unit 120 outputs the acquired IR image to the determination unit 140 and the image processing unit 150. Further, the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the image processing unit 150.
 次に、判定部140は、IR画像取得部120から入力されるIR画像について、所定の認識対象を認識するための画像認識を実行する(ステップS112)。次に、判定部140は、画像認識の結果に基づいて、IR画像に透過を抑制すべき被写体が映っているかを判定する(ステップS120)。 Next, the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S112). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed appears in the IR image based on the result of image recognition (step S120).
 IR画像に透過を抑制すべき被写体が映っていると判定された場合、画像処理部150は、IR画像に不鮮明化領域を設定し、及び対応する領域を補助画像にも設定する(ステップS132)。次に、画像処理部150は、上述した1つ以上のパラメータに基づいて、不鮮明化レベルを決定する(ステップS133)。ここでの不鮮明化レベルは、例えば上述したミキシング比率に相当し得る。次に、画像処理部150は、決定した不鮮明化レベルに従って、不鮮明化領域においてIR画像に補助画像をミキシングすることにより、不鮮明化領域の部分画像を不鮮明化する(ステップS137)。IR画像に透過を抑制すべき被写体が映っていると判定されなかった場合には、ステップS132、S133及びS137の処理はスキップされる。 When it is determined that a subject whose transmission is to be suppressed appears in the IR image, the image processing unit 150 sets a blurring area in the IR image and sets a corresponding area in the auxiliary image (step S132). . Next, the image processing unit 150 determines the blurring level based on the one or more parameters described above (step S133). The blurring level here may correspond to, for example, the mixing ratio described above. Next, the image processing unit 150 blurs the partial image in the blurred area by mixing the auxiliary image with the IR image in the blurred area in accordance with the determined blur level (step S137). If it is not determined that the subject whose transmission is to be suppressed appears in the IR image, the processes of steps S132, S133, and S137 are skipped.
 次に、画像処理部150は、ステップS137において更新したIR画像、又は透過を抑制すべき被写体が映っていないために更新されなかったIR画像を、ディスプレイ110、通信インタフェース112又はストレージ114へ出力する(ステップS138)。 Next, the image processing unit 150 outputs the IR image updated in step S137 or the IR image that has not been updated because the subject whose transmission is to be suppressed is not reflected, to the display 110, the communication interface 112, or the storage 114. (Step S138).
 その後、処理すべき次のIR画像が存在する場合には、処理はステップS104へ戻る(ステップS140)。次のIR画像が存在しない場合には、図14Dに示したIR画像処理は終了する。 Thereafter, if there is a next IR image to be processed, the process returns to step S104 (step S140). If there is no next IR image, the IR image processing shown in FIG. 14D ends.
 ここまでに説明した様々な処理ステップは、フローチャートに示した例に限定されず、互いにどのように組み合わされてもよい。例えば、第3の例のように補助画像内で認識対象が認識された後、第4の例のように不鮮明化のためにIR画像と補助画像とがミキシングされてもよい。また、可視光画像が利用可能な昼の時間帯においては第3の例のように補助画像としての可視光画像を用いて被写体が認識され、夜の時間帯においては第1の例のようにIR画像を用いて被写体が認識されるといったように、利用すべき画像が時間帯に依存して切り替えられてもよい。 The various processing steps described so far are not limited to the examples shown in the flowchart, and may be combined in any way. For example, after the recognition target is recognized in the auxiliary image as in the third example, the IR image and the auxiliary image may be mixed for blurring as in the fourth example. In addition, the subject is recognized using the visible light image as the auxiliary image in the daytime period when the visible light image can be used, as in the third example, and as in the first example in the nighttime period. The image to be used may be switched depending on the time zone so that the subject is recognized using the IR image.
 <3.第2の実施形態>
 前節で説明した第1の実施形態ではIR画像の全体又は一部に相当する不鮮明化領域が不鮮明化される。これに対し、第2の実施形態では、より簡易な実装で赤外線の透過性に起因する不適切な行為を防止することを可能とするために、透過抑制処理において、IR画像の全体又は一部の撮像、表示又は記録が無効化される。
<3. Second Embodiment>
In the first embodiment described in the previous section, the blurred region corresponding to the whole or a part of the IR image is blurred. On the other hand, in the second embodiment, in order to prevent inappropriate actions caused by infrared transparency with a simpler implementation, in the transmission suppression process, all or part of the IR image. The imaging, display or recording of the image is invalidated.
  [3-1.機能構成]
 第2の実施形態に係る画像処理装置200のハードウェア構成は、図4を用いて説明した画像処理装置100のハードウェア構成と同様であってよい。図15は、第2の実施形態に係る画像処理装置200の論理的機能の構成の一例を示すブロック図である。図15を参照すると、画像処理装置200は、IR画像取得部120、補助画像取得部130、判定部140、認識DB145、画像処理部250、ユーザインタフェース部260及び報知部170を備える。
[3-1. Functional configuration]
The hardware configuration of the image processing apparatus 200 according to the second embodiment may be the same as the hardware configuration of the image processing apparatus 100 described with reference to FIG. FIG. 15 is a block diagram illustrating an example of a logical function configuration of the image processing apparatus 200 according to the second embodiment. Referring to FIG. 15, the image processing apparatus 200 includes an IR image acquisition unit 120, an auxiliary image acquisition unit 130, a determination unit 140, a recognition DB 145, an image processing unit 250, a user interface unit 260, and a notification unit 170.
 本実施形態においても、判定部140は、IR画像又は補助画像について所定の認識対象(例えば、人間の顔、体又は体の一部)を認識するための画像認識を実行することにより、IR画像に透過を抑制すべき被写体が映っているかを判定する。そして、判定部140は、判定結果を画像処理部250へ出力する。 Also in the present embodiment, the determination unit 140 performs image recognition for recognizing a predetermined recognition target (for example, a human face, body, or part of a body) for an IR image or an auxiliary image, so that an IR image It is determined whether or not a subject whose transmission should be suppressed is shown. Then, the determination unit 140 outputs the determination result to the image processing unit 250.
 画像処理部250は、判定部140から入力される判定結果に基づいて、IR画像の表示又は記録を少なくとも部分的に抑制する。より具体的には、本実施形態において、画像処理部250は、透過を抑制すべき被写体がIR画像に映っていると判定部140により判定された場合に、IR画像の全体又は一部の撮像、表示又は記録を無効化する。 The image processing unit 250 suppresses at least partially the display or recording of the IR image based on the determination result input from the determination unit 140. More specifically, in the present embodiment, the image processing unit 250 captures all or part of the IR image when the determination unit 140 determines that the subject whose transmission should be suppressed is reflected in the IR image. Disable display or recording.
 例えば、画像処理部250は、IR画像又は補助画像内で所定の認識対象が認識された場合に、IR画像取得部120(又は赤外線カメラ102)へ無効化信号を送出することにより、IR画像の撮像を停止させてもよい。IR画像の撮像は、ユーザインタフェース部260により検出されるユーザ入力、又は補助画像内で認識対象がもはや認識されなくなったことなどをトリガとして、再開され得る。また、画像処理部250は、IR画像又は補助画像内で所定の認識対象が認識された場合に、ディスプレイ110へのIR画像の出力を停止し、又はストレージ114へのIR画像の記録を停止してもよい。 For example, when a predetermined recognition target is recognized in the IR image or the auxiliary image, the image processing unit 250 sends an invalidation signal to the IR image acquisition unit 120 (or the infrared camera 102), thereby Imaging may be stopped. The imaging of the IR image can be resumed triggered by a user input detected by the user interface unit 260 or when the recognition target is no longer recognized in the auxiliary image. Further, the image processing unit 250 stops outputting the IR image to the display 110 or stops recording the IR image to the storage 114 when a predetermined recognition target is recognized in the IR image or the auxiliary image. May be.
 画像処理部250は、このように撮像、表示又は記録を無効化した部分(例えば、一連のフレームのうちの透過を抑制すべき被写体が映っていると判定される1つ以上のフレーム、又は1つのフレームの中の一部分)のIR画像を補助画像に置き換えてもよい。ここでの補助画像は、可視光画像、追加的なIR画像(例えば、透過性の低い熱画像など)若しくは深度マップであってもよく、又は予め用意される固定的な画像であってもよい。 In this way, the image processing unit 250 invalidates the imaging, display, or recording (for example, one or more frames determined to show a subject whose transmission should be suppressed in a series of frames, or 1 An IR image in a part of one frame may be replaced with an auxiliary image. The auxiliary image here may be a visible light image, an additional IR image (for example, a thermal image with low transparency), a depth map, or a fixed image prepared in advance. .
 図16は、本実施形態に係る撮像、表示又は記録の無効化について説明するための説明図である。図16の左には、時間軸に沿って画像処理部250へ入力される一連のIR画像Im21~Im24が示されている。最も早いIR画像Im21には透過を抑制すべき被写体は映っていないが、後続するIR画像Im22及びIm23には人物が映っており、それぞれのIR画像で認識された顔領域に、矩形枠242a及び242bが付加されている。さらに後続するIR画像Im24には、透過を抑制すべき被写体は映っていない。画像処理部250は、これらIR画像をIR画像取得部120から順次受け取り、対応する判定結果を判定部140から受け取る。そして、画像処理部250は、IR画像Im21及びIR画像Im24をディスプレイ110又はストレージ114などの出力先へ出力する一方、透過を抑制すべき被写体が映っているIR画像Im22及びIR画像Im23をそれら出力先へ出力しない。監視カメラなどのいくつかの用途では、被写体の画像が全く表示されず又は記録されないとすると、装置本来の目的が達成されない。そのため、こうした用途では、出力されないIR画像の代わりに可視光画像、熱画像又は深度マップなどの補助画像が暫定的に出力されることが有益である。 FIG. 16 is an explanatory diagram for explaining invalidation of imaging, display, or recording according to the present embodiment. The left side of FIG. 16 shows a series of IR images Im21 to Im24 input to the image processing unit 250 along the time axis. The earliest IR image Im21 does not show a subject whose transmission should be suppressed, but the subsequent IR images Im22 and Im23 show people, and a rectangular frame 242a and a face frame recognized by each IR image 242b is added. Further, the subsequent IR image Im24 does not show a subject whose transmission should be suppressed. The image processing unit 250 sequentially receives these IR images from the IR image acquisition unit 120 and receives corresponding determination results from the determination unit 140. Then, the image processing unit 250 outputs the IR image Im21 and the IR image Im24 to an output destination such as the display 110 or the storage 114, and outputs the IR image Im22 and the IR image Im23 in which the subject whose transmission should be suppressed is displayed. Do not output to the destination. In some applications, such as surveillance cameras, if the subject image is not displayed or recorded at all, the intended purpose of the device is not achieved. Therefore, in such an application, it is beneficial that an auxiliary image such as a visible light image, a thermal image, or a depth map is provisionally output instead of an IR image that is not output.
 画像処理部250は、透過の抑制が行われていることを示す標識を画面に表示させてもよい。当該標識は、IR画像の代わりに出力される補助画像に重畳されてもよい。また、本実施形態においても、報知部170により、IR画像が撮像されることが近傍の人物へ報知されてもよい。報知部170は、画像処理部250により透過の抑制が行われているか否かに依存して異なる報知パターンで、報知を行ってもよい。 The image processing unit 250 may display a sign indicating that transmission is being suppressed on the screen. The sign may be superimposed on an auxiliary image that is output instead of the IR image. Also in this embodiment, the notification unit 170 may notify a nearby person that an IR image is captured. The notification unit 170 may perform notification with a different notification pattern depending on whether or not transmission suppression is performed by the image processing unit 250.
  [3-2.処理の流れ]
 本項では、ここまでに説明した第2の実施形態に係るIR画像処理の流れのいくつかの例について説明する。
[3-2. Process flow]
In this section, some examples of the flow of IR image processing according to the second embodiment described so far will be described.
   (1)第1の例
 図17Aは、第2の実施形態に係るIR画像処理の流れの第1の例を示すフローチャートである。図17Aに示した処理は、処理すべき1つ以上のIR画像の各々について繰り返される。
(1) First Example FIG. 17A is a flowchart illustrating a first example of the flow of IR image processing according to the second embodiment. The process shown in FIG. 17A is repeated for each of one or more IR images to be processed.
 まず、IR画像取得部120は、IR画像を取得し、取得したIR画像を判定部140及び画像処理部250へ出力する(ステップS202)。 First, the IR image acquisition unit 120 acquires an IR image, and outputs the acquired IR image to the determination unit 140 and the image processing unit 250 (step S202).
 次に、判定部140は、IR画像取得部120から入力されるIR画像について、所定の認識対象を認識するための画像認識を実行する(ステップS212)。次に、判定部140は、画像認識の結果に基づいて、IR画像に透過を抑制すべき被写体が映っているかを判定する(ステップS220)。 Next, the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S212). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed is reflected in the IR image based on the image recognition result (step S220).
 IR画像に透過を抑制すべき被写体が映っていると判定された場合、画像処理部250は、ディスプレイ110、通信インタフェース112又はストレージ114へのIR画像の出力をスキップする(ステップS232)。一方、IR画像に透過を抑制すべき被写体が映っていないと判定された場合、画像処理部250は、IR画像をいずれかの出力先へ出力する(ステップS236)。 When it is determined that the subject whose transmission should be suppressed is reflected in the IR image, the image processing unit 250 skips the output of the IR image to the display 110, the communication interface 112, or the storage 114 (step S232). On the other hand, when it is determined that the subject whose transmission is to be suppressed is not reflected in the IR image, the image processing unit 250 outputs the IR image to any output destination (step S236).
 その後、処理すべき次のIR画像が存在する場合には、処理はステップS202へ戻る(ステップS240)。次のIR画像が存在しない場合には、図17Aに示したIR画像処理は終了する。 Thereafter, if there is a next IR image to be processed, the process returns to step S202 (step S240). If there is no next IR image, the IR image processing shown in FIG. 17A ends.
   (2)第2の例
 図17Bは、第2の実施形態に係るIR画像処理の流れの第2の例を示すフローチャートである。図17Bに示した処理は、処理すべき1つ以上のIR画像の各々について繰り返される。
(2) Second Example FIG. 17B is a flowchart illustrating a second example of the flow of IR image processing according to the second embodiment. The process shown in FIG. 17B is repeated for each of one or more IR images to be processed.
 まず、IR画像取得部120によりIR画像が、補助画像取得部130により補助画像がそれぞれ取得される(ステップS204)。これらIR画像及び補助画像の画角は、互いに重なる(理想的には一致する)ように校正されている。IR画像取得部120は、取得したIR画像を画像処理部150へ出力する。また、補助画像取得部130は、取得した補助画像を判定部140へ出力する。 First, an IR image is acquired by the IR image acquisition unit 120, and an auxiliary image is acquired by the auxiliary image acquisition unit 130 (step S204). The angles of view of the IR image and the auxiliary image are calibrated so as to overlap (ideally match) each other. The IR image acquisition unit 120 outputs the acquired IR image to the image processing unit 150. In addition, the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the determination unit 140.
 次に、判定部140は、補助画像取得部130から入力される補助画像について、所定の認識対象を認識するための画像認識を実行する(ステップS214)。次に、判定部140は、画像認識の結果に基づいて、IR画像に透過を抑制すべき被写体が映っているかを判定する(ステップS220)。例えば、補助画像内で所定の認識対象が認識された場合、判定部140は、IR画像に透過を抑制すべき被写体が映っていると判定し得る。 Next, the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the auxiliary image input from the auxiliary image acquisition unit 130 (step S214). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed is reflected in the IR image based on the image recognition result (step S220). For example, when a predetermined recognition target is recognized in the auxiliary image, the determination unit 140 can determine that the subject whose transmission should be suppressed appears in the IR image.
 IR画像に透過を抑制すべき被写体が映っていると判定された場合、画像処理部250は、ディスプレイ110、通信インタフェース112又はストレージ114へのIR画像の出力をスキップする(ステップS232)。一方、IR画像に透過を抑制すべき被写体が映っていないと判定された場合、画像処理部250は、IR画像をいずれかの出力先へ出力する(ステップS236)。 When it is determined that the subject whose transmission should be suppressed is reflected in the IR image, the image processing unit 250 skips the output of the IR image to the display 110, the communication interface 112, or the storage 114 (step S232). On the other hand, when it is determined that the subject whose transmission is to be suppressed is not reflected in the IR image, the image processing unit 250 outputs the IR image to any output destination (step S236).
 その後、処理すべき次のIR画像が存在する場合には、処理はステップS204へ戻る(ステップS240)。次のIR画像が存在しない場合には、図17Bに示したIR画像処理は終了する。 Thereafter, if there is a next IR image to be processed, the process returns to step S204 (step S240). If there is no next IR image, the IR image processing shown in FIG. 17B ends.
   (3)第3の例
 図17Cは、第2の実施形態に係るIR画像処理の流れの第3の例を示すフローチャートである。図17Cに示した処理は、処理すべき1つ以上のIR画像の各々について繰り返される。
(3) Third Example FIG. 17C is a flowchart illustrating a third example of the flow of IR image processing according to the second embodiment. The process shown in FIG. 17C is repeated for each of the one or more IR images to be processed.
 まず、IR画像取得部120によりIR画像が、補助画像取得部130により補助画像がそれぞれ取得される(ステップS204)。IR画像取得部120は、取得したIR画像を判定部140及び画像処理部150へ出力する。また、補助画像取得部130は、取得した補助画像を画像処理部150へ出力する。 First, an IR image is acquired by the IR image acquisition unit 120, and an auxiliary image is acquired by the auxiliary image acquisition unit 130 (step S204). The IR image acquisition unit 120 outputs the acquired IR image to the determination unit 140 and the image processing unit 150. Further, the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the image processing unit 150.
 次に、判定部140は、IR画像取得部120から入力されるIR画像について、所定の認識対象を認識するための画像認識を実行する(ステップS212)。次に、判定部140は、画像認識の結果に基づいて、IR画像に透過を抑制すべき被写体が映っているかを判定する(ステップS220)。 Next, the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S212). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed is reflected in the IR image based on the image recognition result (step S220).
 IR画像に透過を抑制すべき被写体が映っていると判定された場合、画像処理部250は、IR画像又はその部分画像を、補助画像取得部130から入力される補助画像に置き換える(ステップS233)。そして、画像処理部250は、置き換え後のIR画像を、ディスプレイ110、通信インタフェース112又はストレージ114へ出力する(ステップS234)。一方、IR画像に透過を抑制すべき被写体が映っていないと判定された場合、画像処理部250は、IR画像取得部120から入力されたIR画像を、そのままいずれかの出力先へ出力する(ステップS236)。 When it is determined that the subject whose transmission should be suppressed is reflected in the IR image, the image processing unit 250 replaces the IR image or its partial image with the auxiliary image input from the auxiliary image acquisition unit 130 (step S233). . Then, the image processing unit 250 outputs the replaced IR image to the display 110, the communication interface 112, or the storage 114 (step S234). On the other hand, if it is determined that the subject whose transmission is to be suppressed is not reflected in the IR image, the image processing unit 250 outputs the IR image input from the IR image acquisition unit 120 as it is to any output destination ( Step S236).
 その後、処理すべき次のIR画像が存在する場合には、処理はステップS204へ戻る(ステップS240)。次のIR画像が存在しない場合には、図17Cに示したIR画像処理は終了する。 Thereafter, if there is a next IR image to be processed, the process returns to step S204 (step S240). If there is no next IR image, the IR image processing shown in FIG. 17C ends.
 本節で説明した様々な処理ステップもまた、フローチャートに示した例に限定されず、互いにどのように組み合わされてもよい。 The various processing steps described in this section are not limited to the examples shown in the flowchart, and may be combined in any way.
 <4.まとめ>
 ここまで、図1~図17Cを用いて、本開示に係る技術の様々な実施形態について詳細に説明した。上述した実施形態によれば、赤外線画像に透過を抑制すべき被写体が映っているかが判定され、その判定結果に基づいて赤外線画像の表示又は記録が少なくとも部分的に抑制される。従って、問題となる被写体が赤外線画像に映っていない場合には当該赤外線画像を利用する機会を維持しつつ、問題となる被写体が赤外線画像に映っている場合に、赤外線の透過性に起因する不適切な画像が利用されることを防止することができる。
<4. Summary>
So far, various embodiments of the technology according to the present disclosure have been described in detail with reference to FIGS. 1 to 17C. According to the above-described embodiment, it is determined whether a subject whose transmission is to be suppressed is reflected in the infrared image, and display or recording of the infrared image is suppressed at least partially based on the determination result. Therefore, if the subject in question is not reflected in the infrared image, the opportunity to use the infrared image is maintained, and if the subject in question is reflected in the infrared image, there is a problem caused by infrared transparency. It is possible to prevent an appropriate image from being used.
 また、上述した実施形態によれば、赤外線画像又は当該赤外線画像と重なる画角を有する補助画像について、人間の顔、体又は体の一部といった認識対象を認識するための画像認識が実行され、その認識対象が認識される場合に、赤外線画像に透過を抑制すべき被写体が映っていると判定される。従って、顔認識又は人物認識などの既存の画像認識技術を活用することにより、上述した仕組みを装置又はシステムに容易に取り入れることができる。 Further, according to the above-described embodiment, image recognition for recognizing a recognition target such as a human face, body, or part of a body is performed on an infrared image or an auxiliary image having an angle of view overlapping the infrared image, When the recognition target is recognized, it is determined that the subject whose transmission should be suppressed is reflected in the infrared image. Therefore, the above-described mechanism can be easily incorporated into an apparatus or system by utilizing an existing image recognition technology such as face recognition or person recognition.
 また、ある実施形態によれば、赤外線画像に透過を抑制すべき被写体が映っていると判定された場合に、当該赤外線画像の全体又は一部に相当する領域が不鮮明化される。この場合、不鮮明化されない部分はユーザにより依然として鮮明に視認可能であり、不鮮明化される部分においても(不鮮明化レベルに依存して)ある程度の被写体の識別は可能である。従って、IR画像の撮像が禁止されるケースと比較して、赤外線画像を利用する機会をより広く確保することができる。 Further, according to an embodiment, when it is determined that a subject whose transmission is to be suppressed is reflected in the infrared image, a region corresponding to the whole or a part of the infrared image is blurred. In this case, the portion that is not blurred is still clearly visible by the user, and the subject can be identified to some extent (depending on the blurring level) even in the portion that is blurred. Therefore, it is possible to secure a wider opportunity to use the infrared image than in the case where the imaging of the IR image is prohibited.
 また、ある実施形態によれば、透過を抑制すべき被写体が映っている領域を不鮮明化する際の不鮮明化レベルが、動的に決定される。従って、赤外線画像の利用が不適切な行為につながる可能性が高い場合に適応的に不鮮明化レベルを高めることで、赤外線の透過性に起因する不適切な行為を効果的に防止することができる。 Further, according to an embodiment, the blurring level when blurring the area where the subject whose transmission should be suppressed is blurred is dynamically determined. Therefore, when the use of infrared images is likely to lead to inappropriate acts, it is possible to effectively prevent inappropriate acts caused by infrared transparency by adaptively increasing the blurring level. .
 また、ある実施形態によれば、赤外線画像に透過を抑制すべき被写体が映っていると判定された場合に、赤外線画像の全体又は一部の撮像、表示又は記録が無効化される。この場合、不鮮明化のために画素値を加工する処理を実装する必要が無いため、被写体の透過を抑制する仕組みをより低いコストで又は小さい処理遅延と共に実現することができる。 Further, according to an embodiment, when it is determined that a subject whose transmission is to be suppressed is reflected in the infrared image, imaging, display, or recording of the whole or a part of the infrared image is invalidated. In this case, since it is not necessary to implement a process for processing pixel values for blurring, a mechanism for suppressing the transmission of the subject can be realized at a lower cost or with a small processing delay.
 なお、本明細書において説明した各装置による一連の制御処理は、ソフトウェア、ハードウェア、及びソフトウェアとハードウェアとの組合せのいずれを用いて実現されてもよい。ソフトウェアを構成するプログラムは、例えば、各装置の内部又は外部に設けられる記憶媒体(非一時的な媒体:non-transitory media)に予め格納される。そして、各プログラムは、例えば、実行時にRAMに読み込まれ、CPUなどのプロセッサにより実行される。 Note that a series of control processing by each device described in this specification may be realized using any of software, hardware, and a combination of software and hardware. For example, the program constituting the software is stored in advance in a storage medium (non-transitory medium) provided inside or outside each device. Each program is read into a RAM at the time of execution, for example, and executed by a processor such as a CPU.
 また、本明細書においてフローチャートを用いて説明した処理は、必ずしもフローチャートに示された順序で実行されなくてもよい。いくつかの処理ステップは、並列的に実行されてもよい。また、追加的な処理ステップが採用されてもよく、一部の処理ステップが省略されてもよい。 Further, the processing described using the flowchart in this specification does not necessarily have to be executed in the order shown in the flowchart. Some processing steps may be performed in parallel. Further, additional processing steps may be employed, and some processing steps may be omitted.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.
 また、本明細書に記載された効果は、あくまで説明的又は例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果と共に、又は上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏し得る。 In addition, the effects described in the present specification are merely illustrative or illustrative, and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification, together with the above effects or instead of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 赤外線画像を取得する赤外線画像取得部と、
 前記赤外線画像に透過を抑制すべき被写体が映っているかを判定する判定部と、
 前記判定部による判定結果に基づいて、前記赤外線画像の表示又は記録を少なくとも部分的に抑制する処理部と、
 を備える画像処理装置。
(2)
 前記判定部は、前記赤外線画像又は前記赤外線画像と重なる画角を有する補助画像についての画像認識により所定の認識対象が認識される場合に、前記赤外線画像に前記被写体が映っていると判定する、前記(1)に記載の画像処理装置。
(3)
 前記所定の認識対象は、人間の顔、体又は体の一部を含む、前記(2)に記載の画像処理装置。
(4)
 前記処理部は、前記判定部により前記赤外線画像に前記被写体が映っていると判定された場合に、前記赤外線画像の全体又は一部に相当する不鮮明化領域を不鮮明化する、前記(1)~(3)のいずれか1項に記載の画像処理装置。
(5)
 前記処理部は、前記不鮮明化領域を平滑化し若しくは塗りつぶし、又は前記不鮮明化領域に他の画素値をミキシングすることにより、前記不鮮明化領域を不鮮明化する、前記(4)に記載の画像処理装置。
(6)
 前記処理部は、前記赤外線画像に映っていると判定された前記被写体の推定位置を前記不鮮明化領域が含むように、前記赤外線画像に前記不鮮明化領域を設定する、前記(4)又は前記(5)に記載の画像処理装置。
(7)
 前記処理部は、動的に決定される不鮮明化レベルに従って、前記不鮮明化領域を不鮮明化する、前記(4)~(6)のいずれか1項に記載の画像処理装置。
(8)
 前記処理部は、前記赤外線画像が撮像された際の赤外線の照射強度及び環境光の強度のうちの1つ以上に基づいて、前記不鮮明化レベルを決定する、前記(7)に記載の画像処理装置。
(9)
 前記判定部は、前記赤外線画像又は前記赤外線画像と重なる画角を有する補助画像についての画像認識により所定の認識対象が認識される場合に、前記赤外線画像に前記被写体が映っていると判定し、
 前記処理部は、前記画像認識の認識尤度に基づいて、前記不鮮明化レベルを決定する、
 前記(7)に記載の画像処理装置。
(10)
 前記処理部は、前記赤外線画像が撮像された際の前記被写体までの距離、及び前記赤外線画像における前記被写体のサイズ、のうちの1つ以上に基づいて、前記不鮮明化レベルを決定する、前記(7)に記載の画像処理装置。
(11)
 前記画像処理装置は、前記不鮮明化レベルをユーザに調整させるためのユーザインタフェースを提供するユーザインタフェース部、をさらに備える、前記(7)~(10)のいずれか1項に記載の画像処理装置。
(12)
 前記不鮮明化レベルのユーザによる調整は、前記不鮮明化レベルが所定のレベルを下回らないように制限される、前記(11)に記載の画像処理装置。
(13)
 前記処理部は、前記赤外線画像に属する画素値の代表値に基づいて、不鮮明化のための塗りつぶし色又はミキシング色を決定する、前記(5)に記載の画像処理装置。
(14)
 前記処理部は、前記判定部により前記赤外線画像に前記被写体が映っていると判定された場合に、前記赤外線画像の全体又は一部の撮像、表示又は記録を無効化する、前記(1)~(3)のいずれか1項に記載の画像処理装置。
(15)
 前記処理部は、撮像、表示又は記録を無効化した部分の赤外線画像を補助画像に置き換える、前記(14)に記載の画像処理装置。
(16)
 前記補助画像は、可視光画像、熱画像及び深度マップのうちの1つ以上を含む、前記(2)、(9)又は(15)のいずれか1項に記載の画像処理装置。
(17)
 前記処理部は、前記赤外線画像の表示又は記録を少なくとも部分的に抑制する際に、当該抑制が行われていることを示す標識を画面に表示させる、前記(1)~(16)のいずれか1項に記載の画像処理装置。
(18)
 前記画像処理装置は、
 前記赤外線画像を撮像するカメラと、
 前記カメラにより前記赤外線画像が撮像されることを近傍の人物へ報知する報知部と、
 を備え、
 前記報知部は、前記抑制が行われているか否かに依存して異なる報知パターンで、前記報知を行う、
 前記(1)~(17)のいずれか1項に記載の画像処理装置。
(19)
 画像処理装置のプロセッサにより、赤外線画像を取得することと、
 前記赤外線画像に透過を抑制すべき被写体が映っているかを判定することと、
 前記判定の結果に基づいて、前記赤外線画像の表示又は記録を少なくとも部分的に抑制することと、
 を含む画像処理方法。
(20)
 画像処理装置を制御するコンピュータを、
 赤外線画像を取得する赤外線画像取得部と、
 前記赤外線画像に透過を抑制すべき被写体が映っているかを判定する判定部と、
 前記判定部による判定結果に基づいて、前記赤外線画像の表示又は記録を少なくとも部分的に抑制する処理部と、
 として機能させるためのプログラム。
The following configurations also belong to the technical scope of the present disclosure.
(1)
An infrared image acquisition unit for acquiring an infrared image;
A determination unit that determines whether a subject whose transmission should be suppressed is reflected in the infrared image;
A processing unit that at least partially suppresses display or recording of the infrared image based on a determination result by the determination unit;
An image processing apparatus comprising:
(2)
The determination unit determines that the subject is reflected in the infrared image when a predetermined recognition target is recognized by image recognition of the infrared image or an auxiliary image having an angle of view overlapping the infrared image; The image processing apparatus according to (1).
(3)
The image processing apparatus according to (2), wherein the predetermined recognition target includes a human face, body, or part of a body.
(4)
The processing unit blurs a blurring area corresponding to the whole or a part of the infrared image when the determination unit determines that the subject is reflected in the infrared image. The image processing apparatus according to any one of (3).
(5)
The image processing apparatus according to (4), wherein the processing unit smoothes or fills the blurred area, or blurs the blurred area by mixing other pixel values in the blurred area. .
(6)
The processing unit sets the blurring area in the infrared image so that the blurring area includes an estimated position of the subject determined to be reflected in the infrared image. The image processing apparatus according to 5).
(7)
The image processing apparatus according to any one of (4) to (6), wherein the processing unit blurs the blurring area according to a blurring level that is dynamically determined.
(8)
The image processing according to (7), wherein the processing unit determines the blurring level based on one or more of an infrared irradiation intensity and an ambient light intensity when the infrared image is captured. apparatus.
(9)
The determination unit determines that the subject is reflected in the infrared image when a predetermined recognition target is recognized by image recognition of the infrared image or an auxiliary image having an angle of view overlapping the infrared image,
The processing unit determines the blurring level based on a recognition likelihood of the image recognition.
The image processing apparatus according to (7).
(10)
The processing unit determines the blurring level based on one or more of a distance to the subject when the infrared image is captured and a size of the subject in the infrared image. The image processing apparatus according to 7).
(11)
The image processing apparatus according to any one of (7) to (10), further including a user interface unit that provides a user interface for allowing a user to adjust the blurring level.
(12)
The image processing apparatus according to (11), wherein the adjustment of the blurring level by a user is limited so that the blurring level does not fall below a predetermined level.
(13)
The image processing apparatus according to (5), wherein the processing unit determines a fill color or a mixing color for blurring based on a representative value of pixel values belonging to the infrared image.
(14)
The processing unit invalidates imaging, display, or recording of the whole or a part of the infrared image when the determination unit determines that the subject is reflected in the infrared image. The image processing apparatus according to any one of (3).
(15)
The image processing apparatus according to (14), wherein the processing unit replaces an infrared image of a portion where imaging, display, or recording is invalidated with an auxiliary image.
(16)
The image processing apparatus according to any one of (2), (9), and (15), wherein the auxiliary image includes one or more of a visible light image, a thermal image, and a depth map.
(17)
When the processing unit at least partially suppresses display or recording of the infrared image, the processing unit displays a sign indicating that the suppression is performed on the screen. The image processing apparatus according to item 1.
(18)
The image processing apparatus includes:
A camera that captures the infrared image;
A notification unit for notifying a nearby person that the infrared image is captured by the camera;
With
The notification unit performs the notification with a different notification pattern depending on whether or not the suppression is performed.
The image processing apparatus according to any one of (1) to (17).
(19)
Acquiring an infrared image by the processor of the image processing apparatus;
Determining whether a subject whose transmission should be suppressed is reflected in the infrared image;
Based on the result of the determination, at least partially suppressing display or recording of the infrared image;
An image processing method including:
(20)
A computer for controlling the image processing apparatus;
An infrared image acquisition unit for acquiring an infrared image;
A determination unit that determines whether a subject whose transmission should be suppressed is reflected in the infrared image;
A processing unit that at least partially suppresses display or recording of the infrared image based on a determination result by the determination unit;
Program to function as.
 100,200  画像処理装置
 102      赤外線カメラ
 120      赤外線画像取得部
 130      補助画像取得部
 140      判定部
 150,250  画像処理部
 160,260  ユーザインタフェース部
 170      報知部
DESCRIPTION OF SYMBOLS 100,200 Image processing apparatus 102 Infrared camera 120 Infrared image acquisition part 130 Auxiliary image acquisition part 140 Judgment part 150,250 Image processing part 160,260 User interface part 170 Notification part

Claims (20)

  1.  赤外線画像を取得する赤外線画像取得部と、
     前記赤外線画像に透過を抑制すべき被写体が映っているかを判定する判定部と、
     前記判定部による判定結果に基づいて、前記赤外線画像の表示又は記録を少なくとも部分的に抑制する処理部と、
     を備える画像処理装置。
    An infrared image acquisition unit for acquiring an infrared image;
    A determination unit that determines whether a subject whose transmission should be suppressed is reflected in the infrared image;
    A processing unit that at least partially suppresses display or recording of the infrared image based on a determination result by the determination unit;
    An image processing apparatus comprising:
  2.  前記判定部は、前記赤外線画像又は前記赤外線画像と重なる画角を有する補助画像についての画像認識により所定の認識対象が認識される場合に、前記赤外線画像に前記被写体が映っていると判定する、請求項1に記載の画像処理装置。 The determination unit determines that the subject is reflected in the infrared image when a predetermined recognition target is recognized by image recognition of the infrared image or an auxiliary image having an angle of view overlapping the infrared image; The image processing apparatus according to claim 1.
  3.  前記所定の認識対象は、人間の顔、体又は体の一部を含む、請求項2に記載の画像処理装置。 The image processing apparatus according to claim 2, wherein the predetermined recognition target includes a human face, body, or part of the body.
  4.  前記処理部は、前記判定部により前記赤外線画像に前記被写体が映っていると判定された場合に、前記赤外線画像の全体又は一部に相当する不鮮明化領域を不鮮明化する、請求項1に記載の画像処理装置。 The said process part blurs the blurring area | region equivalent to the whole or one part of the said infrared image, when it determines with the said subject being reflected in the said infrared image by the said determination part. Image processing apparatus.
  5.  前記処理部は、前記不鮮明化領域を平滑化し若しくは塗りつぶし、又は前記不鮮明化領域に他の画素値をミキシングすることにより、前記不鮮明化領域を不鮮明化する、請求項4に記載の画像処理装置。 The image processing apparatus according to claim 4, wherein the processing unit blurs the blurred area by smoothing or filling the blurred area, or mixing another pixel value with the blurred area.
  6.  前記処理部は、前記赤外線画像に映っていると判定された前記被写体の推定位置を前記不鮮明化領域が含むように、前記赤外線画像に前記不鮮明化領域を設定する、請求項4に記載の画像処理装置。 The image according to claim 4, wherein the processing unit sets the blurring area in the infrared image so that the blurring area includes an estimated position of the subject determined to be reflected in the infrared image. Processing equipment.
  7.  前記処理部は、動的に決定される不鮮明化レベルに従って、前記不鮮明化領域を不鮮明化する、請求項4に記載の画像処理装置。 5. The image processing apparatus according to claim 4, wherein the processing unit blurs the blurring area according to a blurring level that is dynamically determined.
  8.  前記処理部は、前記赤外線画像が撮像された際の赤外線の照射強度及び環境光の強度のうちの1つ以上に基づいて、前記不鮮明化レベルを決定する、請求項7に記載の画像処理装置。 The image processing apparatus according to claim 7, wherein the processing unit determines the blurring level based on one or more of an infrared irradiation intensity and an ambient light intensity when the infrared image is captured. .
  9.  前記判定部は、前記赤外線画像又は前記赤外線画像と重なる画角を有する補助画像についての画像認識により所定の認識対象が認識される場合に、前記赤外線画像に前記被写体が映っていると判定し、
     前記処理部は、前記画像認識の認識尤度に基づいて、前記不鮮明化レベルを決定する、
     請求項7に記載の画像処理装置。
    The determination unit determines that the subject is reflected in the infrared image when a predetermined recognition target is recognized by image recognition of the infrared image or an auxiliary image having an angle of view overlapping the infrared image,
    The processing unit determines the blurring level based on a recognition likelihood of the image recognition.
    The image processing apparatus according to claim 7.
  10.  前記処理部は、前記赤外線画像が撮像された際の前記被写体までの距離、及び前記赤外線画像における前記被写体のサイズ、のうちの1つ以上に基づいて、前記不鮮明化レベルを決定する、請求項7に記載の画像処理装置。 The processing unit determines the blurring level based on one or more of a distance to the subject when the infrared image is captured and a size of the subject in the infrared image. 8. The image processing apparatus according to 7.
  11.  前記画像処理装置は、前記不鮮明化レベルをユーザに調整させるためのユーザインタフェースを提供するユーザインタフェース部、をさらに備える、請求項7に記載の画像処理装置。 The image processing apparatus according to claim 7, further comprising a user interface unit that provides a user interface for allowing a user to adjust the blurring level.
  12.  前記不鮮明化レベルのユーザによる調整は、前記不鮮明化レベルが所定のレベルを下回らないように制限される、請求項11に記載の画像処理装置。 12. The image processing apparatus according to claim 11, wherein the adjustment of the blurring level by a user is limited so that the blurring level does not fall below a predetermined level.
  13.  前記処理部は、前記赤外線画像に属する画素値の代表値に基づいて、不鮮明化のための塗りつぶし色又はミキシング色を決定する、請求項5に記載の画像処理装置。 The image processing apparatus according to claim 5, wherein the processing unit determines a fill color or a mixing color for blurring based on a representative value of pixel values belonging to the infrared image.
  14.  前記処理部は、前記判定部により前記赤外線画像に前記被写体が映っていると判定された場合に、前記赤外線画像の全体又は一部の撮像、表示又は記録を無効化する、請求項1に記載の画像処理装置。 The processing unit invalidates imaging, display, or recording of all or a part of the infrared image when the determination unit determines that the subject is reflected in the infrared image. Image processing apparatus.
  15.  前記処理部は、撮像、表示又は記録を無効化した部分の赤外線画像を補助画像に置き換える、請求項14に記載の画像処理装置。 15. The image processing apparatus according to claim 14, wherein the processing unit replaces an infrared image of a portion where imaging, display, or recording is invalidated with an auxiliary image.
  16.  前記補助画像は、可視光画像、熱画像及び深度マップのうちの1つ以上を含む、請求項2に記載の画像処理装置。 The image processing apparatus according to claim 2, wherein the auxiliary image includes one or more of a visible light image, a thermal image, and a depth map.
  17.  前記処理部は、前記赤外線画像の表示又は記録を少なくとも部分的に抑制する際に、当該抑制が行われていることを示す標識を画面に表示させる、請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein when the display or recording of the infrared image is at least partially suppressed, the processing unit displays a sign indicating that the suppression is performed on a screen.
  18.  前記画像処理装置は、
     前記赤外線画像を撮像するカメラと、
     前記カメラにより前記赤外線画像が撮像されることを近傍の人物へ報知する報知部と、
     を備え、
     前記報知部は、前記抑制が行われているか否かに依存して異なる報知パターンで、前記報知を行う、
     請求項1に記載の画像処理装置。
    The image processing apparatus includes:
    A camera that captures the infrared image;
    A notification unit for notifying a nearby person that the infrared image is captured by the camera;
    With
    The notification unit performs the notification with a different notification pattern depending on whether or not the suppression is performed.
    The image processing apparatus according to claim 1.
  19.  画像処理装置のプロセッサにより、赤外線画像を取得することと、
     前記赤外線画像に透過を抑制すべき被写体が映っているかを判定することと、
     前記判定の結果に基づいて、前記赤外線画像の表示又は記録を少なくとも部分的に抑制することと、
     を含む画像処理方法。
    Acquiring an infrared image by the processor of the image processing apparatus;
    Determining whether a subject whose transmission should be suppressed is reflected in the infrared image;
    Based on the result of the determination, at least partially suppressing display or recording of the infrared image;
    An image processing method including:
  20.  画像処理装置を制御するコンピュータを、
     赤外線画像を取得する赤外線画像取得部と、
     前記赤外線画像に透過を抑制すべき被写体が映っているかを判定する判定部と、
     前記判定部による判定結果に基づいて、前記赤外線画像の表示又は記録を少なくとも部分的に抑制する処理部と、
     として機能させるためのプログラム。
    A computer for controlling the image processing apparatus;
    An infrared image acquisition unit for acquiring an infrared image;
    A determination unit that determines whether a subject whose transmission should be suppressed is reflected in the infrared image;
    A processing unit that at least partially suppresses display or recording of the infrared image based on a determination result by the determination unit;
    Program to function as.
PCT/JP2015/071543 2014-10-24 2015-07-29 Image processing device, image processing method and program WO2016063595A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014217473 2014-10-24
JP2014-217473 2014-10-24

Publications (1)

Publication Number Publication Date
WO2016063595A1 true WO2016063595A1 (en) 2016-04-28

Family

ID=55760644

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/071543 WO2016063595A1 (en) 2014-10-24 2015-07-29 Image processing device, image processing method and program

Country Status (1)

Country Link
WO (1) WO2016063595A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180110645A (en) 2017-03-29 2018-10-10 주식회사 레고켐 바이오사이언스 Pyrrolobenzodiazepine dimer prodrug and its ligand-linker conjugate compound
KR20200084802A (en) 2019-01-03 2020-07-13 주식회사 레고켐 바이오사이언스 Pyrrolobenzodiazepine dimer compounds with improved safety and its use
KR20220122590A (en) 2019-01-03 2022-09-02 주식회사 레고켐 바이오사이언스 Pyrrolobenzodiazepine dimer compounds with improved safety and its use

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002358527A (en) * 2001-05-31 2002-12-13 Hudson Soft Co Ltd Device and method for processing image, program for making computer perform image processing method, recording medium with the program recorded, device and method for displaying pickup image, program for making computer perform pickup image display method, and recording medium with the program recorded
JP2006041841A (en) * 2004-07-26 2006-02-09 Nippon Telegr & Teleph Corp <Ntt> Device, method, and program for deciding photography propriety, photography system, and recording medium
JP2009201064A (en) * 2008-02-25 2009-09-03 Pioneer Electronic Corp Method and apparatus for specifying related region, and method and apparatus for recognizing image
JP2012015834A (en) * 2010-07-01 2012-01-19 Konica Minolta Opto Inc Imaging device
JP2013131824A (en) * 2011-12-20 2013-07-04 Nikon Corp Electronic device
JP2013242408A (en) * 2012-05-18 2013-12-05 Canon Inc Imaging device and control method of the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002358527A (en) * 2001-05-31 2002-12-13 Hudson Soft Co Ltd Device and method for processing image, program for making computer perform image processing method, recording medium with the program recorded, device and method for displaying pickup image, program for making computer perform pickup image display method, and recording medium with the program recorded
JP2006041841A (en) * 2004-07-26 2006-02-09 Nippon Telegr & Teleph Corp <Ntt> Device, method, and program for deciding photography propriety, photography system, and recording medium
JP2009201064A (en) * 2008-02-25 2009-09-03 Pioneer Electronic Corp Method and apparatus for specifying related region, and method and apparatus for recognizing image
JP2012015834A (en) * 2010-07-01 2012-01-19 Konica Minolta Opto Inc Imaging device
JP2013131824A (en) * 2011-12-20 2013-07-04 Nikon Corp Electronic device
JP2013242408A (en) * 2012-05-18 2013-12-05 Canon Inc Imaging device and control method of the same

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180110645A (en) 2017-03-29 2018-10-10 주식회사 레고켐 바이오사이언스 Pyrrolobenzodiazepine dimer prodrug and its ligand-linker conjugate compound
KR20210058795A (en) 2017-03-29 2021-05-24 주식회사 레고켐 바이오사이언스 Pyrrolobenzodiazepine dimer prodrug and its ligand-linker conjugate compound
KR20220010048A (en) 2017-03-29 2022-01-25 주식회사 레고켐 바이오사이언스 Pyrrolobenzodiazepine dimer prodrug and its ligand-linker conjugate compound
KR20200084802A (en) 2019-01-03 2020-07-13 주식회사 레고켐 바이오사이언스 Pyrrolobenzodiazepine dimer compounds with improved safety and its use
KR20220122590A (en) 2019-01-03 2022-09-02 주식회사 레고켐 바이오사이언스 Pyrrolobenzodiazepine dimer compounds with improved safety and its use

Similar Documents

Publication Publication Date Title
US11501535B2 (en) Image processing apparatus, image processing method, and storage medium for reducing a visibility of a specific image region
CN105323497B (en) The high dynamic range (cHDR) of constant encirclement operates
US9875530B2 (en) Gradient privacy masks
US20190199898A1 (en) Image capturing apparatus, image processing apparatus, control method, and storage medium
US11425298B2 (en) Imaging device, imaging method, and program
JP6293571B2 (en) Surveillance method and camera
US7574021B2 (en) Iris recognition for a secure facility
TWI689892B (en) Background blurred method and electronic apparatus based on foreground image
WO2019148978A1 (en) Image processing method and apparatus, storage medium and electronic device
JP6729394B2 (en) Image processing apparatus, image processing method, program and system
EP2813970A1 (en) Monitoring method and camera
US10255683B1 (en) Discontinuity detection in video data
JP5071198B2 (en) Signal recognition device, signal recognition method, and signal recognition program
US20180225522A1 (en) Ir or thermal image enhancement method based on background information for video analysis
DE102019107582A1 (en) Electronic device with image pickup source identification and corresponding methods
JP2017201745A (en) Image processing apparatus, image processing method, and program
WO2018233217A1 (en) Image processing method, device and augmented reality apparatus
WO2016063595A1 (en) Image processing device, image processing method and program
TWI542212B (en) Photographic system with visibility enhancement
US20130308829A1 (en) Still image extraction apparatus
KR101920740B1 (en) Real-time image processing system
KR102474697B1 (en) Image Pickup Apparatus and Method for Processing Images
JP2005173879A (en) Fused image display device
CN112926367A (en) Living body detection equipment and method
JP2021149691A (en) Image processing system and control program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15852912

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15852912

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP