WO2016063595A1 - Dispositif de traitement d'image, procédé de traitement d'image et programme - Google Patents

Dispositif de traitement d'image, procédé de traitement d'image et programme Download PDF

Info

Publication number
WO2016063595A1
WO2016063595A1 PCT/JP2015/071543 JP2015071543W WO2016063595A1 WO 2016063595 A1 WO2016063595 A1 WO 2016063595A1 JP 2015071543 W JP2015071543 W JP 2015071543W WO 2016063595 A1 WO2016063595 A1 WO 2016063595A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
infrared
image processing
processing apparatus
infrared image
Prior art date
Application number
PCT/JP2015/071543
Other languages
English (en)
Japanese (ja)
Inventor
拓郎 川合
利昇 井原
昌俊 横川
隆浩 永野
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2016063595A1 publication Critical patent/WO2016063595A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to an image processing apparatus, an image processing method, and a program.
  • Patent Document 1 Conventionally, it has been proposed to use an image captured by an infrared camera for security and other purposes (see, for example, Patent Document 1). Infrared rays have a variety of uses different from visible light, depending on their wavelength. The technique proposed by Patent Document 1 can notify the user of the presence of a suspicious person detected through an infrared image by using infrared rays for night vision applications.
  • Infrared cameras are used not only for security equipment such as surveillance cameras, but also for medical / diagnostic equipment, in-vehicle equipment and inspection equipment. There is also an infrared module that can be connected to (or built in) a general-purpose portable device such as a smartphone or tablet PC (Personal Computer) to capture, display or record infrared images. To do.
  • a general-purpose portable device such as a smartphone or tablet PC (Personal Computer) to capture, display or record infrared images.
  • infrared rays having a certain range of wavelengths are transmissive through materials such as cloth or thin film, the use of infrared images may fall under inappropriate acts such as infringement of privacy.
  • an infrared image acquisition unit that acquires an infrared image
  • a determination unit that determines whether a subject whose transmission is to be suppressed is reflected in the infrared image
  • the infrared image based on a determination result by the determination unit
  • An image processing apparatus includes a processing unit that at least partially suppresses display or recording of an image.
  • the processor of the image processing apparatus acquires an infrared image, determines whether a subject whose transmission is to be suppressed is reflected in the infrared image, and based on the determination result. And at least partially suppressing display or recording of the infrared image.
  • the computer that controls the image processing apparatus includes an infrared image acquisition unit that acquires an infrared image, a determination unit that determines whether a subject whose transmission should be suppressed is reflected in the infrared image, Based on the determination result by the determination unit, a program is provided for functioning as a processing unit that at least partially suppresses display or recording of the infrared image.
  • FIG. 1 is an explanatory diagram for explaining various uses of an infrared (IR) image depending on a wavelength.
  • the horizontal direction in FIG. 1 corresponds to the wavelength of infrared rays, and the wavelength increases from left to right.
  • Light having a wavelength of 0.7 ⁇ m or less is visible light, and human vision senses this visible light.
  • Infrared rays having a wavelength in the range of 0.7 ⁇ m to 1.0 ⁇ m are classified as near infrared rays (NIR).
  • NIR near infrared rays
  • Near-infrared light can be used, for example, for night vision, fluoroscopy, optical communication and ranging.
  • Infrared rays having a wavelength in the range of 1.0 ⁇ m to 2.5 ⁇ m are classified as short wavelength infrared rays (SWIR). Short wavelength infrared is also available for night vision and fluoroscopy.
  • a night vision apparatus using near-infrared rays or short-wavelength infrared rays first irradiates infrared rays in the vicinity and captures the reflected light to generate an IR image.
  • Infrared light having a wavelength in the range of 2.5 ⁇ m to 4.0 ⁇ m is classified as medium wavelength infrared (MWIR). Since a substance-specific absorption spectrum appears in the wavelength range of the medium wavelength infrared, the medium wavelength infrared can be used for identification of the substance.
  • MWIR medium wavelength infrared
  • Medium wavelength infrared can also be used for thermography.
  • Infrared rays having a wavelength of 4.0 ⁇ m or more are classified as far infrared rays (FIR).
  • FIR far infrared rays
  • Far infrared can be utilized for night vision, thermography and heating.
  • Infrared rays emitted by black body radiation from an object correspond to far infrared rays. Therefore, a night vision apparatus using far infrared rays can generate an IR image by capturing black body radiation from an object without irradiating infrared rays.
  • the boundary values of the wavelength range shown in FIG. 1 are merely examples.
  • Various definitions exist for the boundary value of the infrared classification, and the advantages described below of the technology according to the present disclosure can be enjoyed under any definition.
  • near infrared rays and short wavelength infrared rays are mainly permeable to materials such as cloth or thin film. Therefore, for example, when a person appears in an IR image based on these types of infrared rays, the clothes of the person may be transmitted, and underwear and other objects that are not desirable to be seen by others may be exposed in the IR image. is there. Therefore, the use of IR images may fall under inappropriate acts such as privacy infringement or nuisance.
  • many security cameras have been installed in public places due to heightened security awareness, and automobiles equipped with night vision cameras have been marketed from the viewpoint of accident prevention. It is utilized in. In view of this, the present specification proposes a mechanism that can prevent an inappropriate act caused by infrared transparency while maintaining an opportunity for appropriate use of an infrared image.
  • FIG. 2 is a flowchart illustrating an example of a schematic flow of infrared (IR) image processing according to an embodiment.
  • IR image is acquired (step S10).
  • the IR image acquired here is an image generated through a camera that detects infrared rays having transparency.
  • the IR image may be a still image or one of a series of frames that make up a moving image.
  • subject recognition processing is executed (step S11).
  • the subject recognized here is the same as the subject whose transmission should be suppressed, or can be defined in advance in association with the subject whose transmission should be suppressed.
  • the subject recognized in step S11 may be the human body itself or a part of the human body such as a human face or hand.
  • the subject whose transmission should be suppressed may be any object different from the human body (for example, a container in which it is not preferable to visually recognize the contents).
  • the subject recognition process may be executed with an IR image as an input, or may be executed with another type of image (however, an image having an angle of view that can be calibrated with reference to the IR image) as an input.
  • step S12 it is determined whether the subject whose transmission should be suppressed is reflected in the IR image. If it is determined that the subject whose transmission should be suppressed is reflected in the IR image, a transmission suppression process is executed in order to at least partially suppress the display or recording of the IR image (step S13). As described above, in the technology according to the present disclosure, such an operation is suppressed only when it is determined that the display or recording is inappropriate while the display or recording of the IR image is basically allowed.
  • the region corresponding to the whole or a part of the IR image is blurred in the transmission suppression process.
  • imaging, display, or recording of all or part of the IR image is invalidated in the transmission suppression process.
  • the IR image processing described above can be performed on any type of device that captures an IR image or processes a captured IR image. Even if only a few examples are given, devices for imaging IR images are digital video cameras, digital still cameras, television broadcasting cameras, surveillance cameras, intercoms with monitors, in-vehicle cameras, smartphones, PCs (Personal Computers), HMDs. (Head Mounted Display) A terminal, a game terminal, a medical device, a diagnostic device, an inspection device, and the like may be included. In addition to the various imaging devices described above, the device that processes IR images may include a television receiver, a content player, a content recorder, an authoring device, and the like.
  • the image processing apparatus mentioned in the following sections may be a module mounted on or connected to the apparatus exemplified here.
  • FIG. 3A to 3D are explanatory diagrams showing examples of various scenarios in which IR image processing can be executed.
  • an IR image is input to IR image processing from an infrared camera that captures an IR image. Then, the IR image in which the transmission is suppressed is output to the display, and the image is displayed on the display.
  • an IR image is input to IR image processing from an infrared camera that captures an IR image. Then, the IR image in which the transmission is suppressed is output to the storage, and the image is recorded by the storage.
  • FIG. 3C the IR image is read out to the IR image processing from the storage storing the captured IR image.
  • the IR image in which the transmission is suppressed is output to the display, and the image is displayed on the display.
  • the IR image is read out to the IR image processing from the storage storing the captured IR image.
  • the IR image whose transmission is suppressed is recorded again by the storage (that is, the image data is updated or converted).
  • FIG. 4 is a block diagram illustrating an example of a hardware configuration of the image processing apparatus 100 according to the first embodiment.
  • the image processing apparatus 100 includes an infrared camera 102, a sub camera 104, an input interface 106, a memory 108, a display 110, a communication interface 112, a storage 114, a bus 116, and a processor 118.
  • the infrared camera 102 is an imaging module that captures an infrared (IR) image.
  • the infrared camera 102 has an array of imaging elements that sense infrared rays that are mainly classified as near infrared rays or short wavelength infrared rays, and a light emitting element that irradiates infrared rays in the vicinity of the apparatus.
  • the infrared camera 102 irradiates infrared rays from a light emitting element in response to a trigger such as a user input or periodically, and captures infrared rays reflected on a subject or its background to generate an IR image.
  • a series of IR images generated by the infrared camera 102 may constitute a video.
  • the sub-camera 104 captures an auxiliary image that is used auxiliary to recognize a subject or suppress transmission in IR image processing.
  • the auxiliary image captured by the sub camera 104 may be, for example, one or more of a visible light image, an additional IR image (for example, a thermal image, that is, a MWIR image or an FIR image), and a depth map.
  • the input interface 106 is used for a user to operate the image processing apparatus 100 or input information to the image processing apparatus 100.
  • the input interface 106 may include an input device such as a touch sensor, a keyboard, a keypad, a button, or a switch, for example.
  • the input interface 106 may include a microphone for voice input and a voice recognition module.
  • the input interface 106 may also include a remote control module that receives commands selected by the user from the remote device.
  • the memory 108 is a storage medium that can include a RAM (Random Access Memory) and a ROM (Read Only Memory).
  • the memory 108 is coupled to the processor 118 and stores programs and data for processing executed by the processor 118.
  • the display 110 is a display module having a screen for displaying an image.
  • the display 110 may be, for example, an LCD (Liquid Crystal Display), an OLED (Organic light-Emitting Diode), or a CRT (Cathode Ray Tube).
  • the communication interface 112 is a module that mediates communication between the image processing apparatus 100 and another apparatus.
  • the communication interface 112 establishes a communication connection according to any wireless communication protocol or wired communication protocol.
  • Storage The storage 114 is a storage device that stores image data that can include IR images and auxiliary images, or stores a database used in IR image processing.
  • the storage 114 contains a storage medium such as a semiconductor memory or a hard disk. Note that the program and data described in this specification may be acquired from a data source external to the image processing apparatus 100 (for example, a data server, a network storage, or an external memory).
  • the bus 116 connects the infrared camera 102, the sub camera 104, the input interface 106, the memory 108, the display 110, the communication interface 112, the storage 114, and the processor 118 to each other.
  • the processor 118 is a processing module such as a CPU (Central Processing Unit) or a DSP (Digital Signal Processor).
  • the processor 118 operates a function for suppressing undesired transparency in the IR image by executing a program stored in the memory 108 or other storage medium.
  • FIG. 5 is a block diagram illustrating an example of a configuration of logical functions realized by linking the components of the image processing apparatus 100 illustrated in FIG. 4 to each other.
  • the image processing apparatus 100 includes an IR image acquisition unit 120, an auxiliary image acquisition unit 130, a determination unit 140, a recognition DB 145, an image processing unit 150, a blurring DB 155, a user interface unit 160, and a notification unit 170. .
  • the IR image acquisition unit 120 acquires an infrared (IR) image and outputs the acquired IR image to the determination unit 140 and the image processing unit 150.
  • the IR image acquisition unit 120 may acquire an IR image captured by the infrared camera 102. Further, the IR image acquisition unit 120 may acquire an IR image stored in the storage 114.
  • the IR image acquisition unit 120 may acquire an IR image from another device via the communication interface 112.
  • the IR image acquired by the IR image acquisition unit 120 may be an image that has undergone preliminary processing such as signal amplification and noise removal.
  • the IR image acquisition unit 120 may decode the IR image from the compressed and encoded stream.
  • the auxiliary image acquisition unit 130 acquires an auxiliary image that may include a visible light image, an additional IR image, or a depth map. It is assumed that the angle of view of the auxiliary image is calibrated so as to overlap (ideally match) the angle of view of the IR image acquired by the IR image acquisition unit 120.
  • the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the determination unit 140 when the auxiliary image is used for subject recognition in IR image processing.
  • the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the image processing unit 150 when the auxiliary image is used for suppressing transmission.
  • the auxiliary image acquisition unit 130 may acquire an auxiliary image captured by the sub camera 104 and stored in the storage 114 or received via the communication interface 112. When the auxiliary image is not used for any application, the auxiliary image acquisition unit 130 may be omitted from the configuration of the image processing apparatus 100.
  • the determination unit 140 determines whether a subject whose transmission should be suppressed is reflected in the IR image. More specifically, the determination unit 140 performs image recognition for recognizing a predetermined recognition target for an IR image or an auxiliary image having an angle of view overlapping with the IR image. Then, when the predetermined recognition target is recognized as a result of the image recognition, the determination unit 140 determines that the subject whose transmission should be suppressed appears in the IR image.
  • the predetermined recognition target may be a human face, body, or part of the body.
  • the recognition DB 145 stores data referred to in image recognition executed by the determination unit 140.
  • the recognition DB 145 stores in advance image feature quantities to be recognized that are acquired from a number of known IR images through a prior learning process.
  • the determination unit 140 collates the image feature amount extracted from the IR image input from the IR image acquisition unit 120 with the feature amount stored in the recognition DB 145, and the recognition target appears in the IR image according to the collation result. Can be determined.
  • the determination unit 140 may perform the above-described image recognition according to an existing algorithm such as boosting or support vector machine.
  • a prior learning process is also executed based on a known image of the same type as the auxiliary image. Based on the assumption that the angle of view of the auxiliary image overlaps the angle of view of the IR image captured at the same timing, based on the detection of a predetermined recognition target in the auxiliary image, transmission to the IR image should be suppressed It can be estimated that the subject is shown.
  • a visible light image can be used as an auxiliary image. In this case, it is possible to determine whether the subject whose transmission should be suppressed is reflected in the IR image by utilizing a face recognition technique or a person recognition technique with good accuracy based on the visible light image.
  • a plurality of recognition targets may be recognized for one IR image or auxiliary image.
  • Recognition target type a code that identifies the type of the recognized recognition target. Various possible types such as face / human body, face / torso / arm / leg, face / upper body / lower body, face / torso (upper body) / torso (lower body) / arm / leg may be defined in advance.
  • Recognition position / orientation / size The position, orientation and size of the recognized recognition object in the IR image (or auxiliary image). Additionally or alternatively, information indicating the shape may be output.
  • Recognition likelihood an index indicating the likelihood of the result of image recognition. The larger the value, the more likely the recognition result is correct. Also called reliability.
  • the image recognition performed by the determination unit 140 may be shared with image recognition for other purposes, such as pedestrian recognition performed on images from the in-vehicle camera for driving assistance.
  • a display object indicating the result of such image recognition (for example, a frame surrounding a pedestrian) may be superimposed on the IR image.
  • the image processing unit 150 suppresses at least partially the display or recording of the IR image based on the determination result input from the determination unit 140. More specifically, in the present embodiment, the image processing unit 150 determines that the subject whose transmission should be suppressed is reflected in the IR image when the determination unit 140 determines that the subject is reflected in the IR image. A blurring area is set in the IR image so as to include the estimated position of the subject. Then, the image processing unit 150 blurs the blurring area of the IR image according to the setting.
  • the blurring DB 155 stores data referred to in image processing executed by the image processing unit 150.
  • the blurred area may correspond to the entire IR image.
  • the image processing unit 150 can set the unsharp area without grasping the specific position of the subject.
  • the blurred area may correspond to a portion of the IR image.
  • the image processing unit 150 sets a blurring region in the IR image based on the recognition position, orientation, and size information of the recognition target input from the determination unit 140. For example, when a human face is recognized as a recognition target, the blurring area can be set below the recognition position of the face with a size depending on the size of the face.
  • FIG. 6 is an explanatory diagram showing an example of the setting of the blurring area.
  • the IR image Im11 is shown on the left of FIG.
  • a rectangular frame 142 indicates that a recognition target corresponding to a human face is recognized in the IR image Im11.
  • the image processing unit 150 sets the smeared region 144 below the recognized face position.
  • the blurring region 144 is set so as to include an estimated position of the human body that is a subject whose transmission is to be suppressed (see the center of FIG. 6).
  • Such a positional relationship between the subject and the recognition target is defined in advance (for example, based on knowledge of the standard shape of the human body) and can be stored for each recognition target type by the blurring DB 155.
  • the right side of FIG. 6 shows a processed IR image Im12 in which a partial image of the blurred region 144 is blurred.
  • the image processing unit 150 may change the size, orientation, or shape of the blurred area 144 in accordance with the size, orientation, or shape of the recognition target. Further, when a human face is recognized, the image processing unit 150 blurs an area having a pixel value whose difference from a pixel value of the face area (for example, an average value or a median value over the face area) is below a threshold value. It may be set as a conversion area.
  • the positional relationship between the subject and the recognition target is not defined in advance, it corresponds to human skin that is reflected in the IR image (it is assumed to have a gradation close to the gradation of the face) It is possible to set a blurring area in the portion to be processed.
  • the image processing unit 150 may blur the blurred area 144 by smoothing the pixel value.
  • the smoothing here can be performed by applying a smoothing filter typified by a Gaussian filter to each pixel belonging to the unsharp region 144.
  • the level of smearing depends on the filter scale (eg, variance ⁇ ).
  • the level of smearing depends on the sub-region size R sub .
  • the image processing unit 150 may blur the blurred area 144 by filling the blurred area 144 with a specific pixel value.
  • the image processing unit 150 mixes the pixel values of the auxiliary image (the portion corresponding to the unsharp area 144) into the unsharp area 144, thereby blurring the area 144. May be blurred.
  • the left part of FIG. 8 shows a partial image of the auxiliary image corresponding to the blurred area 144 and a partial image of the blurred area 144 of the IR image.
  • the image processing unit 150 may generate a blurred partial image of the blurred region 144 by mixing these two partial images using the mixing ratio ⁇ mix according to the following equation (see the right side of FIG. 8). ).
  • IR x, y and IR x, y_blurred represent the pixel values of the IR image before and after the pixel position (x, y), respectively, and SI x, y represents the pixel position (x, y). y) represents the pixel value of the auxiliary image.
  • a plurality of auxiliary images may be mixed with the IR image using a plurality of mixing ratios. When two mixing ⁇ mix and ⁇ mix are used, for example, the following equation (2) may be used instead of equation (1).
  • SI1 x, y and SI2 x, y represent the pixel value of the first auxiliary image and the pixel value of the second auxiliary image at the pixel position (x, y), respectively.
  • the level of smearing depends on the mixing ratio.
  • a supplementary image for mixing instead of a dynamically acquired image such as a visible light image, an additional IR image or a depth map, a fixed image (eg CG) pre-stored by the blurring DB 155 (Computer Graphics) images or monochrome images etc.) may be used.
  • a fixed image eg CG pre-stored by the blurring DB 155 (Computer Graphics) images or monochrome images etc.
  • mixing may be performed using the pixel value P mix that is dynamically determined as in the following equation.
  • the pixel value P mix may be a representative value of pixel values belonging to the IR image.
  • the representative value here may be, for example, the average value or the median value of the pixel values of the entire IR image, the inside of the blurred area, or the pixel group on the boundary of the blurred area. By using such a representative value, an unnatural color or an image with little change in gradation can be obtained as an image of a blurred region after mixing.
  • the pixel value P mix selected in this way may be used as a fill color for blurring instead of the mixing color. Further, the pixel value P mix may be set by the user via the user interface unit 160.
  • the image processing unit 150 may dynamically determine the blurring level according to one or more conditions, and blur the blurring area according to the determined blurring level.
  • the image processing unit 150 may determine the blurring level based on one or more of infrared irradiation intensity and ambient light intensity when an IR image is captured. For example, as the intensity of infrared rays irradiated to the vicinity when an IR image is captured increases, the texture of the IR image becomes clearer, and the transmitted subject appears more clearly in the IR image. Therefore, the image processing unit 150 acquires information indicating the infrared irradiation intensity when the IR image is captured from the infrared camera 102, and the higher the irradiation intensity, the higher the blurring level. May be determined.
  • the blurring level may be determined based on the irradiation intensity of infrared rays relative to the intensity of the ambient light. The intensity of the ambient light can be measured by an illuminance sensor not shown in FIG.
  • the image processing unit 150 may determine the blurring level based on the recognition likelihood of image recognition input from the determination unit 140. For example, when there is a high possibility that an object recognized as being reflected in an IR image is a human torso, the recognition likelihood (or reliability) indicating a high value associated with the recognition target type indicating “torso” is high. The determination result is input from the determination unit 140. In this case, the image processing unit 150 can determine the blur level value of the body region to be relatively high.
  • the image processing unit 150 determines the blurring level based on one or more of the distance from the camera to the subject when the IR image is captured and the size of the subject in the IR image. Also good.
  • the distance from the camera to the subject can be measured, for example, by an infrared-based depth sensor (eg, by a near infrared round trip time reflected by the subject, or a ranging method based on distortion of the dot pattern projected onto the subject).
  • the size of the subject can be acquired through image recognition executed by the determination unit 140. For example, if the distance from the camera to the subject is smaller or the subject size is larger, the transmitted subject will appear more clearly in the IR image. Therefore, the image processing unit 150 can determine the blurring level so that the blurring level becomes higher in these cases.
  • the blurring level corresponds to the scale of the Gaussian filter. If smearing is achieved by averaging pixel values for each sub-region, the smearing level corresponds to the size of the sub-region. If blurring is achieved by mixing with the auxiliary image, the blurring level corresponds to the mixing ratio.
  • the image processing unit 150 sets values for parameters for determining the blurring level (for example, infrared irradiation intensity, ambient light intensity, recognition likelihood, distance to the subject or size of the subject, or any combination thereof). Regardless, these blurring levels may be limited so as not to fall below a predetermined level.
  • FIG. 9A is a graph illustrating a first example of the relationship between the parameter for determining the blurring level and the blurring level.
  • the horizontal axis of the graph represents the parameter X for determining the blurring level, and the parameter X may correspond to, for example, infrared irradiation intensity, recognition likelihood, or subject size.
  • the vertical axis of the graph represents the level of smearing and may correspond to, for example, the scale of the filter, the size of the sub-region for averaging or the mixing ratio.
  • the blurring level is constant at the minimum value L min regardless of the value of the parameter X.
  • the blurring level Due to such limitation of the blurring level, it is possible to prevent a subject that is not desired to be visually recognized from being visually recognized due to a lack of the blurring level.
  • the blurring level increases with the value of the parameter X.
  • the value of the parameter X is the threshold value Th12 above, blurring level reaches a maximum value L max.
  • Information defining such a relationship between the parameter X and the blurring level may be defined in advance and stored by the blurring DB 155.
  • FIG. 9B is a graph showing a second example of the relationship between the parameter for determining the blurring level and the blurring level.
  • the blurring level when the value of the parameter X is lower than the threshold Th21, the blurring level is constant at the minimum value L min regardless of the value of the parameter X.
  • the blurring level takes a value associated with the subrange to which the value of the parameter X belongs.
  • the subrange boundary is defined by threshold values Th21, Th22, Th23, and Th24.
  • the blurring DB 155 may store information defining such a subrange boundary and a blurring level for each subrange.
  • the user interface unit 160 is additionally provided in the image processing apparatus 100 in order to provide a user interface for allowing the user to adjust the blurring level.
  • the user interface unit 160 acquires user input via the input interface 106.
  • the user interface unit 160 may display a graphical user interface as illustrated in FIG. 10 on the screen of the display 110, for example. Referring to FIG. 10, a setting window U10 is shown.
  • the setting window U10 includes a slider U11 and a button U12. The user can increase or decrease the degree of blurring by sliding the slider U11 along the slider axis and pressing (or tapping) the button U12.
  • FIG. 11 is a graph showing an example of the blurring level depending on the user setting. Referring to FIG. 11, three graphs G1, G2, and G3 having different maximum values and slopes are shown.
  • the graph G1 defines the relationship between the parameter X for the determination of the blurring level and the blurring level, which is selected when the user desires a lower blurring level.
  • Graph G2 defines the relationship between parameter X and the smearing level that is selected when the user desires a medium smearing level.
  • Graph G3 defines the relationship between parameter X and the smearing level that is selected when the user desires a high smearing level. In the example of FIG.
  • the blurring DB 155 may store information that defines the blurring level that depends on the user setting.
  • the relationship between the parameter for determining the blurring level and the blurring level is not limited to the examples shown in FIGS. 9A and 9B and FIG.
  • the graph indicating the relationship may draw any trajectory such as a straight line, a curved line, and a broken line.
  • the blurring level may be determined based on a plurality of parameters (for example, recognition likelihood and distance to the subject). For example, a relationship between a single intermediate parameter calculated as a function of a plurality of parameters and the blurring level may be defined in the blurring DB 155. Also, different blurring levels may be used depending on the imaging time zone such as day or night.
  • the image processing unit 150 dynamically changes the fill color for smearing (for example, depending on information such as the type of recognition target). It may be changed. Further, the image processing unit 150 may dynamically change the color of the monochrome image (that is, the mixing color) in an example in which the blurring region 144 is blurred by mixing with the monochrome image. For example, the image processing unit 150 sets the mixing color to red when the smearing level is high, the mixing color to blue when the smearing level is medium, and the mixing color to gray when the smearing level is low. Can be set to Thereby, it is possible to notify the user who sees the output image how much blurring has been performed. Further, the image processing unit 150 may dynamically change the mixing color depending on other information such as the type of recognition target.
  • the image processing unit 150 may display a sign indicating that the suppression is performed on the screen.
  • the image processing unit 150 can display an indication on the display screen by superimposing an indication indicating that suppression is performed on the IR image.
  • the image processing apparatus 100 is a tablet PC.
  • a sign U21 and a sign U22 are displayed on the screen of the image processing apparatus 100.
  • the sign U21 is a text label for informing the user that the smearing has been performed as a result of the human body being recognized in the IR image.
  • the sign U22 is a frame that surrounds the blurred area. The user can grasp not only that the smearing has been performed but also which part of the output image has been smeared by looking at the sign U22.
  • the sign U21 and the sign U22 illustrated in FIG. 12 are useful for the user who captures or uses the IR image to know that the blurring has been performed on the IR image.
  • the notification unit 170 can notify that an IR image is captured by means such as light emitted from a light emitting element such as an LED (Light Emitting Diode), or a sound effect or sound output from a speaker.
  • the notification unit 170 may perform notification with a different notification pattern depending on whether or not transmission suppression is performed by the image processing unit 150.
  • FIG. 13 is an explanatory diagram for explaining an example of a technique for notifying a nearby person that an IR image is captured.
  • the image processing apparatus 100 is a digital video camera.
  • a light emitting element 172 is disposed on the back surface of the image processing apparatus 100.
  • the notification unit 170 turns on the light emitting element 172 when an IR image is captured.
  • the color of light emitted from the light emitting element 172 is set to, for example, blue when transmission is suppressed by the image processing unit 150, and red when transmission is not suppressed.
  • the person who is the subject can grasp whether or not the transmission is appropriately suppressed by looking at the color of the light from the light emitting element 172.
  • FIG. 14A is a flowchart illustrating a first example of the flow of IR image processing according to the first embodiment. The process shown in FIG. 14A is repeated for each of the one or more IR images to be processed.
  • the IR image acquisition unit 120 acquires an IR image, and outputs the acquired IR image to the determination unit 140 and the image processing unit 150 (step S102).
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S112). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed appears in the IR image based on the result of image recognition (step S120).
  • the image processing unit 150 sets a blurring area in the IR image so as to include the estimated position of the subject (step S131). Next, the image processing unit 150 blurs the partial image of the IR image in the set blurring area (step S135). If it is not determined that the subject whose transmission is to be suppressed appears in the IR image, the processes in steps S131 and S135 are skipped.
  • the image processing unit 150 outputs, to the display 110, the communication interface 112, or the storage 114, the IR image updated in step S135 or the IR image that has not been updated because the subject whose transmission should be suppressed is not shown. (Step S138).
  • step S140 the IR image processing shown in FIG. 14A ends.
  • FIG. 14B is a flowchart illustrating a second example of the flow of IR image processing according to the first embodiment. The process shown in FIG. 14B is repeated for each of the one or more IR images to be processed.
  • the IR image acquisition unit 120 acquires an IR image, and outputs the acquired IR image to the determination unit 140 and the image processing unit 150 (step S102).
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S112). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed appears in the IR image based on the result of image recognition (step S120).
  • the image processing unit 150 sets a blurring area in the IR image so as to include the estimated position of the subject (step S131).
  • the image processing unit 150 uses one or more parameters of infrared irradiation intensity when the IR image is captured, ambient light intensity, recognition likelihood of image recognition, distance to the subject, and size of the subject. Based on the above, the blurring level is determined (step S133).
  • the image processing unit 150 blurs the partial image of the IR image in the blurring area in accordance with the determined blurring level (step S136). If it is not determined that the subject whose transmission is to be suppressed appears in the IR image, the processes of steps S131, S133, and S136 are skipped.
  • the image processing unit 150 outputs, to the display 110, the communication interface 112, or the storage 114, the IR image updated in step S136 or the IR image that has not been updated because the subject whose transmission should be suppressed is not shown. (Step S138).
  • step S140 the IR image processing shown in FIG. 14B ends.
  • FIG. 14C is a flowchart illustrating a third example of the IR image processing flow according to the first embodiment. The process shown in FIG. 14C is repeated for each of the one or more IR images to be processed.
  • an IR image is acquired by the IR image acquisition unit 120, and an auxiliary image is acquired by the auxiliary image acquisition unit 130 (step S104).
  • the angles of view of the IR image and the auxiliary image are calibrated so as to overlap (ideally match) each other.
  • the IR image acquisition unit 120 outputs the acquired IR image to the image processing unit 150.
  • the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the determination unit 140.
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the auxiliary image input from the auxiliary image acquisition unit 130 (step S114).
  • the determination unit 140 determines whether a subject whose transmission should be suppressed appears in the IR image based on the result of image recognition (step S120). For example, when a predetermined recognition target is recognized in the auxiliary image, the determination unit 140 can determine that the subject whose transmission should be suppressed appears in the IR image.
  • the image processing unit 150 sets a blurring area in the IR image so as to include the estimated position of the subject (step S131). Next, the image processing unit 150 blurs the partial image of the IR image in the set blurring area (step S135). If it is not determined that the subject whose transmission is to be suppressed appears in the IR image, the processes in steps S131 and S135 are skipped.
  • the image processing unit 150 outputs, to the display 110, the communication interface 112, or the storage 114, the IR image updated in step S135 or the IR image that has not been updated because the subject whose transmission should be suppressed is not shown. (Step S138).
  • step S140 the IR image processing shown in FIG. 14C ends.
  • FIG. 14D is a flowchart illustrating a fourth example of the IR image processing flow according to the first embodiment. The process shown in FIG. 14D is repeated for each of the one or more IR images to be processed.
  • an IR image is acquired by the IR image acquisition unit 120, and an auxiliary image is acquired by the auxiliary image acquisition unit 130 (step S104).
  • the angles of view of the IR image and the auxiliary image are calibrated so as to overlap (ideally match) each other.
  • the IR image acquisition unit 120 outputs the acquired IR image to the determination unit 140 and the image processing unit 150.
  • the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the image processing unit 150.
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S112). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed appears in the IR image based on the result of image recognition (step S120).
  • the image processing unit 150 sets a blurring area in the IR image and sets a corresponding area in the auxiliary image (step S132). .
  • the image processing unit 150 determines the blurring level based on the one or more parameters described above (step S133).
  • the blurring level here may correspond to, for example, the mixing ratio described above.
  • the image processing unit 150 blurs the partial image in the blurred area by mixing the auxiliary image with the IR image in the blurred area in accordance with the determined blur level (step S137). If it is not determined that the subject whose transmission is to be suppressed appears in the IR image, the processes of steps S132, S133, and S137 are skipped.
  • the image processing unit 150 outputs the IR image updated in step S137 or the IR image that has not been updated because the subject whose transmission is to be suppressed is not reflected, to the display 110, the communication interface 112, or the storage 114. (Step S138).
  • step S140 the IR image processing shown in FIG. 14D ends.
  • the various processing steps described so far are not limited to the examples shown in the flowchart, and may be combined in any way.
  • the IR image and the auxiliary image may be mixed for blurring as in the fourth example.
  • the subject is recognized using the visible light image as the auxiliary image in the daytime period when the visible light image can be used, as in the third example, and as in the first example in the nighttime period.
  • the image to be used may be switched depending on the time zone so that the subject is recognized using the IR image.
  • Second Embodiment> In the first embodiment described in the previous section, the blurred region corresponding to the whole or a part of the IR image is blurred. On the other hand, in the second embodiment, in order to prevent inappropriate actions caused by infrared transparency with a simpler implementation, in the transmission suppression process, all or part of the IR image. The imaging, display or recording of the image is invalidated.
  • FIG. 15 is a block diagram illustrating an example of a logical function configuration of the image processing apparatus 200 according to the second embodiment.
  • the image processing apparatus 200 includes an IR image acquisition unit 120, an auxiliary image acquisition unit 130, a determination unit 140, a recognition DB 145, an image processing unit 250, a user interface unit 260, and a notification unit 170.
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target (for example, a human face, body, or part of a body) for an IR image or an auxiliary image, so that an IR image It is determined whether or not a subject whose transmission should be suppressed is shown. Then, the determination unit 140 outputs the determination result to the image processing unit 250.
  • a predetermined recognition target for example, a human face, body, or part of a body
  • the image processing unit 250 suppresses at least partially the display or recording of the IR image based on the determination result input from the determination unit 140. More specifically, in the present embodiment, the image processing unit 250 captures all or part of the IR image when the determination unit 140 determines that the subject whose transmission should be suppressed is reflected in the IR image. Disable display or recording.
  • the image processing unit 250 sends an invalidation signal to the IR image acquisition unit 120 (or the infrared camera 102), thereby Imaging may be stopped.
  • the imaging of the IR image can be resumed triggered by a user input detected by the user interface unit 260 or when the recognition target is no longer recognized in the auxiliary image.
  • the image processing unit 250 stops outputting the IR image to the display 110 or stops recording the IR image to the storage 114 when a predetermined recognition target is recognized in the IR image or the auxiliary image. May be.
  • the image processing unit 250 invalidates the imaging, display, or recording (for example, one or more frames determined to show a subject whose transmission should be suppressed in a series of frames, or 1
  • An IR image in a part of one frame may be replaced with an auxiliary image.
  • the auxiliary image here may be a visible light image, an additional IR image (for example, a thermal image with low transparency), a depth map, or a fixed image prepared in advance. .
  • FIG. 16 is an explanatory diagram for explaining invalidation of imaging, display, or recording according to the present embodiment.
  • the left side of FIG. 16 shows a series of IR images Im21 to Im24 input to the image processing unit 250 along the time axis.
  • the earliest IR image Im21 does not show a subject whose transmission should be suppressed, but the subsequent IR images Im22 and Im23 show people, and a rectangular frame 242a and a face frame recognized by each IR image 242b is added. Further, the subsequent IR image Im24 does not show a subject whose transmission should be suppressed.
  • the image processing unit 250 sequentially receives these IR images from the IR image acquisition unit 120 and receives corresponding determination results from the determination unit 140.
  • the image processing unit 250 outputs the IR image Im21 and the IR image Im24 to an output destination such as the display 110 or the storage 114, and outputs the IR image Im22 and the IR image Im23 in which the subject whose transmission should be suppressed is displayed. Do not output to the destination.
  • an auxiliary image such as a visible light image, a thermal image, or a depth map is provisionally output instead of an IR image that is not output.
  • the image processing unit 250 may display a sign indicating that transmission is being suppressed on the screen.
  • the sign may be superimposed on an auxiliary image that is output instead of the IR image.
  • the notification unit 170 may notify a nearby person that an IR image is captured.
  • the notification unit 170 may perform notification with a different notification pattern depending on whether or not transmission suppression is performed by the image processing unit 250.
  • FIG. 17A is a flowchart illustrating a first example of the flow of IR image processing according to the second embodiment. The process shown in FIG. 17A is repeated for each of one or more IR images to be processed.
  • the IR image acquisition unit 120 acquires an IR image, and outputs the acquired IR image to the determination unit 140 and the image processing unit 250 (step S202).
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S212). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed is reflected in the IR image based on the image recognition result (step S220).
  • the image processing unit 250 When it is determined that the subject whose transmission should be suppressed is reflected in the IR image, the image processing unit 250 skips the output of the IR image to the display 110, the communication interface 112, or the storage 114 (step S232). On the other hand, when it is determined that the subject whose transmission is to be suppressed is not reflected in the IR image, the image processing unit 250 outputs the IR image to any output destination (step S236).
  • step S240 the IR image processing shown in FIG. 17A ends.
  • FIG. 17B is a flowchart illustrating a second example of the flow of IR image processing according to the second embodiment. The process shown in FIG. 17B is repeated for each of one or more IR images to be processed.
  • an IR image is acquired by the IR image acquisition unit 120, and an auxiliary image is acquired by the auxiliary image acquisition unit 130 (step S204).
  • the angles of view of the IR image and the auxiliary image are calibrated so as to overlap (ideally match) each other.
  • the IR image acquisition unit 120 outputs the acquired IR image to the image processing unit 150.
  • the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the determination unit 140.
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the auxiliary image input from the auxiliary image acquisition unit 130 (step S214).
  • the determination unit 140 determines whether a subject whose transmission should be suppressed is reflected in the IR image based on the image recognition result (step S220). For example, when a predetermined recognition target is recognized in the auxiliary image, the determination unit 140 can determine that the subject whose transmission should be suppressed appears in the IR image.
  • the image processing unit 250 When it is determined that the subject whose transmission should be suppressed is reflected in the IR image, the image processing unit 250 skips the output of the IR image to the display 110, the communication interface 112, or the storage 114 (step S232). On the other hand, when it is determined that the subject whose transmission is to be suppressed is not reflected in the IR image, the image processing unit 250 outputs the IR image to any output destination (step S236).
  • step S240 the IR image processing shown in FIG. 17B ends.
  • FIG. 17C is a flowchart illustrating a third example of the flow of IR image processing according to the second embodiment. The process shown in FIG. 17C is repeated for each of the one or more IR images to be processed.
  • an IR image is acquired by the IR image acquisition unit 120, and an auxiliary image is acquired by the auxiliary image acquisition unit 130 (step S204).
  • the IR image acquisition unit 120 outputs the acquired IR image to the determination unit 140 and the image processing unit 150. Further, the auxiliary image acquisition unit 130 outputs the acquired auxiliary image to the image processing unit 150.
  • the determination unit 140 performs image recognition for recognizing a predetermined recognition target for the IR image input from the IR image acquisition unit 120 (step S212). Next, the determination unit 140 determines whether a subject whose transmission should be suppressed is reflected in the IR image based on the image recognition result (step S220).
  • the image processing unit 250 When it is determined that the subject whose transmission should be suppressed is reflected in the IR image, the image processing unit 250 replaces the IR image or its partial image with the auxiliary image input from the auxiliary image acquisition unit 130 (step S233). . Then, the image processing unit 250 outputs the replaced IR image to the display 110, the communication interface 112, or the storage 114 (step S234). On the other hand, if it is determined that the subject whose transmission is to be suppressed is not reflected in the IR image, the image processing unit 250 outputs the IR image input from the IR image acquisition unit 120 as it is to any output destination ( Step S236).
  • step S240 the IR image processing shown in FIG. 17C ends.
  • image recognition for recognizing a recognition target such as a human face, body, or part of a body is performed on an infrared image or an auxiliary image having an angle of view overlapping the infrared image,
  • the recognition target is recognized, it is determined that the subject whose transmission should be suppressed is reflected in the infrared image. Therefore, the above-described mechanism can be easily incorporated into an apparatus or system by utilizing an existing image recognition technology such as face recognition or person recognition.
  • a region corresponding to the whole or a part of the infrared image is blurred.
  • the portion that is not blurred is still clearly visible by the user, and the subject can be identified to some extent (depending on the blurring level) even in the portion that is blurred. Therefore, it is possible to secure a wider opportunity to use the infrared image than in the case where the imaging of the IR image is prohibited.
  • the blurring level when blurring the area where the subject whose transmission should be suppressed is blurred is dynamically determined. Therefore, when the use of infrared images is likely to lead to inappropriate acts, it is possible to effectively prevent inappropriate acts caused by infrared transparency by adaptively increasing the blurring level. .
  • the infrared image when it is determined that a subject whose transmission is to be suppressed is reflected in the infrared image, imaging, display, or recording of the whole or a part of the infrared image is invalidated.
  • a mechanism for suppressing the transmission of the subject since it is not necessary to implement a process for processing pixel values for blurring, a mechanism for suppressing the transmission of the subject can be realized at a lower cost or with a small processing delay.
  • a series of control processing by each device described in this specification may be realized using any of software, hardware, and a combination of software and hardware.
  • the program constituting the software is stored in advance in a storage medium (non-transitory medium) provided inside or outside each device.
  • Each program is read into a RAM at the time of execution, for example, and executed by a processor such as a CPU.
  • processing described using the flowchart in this specification does not necessarily have to be executed in the order shown in the flowchart. Some processing steps may be performed in parallel. Further, additional processing steps may be employed, and some processing steps may be omitted.
  • An infrared image acquisition unit for acquiring an infrared image;
  • a determination unit that determines whether a subject whose transmission should be suppressed is reflected in the infrared image;
  • a processing unit that at least partially suppresses display or recording of the infrared image based on a determination result by the determination unit;
  • An image processing apparatus comprising: (2) The determination unit determines that the subject is reflected in the infrared image when a predetermined recognition target is recognized by image recognition of the infrared image or an auxiliary image having an angle of view overlapping the infrared image;
  • the image processing apparatus according to (1).
  • the image processing apparatus (3) The image processing apparatus according to (2), wherein the predetermined recognition target includes a human face, body, or part of a body. (4) The processing unit blurs a blurring area corresponding to the whole or a part of the infrared image when the determination unit determines that the subject is reflected in the infrared image. The image processing apparatus according to any one of (3). (5) The image processing apparatus according to (4), wherein the processing unit smoothes or fills the blurred area, or blurs the blurred area by mixing other pixel values in the blurred area. . (6) The processing unit sets the blurring area in the infrared image so that the blurring area includes an estimated position of the subject determined to be reflected in the infrared image. The image processing apparatus according to 5).
  • the image processing apparatus according to any one of (4) to (6), wherein the processing unit blurs the blurring area according to a blurring level that is dynamically determined.
  • the processing unit determines the blurring level based on one or more of an infrared irradiation intensity and an ambient light intensity when the infrared image is captured. apparatus.
  • the determination unit determines that the subject is reflected in the infrared image when a predetermined recognition target is recognized by image recognition of the infrared image or an auxiliary image having an angle of view overlapping the infrared image, The processing unit determines the blurring level based on a recognition likelihood of the image recognition.
  • the image processing apparatus according to (7).
  • the processing unit determines the blurring level based on one or more of a distance to the subject when the infrared image is captured and a size of the subject in the infrared image.
  • the image processing apparatus according to 7).
  • (13) The image processing apparatus according to (5), wherein the processing unit determines a fill color or a mixing color for blurring based on a representative value of pixel values belonging to the infrared image.
  • the processing unit invalidates imaging, display, or recording of the whole or a part of the infrared image when the determination unit determines that the subject is reflected in the infrared image.
  • the image processing apparatus according to any one of (3).
  • the auxiliary image includes one or more of a visible light image, a thermal image, and a depth map.
  • the processing unit displays a sign indicating that the suppression is performed on the screen.
  • the image processing apparatus includes: A camera that captures the infrared image; A notification unit for notifying a nearby person that the infrared image is captured by the camera; With The notification unit performs the notification with a different notification pattern depending on whether or not the suppression is performed.
  • the image processing apparatus according to any one of (1) to (17).
  • An image processing method including: (20) A computer for controlling the image processing apparatus; An infrared image acquisition unit for acquiring an infrared image; A determination unit that determines whether a subject whose transmission should be suppressed is reflected in the infrared image; A processing unit that at least partially suppresses display or recording of the infrared image based on a determination result by the determination unit; Program to function as.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

L'invention a pour but d'empêcher des actions inappropriées de se produire en raison des propriétés entraînant la transparence d'une lumière infrarouge, tout en conservant la possibilité, pour des images infrarouges, d'être utilisées de manière appropriée. Pour atteindre ce but, la présente invention concerne un dispositif de traitement d'image qui est pourvu des éléments suivants : une unité d'acquisition d'image infrarouge qui acquiert des images infrarouges ; une unité de détermination qui détermine si un sujet, pour lequel la transparence doit être supprimée, est montré ou non dans l'image infrarouge ; une unité de traitement qui empêche, sur la base des résultats de détermination provenant de l'unité de détermination, au moins partiellement l'affichage ou l'enregistrement de l'image infrarouge.
PCT/JP2015/071543 2014-10-24 2015-07-29 Dispositif de traitement d'image, procédé de traitement d'image et programme WO2016063595A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014217473 2014-10-24
JP2014-217473 2014-10-24

Publications (1)

Publication Number Publication Date
WO2016063595A1 true WO2016063595A1 (fr) 2016-04-28

Family

ID=55760644

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/071543 WO2016063595A1 (fr) 2014-10-24 2015-07-29 Dispositif de traitement d'image, procédé de traitement d'image et programme

Country Status (1)

Country Link
WO (1) WO2016063595A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180110645A (ko) 2017-03-29 2018-10-10 주식회사 레고켐 바이오사이언스 피롤로벤조디아제핀 이량체 전구체 및 이의 리간드-링커 접합체 화합물
KR20200084802A (ko) 2019-01-03 2020-07-13 주식회사 레고켐 바이오사이언스 안전성이 향상된 피롤로벤조디아제핀 이량체 화합물 및 이의 용도
KR20220122590A (ko) 2019-01-03 2022-09-02 주식회사 레고켐 바이오사이언스 안전성이 향상된 피롤로벤조디아제핀 이량체 화합물 및 이의 용도

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002358527A (ja) * 2001-05-31 2002-12-13 Hudson Soft Co Ltd 画像処理装置、画像処理方法、画像処理方法をコンピュータに実行させるためのプログラム及びそのプログラムを記録した記録媒体、並びに、撮像画像表示装置、撮像画像表示方法、撮像画像表示方法をコンピュータに実行させるためのプログラム及びそのプログラムを記録した記録媒体
JP2006041841A (ja) * 2004-07-26 2006-02-09 Nippon Telegr & Teleph Corp <Ntt> 撮影可否判定装置、撮影システム、撮影可否判定方法、撮影可否判定プログラム、および、記録媒体
JP2009201064A (ja) * 2008-02-25 2009-09-03 Pioneer Electronic Corp 関連領域特定装置及び方法、並びに画像認識装置及び方法
JP2012015834A (ja) * 2010-07-01 2012-01-19 Konica Minolta Opto Inc 撮像装置
JP2013131824A (ja) * 2011-12-20 2013-07-04 Nikon Corp 電子機器
JP2013242408A (ja) * 2012-05-18 2013-12-05 Canon Inc 撮像装置およびその制御方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002358527A (ja) * 2001-05-31 2002-12-13 Hudson Soft Co Ltd 画像処理装置、画像処理方法、画像処理方法をコンピュータに実行させるためのプログラム及びそのプログラムを記録した記録媒体、並びに、撮像画像表示装置、撮像画像表示方法、撮像画像表示方法をコンピュータに実行させるためのプログラム及びそのプログラムを記録した記録媒体
JP2006041841A (ja) * 2004-07-26 2006-02-09 Nippon Telegr & Teleph Corp <Ntt> 撮影可否判定装置、撮影システム、撮影可否判定方法、撮影可否判定プログラム、および、記録媒体
JP2009201064A (ja) * 2008-02-25 2009-09-03 Pioneer Electronic Corp 関連領域特定装置及び方法、並びに画像認識装置及び方法
JP2012015834A (ja) * 2010-07-01 2012-01-19 Konica Minolta Opto Inc 撮像装置
JP2013131824A (ja) * 2011-12-20 2013-07-04 Nikon Corp 電子機器
JP2013242408A (ja) * 2012-05-18 2013-12-05 Canon Inc 撮像装置およびその制御方法

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180110645A (ko) 2017-03-29 2018-10-10 주식회사 레고켐 바이오사이언스 피롤로벤조디아제핀 이량체 전구체 및 이의 리간드-링커 접합체 화합물
KR20210058795A (ko) 2017-03-29 2021-05-24 주식회사 레고켐 바이오사이언스 피롤로벤조디아제핀 이량체 전구체 및 이의 리간드-링커 접합체 화합물
KR20220010048A (ko) 2017-03-29 2022-01-25 주식회사 레고켐 바이오사이언스 피롤로벤조디아제핀 이량체 전구체 및 이의 리간드-링커 접합체 화합물
KR20200084802A (ko) 2019-01-03 2020-07-13 주식회사 레고켐 바이오사이언스 안전성이 향상된 피롤로벤조디아제핀 이량체 화합물 및 이의 용도
KR20220122590A (ko) 2019-01-03 2022-09-02 주식회사 레고켐 바이오사이언스 안전성이 향상된 피롤로벤조디아제핀 이량체 화합물 및 이의 용도

Similar Documents

Publication Publication Date Title
US11501535B2 (en) Image processing apparatus, image processing method, and storage medium for reducing a visibility of a specific image region
CN105323497B (zh) 恒定包围的高动态范围(cHDR)操作
US9875530B2 (en) Gradient privacy masks
US20190199898A1 (en) Image capturing apparatus, image processing apparatus, control method, and storage medium
US11425298B2 (en) Imaging device, imaging method, and program
JP6293571B2 (ja) 監視方法及びカメラ
US7574021B2 (en) Iris recognition for a secure facility
WO2019148978A1 (fr) Procédé et appareil de traitement d&#39;images, support de stockage et dispositif électronique
TWI689892B (zh) 基於前景影像的背景虛化方法與電子裝置
EP2813970A1 (fr) Caméra et procédé de surveillance
US10255683B1 (en) Discontinuity detection in video data
JP5071198B2 (ja) 信号機認識装置,信号機認識方法および信号機認識プログラム
IL256202A (en) A method for enhancing an ir or scam image based on video analysis information
JP2017201745A (ja) 画像処理装置、画像処理方法およびプログラム
WO2018233217A1 (fr) Procédé de traitement d&#39;image, dispositif et appareil de réalité augmentée
WO2016063595A1 (fr) Dispositif de traitement d&#39;image, procédé de traitement d&#39;image et programme
TWI542212B (zh) Photographic system with visibility enhancement
US20130308829A1 (en) Still image extraction apparatus
CN112926367A (zh) 一种活体检测的设备及方法
JP2014178146A (ja) 画像処理装置及び方法
KR101920740B1 (ko) 실시간 영상처리 시스템
KR102474697B1 (ko) 촬상 장치 및 영상 처리 방법
JP2005173879A (ja) 融合画像表示装置
JP2021149691A (ja) 画像処理システム及び制御プログラム
JP5782870B2 (ja) 検出装置及び検出方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15852912

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15852912

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP