WO2015166675A1 - 画像処理装置、画像処理方法、およびプログラム - Google Patents
画像処理装置、画像処理方法、およびプログラム Download PDFInfo
- Publication number
- WO2015166675A1 WO2015166675A1 PCT/JP2015/051572 JP2015051572W WO2015166675A1 WO 2015166675 A1 WO2015166675 A1 WO 2015166675A1 JP 2015051572 W JP2015051572 W JP 2015051572W WO 2015166675 A1 WO2015166675 A1 WO 2015166675A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- focus
- degree
- presentation
- determination
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 503
- 238000003672 processing method Methods 0.000 title claims description 26
- 238000000034 method Methods 0.000 claims description 175
- 238000001514 detection method Methods 0.000 claims description 22
- 238000010801 machine learning Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 description 137
- 238000003384 imaging method Methods 0.000 description 30
- 238000004891 communication Methods 0.000 description 29
- 230000006870 function Effects 0.000 description 29
- 238000011156 evaluation Methods 0.000 description 20
- 238000010586 diagram Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 8
- 239000003086 colorant Substances 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 3
- 230000002708 enhancing effect Effects 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000001965 increasing effect Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001151 other effect Effects 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 241000406668 Loxodonta cyclotis Species 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000008719 thickening Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/70—Circuits for processing colour signals for colour killing
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/18—Signals indicating condition of a camera member or suitability of light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/74—Circuits for processing colour signals for obtaining special effects
- H04N9/76—Circuits for processing colour signals for obtaining special effects for mixing of colour signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
Definitions
- the present disclosure relates to an image processing device, an image processing method, and a program.
- the user can confirm the details of the location to be actually focused.
- an edge is not displayed for a portion that does not have a high-frequency component in the image, so that the user may not be able to confirm the focused area. is there.
- This disclosure proposes a new and improved image processing apparatus, image processing method, and program capable of allowing a user to grasp the degree of focusing in an image.
- a first determination result that is a determination result of a focus degree in pixel units in a target image that is an image to be processed, and a determination result of a focus degree in area units in the target image.
- An image processing apparatus includes a presentation control unit that controls presentation of the degree of focus in the target image based on the second determination result.
- the first determination processing unit that determines the degree of focus in pixel units from the target image that is the processing target image
- the second that determines the degree of focus in area units from the target image.
- the first determination result that is the determination result of the degree of focus in the pixel unit in the target image that is the processing target image
- the determination result of the focus degree in the region unit in the target image An image processing method executed by the image processing apparatus is provided that includes a step of controlling the presentation of the degree of focus in the target image based on the second determination result.
- the first determination result that is the determination result of the degree of focus in the pixel unit in the target image that is the processing target image
- the determination result of the focus degree in the region unit in the target image A program for causing a computer to execute the step of controlling the presentation of the degree of focus in the target image is provided based on the second determination result.
- the user can grasp the degree of focusing in the image.
- Image processing method according to this embodiment Before describing the configuration of the image processing apparatus according to the present embodiment, first, the image processing method according to the present embodiment will be described. Hereinafter, the image processing method according to the present embodiment will be described by taking as an example the case where the image processing apparatus according to the present embodiment performs the processing according to the image processing method according to the present embodiment.
- the image processing apparatus controls the presentation of the degree of focus in an image to be processed (hereinafter referred to as “target image”) (presentation control process).
- the image processing apparatus has a determination result of the degree of focus in pixel units in the target image (hereinafter, may be referred to as “first determination result”) and a unit of area in the target image. Based on the determination result of the degree of focus (hereinafter, sometimes referred to as “second determination result”), the presentation of the degree of focus in the target image is controlled.
- the target image according to the present embodiment is, for example, a captured image generated by an imaging device having a plurality of imaging elements.
- a captured image according to the present embodiment for example, a through image (an image generated by an imaging device is displayed on a display screen without being recorded on a recording medium. An image used in a so-called live view or the like). And an image generated by an imaging device read from a recording medium.
- a case where the target image according to the present embodiment is a through image will be described as an example.
- control of presenting the degree of focus in the target image includes, for example, generation control for generating an image based on the target image (hereinafter referred to as “presented image”).
- the presentation image according to the present embodiment is an image in which the degree of focus is represented by the way of presentation based on one or both of the first determination result and the second determination result.
- a presentation image concerning this embodiment the image shown below is mentioned, for example.
- ⁇ Presentation image showing the degree of focus in pixel units based on the first determination result ⁇ Presentation image showing the degree of focus in region units based on the second determination result ⁇ Based on the first determination result Presented image showing the degree of focus in pixel units and the degree of focus in area units based on the second determination result
- the image processing apparatus generates, for example, a presentation image as the process related to generation control according to the present embodiment. Further, the image processing apparatus according to the present embodiment, for example, causes the external device (or external device) of the image processing apparatus according to the present embodiment to generate a presentation image as the process related to the generation control according to the present embodiment.
- the image processing apparatus When the image processing apparatus according to the present embodiment performs processing related to generation control as the presentation control processing according to the present embodiment, the user can determine the degree of focus in the target image by, for example, looking at the generated presentation image. It becomes possible to grasp. Therefore, the image processing apparatus according to the present embodiment performs a process related to generation control as the presentation control process according to the present embodiment, so that the user can grasp the degree of focusing in the image.
- control of presentation of the degree of focus in the target image is not limited to the above.
- control of presentation of the degree of focus in the target image” according to the present embodiment may further include display control for displaying the generated presentation image on the display screen.
- the display control process according to the present embodiment can display, for example, a display screen for displaying the presented image. It is performed when the state is in a normal state (for example, when the power is on).
- the presentation control process which concerns on this embodiment in said case is not restricted above.
- the display screen for displaying the presentation image is not in a displayable state (for example, when the power is off)
- the display screen is in a displayable state.
- the presented image can be displayed on the display screen.
- the image processing apparatus performs, for example, a display screen of a display unit (described later) included in the image processing apparatus according to the present embodiment as a process related to display control according to the present embodiment.
- the image processing apparatus which concerns on this embodiment displays the produced
- the image processing apparatus according to the present embodiment shows a presentation image on a communication unit (described later) included in the image processing apparatus according to the present embodiment or a communication device connected to the image processing apparatus according to the present embodiment. By transmitting data and a display command to the display device, a presentation image is displayed on the display device.
- the processing related to display control according to the present embodiment is not limited to the above.
- the image processing apparatus according to the present embodiment may magnify and display a part of the generated presentation image (for example, a focused part in the presentation image) (a so-called enlarged focus presentation image). Display of).
- the image processing apparatus according to the present embodiment can display the presentation image on the display screen by various methods such as displaying the presentation image on a part of the target image such as PinP (Picture in Picture). Is possible.
- the image processing apparatus when the image processing apparatus according to the present embodiment performs the process related to the generation of the presentation image and the process related to the display control as the presentation control process according to the present embodiment, that is, the image processing apparatus according to the present embodiment.
- the case where a presentation image is displayed on a display screen is given as an example.
- the image processing apparatus according to the present embodiment performs a process related to generation of a presentation image as the presentation control process according to the present embodiment, for example, the user manually performs a presentation image by performing an operation. Is displayed on the display screen, the same effect as that obtained when the image processing apparatus according to the present embodiment to be described later displays the presentation image on the display screen is obtained.
- the first determination result according to the present embodiment is a result of determining the degree of focus on the target image in units of pixels, and the first determination result according to the present embodiment is focused on the target image. Indicates where it is. Examples of the first determination result according to the present embodiment include an edge image in which an edge detected from the target image is represented.
- the process for determining the degree of focus in pixel units from the target image may be performed by the image processing apparatus according to the present embodiment, or may be performed by an external apparatus of the image processing apparatus according to the present embodiment.
- An example of the first determination result according to the present embodiment and an example of a process for determining the in-focus level for each pixel according to the present embodiment will be described later.
- the second determination result according to the present embodiment is a result of determining the degree of focus on the target image in units of regions obtained by dividing the target image
- the second determination result according to the present embodiment is , An area focused on the target image.
- Examples of the area according to the present embodiment include a rectangular area obtained by dividing the target image in the horizontal direction and the vertical direction.
- the area according to the present embodiment is not limited to the above, and may be an area having an arbitrary shape for dividing the target image.
- the region according to the present embodiment is a rectangular region in which the target image is equally divided in the horizontal direction and the vertical direction will be described.
- the second determination result according to the present embodiment for example, a determination image generated based on a score map in which the degree of focusing is quantified for each region can be cited.
- the determination image according to the present embodiment is generated, for example, by representing only a region having a value equal to or greater than a predetermined threshold (or a value greater than the threshold) among the regions constituting the score map.
- the second determination result according to the present embodiment may include, for example, a score map.
- the second determination result according to the present embodiment may be a score map.
- the process for determining the degree of focusing in units of regions from the target image may be performed by the image processing apparatus according to the present embodiment, or may be performed by an external apparatus of the image processing apparatus according to the present embodiment.
- An example of the second determination result according to the present embodiment and an example of a process for determining the in-focus level for each region according to the present embodiment will be described later.
- the image processing apparatus performs, for example, the following processes (A) to (C) as the presentation control process according to the present embodiment, and displays the presented image on the display screen.
- the image processing apparatus is, for example, an image obtained by superimposing an edge image (first determination result) or an image based on the edge image on the target image, and a degree of focus in pixel units.
- an image based on the edge image according to the present embodiment for example, an image in which the edge image is processed so that the edge indicated by the edge image is not changed, such as an image in which the color, color density, luminance, or the like of the edge image is changed, Can be mentioned.
- the presented image in which the degree of focus in pixel units according to the present embodiment is represented is an image in which a target image and an edge image (or an image based on an edge image; hereinafter the same) are combined.
- an image in which the target image and the edge image are superimposed on different layers may be used.
- the image processing apparatus displays, for example, a presentation image in which the degree of focus in pixel units is expressed, that is, a presentation image in which an edge image is superimposed on the target image, on the display screen.
- the user who viewed the presentation image showing the degree of focus in pixel units displayed on the display screen can grasp the degree of focus in the details of the target image from the edge image superimposed on the target image. .
- the image processing apparatus presents a determination image (second determination result) or an image based on the determination image superimposed on the target image, and a presentation image in which the degree of focus in region units is expressed.
- the image based on the determination image according to the present embodiment include an image obtained by processing the determination image, such as an image in which the color, color density, luminance, or the like of the determination image is changed.
- the presented image in which the degree of focus in units of regions according to the present embodiment is represented is an image in which a target image and a determination image (or an image based on the determination image; hereinafter the same) are combined.
- an image in which the target image and the determination image are superimposed on different layers may be used.
- the image processing apparatus displays, on the display screen, for example, a presentation image in which the degree of focusing in units of regions is represented, that is, a presentation image in which a determination image is superimposed on the target image.
- the user who sees the presented image showing the degree of focusing in units of areas displayed on the display screen can grasp the focused area from the determination image superimposed on the target image.
- the focused area corresponds to, for example, the depth of field range
- the user must grasp the range of the depth of field based on the determination image superimposed on the target image. Can do.
- (C) An example of processing for displaying a presentation image in which the degree of focusing in pixel units and the degree of focusing in area units are displayed.
- the degree of focusing in pixel units and the degree of focusing in region units are displayed.
- the image processing apparatus When displaying the presented presentation image, the image processing apparatus according to the present embodiment generates, for example, an image in which the edge image and the determination image are superimposed on the target image as the presentation image.
- the image processing apparatus for example, superimposes a presentation image representing the degree of focusing on a pixel basis and the degree of focusing on an area basis, that is, an edge image and a determination image, on the target image.
- the displayed image is displayed on the display screen.
- FIG. 1 is an explanatory diagram illustrating an example of a presentation image according to the present embodiment.
- FIG. 1 shows an example of a presentation image in which the degree of focus in pixel units and the degree of focus in region units are represented.
- a shown in FIG. 1 shows an example of the edge indicated by the edge image superimposed on the target image.
- B shown in FIG. 1 has shown an example of the determination image superimposed on the target image.
- the user who viewed the presentation image shown in FIG. 1 displayed on the display screen can grasp the degree of focus in the details of the target image by the edge (A in FIG. 1) indicated by the edge image superimposed on the target image. it can. Therefore, the user who viewed the presented image can visually recognize the details of the in-focus area in the target image by the edge (A in FIG. 1) indicated by the edge image superimposed on the target image. Can be performed more easily.
- the user who viewed the presentation image shown in FIG. 1 can visually recognize the focused position in the target image by the determination image (B in FIG. 1) superimposed on the target image. It is possible to grasp a focused area (range of depth of field).
- the user who viewed the presented image can visually recognize the focused spot in the target image by the determination image (B in FIG. 1) superimposed on the target image, the focused area Can be grasped more quickly.
- the user who has viewed the presentation image displayed on the display screen can determine the degree of focus in the details of the target image and the focused area (the depth of field) by the edge image and the determination image superimposed on the elephant image. Range).
- the user since it becomes possible to grasp both the in-focus level in the details of the target image and the focused area (the range of the depth of field), the user can, for example, more easily and Thus, the degree of focusing in the target image can be grasped more intuitively.
- the presentation image according to the present embodiment is a presentation image in which the edge image and the determination image are superimposed on the target image
- the determination image superimposed on the target image causes a portion where no edge exists in the target image. The degree of focus can be grasped.
- the image processing apparatus generates a presentation image by performing, for example, the processes shown in (A) to (C) above as the presentation control process according to the present embodiment, and generates the generated presentation image. Display on the display screen.
- the presentation image according to the present embodiment is an image in which one or both of the degree of focusing in pixel units and the degree of focusing in region units are represented.
- the image processing apparatus grasps the degree of focus in the image by performing, for example, the processes (A) to (C) described above as the presentation control process according to the present embodiment. Can be made.
- the image processing apparatus further improves the operability of the imaging device by performing, for example, the processes shown in (A) to (C) above as the presentation control process according to the present embodiment. Can do.
- the image processing apparatus can cause the user to grasp the degree of focusing in the image by performing, for example, the presentation control process as a process related to the image processing method according to the present embodiment.
- the said presentation control process represents the process which concerns on the image processing method which concerns on this embodiment for convenience. Therefore, in the process according to the image processing method according to the present embodiment, the presentation control process can be regarded as two or more processes (depending on an arbitrary separation method).
- processing related to the image processing method according to the present embodiment is not limited to the above.
- the image processing apparatus includes a process for determining the degree of focus in pixel units from the target image, and a focus in area units from the target image, as processes related to the image processing method according to the present embodiment.
- One or both of the processes for determining the degree may be performed.
- the image processing apparatus performs one or both of a process for determining the degree of focus in pixel units from the target image and a process for determining the degree of focus in area units from the target image.
- the image processing apparatus performs the presentation control process using the result of the process performed (first determination result and / or second determination result).
- FIG. 2 is a block diagram illustrating an example of the configuration of the image processing apparatus 100 according to the present embodiment.
- FIGS. 3 to 8 are explanatory diagrams showing an example of an image related to processing in the image processing apparatus 100 according to the present embodiment.
- a shown in FIGS. 3 to 8 shows an example of the target image.
- B shown in FIGS. 3 to 8 shows an example of an edge image (an example of a first determination result) obtained from the target image shown in A of FIGS. 3 to 8, and is shown in FIGS. C shows another example of the edge image (an example of the first determination result).
- B shown in FIGS. 3 to 8 and C shown in FIGS. 3 to 8 are edge images obtained when the filter coefficient and the filter type of the filter for obtaining the edge image from the target image are changed. It can be said that C shown in FIGS.
- FIG. 3 to 8 is an edge image representing only a portion determined to be a focused portion in the target image shown in A of FIGS. Further, D shown in FIGS. 3 to 8 shows an example of the determination image, and E shown in FIGS. 3 to 8 shows an example of the presentation image.
- the image processing apparatus 100 includes, for example, a first determination processing unit 102, a second determination processing unit 104, and a presentation control unit 106.
- the image processing apparatus 100 includes, for example, a control unit (not shown) for controlling the entire image processing apparatus 100, a ROM (Read Only Memory. Not shown), a RAM (Random Access Memory. Not shown), an external A communication unit (not shown) for communicating with an external device such as a display device, a storage unit (not shown), an operation unit (not shown) that can be operated by the user, and various screens are displayed on the display screen.
- a control unit for controlling the entire image processing apparatus 100
- ROM Read Only Memory.
- RAM Random Access Memory
- an external A communication unit for communicating with an external device such as a display device
- a storage unit not shown
- an operation unit not shown
- various screens are displayed on the display screen.
- a display unit (not shown) or the like may be provided.
- the control unit (not shown) is configured by, for example, a processor configured by an arithmetic circuit such as an MPU, various circuits, and the like, and controls the entire image processing apparatus 100.
- the control unit may serve as one or more of the first determination processing unit 102, the second determination processing unit 104, and the presentation control unit 106 in the image processing apparatus 100.
- the first determination processing unit 102, the second determination processing unit 104, and the presentation control unit 106 are configured by dedicated (or general-purpose) circuits capable of realizing the processing of each unit. Needless to say, it's also good.
- ROM (not shown) stores control data such as programs and calculation parameters used by a control unit (not shown), for example.
- a RAM (not shown) temporarily stores, for example, a program executed by a control unit (not shown).
- the communication unit is a communication unit included in the image processing apparatus 100, and serves to perform communication with an external device wirelessly or by wire via a network (or directly). In addition, communication of a communication unit (not shown) is controlled by, for example, a control unit (not shown). An example of the communication unit (not shown) is a communication interface described later.
- the storage unit (not shown) is a storage unit included in the image processing apparatus 100, and stores various data such as applications.
- data related to processing related to the image processing method according to the present embodiment such as image data indicating a captured image, may be stored.
- Examples of the storage unit (not shown) include a recording medium described later.
- an operation input device to be described later can be cited.
- a display part (not shown), the display device mentioned later is mentioned.
- FIG. 9 is an explanatory diagram illustrating an example of a hardware configuration of the image processing apparatus 100 according to the present embodiment.
- the image processing apparatus 100 includes, for example, an MPU 150, a ROM 152, a RAM 154, a recording medium 156, an input / output interface 158, an operation input device 160, a display device 162, a communication interface 164, an imaging device 166, and a sensor. 168.
- the image processing apparatus 100 connects each component with a bus 170 as a data transmission path, for example.
- the MPU 150 is configured by, for example, a processor configured by an arithmetic circuit such as an MPU, various processing circuits, and the like, and functions as a control unit (not shown) that controls the entire image processing apparatus 100.
- the MPU 150 plays the role of one or more of the first determination processing unit 102, the second determination processing unit 104, and the presentation control unit 106 in the image processing apparatus 100, for example.
- the first determination processing unit 102, the second determination processing unit 104, and the presentation control unit 106 are configured by dedicated (or general-purpose) circuits capable of realizing the processing of each unit. Also good.
- the ROM 152 stores programs used by the MPU 150, control data such as calculation parameters, and the like.
- the RAM 154 temporarily stores a program executed by the MPU 150, for example.
- the recording medium 156 functions as a storage unit (not shown), and stores various data such as applications. Further, the recording medium 156 may store data related to processing related to the image processing method according to the present embodiment, such as image data indicating a captured image.
- Examples of the recording medium 156 include a magnetic recording medium such as a hard disk, a non-volatile memory such as a flash memory, and the like. Further, the storage unit (not shown) may be detachable from the image processing apparatus 100.
- the input / output interface 158 connects, for example, the operation input device 160 and the display device 162.
- the operation input device 160 functions as an operation unit (not shown)
- the display device 162 functions as a display unit (not shown).
- examples of the input / output interface 158 include a USB terminal, a DVI (Digital Visual Interface) terminal, an HDMI (High-Definition Multimedia Interface) (registered trademark) terminal, and various processing circuits.
- the operation input device 160 is provided on the image processing apparatus 100, for example, and is connected to the input / output interface 158 inside the image processing apparatus 100.
- Examples of the operation input device 160 include a button, a direction key, a rotary selector such as a jog dial, or a combination thereof.
- the display device 162 is provided on the image processing apparatus 100, for example, and is connected to the input / output interface 158 inside the image processing apparatus 100.
- Examples of the display device 162 include a liquid crystal display (Liquid Crystal Display), an organic EL display (Organic Electro-Luminescence Display, or an OLED display (Organic Light Emitting Diode Display)), and the like.
- the input / output interface 158 can be connected to an external device such as an operation input device (for example, a keyboard or a mouse), a display device, or an imaging device as an external device of the image processing apparatus 100.
- an operation input device for example, a keyboard or a mouse
- a display device for example, a liquid crystal display
- an imaging device as an external device of the image processing apparatus 100.
- the display device 162 may be a device capable of display and user operation, such as a touch device.
- the communication interface 164 is a communication unit included in the image processing apparatus 100, and wirelessly communicates with an external device (or an external device) such as an external imaging device or an external display device via a network (or directly). Alternatively, it functions as a communication unit (not shown) for performing wired communication.
- Examples of the communication interface 164 include a communication antenna and an RF (Radio Frequency) circuit (wireless communication), an IEEE 802.15.1 port and a transmission / reception circuit (wireless communication), an IEEE 802.11 port and a transmission / reception circuit (wireless communication), or Examples include LAN (Local Area Network) terminals and transmission / reception circuits (wired communication).
- the communication unit (not shown) has a configuration corresponding to an arbitrary standard capable of communication such as a USB (Universal Serial Bus) terminal and a transmission / reception circuit, or an arbitrary communication capable of communicating with an external device via a network. It may be a configuration.
- an arbitrary standard capable of communication such as a USB (Universal Serial Bus) terminal and a transmission / reception circuit, or an arbitrary communication capable of communicating with an external device via a network. It may be a configuration.
- a wired network such as a LAN or a WAN (Wide Area Network), a wireless LAN (WLAN: Wireless Local Area Network) or a wireless WAN via a base station (WWAN: Wireless Wide Area Area).
- a communication protocol such as TCP / IP (Transmission Control Protocol / Internet Protocol).
- the imaging device 166 is an imaging unit included in the image processing apparatus 100, and generates an image (captured image) by imaging.
- the image processing apparatus 100 can perform processing according to the image processing method according to the present embodiment using, for example, a captured image generated by imaging in the imaging device 166 as a target image. Become.
- the imaging device 166 includes, for example, a lens / imaging device and a signal processing circuit.
- the lens / imaging device includes, for example, an optical lens and an image sensor using a plurality of imaging devices such as CMOS (Complementary Metal Oxide Semiconductor).
- the signal processing circuit includes, for example, an AGC (Automatic Gain Control) circuit and an ADC (Analog to Digital Converter), and converts an analog signal generated by the image sensor into a digital signal (image data).
- the signal processing circuit performs various processes related to, for example, RAW development.
- the signal processing circuit may perform various signal processing such as, for example, White Balance correction processing, color tone correction processing, gamma correction processing, YCbCr conversion processing, and edge enhancement processing.
- the sensor 168 is composed of an illuminance sensor, for example, and detects external light on the display screen.
- the sensor 168 is disposed at a position where external light can be detected on the display screen of the display device 162 or at a position where external light can be detected on the display screen of an external display device. An example of the process using the external light detection result on the display screen will be described later.
- the image processing apparatus 100 performs processing related to the image processing method according to the present embodiment, for example, with the configuration shown in FIG. Note that the hardware configuration of the image processing apparatus 100 according to the present embodiment is not limited to the configuration illustrated in FIG. 9.
- the image processing apparatus 100 when processing a captured image generated by an external imaging device, can be configured not to include the imaging device 166.
- the image processing apparatus 100 can be configured without the sensor 168.
- the image processing apparatus 100 may not include the communication interface 164 if the image processing apparatus 100 is configured to perform processing in a stand-alone manner, for example. Further, the image processing apparatus 100 may be configured not to include the recording medium 156 and the display device 162.
- the first determination processing unit 102 determines the degree of focus in pixel units from the target image. Then, the first determination processing unit 102 transmits the first determination result to the presentation control unit 106. The first determination processing unit 102 plays a leading role in performing a process of determining the degree of focus in pixel units from the target image according to the present embodiment.
- the first determination processing unit 102 detects an edge from the target image by filter processing using one or two or more filters realized by, for example, an analog filter circuit or a digital filter circuit. Then, the first determination processing unit 102 transmits an edge image indicating the detected edge to the presentation control unit 106 as a first determination result.
- examples of the filter according to the present embodiment include a high-pass filter, a band-pass filter, a first-order differential filter, a second-order differential filter, a third-order differential filter, a Sobel filter, a four-way Laplacian filter, and an eight-way Laplacian filter.
- DoG filter Difference of Gaussian filter
- self-quotient filter the filter according to the present embodiment is not limited to the example described above, and may be any filter that can detect an edge from an image.
- the first determination processing unit 102 obtains an edge image (first determination result) using, for example, a preset filter.
- the first determination processing unit 102 can use a filter corresponding to a display screen on which a presentation image is displayed.
- the first determination processing unit 102 is, for example, information related to a display screen on which a presentation image transmitted from the presentation control unit 106 or a control unit (not shown) is displayed (for example, an ID indicating a display device, a display device A filter corresponding to the display screen is specified using data indicating resolution, data indicating the size of the display screen, and the like.
- the first determination processing unit 102 uses a table (or database; hereinafter the same shall apply) in which information about a display screen is associated with one or more filters to display a display image. Identify the filter that corresponds to the information about the screen.
- the table is stored in, for example, a storage unit (not shown), an external recording medium connected to the image processing apparatus 100, or the like.
- the first determination processing unit 102 obtains an edge image by performing a filter process using the specified filter on the target image.
- the first determination processing unit 102 may further perform, for example, a filtering process using a thick line filter on the edge image from which the edge is extracted by the filtering process.
- the first determination processing unit 102 performs a process using a filter corresponding to the display screen on which the presentation image is displayed, so that the presentation control unit 106 (to be described later) focuses on the presentation method corresponding to the display screen.
- a presentation image representing the degree can be generated, and the generated presentation image can be displayed on the display screen.
- the display screen on which the presentation image is displayed is a display screen of a viewfinder included in the imaging device or a display screen of a monitor included in the imaging device
- the display screen on which the presentation image is displayed Depending on the screen size, the screen size and resolution of the display screen, the viewing distance of the user, and the like may vary.
- the presentation control unit 106 By causing the presentation control unit 106 to display a presentation image in which the degree of focus is represented by the presentation method corresponding to the display screen on the display screen, the image processing apparatus 100 allows the degree of focus by the presentation method more suitable for the display screen. Can be displayed on the display screen.
- the image processing apparatus 100 allows the user to easily grasp the degree of focus in the image. be able to.
- the first determination processing unit 102 When the first determination processing unit 102 performs the above-described processing, for example, the edge images shown in FIGS. 3 to 8B and FIGS. 3 to 8C are obtained.
- the second determination processing unit 104 determines the degree of focus in region units from the target image. Then, the second determination processing unit 104 transmits the second determination result to the presentation control unit 106.
- the second determination processing unit 104 plays a leading role in performing processing for determining the degree of focusing in units of regions from the target image according to the present embodiment.
- the second determination processing unit 104 determines the degree of focusing in units of regions from the target image by performing the processes shown in (a) to (c) below, for example, and obtains a second determination result.
- the second determination processing unit 104 divides the target image into rectangular regions in which the target image is equally divided in the horizontal direction and the vertical direction, and determines the degree of focusing in units of regions.
- Each region is indicated as “region w (n)” (n is the number of divisions of the target image).
- the area according to the present embodiment is not limited to the rectangular area.
- the second determination processing unit 104 divides the target image into regions of a predetermined size, but the region setting method in the second determination processing unit 104 is not limited to the above.
- the second determination processing unit 104 can set an area corresponding to the display screen on which the presentation image is displayed.
- the second determination processing unit 104 uses, for example, information about a display screen on which a presentation image transmitted from the presentation control unit 106 or a control unit (not shown) is displayed, for each region corresponding to the display screen.
- the size of the area for determining the degree of focusing is specified.
- the size of the area for determining the degree of focus in the area unit corresponds to the determination size for determining the degree of focus in the area unit.
- the second determination processing unit 104 includes, for example, information related to the display screen and information related to area setting (for example, data defining the area to be divided, such as data indicating the size of the area and data indicating the number of divisions). Is used to identify information relating to setting of an area corresponding to information relating to a display screen on which a presentation image is displayed.
- the table is stored in, for example, a storage unit (not shown), an external recording medium connected to the image processing apparatus 100, or the like.
- the 2nd determination process part 104 sets the area
- the second determination processing unit 104 sets an area corresponding to the display screen on which the presentation image is displayed, and performs the processes shown in the following (a) to (c), thereby providing a presentation control unit 106 described later. Can generate a presentation image in which the degree of focus is expressed according to the presentation method corresponding to the display screen, and can display the generated presentation image on the display screen.
- FIG. 10 is an explanatory diagram showing an example of a presentation image in which the degree of focus is represented by the way of presentation corresponding to the display screen according to the present embodiment.
- a illustrated in FIG. 10 illustrates an example of the presentation image displayed on the display screen when the display screen on which the presentation image is displayed is a display screen of a monitor included in the imaging apparatus.
- B shown in FIG. 10 shows an example of the presentation image displayed on the display screen when the display screen on which the presentation image is displayed is a display screen of a viewfinder provided in the imaging apparatus.
- the viewing distance is longer than when the display screen on which the presentation image is displayed is a viewfinder.
- the area is made larger than in the case shown in B of FIG.
- the second determination processing unit 104 makes the area larger than when the display screen on which the presentation image is displayed is a viewfinder. The visibility of the presented image displayed on the screen can be improved.
- the second determination processing unit 104 makes the area larger than when the display screen on which the presentation image is displayed is a viewfinder, for example, It is possible to adjust the impression received by the user from the presented image displayed on the display screen.
- the image processing apparatus 100 By causing the presentation control unit 106 to display a presentation image in which the degree of focus is represented by the way of presentation corresponding to the display screen on the display screen, the image processing apparatus 100 allows the degree of focus by the way of presentation more suitable for the display screen. Is displayed on the display screen. Therefore, the second determination processing unit 104 sets an area corresponding to the display screen on which the presentation image is displayed, and performs the processes shown in the following (a) to (c), whereby the image processing apparatus 100 is The degree of focusing in the image can be easily grasped by the user.
- the second determination processing unit 104 acquires, for example, an evaluation value for each region using a filter, and the region in the target image based on the acquired evaluation value The degree of focusing in units is determined to obtain a second determination result.
- the second determination processing unit 104 obtains an evaluation value for each region w (n) by performing, for example, the following process on each region w (n). Generate a luminance signal. Extract high-frequency components from the generated luminance signal using a filter such as a high-pass filter. -Take the absolute value of the extracted high-frequency component. ⁇ Remove noise from the absolute value of the high-frequency component obtained. You may provide an upper limit with respect to the absolute value of the obtained high frequency component. ⁇ Integrate the value from which noise has been removed.
- FIG. 11 is an explanatory diagram for explaining an example of processing in the second determination processing unit 104 included in the image processing apparatus 100 according to the present embodiment, and the degree of focusing is quantified for each region w (n). An example of a score map is shown.
- FIG. 11 shows an example of the score map when the target image is divided into 24 regions, it goes without saying that the score map according to the present embodiment is not limited to the example shown in FIG.
- the edge component appears larger, and the higher the evaluation value, the higher the possibility of showing the edge.
- the second determination processing unit 104 performs threshold processing using each evaluation value of each region w (n) and a predetermined threshold. For example, the region where the evaluation value is equal to or greater than the threshold (or the evaluation value is An area larger than the threshold value (hereinafter the same shall apply) is determined as an in-focus area.
- examples of the predetermined threshold according to the present embodiment include a preset fixed value and a variable value that can be changed based on a user operation.
- the second determination processing unit 104 sets the predetermined threshold value to “128”, but it goes without saying that the predetermined threshold value according to the present embodiment is not limited to “128”.
- the second determination processing unit 104 generates, for example, an image in which only the region determined to be a focused region is represented as a determination image. Then, the first determination processing unit 102 transmits the determination image to the presentation control unit 106 as a second determination result.
- the second determination processing unit 104 may transmit, for example, the determination image and the score map to the presentation control unit 106 as the second determination result, and the score map may be transmitted to the second determination. As a result, it can be transmitted to the presentation control unit 106.
- the score map is transmitted to the presentation control unit 106 as the second determination result, processing related to generation of the determination image is performed by the presentation control unit 106, for example.
- the second determination processing unit 104 is obtained, for example, by machine learning using a learning image that is in focus and a learning image that is not in focus.
- the second determination result is obtained by determining the degree of focus in the region unit using the determined determiner.
- the determiner according to the present embodiment is a process for digitizing the degree of blurring and blurring based on the statistical properties of the image obtained by learning.
- the second determination processing unit 104 acquires an evaluation value for each region w (n), for example, similarly to the processing according to the first example.
- the second determination processing unit 104 performs an operation by substituting the obtained evaluation value of each region w (n) into a determiner, and uses the obtained value as the focus value of each region w (n). A new evaluation value indicating the degree is used.
- the second determination processing unit 104 obtains a score map as shown in FIG. 11, for example, as in the case where the processing according to the first example is performed. .
- the second determination processing unit 104 determines the focused area, for example, similarly to the case where the process according to the first example is performed, and determines the determination image according to the determination result as the second determination. As a result, it is transmitted to the presentation control unit 106.
- the second determination processing unit 104 may transmit, for example, the determination image and the score map to the presentation control unit 106 as the second determination result, and the score map may be transmitted to the second determination. As a result, it can be transmitted to the presentation control unit 106.
- each region w (n) The example in which the focused area is determined by performing threshold processing using a predetermined threshold on the evaluation value of) is shown.
- the processing in the second determination processing unit 104 according to the present embodiment is focused by performing threshold processing using a common threshold for each region w (n) on the acquired evaluation value. It is not restricted to the process which determines an area
- the second determination processing unit 104 determines an in-focus level based on a first determination criterion serving as a determination criterion for an area that does not include an object in the target image, and includes an object in the target image. For a region, the degree of focus may be determined by the second determination criterion corresponding to the object.
- the processing according to the first example shown in (a) above and the second example shown in (b) above determination of a focused region by the threshold processing using the predetermined threshold may be mentioned.
- the determination of the degree of focus based on the second determination criterion includes, for example, determination of an in-focus area by threshold processing using a threshold corresponding to an object detected from the target image. It is done.
- the object detection processing according to the present embodiment may be performed, for example, in an object detection unit (not shown) included in the image processing apparatus 100 or may be performed in an external device of the image processing apparatus 100.
- Examples of the object detection process according to the present embodiment include an arbitrary detection process that can detect a subject such as a face or a plant from the target image as an object, such as a face detection process.
- the second determination processing unit 104 specifies a threshold corresponding to the detected object using, for example, an object detection result and a table in which the object and the threshold are associated with each other. Then, the second determination processing unit 104 determines a focused region by performing threshold processing using the specified threshold for the region w (n) corresponding to the object detection result.
- a threshold corresponding to the detected object using, for example, an object detection result and a table in which the object and the threshold are associated with each other. Then, the second determination processing unit 104 determines a focused region by performing threshold processing using the specified threshold for the region w (n) corresponding to the object detection result.
- the detection result of the object for example, data indicating the type of the detected object and data indicating a region including the detected object can be cited.
- the second determination processing unit 104 corresponds to the detected object using, for example, a table in which the brightness of the region including the object, the dynamic range, the amount of the high frequency component, and the threshold value are associated with each other. It is also possible to specify a threshold and determine a focused area by threshold processing using the specified threshold.
- the second determination processing unit 104 determines the degree of focus of each region w (n) based on the determination criterion for each region w (n), so that the presentation image with higher accuracy can be displayed to the user. Can be presented.
- increasing accuracy means, for example, that an out-of-focus portion or region is highlighted and the in-focus portion or region is not highlighted. For example, to improve the situation.
- the second determination processing unit 104 may selectively determine the degree of focus based on the second determination criterion depending on the size of the detected object. . For example, when the size of the detected object is equal to or smaller than a set threshold value (or when the object size is smaller than the threshold value), the second determination processing unit 104 focuses on the first determination criterion.
- the degree of focus is determined by determining the degree. For example, when the size of the detected object is larger than the set threshold (or when the size of the object is equal to or larger than the threshold), the second determination processing unit 104 performs the second determination.
- the focus level is determined by determining the focus level based on the reference.
- the second determination processing unit 104 determines whether the region w (n) corresponding to the face region is an in-focus region using a threshold value lower than a reference threshold value. As described above, for the area w (n) corresponding to the face area, the focused area is determined based on the threshold value corresponding to the face area. It is possible to generate a presentation image that has been highlighted and display the generated presentation image on a display screen.
- the brightness, dynamic range, and high-frequency component of the region including the subject are specified by specifying the position and size of the subject.
- the amount can be determined.
- the second determination processing unit 104 is focused on the area w (n) corresponding to the area including the subject using the threshold corresponding to the obtained amount instead of the reference threshold. Determine if it is an area.
- Examples of the threshold value corresponding to the obtained amount include a threshold value that is higher than a reference threshold value when there are many high-frequency components, and a threshold value that is lower than a reference threshold value when there are few high-frequency components.
- the area focused by the threshold corresponding to the subject is determined, for example, the focus on the subject that the user wants to capture is captured. It is possible to generate a presentation image representing the degree and display the generated presentation image on the display screen.
- the second determination processing unit 104 determines a focused area based on the detection result of the object, so that the presentation control unit 106 to be described later depends on the presentation method corresponding to the object included in the target image.
- a presentation image in which the degree of focus is expressed can be generated, and the generated presentation image can be displayed on the display screen.
- the processing in the second determination processing unit 104 includes the processing according to the first example shown in (a) above and the second example shown in (b) above.
- the second determination processing unit 104 is not limited to the process according to the third example shown in (c) above, and the second determination processing unit 104 performs focusing of each region w (n), such as focusing determination using a phase difference sensor. It is possible to perform an arbitrary process that can determine the degree of.
- the second determination result is not limited to the determination image, and may be a determination image, a score map, or a score map.
- the second determination result is a determination image and a score map will be described as an example.
- the presentation control unit 106 plays a role of leading the presentation control processing according to the present embodiment.
- the presentation control unit 106 determines the degree of focus in the target image based on the first determination result transmitted from the first determination processing unit 102 and the second determination result transmitted from the second determination processing unit 104. Control presentation.
- the presentation control unit 106 performs, for example, one or both of the first determination result and the second determination result by performing any one of the processes shown in (A) to (C).
- a presentation image in which the degree of focus is represented by the way of presentation based on is displayed on the display screen.
- the presentation control unit 106 presents the degree of focus represented by the processing shown in (C) above, that is, “presentation based on both the first determination result and the second determination result”.
- the process for displaying an image on the display screen is set as a reference process.
- the presentation control unit 106 matches the process shown in (A) or the process shown in (B), that is, depending on the presentation method based on one of the first determination result and the second determination result.
- the process for displaying the presentation image representing the degree of focus on the display screen can also be used as a reference process.
- the presentation control unit 106 In the case of performing “a process of displaying a presentation image whose degree of focus is represented by a presentation method based on one of the first determination result and the second determination result on the display screen”, the presentation control unit 106, for example, Among the processes shown in (A) or (B), a preset process is performed.
- the presentation control unit 106 performs “a process of displaying a presentation image whose degree of focus is represented by a presentation method based on one of the first determination result and the second determination result on the display screen”.
- the process to be performed is not limited to a preset process.
- the presentation control unit 106 performs the above (A) based on the information on the accuracy of the focusing degree corresponding to the first determination result and the information on the accuracy of the focusing degree corresponding to the second determination result. It is also possible to perform the processing shown or the processing shown in (B) above.
- the accuracy of the degree of focus corresponding to the first determination result for example, based on data indicating the amount of the high-frequency component included in the target image or the amount of the high-frequency component included in the target image.
- Data indicating the degree of focus in the edge image (first determination result) such as data indicating the amount of edge detected from the target image, can be mentioned.
- Information about the accuracy of the degree of focus corresponding to the first determination result according to the present embodiment is generated based on the target image in the first determination processing unit 102 or the presentation control unit 106, for example.
- the degree of focus such as data indicating the ratio of the focused area in the target image to the entire target image.
- Information about the accuracy of the degree of focus corresponding to the second determination result according to the present embodiment is generated based on the target image in the second determination processing unit 104 or the presentation control unit 106, for example.
- the presentation control unit 106 determines the accuracy of the focus degree based on the accuracy information on the focus degree corresponding to the first determination result and the information on the accuracy of the focus degree corresponding to the second determination result. Determines a better determination result. And the presentation control part 106 displays the presentation image by which the focus degree was represented by the way of presentation based on the determination result with the better accuracy of the focus degree of the first determination result and the second determination result. Display on the screen.
- a threshold value for a value related to the degree of the degree of focus indicated by the accuracy information of each degree of focus for example, a threshold value for a value related to the degree of the degree of focus indicated by the accuracy information of each degree of focus.
- the accuracy of the focus level such as “processing that uses the processing results” and “processing that compares (directly or indirectly) the value related to the focus level indicated by the accuracy information of each focus level”
- the presentation control unit 106 may switch any of the processes shown in (A) to (C) based on a predetermined user operation related to switching of the presentation method, for example. .
- the presentation control unit 106 has the degree of focus depending on the way of presentation corresponding to the user operation.
- the presented presentation image is displayed on the display screen.
- the predetermined user operation is a user operation for causing the presentation control unit 106 to perform any one of the processes shown in (A) to (C).
- Examples of the predetermined user operation according to the present embodiment include an operation on an operation unit (not shown) and an operation on an external operation device such as a remote controller.
- the presentation control unit 106 switches processing to be performed based on an operation signal transmitted from an operation unit (not shown) or the like.
- the presentation control unit 106 performs a process as described above, for example, to display a presentation image in which the degree of focus is represented by a presentation method based on one or both of the first determination result and the second determination result. Display on the display screen.
- the presentation control unit 106 includes, for example, a focus area presentation control unit 110 and a focus location presentation control unit 112, as shown in FIG.
- the in-focus area presentation control unit 110 generates an image in which the in-focus level is expressed for each target image based on the second determination result.
- the in-focus location presentation control unit 112 generates an image in which the degree of in-focus in pixel units is expressed with respect to the target image based on the first determination result.
- the presentation control unit 106 focuses on the process shown in (A) or the process shown in (B), that is, “presentation based on one of the first determination result and the second determination result”.
- the process of “displaying the presentation image representing the degree on the display screen” one of the in-focus area presentation control unit 110 and the in-focus location presentation control unit 112 performs the process.
- region presentation control part 110 or the focus location presentation control part 112 is displayed on a display screen as a presentation image, for example.
- the presentation control unit 106 performs processing shown in the above (C), that is, “presentation image whose degree of focus is represented by a presentation method based on both the first determination result and the second determination result,
- C processing shown in the above
- both the focus area presentation control unit 110 and the focus location presentation control unit 112 perform the process.
- both the in-focus area presentation control unit 110 and the in-focus location presentation control unit 112 perform processing, one of the in-focus area presentation control unit 110 and the in-focus location presentation control unit 112 is applied to the target image. After processing, the other performs processing on the image generated by one.
- the focus location presentation control unit 112 may perform processing, or after the focus location presentation control unit 112 performs processing, The focused area presentation control unit 110 may perform processing. Further, the order of processing in the presentation control unit 106 may be switched based on a user operation.
- the presentation control unit 106 performs processing shown in (C) above, that is, “presentation images whose degree of focus is represented by the way of presentation based on both the first determination result and the second determination result,
- processing shown in (C) above that is, “presentation images whose degree of focus is represented by the way of presentation based on both the first determination result and the second determination result.
- the case of performing the “process to be displayed on the display screen” will be described as an example, and the configuration of the presentation control unit 106 and the process in the presentation control unit 106 will be described.
- the presentation control unit 106 when the presentation control unit 106 performs processing after the focusing region presentation control unit 110 performs processing, that is, when the focusing region presentation control unit 110 performs processing.
- the focused point presentation control unit 112 processes the image processed by the focused region presentation control unit 110 for the target image.
- FIG. 12 is a focus provided in the image processing apparatus 100 according to the present embodiment.
- 6 is an explanatory diagram illustrating a first example of processing in a region presentation control unit 110.
- the focused area presentation control unit 110 includes, for example, a background processing unit 120, a drawing color determination unit 122, and a mixing unit 124.
- the target image is input to the background processing unit 120, and the background processing unit 120 adjusts the color of the target image.
- the process related to the adjustment of the color of the target image in the background processing unit 120 include an arbitrary process capable of changing the color of the target image, such as a process of converting the target image into a monochrome image.
- the presentation control unit 106 adjusts the color of the target image, and displays a presentation image based on the target image whose color is adjusted.
- the generated presentation image can be displayed on the display screen.
- the focus area presentation control unit 110 when the focus area presentation control unit 110 performs processing after the focus location presentation control unit 112 performs processing, that is, the focus area presentation control unit 110 performs the focus location presentation.
- the background processing unit 120 included in the focus area presentation control unit 110 does not perform processing.
- the focus area presentation control unit 110 according to the first example may not include the background processing unit 120.
- the drawing color determination unit 122 determines the color of an object representing the focused area in the determination image (an example of the second determination result) (hereinafter, may be referred to as “first drawing color”). .
- the object representing the focused area in the determination image for example, the object shown in B of FIG. 1 or the object shown in D of FIGS. 3 to 8 (for example, an arbitrary shape such as a rectangle or a circle).
- the drawing color determination unit 122 determines the first drawing color based on the first color setting information indicating the first drawing color stored in a storage unit (not shown), for example.
- the first drawing color indicated by the first color setting information may be a fixed color set in advance or a variable color that can be changed by a user operation or the like.
- the drawing color determination unit 122 adjusts the color indicated by the first color setting information for each focused area based on the second determination result, and the adjusted color is displayed in the first drawing. It is good also as a color.
- a value corresponding to the focused area indicated by the score map (an example of the second determination result) (hereinafter, “ Based on the “evaluation value”), the luminance, hue, and the like can be changed.
- the first drawing color determined by the drawing color determination unit 122 is a single color.
- the first drawing color according to the present embodiment may be a plurality of colors.
- FIG. 13 is an explanatory diagram for explaining an example of processing in the focus area presentation control unit 110 included in the image processing apparatus 100 according to the present embodiment, and a drawing color determination unit that configures the focus area presentation control unit 110.
- An example of the first drawing color determined by 122 is shown.
- FIG. 13 shows an example in which the object representing the focused area is a rectangle, and the first drawing color represents the rectangle with two colors shown in FIGS.
- the first drawing color represents the rectangle with two colors shown in FIGS.
- FIG. 13 by representing an object representing an area focused by a plurality of colors, for example, the contrast between the focused area and the periphery of the area is maintained. . Therefore, for example, as shown in FIG. 13, the visibility of the presented image can be improved by representing an object representing an area focused by a plurality of colors.
- the mixing unit 124 mixes the image transmitted from the background processing unit 120 and the determination image indicated by the second determination result. Further, the mixing unit 124 sets the color of the object representing the focused area indicated by the determination image (an example of the second determination result) as the first drawing color determined by the drawing color determination unit 122. Thus, the image transmitted from the background processing unit 120 and the determination image are mixed.
- the mixing unit 124 combines the image transmitted from the background processing unit 120 and the determination image, or superimposes the image transmitted from the background processing unit 120 and the determination image in different layers.
- the image transmitted from the background processing unit 120 and the determination image are mixed.
- the mixing unit 124 When the score map is included in the second determination result, the mixing unit 124, for example, based on the evaluation value of the focused area indicated by the score map (an example of the second determination result), The mixing ratio of the image transmitted from the background processing unit 120 and the determination image may be changed. For example, the mixing unit 124 increases the mixing ratio of the object representing the focused area in the focused area having a higher evaluation value.
- examples of the mixing ratio according to the present embodiment include an alpha value in an alpha blend.
- the mixing unit 124 determines the mixing ratio between the image transmitted from the background processing unit 120 and the determination image.
- the presentation control unit 106 can generate a presentation image in which the degree of focus is represented by the presentation method corresponding to the second determination result, and can display the generated presentation image on the display screen.
- the in-focus area presentation control unit 110 can generate an image in which the in-focus level is expressed in units of areas based on the second determination result, for example, with the configuration illustrated in FIG.
- the focus area presentation control according to the first example is performed.
- the image generated in the unit 110 is an image in which both the degree of focusing in pixel units and the degree of focusing in region units are represented.
- the configuration of the focus area presentation control unit 110 according to the first example is not limited to the example shown above.
- the presentation control unit 106 when the focused region presentation control unit 110 performs processing after the focused location presentation control unit 112 performs processing, the focusing according to the first example is performed.
- the region presentation control unit 110 may not include the background processing unit 120.
- FIG. 14 shows a second example of processing in the focus area presentation control unit 110 included in the image processing apparatus 100 according to the present embodiment. It is explanatory drawing.
- the focus area presentation control unit 110 includes, for example, a background processing unit 120, a drawing color determination unit 126, and a mixing unit 124.
- the background processing unit 120 has the same function as the background processing unit 120 included in the focusing area presentation control unit 110 according to the first example shown in FIG.
- the background processing unit 120 adjusts the color of the target image, for example.
- the drawing color determination unit 126 determines the first drawing color.
- the drawing color determination unit 126 is transmitted from the background processing unit 120 in addition to the function of the drawing color determination unit 122 included in the focus area presentation control unit 110 according to the first example illustrated in FIG.
- the first drawing color is determined based on the image (the target image or an image obtained by processing the target image).
- the drawing color determination unit 126 is determined in the same manner as the drawing color determination unit 122 shown in FIG. Further, one of the following processes or two or more processes that can be combined are performed on the obtained color.
- the drawing color determination unit 126 is determined in the same manner as the drawing color determination unit 122 shown in FIG. 12 according to the average luminance of the pixels around the focused area in the image transmitted from the background processing unit 120.
- the first drawing color is determined by adjusting the luminance of the selected color.
- the drawing color determination unit 126 determines in the same manner as the drawing color determination unit 122 shown in FIG.
- the first drawing color is determined by inverting the hue and rotating the hue.
- the drawing color determination unit 126 determines the image and determination image transmitted from the background processing unit 120 according to the average color of the pixels around the focused area in the image transmitted from the background processing unit 120. And the mixing ratio (or an adjustment value for adjusting the mixing ratio) is determined.
- the drawing color determination unit 126 selects the target image (or the focused region after the processing is performed by the focused point presentation control unit 112)
- a color based on a color extracted from the image processed in the focus area presentation control unit 110 is determined as the first drawing color.
- the color based on the color extracted from the target image related to the determination of the first drawing color for example, the color for each region extracted from each focused region in the target image, or the focused color
- a representative color extracted from each area for example, most existing colors
- an average color of colors extracted from each focused area, and the like can be given.
- the presentation control unit 106 When the focused area presentation control unit 110 determines the first drawing color or the like based on the image transmitted from the background processing unit 120 in the drawing color determination unit 126, the presentation control unit 106 corresponds to the target image. It is possible to generate a presentation image in which the degree of focus is expressed according to the way of presentation, and display the generated presentation image on the display screen.
- the mixing unit 124 has the same function as the mixing unit 124 included in the focus area presentation control unit 110 according to the first example illustrated in FIG. 12, and the image transmitted from the background processing unit 120 and the determination image are displayed. Mix.
- the in-focus area presentation control unit 110 can generate an image in which an in-focus degree is expressed in units of areas based on the second determination result, for example, with the configuration illustrated in FIG.
- the configuration of the focus area presentation control unit 110 according to the second example is not limited to the example shown above.
- the focus area presentation control unit 110 according to the second example can have a configuration that does not include the background processing unit 120, like the focus area presentation control unit 110 according to the first example. .
- FIG. 15 shows a third example of processing in the focus area presentation control unit 110 included in the image processing apparatus 100 according to the present embodiment. It is explanatory drawing.
- the focus area presentation control unit 110 includes, for example, a background processing unit 120, a drawing color determination unit 128, and a mixing unit 124.
- the background processing unit 120 has the same function as the background processing unit 120 included in the focusing area presentation control unit 110 according to the first example shown in FIG.
- the background processing unit 120 adjusts the color of the target image, for example.
- the drawing color determination unit 128 determines the first drawing color.
- the drawing color determination unit 128, in addition to the function of the drawing color determination unit 122 included in the focus area presentation control unit 110 according to the first example illustrated in FIG. 12, further includes an illuminance value (display) detected by the sensor 168 or the like.
- the first drawing color is determined using an example of the result of detecting external light on the screen.
- the drawing color determination unit 128 adjusts the luminance of the color determined in the same manner as the drawing color determination unit 122 illustrated in FIG. 12 according to the illuminance value, and determines the first drawing color.
- FIG. 16 is an explanatory diagram for explaining an example of processing in the focusing area presentation control unit 110 included in the image processing apparatus 100 according to the present embodiment, and a drawing color determination unit that configures the focusing area presentation control unit 110.
- An example of the first luminance adjustment information used for adjusting the luminance at 128 is shown.
- the drawing color determination unit 128 specifies the luminance value corresponding to the detected illuminance value using the first luminance adjustment information in which the illuminance value and the luminance are associated, for example, as illustrated in FIG. Then, for example, the drawing color determination unit 128 adjusts the luminance of the determined color in the same manner as the drawing color determination unit 122 illustrated in FIG. 12 so as to indicate the specified luminance value, and sets the first drawing color. decide.
- the first luminance adjustment information according to the present embodiment for example, there is data indicating a function for which the luminance value is uniquely calculated based on the illuminance value.
- the first brightness adjustment information according to the present embodiment may be a table in which illuminance values and brightness values are associated with each other.
- the drawing color determination unit 128 represents a focused area as the illuminance value is higher within the range between the upper limit value and the lower limit value.
- a first drawing color is determined such that the brightness of the object is high.
- the correspondence between the illuminance value and the luminance value indicated by the first luminance adjustment information according to the present embodiment is not limited to the example shown in FIG.
- the focus area presentation control unit 110 determines the first drawing color using the detected illuminance value in the drawing color determination unit 128, so that the presentation control unit 106 responds to the detection result of external light on the display screen. It is possible to generate a presentation image in which the degree of focus is expressed according to the way of presentation, and display the generated presentation image on the display screen.
- the mixing unit 124 has the same function as the mixing unit 124 included in the focus area presentation control unit 110 according to the first example illustrated in FIG. 12, and the image transmitted from the background processing unit 120 and the determination image are displayed. Mix.
- the in-focus area presentation control unit 110 can generate an image in which the in-focus level is expressed in units of areas based on the second determination result, for example, with the configuration illustrated in FIG.
- the configuration of the focus area presentation control unit 110 according to the third example is not limited to the example shown above.
- the focus area presentation control unit 110 according to the third example can have a configuration that does not include the background processing unit 120, like the focus area presentation control unit 110 according to the first example. .
- the focus area presentation control unit 110 uses the illuminance value detected by the sensor 168 or the like to transmit an image (the target image or the target image is processed from the background processing unit 120). It is also possible to further adjust the brightness of the image.
- FIG. 17 is an explanatory diagram for explaining an example of processing in the focus area presentation control unit 110 included in the image processing apparatus 100 according to the present embodiment. From the background processing unit 120 in the focus area presentation control unit 110. An example of the second luminance adjustment information used for adjusting the luminance of the transmitted image is shown.
- the focus area presentation control unit 110 specifies the luminance value corresponding to the detected illuminance value using second luminance adjustment information in which the illuminance value and the luminance are associated with each other. Then, the focus area presentation control unit 110 adjusts the brightness of the image transmitted from the background processing unit 120 so as to indicate the specified brightness value, for example.
- the second brightness adjustment information according to the present embodiment includes, for example, data indicating a function for uniquely calculating the brightness value based on the illuminance value.
- the second brightness adjustment information according to the present embodiment may be a table in which illuminance values and brightness values are associated with each other.
- the focus area presentation control unit 110 determines that the higher the illuminance value is within the range between the upper limit value and the lower limit value, The brightness of the image transmitted from the background processing unit 120 is adjusted so that the brightness of the transmitted image is increased.
- the range between the upper limit value and the lower limit value indicated by the second brightness adjustment information shown in FIG. 17 that is, the range in which the brightness of the image transmitted from the background processing unit 120 is adjusted is shown in FIG. It is smaller than the range between the upper limit value and the lower limit value indicated by the luminance adjustment information of 1.
- the range between the upper limit value and the lower limit value indicated by the second luminance adjustment information smaller than the range between the upper limit value and the lower limit value indicated by the first luminance adjustment information, for example, in the presentation image It is possible to further emphasize the degree of focus in the region unit represented.
- the correspondence between the illuminance value and the luminance value indicated by the second luminance adjustment information according to the present embodiment is not limited to the example shown in FIG.
- An object representing a focused area by the focused area presentation control unit 110 further adjusting the luminance of the image transmitted from the background processing unit 120 using the illuminance value detected by the sensor 168 or the like.
- the luminance of the image transmitted from the background processing unit 120 is changed in conjunction with the first drawing color. Therefore, the focus area presentation control unit 110 further adjusts the luminance of the image transmitted from the background processing unit 120 using the detected illuminance value, for example, to represent the focused area.
- the contrast between the first drawing color of the object and the background color of the image transmitted from the background processing unit 120 can be kept above a certain level.
- the process of adjusting the luminance of the image transmitted from the background processing unit 120 using the detected illuminance value may be performed by, for example, the background processing unit included in the focus area presentation control unit 110.
- the mixing unit may perform.
- the focus area presentation control unit 110 may be configured according to the first example shown in (1-1) above, configured according to the second example shown in (1-2), or (1-3) described above. With the configuration according to the third example shown in (2), an image in which the degree of focus in units of regions is expressed with respect to the target image is generated based on the second determination result.
- the configuration of the focus area presentation control unit 110 is not limited to the configuration according to the first example shown in (1-1) to the configuration according to the third example shown in (1-3).
- the focus area presentation control unit 110 has a configuration in which the configuration according to the second example shown in (1-2) and the configuration according to the third example shown in (1-3) are combined. Is also possible.
- FIG. 18 illustrates in-focus included in the image processing apparatus 100 according to the present embodiment. It is explanatory drawing which shows the 1st example of the process in the location presentation control part.
- the in-focus location presentation control unit 112 includes, for example, a drawing color determination unit 130 and a mixing unit 132.
- the drawing color determining unit 130 determines an edge color (hereinafter, sometimes referred to as “second drawing color”) in the edge image (an example of the first determination result).
- the drawing color determination unit 130 determines the second drawing color based on the second color setting information indicating the second drawing color stored in a storage unit (not shown), for example.
- the second drawing color indicated by the second color setting information may be a fixed color set in advance or a variable color that can be changed by a user operation or the like.
- the mixing unit 132 mixes the input image and the edge image indicated by the first determination result. Also, the mixing unit 124 mixes the input image and the edge image after setting the edge color in the edge image to the second drawing color determined by the drawing color determination unit 130.
- the presentation control unit 106 when processing is performed in the in-focus location presentation control unit 112 after processing in the in-focus area presentation control unit 110, as an image input to the mixing unit 132, for example, An image generated by the in-focus area presentation control unit 110 and representing the degree of in-focus in the area unit with respect to the target image.
- examples of the image input to the mixing unit 132 include a target image.
- the mixing unit 132 mixes the input image and the edge image, for example, by combining the input image and the edge image, or by superimposing the input image and the edge image on different layers.
- the in-focus location presentation control unit 112 according to the first example generates an image in which the in-focus degree is expressed in units of pixels based on the first determination result, for example, with the configuration illustrated in FIG.
- the in-focus location presentation control according to the first example is performed.
- the image generated in the unit 112 is an image in which both the degree of focusing in pixel units and the degree of focusing in region units are represented.
- the configuration of the in-focus location presentation control unit 112 according to the first example is not limited to the example shown above.
- the in-focus location presentation control unit 112 according to the first example may further include a background processing processing unit having the same function as the background processing processing unit 120 included in the above-described focusing area presentation control unit 110. Good.
- the in-focus location presentation control unit 112 according to the first example further includes a background processing processing unit, the background processing processing unit, after the processing is performed in the in-focus location presentation control unit 112, the in-focus region presentation control unit When the process is performed at 110, that is, when the target image is processed, the color of the target image is adjusted.
- the focus location presentation control unit 112 adjusts the color of the target image in the background processing processing unit.
- the presentation control unit 106 can adjust the color of the target image, generate a presentation image based on the target image whose color has been adjusted, and display the generated presentation image on the display screen.
- the configuration of the focus location presentation control unit 112 is not limited to the configuration according to the first example shown in (2-1) above.
- the drawing color determination unit included in the in-focus location presentation control unit 112 has an input function in addition to the function of the drawing color determination unit 130 in the in-focus location presentation control unit 112 according to the first example illustrated in FIG. It is also possible to determine the second drawing color or the like based on the image to be processed.
- the drawing color determination unit included in the in-focus location presentation control unit 112 according to the second example is the drawing color shown in FIG.
- One of the following processes or two or more processes that can be combined are further performed on the color determined in the same manner as the determination unit 130.
- the drawing color determination unit included in the in-focus location presentation control unit 112 according to the second example determines the drawing color illustrated in FIG. 18 according to the average luminance of pixels around the pixel indicating the edge in the input image.
- the second drawing color is determined by adjusting the luminance of the determined color in the same manner as the unit 130.
- the drawing color determination unit included in the in-focus location presentation control unit 112 according to the second example determines the drawing color illustrated in FIG. 18 according to the average color of pixels around the pixel indicating the edge in the input image.
- the second drawing color is determined by inverting the hue determined in the same manner as in the unit 130, rotating the hue, and the like.
- the drawing color determination unit included in the in-focus location presentation control unit 112 according to the second example includes the input image and the edge image according to the average color of pixels around the pixel indicating the edge in the input image. And the mixing ratio (or an adjustment value for adjusting the mixing ratio) is determined.
- the drawing color determination unit included in the in-focus location presentation control unit 112 according to the second example is extracted from the target image (or input image), for example, when the input image is a monochrome image
- a color based on the color is determined as the second drawing color.
- the color based on the color extracted from the target image related to the determination of the second drawing color for example, the color for each pixel indicating the edge in the target image or the representative color of the pixel indicating the edge (for example, most And the average color of the pixel colors indicating the edges.
- the presentation control unit 106 corresponds to the target image by determining the second drawing color or the like based on the image input by the drawing color determination unit by the in-focus location presentation control unit 112 according to the second example. It is possible to generate a presentation image in which the degree of focus is expressed according to the way of presentation and display the generated presentation image on the display screen.
- the in-focus location presentation control unit 112 according to the second example includes a mixing unit having the same function as the mixing unit 132 included in the in-focus location presentation control unit 112 according to the first example illustrated in FIG. The input image and the edge image indicated by the first determination result are mixed.
- the in-focus location presentation control unit 112 according to the second example is adjusted in units of pixels based on the first determination result.
- An image representing the degree of focus can be generated.
- the configuration of the in-focus location presentation control unit 112 according to the second example is not limited to the example shown above.
- the in-focus location presentation control unit 112 according to the second example may further include a background processing unit, similar to the in-focus location presentation control unit 112 according to the first example illustrated in FIG.
- the configuration of the focus location presentation control unit 112 is the same as the configuration according to the first example shown in (2-1) above or (2 The configuration is not limited to the second example shown in -2).
- the drawing color determination unit included in the in-focus location presentation control unit 112 includes, in addition to the functions of the drawing color determination unit 130 in the in-focus location presentation control unit 112 according to the first example illustrated in FIG. It is also possible to determine the second drawing color using the illuminance value detected in 168 or the like (an example of the result of detecting external light on the display screen).
- the drawing color determination unit included in the in-focus location presentation control unit 112 according to the third example adjusts the luminance of the color determined similarly to the drawing color determination unit 130 illustrated in FIG. 18 according to the illuminance value, for example.
- the second drawing color is determined.
- the drawing color determination unit included in the in-focus location presentation control unit 112 according to the third example is, for example, as illustrated in FIG. 16, similarly to the drawing color determination unit 128 that configures the in-focus area presentation control unit 110 illustrated in FIG. 15.
- the second drawing color is determined using such first brightness adjustment information.
- the in-focus location presentation control unit 112 determines the second drawing color using the detected illuminance value in the drawing color determination unit, so that the presentation control unit 106 has external light on the display screen. It is possible to generate a presentation image in which the degree of focus is expressed according to the presentation method corresponding to the detection result, and display the generated presentation image on the display screen.
- the in-focus location presentation control unit 112 according to the third example includes a mixing unit having the same function as the mixing unit 132 included in the in-focus location presentation control unit 112 according to the first example illustrated in FIG. The input image and the edge image indicated by the first determination result are mixed.
- the in-focus location presentation control unit 112 according to the third example is adjusted in units of pixels based on the first determination result. An image representing the degree of focus can be generated.
- the configuration of the in-focus location presentation control unit 112 according to the third example is not limited to the example shown above.
- the in-focus location presentation control unit 112 according to the third example may further include a background processing unit, similar to the in-focus location presentation control unit 112 according to the first example illustrated in FIG.
- the focused point presentation control unit 112 according to the third example can further adjust the luminance of the input image using the illuminance value detected by the sensor 168 or the like.
- the in-focus location presentation control unit 112 according to the third example is an image that is input using the second luminance adjustment information as illustrated in FIG. 17, as in the in-focus region presentation control unit 110 illustrated in FIG. 15, for example. Adjust the brightness.
- the in-focus location presentation control unit 112 according to the third example further adjusts the luminance of the input image using the detected illuminance value, so that the input is performed together with the second drawing color indicating the edge in the edge image.
- the brightness of the displayed image is changed in conjunction. Therefore, the focused point presentation control unit 112 according to the third example further adjusts the luminance of the input image using the detected illuminance value, for example, the second drawing indicating the edge in the edge image
- the contrast between the color and the background color of the input image can be kept above a certain level.
- the process of adjusting the luminance of the input image using the illuminance value detected by the sensor 168 or the like is performed by, for example, the mixing unit included in the in-focus location presentation control unit 112 according to the third example. .
- the in-focus location presentation control unit 112 is configured according to the first example shown in (2-1) above, configured according to the second example shown in (2-2), or (2-3) above.
- the configuration according to the third example shown in (2) an image in which the degree of focus in pixel units is expressed with respect to the target image is generated based on the first determination result.
- the configuration of the in-focus location presentation control unit 112 is not limited to the configuration according to the first example shown in (2-1) to the configuration according to the third example shown in (2-3).
- the in-focus location presentation control unit 112 has a configuration in which the configuration according to the second example shown in (2-2) and the configuration according to the third example shown in (2-3) are combined. Is also possible.
- the presentation control unit 106 includes, for example, the in-focus area presentation control unit 110 and the in-focus location presentation control unit 112 as described above, so that the first determination result transmitted from the first determination processing unit 102, Based on the second determination result transmitted from the second determination processing unit 104, the presentation of the degree of focus in the target image is controlled. For example, when the presentation control unit 106 performs the processing as described above, a presentation image illustrated in E of FIGS. 3 to 8 is obtained.
- processing in the presentation control unit 106 according to the present embodiment is not limited to the processing described above.
- the presentation control unit 106 generates and generates a presentation image in which the degree of focus is represented by the way of presentation corresponding to the object included in the target image, based on the result of object detection processing on the target image.
- the presented image may be displayed on the display screen.
- the presentation control unit 106 can display, for example, a presentation image with higher visibility on the display screen.
- the object detection process may be performed in an object detection unit (not shown) included in the image processing apparatus 100 or may be performed in an external device of the image processing apparatus 100. Further, when the image processing apparatus 100 includes an object detection unit (not shown), the object detection unit (not shown) may constitute the presentation control unit 106, or separate from the presentation control unit 106.
- the processing circuit may be realized.
- enhancing visibility refers to, for example, changing the brightness or color so that the expressed focus level is easier to see, or displaying the focus level expressed. It means making the degree of focus easier to see by improving the situation where there are too many or too few.
- the presentation control unit 106 may selectively perform a process for improving the visibility depending on the size of the detected object, for example, even when the object is detected. For example, when the size of the detected object is equal to or smaller than a set threshold value (or when the object size is smaller than the threshold value), the presentation control unit 106 does not perform a process for improving the visibility. . In addition, for example, when the size of the detected object is larger than the set threshold value (or when the object size is equal to or larger than the threshold value), the presentation control unit 106 increases the visibility. Process.
- the presentation control unit 106 maintains visibility by thickening only the lines included in the detected face area or by highlighting and displaying the area w (n) corresponding to the face area.
- the presentation control unit 106 changes the drawing as described below, for example, according to the detected feature of the subject.
- the presentation control unit 106 reduces the drawing amount of the edge display for a portion with a high frequency component in the area including the detected subject (the situation where it is difficult to check the details because the contour is drawn too much). Improvement).
- the presentation control unit 106 emphasizes and displays the focused area for a portion with a low high-frequency component in the area including the detected subject (when the high-frequency component is low, an edge is displayed).
- the presentation control unit 106 determines a brighter color as the first drawing color or the second drawing color for the dark part in the area including the detected subject.
- the presentation control unit 106 determines a darker color as the first drawing color or the second drawing color for the bright part in the area including the detected subject.
- the presentation control unit 106 determines the first drawing color and the second drawing color according to a representative color (for example, the most existing color) in the area including the detected subject.
- the presentation control unit 106 decreases the mixing rate in the area including the detected subject.
- the presentation control unit 106 changes the drawing according to the detected feature of the subject as described above, for example, the drawing control unit 106 draws according to an area where the user is more likely to take an image. It is possible to change. Therefore, for example, as described above, the presentation control unit 106 can change the drawing according to the detected feature of the subject, thereby improving the visibility more than the case where the drawing is adjusted for the entire image.
- the presentation control unit 106 can change the drawing according to the amount of movement of the object detected from the target image.
- the image processing apparatus 100 has, for example, the configuration shown in FIG. 2 to perform processing related to the image processing method according to the present embodiment (for example, processing for determining the degree of focus in pixel units from the target image, Processing for determining the degree of focus and presentation control processing) are performed.
- the image processing apparatus 100 can allow the user to grasp the degree of focusing in the image, for example, with the configuration shown in FIG.
- the image processing apparatus 100 can exhibit the effects exhibited by performing the processing related to the image processing method according to the present embodiment as described above, for example.
- the configuration of the image processing apparatus according to the present embodiment is not limited to the configuration shown in FIG.
- the present embodiment can be configured not to include the first determination processing unit 102 and the second determination processing unit 104 illustrated in FIG.
- the determination processing device is an external device of the image processing device according to the present embodiment, and is another image processing device.
- the determination processing apparatus according to the present embodiment has a hardware configuration (including modifications) similar to that of the image processing apparatus 100 shown in FIG. A function similar to that of the determination processing unit 104 is realized.
- the first determination result is obtained from the first external device having the same function and configuration as the first determination processing unit 102 and the second external device having the same function and configuration as the second determination processing unit 104.
- the image processing apparatus is configured not to include a determination processing unit related to determination of the acquired determination result. Also good.
- the image processing apparatus according to the embodiment can perform a presentation control process as a process related to the image processing method according to the present embodiment.
- the image processing apparatus does not include the first determination processing unit 102 and the second determination processing unit 104 illustrated in FIG. It is possible to make the user grasp the degree of focus.
- the determination processing apparatus when the image processing apparatus according to the present embodiment acquires the first determination result and the second determination result from the determination processing apparatus according to the present embodiment, the determination processing apparatus according to the present embodiment With the image processing apparatus according to the embodiment, an image processing system in which processing similar to that of the image processing apparatus 100 illustrated in FIG. 2 is performed is realized.
- the determination processing apparatus can realize an image processing system that allows the user to grasp the degree of focusing in an image.
- the image processing apparatus has been described as the present embodiment, but the present embodiment is not limited to such a form.
- an imaging device such as a digital still camera or a digital video camera
- a communication device such as a mobile phone or a smartphone, a tablet device, a computer such as a PC (Personal Computer) or a server, a display device, a video /
- the present invention can be applied to various devices such as music playback devices (or video / music recording / playback devices) and game machines.
- the present embodiment can be applied to, for example, a processing IC (Integrated Circuit) that can be incorporated in the above devices.
- a processing IC Integrated Circuit
- the image processing apparatus may be realized by a system including a plurality of apparatuses on the premise of connection to a network (or communication between apparatuses) such as cloud computing. . That is, the above-described image processing apparatus according to the present embodiment can be realized as a system including a plurality of apparatuses, for example.
- the present embodiment is not limited to such a form.
- an imaging device such as a digital still camera or a digital video camera
- a communication device such as a mobile phone or a smartphone, a tablet device, a computer such as a PC or server, a display device, a video / music playback device ( Alternatively, it can be applied to various devices such as a video / music recording / playback apparatus) and a game machine.
- the present embodiment can be applied to a processing IC that can be incorporated in the above-described device, for example.
- a program for causing a computer to function as the image processing apparatus according to the present embodiment A program for causing a computer to function as the image processing apparatus according to the present embodiment (for example, “presentation control process” or “from the target image to a pixel unit) “Processing for Determining Degree of Focus and Presentation Control Process”, “Process for Determining Degree of Focus in Region Unit from Target Image, and Presentation Control Process”, “Processing Degree of Focus in Pixel Unit from Target Image”
- a program capable of executing processing relating to the image processing method according to the present embodiment such as “determination processing, processing for determining the degree of focusing in units of regions from the target image, and presentation control processing” By being executed by a processor or the like, the user can grasp the degree of focusing in the image.
- an effect produced by the processing according to the above-described image processing method according to the present embodiment by executing a program for causing the computer to function as the image processing apparatus according to the present embodiment by a processor or the like in the computer. Can be played.
- [II] Program for causing a computer to function as the determination processing device according to the present embodiment A program for causing a computer to function as the determination processing device according to the present embodiment (for example, “determining the degree of focus in pixel units from the target image Image processing that allows the user to grasp the degree of in-focus in the image by executing a process for determining the degree of in-focus from the target image by a processor or the like in the computer.
- a system can be realized.
- a program for causing a computer to function as the image processing apparatus according to the present embodiment or the determination processing apparatus according to the present embodiment is provided.
- a first determination result that is a determination result of a focus degree in a pixel unit in a target image that is an image to be processed, and a second determination result that is a determination result of a focus degree in a region unit in the target image;
- An image processing apparatus comprising: a presentation control unit that controls presentation of the degree of focus in the target image based on the above.
- the presentation control unit displays a presentation image based on the target image in which the degree of focus is represented by a presentation method based on one or both of the first determination result and the second determination result.
- the image processing apparatus according to (1) which is displayed on the screen.
- the presentation control unit includes the presentation image in which both the degree of focusing in units of pixels based on the first determination result and the degree of focusing in units of regions based on the second determination result are represented.
- the image processing apparatus wherein: (4) The presentation control unit includes the presentation image in which one of a focus degree in the pixel unit based on the first determination result and a focus degree in the region unit based on the second determination result is represented.
- the presentation control unit may determine whether one of the focus level in the pixel unit based on the first determination result and the focus level in the region unit based on the second determination result is (2) to (4) switching the displayed presentation image or the presentation image in which both the in-focus level in the pixel unit and the in-focus level in the region unit are displayed.
- the image processing apparatus according to any one of the above.
- the presentation control unit based on the information on the accuracy of the focusing degree corresponding to the first determination result and the information on the accuracy of the focusing degree corresponding to the second determination result, in the pixel unit.
- the image processing apparatus according to (4), wherein the presentation image in which one of a focus degree or a focus degree in the region unit is displayed is displayed.
- the presentation control unit displays the presentation image in which the degree of focus is represented by a presentation method corresponding to an object included in the target image, according to any one of (2) to (6).
- Image processing apparatus. (8) The presentation control unit Adjusting the color of the target image, The image processing apparatus according to any one of (2) to (7), wherein the presentation image based on the target image whose color is adjusted is displayed. (9) The image processing apparatus according to any one of (2) to (8), wherein the presentation control unit displays the presentation image in which the degree of focus is represented by a presentation method corresponding to the target image. .
- a first determination processing unit that determines the degree of focus in pixel units from the target image;
- the presentation control unit controls presentation of a degree of focus in the target image based on the first determination result in the first determination processing unit, according to any one of (1) to (12).
- Image processing apparatus (14)
- the first determination processing unit displays a presentation image based on the target image in which the degree of focus is represented by a presentation method based on one or both of the first determination result and the second determination result.
- the image processing apparatus according to (13), wherein the degree of focus in pixel units is determined by a filter corresponding to a display screen.
- a second determination processing unit that determines the degree of focus in the region unit from the target image;
- the presentation control unit controls presentation of the degree of focus in the target image based on the second determination result in the second determination processing unit (1) to any one of (14)
- the second determination processing unit For the region that does not include the object in the target image, the degree of focus is determined by a first determination criterion that is a criterion for determination;
- the image processing apparatus according to (15), wherein the degree of focus is determined based on a second determination criterion corresponding to the object for an area including the object in the target image.
- a first determination processing unit that determines a degree of focus in pixel units from a target image that is an image to be processed;
- a second determination processing unit that determines the degree of focus in region units from the target image;
- An image processing apparatus comprising: (18) The second determination processing unit determines the degree of focus in the region unit using a determiner obtained by machine learning using a learning image that is in focus and a learning image that is not in focus.
- the program for making a computer perform the step which controls presentation of the focus degree in the said target image based on this.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Computing Systems (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
- Automatic Focus Adjustment (AREA)
Abstract
Description
1.本実施形態に係る画像処理方法
2.本実施形態に係る画像処理装置
3.本実施形態に係るプログラム
本実施形態に係る画像処理装置の構成について説明する前に、まず、本実施形態に係る画像処理方法について説明する。以下では、本実施形態に係る画像処理方法に係る処理を、本実施形態に係る画像処理装置が行う場合を例に挙げて、本実施形態に係る画像処理方法について説明する。
・第1の判定結果に基づく画素単位での合焦度合いが表された提示画像
・第2の判定結果に基づく領域単位での合焦度合いが表された提示画像
・第1の判定結果に基づく画素単位での合焦度合い、および第2の判定結果に基づく領域単位での合焦度合いが表された提示画像
第1の判定結果に基づく画素単位での合焦度合いが表された提示画像を表示させる場合、本実施形態に係る画像処理装置は、例えば、エッジ画像(第1の判定結果)またはエッジ画像に基づく画像が、対象画像に重畳された画像を、画素単位での合焦度合いが表された提示画像として生成させる。本実施形態に係るエッジ画像に基づく画像としては、例えば、エッジ画像の色彩や色濃度、輝度などを変更した画像など、エッジ画像が示すエッジが変更されないようにエッジ画像が加工された画像が、挙げられる。
第2の判定結果に基づく領域単位での合焦度合いが表された提示画像を表示させる場合、本実施形態に係る画像処理装置は、例えば、判定画像(第2の判定結果)または判定画像に基づく画像が、対象画像に重畳された画像を、領域単位での合焦度合いが表された提示画像として生成させる。本実施形態に係る判定画像に基づく画像としては、例えば、判定画像の色彩や色濃度、輝度などを変更した画像など、判定画像が加工された画像が挙げられる。
画素単位での合焦度合いおよび領域単位での合焦度合いが表された提示画像を表示させる場合、本実施形態に係る画像処理装置は、例えば、エッジ画像および判定画像が対象画像に重畳された画像を、提示画像として生成させる。
図2は、本実施形態に係る画像処理装置100の構成の一例を示すブロック図である。
図9は、本実施形態に係る画像処理装置100のハードウェア構成の一例を示す説明図である。画像処理装置100は、例えば、MPU150と、ROM152と、RAM154と、記録媒体156と、入出力インタフェース158と、操作入力デバイス160と、表示デバイス162と、通信インタフェース164と、撮像デバイス166と、センサ168とを備える。また、画像処理装置100は、例えば、データの伝送路としてのバス170で各構成要素間を接続する。
第2判定処理部104は、例えば、フィルタを用いて領域ごとの評価値を取得し、取得した評価値に基づいて対象画像における領域単位での合焦度合いを判定して、第2の判定結果を得る。
・輝度信号を生成する。
・生成された輝度信号に対してハイパスフィルタなどのフィルタを用いて高周波成分を抽出する。
・抽出された高周波成分の絶対値をとる。
・得られた高周波成分の絶対値からノイズを除去する。得られた高周波成分の絶対値に対して上限値を設けてもよい。
・ノイズが除去された値に対して積分処理を行う。
第2判定処理部104は、例えば、ピントが合っている学習画像とピントが合っていない学習画像とを用いた機械学習により得られた判定器を用いて領域単位での合焦度合いを判定して、第2の判定結果を得る。ここで、本実施形態に係る判定器は、学習により求められる画像の統計的性質を基に、ブレやボケの度合いを数値化する処理である。
上記(a)に示す第1の例に係る処理および上記(b)に示す第2の例に係る処理では、各領域w(n)の評価値に対して所定の閾値を用いた閾値処理を行うことによって、合焦されている領域を判定する例を示した。しかしながら、本実施形態に係る第2判定処理部104における処理は、取得された評価値に対して各領域w(n)に共通の閾値を用いた閾値処理を行うことによって、合焦されている領域を判定する処理に限られない。
上記のように、顔領域に対応する領域w(n)に対して、顔領域に対応する閾値により合焦されている領域が判定されることによって、例えば、合焦している箇所や領域に強調表示が行われた提示画像を生成して、生成された提示画像を表示画面に表示させることが可能となる。
第2判定処理部104における処理は、上記(a)に示す第1の例に係る処理、上記(b)に示す第2の例に係る処理、および上記(c)に示す第3の例に係る処理に限られず、第2判定処理部104は、位相差センサを用いた合焦判定など、各領域w(n)の合焦の度合いを判定することが可能な、任意の処理を行うことが可能である。
(1-1)合焦領域提示制御部110における処理の第1の例
図12は、本実施形態に係る画像処理装置100が備える合焦領域提示制御部110における処理の第1の例を示す説明図である。
図14は、本実施形態に係る画像処理装置100が備える合焦領域提示制御部110における処理の第2の例を示す説明図である。
・描画色決定部126は、背景加工処理部120から伝達される画像における、合焦されている領域の周囲の画素の平均輝度に応じて、図12に示す描画色決定部122と同様に決定された色の輝度を調整して、第1の描画色を決定する。
・描画色決定部126は、背景加工処理部120から伝達される画像における、合焦されている領域の周囲の画素の平均色に応じて、図12に示す描画色決定部122と同様に決定された色を反転、色相回転などさせて、第1の描画色を決定する。
・描画色決定部126は、背景加工処理部120から伝達される画像における、合焦されている領域の周囲の画素の平均色に応じて、背景加工処理部120から伝達される画像と判定画像との混合率(または混合率を調整するための調整値)を決定する。
・描画色決定部126は、背景加工処理部120から伝達される画像がモノクロ画像である場合には、対象画像(または、合焦箇所提示制御部112において処理が行われた後に、合焦領域提示制御部110において処理が行われるときには、合焦領域提示制御部110において処理された画像)から抽出される色に基づく色を、第1の描画色として決定する。第1の描画色の決定に係る対象画像から抽出される色に基づく色としては、例えば、対象画像における合焦されている領域それぞれから抽出される当該領域ごとの色や、合焦されている領域それぞれから抽出される色の代表的な色(例えば、最も多く存在する色など)、合焦されている領域それぞれから抽出される色の平均色などが挙げられる。
図15は、本実施形態に係る画像処理装置100が備える合焦領域提示制御部110における処理の第3の例を示す説明図である。
(2-1)合焦箇所提示制御部112における処理の第1の例
図18は、本実施形態に係る画像処理装置100が備える合焦箇所提示制御部112における処理の第1の例を示す説明図である。
合焦箇所提示制御部112の構成は、上記(2-1)に示す第1の例に係る構成に限られない。例えば、合焦箇所提示制御部112が備える描画色決定部は、図18に示す第1の例に係る合焦箇所提示制御部112が備える描画色決定部130が有する機能に加え、さらに、入力される画像に基づいて、第2の描画色などを決定することも可能である。
・第2の例に係る合焦箇所提示制御部112が備える描画色決定部は、入力される画像における、エッジを示す画素の周囲の画素の平均輝度に応じて、図18に示す描画色決定部130と同様に決定された色の輝度を調整して、第2の描画色を決定する。
・第2の例に係る合焦箇所提示制御部112が備える描画色決定部は、入力される画像における、エッジを示す画素の周囲の画素の平均色に応じて、図18に示す描画色決定部130と同様に決定された色を反転、色相回転などさせて、第2の描画色を決定する。
・第2の例に係る合焦箇所提示制御部112が備える描画色決定部は、入力される画像における、エッジを示す画素の周囲の画素の平均色に応じて、入力される画像とエッジ画像との混合率(または混合率を調整するための調整値)を決定する。
・第2の例に係る合焦箇所提示制御部112が備える描画色決定部は、入力される画像がモノクロ画像である場合には、例えば、対象画像(または入力される画像)から抽出される色に基づく色を、第2の描画色として決定する。第2の描画色の決定に係る対象画像などから抽出される色に基づく色としては、例えば、対象画像におけるエッジを示す画素ごとの色や、エッジを示す画素の代表的な色(例えば、最も多く存在する色など)、エッジを示す画素の色の平均色などが挙げられる。
合焦箇所提示制御部112の構成は、上記(2-1)に示す第1の例に係る構成や、上記(2-2)に示す第2の例に係る構成に限られない。例えば、合焦箇所提示制御部112が備える描画色決定部は、図18に示す第1の例に係る合焦箇所提示制御部112が備える描画色決定部130が有する機能に加え、さらに、センサ168などにおいて検出された照度値(表示画面における外光の検出結果の一例)を用いて、第2の描画色を決定することも可能である。
・提示制御部106は、検出された被写体を含む領域における高周波成分が多い部分に対しては、エッジ表示の描画量を減少させる(輪郭が描画されすぎて、細部の確認が行いにくくなる状況の改善)。
・提示制御部106は、検出された被写体を含む領域における高周波成分が少ない部分に対しては、合焦されている領域をより強調して表示させる(高周波成分が少ない場合にはエッジが表示され難いので、領域単位での合焦度合いをより強調する)。
・提示制御部106は、検出された被写体を含む領域における暗部に対しては、第1の描画色や第2の描画色として、より明るい色を決定する。
・提示制御部106は、検出された被写体を含む領域における明部に対しては、第1の描画色や第2の描画色として、より暗い色を決定する。
・提示制御部106は、検出された被写体を含む領域における代表的な色(例えば、最も多く存在する色など)に応じて、第1の描画色や第2の描画色を決定する。
・提示制御部106は、検出された被写体を含む領域では、混合率を下げる。
[I]本実施形態に係る画像処理装置として機能させるためのプログラム
コンピュータを、本実施形態に係る画像処理装置として機能させるためのプログラム(例えば、“提示制御処理”や、“対象画像から画素単位での合焦度合いを判定する処理、および提示制御処理”、“対象画像から領域単位での合焦度合いを判定する処理、および提示制御処理”、“対象画像から画素単位での合焦度合いを判定する処理、対象画像から領域単位での合焦度合いを判定する処理、および提示制御処理”など、本実施形態に係る画像処理方法に係る処理を実行することが可能なプログラム)が、コンピュータにおいてプロセッサなどにより実行されることによって、画像における合焦の度合いをユーザに把握させることができる。
コンピュータを、本実施形態に係る判定処理装置として機能させるためのプログラム(例えば、“対象画像から画素単位での合焦度合いを判定する処理、および対象画像から領域単位での合焦度合いを判定する処理”)が、コンピュータにおいてプロセッサなどにより実行されることによって、画像における合焦の度合いをユーザに把握させることが可能な画像処理システムを、実現することができる。
(1)
処理対象の画像である対象画像における画素単位での合焦度合いの判定結果である第1の判定結果と、前記対象画像における領域単位での合焦度合いの判定結果である第2の判定結果とに基づいて、前記対象画像における合焦度合いの提示を制御する提示制御部を備える、画像処理装置。
(2)
前記提示制御部は、前記第1の判定結果と前記第2の判定結果との一方または双方に基づく提示の仕方によって前記合焦度合いが表された、前記対象画像に基づく提示画像を、表示画面に表示させる、(1)に記載の画像処理装置。
(3)
前記提示制御部は、前記第1の判定結果に基づく前記画素単位での合焦度合いと、前記第2の判定結果に基づく前記領域単位での合焦度合いとの双方が表された前記提示画像を表示させる、(2)に記載の画像処理装置。
(4)
前記提示制御部は、前記第1の判定結果に基づく前記画素単位での合焦度合いと、前記第2の判定結果に基づく前記領域単位での合焦度合いとの一方が表された前記提示画像を表示させる、(2)に記載の画像処理装置。
(5)
前記提示制御部は、ユーザ操作に基づいて、前記第1の判定結果に基づく前記画素単位での合焦度合いと、前記第2の判定結果に基づく前記領域単位での合焦度合いとの一方が表された前記提示画像、または、前記画素単位での合焦度合いと前記領域単位での合焦度合いとの双方が表された前記提示画像を、切り換えて表示させる、(2)~(4)のいずれか1つに記載の画像処理装置。
(6)
前記提示制御部は、前記第1の判定結果に対応する合焦度合いの精度の情報と、前記第2の判定結果に対応する合焦度合いの精度の情報とに基づいて、前記画素単位での合焦度合い、または前記領域単位での合焦度合いの一方が表された前記提示画像を表示させる、(4)に記載の画像処理装置。
(7)
前記提示制御部は、前記対象画像に含まれるオブジェクトに対応する提示の仕方によって前記合焦度合いが表された前記提示画像を、表示させる、(2)~(6)のいずれか1つに記載の画像処理装置。
(8)
前記提示制御部は、
前記対象画像の色彩を調整し、
色彩が調整された前記対象画像に基づく前記提示画像を表示させる、(2)~(7)のいずれか1つに記載の画像処理装置。
(9)
前記提示制御部は、前記対象画像に対応する提示の仕方によって前記合焦度合いが表された前記提示画像を、表示させる、(2)~(8)のいずれか1つに記載の画像処理装置。
(10)
前記提示制御部は、前記表示画面に対応する提示の仕方によって前記合焦度合いが表された前記提示画像を、表示させる、(2)~(9)のいずれか1つに記載の画像処理装置。
(11)
前記提示制御部は、前記表示画面における外光の検出結果に対応する提示の仕方によって前記合焦度合いが表された前記提示画像を、表示させる、(2)~(10)のいずれか1つに記載の画像処理装置。
(12)
前記提示制御部は、前記第2の判定結果に対応する提示の仕方によって前記合焦度合いが表された前記提示画像を、表示させる、(2)~(11)のいずれか1つに記載の画像処理装置。
(13)
前記対象画像から前記画素単位での合焦度合いを判定する第1判定処理部をさらに備え、
前記提示制御部は、前記第1判定処理部における前記第1の判定結果に基づいて、前記対象画像における合焦度合いの提示を制御する、(1)~(12)のいずれか1つに記載の画像処理装置。
(14)
前記第1判定処理部は、前記第1の判定結果と前記第2の判定結果との一方または双方に基づく提示の仕方によって前記合焦度合いが表された前記対象画像に基づく提示画像が表示される、表示画面に対応するフィルタによって、前記画素単位での合焦度合いを判定する、(13)に記載の画像処理装置。
(15)
前記対象画像から前記領域単位での合焦度合いを判定する第2判定処理部をさらに備え、
前記提示制御部は、前記第2判定処理部における前記第2の判定結果に基づいて、前記対象画像における合焦度合いの提示を制御する、(1)~(14)のいずれか1つ
(16)
前記第2判定処理部は、
前記対象画像におけるオブジェクトが含まれない領域に対して、判定の基準となる第1の判定基準で前記合焦度合いを判定し、
前記対象画像におけるオブジェクトが含まれる領域に対しては、前記オブジェクトに対応する第2の判定基準で前記合焦度合いを判定する、(15)に記載の画像処理装置。
(17)
処理対象の画像である対象画像から画素単位での合焦度合いを判定する第1判定処理部と、
前記対象画像から領域単位での合焦度合いを判定する第2判定処理部と、
を備える、画像処理装置。
(18)
前記第2判定処理部は、ピントが合っている学習画像とピントが合っていない学習画像とを用いた機械学習により得られた判定器を用いて前記領域単位での合焦度合いを判定する、(17)に記載の画像処理装置。
(19)
処理対象の画像である対象画像における画素単位での合焦度合いの判定結果である第1の判定結果と、前記対象画像における領域単位での合焦度合いの判定結果である第2の判定結果とに基づいて、前記対象画像における合焦度合いの提示を制御するステップを有する、画像処理装置により実行される画像処理方法。
(20)
処理対象の画像である対象画像における画素単位での合焦度合いの判定結果である第1の判定結果と、前記対象画像における領域単位での合焦度合いの判定結果である第2の判定結果とに基づいて、前記対象画像における合焦度合いの提示を制御するステップをコンピュータに実行させるためのプログラム。
102 第1判定処理部
104 第2判定処理部
106 提示制御部
110 合焦領域提示制御部
112 合焦箇所提示制御部
120 背景加工処理部
122、126、128、130 描画色決定部
124、132 混合部
Claims (20)
- 処理対象の画像である対象画像における画素単位での合焦度合いの判定結果である第1の判定結果と、前記対象画像における領域単位での合焦度合いの判定結果である第2の判定結果とに基づいて、前記対象画像における合焦度合いの提示を制御する提示制御部を備える、画像処理装置。
- 前記提示制御部は、前記第1の判定結果と前記第2の判定結果との一方または双方に基づく提示の仕方によって前記合焦度合いが表された、前記対象画像に基づく提示画像を、表示画面に表示させる、請求項1に記載の画像処理装置。
- 前記提示制御部は、前記第1の判定結果に基づく前記画素単位での合焦度合いと、前記第2の判定結果に基づく前記領域単位での合焦度合いとの双方が表された前記提示画像を表示させる、請求項2に記載の画像処理装置。
- 前記提示制御部は、前記第1の判定結果に基づく前記画素単位での合焦度合いと、前記第2の判定結果に基づく前記領域単位での合焦度合いとの一方が表された前記提示画像を表示させる、請求項2に記載の画像処理装置。
- 前記提示制御部は、ユーザ操作に基づいて、前記第1の判定結果に基づく前記画素単位での合焦度合いと、前記第2の判定結果に基づく前記領域単位での合焦度合いとの一方が表された前記提示画像、または、前記画素単位での合焦度合いと前記領域単位での合焦度合いとの双方が表された前記提示画像を、切り換えて表示させる、請求項2に記載の画像処理装置。
- 前記提示制御部は、前記第1の判定結果に対応する合焦度合いの精度の情報と、前記第2の判定結果に対応する合焦度合いの精度の情報とに基づいて、前記画素単位での合焦度合い、または前記領域単位での合焦度合いの一方が表された前記提示画像を表示させる、請求項4に記載の画像処理装置。
- 前記提示制御部は、前記対象画像に含まれるオブジェクトに対応する提示の仕方によって前記合焦度合いが表された前記提示画像を、表示させる、請求項2に記載の画像処理装置。
- 前記提示制御部は、
前記対象画像の色彩を調整し、
色彩が調整された前記対象画像に基づく前記提示画像を表示させる、請求項2に記載の画像処理装置。 - 前記提示制御部は、前記対象画像に対応する提示の仕方によって前記合焦度合いが表された前記提示画像を、表示させる、請求項2に記載の画像処理装置。
- 前記提示制御部は、前記表示画面に対応する提示の仕方によって前記合焦度合いが表された前記提示画像を、表示させる、請求項2に記載の画像処理装置。
- 前記提示制御部は、前記表示画面における外光の検出結果に対応する提示の仕方によって前記合焦度合いが表された前記提示画像を、表示させる、請求項2に記載の画像処理装置。
- 前記提示制御部は、前記第2の判定結果に対応する提示の仕方によって前記合焦度合いが表された前記提示画像を、表示させる、請求項2に記載の画像処理装置。
- 前記対象画像から前記画素単位での合焦度合いを判定する第1判定処理部をさらに備え、
前記提示制御部は、前記第1判定処理部における前記第1の判定結果に基づいて、前記対象画像における合焦度合いの提示を制御する、請求項1に記載の画像処理装置。 - 前記第1判定処理部は、前記第1の判定結果と前記第2の判定結果との一方または双方に基づく提示の仕方によって前記合焦度合いが表された前記対象画像に基づく提示画像が表示される、表示画面に対応するフィルタによって、前記画素単位での合焦度合いを判定する、請求項13に記載の画像処理装置。
- 前記対象画像から前記領域単位での合焦度合いを判定する第2判定処理部をさらに備え、
前記提示制御部は、前記第2判定処理部における前記第2の判定結果に基づいて、前記対象画像における合焦度合いの提示を制御する、請求項1に記載の画像処理装置。 - 前記第2判定処理部は、
前記対象画像におけるオブジェクトが含まれない領域に対して、判定の基準となる第1の判定基準で前記合焦度合いを判定し、
前記対象画像におけるオブジェクトが含まれる領域に対しては、前記オブジェクトに対応する第2の判定基準で前記合焦度合いを判定する、請求項15に記載の画像処理装置。 - 処理対象の画像である対象画像から画素単位での合焦度合いを判定する第1判定処理部と、
前記対象画像から領域単位での合焦度合いを判定する第2判定処理部と、
を備える、画像処理装置。 - 前記第2判定処理部は、ピントが合っている学習画像とピントが合っていない学習画像とを用いた機械学習により得られた判定器を用いて前記領域単位での合焦度合いを判定する、請求項17に記載の画像処理装置。
- 処理対象の画像である対象画像における画素単位での合焦度合いの判定結果である第1の判定結果と、前記対象画像における領域単位での合焦度合いの判定結果である第2の判定結果とに基づいて、前記対象画像における合焦度合いの提示を制御するステップを有する、画像処理装置により実行される画像処理方法。
- 処理対象の画像である対象画像における画素単位での合焦度合いの判定結果である第1の判定結果と、前記対象画像における領域単位での合焦度合いの判定結果である第2の判定結果とに基づいて、前記対象画像における合焦度合いの提示を制御するステップをコンピュータに実行させるためのプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/305,718 US10182184B2 (en) | 2014-05-02 | 2015-01-21 | Image processing apparatus and image processing method |
JP2016515866A JP6547739B2 (ja) | 2014-05-02 | 2015-01-21 | 画像処理装置、画像処理方法、およびプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-095272 | 2014-05-02 | ||
JP2014095272 | 2014-05-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015166675A1 true WO2015166675A1 (ja) | 2015-11-05 |
Family
ID=54358416
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/051572 WO2015166675A1 (ja) | 2014-05-02 | 2015-01-21 | 画像処理装置、画像処理方法、およびプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US10182184B2 (ja) |
JP (1) | JP6547739B2 (ja) |
WO (1) | WO2015166675A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180225820A1 (en) * | 2015-08-07 | 2018-08-09 | Arizona Board Of Regents On Behalf Of Arizona State University | Methods, systems, and media for simultaneously monitoring colonoscopic video quality and detecting polyps in colonoscopy |
WO2019181053A1 (ja) * | 2018-03-22 | 2019-09-26 | 富士フイルム株式会社 | デフォーカス量測定装置、方法およびプログラム、並びに判別器 |
JP2021005763A (ja) * | 2019-06-25 | 2021-01-14 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理装置の制御方法およびプログラム |
JP2021015559A (ja) * | 2019-07-16 | 2021-02-12 | 凸版印刷株式会社 | 三次元形状モデル生成装置、三次元形状モデル生成方法、及びプログラム |
JP7444163B2 (ja) | 2019-03-28 | 2024-03-06 | ソニーグループ株式会社 | 撮像装置、撮像方法、及びプログラム |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0854557A (ja) * | 1994-08-09 | 1996-02-27 | Nikon Corp | カメラのオートフォーカス装置 |
JP2004015597A (ja) * | 2002-06-10 | 2004-01-15 | Minolta Co Ltd | 電子カメラ |
JP2007060328A (ja) * | 2005-08-25 | 2007-03-08 | Sony Corp | 撮像装置および表示制御方法 |
JP2009163220A (ja) * | 2007-12-14 | 2009-07-23 | Canon Inc | 撮像装置 |
JP2010171769A (ja) * | 2009-01-23 | 2010-08-05 | Nikon Corp | 電子カメラ |
WO2012164896A1 (ja) * | 2011-05-31 | 2012-12-06 | パナソニック株式会社 | 画像処理装置及び画像処理方法並びにデジタルカメラ |
JP2013242407A (ja) * | 2012-05-18 | 2013-12-05 | Canon Inc | 撮像装置およびその制御方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5732288A (en) | 1994-08-09 | 1998-03-24 | Nikon Corporation | Auto-focusing device for camera |
JP4389656B2 (ja) * | 2004-05-12 | 2009-12-24 | ソニー株式会社 | 画像処理装置および方法、記録媒体、並びにプログラム |
JP4913163B2 (ja) * | 2007-10-16 | 2012-04-11 | パナソニック株式会社 | 画像表示装置及び画像表示方法 |
US8279318B2 (en) | 2007-12-14 | 2012-10-02 | Canon Kabushiki Kaisha | Image pickup apparatus and display control method for the same |
JP2013242408A (ja) * | 2012-05-18 | 2013-12-05 | Canon Inc | 撮像装置およびその制御方法 |
JP5938268B2 (ja) * | 2012-05-18 | 2016-06-22 | キヤノン株式会社 | 撮像装置およびその制御方法 |
US9277111B2 (en) | 2012-05-18 | 2016-03-01 | Canon Kabushiki Kaisha | Image capture apparatus and control method therefor |
EP3061063A4 (en) * | 2013-10-22 | 2017-10-11 | Eyenuk, Inc. | Systems and methods for automated analysis of retinal images |
-
2015
- 2015-01-21 US US15/305,718 patent/US10182184B2/en active Active
- 2015-01-21 WO PCT/JP2015/051572 patent/WO2015166675A1/ja active Application Filing
- 2015-01-21 JP JP2016515866A patent/JP6547739B2/ja active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0854557A (ja) * | 1994-08-09 | 1996-02-27 | Nikon Corp | カメラのオートフォーカス装置 |
JP2004015597A (ja) * | 2002-06-10 | 2004-01-15 | Minolta Co Ltd | 電子カメラ |
JP2007060328A (ja) * | 2005-08-25 | 2007-03-08 | Sony Corp | 撮像装置および表示制御方法 |
JP2009163220A (ja) * | 2007-12-14 | 2009-07-23 | Canon Inc | 撮像装置 |
JP2010171769A (ja) * | 2009-01-23 | 2010-08-05 | Nikon Corp | 電子カメラ |
WO2012164896A1 (ja) * | 2011-05-31 | 2012-12-06 | パナソニック株式会社 | 画像処理装置及び画像処理方法並びにデジタルカメラ |
JP2013242407A (ja) * | 2012-05-18 | 2013-12-05 | Canon Inc | 撮像装置およびその制御方法 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180225820A1 (en) * | 2015-08-07 | 2018-08-09 | Arizona Board Of Regents On Behalf Of Arizona State University | Methods, systems, and media for simultaneously monitoring colonoscopic video quality and detecting polyps in colonoscopy |
US10861151B2 (en) * | 2015-08-07 | 2020-12-08 | The Arizona Board Of Regents On Behalf Of Arizona State University | Methods, systems, and media for simultaneously monitoring colonoscopic video quality and detecting polyps in colonoscopy |
WO2019181053A1 (ja) * | 2018-03-22 | 2019-09-26 | 富士フイルム株式会社 | デフォーカス量測定装置、方法およびプログラム、並びに判別器 |
JP7444163B2 (ja) | 2019-03-28 | 2024-03-06 | ソニーグループ株式会社 | 撮像装置、撮像方法、及びプログラム |
JP2021005763A (ja) * | 2019-06-25 | 2021-01-14 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理装置の制御方法およびプログラム |
JP7210388B2 (ja) | 2019-06-25 | 2023-01-23 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理装置の制御方法およびプログラム |
JP2021015559A (ja) * | 2019-07-16 | 2021-02-12 | 凸版印刷株式会社 | 三次元形状モデル生成装置、三次元形状モデル生成方法、及びプログラム |
JP7334516B2 (ja) | 2019-07-16 | 2023-08-29 | 凸版印刷株式会社 | 三次元形状モデル生成装置、三次元形状モデル生成方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20170048447A1 (en) | 2017-02-16 |
US10182184B2 (en) | 2019-01-15 |
JPWO2015166675A1 (ja) | 2017-04-20 |
JP6547739B2 (ja) | 2019-07-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9704028B2 (en) | Image processing apparatus and program | |
WO2015166675A1 (ja) | 画像処理装置、画像処理方法、およびプログラム | |
KR102303514B1 (ko) | 정보 처리 장치, 정보 처리 방법 및 프로그램 | |
JP4181625B1 (ja) | 輝度情報表示装置および方法 | |
JP2010093783A (ja) | 撮像装置および撮像システム | |
JP2013179708A (ja) | 撮像システムおよび撮像方法 | |
JP2013162287A (ja) | 画像処理装置、画像処理方法、およびプログラム | |
JP2010117884A (ja) | 画像処理装置、映像表示装置、撮像装置、画像処理方法 | |
JP5914992B2 (ja) | 表示制御装置、表示制御方法、およびプログラム | |
JP2015192310A (ja) | プロジェクションシステム、携帯機器、プログラム、及び、携帯機器の制御方法 | |
CN109391769B (zh) | 控制设备、控制方法和存储介质 | |
US10721432B2 (en) | Image capturing apparatus connectable to display apparatus, display apparatus connectable to external apparatus, and image processing apparatus performing image processing | |
US9460531B2 (en) | Effect control device, effect control method, and program | |
JP2005173879A (ja) | 融合画像表示装置 | |
JP2014165527A (ja) | 撮像装置及び制御プログラム並びに露出量制御方法 | |
JP2010041507A (ja) | ビューファインダ表示回路 | |
JP5880221B2 (ja) | 撮像装置、撮像方法、及び画像処理装置 | |
JP5117620B2 (ja) | 液晶表示装置、画像表示方法、プログラムおよび記録媒体 | |
JP2014053650A (ja) | 画像処理装置、および画像処理方法 | |
JP6033098B2 (ja) | 画像処理装置及びその制御方法、プログラム | |
JP5520135B2 (ja) | 画像処理装置 | |
JP5448799B2 (ja) | 表示制御装置及び表示制御方法 | |
JP2012238099A (ja) | 画像処理装置、画像処理方法、およびプログラム | |
KR101378618B1 (ko) | 영상 처리 장치 및 방법 | |
JP2018124377A (ja) | 表示装置、表示システム、及び、表示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15785871 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016515866 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15305718 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15785871 Country of ref document: EP Kind code of ref document: A1 |