CN107018316B - Image processing apparatus, image processing method, and storage medium - Google Patents

Image processing apparatus, image processing method, and storage medium Download PDF

Info

Publication number
CN107018316B
CN107018316B CN201611033980.0A CN201611033980A CN107018316B CN 107018316 B CN107018316 B CN 107018316B CN 201611033980 A CN201611033980 A CN 201611033980A CN 107018316 B CN107018316 B CN 107018316B
Authority
CN
China
Prior art keywords
image
wide
angle
images
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611033980.0A
Other languages
Chinese (zh)
Other versions
CN107018316A (en
Inventor
高山喜博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Casio Computer Co Ltd
Original Assignee
Casio Computer Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Casio Computer Co Ltd filed Critical Casio Computer Co Ltd
Publication of CN107018316A publication Critical patent/CN107018316A/en
Application granted granted Critical
Publication of CN107018316B publication Critical patent/CN107018316B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/241Devices for focusing
    • G02B21/244Devices for focusing using image analysis techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • G06T5/73
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10148Varying focus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20064Wavelet transform [DWT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The invention provides an image processing apparatus, an image processing method, and a storage medium, which can make the corresponding relation between a fisheye image and a given subject captured in the fisheye image clearly provided to a user. The image processing apparatus includes: a detection unit that recognizes a given subject included in a wide-angle image captured at a wide angle and detects an image of a subject portion thereof; a correction unit that corrects distortion of an image of the object portion; and a processing unit that performs processing for associating an image of at least a part of the wide-angle image with an image of the subject portion whose distortion is corrected.

Description

Image processing apparatus, image processing method, and storage medium
The present application is based on japanese patent application No. 2015-249410, filed 12/22 and 2016, 2016 and 2016, and requires priority benefits in accordance with 35u.s.c.119, the entire disclosures of which, including the specification, claims, drawings, and abstract, are hereby incorporated by reference.
Technical Field
The present invention relates to an image processing apparatus, an image processing method, and a storage medium that perform processing for a wide-angle image captured at a wide angle.
Background
Although a fisheye lens used in an imaging device such as a digital still camera can perform imaging in a wide range of an angle of view of, for example, about 180 °, in order to employ a projection method, an image (fisheye image) captured by the fisheye lens is an image whose distortion increases as it goes from the center portion toward the end portion (peripheral portion). As a technique for displaying a fisheye image captured using such a fisheye lens, for example, as disclosed in japanese utility model registration No. 3066594, there is a technique for obtaining an image in a given region from inside the fisheye image, and applying a distortion correction process to the image in the obtained region to provide a distortion-free image (corrected image) to a user. Further, as disclosed in japanese patent application laid-open No. 2015-19162, when a distorted circular image (fisheye image) including the faces of the participants of the conference is captured using a fisheye lens, a technique is disclosed in which the faces of the participants are recognized, and the images of the participants are cut out and displayed together with the sound emission time of the participants.
According to the techniques of the above patent documents, even when a wide-range image is captured using a fisheye lens, an image having no distortion can be obtained for an image in a region cut out from a part thereof, but if only the cut-out image is displayed, it is difficult for a user to confirm the correspondence between a person or the like captured in what scene and the fisheye image.
Disclosure of Invention
The invention aims to clearly provide a user with a correspondence relationship between a fisheye image and a predetermined subject captured in the fisheye image.
One aspect of the present invention relates to an image processing apparatus including: a detection unit that recognizes a given subject included in a wide-angle image captured at a wide angle and detects an image of a subject portion thereof; a correction unit that corrects distortion of an image of the object portion; and a processing unit that performs processing for associating an image of at least a part of the wide-angle image with an image of the subject portion whose distortion is corrected.
Another aspect of the present invention relates to an image processing method in an image processing apparatus, the image processing method including: a process of recognizing a given subject included in a wide-angle image captured at a wide angle and detecting an image of a subject portion thereof; processing for correcting the detected distortion of the image of the subject portion; and a process of associating an image of at least a part of the wide-angle image with the corrected image of the subject portion.
Still another aspect of the present invention relates to a computer-readable storage medium storing a program for causing a computer of an image processing apparatus to execute: a process of recognizing a given subject included in a wide-angle image captured at a wide angle and detecting an image of a subject portion thereof; processing for correcting the detected distortion of the image of the subject portion; and a process of associating an image of at least a part of the wide-angle image with the corrected image of the subject portion.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the general description given above and the detailed description of the embodiments given below, serve to explain the principles of the invention.
Fig. 1(1) is an external view showing a state in which an imaging device 10 and a main body device 20 constituting a digital camera used as an image processing device are integrally combined, and fig. 1(2) is an external view showing a state in which the imaging device 10 and the main body device 20 are separated.
Fig. 2(1) is a block diagram showing the configuration of the imaging apparatus 10, and fig. 2(2) is a block diagram showing the configuration of the main apparatus 20.
Fig. 3(1) and (2) are diagrams for explaining various postures of the imaging apparatus 10 at the time of photographing.
Fig. 4(1) and (2) are diagrams showing examples of display of fisheye images captured by the fisheye lens 16B.
Fig. 5(1) to (4) are diagrams for explaining the case of reproducing a fisheye image (stored image).
Fig. 6 is a diagram for explaining the image management table 23C on the main apparatus 20 side.
Fig. 7 is a flowchart for explaining the operation of the main apparatus 20 side (the operation of the characteristic portion of embodiment 1) when performing 360 ° photography.
Fig. 8 is a flowchart showing the operation of the main apparatus 20 (the operation of the characteristic portion of embodiment 1) when reproducing a fisheye image (stored image).
Fig. 9 is a flowchart showing the operation continued from fig. 8.
Fig. 10 is a flowchart showing the operation of the main apparatus 20 (the operation of the characteristic portion of embodiment 2) when reproducing a fisheye image (stored image).
Fig. 11(1) to (5) are diagrams for explaining a case where a fish-eye image (stored image) is reproduced in embodiment 2.
Fig. 12 is a flowchart showing the operation of the main apparatus 20 (the operation of the characteristic portion of embodiment 3) when reproducing a fisheye image (stored image).
Fig. 13(1) to (3) are diagrams for explaining a case where a fish-eye image (stored image) is reproduced in embodiment 3.
Fig. 14(1) to (3) are diagrams for explaining another case when a fish-eye image (stored image) is reproduced in embodiment 3.
Detailed Description
Hereinafter, embodiments of the present invention will be described based on the drawings.
(embodiment 1)
Embodiment 1 of the present invention will be described below with reference to fig. 1 to 9.
In the present embodiment, a case has been exemplified in which the present invention is applied to a digital camera as an image processing device, and the digital camera is a separate type digital camera which can be separated into an imaging device 10 having an imaging unit described later and a main body device 20 having a display unit described later. Fig. 1(1) is a diagram showing a state in which the imaging device 10 and the main body device 20 are combined together, and fig. 1(2) is a diagram showing a state in which the imaging device 10 and the main body device 20 are separated. The imaging device 10 and the main body device 20 constituting the separate digital camera can be paired (wireless connection identification) using available wireless communication, and wireless LAN (Wi-Fi) or Bluetooth (registered trademark) can be used as wireless communication. On the main apparatus 20 side, an image captured on the imaging apparatus 10 side is received and acquired, and the captured image (live view image) is displayed. In the present embodiment, the captured image is not limited to the image that has been stored, and means an image displayed on the live view screen (live view image: image before storage).
Fig. 2(1) is a block diagram showing the configuration of the imaging apparatus 10, and fig. 2(2) is a block diagram showing the configuration of the main apparatus 20.
In fig. 2(1), the imaging apparatus 10 includes: a control unit 11, a power supply unit 12, a storage unit 13, a communication unit 14, an operation unit 15, an imaging unit 16, and a posture detection unit 17. The control unit 11 operates by power supply from a power supply unit (secondary battery) 12, controls the overall operation of the imaging apparatus 10 according to various programs in a storage unit 13, and the control unit 11 is provided with a CPU (central processing unit), a memory, and the like (not shown). The storage unit 13 has a configuration such as a ROM and a flash memory, and stores a program for realizing the present embodiment, various applications, and the like. The communication unit 14 transmits a photographed image to the main apparatus 20, receives an operation instruction signal from the main apparatus 20, and the like. The operation unit 15 includes basic operation keys (hardware keys) such as a power switch.
The image pickup section 16 constitutes a camera section capable of picking up an object with high definition, and a fisheye lens 16B, an image pickup device 16C, and the like are provided in a lens unit 16A of the image pickup section 16. The camera according to the present embodiment can replace a normal imaging lens (not shown) and the fisheye lens 16B, and the illustrated example shows a state in which the fisheye lens 16B is attached. The fisheye lens 16B is, for example, a circumferential fisheye lens (full-circumference fisheye lens) which is composed of three lens systems and can photograph a wide range of an angle of view of about 180 °, and the entire wide-angle image (fisheye image) photographed by the fisheye lens 16B is a circular image. In this case, in order to adopt the projection method, the wide-angle image (fisheye image) captured by the fisheye lens 16B is an image whose distortion increases as the distance from the center thereof increases toward the ends thereof.
That is, since the fisheye lens 16B is a circumferential fisheye lens capable of photographing in a wide range of an angle of view of about 180 °, the fisheye image as a whole becomes a circular image, and the distortion becomes larger as it goes toward an end portion (peripheral portion) from the center portion thereof, and since the peripheral portion thereof becomes an image reduced in size as compared with the center portion of the fisheye image, it is extremely difficult for a user to visually confirm the content of the peripheral portion in detail. When the subject image (optical image) captured by the fisheye lens 16B is imaged on the image pickup device (for example, CMOS or CCD)16C, an image signal (analog-value signal) photoelectrically converted by the image pickup device 16C is converted into a digital signal by an a/D conversion unit (not shown), subjected to a predetermined image display process, and then transmitted to the main body device 20 side for display on a monitor. The posture detecting unit 17 is a triaxial acceleration sensor that detects acceleration applied to the imaging device 10, and supplies each acceleration component in the X, Y, Z direction detected from the posture of the imaging device 10 to the control unit 11.
In fig. 2(2), the main apparatus 20 has a reproduction function of reproducing an image captured using a fisheye lens or the like, and includes: a control unit 21, a power supply unit 22, a storage unit 23, a communication unit 24, an operation unit 25, a touch display unit 26, and a posture detection unit 27. The control unit 21 operates by the supply of electric power from the power supply unit (secondary battery) 22, controls the overall operation of the main body device 20 in accordance with various programs in the storage unit 23, and the control unit 21 is provided with a CPU (central processing unit), a memory, and the like, which are not shown. The storage unit 23 has a configuration of, for example, a ROM, a flash memory, or the like, and includes a program memory 23A in which a program for realizing the present embodiment, various applications, and the like are stored, a work memory 23B in which various information (for example, flags and the like) necessary for operating the main apparatus 20 is temporarily stored, an image management table 23C described later, and the like.
The communication unit 24 transmits and receives various data to and from the imaging device 10. The operation unit 25 includes various push button type keys such as a power button, a release button, and a setting button for setting imaging conditions such as exposure and shutter speed, and the control unit 21 executes processing according to an input operation signal from the operation unit 25 or transmits the input operation signal to the imaging device 10. The touch display unit 26 has a structure in which a touch panel 26B is disposed in a stacked manner on a display 26A such as a high-definition liquid crystal display, and a display screen thereof is a monitor screen (live view screen) for displaying a captured image (fisheye image) in real time or a reproduction screen for reproducing a captured image. The posture detecting unit 27 is a triaxial acceleration sensor that detects acceleration applied to the main body device 20, and supplies each acceleration component in the X, Y, Z direction detected from the posture of the main body device 20 to the control unit 21.
On the main apparatus 20 side, when a release button for instructing photographing is operated, the control unit 21 performs development processing on a fisheye image to generate a photographed image, performs image compression processing to convert the image into a standard file format, and then records and stores the converted image in a recording medium of the storage unit 23. In this case, the control unit 21 generates a corrected image by performing a process of correcting distortion of the fisheye image, and records and stores the corrected image so as to correspond to the fisheye image. In this case, a technique of correcting distortion of a fisheye image by using a plane tangent to an arbitrary point on the virtual spherical model as a screen and converting coordinates of points on the virtual spherical model into points on the plane screen is used, but a known technique generally used for image processing is used for a process of correcting distortion of a fisheye image (fisheye distortion correction process), and thus a detailed description thereof is omitted.
Fig. 3 is a diagram for explaining the posture of the imaging apparatus 10.
The entire imaging device 10 is, for example, box-shaped, and a fisheye lens 16B is disposed in the center of the front surface thereof. Fig. 3(1) shows a state in which the optical axis direction of the fisheye lens 16B is substantially orthogonal to the gravity direction, that is, a case in which shooting is performed such that the camera is substantially perpendicular to the ground (a case in which vertical shooting is performed), and a hemisphere indicated by a broken line in the drawing shows a shooting range of the fisheye lens 16B in which the angle of view is about 180 °. Fig. 3(2) shows a state in which the front direction of the camera (the optical axis direction of the fisheye lens 16B) is oriented skyward, that is, a case in which shooting is performed such that the optical axis direction of the fisheye lens 16B is substantially opposite to the gravity direction (a case in which horizontal shooting is performed).
Fig. 4 is a view illustrating a wide-angle image (fish-eye image) captured by using the fish-eye lens 16B, and shows a state in which the fish-eye image is displayed on the reproduction screen of the touch display unit 26 on the main apparatus 20 side.
The illustrated example shows a fisheye image captured by the vertical imaging shown in fig. 3(1), fig. 4(1) shows a fisheye image obtained when vertical imaging is performed with the front 180 ° as the imaging range, and fig. 4(2) shows a fisheye image obtained when vertical imaging is performed with the rear 180 ° as the imaging range. When 360 ° photography is performed in this manner, photography is performed twice, divided into front 180 ° photography and rear 180 ° photography. In the illustrated example, the image is taken while a party is being taken outside the room at a dining table, and a group of mothers is captured in the fisheye image of fig. 4(1) in which the front 180 ° image is taken, and a group of mothers is captured in the fisheye image of fig. 4(2) in which the rear 180 ° image is taken. Each fisheye image photographed by 360 ° in this manner is stored and stored as a set of fisheye image groups.
The control unit 21 of the main body apparatus 20 receives a process of acquiring a fisheye image captured by the imaging apparatus 10 and correcting distortion of the entire image, and performs a process of associating an image of a part of the corrected image with an image of each detected object part after recognizing a predetermined object included in the corrected image and detecting an image of the object part for each object. That is, the control unit 21 performs a process of correcting distortion of each fisheye image photographed at 360 ° in the above-described manner, generates a corrected image, and records and stores the corrected image in association with each fisheye image. Then, the image is analyzed for each corrected image, and all of the given subjects (faces of persons) included in the corrected image are detected. The face detection process is a technique generally used in a camera, for example, to detect a face element or detect a skin color, and this known technique is used in the present embodiment, and therefore, a detailed description thereof is omitted.
As a result of this face detection processing, when the faces of all the persons included in the corrected image are detected, the control section 21 cuts out the subject portion (face portion) for each person along the outline of the person and generates a face image, and records and stores each face image in association with the fisheye image thereof. The face image is an image centered on the face so as to identify a person, but may be an image including the upper body (head, neck, shoulder), or may be a feature region extracted by detecting feature points from the image without being limited to the face. The control unit 21 performs image processing for matching each face image with a predetermined shape and size, and then records and stores the face image.
Fig. 5 is a diagram for explaining a case where a fisheye image (stored image) is reproduced.
Fig. 5(1) shows a list screen in which thumbnail images of various fisheye images (stored images) are displayed on the touch display unit 26 on the main apparatus 20 side, and circles in the figure show the fisheye images and a state in which thumbnail images of fisheye images are displayed in a3 × 3 matrix. Here, when the position of an arbitrary thumbnail image is touched, the control unit 21 selects the touched thumbnail image (fisheye image) as the reproduction target. Fig. 5(2) shows a case where the fisheye image selected as the reproduction target is selected, and among the fisheye images of 360 ° described above, the fisheye image photographed 180 ° forward (see fig. 4 (1)).
Fig. 5(3) shows an image in a predetermined range cut out from the corrected images stored in association with the fisheye image selected as the reproduction target. In this case, an image within a given range (for example, a rectangular cutout frame) is cut out from the corrected image. The cutout frame (virtual frame) is similar to the corrected image, and is, for example, a rectangular frame of 1/4 size of the corrected image, and is set at an initial position (position of the central portion) on the corrected image at the first time point of selection of the fisheye image, but can be moved to an arbitrary position on the corrected image by a user operation. In the illustrated example, the image (including the image of the mother and the daughter) in the rectangular cutout frame shown by the broken line is cut out.
Fig. 5(4) is a view showing a reproduction screen of a fisheye image selected as a reproduction target. In the reproduction screen, as described above, a part of the image of the corrected image (the image in the cutout frame) is displayed in an enlarged manner, and the face images of the persons corresponding to the fisheye images are displayed in a list. In this case, the face images of the persons included in the fish-eye image selected by the touch operation (for example, the fish-eye image of (1) in fig. 4) are not limited to the face images of the persons included in the fish-eye image selected by the touch operation, and the face images of the persons included in the other fish-eye image (for example, the fish-eye image of (2) in fig. 4) are also displayed in a list. At this time, the face images of the persons are arranged in the order of arrangement of the persons in the fish-eye image, and are displayed in a list. As shown in the figure, the face images of the respective persons are arranged in a straight line in the lower part of the playback screen and displayed in a list. In this case, in the figure, the face images of the mother and the daughter who are arranged on the left side are the face images of the person included in the fish-eye image of fig. 4(1), and the face images of the mother and the daughter who are arranged on the right side are the face images of the person included in the fish-eye image of fig. 4 (2).
In the figure, a dotted arrow mark marked around an enlarged image (an image in the cutout frame) indicates a moving direction of the cutout frame that can be moved by a touch operation, and the cutout frame is moved by the touch operation (slide operation) on the touch display unit 26 according to the sliding direction and the sliding amount. Accordingly, the content of the image (enlarged image) in the cutout frame displayed on the playback screen also changes in accordance with the movement of the cutout frame. The moving direction of the cutting frame is not limited to four directions, i.e., up, down, left, and right, and can be moved in an oblique direction. In addition to the above-described slide operation performed on the playback screen, it is also effective to perform a touch operation in which a tap operation (double-tap operation) is performed twice in succession, and when the double-tap operation is performed on an arbitrary face image, the control section 21 detects a position including the face image from the corrected image, and moves (jumps) the cutout frame to the detected position, thereby causing the face image to be included in an enlarged image (image within the cutout frame) displayed on the playback screen.
Fig. 6 is a diagram for explaining the image management table 23C on the main apparatus 20 side.
The image management table 23C is a table for managing a fisheye image (saved image) and images related thereto, and includes: "fisheye image ID", "shooting type", "link ID", "correction image ID", "face image ID", …. The "fisheye image ID" is an item in which information (a serial number or the like) for identifying a fisheye image file (not shown) is stored in order to manage fisheye images, and the illustrated example shows a case where "a 0001", "a 0002", and … are stored. The "shooting category" is an item for storing a type of shooting indicating whether a fisheye image shot at 180 ° or a fisheye image shot at 360 °, and the illustrated example indicates that each image having a "fisheye image ID" of "a 0002" or "a 0003" is a group of fisheye images shot at 360 °.
The "link ID" is an item for storing a "fisheye image ID" indicating another fisheye image grouped in the case where the "photographing category" is 360 ° photographing, and in the illustrated example, "a 0003" is stored as a "fisheye image ID" indicating another fisheye image in the "link ID" indicating the "fisheye image ID" of "a 0002", and "a 0002" is stored as a "fisheye image ID" indicating another fisheye image in the "link ID" of "a 0003". The "corrected image ID" is an item in which information (serial number or the like) for identifying a corrected image file (not shown) is stored in order to manage a corrected image in which distortion is corrected in association with a fisheye image, and the illustrated example shows a case where "B0001", "B0002", and … are stored. The "face image ID" is an item in which information (a serial number or the like) for identifying a face image file (not shown) of each person included in the fisheye image is stored so that the face image of each person is managed in association with the fisheye image, and indicates a case where "C0001", "C0002", and … are stored in the illustrated example.
Next, the concept of the operation of the image processing apparatus (digital camera) according to embodiment 1 will be described with reference to flowcharts shown in fig. 7 to 9. Here, each function described in these flowcharts is stored in the form of a readable program code, and the operation corresponding to the program code is sequentially executed. Further, the operations corresponding to the program codes described above transmitted via a transmission medium such as a network can be sequentially executed. That is, the operations unique to the present embodiment can be executed by using a program or data supplied from the outside through a transmission medium in addition to the recording medium. Fig. 7 to 9 are flowcharts showing an outline of operations of characteristic portions of embodiment 1 in the overall operation of the digital camera, and the flow returns to the main flow (not shown) of the overall operation when exiting from the flow of fig. 7 to 9.
Fig. 7 is a flowchart for explaining the operation of the main apparatus 20 side (the operation of the characteristic portion of the present embodiment) when performing 360 ° photography. The 360 ° photography is a menu item arbitrarily selected by a user operation from a menu screen for selecting various photography modes, and when the menu item for the 360 ° photography is selected, the operation of fig. 7 is started.
First, in a standby state of the release operation (step a1), when the release operation is performed (yes in step a1), the control unit 21 on the main apparatus 20 side performs development processing on the fisheye image received from the imaging apparatus 10 to generate a captured image, performs image compression processing to convert the image into a standard file format (step a2), and records and stores the image in a recording medium of the storage unit 23 (step A3). In this case, two fisheye images obtained by 360 ° photography are recorded and stored as a set of images, and for example, "a 0002" is generated as a new "fisheye image ID" and is additionally registered in the image management table 23C.
Then, the control unit 21 registers "360 ° photography" as the "photography type" corresponding to the new "fish-eye image ID" in the image management table 23C, and performs a process of registering the corresponding "link ID" (step a4), but the registration of the "link ID" is performed at the time of the second photography of 360 ° photography. Next, the control unit 21 performs a process of correcting distortion for the fish-eye image to generate a corrected image (step a5), and records the corrected image in the recording medium of the storage unit 23, but at this time, "B0002" is generated as a new "corrected image ID", for example, and is registered in the image management table 23C in association with "a 0002" of the "fish-eye image ID" (step a 6).
Further, the control unit 21 analyzes the entire corrected image to detect the face of the person (step a 7). As a result of the examination, it is examined whether or not the face of the person is detected (step A8), and if the face of the person cannot be detected (no in step A8), the procedure proceeds to step a11, and it is checked whether or not the 360 ° photography is completed. Since the first shooting of 360 ° shooting is currently completed (no in step a11), the process returns to step a1 described above, and a standby state is established until a release instruction to the second shooting of 360 ° shooting is operated.
When the face of a person in the corrected image is detected (yes in step A8), the face image is cut out for each person along the outline of the person (step a 9). In this case, the face images of all the persons included in the corrected image are produced, and if there is a possibility that a person may be present, the face images of the persons may be generated if the face distortion is too large and the face of the person cannot be detected with high accuracy even if the distortion correction is performed, such as a person located in the peripheral portion of the fisheye image. After each face image created by cutting out each person in this manner is adjusted to a predetermined shape and size, the face image is recorded in association with the fisheye image and stored in the recording medium of the storage unit 23, and in this case, "C0004" and "C0005" are created as the "face image ID" and registered in the image management table 23C in association with "a 0002" of the "fisheye image ID" (step a 10). Then, the above-described operations (steps a1 to a11) are repeated until the 360 ° imaging is completed, and when the 360 ° imaging is completed (yes in step a11), the flow of fig. 7 is exited.
Fig. 8 and 9 are flowcharts showing the operation of the main apparatus 20 (the operation of the characteristic portion of the present embodiment) when reproducing a fisheye image (stored image).
First, when the user operation instructs reproduction of a fisheye image (stored image), the control unit 21 reads a plurality of fisheye images (stored images) stored in the storage medium of the storage unit 23 (step B1), converts the fisheye images into thumbnail images, and displays the thumbnail images in a list (step B2). In this state, it is checked whether or not a touch operation is performed to specify an arbitrary thumbnail image from the list screen as a reproduction target (step B3), and if the touch operation is not performed (no in step B3), it is checked whether or not a key indicating the end of reproduction, which is a predetermined key of the operation unit 25, is operated (step B4). Here, if the regeneration end is not instructed (no in step B4), the process returns to step B3 described above, and if the regeneration end is instructed (yes in step B4), the process exits from the flow of fig. 8 and 9.
When the position of any thumbnail image is touched while a plurality of fisheye images are displayed in a matrix as thumbnail images (yes in step B3), the control unit 21 selects and designates the fisheye image as a reproduction target (step B5), reads a corrected image stored in association with the designated image (fisheye image) (step B6), and cuts out an image in a rectangular area (cut frame) from the corrected image (step B7). In this case, since the trimming frame is set at the initial position (the position of the central portion) on the correction image at the initial point in time when the fisheye image is selected, the image of the central portion of the correction image is trimmed. Then, the cut-out image is enlarged to a given size and displayed at the central portion of the reproduction screen (step B8).
Next, each face image stored in association with the above-described designated image (fisheye image) is read (step B9), and the respective face images are collectively displayed on the playback screen (step B10). In this case, as shown in fig. 5(4), the face images are arranged in a horizontal row in the lower part of the playback screen and displayed in a list, and the arrangement order of the face images is the arrangement order of the persons in the fisheye image as described above. Then, it is checked whether or not another fisheye image grouped with the above-described designated image (fisheye image) or another fisheye image of 360 ° is unspecified (step B11), and if an unspecified fisheye image remains (yes in step B11), the unspecified fisheye image is designated as a reproduction target (step B12), and thereafter, the process returns to step B9, and each face image stored in association with the new designated image (fisheye image) is read out and displayed in a list on the reproduction screen (step B10). Accordingly, when all of the fisheye images photographed at 360 ° are designated and ended (no in step B11), the flow proceeds to the flow of fig. 9.
When a touch operation is performed on the reproduction screen while the reproduction screen is displayed on the touch display unit 26 (yes in step B13), it is checked whether the touch operation is a double-click operation on the face images displayed in a list (step B14) or a slide operation in the vicinity of the correction image (image in the cutout frame) displayed in an enlarged manner (step B15). When a double-click operation is currently performed on a face image (yes in step B14), a position including the face is specified from within the corrected image, and the cutting frame is moved to the detection position (step B16). For example, as shown in fig. 5(4), when a double-click operation is performed on the face image of the girl, the face image of the girl is specified from the corrected image, and the cutout frame is moved so that the face image becomes the center position of the cutout frame. Then, the image in the cutout frame is cut out and enlarged and displayed on the reproduction screen (step B17). Thereafter, the process returns to step B13.
Further, if the touch operation on the reproduction screen is a slide operation in the vicinity of the correction image (image in the cutout frame) that is displayed in an enlarged manner (yes in step B15), the slide direction and the slide amount are detected (step B18), and the cutout frame is moved on the correction image in accordance with the slide direction and the slide amount (step B19). Then, the image in the trimming frame is cut out from the corrected image and displayed on the reproduction screen in an enlarged manner (step B17), and the process returns to step B13 described above.
On the other hand, when a key indicating switching to another fisheye image for 360 ° photography is operated (yes in step B20) as a predetermined key of the operation unit 25 instead of the touch operation on the reproduction screen (no in step B13), for example, the other fisheye image is designated as the reproduction target from among two fisheye images for 360 ° photography (step B21), the corrected image stored in correspondence with the designated image is read (step B22), the image in the cutout frame is cut out from the corrected image (step B23), and the cut-out image is enlarged and displayed on the reproduction screen (step B24). Thereafter, the process returns to step B13. For example, if a key for instructing the end of reproduction, which is a predetermined key of the operation unit 25, is operated (yes in step B25), the flow of fig. 7 and 8 is exited.
As described above, in embodiment 1, since the control section 21 of the main body apparatus 20 performs the processing of recognizing a given subject included in the wide-angle image (fisheye image) captured by the image capturing apparatus 10, correcting the distortion of the image of the subject portion, and associating the image of a part of the wide-angle image and the image of the corrected subject portion, it is possible to clearly provide the correspondence relationship between the fisheye image and the given subject captured in the fisheye image to the user.
The control unit 21 performs a process of correcting distortion of a wide-angle image (fisheye image) captured by the imaging device 10, recognizing a predetermined subject (face of a person) included in the corrected image, detecting an image (face image) of a subject portion for each of the subjects (persons), and associating an image of a part of the corrected image with the detected image (face image) of each subject portion, and thus can convert a wide-angle image obtained by capturing a predetermined subject at a wide angle into a form in which the predetermined subject is easily observed. For example, when displaying an image, if the image in which the distortion is corrected is displayed in association with the face image of each person, as compared with when directly displaying a wide-angle image having a large distortion, the image can be displayed in a form in which the wide-angle image is easily viewed, and the person can be easily confirmed by the user.
Since the control unit 21 enlarges and displays the images within the predetermined range (the cutout frame) of the corrected image and displays the images (face images) of the detected subject portions in a list, when reproducing the image, if a part of the corrected image is displayed, the image can be displayed in a form in which the wide-angle image is easier to observe than the whole wide-angle image having a large distortion. That is, the user can confirm the subject by reducing the size of the corrected image captured at a wide angle to a portion thereof, and can confirm all persons included in the image captured at a wide angle, including persons not displayed in the portion of the corrected image.
When displaying a list of detected images (face images) of the respective subject portions, the control unit 21 arranges and displays the images in a list in accordance with the arrangement order of the respective subjects (persons) on the wide-angle image, and therefore, can easily confirm where and where the person is, etc., by merely observing the list display of the face images.
Since the control section 21 arbitrarily moves the given range (cutout frame) by a user operation (slide operation) and displays the image cutout in accordance with the movement of the cutout frame in an enlarged manner, it is possible for the user to easily confirm the image of a desired portion from the corrected image to the detailed portion thereof.
Since the control unit 21 recognizes all given subjects (faces of persons) included in the wide-angle image group including the plurality of wide-angle images captured by changing the imaging direction and detects images (face images) of the subject portions for each of the subjects, it is possible to detect the faces of all persons existing around the subject even if 360 ° imaging is performed by dividing the imaging into front 180 ° imaging and rear 180 ° imaging, for example.
When a wide-angle image to be displayed is designated from a wide-angle image group including a plurality of wide-angle images captured while changing the shooting direction, the control unit 21 causes a corrected image of the designated wide-angle image to be displayed in association with images (faces of persons) of all the subject portions detected from the wide-angle image group, and therefore, the user can observe the plurality of images (corrected images) by switching the display of the plurality of images.
Since the control unit 21 designates a wide-angle image arbitrarily selected by a user operation from the group of wide-angle images captured while changing the shooting direction as a display target, it is possible for the user to arbitrarily switch the designated image from among a plurality of images (corrected images) captured while changing the shooting direction to view the enlarged display of the image.
When an arbitrary wide-angle image is selected from the thumbnail images by a user operation in a state where the wide-angle images (fisheye images) are displayed in a list as thumbnail images, the control unit 21 enlarges and displays a corrected image of the selected wide-angle image and displays an image (face image) of each detected subject portion in a list, so that it is possible for the user to arbitrarily select a desired fisheye image from a large number of fisheye images and confirm the image in a form that is easy to observe.
Since the control unit 21 records and stores the corrected image and the image of each subject portion (face image) in association with each other, it is not necessary to perform distortion correction of the wide-angle image or a process of cutting out the face image from the corrected image every time image reproduction is performed.
(modification 1)
In embodiment 1 described above, a fisheye image arbitrarily selected by a user operation from a group of fisheye images captured while changing the imaging direction is designated as a display target, but each fisheye image may be sequentially selected one by one in a predetermined order at a predetermined timing, and the corrected image may be displayed in a switched manner. This enables the user to observe each fisheye image (corrected image) in a predetermined order. In this case, in the 360 ° photography, the face images of the persons included in the displayed fisheye image (corrected image) and the face images of the persons included in the non-displayed fisheye image (corrected image) can be identified and displayed in a list. This makes it possible to easily confirm the correspondence between the fisheye image (corrected image) and the face image of the person.
(modification 2)
In embodiment 1 described above, the face images of all persons included in a group of fisheye images captured with the shooting direction changed are displayed in a list, but a specific face image may be recognized and displayed. For example, a person included in a part of the corrected image in the enlarged display may be specified, and only the face image of the person may be displayed in a recognized manner (for example, displayed in a color-separated manner or enlarged). This makes it possible to recognize and display a person included in a part of the corrected image being displayed in an enlarged manner and a person included in an undisplayed part of the corrected image. Here, when the person included in the undisplayed part is confirmed by the enlarged display, the cutting frame may be moved so as to cut out the person.
(modification 3)
Although the circumferential fisheye lens (full-circumference fisheye lens) is exemplified as the fisheye lens 16B in embodiment 1 described above, a diagonal fisheye lens may be used. In the above-described embodiment, the 360 ° photographing is performed by dividing one fisheye lens 16B into two times of the front 180 ° photographing and the rear 180 ° photographing, but if two fisheye lenses 16B are provided, for example, if fisheye lenses are provided on both the front surface and the rear surface of the camera housing, the front 180 ° photographing and the rear 180 ° photographing can be performed simultaneously in accordance with an instruction of the 360 ° photographing. Further, 360 ° photographing may be performed by performing photographing a plurality of times using an ultra-wide-angle lens.
(modification 4)
In the above-described embodiment 1, the control unit 21 of the main apparatus 20 receives the fisheye image captured by the imaging apparatus 10 and performs the process of correcting the distortion of the entire image, and after a given subject contained in the corrected image is identified and an image of a subject portion thereof is detected for each of the subjects, a process of associating an image of a part of the corrected image with the detected image of each object part is performed, however, the present invention is not limited to this, and the control unit 21 of the main apparatus 20 may receive and acquire the fisheye image captured by the imaging apparatus 10 without correcting distortion of the fisheye image, but after a given subject included in the fisheye image is identified and an image of a subject portion thereof is detected for each of the subjects, a process is performed for associating an image of a part of the fisheye image with the detected image of each object part.
In this case, since distortion remains in the detected image of each object portion, distortion correction may be performed on the image of each object portion at a predetermined timing, for example, at the same time as or before the above-described association-establishing process is performed, or at the same time as or before the object portion is cut for each person. As a process of associating the image of the part of the fisheye image in which the distortion is directly left with the image of the corrected subject part, for example, if a process of displaying the image of the part of the fisheye image on the reproduction screen and displaying the images of the corrected subject part on the reproduction screen in a list is performed, the correspondence relationship between the fisheye image and the predetermined subject captured in the fisheye image can be clearly provided to the user, as in the above-described embodiment.
(modification 5)
Although in embodiment 1 described above, an image within a predetermined range cut out from the corrected image recorded and stored in association with the fisheye image selected as the reproduction target is displayed, the present invention is not limited to this, and the fisheye image selected as the reproduction target itself may be displayed. That is, the distorted fisheye image and the detected images (face images) of the respective subject portions may be displayed in a list. In this case, as in the above-described embodiment, the correspondence between the fisheye image and a predetermined subject captured in the fisheye image can be clearly provided to the user.
(modification 6)
In the above-described embodiment 1, when a wide-angle image of a display object is specified by a user operation from a wide-angle image group including a plurality of wide-angle images captured while changing the imaging direction, a corrected image of the specified wide-angle image is displayed in association with images (faces of persons) of all the object portions detected from the wide-angle image group, but when a wide-angle image of a display object is specified by a user operation from the wide-angle image group, the specified wide-angle image (image with distortion remaining) may be displayed in association with images (faces of persons) of all the object portions detected from the wide-angle image group. Thus, the user can observe a plurality of wide-angle images by switching the display of the plurality of wide-angle images.
In addition, although the above-described embodiment 1 employs a configuration in which images within a predetermined range (cut frame) of the corrected image are displayed in an enlarged manner and images (face images) of each detected subject portion are displayed in a list, the present invention is not limited to this configuration, and the entire corrected image stored in a manner corresponding to the fisheye image selected as the reproduction target and the images (face images) of each detected subject portion may be displayed in a list.
Although the fisheye image, the corrected image, and the face image of each person are recorded and stored in association with each other in embodiment 1 described above, the fisheye image, the corrected image, and the face image of each person may be output to the outside via a removable memory (recording medium) or a communication unit, such as an SD card or a USB memory, which is removable, and the corrected image and the face image of each person may be displayed in association with each other by an external device.
Although in embodiment 1 described above, a part of the corrected image (image within the cutout frame) in which the distortion of the fisheye image is corrected is enlarged and displayed on the reproduction screen, the entire corrected image may be displayed on the reproduction screen.
In embodiment 1 described above, 360 ° photographing is performed by dividing two photographing operations of front 180 ° photographing and rear 180 ° photographing, and the plurality of fisheye images are set as a group, but the range of photographing using the fisheye lens 16B is not limited to 180 ° photographing, and is arbitrary. It is to be noted that the present invention is not limited to 360 ° imaging, and can be applied to an image captured in a 180 ° range.
In the above-described embodiment, when the double-click operation is performed on an arbitrary face image, the control unit 21 detects a position including the face image from the corrected image and moves (jumps) the cutout frame to the detected position, thereby causing the image displayed in an enlarged manner on the reproduction screen (the image in the cutout frame) to include the face image. That is, a configuration may be adopted in which the position of the correction image including the arbitrary face image is moved in accordance with the amount of sliding, and the image at the position of the movement target of the correction image is displayed in place of the arbitrary face image. In this case, when a double-click operation is performed on the movement target image displayed in place of an arbitrary face image, the control unit 21 detects a position including the movement target image from the corrected image, and moves the cutting frame to the detected position, thereby causing the movement target image to be included in the image displayed in an enlarged manner on the reproduction screen.
(embodiment 2)
Embodiment 2 of the present invention will be described below with reference to fig. 10 and 11.
In addition, in the above-described embodiment 1, when a plurality of fisheye images (stored images) are reproduced and a display position of any thumbnail image is touched from among the fisheye images in a state where the fisheye images are displayed on the thumbnail list screen, an image (image in the cutout frame) of a part of the corrected image corresponding to the fisheye image is displayed in a switched manner, but in the embodiment 2, when a fisheye image (thumbnail image) is touched, the fisheye image is displayed in a switched manner as a list screen of face images of persons corresponding to the fisheye image.
Further, although in embodiment 1, when a face image displayed together with a correction image is touched, a position including the face image is detected from the correction image, and a cutting frame is moved to the detected position, thereby causing the face image to be included in an image in the cutting frame, in embodiment 2, when an arbitrary face image is touched from a list screen of face images, the face image is switched and displayed to a correction image corresponding to a fisheye image including the face image (the fisheye image selected from a thumbnail list screen). Here, the same reference numerals are given to the substantially same or identically named portions of the two embodiments, and the description thereof is omitted, and hereinafter, the description will be made centering on the characteristic portions of embodiment 2.
In the operation at the time of photographing in embodiment 2, as in embodiment 1 described above, when a fisheye image is developed and recorded and stored in a recording medium of the storage unit 23, a correction image is generated by performing a process of correcting distortion for the fisheye image, and the correction image is recorded and stored in association with the fisheye image. Then, the face of each person included in the corrected image is detected, and a face image (corrected image) is generated by cutting out the face portion for each person, and is recorded and stored in association with the fisheye image.
Fig. 10 is a flowchart showing the operation of the main apparatus 20 (the operation of the characteristic portion of embodiment 2) when reproducing a fisheye image (stored image).
First, when a user operation instructs reproduction of a fisheye image (stored image), the control unit 21 reads a plurality of fisheye images (stored images) stored in the storage medium of the storage unit 23 (step C1), converts the fisheye images into thumbnail images, and displays the thumbnail images in a list (step C2). Fig. 11(1) shows a thumbnail list screen of fisheye images, and in the figure, circles arranged in a3 × 3 matrix form show thumbnail images of fisheye images. Fig. 11(2) is a diagram showing a part of the thumbnail list screen in an enlarged manner.
In a state where such a thumbnail list screen is displayed, it is checked whether or not a predetermined touch operation (for example, a double-click operation) is performed on the display position of any thumbnail image (step C3). If a double-tap operation is not performed on the thumbnail image by two consecutive taps (no in step C3), the process proceeds to step C9, but if a double-tap operation is performed on the thumbnail image (yes in step C3), the face images (corrected images) of the stored persons recorded in association with the double-tap operated fisheye image are read (step C4), and the face images are enlarged and displayed on the playback screen in a switched manner (step C5). Fig. 11(3) shows a screen in which face images of persons are displayed in a list, and face images cut out in a rectangular shape, for example, are arranged and displayed in the same order as in embodiment 1 described above.
In the state where the list screen of face images is displayed in this manner, it is checked whether or not a predetermined touch operation (for example, a double-click operation) is performed on the display position of any face image (step C6). Currently, if the double-click operation is not performed on the face image (NO in step C6), the flow shifts to step C9, but when the double-click operation is performed (YES in step C6), the correction image recorded and saved in correspondence with the fisheye image containing the face image operated by the double-click is read out (step C7), and display (full-face display) is switched to the correction image (step C8).
In this case, the fisheye image selected on the thumbnail list screen is identified as the fisheye image including the face image designated by the touch, and the corrected image corresponding to the fisheye image is read and displayed in a switched manner. Fig. 11(4) is a view illustrating a case where, when the face image of the female child is double-clicked as shown in fig. 11(3), a part of the corrected image corresponding to the fisheye image including the face image is cut out and enlarged and displayed, and an enlarged image centered on the face of the female child is switched and displayed.
Fig. 11(5) is a diagram for explaining another display example different from the display example of fig. 11 (4). In fig. 11(4), the fisheye image selected on the thumbnail list screen is identified as the fisheye image including the face image designated by the touch, but in fig. 11(5), the fisheye image is identified as another fisheye image including the face image other than the fisheye image, and a part of the fisheye image is cut out of the corrected image corresponding to the fisheye image and displayed in an enlarged manner. In this case, the image cut out from the corrected image with the face image as the center is enlarged and displayed in a switched manner (full-face display) so that the face image designated by the touch is positioned at the center.
Further, although the fisheye image selected on the thumbnail list screen is preferentially identified as the fisheye image including the face image designated by the touch, it may be set in advance by a user operation as to which fisheye image is preferentially identified or which fisheye image is another fisheye image including the face image designated by the touch other than the fisheye image. Further, as a fisheye image containing the face image specified by the touch, it is also possible to switch the fisheye image and another fisheye image by a user operation or automatically switch them after a certain time. Further, if a plurality of other fisheye images are stored, the plurality of fisheye images may be sequentially specified by a switching operation, and a corrected image corresponding to the specified fisheye image may be read out and displayed in a switched manner.
In the state where the corrected image is switched and displayed in this manner, it is checked whether or not a return instruction operation is performed (step C9), and if the return instruction operation is not performed (no in step C9), it is checked whether or not another instruction operation is performed (step C10). Here, when another instruction operation (for example, an operation for instructing image editing) is performed (yes in step C10), the process proceeds to the processing corresponding to the other instruction, and when a return instruction operation is performed (yes in step C9), the process returns to step C3 described above.
As described above, in embodiment 2, when an arbitrary wide-angle image is selected by a user operation from among the wide-angle images in a state where the wide-angle images are displayed in a list as thumbnail images, the images of a given subject portion included in the wide-angle image are displayed in a list, and therefore, even if the wide-angle image whose image is distorted as a whole is displayed in a thumbnail state, the user can easily visually confirm the subject. That is, since the distortion of the entire wide-angle image (fish-eye image) is large, it is difficult to accurately grasp who has been photographed even when observing the thumbnail image, and it takes time and effort and burdensome for the user to confirm the faces photographed in the image one by enlarging and displaying an arbitrary thumbnail image while specifying it by the user operation.
In addition, when an arbitrary face image is selected from the face images by a user operation in a state where the face images are displayed in a list, a corrected image in which distortion of a wide-angle image included in the face image is corrected is displayed, and therefore, a user can view a wide range of images without distortion.
In addition, although in embodiment 2 described above, a correction image is generated by performing a process of correcting distortion of a captured fish-eye image, a face image of each person extracted from the correction image is generated, the correction image and the face image of each person are recorded and stored in association with the fish-eye image, and the face image of each person corresponding to the fish-eye image is read and displayed in a list at the time of reproducing each fish-eye image, the correction image and the face image of each person may be generated at the time of reproducing the fish-eye image. That is, when displaying a thumbnail of a fisheye image, a process of correcting distortion may be performed on an arbitrarily selected fisheye image to generate a corrected image, and then a face image of each person extracted from the corrected image may be generated. Further, the face image may be generated by cutting out the face image of each person from the fisheye image and correcting the distortion thereof, without being limited to the case where the face portion of each person is cut out after the correction image is generated.
Although the thumbnail image is displayed in the above-described embodiment 2, a corrected image in which distortion of the fisheye image is corrected may be displayed as a thumbnail image. Even in this case, when a thumbnail image (corrected image) is selected by a user operation, the screen is switched to a screen on which face images of persons included in the image are displayed in a list, and in this regard, the same as in embodiment 2 is applied.
In embodiment 2 described above, 360 ° photographing may be performed by dividing two photographing operations of front 180 ° photographing and rear 180 ° photographing, and the plurality of fisheye images may be set as a group, but the present invention is not limited to 360 ° photographing, and may be applied to a photographed image photographed in a 180 ° range (180 ° photographing).
In the above-described embodiment, when the double-click operation is performed on an arbitrary face image, the control unit 21 detects a position including the face image from the corrected image and moves (jumps) the cutout frame to the detected position, thereby causing the enlarged image (image in the cutout frame) displayed on the reproduction screen to include the face image. That is, a configuration may be adopted in which the position of the correction image including the arbitrary face image is moved in accordance with the amount of sliding, and the image at the position of the movement target of the correction image is displayed in place of the arbitrary face image. In this case, when a double-click operation is performed on the movement target image displayed in place of an arbitrary face image, the control unit 21 detects a position including the movement target image from the corrected image, and moves the cutting frame to the detected position, thereby including the movement target image in the image displayed in an enlarged manner on the reproduction screen.
(embodiment 3)
Embodiment 3 of the present invention will be described below with reference to fig. 12 to 14.
In embodiment 3, when converting a plurality of fisheye images into thumbnail images and displaying the thumbnail images in a list, face images of persons stored in association with the fisheye images are displayed in a list in the vicinity of the thumbnail images. That is, the face images of the respective persons associated with the thumbnail images are displayed together at the periphery of the images. Here, the same reference numerals are given to the substantially same or identically named portions of the two embodiments, and the description thereof is omitted, and hereinafter, the description will be made centering on the characteristic portions of embodiment 3. In embodiment 3, for example, as shown in fig. 4(1), a fisheye image obtained when vertically photographing 180 ° forward and a fisheye image obtained when vertically photographing 180 ° rearward as shown in fig. 4(2), that is, two fisheye images obtained by two times of photographing (360 ° photographing) including 180 ° forward and 180 ° rearward are recorded and stored as a set of images.
Fig. 12 is a flowchart showing the operation of the main apparatus 20 (the operation of the characteristic portion of embodiment 3) when reproducing a fisheye image (stored image).
First, when a user operation instructs reproduction of a fisheye image, the control section 21 reads a plurality of stored fisheye images (step D1), converts the fisheye images into thumbnail images, and displays the thumbnail images in a list (step D2). Then, the face images of the stored persons are read out for each fisheye image in association with the fisheye image (step D3), and the corresponding face images are displayed in the vicinity (periphery) of the fisheye image (thumbnail image) (step D4).
Fig. 13(1) is a diagram showing an example of a thumbnail list screen of a fisheye image, in which a circle arranged in a3 × 3 matrix represents a thumbnail image of a fisheye image, and a vertically long small circle (ellipse) arranged around the fisheye image (thumbnail image) represents a face image of each person corresponding to the fisheye image. Fig. 13(2) is a diagram in which a part of the thumbnail list screen is enlarged, and the illustrated example shows a state in which a thumbnail image is displayed when shooting at an angle other than 360 °, that is, when shooting at 180 °, and a list of face images of persons included in the fisheye image is displayed in the periphery of the thumbnail image.
Fig. 14(1) is a diagram showing another example of a thumbnail list screen of fisheye images, in which fisheye image a shows a fisheye image when vertically shooting 180 ° in the front direction as shown in fig. 4(1) and fisheye image B shows a fisheye image when vertically shooting 180 ° in the rear direction as shown in fig. 4 (2). In the case of a group of fisheye images obtained by dividing two shots (360 ° shots) of 180 ° forward and 180 ° backward, the face images of the person included in the fisheye image of the user and the face images of the person included in the other fisheye image of the group are arranged and displayed in the periphery of the thumbnail images.
Fig. 14(2) is a diagram showing a thumbnail image display of the fisheye image a in which a part of the thumbnail list screen is enlarged. In this case, in addition to the face images of the mother and the daughter captured in the fisheye image a, the face images of the mother and the daughter captured in the other fisheye image B grouped with the fisheye image a are also displayed in a list around the thumbnail image. Note that, although the thumbnail image is also displayed in this fish-eye image B, although not shown, the face images of the mother and the daughter captured in the fish-eye image B and the face images of the mother and the daughter captured in another fish-eye image a grouped with this fish-eye image B are also displayed in a list in the vicinity of this thumbnail image.
In addition, when the face images are arranged and displayed in the vicinity of the thumbnail image, the face images are arranged with good balance at equal intervals by adjusting the size of each face image as shown in fig. 13(2) and 14 (2). In a state where the thumbnail list screen is displayed, it is checked whether or not the display position of the face image in the list screen is touched (double-clicked) (step D5). Currently, if the display position of the face image is not double-clicked (NO in step D5), the process proceeds to step D8, and if double-clicked (YES in step D5), the corresponding fisheye image is identified from the double-clicked face image, the corrected image recorded and stored in association with the fisheye image is read (step D6), and the corrected image is displayed on the thumbnail list screen in a switched manner (step D7). In this case, as in the above-described embodiment 2, the fisheye image selected on the thumbnail list screen or another fisheye image other than the fisheye image containing the face image designated by the touch is specified, and a correction image corresponding to the specified fisheye image is read and displayed in a switched manner.
Fig. 13(3) shows a case where, when the face image of the child is double-clicked as shown in fig. 13(2), a corrected image including the face image is displayed in an enlarged manner. Fig. 14(3) also shows a case where the corrected image including the face image of the double-clicked child is enlarged and displayed, but in fig. 13(3), the corrected image of the image displayed as a thumbnail is displayed, and in fig. 14(3), the corrected image of the image B grouped with the image a displayed as a thumbnail is displayed.
In the state where the corrected image is switched and displayed in this manner, it is checked whether or not a return instruction operation is performed (step D8), and if the return instruction operation is not performed (no in step D8), it is checked whether or not another instruction operation is performed (step D9). Here, when another instruction operation (for example, an operation to instruct image editing) is performed (yes in step D9), the process proceeds to the process corresponding to the other instruction, and when a return instruction operation is performed (yes in step D8), the process returns to step D2 described above, and the thumbnail list screen is displayed.
As described above, in embodiment 3, the wide-angle images are displayed in a list as thumbnail images, and the images of the subject portions cut out of the wide-angle images are displayed so as to correspond to the thumbnail images, so that even if the wide-angle images with the distortion of the entire images are thumbnail-displayed in the original state, the user can easily visually confirm the subjects.
Further, in the case where a thumbnail image is displayed for any one of the wide-angle images in the wide-angle image group in which a plurality of wide-angle images captured while changing the shooting direction are grouped, the control unit 21 causes the thumbnail images to be displayed in association with the images of all the subject portions included in the wide-angle image group, and therefore, it is possible for the user to confirm the images of all the subject portions included in the wide-angle image group by only confirming the thumbnail display of any one of the wide-angle images in the wide-angle image group. For example, when 360 ° photography is performed by dividing two times of front 180 ° photography and rear 180 ° photography, the thumbnail image (fisheye image) photographed at the front 180 ° displays the face image of the person included in the fisheye image and also displays the face image of the person included in the fisheye image photographed at the rear 180 ° together, and therefore, even if the thumbnail display of the fisheye image photographed at the rear 180 ° is not confirmed, the face image of the person included in the fisheye image can be confirmed.
In addition, although the above-described embodiment 3 performs a process of correcting distortion of a captured fish-eye image to generate a corrected image, generates a face image of each person extracted from the corrected image, records and stores the corrected image and the face image of each person in association with the fish-eye image, reads and displays the face image of each person associated with the fish-eye image in a list when thumbnail images of the fish-eye images are displayed, and reads and displays a corrected image including the face image when an arbitrary face image is specified by a user from the list display, the corrected image and the face image of each person may be generated when thumbnail images of the fish-eye images are displayed. That is, when displaying a thumbnail of a fisheye image, a correction image may be generated by performing a process of correcting distortion of the fisheye image, and a face image of each person cut out from the correction image may be generated. Further, the present invention is not limited to the case where the face portion of each person is cut out from the corrected image after the corrected image is generated, and the face image may be generated by cutting out the face image of each person from the fish-eye image and correcting the distortion thereof.
Although the 360 ° photographing is performed by dividing one fisheye lens 16B into the front 180 ° photographing and the rear 180 ° photographing twice in the above-described embodiment 3, if two fisheye lenses 16B are provided, for example, if fisheye lenses are provided on both the front surface and the rear surface of the camera housing, the front 180 ° photographing and the rear 180 ° photographing can be performed simultaneously in accordance with an instruction of the 360 ° photographing. Further, 360 ° photographing may be performed by performing photographing a plurality of times using an ultra-wide-angle lens.
In the above-described embodiment, when the double-click operation is performed on an arbitrary face image, the control unit 21 detects a position including the face image from the corrected image and moves (jumps) the cutout frame to the detected position, thereby causing the image displayed in an enlarged manner on the reproduction screen (the image in the cutout frame) to include the face image. That is, a configuration may be adopted in which the position of the correction image including the arbitrary face image is moved in accordance with the amount of sliding, and the image at the position of the movement target of the correction image is displayed in place of the arbitrary face image. In this case, when a double-click operation is performed on the movement target image displayed in place of an arbitrary face image, the control unit 21 detects a position including the movement target image from the corrected image, and moves the cutting frame to the detected position, thereby causing the movement target image to be included in the image displayed in an enlarged manner on the reproduction screen.
In the above-described embodiments, the case where the image processing apparatus is applied to a digital camera has been described, but the image processing apparatus is not limited to this, and may be, for example, a personal computer, a PDA (personal digital assistant), a tablet terminal apparatus, a mobile phone such as a smartphone, an electronic game machine, a music player, or the like.
The "device" or "section" shown in the above embodiments may be separated into a plurality of housings according to the function, and is not limited to a single housing. The steps described in the above-described flowcharts are not limited to time-series processing, and a plurality of steps may be processed in parallel or may be processed individually and independently.
In the above-described embodiments, the fisheye image or the wide-angle image is displayed as the thumbnail image, but an image corresponding to the fisheye image or the wide-angle image, such as an image in which distortion in a predetermined range of the fisheye image or the wide-angle image is corrected, may be displayed as the thumbnail image.
Although the embodiments of the present invention have been described above, the present invention is not limited to these embodiments, and the invention described in the claims and the equivalent scope thereof are also included.

Claims (21)

1. An image processing apparatus is characterized by comprising:
a detection unit that recognizes an object included in a wide-angle image captured at a wide angle and detects an image of an object portion thereof;
a correction unit that corrects distortion of an image of the object portion;
a display unit; and
a processing unit that performs a process of associating a file of an image of at least a part of the wide-angle image with a file of an image of a subject portion whose distortion is corrected by the correcting unit,
the processing unit performs the process of establishing the association by causing an image of at least a part of the wide-angle image to be displayed on the display unit and causing an image of the subject portion whose distortion is corrected to be displayed on the display unit.
2. The image processing apparatus according to claim 1,
the correction unit further corrects distortion of the wide-angle image,
the processing unit performs processing of associating a file of an image of at least a part of the wide-angle image whose distortion is corrected with a file of an image of the subject part whose distortion is corrected.
3. The image processing apparatus according to claim 1,
the processing unit performs a process of associating a file of an image of at least a part of the wide-angle image having distortion with a file of an image of the subject part after distortion correction.
4. The image processing apparatus according to claim 1,
the processing unit performs the process of establishing the association by simultaneously displaying an image of at least a part of the wide-angle image and an image of the subject portion whose distortion is corrected on the display unit.
5. The image processing apparatus according to claim 1,
the image processing apparatus further includes: an acquisition unit that acquires an image within a given range as a part of the wide-angle image,
the processing unit performs the process of establishing the association by causing the display unit to display the image in the given range acquired by the acquisition unit and causing the display unit to display the image of the subject portion with the distortion corrected.
6. The image processing apparatus according to claim 1,
the detection unit identifies a given subject contained in a wide-angle image captured at a wide angle, and detects an image of a subject portion thereof,
the processing unit performs the association processing by displaying an image of at least a part of the wide-angle image on the display unit and displaying a list of images of the subject portion detected by the detection unit on the display unit.
7. The image processing apparatus according to claim 6,
the image processing apparatus further includes: second selection means for selecting an image of an arbitrary subject portion from the subject portions in accordance with a user operation in a state where the images of the subject portions are displayed in a list on the display means,
the processing unit performs the process of establishing the association by displaying, on the display unit, an image of at least a part of the wide-angle image corresponding to the subject portion selected by the second selecting unit when the image of an arbitrary subject portion is selected by the second selecting unit.
8. The image processing apparatus according to claim 7,
the correction unit further corrects distortion of a given range of the wide-angle image including the image of the subject portion selected by the second selection unit,
the processing unit performs the process of establishing the association by displaying, on the display unit, a corrected image of the wide-angle image of the given range corrected by the correction unit when the image of the arbitrary subject portion is selected by the second selection unit.
9. The image processing apparatus according to any one of claims 6 to 8,
the processing unit performs the association establishing processing by switching display of the display unit so that a state in which images of the subject portion are displayed in a list is changed to a state in which an image of at least a part of the wide-angle image is displayed.
10. The image processing apparatus according to claim 7 or 8,
the processing unit performs the process of establishing the association by switching display of the display unit so that an image of at least a part of the wide-angle image corresponding to the subject portion selected by the second selection unit is displayed on the display unit from a state in which images of the subject portion are displayed in a list on the display unit.
11. The image processing apparatus according to claim 6,
the detection unit identifies a plurality of given subjects included in a wide-angle image captured at a wide angle, and detects an image of a subject portion thereof for each subject,
the processing means displays a list of images of the subject portions detected by the detecting means on the display means, and arranges and displays the images in a list according to the order of arrangement of the subjects on the wide-angle image.
12. The image processing apparatus according to claim 5,
the image processing apparatus further includes: a movement instruction unit that arbitrarily moves the given range by a user operation,
the acquisition unit acquires an image within a given range moved by the movement instruction unit.
13. The image processing apparatus according to claim 1,
the image processing apparatus further includes: a first selection unit configured to select an arbitrary wide-angle image or corresponding image from the thumbnail images by a user operation in a state where the wide-angle images or corresponding images corresponding to the wide-angle images are displayed in a list on the display unit as thumbnail images,
the detection unit identifies a given subject contained in the wide-angle image or the corresponding image selected by the first selection unit and detects an image of a subject portion thereof,
the processing unit displays a list of images of the subject portion detected by the detecting unit and corrected by the correcting unit on the display unit when an arbitrary wide-angle image or corresponding image is selected by the first selecting unit.
14. The image processing apparatus according to claim 1,
the image processing apparatus further includes: a first selection unit that selects an arbitrary wide-angle image or corresponding image from the thumbnail images by a user operation in a state where the wide-angle images or corresponding images corresponding to the wide-angle images are displayed in a list as thumbnail images,
the processing unit performs the process of establishing the association by displaying, on the display unit, the wide-angle image or the corresponding wide-angle image and the image of the subject portion whose distortion is corrected at the same time when any of the wide-angle image and the corresponding image is selected by the first selecting unit.
15. The image processing apparatus according to claim 1,
the detection unit identifies all given subjects included in a wide-angle image group consisting of a plurality of wide-angle images captured with the photographing direction changed, and detects an image of a subject portion thereof for each subject,
the processing unit displays, when thumbnail display is performed on any one of the wide-angle images in the wide-angle image group, images of all subject portions included in the wide-angle image group detected by the detecting unit and corrected by the correcting unit in association with the wide-angle image on which thumbnail display is performed.
16. The image processing apparatus according to claim 1,
the detection unit identifies all given subjects included in a wide-angle image group having a plurality of wide-angle images captured with the photographing direction changed as a group, and detects an image of a subject portion thereof for each subject.
17. The image processing apparatus according to claim 16,
the image processing apparatus further includes: a specifying unit that specifies one wide-angle image from a group of wide-angle images of a plurality of wide-angle images captured with a photographing direction changed,
the processing unit performs processing of associating the file of the wide-angle image specified by the specifying unit with files of images of all subject portions detected from the group of wide-angle images by the detecting unit.
18. The image processing apparatus according to claim 1,
the processing unit performs a process of recording and saving the wide-angle image in association with the image of the subject portion detected by the detection unit.
19. The image processing apparatus according to claim 1,
the wide-angle image is a fisheye image captured using a fisheye lens.
20. An image processing method in an image processing apparatus, characterized in that,
the image processing method comprises the following steps:
a process of recognizing a subject included in a wide-angle image captured at a wide angle and detecting an image of a subject portion thereof;
processing for correcting the detected distortion of the image of the subject portion; and
a process of associating a file of an image of at least a part of the wide-angle image with a file of an image of a subject portion whose distortion is corrected by the correction process,
the process of establishing the association is performed by displaying an image of at least a part of the wide-angle image on a display unit and displaying an image of the subject portion with the distortion corrected on the display unit.
21. A computer-readable storage medium storing a program, characterized in that,
the program is for causing a computer of an image processing apparatus to execute:
a process of recognizing a subject included in a wide-angle image captured at a wide angle and detecting an image of a subject portion thereof;
processing for correcting the detected distortion of the image of the subject portion; and
a process of associating a file of an image of at least a part of the wide-angle image with a file of an image of a subject portion whose distortion is corrected by the correction process,
the process of establishing the association is performed by displaying an image of at least a part of the wide-angle image on a display unit and displaying an image of the subject portion with the distortion corrected on the display unit.
CN201611033980.0A 2015-12-22 2016-11-15 Image processing apparatus, image processing method, and storage medium Active CN107018316B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2015-249410 2015-12-22
JP2015249410 2015-12-22
JP2016-052641 2016-03-16
JP2016052641A JP6723512B2 (en) 2015-12-22 2016-03-16 Image processing apparatus, image processing method and program

Publications (2)

Publication Number Publication Date
CN107018316A CN107018316A (en) 2017-08-04
CN107018316B true CN107018316B (en) 2021-02-05

Family

ID=59235032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611033980.0A Active CN107018316B (en) 2015-12-22 2016-11-15 Image processing apparatus, image processing method, and storage medium

Country Status (3)

Country Link
JP (1) JP6723512B2 (en)
KR (1) KR20170074742A (en)
CN (1) CN107018316B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3438935A4 (en) 2016-03-31 2019-07-03 Sony Corporation Information processing device, information processing method, program
JP7086552B2 (en) * 2017-09-22 2022-06-20 キヤノン株式会社 Information processing equipment, imaging equipment, information processing methods and programs
JP6688277B2 (en) * 2017-12-27 2020-04-28 本田技研工業株式会社 Program, learning processing method, learning model, data structure, learning device, and object recognition device
JP7231643B2 (en) 2018-10-02 2023-03-01 マクセル株式会社 Information processing equipment
JP7267764B2 (en) * 2019-02-08 2023-05-02 キヤノン株式会社 ELECTRONIC DEVICE, ELECTRONIC DEVICE CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
JP7419723B2 (en) * 2019-09-25 2024-01-23 株式会社リコー Image processing system, image processing device and method
CN110611749A (en) * 2019-09-30 2019-12-24 深圳市大拿科技有限公司 Image processing method and device
CN110572578A (en) * 2019-09-30 2019-12-13 联想(北京)有限公司 Image processing method, apparatus, computing device, and medium
CN113496458A (en) * 2020-03-18 2021-10-12 杭州海康威视数字技术股份有限公司 Image processing method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101510956A (en) * 2008-02-15 2009-08-19 索尼株式会社 Image processing device, camera device, image processing method, and program
US7791668B2 (en) * 2005-01-18 2010-09-07 Nikon Corporation Digital camera
CN103247031A (en) * 2013-04-19 2013-08-14 华为技术有限公司 Method, terminal and system for correcting aberrant image
CN104994281A (en) * 2015-06-30 2015-10-21 广东欧珀移动通信有限公司 Method for correcting face distortion and terminal
CN105141827A (en) * 2015-06-30 2015-12-09 广东欧珀移动通信有限公司 Distortion correction method and terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7791668B2 (en) * 2005-01-18 2010-09-07 Nikon Corporation Digital camera
CN101510956A (en) * 2008-02-15 2009-08-19 索尼株式会社 Image processing device, camera device, image processing method, and program
CN103247031A (en) * 2013-04-19 2013-08-14 华为技术有限公司 Method, terminal and system for correcting aberrant image
CN104994281A (en) * 2015-06-30 2015-10-21 广东欧珀移动通信有限公司 Method for correcting face distortion and terminal
CN105141827A (en) * 2015-06-30 2015-12-09 广东欧珀移动通信有限公司 Distortion correction method and terminal

Also Published As

Publication number Publication date
JP6723512B2 (en) 2020-07-15
CN107018316A (en) 2017-08-04
KR20170074742A (en) 2017-06-30
JP2017118472A (en) 2017-06-29

Similar Documents

Publication Publication Date Title
CN107018316B (en) Image processing apparatus, image processing method, and storage medium
CN107197137B (en) Image processing apparatus, image processing method, and recording medium
JP6627352B2 (en) Image display device, image display method, and program
US10440307B2 (en) Image processing device, image processing method and medium
CN109934931B (en) Method and device for collecting image and establishing target object recognition model
CN107911621A (en) A kind of image pickup method of panoramic picture, terminal device and storage medium
JP6497965B2 (en) Image processing apparatus and image processing method
KR20150058871A (en) Photographing device and stitching method of photographing image
JP2010183187A (en) Imaging apparatus and control method of the same, program
JP2017175507A (en) Image processing apparatus, image processing method, and program
CN106296789B (en) It is a kind of to be virtually implanted the method and terminal that object shuttles in outdoor scene
CN111654624B (en) Shooting prompting method and device and electronic equipment
JP6677900B2 (en) Image processing apparatus, image processing method, and program
JP2017059927A (en) User terminal, color correction system, and color correction method
JP7310123B2 (en) Imaging device and program
JP6720966B2 (en) Information processing apparatus, information processing method, and program
JP5872415B2 (en) Display terminal, operation reception method, and program
JP7350511B2 (en) Electronic equipment, electronic equipment control method, program, and storage medium
JP2014017665A (en) Display control unit, control method for display control unit, program, and recording medium
CN111242107B (en) Method and electronic device for setting virtual object in space
CN111953870B (en) Electronic device, control method of electronic device, and computer-readable medium
JP2015133549A (en) Image processing system, image processing method and computer program
JP2021129293A (en) Image processing apparatus, image processing system, image processing method, and program
CN108347596B (en) laser guide scanning system and method based on feedback
CN107925724A (en) The technology and its equipment of photography are supported in the equipment with camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant