US9973690B2 - Imaging device, imaging method, and computer-readable recording medium - Google Patents

Imaging device, imaging method, and computer-readable recording medium Download PDF

Info

Publication number
US9973690B2
US9973690B2 US15/496,312 US201715496312A US9973690B2 US 9973690 B2 US9973690 B2 US 9973690B2 US 201715496312 A US201715496312 A US 201715496312A US 9973690 B2 US9973690 B2 US 9973690B2
Authority
US
United States
Prior art keywords
image data
resolution
super
image
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/496,312
Other versions
US20170230572A1 (en
Inventor
Shinsuke HOMMA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OM Digital Solutions Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Priority to US15/496,312 priority Critical patent/US9973690B2/en
Publication of US20170230572A1 publication Critical patent/US20170230572A1/en
Assigned to OLYMPUS CORPORATION reassignment OLYMPUS CORPORATION CHANGE OF ASSIGNEE ADDRESS Assignors: OLYMPUS CORPORATION
Assigned to OLYMPUS CORPORATION reassignment OLYMPUS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HOMMA, SHINSUKE
Application granted granted Critical
Publication of US9973690B2 publication Critical patent/US9973690B2/en
Assigned to OM DIGITAL SOLUTIONS CORPORATION reassignment OM DIGITAL SOLUTIONS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OLYMPUS CORPORATION
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • H04N5/23232
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4053Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • H04N5/23216
    • H04N5/23245
    • H04N5/23293
    • H04N5/23296

Definitions

  • the disclosure relates to an imaging device and method for imaging an object to generate image data of the object, and a computer-readable recording medium.
  • an imaging device includes: an optical system configured to image an object; an imaging element configured to receive light of an object image of the object formed by the optical system and to continuously generate image data of the object; a position changing unit configured to change a relative positional relationship between the optical system and the imaging element relative to a surface orthogonal to an optical axis of the optical system; a display unit configured to display an image corresponding to the image data generated by the imaging element; a touch panel that is provided so as to overlap on a display area of the display unit and is configured to detect a contact position where an external body touches the touch panel and to periodically output a position signal according to the contact position; a position controller configured to control the position changing unit based on a period of time during which the contact position is maintained, and thereby to change the relative positional relationship between the optical system and the imaging element per a predetermined distance; a trimming unit configured to generate a plurality of pieces of trimming image data by sequentially cutting out an area including a touch position corresponding to the position signal from the image corresponding to the
  • an imaging method executed by an imaging device includes: an optical system configured to image an object; an imaging element configured to receive light of an object image of the object formed by the optical system and to continuously generate image data of the object; a position changing unit configured to change a relative positional relationship between the optical system and the imaging element relative to a surface orthogonal to an optical axis of the optical system; a display unit configured to display an image corresponding to the image data generated by the imaging element; and a touch panel that is provided so as to overlap on a display area of the display unit and is configured to detect a contact position where an external body touches the touch panel and to periodically output a position signal according to the contact position.
  • the method includes: controlling the position changing unit based on a period of time during which the contact position is maintained to change the relative positional relationship between the optical system and the imaging element per a predetermined distance; generating a plurality of pieces of trimming image data by sequentially cutting out an area including a touch position corresponding to the position signal from the image corresponding to the image data sequentially generated by the imaging element each time the relative positional relationship between the optical system and the imaging element is changed; and performing pixel interpolation using the plurality of pieces of trimming image data to generate super-resolution image data having a higher resolution than a resolution of each trimming image data.
  • a non-transitory computer-readable recording medium with an executable program stored thereon instructs an imaging device including: an optical system configured to image an object; an imaging element configured to receive light of an object image of the object formed by the optical system and to continuously generate image data of the object; a position changing unit configured to change a relative positional relationship between the optical system and the imaging element relative to a surface orthogonal to an optical axis of the optical system; a display unit configured to display an image corresponding to the image data generated by the imaging element; and a touch panel that is provided so as to overlap on a display area of the display unit and is configured to detect a contact position where an external body touches the touch panel and to periodically output a position signal according to the contact position, to execute: controlling the position changing unit based on a period of time during which the contact position is maintained to change the relative positional relationship between the optical system and the imaging element per a predetermined distance; generating a plurality of pieces of trimming image data by sequentially cutting out an area including
  • FIG. 1 is a block diagram illustrating a functional configuration of an imaging device according to one embodiment of the present invention
  • FIG. 2 is a flowchart illustrating an outline of processing executed by the imaging device according to one embodiment of the present invention
  • FIG. 3 is a view illustrating an exemplary live view image displayed on a display unit of the imaging device according to one embodiment of the present invention
  • FIG. 4 is a view illustrating an exemplary image displayed on the display unit of the imaging device according to one embodiment of the present invention.
  • FIG. 5 is a view schematically illustrating a structure of an image file generated by the imaging device according to one embodiment of the present invention
  • FIG. 6 is a view illustrating an exemplary super-resolution image displayed on the display unit of the imaging device according to one embodiment of the present invention.
  • FIG. 7 is a view illustrating an exemplary super-resolution image played back and displayed on the display unit of the imaging device according to one embodiment of the present invention.
  • FIG. 8 is a view illustrating another exemplary super-resolution image played back and displayed on the display unit of the imaging device according to one embodiment of the present invention.
  • FIG. 9 is a view illustrating another exemplary super-resolution image played back and displayed on the display unit of the imaging device according to one embodiment of the present invention.
  • FIG. 1 is a block diagram illustrating a functional configuration of the imaging device according to one embodiment of the present invention.
  • An imaging device 1 illustrated in FIG. 1 includes an imaging unit 2 , a position changing unit 3 , a posture detection unit 4 , a display unit 5 , a touch panel 6 , an operational unit 7 , a recording unit 8 , and a controller 9 .
  • the imaging unit 2 includes an optical system 21 for forming an object image of an object, and an imaging element 22 for generating image data of the object image by imaging the object image that has been formed by the optical system 21 .
  • the optical system 21 has one or more lenses, and captures an image of the object.
  • the imaging element 22 has an imaging sensor such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS), and receives light of the object image of the object that has been formed by the optical system 21 and continuously generates the image data of the object.
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the position changing unit 3 holds the imaging element 22 . Under control of the controller 9 , the position changing unit 3 relatively changes a positional relationship between the optical system 21 and the imaging element 22 by moving the imaging element 22 relative to a surface orthogonal to an optical axis O (horizontal direction and vertical direction).
  • the position changing unit 3 has a voice coil motor, a piezoelectric element, and the like.
  • the position changing unit 3 functions as a hand shake prevention mechanism that prevents a hand shake occurs in the imaging device 1 .
  • the posture detection unit 4 detects a posture of the imaging device 1 and outputs a detected result to the controller 9 .
  • the posture detection unit 4 has a gyro sensor and the like.
  • the display unit 5 displays an image corresponding to the image data generated by the imaging unit 2 .
  • the display unit 5 has a display panel including a liquid crystal, an organic electro luminescence (EL), or the like.
  • the touch panel 6 is provided so as to be overlapped on a display screen of the display unit 5 , and detects a contact position (touch position) where a body from outside has contacted as well as receives input of a signal according to the detected position.
  • the touch panel 6 periodically outputs a signal according to the contact position where the body from the outside has contacted to the controller 9 .
  • Any of a resistance film method, an electrostatic capacitance method, an optical method, and the like is applicable as a method for the touch panel 6 .
  • the operational unit 7 receives input of various operations from the imaging device 1 .
  • the operational unit 7 has, for example, a power supply switch that turns a power supply of the imaging device 1 on and off, a zoom switch that changes a zoom magnification of the optical system 21 , a mode change switch that changes a mode of the imaging device 1 , a release switch that instructs the imaging device 1 to capture an image, and the like.
  • the recording unit 8 includes a program recording unit 81 that records various programs executed by the imaging device 1 and an image data recording unit 82 that records the image data generated by the imaging unit 2 .
  • the recording unit 8 has a flash memory, a synchronous dynamic random access memory (SDRAM), a memory card, and the like.
  • SDRAM synchronous dynamic random access memory
  • the controller 9 integrally controls each unit constituting the imaging device 1 .
  • the controller 9 has a central processing unit (CPU) and the like.
  • the controller 9 includes a touch detection unit 91 , a determination unit 92 , a position controller 93 , a trimming unit 94 , a super-resolution processer 95 , an imaging controller 96 , and a display controller 97 .
  • the touch detection unit 91 detects the touch position on the touch panel 6 based on the signal that has been input from the touch panel 6 . For example, based on the position signal periodically input from the touch panel 6 , the touch detection unit 91 detects the touch position on the touch panel 6 as well as detects a temporal change of the touch position.
  • the determination unit 92 determines whether or not blurring occurs in the imaging device 1 . Based on super-resolution image data generated by the super-resolution processer 95 , the determination unit 92 further determines whether or not the number of pixels of a super-resolution image corresponding to the super-resolution image data generated by the super-resolution processer 95 exceeds a resolution of the optical system 21 . The determination unit 92 may determine that the number of pixels of the super-resolution image exceeds the resolution of the optical system 21 when data volume of the super-resolution image data generated by the super-resolution processer 95 exceeds a predetermined value.
  • the position controller 93 drives the position changing unit 3 to move the imaging element 22 based on a period of time during which the contact position on the touch panel 6 is maintained (for example, a period of time of touching or duration of maintaining the touch position). Specifically, while the contact position on the touch panel 6 is maintained, the position controller 93 drives the position changing unit 3 to change a relative positional relationship between the optical system 21 and the imaging element 22 by moving the imaging element 22 by a predetermined very small amount such as by every other 0.5 pixels such that the number of pixels virtually increases.
  • the trimming unit 94 Based on the position signal input from the touch panel 6 , the trimming unit 94 sequentially cuts out an area including the touch position from the image displayed on the display unit 5 to generate a plurality of pieces of trimming image data. Specifically, the trimming unit 94 generates the trimming image data by sequentially cutting out, from the image displayed on the display unit 5 , the area including the touch position that is, for example, a rectangular area smaller than a display area displayed on the display unit 5 and including the area of the touch position.
  • the super-resolution processer 95 generates the super-resolution image data by using the plurality of pieces of trimming image data sequentially generated by the trimming unit 94 . Specifically, the super-resolution processer 95 generates the super-resolution image data by pixel interpolating using, for example, the plurality of pieces of trimming image data with low resolution.
  • a resolution of the super-resolution image corresponding to the super-resolution image data is higher than a resolution of the image corresponding to the image data generated by the imaging unit 2 .
  • the imaging controller 96 controls imaging by the imaging unit 2 . Specifically, when a release signal for instructing image capturing is input from the operational unit 7 , the imaging controller 96 causes the imaging unit 2 to execute the image capturing.
  • the display controller 97 controls a display mode of the display unit 5 .
  • the display controller 97 causes various information related to the imaging device 1 to be displayed on the display unit 5 .
  • the display controller 97 causes the super-resolution image corresponding to the super-resolution image data generated by the super-resolution processer 95 to be enlarged and displayed on the display unit 5 .
  • the display controller 97 enlarges and displays the super-resolution image corresponding to the super-resolution image data generated by the super-resolution processer 95 on the display unit 5 according to the duration of the position signal output from the touch panel 6 .
  • the display controller 97 causes the super-resolution image corresponding to the super-resolution image data generated by the super-resolution processer 95 to be enlarged and displayed on the display unit 5 in place of a live view image or a playback image displayed on the display unit 5 .
  • FIG. 2 is a flowchart illustrating an outline of the processing executed by the imaging device 1 .
  • step S 101 the imaging device 1 is set to a shooting mode
  • the display controller 97 causes the live view image corresponding to the image data generated by the imaging unit 2 to be displayed on the display unit 5 (step S 102 ).
  • the display controller 97 causes a live view image LV 1 corresponding to the image data generated by the imaging unit 2 to be displayed on the display unit 5 .
  • the determination unit 92 starts determination of blurring whether the blurring occurs in the imaging device 1 (step S 103 ). Specifically, based on the posture signal input from the posture detection unit 4 , the determination unit 92 starts the determination of blurring whether or not the blurring occurs in the imaging device 1 . The determination unit 92 may also make the determination of blurring whether or not the blurring occurs in the imaging device 1 based on two temporally adjacent image data generated by the imaging unit 2 .
  • the display controller 97 causes the display unit 5 to display information on possibility of super-resolution on the live view image LV 1 (step S 104 ). Specifically, if the determination unit 92 determines that the blurring does not occur in the imaging device 1 , the display controller 97 displays an icon indicating that super-resolution is possible on the live view image LV 1 . On the other hand, if the determination unit 92 determines that the blurring occurs in the imaging device 1 , the display controller 97 causes information indicating that the super-resolution is not possible, for example, an icon for a message urging to use a tripod since the blurring occurred in the imaging device 1 is large, to be displayed on the display unit 5 .
  • the determination unit 92 determines whether or not blurring occurs in the object (step S 106 ). Specifically, based on two temporally preceding and following image data generated by the imaging unit 2 , the determination unit 92 determines whether or not the blurring occurs in the object. The determination unit 92 may also determine whether or not the object is a moving body based on the two temporally preceding and following image data generated by the imaging unit 2 and, if the object is the moving body, the determination unit 92 may determine that the blurring occurs in the object. If the determination unit 92 determines that the blurring occurs in the object (step S 106 : Yes), the imaging device 1 proceeds to step S 112 described below. In contrast, if the determination unit 92 determines that the blurring does not occur in the object (step S 106 : No), the imaging device 1 proceeds to step S 107 described below.
  • step S 107 the imaging controller 96 starts continuous shooting by the imaging unit 2 (step S 107 ).
  • the position controller 93 moves the imaging element 22 in a horizontal direction and a vertical direction relative to a surface orthogonal to the optical system 21 so as to move by a predetermined pixel centering on the touch position, which corresponds to the signal input from the touch panel 6 . Accordingly, only the blurring in a specific area should be taken care of, whereby a success rate is increased.
  • the position controller 93 moves the imaging element 22 such that move between the pixels is gradually reduced or in four or eight different directions (for example, from 0.5 pixels to 0.2 pixels, or horizontally, vertically, and to the lower right and the upper right, and the like).
  • the display controller 97 causes the display unit 5 to display information indicating that the imaging device 1 is performing super-resolution processing such that the information is superimposed on the live view image LV 1 (step S 108 ). Specifically, as illustrated in FIG. 4 , the display controller 97 causes the display unit 5 to display an icon T 1 indicating that the imaging device 1 is performing the super-resolution processing such that the icon is superimposed on the live view image LV 1 . In place of the icon T 1 , the display controller 97 may also cause the display unit 5 to display a character, a figure, and the like to indicate that the imaging device 1 is performing the super-resolution processing.
  • the imaging device 1 generates a super-resolution image of the area including the touch position corresponding to the signal input from the touch panel 6 (step S 109 ).
  • the trimming unit 94 sequentially cuts out the area including the touch position corresponding to the signal input from the touch panel 6 and generates a plurality of trimming images.
  • the super-resolution processer 95 generates the super-resolution image data of the area including the touch position by pixel interpolating pixels in an area with insufficient resolution by using pixels of the trimming images.
  • a part (area) on which the super-resolution processing has been performed by the super-resolution processer 95 may be worth viewing in enlarged display later on. Furthermore, if trimming processing is performed later on, the part (area) on which the super-resolution processing has been performed by the super-resolution processer 95 may hold the resolution in the same way as a normal image.
  • the determination unit 92 determines whether or not the number of pixels of the super-resolution image generated by the super-resolution processer 95 exceeds the resolution of the optical system 21 (step S 110 ). If the determination unit 92 determines that the number of pixels of the super-resolution image generated by the super-resolution processer 95 exceeds the resolution of the optical system 21 (step S 110 : Yes), the imaging device 1 proceeds to step S 111 described below. In contrast, if the determination unit 92 determines that the number of pixels of the super-resolution image generated by the super-resolution processer 95 does not exceed the resolution of the optical system 21 (step S 110 : No), the imaging device 1 proceeds to step S 112 .
  • step S 111 the determination unit 92 determines whether or not the continuous shooting by the imaging unit 2 reaches the predetermined number of images. Specifically, the determination unit 92 determines whether or not the number of captured images in the continuous shooting by the imaging unit 2 reaches the number of captured images calculated by the imaging controller 96 in step S 107 described above. If the determination unit 92 determines that the continuous shooting by the imaging unit 2 reaches the predetermined number of images (step S 111 : Yes), the imaging device 1 proceeds to step S 112 described below. In contrast, if the determination unit 92 determines that the continuous shooting by the imaging unit 2 does not reach the predetermined number of images (step S 111 : No), the imaging device 1 proceeds to step S 105 described above.
  • step S 105 if the touch panel 6 is not touched (step S 105 : No), the imaging device 1 proceeds to step S 112 described below.
  • the condition that the touch panel 6 is not touched also includes a case where fingers are away from the panel.
  • the imaging controller 96 ends the continuous shooting by the imaging unit 2 (step S 113 ).
  • the display controller 97 may display, on the display unit 5 , a character, a figure, and the like to indicate that generation of the super-resolution image has been ended or to indicate that the number of pixels of the super-resolution image exceeds the resolution of the optical system 21 .
  • the imaging controller 96 generates an image file associating an existing header indicating various information, area information including the touch position on the touch panel 6 detected by the touch detection unit 91 , a normal image (non-super-resolution image) corresponding to the image data generated by the imaging unit 2 , a plurality of pieces of trimming image data of a partial super-resolution image generated by the trimming unit 94 , and the super-resolution image data generated by the super-resolution processer 95 , and the imaging controller 96 records it in the recording unit 8 (step S 114 ). Specifically, as illustrated in FIG.
  • the imaging controller 96 generates an image file D 1 associating an existing header h 1 indicating various information, area information h 2 including the touch position on the touch panel 6 detected by the touch detection unit 91 , a normal image (non-super-resolution image) h 3 corresponding to the image data generated by the imaging unit 2 , a plurality of pieces of trimming image data h 4 of the partial super-resolution image generated by the trimming unit 94 , and super-resolution image data h 5 generated by the super-resolution processer 95 , and the imaging controller 96 records it in the recording unit 8 .
  • the display controller 97 enlarges and displays the super-resolution image generated by the super-resolution processer 95 on a full-screen area of the display unit 5 (step S 115 ). Specifically, as illustrated in FIG. 6 , the display controller 97 enlarges and displays a super-resolution image T 2 generated by the super-resolution processer 95 on the full-screen area of the display unit 5 . Accordingly, a user can check the super-resolution image T 2 . While checking, the user can also select a desired resolution depending on length of time of touching (i.e., duration of maintaining the touch position).
  • the display controller 97 may enlarge and display the super-resolution image corresponding to the super-resolution image data each time the super-resolution image data is generated in the super-resolution processing by the super-resolution processer 95 . Accordingly, the user can stop touch operation while checking an effect of the super-resolution processing by the super-resolution processer 95 .
  • the imaging device 1 it is not necessary for the imaging device 1 to stop the continuous shooting by the imaging unit 2 at an end of the touch (stop of the position signal from the touch panel 6 ), and the continuous shooting may be continued as necessary even after the end of the touch.
  • the trimming unit 94 may gradually enlarge a trimming area to be cut out from the image corresponding to the image data generated by the imaging element 22 .
  • step S 116 when a predetermined time has passed since the image has been displayed on the display unit 5 (for example, three seconds) (step S 116 : Yes), the imaging device 1 proceeds to step S 117 described below. In contrast, if the predetermined time has not passed since the image has been displayed on the display unit 5 (step S 116 : No), the imaging device 1 continues this determination.
  • step S 117 Yes
  • the imaging device 1 ends this processing.
  • step S 117 No
  • the imaging device 1 returns to step S 101 described above.
  • step S 112 If the super-resolution image has not been generated by the super-resolution processer 95 in step S 112 (step S 112 : No), the imaging device 1 proceeds to step S 118 described below.
  • step S 118 the display controller 97 displays a first image on the display unit 5 (step S 119 ). Specifically, the display controller 97 displays an image corresponding to the image data generated by the imaging unit 2 as the first image on the display unit 5 . After step S 119 , the imaging device 1 proceeds to step S 116 .
  • step S 118 if the touch panel 6 has not been touched (step S 118 : No), the imaging device 1 proceeds to step S 120 .
  • the imaging controller 96 causes the imaging unit 2 to execute the capturing of the image (step S 121 ).
  • step S 122 the display controller 97 causes the display unit 5 to display the image corresponding to the image data generated by the imaging unit 2 for confirmation (step S 122 ).
  • step S 122 the imaging device 1 proceeds to step S 116 .
  • step S 120 if there is no release signal for instructing the image capturing from the operational unit 7 (step S 120 : No), the imaging device 1 proceeds to step S 117 .
  • step S 101 the shooting mode
  • step S 123 the imaging device 1 is in a playback mode
  • the display controller 97 causes the display unit 5 to display a plurality of thumbnail images corresponding to a plurality of pieces of image data recorded in the image data recording unit 82 (step S 124 ).
  • step S 125 when an image is selected from the plurality of thumbnail images displayed on the display unit 5 through the operational unit 7 or the touch panel 6 (step S 125 : Yes), the imaging device 1 proceeds to step S 126 described below. In contrast, if the image is not selected from the plurality of thumbnail images displayed on the display unit 5 through the operational unit 7 or the touch panel 6 (step S 125 : No), the imaging device 1 returns to step S 124 .
  • step S 126 if the image selected through the operational unit 7 or the touch panel 6 is a super-resolution captured image (step S 126 : Yes), the imaging device 1 proceeds to step S 128 described below. In contrast, if the image selected through the operational unit 7 or the touch panel 6 is not the super-resolution captured image (step S 126 : No), the imaging device 1 proceeds to step S 127 described below.
  • step S 127 the display controller 97 performs normal display in which the image selected through the operational unit 7 or the touch panel 6 is enlarged and displayed in the display area of the display unit 5 .
  • step S 127 the imaging device 1 proceeds to step S 134 described below.
  • step S 128 the display controller 97 causes the display unit 5 to display one base image of the super-resolution image. Specifically, as illustrated in FIG. 7 , the display controller 97 causes the display unit 5 to display, as the one base image, an image W 1 corresponding to normal image data included in an image file of the super-resolution image.
  • the display controller 97 causes the display unit 5 to display a super-resolution image mark indicating that the image displayed on the display unit 5 is the super-resolution image such that the mark is superimposed on the image (step S 129 ). Specifically, based on the area information included in the image file of the super-resolution image, the display controller 97 causes the display unit 5 to display a mark F 1 corresponding to an area of the super-resolution image so as to be superimposed on the image W 1 . Accordingly, the user can intuitively grasp the area of the super-resolution image. In place of the mark F 1 , the display controller 97 may also cause the display unit 5 to display a character, a figure, and the like to indicate the super-resolution image.
  • step S 130 when image enlargement for enlarging a part of the image displayed on the display unit 5 is made through the operational unit 7 (step S 130 : Yes) and if a super-resolution area exists within an angle of view (area) after the enlargement (step S 131 : Yes), the super-resolution processer 95 makes the enlargement by pixel interpolation using a plurality of super-resolution images stored in the image file (step S 132 ).
  • the display controller 97 causes the display unit 5 to display an enlarged image obtained by the pixel interpolation by the super-resolution processer 95 (step S 133 ).
  • step S 134 when an operation to end the playback is performed through the operational unit 7 (step S 134 : Yes), the imaging device 1 proceeds to step S 117 . In contrast, if the operation to end the playback is not performed through the operational unit 7 (step S 134 : No), the imaging device 1 returns to step S 124 described above.
  • step S 130 if the image enlargement for enlarging a part of the image displayed on the display unit 5 through the operational unit 7 is not made (step S 130 : No), the imaging device 1 proceeds to step S 134 .
  • step S 131 when no super-resolution area exists within the angle of view (area) after the enlargement (step S 131 : No), the super-resolution processer 95 makes the enlargement by pixel interpolation using a pixel in the angle of view (step S 135 ). After step S 135 , the imaging device 1 proceeds to step S 133 .
  • step S 123 if the imaging device 1 is not set to the shooting mode (step S 123 : No), the imaging device 1 proceeds to step S 117 .
  • the position controller 93 moves the imaging element 22
  • the trimming unit 94 generates the trimming image data by sequentially cutting out the area including the touch position from the image data sequentially generated by the imaging element 22
  • the super-resolution processer 95 generates the super-resolution image data using the plurality of pieces of trimming image data sequentially generated by the trimming unit 94 , whereby it is possible to obtain the high resolution image reflecting the intention of the user in real time.
  • the trimming unit 94 while the touch position is maintained, since the trimming unit 94 generates the trimming image data by sequentially cutting out only the area including the touch position from the image data sequentially generated by the imaging element 22 , it is possible to reduce an amount of calculation by comparing the super-resolution image data of a full area, whereby it is possible to obtain the high resolution image reflecting the intention of the user in real time.
  • the imaging controller 96 ends generation of the super-resolution image data by the super-resolution processer 95 , whereby it is possible to prevent useless image capturing.
  • the display controller 97 causes the display unit 5 to display the super-resolution image mark so as to be superimposed on the image; however, for example, as illustrated in FIG. 8 , it is also possible to display the super-resolution image T 2 on the display unit 5 so as to be superimposed on the image W 1 . Furthermore, as illustrated in FIG. 9 , it is also possible to display the image W 1 and the super-resolution image T 2 separately on the display unit 5 at the same time.
  • the imaging device may also be applied to a digital video camera, an electronic device such as a cellular phone and a tablet type mobile device with an imaging function, a display device that displays an image corresponding to image data used in in medical and industrial fields imaged by an endoscope and a microscope, and the like. Since it is possible to enlarge any part the image for confirmation that the user is concerned about, it may be a useful interface especially for an inspection device and the like. It is also possible to increase sensitivity by downsizing the device and by virtually increasing an area of a pixel without uselessly increasing the number of pixels.
  • the program to be executed by the imaging device is provided as file data in an installable format or an executable format and is recorded in a computer-readable recording medium such as a CD-ROM, a flexible disk (FD), a CD-R, a digital versatile disk (DVD), a USB medium, a flash memory, and the like.
  • a computer-readable recording medium such as a CD-ROM, a flexible disk (FD), a CD-R, a digital versatile disk (DVD), a USB medium, a flash memory, and the like.
  • the program to be executed by the imaging device may be stored in a computer connected to a network such as the Internet and may be provided by downloading through the network.
  • the program to be executed by the imaging device may be provided or distributed through the network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Studio Devices (AREA)
  • Editing Of Facsimile Originals (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Processing (AREA)

Abstract

An imaging device includes: an optical system; an imaging element that continuously generates image data of an object; a position changing unit that changes a relative positional relationship between the optical system and the imaging element; a touch panel that detects a contact position thereon, and periodically outputs a position signal according to the contact position; a position controller that controls the position changing unit based on duration of maintaining the contact position to change the positional relationship; a trimming unit that generates pieces of trimming image data by sequentially cutting out an area including a touch position corresponding to the position signal from an image corresponding to the image data each time the positional relationship is changed; and a super-resolution processer that performs pixel interpolation using the pieces of trimming image data to generate super-resolution image data having a higher resolution than that of each trimming image data.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a continuation application of U.S. patent application Ser. No. 14/959,947 (referred to as “the '947 application” and incorporated herein by reference), filed on Dec. 4, 2015, titled “IMAGING DEVICE, IMAGING METHOD, AND COMPUTER-READABLE RECORDING MEDIUM TO SHOW SUPER-RESOLUTION IMAGE ON LIVE VIEW IMAGE” and listing Shinsuke HOMMA as the inventor, the '947 application being based upon and claiming the benefit of priority from Japanese Patent Application No. 2014-250053, filed on Dec. 10, 2014, the entire contents of which are incorporated herein by reference.
BACKGROUND
1. Technical Field
The disclosure relates to an imaging device and method for imaging an object to generate image data of the object, and a computer-readable recording medium.
2. Related Art
In recent years, there has been known a technique of generating a high resolution image on an imaging device such as a digital camera by using a plurality of low resolution images that have been captured (see Japanese Laid-open Patent Publication No. 2008-92297). In this technique, by performing high resolution processing only on a predetermined area before performing the high resolution processing on a whole area of an image displayed on a display unit, it is possible to grasp a degree of high resolution of the image displayed on the display unit.
SUMMARY
In some embodiments, an imaging device includes: an optical system configured to image an object; an imaging element configured to receive light of an object image of the object formed by the optical system and to continuously generate image data of the object; a position changing unit configured to change a relative positional relationship between the optical system and the imaging element relative to a surface orthogonal to an optical axis of the optical system; a display unit configured to display an image corresponding to the image data generated by the imaging element; a touch panel that is provided so as to overlap on a display area of the display unit and is configured to detect a contact position where an external body touches the touch panel and to periodically output a position signal according to the contact position; a position controller configured to control the position changing unit based on a period of time during which the contact position is maintained, and thereby to change the relative positional relationship between the optical system and the imaging element per a predetermined distance; a trimming unit configured to generate a plurality of pieces of trimming image data by sequentially cutting out an area including a touch position corresponding to the position signal from the image corresponding to the image data sequentially generated by the imaging element each time the relative positional relationship between the optical system and the imaging element is changed; and a super-resolution processer configured to perform pixel interpolation using the plurality of pieces of trimming image data sequentially generated by the trimming unit, and thereby to generate super-resolution image data having a higher resolution than a resolution of each trimming image data.
In some embodiments, an imaging method executed by an imaging device is provided. The imaging device includes: an optical system configured to image an object; an imaging element configured to receive light of an object image of the object formed by the optical system and to continuously generate image data of the object; a position changing unit configured to change a relative positional relationship between the optical system and the imaging element relative to a surface orthogonal to an optical axis of the optical system; a display unit configured to display an image corresponding to the image data generated by the imaging element; and a touch panel that is provided so as to overlap on a display area of the display unit and is configured to detect a contact position where an external body touches the touch panel and to periodically output a position signal according to the contact position. The method includes: controlling the position changing unit based on a period of time during which the contact position is maintained to change the relative positional relationship between the optical system and the imaging element per a predetermined distance; generating a plurality of pieces of trimming image data by sequentially cutting out an area including a touch position corresponding to the position signal from the image corresponding to the image data sequentially generated by the imaging element each time the relative positional relationship between the optical system and the imaging element is changed; and performing pixel interpolation using the plurality of pieces of trimming image data to generate super-resolution image data having a higher resolution than a resolution of each trimming image data.
In some embodiments, a non-transitory computer-readable recording medium with an executable program stored thereon is provided. The program instructs an imaging device including: an optical system configured to image an object; an imaging element configured to receive light of an object image of the object formed by the optical system and to continuously generate image data of the object; a position changing unit configured to change a relative positional relationship between the optical system and the imaging element relative to a surface orthogonal to an optical axis of the optical system; a display unit configured to display an image corresponding to the image data generated by the imaging element; and a touch panel that is provided so as to overlap on a display area of the display unit and is configured to detect a contact position where an external body touches the touch panel and to periodically output a position signal according to the contact position, to execute: controlling the position changing unit based on a period of time during which the contact position is maintained to change the relative positional relationship between the optical system and the imaging element per a predetermined distance; generating a plurality of pieces of trimming image data by sequentially cutting out an area including a touch position corresponding to the position signal from the image corresponding to the image data sequentially generated by the imaging element each time the relative positional relationship between the optical system and the imaging element is changed; and performing pixel interpolation using the plurality of pieces of trimming image data to generate super-resolution image data having a higher resolution than a resolution of each trimming image data.
The above and other features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram illustrating a functional configuration of an imaging device according to one embodiment of the present invention;
FIG. 2 is a flowchart illustrating an outline of processing executed by the imaging device according to one embodiment of the present invention;
FIG. 3 is a view illustrating an exemplary live view image displayed on a display unit of the imaging device according to one embodiment of the present invention;
FIG. 4 is a view illustrating an exemplary image displayed on the display unit of the imaging device according to one embodiment of the present invention;
FIG. 5 is a view schematically illustrating a structure of an image file generated by the imaging device according to one embodiment of the present invention;
FIG. 6 is a view illustrating an exemplary super-resolution image displayed on the display unit of the imaging device according to one embodiment of the present invention;
FIG. 7 is a view illustrating an exemplary super-resolution image played back and displayed on the display unit of the imaging device according to one embodiment of the present invention;
FIG. 8 is a view illustrating another exemplary super-resolution image played back and displayed on the display unit of the imaging device according to one embodiment of the present invention; and
FIG. 9 is a view illustrating another exemplary super-resolution image played back and displayed on the display unit of the imaging device according to one embodiment of the present invention.
DETAILED DESCRIPTION
Hereinafter, modes for carrying out the invention (hereinafter, referred to as the “embodiment(s)”) will be described with reference to the drawings. The present invention is not to be limited by the embodiments below. Each of the drawings referenced in the description below only schematically illustrates a shape, a size, and a positional relationship to the extent that contents of the invention can be understood. That is, the present invention is not to be limited only to the shape, the size, and the positional relationship exemplified in each of the drawings.
Configuration of Imaging Device
FIG. 1 is a block diagram illustrating a functional configuration of the imaging device according to one embodiment of the present invention. An imaging device 1 illustrated in FIG. 1 includes an imaging unit 2, a position changing unit 3, a posture detection unit 4, a display unit 5, a touch panel 6, an operational unit 7, a recording unit 8, and a controller 9.
The imaging unit 2 includes an optical system 21 for forming an object image of an object, and an imaging element 22 for generating image data of the object image by imaging the object image that has been formed by the optical system 21. The optical system 21 has one or more lenses, and captures an image of the object. The imaging element 22 has an imaging sensor such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS), and receives light of the object image of the object that has been formed by the optical system 21 and continuously generates the image data of the object.
The position changing unit 3 holds the imaging element 22. Under control of the controller 9, the position changing unit 3 relatively changes a positional relationship between the optical system 21 and the imaging element 22 by moving the imaging element 22 relative to a surface orthogonal to an optical axis O (horizontal direction and vertical direction). The position changing unit 3 has a voice coil motor, a piezoelectric element, and the like. The position changing unit 3 functions as a hand shake prevention mechanism that prevents a hand shake occurs in the imaging device 1.
The posture detection unit 4 detects a posture of the imaging device 1 and outputs a detected result to the controller 9. The posture detection unit 4 has a gyro sensor and the like.
Under the control of the controller 9, the display unit 5 displays an image corresponding to the image data generated by the imaging unit 2. The display unit 5 has a display panel including a liquid crystal, an organic electro luminescence (EL), or the like.
The touch panel 6 is provided so as to be overlapped on a display screen of the display unit 5, and detects a contact position (touch position) where a body from outside has contacted as well as receives input of a signal according to the detected position. The touch panel 6 periodically outputs a signal according to the contact position where the body from the outside has contacted to the controller 9. Any of a resistance film method, an electrostatic capacitance method, an optical method, and the like is applicable as a method for the touch panel 6.
The operational unit 7 receives input of various operations from the imaging device 1. The operational unit 7 has, for example, a power supply switch that turns a power supply of the imaging device 1 on and off, a zoom switch that changes a zoom magnification of the optical system 21, a mode change switch that changes a mode of the imaging device 1, a release switch that instructs the imaging device 1 to capture an image, and the like.
The recording unit 8 includes a program recording unit 81 that records various programs executed by the imaging device 1 and an image data recording unit 82 that records the image data generated by the imaging unit 2. The recording unit 8 has a flash memory, a synchronous dynamic random access memory (SDRAM), a memory card, and the like.
The controller 9 integrally controls each unit constituting the imaging device 1. The controller 9 has a central processing unit (CPU) and the like. The controller 9 includes a touch detection unit 91, a determination unit 92, a position controller 93, a trimming unit 94, a super-resolution processer 95, an imaging controller 96, and a display controller 97.
The touch detection unit 91 detects the touch position on the touch panel 6 based on the signal that has been input from the touch panel 6. For example, based on the position signal periodically input from the touch panel 6, the touch detection unit 91 detects the touch position on the touch panel 6 as well as detects a temporal change of the touch position.
Based on a posture signal indicating a posture of the imaging device 1 input from the posture detection unit 4, the determination unit 92 determines whether or not blurring occurs in the imaging device 1. Based on super-resolution image data generated by the super-resolution processer 95, the determination unit 92 further determines whether or not the number of pixels of a super-resolution image corresponding to the super-resolution image data generated by the super-resolution processer 95 exceeds a resolution of the optical system 21. The determination unit 92 may determine that the number of pixels of the super-resolution image exceeds the resolution of the optical system 21 when data volume of the super-resolution image data generated by the super-resolution processer 95 exceeds a predetermined value.
The position controller 93 drives the position changing unit 3 to move the imaging element 22 based on a period of time during which the contact position on the touch panel 6 is maintained (for example, a period of time of touching or duration of maintaining the touch position). Specifically, while the contact position on the touch panel 6 is maintained, the position controller 93 drives the position changing unit 3 to change a relative positional relationship between the optical system 21 and the imaging element 22 by moving the imaging element 22 by a predetermined very small amount such as by every other 0.5 pixels such that the number of pixels virtually increases.
Based on the position signal input from the touch panel 6, the trimming unit 94 sequentially cuts out an area including the touch position from the image displayed on the display unit 5 to generate a plurality of pieces of trimming image data. Specifically, the trimming unit 94 generates the trimming image data by sequentially cutting out, from the image displayed on the display unit 5, the area including the touch position that is, for example, a rectangular area smaller than a display area displayed on the display unit 5 and including the area of the touch position.
The super-resolution processer 95 generates the super-resolution image data by using the plurality of pieces of trimming image data sequentially generated by the trimming unit 94. Specifically, the super-resolution processer 95 generates the super-resolution image data by pixel interpolating using, for example, the plurality of pieces of trimming image data with low resolution. Here, a resolution of the super-resolution image corresponding to the super-resolution image data is higher than a resolution of the image corresponding to the image data generated by the imaging unit 2.
The imaging controller 96 controls imaging by the imaging unit 2. Specifically, when a release signal for instructing image capturing is input from the operational unit 7, the imaging controller 96 causes the imaging unit 2 to execute the image capturing.
The display controller 97 controls a display mode of the display unit 5. The display controller 97 causes various information related to the imaging device 1 to be displayed on the display unit 5. According to duration of the position signal output from the touch panel 6, the display controller 97 causes the super-resolution image corresponding to the super-resolution image data generated by the super-resolution processer 95 to be enlarged and displayed on the display unit 5. Specifically, when the image displayed on the display unit 5 is touched and the super-resolution processer 95 generates the super-resolution image data, the display controller 97 enlarges and displays the super-resolution image corresponding to the super-resolution image data generated by the super-resolution processer 95 on the display unit 5 according to the duration of the position signal output from the touch panel 6. For example, when the position signal output from the touch panel 6 is stopped or a predetermined time (for example, one second) has passed since the position signal is stopped, the display controller 97 causes the super-resolution image corresponding to the super-resolution image data generated by the super-resolution processer 95 to be enlarged and displayed on the display unit 5 in place of a live view image or a playback image displayed on the display unit 5.
Processing by Imaging Device
Next, reference will be made to processing executed by the imaging device 1. FIG. 2 is a flowchart illustrating an outline of the processing executed by the imaging device 1.
As illustrated in FIG. 2, first, reference will be made to a case in which the imaging device 1 is set to a shooting mode (step S101: Yes). In this case, the display controller 97 causes the live view image corresponding to the image data generated by the imaging unit 2 to be displayed on the display unit 5 (step S102). For example, as illustrated in FIG. 3, the display controller 97 causes a live view image LV1 corresponding to the image data generated by the imaging unit 2 to be displayed on the display unit 5.
Subsequently, the determination unit 92 starts determination of blurring whether the blurring occurs in the imaging device 1 (step S103). Specifically, based on the posture signal input from the posture detection unit 4, the determination unit 92 starts the determination of blurring whether or not the blurring occurs in the imaging device 1. The determination unit 92 may also make the determination of blurring whether or not the blurring occurs in the imaging device 1 based on two temporally adjacent image data generated by the imaging unit 2.
Then, based on a determination result by the determination unit 92, the display controller 97 causes the display unit 5 to display information on possibility of super-resolution on the live view image LV1 (step S104). Specifically, if the determination unit 92 determines that the blurring does not occur in the imaging device 1, the display controller 97 displays an icon indicating that super-resolution is possible on the live view image LV1. On the other hand, if the determination unit 92 determines that the blurring occurs in the imaging device 1, the display controller 97 causes information indicating that the super-resolution is not possible, for example, an icon for a message urging to use a tripod since the blurring occurred in the imaging device 1 is large, to be displayed on the display unit 5.
Subsequently, when the touch panel 6 is touched (step S105: Yes), the determination unit 92 determines whether or not blurring occurs in the object (step S106). Specifically, based on two temporally preceding and following image data generated by the imaging unit 2, the determination unit 92 determines whether or not the blurring occurs in the object. The determination unit 92 may also determine whether or not the object is a moving body based on the two temporally preceding and following image data generated by the imaging unit 2 and, if the object is the moving body, the determination unit 92 may determine that the blurring occurs in the object. If the determination unit 92 determines that the blurring occurs in the object (step S106: Yes), the imaging device 1 proceeds to step S112 described below. In contrast, if the determination unit 92 determines that the blurring does not occur in the object (step S106: No), the imaging device 1 proceeds to step S107 described below.
In step S107, the imaging controller 96 starts continuous shooting by the imaging unit 2 (step S107). In this case, the position controller 93 moves the imaging element 22 in a horizontal direction and a vertical direction relative to a surface orthogonal to the optical system 21 so as to move by a predetermined pixel centering on the touch position, which corresponds to the signal input from the touch panel 6. Accordingly, only the blurring in a specific area should be taken care of, whereby a success rate is increased. For example, to pixel interpolate between pixels in the super-resolution, each time the imaging unit 2 performs imaging, the position controller 93 moves the imaging element 22 such that move between the pixels is gradually reduced or in four or eight different directions (for example, from 0.5 pixels to 0.2 pixels, or horizontally, vertically, and to the lower right and the upper right, and the like). In addition, if the number of captured images is denoted by M and pixel magnification is denoted by N, the imaging controller 96 determines the number of the images to be continuously shot by the imaging device 1 by the following formula (1).
M=(N^ 0.5×2)^2/2=2N  (1)
Subsequently, the display controller 97 causes the display unit 5 to display information indicating that the imaging device 1 is performing super-resolution processing such that the information is superimposed on the live view image LV1 (step S108). Specifically, as illustrated in FIG. 4, the display controller 97 causes the display unit 5 to display an icon T1 indicating that the imaging device 1 is performing the super-resolution processing such that the icon is superimposed on the live view image LV1. In place of the icon T1, the display controller 97 may also cause the display unit 5 to display a character, a figure, and the like to indicate that the imaging device 1 is performing the super-resolution processing.
Subsequently, the imaging device 1 generates a super-resolution image of the area including the touch position corresponding to the signal input from the touch panel 6 (step S109). Specifically, from the image corresponding to the image data sequentially generated by the imaging unit 2, the trimming unit 94 sequentially cuts out the area including the touch position corresponding to the signal input from the touch panel 6 and generates a plurality of trimming images. At the same time, by using the plurality of trimming images sequentially generated by the trimming unit 94, the super-resolution processer 95 generates the super-resolution image data of the area including the touch position by pixel interpolating pixels in an area with insufficient resolution by using pixels of the trimming images. A part (area) on which the super-resolution processing has been performed by the super-resolution processer 95 may be worth viewing in enlarged display later on. Furthermore, if trimming processing is performed later on, the part (area) on which the super-resolution processing has been performed by the super-resolution processer 95 may hold the resolution in the same way as a normal image.
Subsequently, the determination unit 92 determines whether or not the number of pixels of the super-resolution image generated by the super-resolution processer 95 exceeds the resolution of the optical system 21 (step S110). If the determination unit 92 determines that the number of pixels of the super-resolution image generated by the super-resolution processer 95 exceeds the resolution of the optical system 21 (step S110: Yes), the imaging device 1 proceeds to step S111 described below. In contrast, if the determination unit 92 determines that the number of pixels of the super-resolution image generated by the super-resolution processer 95 does not exceed the resolution of the optical system 21 (step S110: No), the imaging device 1 proceeds to step S112.
In step S111, the determination unit 92 determines whether or not the continuous shooting by the imaging unit 2 reaches the predetermined number of images. Specifically, the determination unit 92 determines whether or not the number of captured images in the continuous shooting by the imaging unit 2 reaches the number of captured images calculated by the imaging controller 96 in step S107 described above. If the determination unit 92 determines that the continuous shooting by the imaging unit 2 reaches the predetermined number of images (step S111: Yes), the imaging device 1 proceeds to step S112 described below. In contrast, if the determination unit 92 determines that the continuous shooting by the imaging unit 2 does not reach the predetermined number of images (step S111: No), the imaging device 1 proceeds to step S105 described above.
In step S105, if the touch panel 6 is not touched (step S105: No), the imaging device 1 proceeds to step S112 described below. The condition that the touch panel 6 is not touched also includes a case where fingers are away from the panel.
Subsequently, if the super-resolution image has already been generated by the super-resolution processer 95 (step S112: Yes), the imaging controller 96 ends the continuous shooting by the imaging unit 2 (step S113). In this case, the display controller 97 may display, on the display unit 5, a character, a figure, and the like to indicate that generation of the super-resolution image has been ended or to indicate that the number of pixels of the super-resolution image exceeds the resolution of the optical system 21.
Subsequently, the imaging controller 96 generates an image file associating an existing header indicating various information, area information including the touch position on the touch panel 6 detected by the touch detection unit 91, a normal image (non-super-resolution image) corresponding to the image data generated by the imaging unit 2, a plurality of pieces of trimming image data of a partial super-resolution image generated by the trimming unit 94, and the super-resolution image data generated by the super-resolution processer 95, and the imaging controller 96 records it in the recording unit 8 (step S114). Specifically, as illustrated in FIG. 5, the imaging controller 96 generates an image file D1 associating an existing header h1 indicating various information, area information h2 including the touch position on the touch panel 6 detected by the touch detection unit 91, a normal image (non-super-resolution image) h3 corresponding to the image data generated by the imaging unit 2, a plurality of pieces of trimming image data h4 of the partial super-resolution image generated by the trimming unit 94, and super-resolution image data h5 generated by the super-resolution processer 95, and the imaging controller 96 records it in the recording unit 8.
Then, the display controller 97 enlarges and displays the super-resolution image generated by the super-resolution processer 95 on a full-screen area of the display unit 5 (step S115). Specifically, as illustrated in FIG. 6, the display controller 97 enlarges and displays a super-resolution image T2 generated by the super-resolution processer 95 on the full-screen area of the display unit 5. Accordingly, a user can check the super-resolution image T2. While checking, the user can also select a desired resolution depending on length of time of touching (i.e., duration of maintaining the touch position). The longer the length of time of touching (i.e., the longer the duration of maintaining the touch position) is, the higher the resolution of the image becomes to be worth viewing and enlarged display. Thus, the display controller 97 may enlarge and display the super-resolution image corresponding to the super-resolution image data each time the super-resolution image data is generated in the super-resolution processing by the super-resolution processer 95. Accordingly, the user can stop touch operation while checking an effect of the super-resolution processing by the super-resolution processer 95. In this embodiment, it is not necessary for the imaging device 1 to stop the continuous shooting by the imaging unit 2 at an end of the touch (stop of the position signal from the touch panel 6), and the continuous shooting may be continued as necessary even after the end of the touch. while maintaining the touch position, the trimming unit 94 may gradually enlarge a trimming area to be cut out from the image corresponding to the image data generated by the imaging element 22.
Then, when a predetermined time has passed since the image has been displayed on the display unit 5 (for example, three seconds) (step S116: Yes), the imaging device 1 proceeds to step S117 described below. In contrast, if the predetermined time has not passed since the image has been displayed on the display unit 5 (step S116: No), the imaging device 1 continues this determination.
Subsequently, when the imaging device 1 is turned off through the operational unit 7 (step S117: Yes), the imaging device 1 ends this processing. In contrast, if the imaging device 1 is not turned off through the operational unit 7 (step S117: No), the imaging device 1 returns to step S101 described above.
If the super-resolution image has not been generated by the super-resolution processer 95 in step S112 (step S112: No), the imaging device 1 proceeds to step S118 described below.
Subsequently, if the touch panel 6 has been touched (step S118: Yes), the display controller 97 displays a first image on the display unit 5 (step S119). Specifically, the display controller 97 displays an image corresponding to the image data generated by the imaging unit 2 as the first image on the display unit 5. After step S119, the imaging device 1 proceeds to step S116.
In step S118, if the touch panel 6 has not been touched (step S118: No), the imaging device 1 proceeds to step S120.
Subsequently, if there is a release signal for instructing the image capturing from the operational unit 7 (step S120: Yes), the imaging controller 96 causes the imaging unit 2 to execute the capturing of the image (step S121).
Then, the display controller 97 causes the display unit 5 to display the image corresponding to the image data generated by the imaging unit 2 for confirmation (step S122). After step S122, the imaging device 1 proceeds to step S116.
In step S120, if there is no release signal for instructing the image capturing from the operational unit 7 (step S120: No), the imaging device 1 proceeds to step S117.
Next, reference will be made to the condition that the imaging device 1 is not in the shooting mode (step S101: No) and the imaging device 1 is in a playback mode (step S123: Yes). In this case, the display controller 97 causes the display unit 5 to display a plurality of thumbnail images corresponding to a plurality of pieces of image data recorded in the image data recording unit 82 (step S124).
Subsequently, when an image is selected from the plurality of thumbnail images displayed on the display unit 5 through the operational unit 7 or the touch panel 6 (step S125: Yes), the imaging device 1 proceeds to step S126 described below. In contrast, if the image is not selected from the plurality of thumbnail images displayed on the display unit 5 through the operational unit 7 or the touch panel 6 (step S125: No), the imaging device 1 returns to step S124.
In step S126, if the image selected through the operational unit 7 or the touch panel 6 is a super-resolution captured image (step S126: Yes), the imaging device 1 proceeds to step S128 described below. In contrast, if the image selected through the operational unit 7 or the touch panel 6 is not the super-resolution captured image (step S126: No), the imaging device 1 proceeds to step S127 described below.
In step S127, the display controller 97 performs normal display in which the image selected through the operational unit 7 or the touch panel 6 is enlarged and displayed in the display area of the display unit 5. After step S127, the imaging device 1 proceeds to step S134 described below.
In step S128, the display controller 97 causes the display unit 5 to display one base image of the super-resolution image. Specifically, as illustrated in FIG. 7, the display controller 97 causes the display unit 5 to display, as the one base image, an image W1 corresponding to normal image data included in an image file of the super-resolution image.
Subsequently, the display controller 97 causes the display unit 5 to display a super-resolution image mark indicating that the image displayed on the display unit 5 is the super-resolution image such that the mark is superimposed on the image (step S129). Specifically, based on the area information included in the image file of the super-resolution image, the display controller 97 causes the display unit 5 to display a mark F1 corresponding to an area of the super-resolution image so as to be superimposed on the image W1. Accordingly, the user can intuitively grasp the area of the super-resolution image. In place of the mark F1, the display controller 97 may also cause the display unit 5 to display a character, a figure, and the like to indicate the super-resolution image.
Then, when image enlargement for enlarging a part of the image displayed on the display unit 5 is made through the operational unit 7 (step S130: Yes) and if a super-resolution area exists within an angle of view (area) after the enlargement (step S131: Yes), the super-resolution processer 95 makes the enlargement by pixel interpolation using a plurality of super-resolution images stored in the image file (step S132).
Subsequently, the display controller 97 causes the display unit 5 to display an enlarged image obtained by the pixel interpolation by the super-resolution processer 95 (step S133).
Then, when an operation to end the playback is performed through the operational unit 7 (step S134: Yes), the imaging device 1 proceeds to step S117. In contrast, if the operation to end the playback is not performed through the operational unit 7 (step S134: No), the imaging device 1 returns to step S124 described above.
In step S130, if the image enlargement for enlarging a part of the image displayed on the display unit 5 through the operational unit 7 is not made (step S130: No), the imaging device 1 proceeds to step S134.
In step S131, when no super-resolution area exists within the angle of view (area) after the enlargement (step S131: No), the super-resolution processer 95 makes the enlargement by pixel interpolation using a pixel in the angle of view (step S135). After step S135, the imaging device 1 proceeds to step S133.
In step S123, if the imaging device 1 is not set to the shooting mode (step S123: No), the imaging device 1 proceeds to step S117.
According to the embodiment described above, while the touch position is maintained, the position controller 93 moves the imaging element 22, the trimming unit 94 generates the trimming image data by sequentially cutting out the area including the touch position from the image data sequentially generated by the imaging element 22, and the super-resolution processer 95 generates the super-resolution image data using the plurality of pieces of trimming image data sequentially generated by the trimming unit 94, whereby it is possible to obtain the high resolution image reflecting the intention of the user in real time.
Furthermore, according to the embodiment, while the touch position is maintained, since the trimming unit 94 generates the trimming image data by sequentially cutting out only the area including the touch position from the image data sequentially generated by the imaging element 22, it is possible to reduce an amount of calculation by comparing the super-resolution image data of a full area, whereby it is possible to obtain the high resolution image reflecting the intention of the user in real time.
According to the embodiment, if the determination unit 92 determines that the number of pixels of the super-resolution image exceeds the resolution of the optical system 21, the imaging controller 96 ends generation of the super-resolution image data by the super-resolution processer 95, whereby it is possible to prevent useless image capturing.
In this embodiment, if the image displayed on the display unit 5 is the super-resolution image, the display controller 97 causes the display unit 5 to display the super-resolution image mark so as to be superimposed on the image; however, for example, as illustrated in FIG. 8, it is also possible to display the super-resolution image T2 on the display unit 5 so as to be superimposed on the image W1. Furthermore, as illustrated in FIG. 9, it is also possible to display the image W1 and the super-resolution image T2 separately on the display unit 5 at the same time.
In addition to a digital still camera, the imaging device according to the embodiment may also be applied to a digital video camera, an electronic device such as a cellular phone and a tablet type mobile device with an imaging function, a display device that displays an image corresponding to image data used in in medical and industrial fields imaged by an endoscope and a microscope, and the like. Since it is possible to enlarge any part the image for confirmation that the user is concerned about, it may be a useful interface especially for an inspection device and the like. It is also possible to increase sensitivity by downsizing the device and by virtually increasing an area of a pixel without uselessly increasing the number of pixels.
The program to be executed by the imaging device is provided as file data in an installable format or an executable format and is recorded in a computer-readable recording medium such as a CD-ROM, a flexible disk (FD), a CD-R, a digital versatile disk (DVD), a USB medium, a flash memory, and the like.
The program to be executed by the imaging device may be stored in a computer connected to a network such as the Internet and may be provided by downloading through the network. The program to be executed by the imaging device may be provided or distributed through the network such as the Internet.
Note that in the description of the flowchart herein, expressions such as “first”, “then”, and “subsequently” have been used to specify the order of processing between the steps; however, the order of the processing required for carrying out the present invention is not to be uniquely determined by these expressions. That is, the order of the processing in the flowchart described herein may be changed as long as there is no contradiction.
According to some embodiments, it is possible to obtain a high resolution image reflecting the intention of the user in real time.
Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.

Claims (6)

What is claimed is:
1. An imaging device comprising:
an optical system configured to image an object;
an imaging element configured to receive light of an object image of the object formed by the optical system and to continuously generate image data of the object;
a position changing unit configured to change a relative positional relationship between the optical system and the imaging element relative to a surface orthogonal to an optical axis of the optical system;
a display unit configured to display an image corresponding to the image data generated by the imaging element;
a touch panel that is provided so as to overlap on a display area of the display unit and is configured to detect a contact position where an external body touches the touch panel and to periodically output a position signal according to the contact position;
a trimming unit configured to generate a plurality of pieces of trimming image data by sequentially cutting out an area including a touch position corresponding to the position signal from the image corresponding to the image data sequentially generated by the imaging element; and
a super-resolution processer configured to generate super-resolution image data having a higher resolution than a resolution of each trimming image data by using the plurality of pieces of trimming image data sequentially generated by the trimming unit.
2. The imaging device according to claim 1, further comprising a display controller configured to cause the display unit to display a mark corresponding to an area of a super-resolution image corresponding to the super-resolution image data generated by the super-resolution processer so as to be superimposed on the image corresponding to the image data generated by the imaging element.
3. The imaging device according to claim 1, further comprising a display controller configured to cause the display unit to enlarge and display a super-resolution image corresponding to the super-resolution image data generated by the super-resolution processer.
4. The imaging device according to claim 1, further comprising an imaging controller configured to generate an image file including:
area information including the touch position;
the image corresponding to the image data generated by the imaging element;
the plurality of pieces of trimming image data generated by the trimming unit; and
the super-resolution image data generated by the super-resolution processer.
5. An imaging method executed by an imaging device, the imaging device comprising: an optical system configured to image an object; an imaging element configured to receive light of an object image of the object formed by the optical system and to continuously generate image data of the object; a position changing unit configured to change a relative positional relationship between the optical system and the imaging element relative to a surface orthogonal to an optical axis of the optical system; a display unit configured to display an image corresponding to the image data generated by the imaging element; and a touch panel that is provided so as to overlap on a display area of the display unit and is configured to detect a contact position where an external body touches the touch panel and to periodically output a position signal according to the contact position,
the method comprising:
generating a plurality of pieces of trimming image data by sequentially cutting out an area including a touch position corresponding to the position signal from the image corresponding to the image data sequentially generated by the imaging element; and
generating super-resolution image data having a higher resolution than a resolution of each trimming image data by using the plurality of pieces of trimming image data.
6. A non-transitory computer-readable recording medium with an executable program stored thereon, the program instructing an imaging device comprising: an optical system configured to image an object; an imaging element configured to receive light of an object image of the object formed by the optical system and to continuously generate image data of the object; a position changing unit configured to change a relative positional relationship between the optical system and the imaging element relative to a surface orthogonal to an optical axis of the optical system; a display unit configured to display an image corresponding to the image data generated by the imaging element; and a touch panel that is provided so as to overlap on a display area of the display unit and is configured to detect a contact position where an external body touches the touch panel and to periodically output a position signal according to the contact position, to execute:
generating a plurality of pieces of trimming image data by sequentially cutting out an area including a touch position corresponding to the position signal from the image corresponding to the image data sequentially generated by the imaging element; and
generating super-resolution image data having a higher resolution than a resolution of each trimming image data by using the plurality of pieces of trimming image data.
US15/496,312 2014-12-10 2017-04-25 Imaging device, imaging method, and computer-readable recording medium Active US9973690B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/496,312 US9973690B2 (en) 2014-12-10 2017-04-25 Imaging device, imaging method, and computer-readable recording medium

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2014250053A JP2016111652A (en) 2014-12-10 2014-12-10 Imaging apparatus, imaging method and program
JP2014-250053 2014-12-10
US14/959,947 US9681044B2 (en) 2014-12-10 2015-12-04 Imaging device, imaging method, and computer-readable recording medium to show super-resolution image on live view image
US15/496,312 US9973690B2 (en) 2014-12-10 2017-04-25 Imaging device, imaging method, and computer-readable recording medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/959,947 Continuation US9681044B2 (en) 2014-12-10 2015-12-04 Imaging device, imaging method, and computer-readable recording medium to show super-resolution image on live view image

Publications (2)

Publication Number Publication Date
US20170230572A1 US20170230572A1 (en) 2017-08-10
US9973690B2 true US9973690B2 (en) 2018-05-15

Family

ID=56111636

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/959,947 Active US9681044B2 (en) 2014-12-10 2015-12-04 Imaging device, imaging method, and computer-readable recording medium to show super-resolution image on live view image
US15/496,312 Active US9973690B2 (en) 2014-12-10 2017-04-25 Imaging device, imaging method, and computer-readable recording medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/959,947 Active US9681044B2 (en) 2014-12-10 2015-12-04 Imaging device, imaging method, and computer-readable recording medium to show super-resolution image on live view image

Country Status (2)

Country Link
US (2) US9681044B2 (en)
JP (1) JP2016111652A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180253221A1 (en) * 2017-03-02 2018-09-06 Samsung Electronics Co., Ltd. Display device and user interface displaying method thereof

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9808710B2 (en) * 2015-03-31 2017-11-07 King.Com Ltd. User interface
CN108495029B (en) * 2018-03-15 2020-03-31 维沃移动通信有限公司 Photographing method and mobile terminal
CN111475097B (en) * 2020-04-07 2021-08-06 广州视源电子科技股份有限公司 Handwriting selection method and device, computer equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090189900A1 (en) * 2006-10-02 2009-07-30 Eiji Furukawa Image processing apparatus, image processing program, image production method, and recording medium
US20090201316A1 (en) * 2008-02-11 2009-08-13 Nikhil Bhatt Image Application Performance Optimization
US20100214449A1 (en) * 2009-02-17 2010-08-26 Olympus Corporation Imaging system, image processing method, and image processing program recording medium
US20110273471A1 (en) * 2009-01-19 2011-11-10 Sony Corporation Display control device, display control method and program
US20120262490A1 (en) * 2009-10-01 2012-10-18 Scalado Ab Method Relating To Digital Images
US8319883B2 (en) * 2009-05-15 2012-11-27 Canon Kabushiki Kaisha Image pickup apparatus and control method thereof
US20130093922A1 (en) * 2005-06-21 2013-04-18 Mediapod Llc System and apparatus for increasing quality and efficiency of film capture and methods of use thereof
US20130155308A1 (en) * 2011-12-20 2013-06-20 Qualcomm Incorporated Method and apparatus to enhance details in an image
US8711265B2 (en) * 2008-04-24 2014-04-29 Canon Kabushiki Kaisha Image processing apparatus, control method for the same, and storage medium
US9019413B2 (en) * 2011-08-02 2015-04-28 Stmicroelectronics (Grenoble 2) Sas Method of image preview in a digital image pickup apparatus

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3568974B2 (en) * 1993-03-31 2004-09-22 株式会社東芝 Imaging device
JPH09284626A (en) * 1996-04-19 1997-10-31 Canon Inc Image forming device
JP4759488B2 (en) * 2006-10-17 2011-08-31 キヤノン株式会社 Imaging device
JP2012242913A (en) * 2011-05-16 2012-12-10 Nikon Corp Electronic apparatus
JP6102602B2 (en) * 2013-07-23 2017-03-29 ソニー株式会社 Image processing apparatus, image processing method, image processing program, and imaging apparatus

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130093922A1 (en) * 2005-06-21 2013-04-18 Mediapod Llc System and apparatus for increasing quality and efficiency of film capture and methods of use thereof
US20090189900A1 (en) * 2006-10-02 2009-07-30 Eiji Furukawa Image processing apparatus, image processing program, image production method, and recording medium
US20090201316A1 (en) * 2008-02-11 2009-08-13 Nikhil Bhatt Image Application Performance Optimization
US8711265B2 (en) * 2008-04-24 2014-04-29 Canon Kabushiki Kaisha Image processing apparatus, control method for the same, and storage medium
US20110273471A1 (en) * 2009-01-19 2011-11-10 Sony Corporation Display control device, display control method and program
US20100214449A1 (en) * 2009-02-17 2010-08-26 Olympus Corporation Imaging system, image processing method, and image processing program recording medium
US8319883B2 (en) * 2009-05-15 2012-11-27 Canon Kabushiki Kaisha Image pickup apparatus and control method thereof
US20120262490A1 (en) * 2009-10-01 2012-10-18 Scalado Ab Method Relating To Digital Images
US9019413B2 (en) * 2011-08-02 2015-04-28 Stmicroelectronics (Grenoble 2) Sas Method of image preview in a digital image pickup apparatus
US20130155308A1 (en) * 2011-12-20 2013-06-20 Qualcomm Incorporated Method and apparatus to enhance details in an image

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180253221A1 (en) * 2017-03-02 2018-09-06 Samsung Electronics Co., Ltd. Display device and user interface displaying method thereof
US11231785B2 (en) * 2017-03-02 2022-01-25 Samsung Electronics Co., Ltd. Display device and user interface displaying method thereof

Also Published As

Publication number Publication date
US9681044B2 (en) 2017-06-13
JP2016111652A (en) 2016-06-20
US20170230572A1 (en) 2017-08-10
US20160171655A1 (en) 2016-06-16

Similar Documents

Publication Publication Date Title
CN107566732B (en) Method for controlling camera of device and device thereof
US9973690B2 (en) Imaging device, imaging method, and computer-readable recording medium
JP6296801B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND IMAGING DEVICE CONTROL PROGRAM
KR102023179B1 (en) Dual recording method and apparatus for electronic device having dual camera
KR101969424B1 (en) Photographing device for displaying image and methods thereof
KR101739318B1 (en) Display control apparatus, imaging system, display control method, and recording medium
EP2688283A1 (en) Dynamic region of interest adaptation and image capture device providing same
US10841481B2 (en) Control apparatus, method of controlling the same and program
US10096339B2 (en) Display apparatus and control method thereof
JP2016224173A (en) Control device and control method
JP2015031726A (en) Imaging device, control method of imaging device, control program of imaging device, and recording medium
JP6307942B2 (en) IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
US8040388B2 (en) Indicator method, system, and program for restoring annotated images
JP2011130206A (en) Imaging apparatus, and method of controlling the same
WO2011039947A1 (en) Imaging device and control method for same, imaging system and non-transient computer-readable medium with stored program
US8928764B2 (en) Method and device for correcting user's hand tremor in imaging device
JP2012243266A (en) Electronic apparatus and display method
JP5656496B2 (en) Display device and display method
JP6071492B2 (en) REPRODUCTION CONTROL DEVICE AND ITS CONTROL METHOD
JP2013025449A (en) Image processing device, and program
US20230276015A1 (en) Electronic apparatus, method of controlling the same, and computer-readable storage medium storing program
JP6824061B2 (en) Imaging apparatus, control methods, programs, and storage media executed by the imaging apparatus.
JP2017204718A (en) Imaging apparatus and zoom control method thereof
US20150355780A1 (en) Methods and systems for intuitively refocusing images
TWI408952B (en) Electrical device with zooming function and zooming method applying to thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HOMMA, SHINSUKE;REEL/FRAME:044344/0313

Effective date: 20151125

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: CHANGE OF ASSIGNEE ADDRESS;ASSIGNOR:OLYMPUS CORPORATION;REEL/FRAME:044778/0151

Effective date: 20160401

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

AS Assignment

Owner name: OM DIGITAL SOLUTIONS CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OLYMPUS CORPORATION;REEL/FRAME:058294/0274

Effective date: 20210730