US8314844B2 - Image pickup apparatus, method and computer-readable storage medium for processing an image based on user manipulation on a display surface - Google Patents

Image pickup apparatus, method and computer-readable storage medium for processing an image based on user manipulation on a display surface Download PDF

Info

Publication number
US8314844B2
US8314844B2 US12/466,069 US46606909A US8314844B2 US 8314844 B2 US8314844 B2 US 8314844B2 US 46606909 A US46606909 A US 46606909A US 8314844 B2 US8314844 B2 US 8314844B2
Authority
US
United States
Prior art keywords
image
manipulation
foreign substance
grid
image pickup
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US12/466,069
Other languages
English (en)
Other versions
US20090295944A1 (en
Inventor
Kazuya TASHIRO
Nobuyuki Mori
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MORI, NOBUYUKI, Tashiro, Kazuya
Publication of US20090295944A1 publication Critical patent/US20090295944A1/en
Application granted granted Critical
Publication of US8314844B2 publication Critical patent/US8314844B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/02Viewfinders
    • G03B13/04Viewfinders of direct vision type, e.g. frame, sighting mark
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • H04N23/811Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation by dust removal, e.g. from surfaces of the image sensor or processing of the image signal output by the electronic image sensor

Definitions

  • the embodiments of the present invention relate to an image pickup apparatus including an image pickup device for capturing an image of a subject.
  • the floating grid and dust are negatively charged by a charging device disposed within a mirror box, and a dust correcting plate positively charged attracts the negatively charged grid and dust by the coulombic force, thereby suppressing the adhesion of the grid and dust to the optical component or part.
  • an air flow is generated in front of an image pickup element by a piezoelectric pump, thereby suppressing the adhesion of the grid and dust to the optical component or part.
  • a white image is previously photographed to create a dust profile, and an image of the grid and dust appearing on the photographed image is removed by executing image processing based on the dust profile.
  • the present invention has been made in the light of the problems described above, and it is therefore desirable to provide an image pickup apparatus in which an image of grid and dust appearing on a photographed image of a subject can be easily removed.
  • an image pickup apparatus including: an image pickup portion configured to capture an image; a detecting portion configured to detect a manipulation, made on a display surface, for the image displayed on a display portion; and a grid and dust removing portion configured to carry out removal of grid and dust an image of which appears on the captured image by executing image processing based on a position about the manipulation detected by the detecting portion.
  • a computer-readable recording medium having a program recorded therein, the program being adapted to instruct a computer to execute the steps of: capturing an image; detecting a manipulation, made on a display surface, for the image displayed on a display portion; and carrying out removal of grid and dust an image of which appears on the image by executing image processing based on a position about the detected manipulation.
  • an image processing method including the steps of: capturing an image; detecting a manipulation, made on a display surface, for the image displayed on a display portion; and carrying out removal of a grid and dust an image of which appears on the image by executing image processing based on a position about the detected manipulation.
  • the position of the image of the grid and dust appearing on the captured image is specified based on the position, about the manipulation, detected by the detecting portion, and the grid and dust the image of which appears on the photographed image is removed by executing the image processing based on the specified position of the grid and dust.
  • FIG. 1 is a perspective view showing a construction of a main portion of an image pickup apparatus according to an embodiment of the present invention
  • FIG. 2 is a block diagram showing a functional configuration of the image pickup apparatus shown in FIG. 1 ;
  • FIG. 3 is a view explaining grid and dust removal correction
  • FIGS. 4A and 4B are respectively views explaining grid and dust removal correction
  • FIG. 5 is a flow chart explaining a basic operation of the image pickup apparatus shown in FIG. 1 ;
  • FIG. 6 is a view explaining an operation in the image pickup apparatus shown in FIG. 1 ;
  • FIG. 7 is a view explaining an operation in the image pickup apparatus shown in FIG. 1 ;
  • FIG. 8 is a view explaining an operation in the image pickup apparatus shown in FIG. 1 ;
  • FIG. 9 is a view explaining an operation in the image pickup apparatus shown in FIG. 1 ;
  • FIG. 10 is a view explaining an operation in the image pickup apparatus shown in FIG. 1 ;
  • FIG. 11 is a view explaining an operation in the image pickup apparatus shown in FIG. 1 ;
  • FIG. 12 is a view explaining a procedure for erasing a correction parameter from a flash memory
  • FIG. 13 is a view explaining a procedure for erasing the correction parameter from the flash memory
  • FIG. 14 is a flow chart explaining grid and dust removal correcting processing.
  • FIG. 15 is a view explaining setting of a correction area according to a change of the embodiment of the present invention.
  • FIG. 1 is a perspective view showing a construction of a main portion of an image pickup apparatus 1 according to an embodiment of the present invention.
  • This image pickup apparatus 1 is constructed in the form of a single-lens reflex type digital camera with interchangeable lenses.
  • the image pickup apparatus 1 includes a camera main body portion (camera body) 2 , and an interchangeable lens (interchangeable photographing lens unit) 3 is detachably attached to the camera main body portion 2 .
  • the interchangeable lens 3 is interchanged with a different kind of one, the photographing can be carried out with the lens having a focal length and brightness which a user desires to set.
  • grid and dust such as dust floating in an atmosphere of the outside of the camera main body 2 may invade into the inside of the camera main body 2 .
  • the grid and dust invading into the inside of the camera main body 2 adheres to an image pickup element 41 (refer to FIG. 2 ), which results in that an image of the grid and dust appears on a photographed image.
  • the interchangeable lens 3 is mainly composed of a camera cone, and a lens group and a stop which are provided inside the camera cone, and the like.
  • a focus lens which is moved in an optical axis direction, thereby changing a position of a focal point, and the like are included in the lens group functioning as a photographing optical system.
  • the camera main body portion 2 includes a toric mount portion to which the interchangeable lens 3 is adapted to be attached approximately at a center on a front side.
  • the camera main body portion 2 includes a mode setting dial 82 in a top right portion of a back surface thereof, and also includes a control value setting dial 86 in a top left portion of a front surface thereof.
  • the mode setting dial 82 is manipulated, thereby making it possible to carry out an operation for setting various kinds of operation modes which are set in the image pickup apparatus 1 .
  • the various kinds of operation modes includes various kinds of photographing modes (such as a person photographing mode, a scene photographing mode, and a full-automatic photographing mode), a reproduction mode for reproducing a photographed image, a communication mode for carrying out data communication with an external apparatus, and the like.
  • control value setting dial (hereinafter referred to as “the setting dial” as well for short) 86
  • the control values can be set in the various kinds of photographing modes.
  • an enlargement factor of an image displayed on a display screen 12 f of a back surface display portion 12 can be changed in a grid and dust removal correcting mode which will be described later.
  • the camera main body portion 2 includes a grip portion 14 which the user is adapted to grip in a left-hand end portion of the front side.
  • a release button 11 with which exposure start is instructed is provided in an upper surface of the grip portion 14 .
  • a battery accommodating room and a card accommodating room are both provided inside the grip portion 14 .
  • a battery as a camera power source is accommodated in the battery accommodating room, and a memory card (refer to FIG. 2 ) 90 for recording therein image data on a photographed image is detachably accommodated in the card accommodating room.
  • the release button 11 is a two-stage detecting button with which two states of a semi-depression state (state S 1 ) and a full-depression state (state S 2 ) can be detected.
  • a preparing operation such as an AF (Auto Focus) control operation and an AE (Auto Exposure) control operation
  • a photographing operation for the regularly photographed image is carried out.
  • the photographing operation for the regularly photographed image corresponds to a series of operations for carrying out an exposure operation relating to a subject image (a light figure of the subject) is carried out by using an image pickup element 41 which will be described later, and executing predetermined image processing for an image signal obtained through the exposure operation.
  • the camera main body portion 2 includes a finder window (eye piece window) 10 approximately in a central upper portion of the back surface thereof.
  • the user looks through the finder window 10 , whereby the user can visually recognize the light figure of the subject guided through the interchangeable lens 3 , thereby determining a composition.
  • the image pickup apparatus 1 can be loaded with an eye piece 91 for preventing an eye of the user or the like from directly contacting the finder window 10 .
  • the camera main body portion 2 includes a built-in flash 40 in an upper portion thereof.
  • the user pops up the built-in flash 40 to cause the built-in flash 40 to emit a flashlight, thereby making the suitable photographing possible.
  • the back surface display portion 12 having the display screen 12 f is provided approximately in a center of the back surface of the camera main body 2 .
  • a photographed image can be displayed on the back surface display portion 12 .
  • a menu picture for setting of photographing conditions or the line can be displayed on the display screen 12 f of the back surface display portion 12 , and the photographed image the data on which is recorded in the memory card 90 can be reproduced and displayed on the display screen 12 f of the back surface display portion 12 in a reproduction mode.
  • a menu M 1 (refer to FIG.
  • a main switch 81 is provided in a top left portion of the back surface display portion 12 .
  • the main switch 81 is composed of a two-point slide switch. When a contact point is set in a left-hand side “OFF” position, a power source is turned OFF, while when the contact point is set in a right-hand side “ON” position, the power source is turned ON.
  • a direction selecting key 84 is provided on a right-hand side of the back surface display portion 12 .
  • the direction selecting key 84 has a circular manipulation button. Depression manipulations for four directions of left, right, top, and bottom in the manipulation button, and depression manipulations for four directions of top right, top left, bottom right, and bottom left can be individually detected. It is noted that with the directional selecting key 84 , a depression manipulation made with a push button in the central portion can also be detected in addition to depression manipulations for the eight directions described above.
  • Buttons 83 with which the menu display, the deletion of the image, and the like are carried out are provided in a top right portion of the back surface display portion 12 .
  • FIG. 2 is a block diagram showing a functional configuration of the image pickup apparatus 1 shown in FIG. 1 .
  • the same portions, etc. as those in FIG. 1 are designated with the same reference numerals or symbols, respectively.
  • the image pickup apparatus 1 includes a camera control portion 100 , an image processing engine 5 , and a flash memory 50 .
  • the camera control portion 100 takes charge of the control for the camera mechanism.
  • the image processing engine 5 executes image processing for an image signal generated in an image pickup portion 4 .
  • the flash memory 50 can be accessed by each of the camera control portion 100 and the image processing engine 5 , and functions as a non-volatile memory.
  • the camera control portion 100 is a portion which has a Central Processing Unit (CPU) functioning as a microcomputer, and which carries out control or the like in the phase of the photographing operation in the image pickup apparatus 1 .
  • CPU Central Processing Unit
  • the various kinds of operations of the image pickup apparatus 1 are realized in response to an input manipulation made by the user by using a manipulating portion 80 composed of the various kinds of buttons including the release button 11 (refer to FIG. 1 ), a switch and the like.
  • the image pickup portion 4 includes an image pickup element 41 , and an A/D conversion portion 42 for converting an analog signal outputted from the image pickup element 41 into a digital signal. Also, the image pickup element 41 functions as an image pickup section for capturing an image of a subject.
  • the image pickup element 41 is disposed in a direction vertical to an optical axis and on the optical axis of the lens group which the interchangeable lens 3 includes when the interchangeable lens 3 is attached to the camera main body portion 2 .
  • a CMOS (complementary metal-oxide semiconductor) color area sensor (CMOS type image sensor) having a Bayer disposition is used as the image pickup element 41 .
  • CMOS color area sensor a plurality of pixels, for example, each having a photodiode are two-dimensionally disposed in a matrix, and color filters of, for example, Red (R), Green (G) and Blue (B) different in spectroscopical characterizations from one another are disposed at a ratio of 1:2:1 on a light receiving surface of each of pixels.
  • Such an image pickup element 41 generates analog electrical signals (image signals) of color components of Red (R), Green (G) and Blue (B) about a light figure of the subject imaged through the interchangeable lens 3 , and outputs the resulting image signals of R, G and B.
  • the image processing engine 5 for example, is configured in the form of an image processing circuit including a CPU functioning as a microcomputer, and a memory.
  • the image processing engine 5 executes various kinds of image processing such as limb darkening correction (shading correction), black level correction, and white balance correction for the image data outputted from the image pickup portion 4 .
  • the image processing engine 5 includes a grid and dust removal correcting portion 51 for removing an image of grid and dust which appears on the photographed image by executing image processing (grid and dust removal correction) (a detailed description thereof will be given later).
  • the image data for which the image processing is executed in the image processing engine 5 is recorded in the memory card 90 as a recording medium, and an image corresponding to the image data is displayed on the display picture 12 f of the back surface display portion 12 .
  • Confirmation display (after-view) for confirming the photographed image is realized by the image display.
  • the back surface display portion 12 includes a liquid crystal monitor 121 , for example, configured in the form of a color Liquid Crystal Display (LCD). Also, a transparent touch panel 122 which can detect a touch position where the user touches is disposed so as to cover an entire display screen 12 f of the liquid crystal monitor 121 . With the back surface display portion 12 configured in the manner as described above, the user can visually recognize the contents of the image or the like displayed on the liquid crystal monitor 121 through the touch panel 122 .
  • LCD Liquid Crystal Display
  • information on the coordinates of the position which the user touches in the touch panel 122 provided on the display screen 12 f can be acquired through a touch panel controller 13 .
  • the position of the image of the grid and dust in the photographed image can be specified based on the touch position detected by the touch panel 122 while the photographed image is displayed on the display screen 12 f of the back surface display portion 12 , and the suitable grid and dust removal can be carried out in accordance with the grid and dust removal correction based on the position of the grid and dust by using the grid and dust removal correcting portion 51 .
  • a procedure of the grid and dust removal will be described hereinafter.
  • the photographed image obtained in the image pickup element 41 by the regular photographing is displayed on the display screen 12 f of the back surface display portion 12 .
  • a portion in which the image of the grid and dust appears on the photographed image displayed on the display screen 12 f of the back surface display portion 12 is specified by the user manipulation made for the touch panel 122 .
  • an image portion for which the grid and dust removal correction (a correction area Hs which will be described later (refer to FIG. 10 )) is set.
  • the photographed image is obtained for which the image of the grid and dust is suitably removed by the grid and dust removal correction using a correction level which increases depending on a time period for which the user touches the touch panel 122 .
  • the grid and dust removal correction using the correction level will be described in detail hereinafter.
  • a luminance value of one pixel specified as the position of the image of the grid and dust by the user is used as a reference value R, and a pixel having a luminance value falling within a threshold value of (the reference value R ⁇ the correction level ⁇ ) is regarded as an abnormal pixel, and thus becomes an aim of the grid and dust removal correction.
  • a pixel group belonging to the luminance range (R ⁇ ) set depending on the time period for the touch for the touch panel 122 is regarded as a pixel group composing the image of the grid and dust in the photographed image (hereinafter referred to as “a grid and dust composing pixel group”) to be detected.
  • a grid and dust composing pixel group a pixel group composing the image of the grid and dust in the photographed image
  • the information on the specified position Pt of the image Do of the grid and dust, the correction area and the correction level which are set by the user manipulation made for the touch panel 122 is stored in the flash memory 50 so as to be able to be utilized as the correction parameter in the grid and dust removal correction for the photographed image obtained in the follow-on regular photographing.
  • horizontal scanning for the photographed image as shown in FIG. 4A , and vertical scanning as shown in FIG. 4B can be switched over to each other by an input manipulation made by the user.
  • a direction along which the finger FG moves after touch a direction along which the finger is wielded sideways
  • horizontal scanning Qh is carried out as shown in FIG.
  • FIG. 5 is a flow chart explaining a basic operation of the image pickup apparatus 1 , especially, an operation concerned with the grid and dust removal correcting processing.
  • Step ST 1 when the image of the subject is captured by the image pickup element 41 through the operation for the regular photographing, it is judged whether or not the correction parameters (the correction level described above, the position of the image of the grid and dust, and a correction area which will be described later) which can be utilized in the grid and dust removal correction are stored in the flash memory 50 (Step ST 1 ).
  • the correction parameters the correction level described above, the position of the image of the grid and dust, and a correction area which will be described later
  • Step ST 12 in response to the acquisition of the photographed image made by the image pickup element 41 , the removal of the image Do of the grid and dust is automatically carried out for the photographed image thus acquired by carrying out the grid and dust removal correction based on the correction parameters (the information on the image Do of the grid and dust) stored in the flash memory 50 .
  • the convenience is enhanced.
  • Step ST 1 when it is judged in Step ST 1 that none of the correction parameters is stored in the flash memory 50 , the operation proceeds to processing in Step ST 2 .
  • Step ST 2 the photographed image acquired by the image pickup element 41 is displayed on the display screen 12 f of the back surface display portion 12 .
  • the photographed image displayed on the display screen 12 f of the back surface display portion 12 can be enlarged or reduced by the user manipulation made for the setting dial 86 .
  • Step ST 3 it is judged whether or not the grid and dust removal correction mode is selected. For example, as shown in FIG. 6 , it is judged whether or not “DUST REMOVAL MODE” corresponding to the grid and dust removal correcting mode is selected in a menu M 1 displayed on the display screen 12 f of the back surface display portion 12 by the input made to the touch panel 122 by using the finger FG of the user.
  • Step ST 3 When it is judged in Step ST 3 that the grid and dust removal correcting mode is selected, the operation proceeds to processing in Step ST 4 , and processing necessary for the grid and dust removal correction is started to be executed. It is noted that after the operation proceeds to the grid and dust removal correcting mode, for example, the grid and dust removal correcting processing can be stopped by the user manipulation made for the button 83 , thereby exiting the grid and dust removal correcting mode.
  • Step ST 3 when it is judged in Step ST 3 that no grid and dust removal correcting mode is selected, the operation of the flow chart shown in FIG. 5 ends.
  • Step ST 4 a guidance about a next user manipulation is displayed on the display screen 12 f of the back surface display portion 12 .
  • a guidance about a next user manipulation is displayed on the display screen 12 f of the back surface display portion 12 .
  • a message M 2 of “TOUCH POSITION WHERE IMAGE OF DUST EXISTS” is displayed on the display screen 12 f of the back surface display portion 12 .
  • FIG. 7 an image obtained by enlarging a photographed image Gs shown in FIG. 6 is displayed on the display screen 12 f of the back surface display portion 12 . That is to say, in the image pickup apparatus 1 , for the purpose of making the image of the minute grid and dust appearing on the photographed image Gs easy for the user to identify, the photographed image Gs is automatically enlarged and displayed on the display screen 12 f of the back surface display portion 12 in response to the proceeding to the grid and dust removal correcting mode.
  • scroll bars Bh and Bv are displayed in a lower end and a right-hand end of the display screen 12 f along with the enlarged display of the photographed image Gs.
  • the image portion enlarged and displayed in the photographed image Gs can be moved in accordance with the input made to the touch panel 122 using the scroll bars Bh and/or Bv.
  • an image Do of the grid and dust (represented by a parallel slant portion) is displayed on the display screen 12 f of the back surface display portion 12 as shown in FIG. 8 by the user manipulation made for the scroll bars Bh and/or Bv, the image Do of the grid and dust can be exactly touched with the finger FG of the user.
  • Step ST 5 it is judged whether or not a position of the image Do of the grid and dust appearing on the photographed image Gs is specified by a press manipulation made against the touch panel 122 by the finger FG of the user.
  • a position of the image Do of the grid and dust is specified, for example, a specified position Pt (illustrated in the form of a black circle symbol) shown in FIG. 9 is displayed, and then the operation proceeds to processing in Step ST 6 .
  • Step ST 5 when it is judged in Step ST 5 that no position of the image Do of the grid and dust is specified, the operation returns back to the processing in Step ST 4 .
  • Step ST 6 a guidance about a next user manipulation is displayed on the display screen 12 f of the back surface display portion 12 .
  • a guidance about a next user manipulation is displayed on the display screen 12 f of the back surface display portion 12 .
  • a message M 3 of “PLEASE SURROUND AREA TO BE CORRECTED” is displayed on the display screen 12 f of the back surface display portion 12 .
  • Step ST 7 it is judged whether or not a partial area of the photographed image Gs for which the grid and dust removal correction is carried out (hereinafter referred to as “a correction area” for short) is specified by an input manipulation for making a circuit of the image Do of the grid and dust while the finger FG of the user touches the touch panel 122 .
  • the grid and dust removal correcting portion 51 can execute limited grid and dust removal correcting processing for the correction area (partial area) set in the photographed image Gs.
  • an area having a locus obtained by circularly moving the touch position as an outer periphery on the touch panel 122 is set as the correction area, thereby making it possible to easily input the correction parameters necessary for the grid and dust removal correction.
  • Step ST 7 When it is judged in Step ST 7 that the correction area is specified, for example, as shown in FIG. 10 , the correction area Hs (hatching portion) is displayed, and the operation proceeds to processing in Step ST 8 . On the other hand, when no correction area is specified, the operation returns back to the processing in Step ST 6 .
  • Step ST 8 it is judged whether or not the image Do of the grid and dust exists in the correction area Hs specified with the finger FG of the user. Specifically, it is judged in Step ST 8 that the correction area HS includes a specified position Pt of the image Do of the grid and dust specified with the finger FG of the user.
  • the operation proceeds to processing in Step ST 10 .
  • Step ST 9 it is judged in Step ST 8 that no image Do of the grid and dust exists in the correction area Hs.
  • Step ST 9 error display is carried out on the display screen 12 f of the back surface display portion 12 from the reason that no correction area Hs is suitably specified. For example, a message of “PLEASE SELECT AREA TO BE CORRECTED ONCE AGAIN” is displayed on the display screen 12 f of the back surface display portion 12 .
  • Step ST 10 a guidance about a next user manipulation is displayed on the display screen 12 f of the back surface display portion 12 .
  • a message M 4 of “CORRECTION LEVEL CHANGES DEPENDING ON TOUCH TIME PERIOD” is displayed on the display screen 12 f of the back surface display portion 12 .
  • Step ST 11 it is judged whether or not the correction level is specified by the press against the touch panel 122 made with the finger FG of the user. For example, when a time period for the press against the touch panel made with the finger FG of the user is longer than a predetermined threshold time period, this behavior is judged to be a specification behavior for the correction level made with the finger FG of the user. It is noted that with regard to the press manipulation made for the touch panel 122 in order to specify the correction level, the press against any portion may be available as long as any portion is included on the surface of the touch panel 122 .
  • Step ST 11 When it is judged in Step ST 11 that the correction level is specified, the operation proceeds to processing in Step ST 12 . On the other hand, when it is judged in Step ST 11 that no correction level is specified, the operation returns back to the processing in Step ST 11 .
  • Step ST 12 grid and dust removal correcting processing is executed for the image Do of the grid and dust appearing on the photographed image Gs based on the correction parameters (the specified position Pt of the image Do of the grid and dust, the correction area Hs and the correction level) inputted by the user (details thereof will be described later).
  • Step ST 13 the photographed image, after completion of the correction, for which the grid and dust removal correcting processing is executed in Step ST 12 is displayed on the display screen 12 f of the back surface display portion 12 .
  • Step ST 14 it is judged whether or not the touch panel 122 is continuously pressed with the finger FG of the user.
  • the operation proceeds to processing in Step ST 15 .
  • the operation proceeds to processing in Step ST 16 .
  • Step ST 15 the correction level is increased.
  • the grid and dust removal correction (Step ST 12 ) can be carried out based on the correction level which is gradually increased at a given speed, and also the photographed image for which the correction is carried out can be displayed (Step ST 13 ). That is to say, the photographed images for which the grid and dust removal correction is carried out for the grid and dust composing pixel group obtained based on the above luminance ranges set in conjunction with the correction levels successively set based on the increase in touch time period for the touch panel 122 can be successively displayed on the display screen 12 f of the back surface display portion 12 .
  • Step ST 16 it is judged whether or not the photographed image after completion of the correction involves a problem.
  • a message M 5 of “DO YOU DESIRE TO STORE DATA ON IMAGE AFTER CORRECTION?” is displayed on the display screen 12 f of the back surface display portion 12 as shown in FIG. 11 .
  • Step ST 16 when it is judged in Step ST 16 that the photographed image after completion of the correction involves no problem, the operation proceeds to processing in Step ST 17 .
  • the operation returns back to the processing in Step ST 10 .
  • Step ST 17 the image data on the photographed image after completion of the correction is stored in the memory card 90 , and the correction parameters inputted by the user are stored in the flash memory 50 . That is to say, the correction parameters which are ones (grid and dust information) containing therein the specified position Pt of the image Do of the grid and dust used in the grid and dust correction processing in Step ST 12 , and which are newly obtained unlike ones stored in the flash memory 50 are additionally stored in the flash memory (storage device) 50 . As a result, the convenience is enhanced because these correction parameters can be utilized in the grid and dust removal correction for the photographed image obtained in the follow-on photographing.
  • the correction parameters are accumulated in the flash memory 50 .
  • the correction parameters stored in the flash memory 50 needs to be erased in order to prevent the grid and dust removal correction therefor from being automatically carried out. This erasing procedure will be described below.
  • FIGS. 12 and 13 are respectively views explaining a procedure for erasing the correction parameters from the flash memory 50 .
  • the procedure for erasing a correction parameter 1 when the correction parameters 1 and 2 about the images of the grid and dust which exist in positions D 1 and D 2 , respectively, in a grid and dust position display Gp representing a position of an image of grid and dust appearing on a photographed image will be described as a concrete example.
  • a dust removal menu Ma is displayed on the display screen 12 f of the back surface display portion 12 as shown in FIG. 12 , and a displayed portion of “CORRECTION 1 ” is touched with the finger FG of the user.
  • a message Mb of “DO YOU DESIRE TO ERASE CORRECTION PARAMETER 1 ?” is displayed on a lower portion of the display screen 12 f of the back surface display portion 12 .
  • the user specifies at least one correction parameter of a plurality of correction parameters (grid and dust information) stored in the flash memory 50 with his/her finger FG, and the at least one correction parameter thus specified is erased from the flash memory 50 .
  • the unnecessary grid and dust removal correction from being carried out for the image of the grid and dust which has appeared no longer on the photographed image because the grid and dust are physically removed.
  • FIG. 14 is a flow chart explaining processing corresponding to the processing in Step ST 12 shown in FIG. 5 , that is, the grid and dust removal correcting processing.
  • Step ST 20 a luminance value of the pixel located in the specified position Pt (refer to FIG. 9 ) specified as the position of the image Do of the grid and dust by the user is set at the reference value R described above.
  • Step ST 21 the correction level inputted with the finger FG of the user is converted into the threshold described above. That is to say, the value of (the reference value R ⁇ the correction level ⁇ ) set in Step ST 20 is set as the threshold.
  • Step ST 22 the scanning is started within the correction area Hs (refer to FIG. 10 ) the data on which was inputted by the user.
  • the horizontal scanning shown in FIG. 4A or the vertical scanning shown in FIG. 4B can be selected by the user manipulation made for the touch panel 122 .
  • Step ST 23 it is judged whether or not the pixel (grid and dust composing pixels) each having the luminance value falling within the threshold described above with respect to the reference value R are detected in the pixels in the scanning direction.
  • the operation proceeds to processing in Step ST 24 .
  • the operation proceeds to processing in Step ST 25 .
  • Step ST 24 the luminance value of the grid and dust composing pixel is corrected based on the luminance values of the normal pixels in the periphery of the grid and dust composing pixels.
  • the horizontal scanning Qh is carried out in the manner as shown in FIG. 4A
  • each of the luminance values of the abnormal pixels sandwiched between the pixels H 1 and H 2 is replaced with the average of the luminance values of the normal pixels H 1 and H 2 adjacent to the outer edge of the area Dp of the grid and dust composing pixels.
  • the vertical scanning Qv is carried out in the manner as shown in FIG. 4B
  • each of the luminance values of the abnormal pixels sandwiched between the pixels V 1 and V 2 is replaced with the average of the luminance values of the normal pixels V 1 and V 2 adjacent to the outer edge of the area Dp of the grid and dust composing pixels.
  • Step ST 25 the scanning which is started in Step ST 22 ends, thereby completing the grid and dust removal correction. It should be noted that the scanning which is started in Step ST 22 and ends in Step SP 25 is repetitively carried out until completion of the scanning for all the pixels within the correction area Hs.
  • the suitable removal of the image Do of the grid and dust by the grid and dust removal correction can be easily carried out because the position of the image Do of the grid and dust which appears on the photographed image can be specified by using the touch panel 122 while the photographed image is displayed on the display screen 12 f of the back surface display portion 12 .
  • a figure having a predetermined shape and including the position, of the image Do of the grid and dust, specified by the user may be automatically set as the correction area.
  • a circle having a predetermined radius ro is set as a correction area Ht (indicated by a broken line) with the specified position Pt specified by the user as a center.
  • the touch panel 122 may be provided in a monitor which is detachably connected to the image pickup apparatus 1 through a connector or the like.
  • the touch panel 122 may be provided on a part of the display screen 12 f of the back surface display portion 12 .
  • the specification of the position of the image Do of the grid and dust, or the like becomes possible.
  • the touch position may be optically detected by detecting a portion in which light beams, such as infrared lights, radiated in a lattice so as to cover the display screen 12 f of the back surface display portion 12 , or a light beam used to scan the display screen 12 f of the back surface display portion 12 is blocked.
  • light beams such as infrared lights

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)
US12/466,069 2008-06-03 2009-05-14 Image pickup apparatus, method and computer-readable storage medium for processing an image based on user manipulation on a display surface Active 2030-06-03 US8314844B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-145699 2008-06-03
JP2008145699A JP5012673B2 (ja) 2008-06-03 2008-06-03 表示装置

Publications (2)

Publication Number Publication Date
US20090295944A1 US20090295944A1 (en) 2009-12-03
US8314844B2 true US8314844B2 (en) 2012-11-20

Family

ID=41379303

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/466,069 Active 2030-06-03 US8314844B2 (en) 2008-06-03 2009-05-14 Image pickup apparatus, method and computer-readable storage medium for processing an image based on user manipulation on a display surface

Country Status (3)

Country Link
US (1) US8314844B2 (enrdf_load_stackoverflow)
JP (1) JP5012673B2 (enrdf_load_stackoverflow)
CN (1) CN101600053B (enrdf_load_stackoverflow)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110234839A1 (en) * 2007-08-06 2011-09-29 Nikon Corporation Electronic camera
US20130176467A1 (en) * 2012-01-05 2013-07-11 Altek Corporation Image Capturing Device, Dust Removal System and Vibrating Dust Removal Method Thereof
US10282826B2 (en) 2016-10-10 2019-05-07 Carestream Health, Inc. Despeckling method for radiographic images

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103200353B (zh) * 2012-01-05 2016-06-08 华晶科技股份有限公司 影像撷取装置、除尘系统及其振动除尘的方法
CN103376612B (zh) * 2012-04-18 2015-12-02 华晶科技股份有限公司 除尘系统、摄像装置及其振动除尘方法
JP2014026049A (ja) 2012-07-25 2014-02-06 Sony Corp クリーニング装置とクリーニング方法および撮像装置
US9413966B2 (en) * 2013-08-06 2016-08-09 Htc Corporation Method of previewing processed image, device using the same, and storage medium having computer program stored thereon
CN104486546B (zh) * 2014-12-19 2017-11-10 广东欧珀移动通信有限公司 拍照的方法、装置及移动终端
JP6807343B2 (ja) 2018-04-16 2021-01-06 株式会社デンソーテン 付着物除去システム、及び付着物の除去方法

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5596346A (en) * 1994-07-22 1997-01-21 Eastman Kodak Company Method and apparatus for applying a function to a localized area of a digital image using a window
US6160923A (en) * 1997-11-05 2000-12-12 Microsoft Corporation User directed dust and compact anomaly remover from digital images
JP2004184949A (ja) 2002-12-06 2004-07-02 Nikon Corp カメラ
US6791608B1 (en) * 1999-02-24 2004-09-14 Olympus Optical Co., Ltd. Digital camera and dirt position detecting method for digital camera
US20070183771A1 (en) 2006-02-06 2007-08-09 Tatsuo Takanashi Imaging apparatus and imaging unit
US20070195185A1 (en) * 2006-02-17 2007-08-23 Ichiro Onuki Image capturing apparatus control method therefor, and program
JP2007241171A (ja) 2006-03-13 2007-09-20 Olympus Imaging Corp 撮像装置および撮像ユニット
JP2007243651A (ja) 2006-03-09 2007-09-20 Canon Inc 電子撮像装置および撮影画像処理システム
US20080240608A1 (en) * 2007-03-27 2008-10-02 Canon Kabushiki Kaisha Image processing apparatus, control method therefor, program, storage medium, and image capturing apparatus
US20080304765A1 (en) * 2007-06-05 2008-12-11 Canon Kabushiki Kaisha Image processing apparatus, control method thereof, and program
US20090256947A1 (en) * 2008-04-15 2009-10-15 Sony Corporation Method and apparatus for performing touch-based adjustments within imaging devices
US20100074554A1 (en) * 2008-09-24 2010-03-25 Canon Kabushiki Kaisha Image processing apparatus, control method therefor, and program

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11136568A (ja) * 1997-10-31 1999-05-21 Fuji Photo Film Co Ltd タッチパネル操作式カメラ
US6381357B1 (en) * 1999-02-26 2002-04-30 Intel Corporation Hi-speed deterministic approach in detecting defective pixels within an image sensor
JP4167401B2 (ja) * 2001-01-12 2008-10-15 富士フイルム株式会社 ディジタル・カメラおよびその動作制御方法
JP2006295844A (ja) * 2005-04-14 2006-10-26 Olympus Imaging Corp 防塵機能付き光学装置
JP2007174183A (ja) * 2005-12-21 2007-07-05 Konica Minolta Photo Imaging Inc 画像処理装置及び画像処理方法
JP4590355B2 (ja) * 2006-01-12 2010-12-01 キヤノン株式会社 画像処理装置及び画像処理方法及びプログラム
JP4678860B2 (ja) * 2006-01-24 2011-04-27 キヤノン株式会社 撮像装置及びその制御方法とプログラム
JP2007240887A (ja) * 2006-03-08 2007-09-20 Make Softwear:Kk 自動写真撮影装置及びその方法
JP4750630B2 (ja) * 2006-06-28 2011-08-17 キヤノン株式会社 画像処理装置及び画像処理方法及びプログラム及び記憶媒体及び撮像装置
JP2008053845A (ja) * 2006-08-22 2008-03-06 Olympus Imaging Corp レンズ交換式カメラ

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5596346A (en) * 1994-07-22 1997-01-21 Eastman Kodak Company Method and apparatus for applying a function to a localized area of a digital image using a window
US6160923A (en) * 1997-11-05 2000-12-12 Microsoft Corporation User directed dust and compact anomaly remover from digital images
US6791608B1 (en) * 1999-02-24 2004-09-14 Olympus Optical Co., Ltd. Digital camera and dirt position detecting method for digital camera
JP2004184949A (ja) 2002-12-06 2004-07-02 Nikon Corp カメラ
US20070183771A1 (en) 2006-02-06 2007-08-09 Tatsuo Takanashi Imaging apparatus and imaging unit
US20070195185A1 (en) * 2006-02-17 2007-08-23 Ichiro Onuki Image capturing apparatus control method therefor, and program
JP2007243651A (ja) 2006-03-09 2007-09-20 Canon Inc 電子撮像装置および撮影画像処理システム
JP2007241171A (ja) 2006-03-13 2007-09-20 Olympus Imaging Corp 撮像装置および撮像ユニット
US20080240608A1 (en) * 2007-03-27 2008-10-02 Canon Kabushiki Kaisha Image processing apparatus, control method therefor, program, storage medium, and image capturing apparatus
US20080304765A1 (en) * 2007-06-05 2008-12-11 Canon Kabushiki Kaisha Image processing apparatus, control method thereof, and program
US20090256947A1 (en) * 2008-04-15 2009-10-15 Sony Corporation Method and apparatus for performing touch-based adjustments within imaging devices
US20100074554A1 (en) * 2008-09-24 2010-03-25 Canon Kabushiki Kaisha Image processing apparatus, control method therefor, and program

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110234839A1 (en) * 2007-08-06 2011-09-29 Nikon Corporation Electronic camera
US8477209B2 (en) * 2007-08-06 2013-07-02 Nikon Corporation Electronic camera
US20130176467A1 (en) * 2012-01-05 2013-07-11 Altek Corporation Image Capturing Device, Dust Removal System and Vibrating Dust Removal Method Thereof
US8842204B2 (en) * 2012-01-05 2014-09-23 Altek Corporation Image capturing device, dust removal system and vibrating dust removal method thereof
US10282826B2 (en) 2016-10-10 2019-05-07 Carestream Health, Inc. Despeckling method for radiographic images

Also Published As

Publication number Publication date
US20090295944A1 (en) 2009-12-03
JP5012673B2 (ja) 2012-08-29
JP2009296127A (ja) 2009-12-17
CN101600053A (zh) 2009-12-09
CN101600053B (zh) 2012-06-06

Similar Documents

Publication Publication Date Title
US8314844B2 (en) Image pickup apparatus, method and computer-readable storage medium for processing an image based on user manipulation on a display surface
US8300135B2 (en) Digital image processing apparatus and method of controlling the same
JP5817131B2 (ja) 撮像装置と撮像方法並びに撮像プログラム
US10122930B2 (en) Display control apparatus and control method thereof
US10477113B2 (en) Imaging device and control method therefor
US11245852B2 (en) Capturing apparatus for generating two types of images for display from an obtained captured image based on scene luminance and exposure
WO2019167482A1 (ja) 撮像装置、その焦点合わせ補助方法、及び、その焦点合わせ補助プログラム
US20100026873A1 (en) Digital image processing apparatuses, methods of controlling the same, and computer-readable medium encoded with computer executable instructions for executing the method(s)
CN1731269B (zh) 摄像装置
JP2005345833A (ja) 撮像装置およびその合焦制御方法
US10634976B2 (en) Imaging device
US20110228146A1 (en) Imaging apparatus
KR20130024021A (ko) 디지털 촬영 장치 및 이의 제어 방법
JP5318321B2 (ja) 撮像装置
KR20090059512A (ko) 렌즈부 셰이딩 현상을 보정하는 영상 처리 장치 및 그 제어방법
KR101369752B1 (ko) 디지털 영상 처리 장치 및 그 제어 방법
JP6320251B2 (ja) 撮像装置、撮像方法、およびプログラム
CN106170975B (zh) 摄影装置及其控制方法
JP2018113724A (ja) 撮像装置、撮像方法、およびプログラム
JP2009071494A (ja) 撮像装置、画像処理装置およびプログラム
JP2005326506A (ja) 焦点検出装置及び焦点検出方法
CN108141514B (zh) 摄影装置及其控制方法
KR20100003917A (ko) 디지털 영상 처리 장치에서 복수의 메뉴를 일괄 설정 및수행하는 방법
KR20100099565A (ko) 디지털 영상 처리 장치 및 그 제어 방법
JP2012213173A (ja) 撮像装置及び制御方法

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TASHIRO, KAZUYA;MORI, NOBUYUKI;REEL/FRAME:022692/0182

Effective date: 20090512

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12