EP2782328A1 - Imaging device and imaging method, and storage medium for storing tracking program processable by computer - Google Patents

Imaging device and imaging method, and storage medium for storing tracking program processable by computer Download PDF

Info

Publication number
EP2782328A1
EP2782328A1 EP12858021.4A EP12858021A EP2782328A1 EP 2782328 A1 EP2782328 A1 EP 2782328A1 EP 12858021 A EP12858021 A EP 12858021A EP 2782328 A1 EP2782328 A1 EP 2782328A1
Authority
EP
European Patent Office
Prior art keywords
frame
touch panel
focus
imaging
operation portion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP12858021.4A
Other languages
German (de)
French (fr)
Other versions
EP2782328A4 (en
Inventor
Akira Ugawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Imaging Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Imaging Corp filed Critical Olympus Imaging Corp
Publication of EP2782328A1 publication Critical patent/EP2782328A1/en
Publication of EP2782328A4 publication Critical patent/EP2782328A4/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup

Definitions

  • the present invention relates to an imaging apparatus wherein a touch panel is provided in a display unit which displays an image acquired by imaging and an image corresponding to a part on a touch panel touched by an operation portion such as a finger of a user is, for example, displayed in a magnified form.
  • Jpn. Pat. Appln. KOKAI Publication No. 2006-101186 is known as the above-mentioned imaging apparatus.
  • a particular subject to be tracked can be selected by an easy operation during photography.
  • the subject to be tracked is selected in the following manner according to this technique.
  • a touch panel is provided on a monitor. If this touch panel is touched, a particular part corresponding to the touched place is extracted from a displayed image. This particular part is compared with an image displayed by a video signal generated after the extraction of the particular part. An image portion corresponding to the particular part is detected from the video signal.
  • Patent Literature 1 Jpn. Pat. Appln. KOKAI Publication No. 2006-101186
  • the subject is selected by touching the touch panel provided on the monitor. Therefore, at the time of a touch operation, for example, the displayed image corresponding to a place touched on the monitor is hidden and difficult to see, or may be invisible.
  • An object of the present invention is to provide an imaging apparatus and an imaging method of the same which enable imaging while an image corresponding to a place touched on a monitor is being checked without being hidden, and to provide a storage medium to store a computer-processible tracking program.
  • An imaging apparatus comprises a display unit which displays a moving image or a still image, a touch panel provided in the display unit, a frame display controller which displays a frame for focus on the display unit if an operation portion is approaching the touch panel, and moves the frame for focus in accordance with the movement of the operation portion that is approaching the touch panel, and an imaging controller which performs focusing on the subject in the frame for focus and then performs imaging in response to a photography instruction.
  • An imaging method of an imaging apparatus comprises detecting whether an operation portion is approaching a touch panel provided in a display unit which displays a moving image or a still image, displaying a frame for focus on the display unit if an operation portion is approaching the touch panel, and moving the frame for focus in accordance with the movement of the operation portion that is approaching the touch panel, and performing focusing on the subject in the frame for focus and then performing imaging for the subject in response to a photography instruction.
  • a non-transitory computer readable storage medium storing a tracking program comprises a detection function to detect whether an operation portion is approaching or has contacted a touch panel provided in a display unit which displays a moving image or a still image, a tracking function to display a frame for focus on the display unit if an operation portion is approaching the touch panel, and move the frame for focus in accordance with the movement of the operation portion that is approaching the touch panel, and an imaging function to perform focusing on the subject in the frame for focus and then perform imaging for the subject in response to a photography instruction.
  • an imaging apparatus and an imaging method of the same which enable imaging while an image corresponding to a place touched on a monitor is being checked without being hidden, and to provide a storage medium to store a computer-processible tracking program.
  • FIG. 1 shows a configuration diagram of an imaging apparatus.
  • An imaging unit 100 images a subject, and outputs a relevant imaging signal.
  • the imaging unit 100 includes a lens 11, a diaphragm 12, an autofocus (AF) mechanism 13, a shutter 14, an imaging sensor 15, an imaging circuit 16, and an A/D converter 17.
  • a light flux from the subject enters the imaging sensor 15 from the lens 11 through the diaphragm 12 and the shutter 14.
  • the imaging sensor 15 converts the entered light flux to an electric signal.
  • the imaging circuit 16 outputs the electric signal from the imaging sensor 15 as an analog imaging signal.
  • the A/D converter 17 converts the analog imaging signal from the imaging circuit 16 to a digital imaging signal in a predetermined format, and then sends the digital imaging signal to an image processor 18 and a finder image generator 23.
  • the AF mechanism 13 moves the lens 11 in an optical axis direction, and performs AF for the subject.
  • the image processor 18 For the digital imaging signal converted by the A/D converter 17, the image processor 18 performs predetermined image processing, for example, adjustments including a color correction, a gray scale correction, and a gamma ( ⁇ ) correction of an image to be represented by the image data.
  • the image processor 18 temporarily stores the image signal after the above adjustments in a buffer memory 19.
  • a compressor/decompressor 21 compresses/decompresses the image signal temporarily stored in the buffer memory 19, forms the image signal into a format suitable for recording in a recording medium 22 to generate main image data, and records the main image data in the recording medium 22 via an interface 22a.
  • a first liquid crystal controller 20 comprises what is known as a live view function to read the image signal temporarily stored in the buffer memory 19, generate a through-image in a format suitable for display on a liquid crystal display (hereinafter abbreviated as an LCD) 9 for the image signal, and display the through-image on the LCD 9.
  • a live view function to read the image signal temporarily stored in the buffer memory 19
  • LCD liquid crystal display
  • the first liquid crystal controller 20 reads the image data recorded in the recording medium 22 through the buffer memory 19 and the compressor/decompressor 21, and displays this image data on the LCD 9.
  • the finder image generator 23 generates finder image data for moving images in a format suitable for display on a display device 3 such as an EVF from the digital imaging signal converted by the A/D converter 17.
  • a second liquid crystal controller 24 displays, on the display device 3, for example, the EVF, the finder image data for moving images generated by the finder image generator 23.
  • a touch panel 31 is provided on a display screen of the LCD 9.
  • the touch panel 31 detects a touch (contact) by an operation portion such as a finger of a user which touches the touch panel 31, and outputs a coordinate signal corresponding to the touched part.
  • a touch panel controller 30 controls the driving of the touch panel 31, inputs a coordinate signal output from the touch panel 31, and judges the touched part on the touch panel 31.
  • the touch panel controller 30 judges whether, for example, the finger of the user has approached or touched the touch panel 31 in accordance with the change in capacitance between the touch panel 31 and, for example, the finger of the user.
  • FIG. 2 shows a schematic diagram of a sensor pattern of the touch panel 31.
  • sensors 31a are arranged at regular intervals in x-y directions.
  • each of the sensors 31a judges whether the finger of the user is gradually approaching to touch the touch panel 31 or the finger of the user has touched the touch panel 31.
  • the touch panel 31 has a protective layer such as a protective film, protective glass, or a protective panel formed on its surface, a touch by the finger of the user includes a touch on the protective layer.
  • Each of the sensors 31a is of a capacitive projection type, but a self-capacitance method or a mutual-capacitance method may also be used instead.
  • a self-capacitance method a capacitance value generated between each of the sensors 31a and, for example, the finger of the user is sensed.
  • a mutual-capacitance method a capacitance change between the adjacent sensors 31a is sensed if, for example, the finger of the user approaches each of the sensors 31a.
  • FIG. 3 shows a situation in which, for example, a finger F of the user has approached the capacitance type touch panel 31.
  • FIG. 4 shows a capacitance change between the touch panel 31 and the finger F of the user if the finger F of the user has approached the capacitance type touch panel 31.
  • the finger F of the user is located, for example, above coordinates (x, y, z) on the surface of the touch panel 31.
  • the capacitance between the touch panel 31 and the finger F of the user increases in a quadric manner as the finger F of the user approaches the touch panel 31.
  • each of the sensors 31a outputs a signal corresponding to the capacitance between the touch panel 31 and the finger F of the user, that is, a signal corresponding to a distance D (hereinafter referred to as an inter-touch distance) between the touch panel 31 and the finger F of the user.
  • a distance D hereinafter referred to as an inter-touch distance
  • Touch approach ranges E1 to En and a just-before-touch range T are set to judge whether the finger F of the user has approached or touched the touch panel 31.
  • the first and second touch approach ranges E1 and E2 and the just-before-touch range T are set in the touch panel 31, as shown in FIG. 4 .
  • the first and second touch approach ranges E1 and E2 and the just-before-touch range T are set as follows: If the inter-touch distance D between the touch panel 31 and the finger F of the user is D3, D2, or D1 (D3>D2>D1), a range between the distances D3 and D2 is set as the first touch approach range E1, a range between the distances D2 and D3 is set as the second touch approach range E2, and a range within the distance D3 and before the finger F of the user touches the touch panel 31 is set as the just-before-touch range T.
  • the first and second touch approach ranges E1 and E2 and the just-before-touch range T are divided by capacitance values Th1, Th2, and Th3 between the touch panel 31 and the finger F of the user.
  • the relation of magnitude between the capacitance values Th1, Th2, and Th3 is Th1 ⁇ Th2 ⁇ Th3. Accordingly, the first touch approach range E1 is between the capacitance values Th1 and Th2.
  • the second touch approach range E2 is between the capacitance values Th2 and Th3.
  • the just-before-touch range T is a small range including the capacitance value Th3.
  • the touch panel controller 30 monitors the operation in a touch AF mode, that is, a change in capacitance value detected by each of the sensors 31a of the touch panel 31, and detects a region in which the capacitance value changes, as a coordinate position to be touched.
  • the touch panel controller 30 monitors a change in capacitance value in the touch panel 31, compares the monitored capacitance value with each of the capacitance values Th1, Th2 and Th3, and judges whether the finger F of the user is present in the first touch approach range E1, in the second touch approach range E2, or in the just-before-touch range T relative to the touch panel 31, and whether the finger F of the user has touched the touch panel 31.
  • FIG. 5 shows that the finger F of the user is about to touch the touch panel 31 in a direction inclined at the angle ⁇ .
  • the inter-touch distance between the finger F of the user and a display surface of the touch panel 31 is D (including D1, D2, and D3)
  • a horizontal distance between a point P to be touched and the finger F of the user is x
  • an angle from the point P to be touched is ⁇ (an angle with a line perpendicular to the display surface of the touch panel 31).
  • FIG. 6A, FIG. 6B, and FIG. 6C show the process of a change in capacitance value at regular time intervals if the finger F of the user is about to touch the touch panel 31.
  • numerical values "1", “2” ... “5" schematically indicating capacitance values are shown in the sensors 31a. The higher numerical values “1", “2” ... “5" indicate higher capacitance values.
  • the touch panel controller 30 determines, as a coordinate position ST1 of an origin, a place where a change in capacitance is detected on the touch panel 31; for example, a place where the numerical value "1" appears as shown in FIG. 6A .
  • the capacitance value increases on the touch panel 31.
  • FIG. 6B there are more places where the numerical values "3" and "1" appear.
  • the touch panel controller 30 detects a place where the greatest change in capacitance value is shown among the numerical values "3" and "1”; for example, a coordinate position ST2 of the place having the numerical value "3”, and finds the horizontal distance x and the inter-touch distance D from the coordinate position ST2 of the place having the numerical value "3" and from the coordinate position ST1 of the origin.
  • the capacitance value further increases on the touch panel 31.
  • FIG. 6C there are more places where the numerical values "5", "3" and “1" appear.
  • the touch panel controller 30 detects a place where the greatest change in capacitance value is shown among the numerical values "5", "3” and “1”; for example, a coordinate position ST3 of the place having the numerical value "5", and finds the horizontal distance x and the inter-touch distance D from the coordinate position ST3 of the place having the numerical value "5" and from the coordinate position ST1 of the origin.
  • the touch panel controller 30 detects a coordinate position STm of the place having the greatest change in capacitance value at regular time intervals, and finds the horizontal movement distance x and the inter-touch distance D from the coordinate position STm and the coordinate position ST1 of the origin.
  • the touch panel controller 30 may find the horizontal movement distance x and the inter-touch distance D from the coordinate position STm of the place having the greatest change in capacitance value and a preceding coordinate position STm-1 at regular time intervals.
  • the touch panel controller 30 finds a movement direction M of the finger F of the user from each horizontal movement distance x and each inter-touch distance D that are found at regular time intervals, finds the angle ⁇ from the movement direction M, and finds a coordinate position of a candidate point P to be touched by the finger F of the user from the horizontal movement distance x, the inter-touch distance D, and the angle ⁇ .
  • the touch panel controller 30 finds the inter-touch distance D between the finger F of the user and the display surface of the touch panel 31 from the capacitance value detected by each of the sensors 31a of the touch panel 31.
  • the angle ⁇ of the finger F of the user may be set at a preset angle, for example, 45°.
  • a system controller 26 controls a series of imaging operations in which the subject is imaged.
  • the system controller 26 controls the imaging unit 100, the image processor 18, the buffer memory 19, the compressor/decompressor 21, the interface 22a, the finder image generator 23, the touch panel controller 30, the first liquid crystal controller 20, and the second liquid crystal controller 24.
  • An operation unit 27 and a strobe control circuit 25 are connected to the system controller 26.
  • the operation unit 27 has, for example, a release button 5, a mode dial 6, a cross key 7, and an enter key 8.
  • the operation unit 27 includes, for example, a power button and an end button.
  • the strobe control circuit 25 controls the operation of a strobe 4 which generates a flash light.
  • This system controller 26 includes a frame display controller 40, a focus controller 41, an imaging controller 42, a face detector 43, a list display unit 44, and an image reproducer 45.
  • the frame display controller 40 displays a frame W for focus in an image region corresponding to the part of the touch panel 31 to be touched on the image displayed on the display screen of the LCD 9 as shown in FIG. 7 .
  • the frame display controller 40 displays the frame W for focus around the coordinate position of the candidate point P found by the touch panel controller 30.
  • the frame display controller 40 displays the frame W for focus if the finger F of the user has entered the second touch approach range E2.
  • the frame W for focus may be displayed from the time if the finger F of the user has entered the first touch approach range E1.
  • the part of the touch panel 31 to be touched by the finger F of the user can be checked on the display screen of the LCD 9 before the finger F of the user touches the touch panel 31. Since the part to be touched by the finger F of the user is located before the finger F of the user touches the touch panel 31, the frame W for focus can be checked without being hidden by the finger F of the user. If it is found out from this check that the region to be touched by the finger F of the user is different from the region desired by the user, the region to be touched by the finger F of the user can be changed.
  • the focus controller 41 described later can focus on the subject corresponding to the coordinate position of the candidate point P before the finger F of the user touches the touch panel 31.
  • the time before focusing can be shorter than if focusing is performed after the finger F of the user has touched the touch panel 31.
  • the frame W for focus is set to be, for example, a face S of a person as a main subject by the user operation.
  • the frame W for focus is formed into, for example, a square, but is not limited thereto.
  • the frame W for focus may be formed into a circle, a rectangle, or a double circle.
  • the frame display controller 40 moves the frame W for focus on the display screen of the LCD 9 in accordance with the movement of the finger F of the user. For example, if the person moves, the frame W for focus is set at the face S of the person who has moved in accordance with the movement of the finger F of the user.
  • the frame display controller 40 moves the frame W for focus at a speed corresponding to the speed at which the finger F of the user moves; for example, at a speed which increases or decreases in accordance with the increase or decrease in the movement speed of the finger F of the user. If moving the frame W for focus in accordance with the movement of the finger F of the user, the frame display controller 40 moves the frame W for focus in the same direction as the direction in which the finger F of the user moves.
  • the frame display controller 40 demagnifies or magnifies the size of the frame W for focus in accordance with the inter-touch distance D between the finger F of the user and the display surface of the touch panel 31. For example, the frame display controller 40 demagnifies the size of the frame W for focus in an arrow direction as shown in FIG. 9 as the inter-touch distance D decreases, and the frame display controller 40 magnifies the size of the frame W for focus as the inter-touch distance D increases. If the size of the frame W for focus is reduced, it is possible to perform display that is narrowed down to the candidate point P to be touched by the finger F of the user.
  • the frame display controller 40 may lock the display state of the frame W for focus in the display screen of the LCD 9. If the imaging apparatus moves, the frame display controller 40 unlocks the display state of the frame W for focus.
  • the focus controller 41 performs processing to focus on the candidate point P on the subject in the frame W for focus before the finger F of the user touches the touch panel 31.
  • the focus controller 41 does not perform focusing during the movement of the frame W for focus by the frame display controller 40, that is, during the movement in the direction flush with the display screen of the LCD 9. If the movement of the finger F of the user stops, the focus controller 41 performs focusing on the subject in the frame W for focus.
  • the focus controller 41 may perform focusing after the frame W for focus is displayed.
  • the focus controller 41 performs focusing on the subject in the frame W for focus.
  • the imaging controller 42 performs imaging by the imaging unit 100 at the time of the touch detection.
  • the face detector 43 detects whether the facial part S of the subject is present in image data, or detects whether the facial part S of the subject is present in the frame W for focus displayed in the image data. The face detector 43 detects the facial part S of the subject present in the image data.
  • the frame display controller 40 may display the frame W for focus for the facial part S of the subject.
  • the display of the frame W for focus is started if the finger F of the user enters the second touch approach range E2. If the facial part S of the subject is photographed as the main subject, the display position of the frame W for focus can be corrected to the facial part S detected by the face detector 43 even if the position of the frame W for focus displayed by the frame display controller 40 is out of position relative to the main subject.
  • the list display unit 44 displays a list of images on the display screen of the LCD 9 during a reproduction mode.
  • the list display unit 44 reads image data recorded in, for example, the recording medium 22, and displays a list of the image data on the display screen of the LCD 9 via the first liquid crystal controller 20.
  • the image reproducer 45 displays the selected image data on the display screen of the LCD 9 in a magnified form.
  • the system controller 26 turns on the power of the touch panel 31 in step #1, and turns on the power of the LCD 9 in step #2.
  • step #3 the system controller 26 receives a mode such as a photography mode or the reproduction mode selected by, for example, the user operation performed on the mode dial 6.
  • a mode such as a photography mode or the reproduction mode selected by, for example, the user operation performed on the mode dial 6.
  • step #4 the system controller 26 judges whether the selected mode is the photography mode or the reproduction mode.
  • the system controller 26 shifts to step #5, and sets the mode to the photography mode.
  • FIG. 12 shows an imaging operation timing chart.
  • FIG. 12 shows a period in which a capacitance value generated between the touch panel 31 and, for example, the finger F of the user can be sensed, a display period of the frame W for focus displayed on the display screen of the LCD 9, a period in which autofocusing (AF) is performed, and a period in which imaging is performed.
  • AF autofocusing
  • the system controller 26 controls the imaging unit 100, the image processor 18, the buffer memory 19, the compressor/decompressor 21, the interface 22a, the finder image generator 23, the touch panel controller 30, the first liquid crystal controller 20, and the second liquid crystal controller 24, and controls a series of imaging operations for imaging the subject.
  • the light flux from the subject enters the imaging sensor 15 from the lens 11 through the diaphragm 12 and the shutter 14.
  • the AF mechanism 13 moves the lens 11 in the optical axis direction, and performs AF for the subject.
  • the imaging sensor 15 converts the entered light flux to an electric signal.
  • the electric signal output from the imaging sensor 15 is output by the imaging circuit 16 as an analog imaging signal.
  • the analog imaging signal is converted to a digital imaging signal in a predetermined format by the A/D converter 17, and then sent to the image processor 18 and the finder image generator 23.
  • the image processor 18 For the digital imaging signal converted by the A/D converter 17, the image processor 18 performs predetermined image processing, for example, adjustments including a color correction, a gray scale correction, and a gamma ( ⁇ ) correction of an image to be represented by the image data.
  • the image processor 18 temporarily stores the image signal after the above adjustments in the buffer memory 19.
  • the first liquid crystal controller 20 reads the image signal temporarily stored in the buffer memory 19, generates a through-image in a format suitable for display on the LCD 9 for the image signal, and live-view-displays the through-image on the LCD 9.
  • the system controller 26 judges in step #10 whether the mode is set to a touch autofocus (AF) mode. If the mode is judged to be set to the touch AF mode, the touch panel controller 30 monitors a change in capacitance value in the touch panel 31, and detects the region in which the capacitance value changes, as a coordinate position to be touched. The touch panel controller 30 compares the monitored capacitance value with each of the capacitance values Th1, Th2 and Th3, and judges whether the finger F of the user is present in the first touch approach range E1, in the second touch approach range E2, or in the just-before-touch range T relative to the touch panel 31, and whether the finger F of the user has touched the touch panel 31.
  • AF touch autofocus
  • step #11 the touch panel controller 30 monitors a change in capacitance value in the touch panel 31, and judges that the user is bringing the finger F close to the touch panel 31 to touch the touch panel 31.
  • the place to be touched by the finger F of the user is the place on the LCD 9 and on the image desired to be focused in the subject, for example, the part on the touch panel 31 corresponding to the facial part S of the person.
  • FIG. 13A and FIG. 13B show the situation in which the finger F of the user is about to touch the touch panel 31.
  • FIG. 13A shows the image displayed on the LCD 9 in the above situation.
  • FIG. 13B shows the inter-touch distance D between the touch panel 31 and the finger F of the user in the above situation.
  • step #12 the touch panel controller 30 judges whether the finger F of the user is present in the first touch approach range E1 (D3 ⁇ D>D2) as shown in FIG. 14B in accordance with the change in capacitance in the touch panel 31.
  • the first touch approach range E1 is within 30 to 20 mm from the surface of the touch panel 31.
  • the touch panel controller 30 judges that a capacitance value generated between the touch panel 31 and the finger F of the user can be sensed by each of the sensors 31a of the touch panel 31 and that the finger F of the user is about to touch the touch panel 31. Sensing is possible from the point where the finger F of the user has entered the first touch approach range E1, so that when the finger F of the user is out of the first touch approach range E1, sensing is not performed, and no wasteful electricity is consumed.
  • the touch panel controller 30 can inhibit sensing, perform an operation to increase the distance between the finger F of the user and the touch panel 31, or limit circuit operations and calculations in the system controller 26 so that electricity is not wastefully consumed.
  • this capacitance value can be differentiated from a capacitance value generated when a material different from the finger F of the user is about to touch the touch panel 31.
  • this capacitance value can be differentiated from a capacitance value generated when a material different from the finger F of the user is about to touch the touch panel 31.
  • the touch panel controller 30 judges in step #13 whether the finger F of the user is present in the second touch approach range E2 (D2 ⁇ D>D1) as shown in FIG. 15B in accordance with the change in capacitance value in the touch panel 31.
  • the touch panel controller 30 finds a coordinate position of the finger F of the user projected on the touch panel 31 in step #14. That is, if the finger F of the user approaches the display surface of the touch panel 31 and the inter-touch distance D decreases, the capacitance value increases on the touch panel 31 as shown in FIG. 6B , and, for example, there are more places where the numerical values "3" and "1" appear.
  • the touch panel controller 30 detects a place where the greatest change in capacitance value is shown among the numerical values "3" and "1", for example, a coordinate position of the place having the numerical value "3", and from this coordinate position, finds the coordinate position of the finger F of the user.
  • step #15 the touch panel controller 30 judges from the capacitance value in the touch panel 31 that the finger F of the user has entered the second touch approach range E2, and the touch panel controller 30 sends the relevant message to the frame display controller 40.
  • the frame display controller 40 On receipt of the message that the finger F of the user is present in the second touch approach range E2, the frame display controller 40 displays the frame W for focus on the LCD 9 as shown in FIG. 15A . Since the frame W for focus is displayed before the finger F of the user touches the touch panel 31, the frame W for focus is not hidden by the finger F of the user, and the part of the touch panel 31 to be touched by the finger F of the user can be checked on the display screen of the LCD 9.
  • the frame display controller 40 does not display the frame W for focus when the finger F of the user is present in the first touch approach range E1, whereas the frame display controller 40 displays the frame W for focus if the finger F of the user is present in the second touch approach range E2. This is because if the finger F of the user is present in the first touch approach range E1, the distance between the finger F of the user and the touch panel 31 is great, and it is impossible to determine whether the user intends to touch the touch panel 31. This makes it possible to prevent unnecessary display of the frame W for focus.
  • the inter-touch distance D between the finger F of the user and the touch panel 31 is between the inter-touch distances D2 and D1 shown in FIG. 4 .
  • the second touch approach range E2 is within 20 to 10 mm from the surface of the touch panel 31.
  • step #16 the touch panel controller 30 judges in accordance with the change in capacitance in the touch panel 31 whether the finger F of the user is present in the just-before-touch range T (D1 ⁇ D>D0) as shown in FIG. 16B and has touched the touch panel 31.
  • the focus controller 41 starts autofocusing (AF) on the subject in the frame W for focus in step #17.
  • step #18 the focus controller 41 judges whether the autofocusing (AF) on the subject in the frame W for focus has finished. If judging that the autofocusing (AF) on the subject in the frame W for focus has finished, the focus controller 41 locks the autofocusing (AF) in step #19.
  • step #20 the touch panel controller 30 judges whether the finger F of the user has touched the part of the touch panel 31 corresponding to, for example, the facial part S of the person to be focused in the subject in the frame W for focus as shown in FIG. 16B .
  • the touch panel controller 30 judges from the capacitance value in the touch panel 31 that the finger F of the user has touched the touch panel 31, and sends the relevant message to the imaging controller 42.
  • the imaging controller 42 On receipt of the message from the touch panel controller 30 that the finger F of the user has touched the touch panel 31, the imaging controller 42 performs imaging for a still image by the imaging unit 100 in step #21 at the detection of the touch. That is, the light flux from the subject enters the imaging sensor 15 from the lens 11 through the diaphragm 12 and the shutter 14. The imaging sensor 15 converts the entered light flux to an electric signal. The electric signal output from the imaging sensor 15 is output by the imaging circuit 16 as an analog imaging signal. The analog imaging signal is converted to a digital imaging signal in a predetermined format by the A/D converter 17, and then sent to the image processor 18.
  • the image processor 18 For the digital imaging signal converted by the A/D converter 17, the image processor 18 performs predetermined image processing, for example, adjustments including a color correction, a gray scale correction, and a gamma ( ⁇ ) correction of an image to be represented by the image data.
  • the image processor 18 temporarily stores the image signal after the above adjustments in the buffer memory 19.
  • the compressor/decompressor 21 compresses/decompresses the image signal temporarily stored in the buffer memory 19, forms the image signal into a format suitable for recording in the recording medium 22 to generate main image data as a still image shown in FIG. 16A , and records the main image data in the recording medium 22 via the interface 22a.
  • the system controller 26 then finishes the photography mode in step #22.
  • step #10 if judging in step #10 that the mode is not set to the touch autofocus (AF) mode, the system controller 26 shifts to step #30, and judges whether to continue the autofocusing (AF). If the system controller 26 judges to continue the focusing on the subject, the state of the release button 5 of the operation unit 27 shifts from an off-state to an on-state of a 1st release switch if the release button 5 is pressed halfway by the user in step #31.
  • step #34 the system controller 26 finishes the autofocusing (AF), and judges in step #35 whether the state of the release button 5 shows an on-state of a 2nd release switch. If the state of the release button 5 is judged to be the on-state of the 2nd release switch, the imaging controller 42 causes the imaging unit 100 to perform an imaging operation for a still image of the subject in step #36. The system controller 26 displays the still image acquired by the imaging in the imaging controller 42 on the display screen of the LCD 9 in a magnified form.
  • step #32 If the autofocusing on the subject is finished as a result of the judgment of whether to continue the autofocusing (AF) on the subject, the system controller 26 shifts to step #32. If the release button 5 of the operation unit 27 is pressed halfway by the user, the state of the release button 5 shifts from an off-state to an on-state of the 1st release switch. The system controller 26 starts AF in step #33, finishes the autofocusing (AF) in step #34, and shifts to steps #35, #36, and #23.
  • the system controller 26 reads the states of the mode dial 6 and the end button in the operation unit 27 in step #6. If the mode is continuously set to the photography mode, the system controller 26 returns to step #3. If the end button is operated, the system controller 26 shifts to step #7 and then turns off the power.
  • step #4 if judging in step #4 that the mode is set to the reproduction mode, the system controller 26 shifts to step #8, and performs an operation in the reproduction mode.
  • the list display unit 44 reads image data recorded in, for example, the recording medium 22, and displays a list of the image data on the display screen of the LCD 9 via the first liquid crystal controller 20.
  • the image reproducer 45 only displays the selected image data on the display screen of the LCD 9 in a magnified form in step #9.
  • the frame W for focus is displayed on the part of the display screen of the LCD 9 corresponding to the image part which the finger F of the user approaches, and autofocusing (AF) on the subject in the frame W for focus is performed. If the finger F of the user touches the touch panel 31 corresponding to the frame W for focus, an imaging operation is performed.
  • the frame W for focus is displayed in the image part on which autofocusing (AF) is to be performed, so that when the finger F of the user touches the touch panel 31 on the display screen of the LCD 9, the corresponding image to be touched by the finger F of the user is easily recognized, and imaging can be performed while the image in the part on which autofocusing (AF) is to be performed is checked.
  • the part of the touch panel 31 to be touched by the finger F of the user can be checked on the display screen of the LCD 9 without being hidden by the finger F of the user before the finger F of the user touches the touch panel 31. If it is found out from the check that the region to be touched by the finger F of the user is different from the region desired by the user, the region to be touched by the finger F of the user can be changed.
  • the subject corresponding to the coordinate position of the candidate point P can be focused before the finger F of the user touches the touch panel 31. Therefore, the time before focusing can be shorter than when focusing is performed after the finger F of the user has touched the touch panel 31.
  • FIG. 18 shows an imaging operation timing chart.
  • FIG. 18 shows a period in which a capacitance value generated between the touch panel 31 and, for example, the finger F of the user can be sensed, a display period of the frame W for focus displayed on the display screen of the LCD 9, a period in which autofocusing (AF) is performed, and a period in which imaging is performed.
  • AF autofocusing
  • the finger F of the user moves in a horizontal direction, that is, in a planar direction parallel with the display screen of the LCD 9 so that the finger F of the user is close to the touch panel 31 and, for example, the finger F of the user is within the second touch approach range E2 relative to the touch panel 31.
  • the touch panel controller 30 judges in step #20 whether the finger F of the user has touched the touch panel 31 corresponding to, for example, the facial part S of the person to be focused in the subject in the frame W for focus as shown in FIG. 16B .
  • the frame display controller 40 shifts to step #40, and judges whether the finger F of the user has moved in the planar direction parallel with the display surface of the LCD 9 as shown in FIG. 8 so that the finger F of the user is close to the touch panel 31, that is, the finger F of the user is present in the second touch approach range E2 shown in FIG. 4 .
  • the touch panel controller 30 judges from the capacitance value in the touch panel 31 whether the finger F of the user is present in the second touch approach range E2, as described above.
  • the capacitance value increases on the touch panel 31, and there are more places where, for example, the numerical values "5", "3" and “1" appear, as shown in FIG. 6C .
  • the touch panel controller 30 finds, as the coordinate position of the finger F of the user, a place where the greatest change in capacitance value is shown among the numerical values "5", "3" and "1", for example, the place having the numerical value "5".
  • the touch panel controller 30 tracks the movement of the coordinate position of the finger F of the user where the greatest change in capacitance value is shown. As a result of this track, the touch panel controller 30 follows the movement of the finger F of the user, and sends, to the frame display controller 40, each of the coordinate positions where the finger F of the user moves.
  • the frame W for focus is moved by the movement of the finger F of the user to follow the movement of the facial part S of the person when the facial part S of the person in the image displayed on the display screen of the LCD 9 moves.
  • the frame display controller 40 sequentially receives each of the coordinate positions of the movement of the finger F of the user found by the touch panel controller 30, and judges whether the finger F of the user has moved a predetermined movement distance M, for example, 10 mm or more from each of the coordinate positions.
  • the frame display controller 40 judges, in step #41, the direction in which the finger F of the user moves in accordance with each of the coordinate positions of the movement of the finger F of the user sequentially received from the touch panel controller 30.
  • the frame display controller 40 moves the frame W for focus on the display screen of the LCD 9, for example, as shown in FIG. 8 in the movement direction of the finger F of the user by a movement distance 2M such as 20 mm which is twice the predetermined movement distance M as shown in FIG. 19A and FIG. 19B .
  • the frame display controller 40 moves the frame W for focus at a speed proportionate to the speed at which the finger F of the user moves.
  • the focus controller 41 does not perform focusing on the subject, and stops focusing.
  • step #42 the frame display controller 40 judges whether the finger F of the user stops moving at one place as shown in FIG. 20A, FIG. 20B, and FIG. 20C and stays stopped for a given period of time. If judging that the finger F of the user is moving, the frame display controller 40 returns to step #41, and moves the frame W for focus on the display screen of the LCD 9 in accordance with each of the coordinate positions of the movement of the finger F of the user.
  • the frame display controller 40 sends, to the focus controller 41, a message that the movement of the finger F of the user has stopped.
  • the focus controller 41 starts autofocusing (AF) on the subject in the frame W for focus.
  • step #20 the touch panel controller 30 again judges whether the finger F of the user has touched the touch panel 31 corresponding to, for example, the facial part S of the person to be focused in the subject in the frame W for focus as shown in FIG. 16B . If judging that the finger F of the user has touched the touch panel 31 in the subject in the frame W for focus, the touch panel controller 30 judges from the capacitance value in the touch panel 31 that the finger F of the user has touched the touch panel 31, and sends the relevant message to the imaging controller 42. On receipt of the message from the touch panel controller 30 that the finger F of the user has touched the touch panel 31, the imaging controller 42 performs imaging for a still image by the imaging unit 100 in step #21 at the detection of the touch. The system controller 26 displays the still image acquired by the imaging in the imaging unit 100 on the display screen of the LCD 9 in a magnified form.
  • the frame W for focus is moved in accordance with the movement of the finger F of the user. If the finger F of the user then stops moving for a given period of time, autofocusing (AF) on the subject in the frame W for focus is resumed, and imaging is performed when the finger F of the user touches the touch panel 31.
  • AF autofocusing
  • the frame W for focus can be moved in accordance with the movement of the finger F of the user.
  • the frame W for focus can be moved in accordance with the movement of the facial part S of the person by moving the finger F of the user to follow the movement of the facial part S.
  • Imaging can be then performed by the autofocusing on the facial part S of the person in the frame W for focus.
  • autofocusing AF on the facial part S of the person in the frame W for focus is not performed.
  • the autofocusing (AF) is not performed when focusing is unnecessary, so that no wasteful operation is performed.
  • the frame W for focus can be displayed in the image part to be touched if the finger F of the user is moved closer to touch the part of the touch panel 31 corresponding to the facial part S of the different person.
  • FIG. 22 shows an imaging operation timing chart.
  • FIG. 22 shows a period in which a capacitance value generated between the touch panel 31 and, for example, the finger F of the user can be sensed, a display period of the frame W for focus displayed on the display screen of the LCD 9, a period in which autofocusing (AF) is performed, and a period in which imaging is performed.
  • AF autofocusing
  • the finger F of the user approaches the touch panel 31 from an inclined direction.
  • the touch panel controller 30 finds the position of the finger F of the user approaching the touch panel 31 as shown in FIG. 5 in accordance with the inclination angle ⁇ , and displays the frame W for focus on the display screen of the LCD 9 corresponding to the position of the finger F of the user.
  • the frame display controller 40 sequentially demagnifies the size of the frame W for focus whenever the finger F of the user approaches the touch panel 31.
  • the frame display controller 40 includes sizes such as an extra-large size WE, a large size WL, a middle size WM, and a small size WS.
  • the frame display controller 40 divides the second touch approach range E2 into, for example, four size ranges, and sets the size of the frame W for focus to the extra-large size WE, the large size WL, the middle size WM, and the small size WS in descending order of distance from the touch panel 31.
  • FIG. 23 shows a schematic diagram of the size ranges for changing the size of the frame for focus.
  • the second touch approach range E2 is separated into size ranges such as first to fourth size ranges E10 to E13.
  • the first to fourth size ranges E10 to E13 are separated by capacitance values Tha, Thb, and Thc between the touch panel 31 and the finger F of the user.
  • the relation of magnitude between the capacitance values Tha, Thb, and Thc is Tha ⁇ Thb ⁇ Thc.
  • the first size range E10 is between the capacitance values Th2 and Tha.
  • the second size range E11 is between the capacitance values Tha and Thb.
  • the third size range E12 is between the capacitance values Thb and Thc.
  • the fourth size range E13 is between the capacitance values Thc and Th3.
  • the frame display controller 40 changes the size of the frame W for focus in the order of the extra-large size WE, the large size WL, the middle size WM, and the small size WS. As a result of this size change, the size of the frame W for focus is sequentially demagnified.
  • the frame display controller 40 changes the size of the frame W for focus in the order of the small size WS, the middle size WM, the large size WL, and the extra-large size WE. As a result of this size change, the size of the frame W for focus is sequentially magnified.
  • the finger F of the user is approaching to touch the touch panel 31 in the inclined direction in the photography mode.
  • FIG. 24A and FIG. 24B show the situation in which the finger F of the user is about to touch the touch panel 31 in an inclined direction.
  • FIG. 24A shows the image displayed on the LCD 9 in the above situation.
  • FIG. 24B schematically shows the finger F of the user about to touch in the inclined direction in the above situation.
  • the touch panel controller 30 finds the coordinate position of the finger F of the user projected on the touch panel 31 as shown in FIG. 5 in step #14.
  • the touch panel controller 30 sends, to the frame display controller 40, a message that the finger F of the user has entered the second touch approach range E2.
  • the frame display controller 40 displays the frame W for focus on the LCD 9 as shown in FIG. 26A .
  • the frame display controller 40 sequentially demagnifies the size of the frame W for focus in accordance with the frame magnification/demagnification display flowchart in an imaging operation shown in FIG. 21 whenever the finger F of the user approaches the touch panel 31. That is, if the finger F of the user enters the second touch approach range E2 as shown in FIG. 23 , the frame display controller 40 judges, in step #50, whether the finger F of the user is in the first size range E10. This judgment is made by whether the capacitance value between the touch panel 31 and the finger F of the user is between Th2 and Tha.
  • the frame display controller 40 displays the frame W for focus having the extra-large size WE on the LCD 9 in step #51.
  • step #52 the frame display controller 40 judges whether the finger F of the user is in the second size range E11. This judgment is made by whether the capacitance value between the touch panel 31 and the finger F of the user is between Tha and Thb.
  • the frame display controller 40 displays the frame W for focus having the large size WL on the LCD 9 in step #53.
  • step #54 the frame display controller 40 judges whether the finger F of the user is in the third size range E12. This judgment is made by whether the capacitance value between the touch panel 31 and the finger F of the user is between Thb and Thc.
  • the frame display controller 40 displays the frame W for focus having the middle size WM on the LCD 9 in step #55.
  • step #56 the frame display controller 40 judges whether the finger F of the user is in the fourth size range E13. This judgment is made by whether the capacitance value between the touch panel 31 and the finger F of the user is between Thc and Th3.
  • the frame display controller 40 displays the frame W for focus having the small size WS on the LCD 9 in step #57.
  • the frame display controller 40 changes and demagnifies the size of the frame W for focus in the order of the extra-large size WE, the large size WL, the middle size WM, and the small size WS.
  • the touch panel controller 30 then shifts to step #17 in the same manner as described above.
  • the position of the finger F of the user approaching the touch panel 31 is found in accordance with the inclination angle ⁇ , and the frame W for focus is displayed on the display screen of the LCD 9 corresponding to the position of the finger F of the user.
  • the finger F of the user can approach the touch panel 31 from the inclined direction. Consequently, the frame W for focus is not hidden by the finger F of the user, and it is possible to touch within the frame W for focus with a certain recognition of the display position of the frame W for focus, and perform imaging while checking the image in the part on which autofocusing (AF) is to be performed.
  • AF autofocusing
  • the size of the frame W for focus is demagnified in the order of, for example, the extra-large size WE, the large size WL, the middle size WM, and the small size WS whenever the finger F of the user approaches the touch panel 31. Therefore, as the finger F of the user approaches the touch panel 31, the part of the touch panel 31 to be touched by the finger F of the user can be checked on the display screen of the LCD 9 before the finger F of the user touches the touch panel 31. Moreover, operation is easier when the finger F of the user touches within the frame W for focus, and the timing in which the finger F of the user touches the touch panel 31 is more easily known, so that operability during imaging can be improved.
  • a first movement speed ⁇ P1 and a second movement speed ⁇ P2 are set in the touch panel controller 30 to judge a variation ⁇ P in the movement speed of the position where the finger F of the user touches the touch panel 31.
  • the first and second movement speeds ⁇ P1 and ⁇ P2 are set in the relation ⁇ P1> ⁇ P2.
  • the touch panel controller 30 judges whether the variation ⁇ P in the movement speed of the position where the finger F of the user touches the touch panel 31 is lower than the preset first movement speed ⁇ P1, or between the preset movement speeds ⁇ P1 and ⁇ P2, or lower than the movement speed ⁇ P2.
  • the touch panel controller 30 not only judges the variation ⁇ P in the movement speed of the position where the finger F of the user touches the touch panel 31 but may also judge the variation ⁇ P in the movement speed of the finger F of the user while the finger F of the user is close to the first and second touch approach ranges E1 and E2 as shown in FIG. 4 and FIG. 5 .
  • the frame display controller 40 includes the frames W for focus of sizes corresponding to the variation ⁇ P in the movement speed of the touch position of the finger F of the user.
  • the frames W for focus include a first size corresponding to one divisional region of 9 divisions of the display screen of the LCD 9, a second size corresponding to one divisional region of 18 divisions of the display screen of the LCD 9, and a third size corresponding to one divisional region of 27 divisions of the display screen of the LCD 9.
  • the first size is the largest
  • the second size is the second largest
  • the third size is the third largest.
  • the sizes of the frames W for focus are not limited to the first to third sizes and may be changed. For example, other sizes may be set, or there may be additional kinds of sizes.
  • the frame display controller 40 displays the frame W for focus having the first size on the display screen of the LCD 9.
  • the frame display controller 40 displays the frame W for focus having the second size on the display screen of the LCD 9.
  • the frame display controller 40 displays the frame W for focus having the third size on the display screen of the LCD 9.
  • step #60 the system controller 26 judges whether the mode selected by, for example, the user operation on the mode dial 6 is the photography mode. If judging that the mode is the photography mode, the system controller 26 starts a photography operation in step #61, controls a series of imaging operations for imaging the subject, and live-view-displays a through-image on the LCD 9 in step #62.
  • step #63 the touch panel controller 30 monitors a change in capacitance value in the touch panel 31.
  • the touch panel controller 30 judges whether the monitored capacitance value has reached the just-before-touch range T and the finger F of the user has touched the touch panel 31. If the finger F of the user touches the touch panel 31, the touch panel controller 30 also detects a coordinate position of the touched region, and judges whether this coordinate position is a peripheral part on the display screen of the LCD 9. The coordinate position of the peripheral part on the display screen of the LCD 9 is preset.
  • the touch panel controller 30 judges in step #64 whether the coordinate position touched by the finger F of the user is a part other than the peripheral part on the display screen of the LCD 9.
  • the focus controller 41 starts autofocusing (AF) on the subject on the image corresponding to the coordinate position touched by the finger F of the user.
  • the imaging controller 42 performs imaging for a still image by the imaging unit 100 in step #66.
  • the imaging controller 42 records main image data acquired by the imaging in the recording medium 22 via the interface 22a.
  • step #64 If it is judged in step #64 that the coordinate position touched by the finger F of the user is not a part other than the peripheral part on the display screen of the LCD 9, the imaging controller 42 judges in step #67 whether the release button 5 has shifted from the on-state of the 1st release switch to the on-state of the 2nd release switch and whether to perform an imaging operation. If judging that the imaging operation is performed, the imaging controller 42 sends a face detection instruction to the face detector 43 in step #68. The face detector 43 detects the facial part S of the subject present in the image data.
  • the imaging controller 42 performs imaging for a still image by the imaging unit 100, and records main image data acquired by the imaging in the recording medium 22 via the interface 22a in step #69.
  • step #63 if it is judged in step #63 that the coordinate position touched by the finger F of the user is not the peripheral part on the display screen of the LCD 9, the touch panel controller 30 sends, to the frame display controller 40, a message that the coordinate position touched by the finger F of the user is not the peripheral part on the display screen of the LCD 9, and the coordinate position touched by the finger F of the user.
  • step #70 the frame display controller 40 displays the frame W for focus on the LCD 9 in the peripheral part on the display screen of the LCD 9 and at a position corresponding to the coordinate position touched by the finger F of the user as shown in FIG. 29A .
  • the frame W for focus is displayed in the peripheral part on the display screen of the LCD 9 because when a person is imaged as a subject, the person is generally located in the center of the display screen of the LCD 9, so that the composition for imaging the subject is not affected by the peripheral part of the display screen if the frame W for focus is displayed in the peripheral part.
  • the user moves the frame W for focus on the display screen of the LCD 9 while touching the frame W for focus, and sets the frame W for focus on the facial part S of the subject.
  • step #71 the frame display controller 40 judges whether the finger F of the user has moved as shown in FIG. 29B . If judging that the finger F of the user has moved, the frame display controller 40 moves the frame W for focus on the display screen of the LCD 9 in accordance with the movement of the finger F of the user in step #72. If moving the frame W for focus in accordance with the movement of the finger F of the user, the frame display controller 40 moves the frame W for focus so that the speed and direction of the movement correspond to the speed and direction of the movement of the finger F of the user.
  • step #73 the frame display controller 40 judges whether the movement of the finger F of the user has finished. If the movement has not finished, the frame display controller 40 continues the display of the frame W for focus in accordance with the movement of the finger F of the user.
  • FIG. 30 shows a frame display movement flowchart of the frame W for focus.
  • step #80 the touch panel controller 30 judges the change of the coordinate position touched by the finger F of the user. That is, the touch panel controller 30 detects a coordinate position of a place having a great change in capacitance value on the touch panel 31, sequentially detects coordinate positions of places having a great change in capacitance value which move with the elapse of time, and from each of the coordinate positions, detects the movement of the coordinate position on the touch panel 31 touched by the finger F of the user.
  • the frame display controller 40 moves the frame W for focus so that the speed and direction of the movement correspond to the speed and direction of the movement of the touch position of the finger F of the user.
  • a variation in the movement speed of the touch position of the finger F of the user is ⁇ P, and the direction of the movement is D.
  • step #81 the touch panel controller 30 judges whether the variation ⁇ P in the movement speed of the touch position of the finger F of the user is higher than the preset movement speed ⁇ P1 ( ⁇ P> ⁇ P1).
  • step #82 the touch panel controller 30 judges whether the variation ⁇ P in the movement speed of the touch position of the finger F of the user is higher than the preset movement speed ⁇ P2 ( ⁇ P> ⁇ P2).
  • the frame display controller 40 displays the frame W for focus having the first size on the display screen of the LCD 9 in step #82.
  • the frame display controller 40 displays the frame W for focus having the second size on the display screen of the LCD 9 in step #84.
  • the frame display controller 40 displays the frame W for focus having the third size on the display screen of the LCD 9 in step #85.
  • the movement speed of the finger F of the user is higher at the start of the movement, and becomes lower when the facial part S is closer.
  • the variation ⁇ P in the movement speed of the touch position of the finger F of the user is, for example, ⁇ P> ⁇ P1 at the start of the movement, and then becomes ⁇ P1> ⁇ P> ⁇ P2, and becomes ⁇ P2> ⁇ P when the frame W for focus is adjusted to the facial part S of the person.
  • the frame display controller 40 first displays the frame W for focus having the first size on the display screen of the LCD 9, and then displays the frame W for focus having the second size on the display screen of the LCD 9, and then displays the frame W for focus having the third size on the display screen of the LCD 9.
  • step #73 the frame display controller 40 judges whether the variation ⁇ P in the movement speed of the touch position of the finger F of the user is no longer found and the movement of the frame W for focus having the third size has finished.
  • the focus controller 41 starts autofocusing (AF) on the subject on the image corresponding to the frame W for focus having the third size.
  • the imaging controller 42 magnifies a through-image of the moving images of the subject corresponding to the frame W for focus having the third size, and live-view-displays a magnified through-image K on the LCD 9 in step #75.
  • the imaging controller 42 displays the magnified through-image K in the frame W for focus having the third size as, for example, a subject in the display screen of the LCD 9 in a region that does not overlap the person.
  • the imaging controller 42 performs imaging for a still image by the imaging unit 100 in step #76.
  • the imaging controller 42 records image data acquired by the imaging in the recording medium 22 via the interface 22a in step #77.
  • the system controller 26 displays the still image acquired by the imaging in the imaging unit 100 on the display screen of the LCD 9 in a magnified form.
  • the frame W for focus having the first size is displayed. If the variation ⁇ P in the movement speed of the touch position of the finger F of the user is between the preset movement speeds ⁇ P1 and ⁇ P2, the frame W for focus having the second size is displayed. If the variation ⁇ P in the movement speed of the touch position of the finger F of the user is lower than the preset movement speed ⁇ P2, the frame W for focus having the third size is displayed.
  • the frame W for focus having the first size is displayed and is more easily positioned relative to the facial part S of the person which is the subject.
  • the display is switched from the frame W for focus having the second size to the frame W for focus having the third size, and the frame W for focus can be accurately positioned in accordance with the facial part S.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Details Of Cameras Including Film Mechanisms (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

An imaging apparatus is to provide of the same which enable imaging while an image corresponding to a place touched on a monitor is being checked without being hidden.
The imaging apparatus includes a display unit, a touch panel, a frame display controller, an imaging controller. The frame display controller displays a frame for focus on the display unit if an operation portion is approaching the touch panel, and moves the frame for focus in accordance with the movement of the operation portion that is approaching the touch panel. The imaging controller performs focusing on the subject in the frame for focus and then performs imaging in response to a photography instruction.

Description

    Technical Field
  • The present invention relates to an imaging apparatus wherein a touch panel is provided in a display unit which displays an image acquired by imaging and an image corresponding to a part on a touch panel touched by an operation portion such as a finger of a user is, for example, displayed in a magnified form.
  • Background Art
  • For example, a technique disclosed in Jpn. Pat. Appln. KOKAI Publication No. 2006-101186 is known as the above-mentioned imaging apparatus. According to this technique, a particular subject to be tracked can be selected by an easy operation during photography. The subject to be tracked is selected in the following manner according to this technique. A touch panel is provided on a monitor. If this touch panel is touched, a particular part corresponding to the touched place is extracted from a displayed image. This particular part is compared with an image displayed by a video signal generated after the extraction of the particular part. An image portion corresponding to the particular part is detected from the video signal.
  • Citation List Patent Literature
  • Patent Literature 1:Jpn. Pat. Appln. KOKAI Publication No. 2006-101186
  • Summary of Invention Technical Problem
  • However, the subject is selected by touching the touch panel provided on the monitor. Therefore, at the time of a touch operation, for example, the displayed image corresponding to a place touched on the monitor is hidden and difficult to see, or may be invisible.
  • An object of the present invention is to provide an imaging apparatus and an imaging method of the same which enable imaging while an image corresponding to a place touched on a monitor is being checked without being hidden, and to provide a storage medium to store a computer-processible tracking program.
  • An imaging apparatus according to aspect of the present invention comprises a display unit which displays a moving image or a still image, a touch panel provided in the display unit, a frame display controller which displays a frame for focus on the display unit if an operation portion is approaching the touch panel, and moves the frame for focus in accordance with the movement of the operation portion that is approaching the touch panel, and an imaging controller which performs focusing on the subject in the frame for focus and then performs imaging in response to a photography instruction.
  • An imaging method of an imaging apparatus according to aspect of the present invention comprises detecting whether an operation portion is approaching a touch panel provided in a display unit which displays a moving image or a still image, displaying a frame for focus on the display unit if an operation portion is approaching the touch panel, and moving the frame for focus in accordance with the movement of the operation portion that is approaching the touch panel, and performing focusing on the subject in the frame for focus and then performing imaging for the subject in response to a photography instruction.
  • A non-transitory computer readable storage medium storing a tracking program according to aspect of the present invention comprises a detection function to detect whether an operation portion is approaching or has contacted a touch panel provided in a display unit which displays a moving image or a still image, a tracking function to display a frame for focus on the display unit if an operation portion is approaching the touch panel, and move the frame for focus in accordance with the movement of the operation portion that is approaching the touch panel, and an imaging function to perform focusing on the subject in the frame for focus and then perform imaging for the subject in response to a photography instruction.
  • Advantageous Effects of Invention
  • According to the present invention, it is possible to provide an imaging apparatus and an imaging method of the same which enable imaging while an image corresponding to a place touched on a monitor is being checked without being hidden, and to provide a storage medium to store a computer-processible tracking program.
  • Brief Description of Drawings
    • FIG. 1 is a configuration diagram showing a first embodiment of an imaging apparatus according to the present invention;
    • FIG. 2 is a schematic diagram showing a sensor pattern of a touch panel in the same apparatus;
    • FIG. 3 is a graph showing a situation in which, for example, a finger of a user is brought close to the touch panel in the same apparatus;
    • FIG. 4 is a graph showing a capacitance change between the touch panel in the same apparatus and a finger F of a user;
    • FIG. 5 is a diagram showing that the finger of the user is about to touch the touch panel in the same apparatus at an angle;
    • FIG. 6 is a diagram showing the process of a change in capacitance value when the finger of the user is about to touch the touch panel in the same apparatus;
    • FIG. 7 is a diagram showing a frame for focus displayed on a display surface of a liquid crystal display in the same apparatus;
    • FIG. 8 is a diagram showing the movement of the frame for focus following the movement of the finger of the user in the same apparatus;
    • FIG. 9 is a diagram showing how the size of the frame for focus is demagnified or magnified in accordance with the movement speed of the finger of the user in the same apparatus;
    • FIG. 10 is an imaging operation flowchart in a first imaging operation by the same apparatus;
    • FIG. 11 is a photography mode flowchart in the first imaging operation by the same apparatus;
    • FIG. 12 is an imaging operation timing chart in the first imaging operation by the same apparatus;
    • FIG. 13 is a diagram showing a displayed image and the position of the finger of the user in the situation in which the finger of the user is about to touch the touch panel in the same apparatus;
    • FIG. 14 is a diagram showing a displayed image and the position of the finger of the user in the situation in which the finger of the user has entered a first approach range relative to the touch panel in the same apparatus;
    • FIG. 15 is a diagram showing a displayed image and the position of the finger of the user in the situation in which the finger of the user has entered a second approach range relative to the touch panel in the same apparatus;
    • FIG. 16 is a diagram showing a displayed image and the position of the finger of the user in the situation in which the finger of the user has touched the touch panel in the same apparatus;
    • FIG. 17 is a photography mode flowchart in a second imaging operation by the same apparatus;
    • FIG. 18 is an imaging operation timing chart in the second imaging operation by the same apparatus;
    • FIG. 19 is a diagram illustrating the tracking of the frame for focus during the movement of the finger of the user in the same apparatus;
    • FIG. 20 is a diagram showing that the movement of the finger of the user is stopped for a given period of time in the same apparatus;
    • FIG. 21 is a frame magnification/demagnification display flowchart in an imaging operation in the same apparatus;
    • FIG. 22 is an imaging operation timing chart in a third imaging operation in the same apparatus;
    • FIG. 23 is a schematic diagram showing size ranges for changing the size of the frame for focus in the same apparatus;
    • FIG. 24 is a diagram showing a displayed image and the position of the finger of the user in the situation in which the finger of the user is approaching to touch the touch panel in an inclined direction in the same apparatus;
    • FIG. 25 is a diagram showing a displayed image and the position of the finger of the user in the situation in which the finger of the user has approached and entered the first approach range in the inclined direction relative to the touch panel in the same apparatus;
    • FIG. 26 is a diagram showing a displayed image and the position of the finger of the user in the situation in which the finger of the user has approached and entered the second approach range in the inclined direction relative to the touch panel in the same apparatus;
    • FIG. 27 is a diagram showing a displayed image and the position of the finger of the user in the situation in which the finger of the user has touched the touch panel in the same apparatus;
    • FIG. 28 is an imaging operation flowchart in a fourth imaging operation by the same apparatus;
    • FIG. 29 is a diagram showing an example of how to display the frame for focus on a display screen of the liquid crystal display in the same apparatus; and
    • FIG. 30 is a movement flowchart of the frame for focus in the same apparatus.
    Description of Embodiment
  • A first embodiment of the present invention is described below with reference to the drawings.
  • FIG. 1 shows a configuration diagram of an imaging apparatus. An imaging unit 100 images a subject, and outputs a relevant imaging signal. The imaging unit 100 includes a lens 11, a diaphragm 12, an autofocus (AF) mechanism 13, a shutter 14, an imaging sensor 15, an imaging circuit 16, and an A/D converter 17. A light flux from the subject enters the imaging sensor 15 from the lens 11 through the diaphragm 12 and the shutter 14. The imaging sensor 15 converts the entered light flux to an electric signal. The imaging circuit 16 outputs the electric signal from the imaging sensor 15 as an analog imaging signal. The A/D converter 17 converts the analog imaging signal from the imaging circuit 16 to a digital imaging signal in a predetermined format, and then sends the digital imaging signal to an image processor 18 and a finder image generator 23. The AF mechanism 13 moves the lens 11 in an optical axis direction, and performs AF for the subject.
  • For the digital imaging signal converted by the A/D converter 17, the image processor 18 performs predetermined image processing, for example, adjustments including a color correction, a gray scale correction, and a gamma (γ) correction of an image to be represented by the image data. The image processor 18 temporarily stores the image signal after the above adjustments in a buffer memory 19.
  • A compressor/decompressor 21 compresses/decompresses the image signal temporarily stored in the buffer memory 19, forms the image signal into a format suitable for recording in a recording medium 22 to generate main image data, and records the main image data in the recording medium 22 via an interface 22a.
  • A first liquid crystal controller 20 comprises what is known as a live view function to read the image signal temporarily stored in the buffer memory 19, generate a through-image in a format suitable for display on a liquid crystal display (hereinafter abbreviated as an LCD) 9 for the image signal, and display the through-image on the LCD 9.
  • The first liquid crystal controller 20 reads the image data recorded in the recording medium 22 through the buffer memory 19 and the compressor/decompressor 21, and displays this image data on the LCD 9.
  • The finder image generator 23 generates finder image data for moving images in a format suitable for display on a display device 3 such as an EVF from the digital imaging signal converted by the A/D converter 17.
  • A second liquid crystal controller 24 displays, on the display device 3, for example, the EVF, the finder image data for moving images generated by the finder image generator 23.
  • A touch panel 31 is provided on a display screen of the LCD 9. The touch panel 31 detects a touch (contact) by an operation portion such as a finger of a user which touches the touch panel 31, and outputs a coordinate signal corresponding to the touched part.
  • A touch panel controller 30 controls the driving of the touch panel 31, inputs a coordinate signal output from the touch panel 31, and judges the touched part on the touch panel 31. The touch panel controller 30 judges whether, for example, the finger of the user has approached or touched the touch panel 31 in accordance with the change in capacitance between the touch panel 31 and, for example, the finger of the user.
  • FIG. 2 shows a schematic diagram of a sensor pattern of the touch panel 31. In the touch panel 31, sensors 31a are arranged at regular intervals in x-y directions. As described above, in accordance with the change in capacitance between the touch panel 31 and, for example, the finger of the user, each of the sensors 31a judges whether the finger of the user is gradually approaching to touch the touch panel 31 or the finger of the user has touched the touch panel 31. Since the touch panel 31 has a protective layer such as a protective film, protective glass, or a protective panel formed on its surface, a touch by the finger of the user includes a touch on the protective layer. Each of the sensors 31a is of a capacitive projection type, but a self-capacitance method or a mutual-capacitance method may also be used instead. According to the self-capacitance method, a capacitance value generated between each of the sensors 31a and, for example, the finger of the user is sensed. According to the mutual-capacitance method, a capacitance change between the adjacent sensors 31a is sensed if, for example, the finger of the user approaches each of the sensors 31a.
  • FIG. 3 shows a situation in which, for example, a finger F of the user has approached the capacitance type touch panel 31. FIG. 4 shows a capacitance change between the touch panel 31 and the finger F of the user if the finger F of the user has approached the capacitance type touch panel 31. The finger F of the user is located, for example, above coordinates (x, y, z) on the surface of the touch panel 31. The capacitance between the touch panel 31 and the finger F of the user increases in a quadric manner as the finger F of the user approaches the touch panel 31. Therefore, each of the sensors 31a outputs a signal corresponding to the capacitance between the touch panel 31 and the finger F of the user, that is, a signal corresponding to a distance D (hereinafter referred to as an inter-touch distance) between the touch panel 31 and the finger F of the user.
  • Touch approach ranges E1 to En and a just-before-touch range T are set to judge whether the finger F of the user has approached or touched the touch panel 31. For example, the first and second touch approach ranges E1 and E2 and the just-before-touch range T are set in the touch panel 31, as shown in FIG. 4. The first and second touch approach ranges E1 and E2 and the just-before-touch range T are set as follows: If the inter-touch distance D between the touch panel 31 and the finger F of the user is D3, D2, or D1 (D3>D2>D1), a range between the distances D3 and D2 is set as the first touch approach range E1, a range between the distances D2 and D3 is set as the second touch approach range E2, and a range within the distance D3 and before the finger F of the user touches the touch panel 31 is set as the just-before-touch range T.
  • The first and second touch approach ranges E1 and E2 and the just-before-touch range T are divided by capacitance values Th1, Th2, and Th3 between the touch panel 31 and the finger F of the user. The relation of magnitude between the capacitance values Th1, Th2, and Th3 is Th1<Th2<Th3. Accordingly, the first touch approach range E1 is between the capacitance values Th1 and Th2. The second touch approach range E2 is between the capacitance values Th2 and Th3. The just-before-touch range T is a small range including the capacitance value Th3.
  • Therefore, the touch panel controller 30 monitors the operation in a touch AF mode, that is, a change in capacitance value detected by each of the sensors 31a of the touch panel 31, and detects a region in which the capacitance value changes, as a coordinate position to be touched. The touch panel controller 30 monitors a change in capacitance value in the touch panel 31, compares the monitored capacitance value with each of the capacitance values Th1, Th2 and Th3, and judges whether the finger F of the user is present in the first touch approach range E1, in the second touch approach range E2, or in the just-before-touch range T relative to the touch panel 31, and whether the finger F of the user has touched the touch panel 31.
  • If the finger F of the user is about to touch the touch panel 31, the finger F of the user touches the touch panel 31 at an angle θ in many cases. FIG. 5 shows that the finger F of the user is about to touch the touch panel 31 in a direction inclined at the angle θ. Here, the inter-touch distance between the finger F of the user and a display surface of the touch panel 31 is D (including D1, D2, and D3), a horizontal distance between a point P to be touched and the finger F of the user is x, and an angle from the point P to be touched is θ (an angle with a line perpendicular to the display surface of the touch panel 31).
  • FIG. 6A, FIG. 6B, and FIG. 6C show the process of a change in capacitance value at regular time intervals if the finger F of the user is about to touch the touch panel 31. In these drawings, numerical values "1", "2" ... "5" schematically indicating capacitance values are shown in the sensors 31a. The higher numerical values "1", "2" ... "5" indicate higher capacitance values.
  • As the finger F of the user approaches the touch panel 31, the capacitance value of the touch panel 31 increases. If the inter-touch distance D between the finger F of the user and the display surface of the touch panel 31 is great, the change amount of the capacitance in each of the sensors 31a is small. The touch panel controller 30 determines, as a coordinate position ST1 of an origin, a place where a change in capacitance is detected on the touch panel 31; for example, a place where the numerical value "1" appears as shown in FIG. 6A.
  • At a time t1 after a time t0 at which the coordinate position ST1 of the finger F of the user as the origin has been detected, if the finger F of the user approaches the display surface of the touch panel 31 and the inter-touch distance D decreases, the capacitance value increases on the touch panel 31. In FIG. 6B, there are more places where the numerical values "3" and "1" appear. The touch panel controller 30 detects a place where the greatest change in capacitance value is shown among the numerical values "3" and "1"; for example, a coordinate position ST2 of the place having the numerical value "3", and finds the horizontal distance x and the inter-touch distance D from the coordinate position ST2 of the place having the numerical value "3" and from the coordinate position ST1 of the origin.
  • Furthermore, at a time t2 after the time t1, if the finger F of the user approaches the display surface of the touch panel 31 and the inter-touch distance D decreases, the capacitance value further increases on the touch panel 31. In FIG. 6C, there are more places where the numerical values "5", "3" and "1" appear. The touch panel controller 30 detects a place where the greatest change in capacitance value is shown among the numerical values "5", "3" and "1"; for example, a coordinate position ST3 of the place having the numerical value "5", and finds the horizontal distance x and the inter-touch distance D from the coordinate position ST3 of the place having the numerical value "5" and from the coordinate position ST1 of the origin.
  • After this, the touch panel controller 30 detects a coordinate position STm of the place having the greatest change in capacitance value at regular time intervals, and finds the horizontal movement distance x and the inter-touch distance D from the coordinate position STm and the coordinate position ST1 of the origin. The touch panel controller 30 may find the horizontal movement distance x and the inter-touch distance D from the coordinate position STm of the place having the greatest change in capacitance value and a preceding coordinate position STm-1 at regular time intervals.
  • The touch panel controller 30 finds a movement direction M of the finger F of the user from each horizontal movement distance x and each inter-touch distance D that are found at regular time intervals, finds the angle θ from the movement direction M, and finds a coordinate position of a candidate point P to be touched by the finger F of the user from the horizontal movement distance x, the inter-touch distance D, and the angle θ.
  • Another way to calculate the coordinate position of the candidate point P is described. The touch panel controller 30 finds the inter-touch distance D between the finger F of the user and the display surface of the touch panel 31 from the capacitance value detected by each of the sensors 31a of the touch panel 31. The inter-touch distance D and the angle θ of the finger F of the user have the relation of a trigonometric function D=r • cosθ, x=r • sinθ. Therefore, the touch panel controller 30 calculates the trigonometric function D=r • cosθ, x=r • sinθ by using the inter-touch distance D and the angle θ of the finger F of the user, and thereby finds the horizontal movement distance x. From the inter-touch distance D, the angle θ, and the horizontal movement distance x, the touch panel controller 30 finds the coordinate position of the candidate point P to be touched by the finger F of the user.
  • The angle θ of the finger F of the user may be set at a preset angle, for example, 45°. The angle θ (=45°) of the finger F of the user is an angle which the finger F of the user generally makes when touching the touch panel 31.
  • A system controller 26 controls a series of imaging operations in which the subject is imaged. The system controller 26 controls the imaging unit 100, the image processor 18, the buffer memory 19, the compressor/decompressor 21, the interface 22a, the finder image generator 23, the touch panel controller 30, the first liquid crystal controller 20, and the second liquid crystal controller 24. An operation unit 27 and a strobe control circuit 25 are connected to the system controller 26. The operation unit 27 has, for example, a release button 5, a mode dial 6, a cross key 7, and an enter key 8. The operation unit 27 includes, for example, a power button and an end button. The strobe control circuit 25 controls the operation of a strobe 4 which generates a flash light.
  • This system controller 26 includes a frame display controller 40, a focus controller 41, an imaging controller 42, a face detector 43, a list display unit 44, and an image reproducer 45.
  • If the finger F of the user approaches to touch the touch panel 31, the frame display controller 40 displays a frame W for focus in an image region corresponding to the part of the touch panel 31 to be touched on the image displayed on the display screen of the LCD 9 as shown in FIG. 7. In this case, the frame display controller 40 displays the frame W for focus around the coordinate position of the candidate point P found by the touch panel controller 30.
  • The frame display controller 40 displays the frame W for focus if the finger F of the user has entered the second touch approach range E2. The frame W for focus may be displayed from the time if the finger F of the user has entered the first touch approach range E1.
  • If the frame W for focus is displayed when the finger F of the user is approaching the touch panel 31, the part of the touch panel 31 to be touched by the finger F of the user can be checked on the display screen of the LCD 9 before the finger F of the user touches the touch panel 31. Since the part to be touched by the finger F of the user is located before the finger F of the user touches the touch panel 31, the frame W for focus can be checked without being hidden by the finger F of the user. If it is found out from this check that the region to be touched by the finger F of the user is different from the region desired by the user, the region to be touched by the finger F of the user can be changed.
  • Since the coordinate position of the candidate point P to be touched by the finger F of the user can be previously found by the touch panel controller 30, the focus controller 41 described later can focus on the subject corresponding to the coordinate position of the candidate point P before the finger F of the user touches the touch panel 31. The time before focusing can be shorter than if focusing is performed after the finger F of the user has touched the touch panel 31.
  • The frame W for focus is set to be, for example, a face S of a person as a main subject by the user operation. The frame W for focus is formed into, for example, a square, but is not limited thereto. For example, the frame W for focus may be formed into a circle, a rectangle, or a double circle.
  • If the finger F of the user moves in a direction flush with the display screen of the LCD 9 as shown in FIG. 8 so that the finger F of the user is close to the touch panel 31 and, for example, the finger F of the user is within the second touch approach range E2, the frame display controller 40 moves the frame W for focus on the display screen of the LCD 9 in accordance with the movement of the finger F of the user. For example, if the person moves, the frame W for focus is set at the face S of the person who has moved in accordance with the movement of the finger F of the user.
  • If moving the frame W for focus in accordance with the movement of the finger F of the user, the frame display controller 40 moves the frame W for focus at a speed corresponding to the speed at which the finger F of the user moves; for example, at a speed which increases or decreases in accordance with the increase or decrease in the movement speed of the finger F of the user. If moving the frame W for focus in accordance with the movement of the finger F of the user, the frame display controller 40 moves the frame W for focus in the same direction as the direction in which the finger F of the user moves.
  • The frame display controller 40 demagnifies or magnifies the size of the frame W for focus in accordance with the inter-touch distance D between the finger F of the user and the display surface of the touch panel 31. For example, the frame display controller 40 demagnifies the size of the frame W for focus in an arrow direction as shown in FIG. 9 as the inter-touch distance D decreases, and the frame display controller 40 magnifies the size of the frame W for focus as the inter-touch distance D increases. If the size of the frame W for focus is reduced, it is possible to perform display that is narrowed down to the candidate point P to be touched by the finger F of the user.
  • If stopping the movement of the imaging apparatus for a preset long time without changing, for example, the photography direction and composition of the subject, the frame display controller 40 may lock the display state of the frame W for focus in the display screen of the LCD 9. If the imaging apparatus moves, the frame display controller 40 unlocks the display state of the frame W for focus.
  • The focus controller 41 performs processing to focus on the candidate point P on the subject in the frame W for focus before the finger F of the user touches the touch panel 31.
  • The focus controller 41 does not perform focusing during the movement of the frame W for focus by the frame display controller 40, that is, during the movement in the direction flush with the display screen of the LCD 9. If the movement of the finger F of the user stops, the focus controller 41 performs focusing on the subject in the frame W for focus.
  • The focus controller 41 may perform focusing after the frame W for focus is displayed.
  • If the finger F of the user approaches a predetermined approach range among the touch approach ranges E1 to En, for example, the second touch approach range E2, the focus controller 41 performs focusing on the subject in the frame W for focus.
  • If detecting that the finger F of the user is touching the touch panel 31, the imaging controller 42 performs imaging by the imaging unit 100 at the time of the touch detection.
  • The face detector 43 detects whether the facial part S of the subject is present in image data, or detects whether the facial part S of the subject is present in the frame W for focus displayed in the image data. The face detector 43 detects the facial part S of the subject present in the image data.
  • If the facial part S of the person is detected by the face detector 43, the frame display controller 40 may display the frame W for focus for the facial part S of the subject. In the same manner as described above, the display of the frame W for focus is started if the finger F of the user enters the second touch approach range E2. If the facial part S of the subject is photographed as the main subject, the display position of the frame W for focus can be corrected to the facial part S detected by the face detector 43 even if the position of the frame W for focus displayed by the frame display controller 40 is out of position relative to the main subject.
  • The list display unit 44 displays a list of images on the display screen of the LCD 9 during a reproduction mode. The list display unit 44 reads image data recorded in, for example, the recording medium 22, and displays a list of the image data on the display screen of the LCD 9 via the first liquid crystal controller 20.
  • If one item on the list of the image data displayed on the display screen of the LCD 9 by the list display unit 44 is selected, the image reproducer 45 displays the selected image data on the display screen of the LCD 9 in a magnified form.
  • [First imaging operation]
  • Now, a first imaging operation of the apparatus having the above configuration is described with reference to an imaging operation flowchart shown in FIG. 10.
  • The system controller 26 turns on the power of the touch panel 31 in step #1, and turns on the power of the LCD 9 in step #2.
  • In step #3, the system controller 26 receives a mode such as a photography mode or the reproduction mode selected by, for example, the user operation performed on the mode dial 6.
  • In step #4, the system controller 26 judges whether the selected mode is the photography mode or the reproduction mode. When judging that the selected mode is the photography mode, the system controller 26 shifts to step #5, and sets the mode to the photography mode.
  • An imaging operation in the photography mode is described below with reference to a photography mode flowchart shown in FIG. 11.
  • FIG. 12 shows an imaging operation timing chart. FIG. 12 shows a period in which a capacitance value generated between the touch panel 31 and, for example, the finger F of the user can be sensed, a display period of the frame W for focus displayed on the display screen of the LCD 9, a period in which autofocusing (AF) is performed, and a period in which imaging is performed.
  • In the photography mode, the system controller 26 controls the imaging unit 100, the image processor 18, the buffer memory 19, the compressor/decompressor 21, the interface 22a, the finder image generator 23, the touch panel controller 30, the first liquid crystal controller 20, and the second liquid crystal controller 24, and controls a series of imaging operations for imaging the subject.
  • That is, the light flux from the subject enters the imaging sensor 15 from the lens 11 through the diaphragm 12 and the shutter 14. At the same time, the AF mechanism 13 moves the lens 11 in the optical axis direction, and performs AF for the subject. The imaging sensor 15 converts the entered light flux to an electric signal. The electric signal output from the imaging sensor 15 is output by the imaging circuit 16 as an analog imaging signal. The analog imaging signal is converted to a digital imaging signal in a predetermined format by the A/D converter 17, and then sent to the image processor 18 and the finder image generator 23.
  • For the digital imaging signal converted by the A/D converter 17, the image processor 18 performs predetermined image processing, for example, adjustments including a color correction, a gray scale correction, and a gamma (γ) correction of an image to be represented by the image data. The image processor 18 temporarily stores the image signal after the above adjustments in the buffer memory 19.
  • The first liquid crystal controller 20 reads the image signal temporarily stored in the buffer memory 19, generates a through-image in a format suitable for display on the LCD 9 for the image signal, and live-view-displays the through-image on the LCD 9.
  • In this photography mode, the system controller 26 judges in step #10 whether the mode is set to a touch autofocus (AF) mode. If the mode is judged to be set to the touch AF mode, the touch panel controller 30 monitors a change in capacitance value in the touch panel 31, and detects the region in which the capacitance value changes, as a coordinate position to be touched. The touch panel controller 30 compares the monitored capacitance value with each of the capacitance values Th1, Th2 and Th3, and judges whether the finger F of the user is present in the first touch approach range E1, in the second touch approach range E2, or in the just-before-touch range T relative to the touch panel 31, and whether the finger F of the user has touched the touch panel 31.
  • More specifically, in step #11, the touch panel controller 30 monitors a change in capacitance value in the touch panel 31, and judges that the user is bringing the finger F close to the touch panel 31 to touch the touch panel 31. The place to be touched by the finger F of the user is the place on the LCD 9 and on the image desired to be focused in the subject, for example, the part on the touch panel 31 corresponding to the facial part S of the person.
  • FIG. 13A and FIG. 13B show the situation in which the finger F of the user is about to touch the touch panel 31. FIG. 13A shows the image displayed on the LCD 9 in the above situation. FIG. 13B shows the inter-touch distance D between the touch panel 31 and the finger F of the user in the above situation.
  • In step #12, the touch panel controller 30 judges whether the finger F of the user is present in the first touch approach range E1 (D3≥D>D2) as shown in FIG. 14B in accordance with the change in capacitance in the touch panel 31. The first touch approach range E1 is within 30 to 20 mm from the surface of the touch panel 31.
  • If judging that the finger F of the user is present in the first touch approach range E1, the touch panel controller 30 judges that a capacitance value generated between the touch panel 31 and the finger F of the user can be sensed by each of the sensors 31a of the touch panel 31 and that the finger F of the user is about to touch the touch panel 31. Sensing is possible from the point where the finger F of the user has entered the first touch approach range E1, so that when the finger F of the user is out of the first touch approach range E1, sensing is not performed, and no wasteful electricity is consumed. If the finger F of the user is out of the first touch approach range E1, the touch panel controller 30 can inhibit sensing, perform an operation to increase the distance between the finger F of the user and the touch panel 31, or limit circuit operations and calculations in the system controller 26 so that electricity is not wastefully consumed.
  • If the range of the capacitance value generated between the touch panel 31 and the finger F of the user is preset, this capacitance value can be differentiated from a capacitance value generated when a material different from the finger F of the user is about to touch the touch panel 31. Thus, it is possible to only judge that the finger F of the user is about to touch the touch panel 31, and prevent a wrong operation resulting from the different material which is about to touch the touch panel 31.
  • After the finger F of the user has entered the first touch approach range E1 as shown in FIG. 14B, the touch panel controller 30 judges in step #13 whether the finger F of the user is present in the second touch approach range E2 (D2≥D>D1) as shown in FIG. 15B in accordance with the change in capacitance value in the touch panel 31.
  • If judging that the finger F of the user is present in the second touch approach range E2, the touch panel controller 30 finds a coordinate position of the finger F of the user projected on the touch panel 31 in step #14. That is, if the finger F of the user approaches the display surface of the touch panel 31 and the inter-touch distance D decreases, the capacitance value increases on the touch panel 31 as shown in FIG. 6B, and, for example, there are more places where the numerical values "3" and "1" appear. The touch panel controller 30 detects a place where the greatest change in capacitance value is shown among the numerical values "3" and "1", for example, a coordinate position of the place having the numerical value "3", and from this coordinate position, finds the coordinate position of the finger F of the user.
  • At the same time, in step #15, the touch panel controller 30 judges from the capacitance value in the touch panel 31 that the finger F of the user has entered the second touch approach range E2, and the touch panel controller 30 sends the relevant message to the frame display controller 40.
  • On receipt of the message that the finger F of the user is present in the second touch approach range E2, the frame display controller 40 displays the frame W for focus on the LCD 9 as shown in FIG. 15A. Since the frame W for focus is displayed before the finger F of the user touches the touch panel 31, the frame W for focus is not hidden by the finger F of the user, and the part of the touch panel 31 to be touched by the finger F of the user can be checked on the display screen of the LCD 9.
  • The frame display controller 40 does not display the frame W for focus when the finger F of the user is present in the first touch approach range E1, whereas the frame display controller 40 displays the frame W for focus if the finger F of the user is present in the second touch approach range E2. This is because if the finger F of the user is present in the first touch approach range E1, the distance between the finger F of the user and the touch panel 31 is great, and it is impossible to determine whether the user intends to touch the touch panel 31. This makes it possible to prevent unnecessary display of the frame W for focus.
  • The inter-touch distance D between the finger F of the user and the touch panel 31 is between the inter-touch distances D2 and D1 shown in FIG. 4. The second touch approach range E2 is within 20 to 10 mm from the surface of the touch panel 31.
  • In step #16, the touch panel controller 30 judges in accordance with the change in capacitance in the touch panel 31 whether the finger F of the user is present in the just-before-touch range T (D1≥D>D0) as shown in FIG. 16B and has touched the touch panel 31.
  • If it is judged that the finger F of the user is present in the just-before-touch range T and has touched the touch panel 31, the focus controller 41 starts autofocusing (AF) on the subject in the frame W for focus in step #17.
  • In step #18, the focus controller 41 judges whether the autofocusing (AF) on the subject in the frame W for focus has finished. If judging that the autofocusing (AF) on the subject in the frame W for focus has finished, the focus controller 41 locks the autofocusing (AF) in step #19.
  • In step #20, the touch panel controller 30 judges whether the finger F of the user has touched the part of the touch panel 31 corresponding to, for example, the facial part S of the person to be focused in the subject in the frame W for focus as shown in FIG. 16B.
  • If judging that the finger F of the user has touched the part of the touch panel 31 corresponding to, for example, the facial part S of the person to be focused in the subject in the frame W for focus as shown in FIG. 16B, the touch panel controller 30 judges from the capacitance value in the touch panel 31 that the finger F of the user has touched the touch panel 31, and sends the relevant message to the imaging controller 42.
  • On receipt of the message from the touch panel controller 30 that the finger F of the user has touched the touch panel 31, the imaging controller 42 performs imaging for a still image by the imaging unit 100 in step #21 at the detection of the touch. That is, the light flux from the subject enters the imaging sensor 15 from the lens 11 through the diaphragm 12 and the shutter 14. The imaging sensor 15 converts the entered light flux to an electric signal. The electric signal output from the imaging sensor 15 is output by the imaging circuit 16 as an analog imaging signal. The analog imaging signal is converted to a digital imaging signal in a predetermined format by the A/D converter 17, and then sent to the image processor 18. For the digital imaging signal converted by the A/D converter 17, the image processor 18 performs predetermined image processing, for example, adjustments including a color correction, a gray scale correction, and a gamma (γ) correction of an image to be represented by the image data. The image processor 18 temporarily stores the image signal after the above adjustments in the buffer memory 19. The compressor/decompressor 21 compresses/decompresses the image signal temporarily stored in the buffer memory 19, forms the image signal into a format suitable for recording in the recording medium 22 to generate main image data as a still image shown in FIG. 16A, and records the main image data in the recording medium 22 via the interface 22a.
  • The system controller 26 then finishes the photography mode in step #22.
  • On the other hand, if judging in step #10 that the mode is not set to the touch autofocus (AF) mode, the system controller 26 shifts to step #30, and judges whether to continue the autofocusing (AF). If the system controller 26 judges to continue the focusing on the subject, the state of the release button 5 of the operation unit 27 shifts from an off-state to an on-state of a 1st release switch if the release button 5 is pressed halfway by the user in step #31.
  • In step #34, the system controller 26 finishes the autofocusing (AF), and judges in step #35 whether the state of the release button 5 shows an on-state of a 2nd release switch. If the state of the release button 5 is judged to be the on-state of the 2nd release switch, the imaging controller 42 causes the imaging unit 100 to perform an imaging operation for a still image of the subject in step #36. The system controller 26 displays the still image acquired by the imaging in the imaging controller 42 on the display screen of the LCD 9 in a magnified form.
  • If the autofocusing on the subject is finished as a result of the judgment of whether to continue the autofocusing (AF) on the subject, the system controller 26 shifts to step #32. If the release button 5 of the operation unit 27 is pressed halfway by the user, the state of the release button 5 shifts from an off-state to an on-state of the 1st release switch. The system controller 26 starts AF in step #33, finishes the autofocusing (AF) in step #34, and shifts to steps #35, #36, and #23.
  • Returning to the imaging operation flowchart shown in FIG. 10, the system controller 26 reads the states of the mode dial 6 and the end button in the operation unit 27 in step #6. If the mode is continuously set to the photography mode, the system controller 26 returns to step #3. If the end button is operated, the system controller 26 shifts to step #7 and then turns off the power.
  • On the other hand, if judging in step #4 that the mode is set to the reproduction mode, the system controller 26 shifts to step #8, and performs an operation in the reproduction mode. For example, in the reproduction mode, the list display unit 44 reads image data recorded in, for example, the recording medium 22, and displays a list of the image data on the display screen of the LCD 9 via the first liquid crystal controller 20.
  • If one item on the list of the image data displayed on the display screen of the LCD 9 by the list display unit 44 is selected, the image reproducer 45 only displays the selected image data on the display screen of the LCD 9 in a magnified form in step #9.
  • Thus, according to the first imaging operation described above, if the finger F of the user approaches the touch panel 31, the frame W for focus is displayed on the part of the display screen of the LCD 9 corresponding to the image part which the finger F of the user approaches, and autofocusing (AF) on the subject in the frame W for focus is performed. If the finger F of the user touches the touch panel 31 corresponding to the frame W for focus, an imaging operation is performed.
  • According to the first imaging operation described above, the frame W for focus is displayed in the image part on which autofocusing (AF) is to be performed, so that when the finger F of the user touches the touch panel 31 on the display screen of the LCD 9, the corresponding image to be touched by the finger F of the user is easily recognized, and imaging can be performed while the image in the part on which autofocusing (AF) is to be performed is checked.
  • If the frame W for focus is displayed when the finger F of the user is approaching the touch panel 31, the part of the touch panel 31 to be touched by the finger F of the user can be checked on the display screen of the LCD 9 without being hidden by the finger F of the user before the finger F of the user touches the touch panel 31. If it is found out from the check that the region to be touched by the finger F of the user is different from the region desired by the user, the region to be touched by the finger F of the user can be changed.
  • The subject corresponding to the coordinate position of the candidate point P can be focused before the finger F of the user touches the touch panel 31. Therefore, the time before focusing can be shorter than when focusing is performed after the finger F of the user has touched the touch panel 31.
  • [Second imaging operation]
  • Now, a second imaging operation is described with reference to an imaging operation flowchart shown in FIG. 17. FIG. 18 shows an imaging operation timing chart. In the same manner as FIG. 12, FIG. 18 shows a period in which a capacitance value generated between the touch panel 31 and, for example, the finger F of the user can be sensed, a display period of the frame W for focus displayed on the display screen of the LCD 9, a period in which autofocusing (AF) is performed, and a period in which imaging is performed. The same parts as those in the above first imaging operation are indicated by the same reference signs and are not described in detail.
  • In the case of the second imaging operation, the finger F of the user moves in a horizontal direction, that is, in a planar direction parallel with the display screen of the LCD 9 so that the finger F of the user is close to the touch panel 31 and, for example, the finger F of the user is within the second touch approach range E2 relative to the touch panel 31.
  • In this second imaging operation, the touch panel controller 30 judges in step #20 whether the finger F of the user has touched the touch panel 31 corresponding to, for example, the facial part S of the person to be focused in the subject in the frame W for focus as shown in FIG. 16B.
  • If it is judged that the finger F of the user has not touched the touch panel 31, the frame display controller 40 shifts to step #40, and judges whether the finger F of the user has moved in the planar direction parallel with the display surface of the LCD 9 as shown in FIG. 8 so that the finger F of the user is close to the touch panel 31, that is, the finger F of the user is present in the second touch approach range E2 shown in FIG. 4. The touch panel controller 30 judges from the capacitance value in the touch panel 31 whether the finger F of the user is present in the second touch approach range E2, as described above.
  • According to the judgment of the movement of the finger F of the user, if the finger F of the user approaches the display surface of the touch panel 31 and enters the second touch approach range E2, the capacitance value increases on the touch panel 31, and there are more places where, for example, the numerical values "5", "3" and "1" appear, as shown in FIG. 6C. The touch panel controller 30 finds, as the coordinate position of the finger F of the user, a place where the greatest change in capacitance value is shown among the numerical values "5", "3" and "1", for example, the place having the numerical value "5".
  • If the finger F of the user moves in the planar direction parallel with the display surface of the touch panel 31 above the touch panel 31 as shown in FIG. 19A, the part in which the capacitance value on the touch panel 31 increases moves in accordance with the movement of the finger F of the user. The touch panel controller 30 tracks the movement of the coordinate position of the finger F of the user where the greatest change in capacitance value is shown. As a result of this track, the touch panel controller 30 follows the movement of the finger F of the user, and sends, to the frame display controller 40, each of the coordinate positions where the finger F of the user moves. The frame W for focus is moved by the movement of the finger F of the user to follow the movement of the facial part S of the person when the facial part S of the person in the image displayed on the display screen of the LCD 9 moves.
  • The frame display controller 40 sequentially receives each of the coordinate positions of the movement of the finger F of the user found by the touch panel controller 30, and judges whether the finger F of the user has moved a predetermined movement distance M, for example, 10 mm or more from each of the coordinate positions.
  • If judging that the movement distance of the finger F of the user is, for example, 10 mm or more, the frame display controller 40 judges, in step #41, the direction in which the finger F of the user moves in accordance with each of the coordinate positions of the movement of the finger F of the user sequentially received from the touch panel controller 30. The frame display controller 40 moves the frame W for focus on the display screen of the LCD 9, for example, as shown in FIG. 8 in the movement direction of the finger F of the user by a movement distance 2M such as 20 mm which is twice the predetermined movement distance M as shown in FIG. 19A and FIG. 19B.
  • If moving the frame W for focus on the display screen of the LCD 9 in accordance with the movement of the finger F of the user, the frame display controller 40 moves the frame W for focus at a speed proportionate to the speed at which the finger F of the user moves.
  • During the movement of the frame W for focus, the focus controller 41 does not perform focusing on the subject, and stops focusing.
  • In step #42, the frame display controller 40 judges whether the finger F of the user stops moving at one place as shown in FIG. 20A, FIG. 20B, and FIG. 20C and stays stopped for a given period of time. If judging that the finger F of the user is moving, the frame display controller 40 returns to step #41, and moves the frame W for focus on the display screen of the LCD 9 in accordance with each of the coordinate positions of the movement of the finger F of the user.
  • If the finger F of the user stops moving for a given period of time, the frame display controller 40 sends, to the focus controller 41, a message that the movement of the finger F of the user has stopped. In response to the message that the movement of the finger F of the user has stopped, the focus controller 41 starts autofocusing (AF) on the subject in the frame W for focus.
  • In step #20, the touch panel controller 30 again judges whether the finger F of the user has touched the touch panel 31 corresponding to, for example, the facial part S of the person to be focused in the subject in the frame W for focus as shown in FIG. 16B. If judging that the finger F of the user has touched the touch panel 31 in the subject in the frame W for focus, the touch panel controller 30 judges from the capacitance value in the touch panel 31 that the finger F of the user has touched the touch panel 31, and sends the relevant message to the imaging controller 42. On receipt of the message from the touch panel controller 30 that the finger F of the user has touched the touch panel 31, the imaging controller 42 performs imaging for a still image by the imaging unit 100 in step #21 at the detection of the touch. The system controller 26 displays the still image acquired by the imaging in the imaging unit 100 on the display screen of the LCD 9 in a magnified form.
  • Thus, according to the second imaging operation described above, if the finger F of the user is moved in the horizontal direction so that the finger F of the user is close to the touch panel 31 and, for example, the finger F of the user is within the second touch approach range E2 relative to the touch panel 31, the frame W for focus is moved in accordance with the movement of the finger F of the user. If the finger F of the user then stops moving for a given period of time, autofocusing (AF) on the subject in the frame W for focus is resumed, and imaging is performed when the finger F of the user touches the touch panel 31.
  • Consequently, it is possible to provide advantageous effects similar to the advantageous effects in the first imaging operation. In addition, even if the finger F of the user moves, the frame W for focus can be moved in accordance with the movement of the finger F of the user. For example, even if the facial part S of the person has moved on the display screen of the LCD 9, the frame W for focus can be moved in accordance with the movement of the facial part S of the person by moving the finger F of the user to follow the movement of the facial part S. Imaging can be then performed by the autofocusing on the facial part S of the person in the frame W for focus. During the movement of the frame W for focus, autofocusing (AF) on the facial part S of the person in the frame W for focus is not performed. Therefore, the autofocusing (AF) is not performed when focusing is unnecessary, so that no wasteful operation is performed. Even when the facial part S of the person which is the main subject is changed to a facial part S of a different person, the frame W for focus can be displayed in the image part to be touched if the finger F of the user is moved closer to touch the part of the touch panel 31 corresponding to the facial part S of the different person.
  • [Third imaging operation]
  • Now, a third imaging operation is described with reference to the imaging operation flowchart shown in FIG. 17 and with reference to a frame magnification/demagnification display flowchart in an imaging operation shown in FIG. 21. FIG. 22 shows an imaging operation timing chart. In the same manner as FIG. 12, FIG. 22 shows a period in which a capacitance value generated between the touch panel 31 and, for example, the finger F of the user can be sensed, a display period of the frame W for focus displayed on the display screen of the LCD 9, a period in which autofocusing (AF) is performed, and a period in which imaging is performed. The same parts as those in the above second imaging operation are indicated by the same reference signs and are not described in detail.
  • In the third imaging operation, the finger F of the user approaches the touch panel 31 from an inclined direction. The touch panel controller 30 finds the position of the finger F of the user approaching the touch panel 31 as shown in FIG. 5 in accordance with the inclination angle θ, and displays the frame W for focus on the display screen of the LCD 9 corresponding to the position of the finger F of the user.
  • The frame display controller 40 sequentially demagnifies the size of the frame W for focus whenever the finger F of the user approaches the touch panel 31. As the sizes of the frame W for focus, the frame display controller 40 includes sizes such as an extra-large size WE, a large size WL, a middle size WM, and a small size WS. The frame display controller 40 divides the second touch approach range E2 into, for example, four size ranges, and sets the size of the frame W for focus to the extra-large size WE, the large size WL, the middle size WM, and the small size WS in descending order of distance from the touch panel 31.
  • FIG. 23 shows a schematic diagram of the size ranges for changing the size of the frame for focus. The second touch approach range E2 is separated into size ranges such as first to fourth size ranges E10 to E13. The first to fourth size ranges E10 to E13 are separated by capacitance values Tha, Thb, and Thc between the touch panel 31 and the finger F of the user. The relation of magnitude between the capacitance values Tha, Thb, and Thc is Tha<Thb<Thc. The first size range E10 is between the capacitance values Th2 and Tha. The second size range E11 is between the capacitance values Tha and Thb. The third size range E12 is between the capacitance values Thb and Thc. The fourth size range E13 is between the capacitance values Thc and Th3.
  • Therefore, if the finger F of the user approaches the touch panel 31 and then sequentially enters the first to fourth size ranges E10 to E13 of the second touch approach range E2, the frame display controller 40 changes the size of the frame W for focus in the order of the extra-large size WE, the large size WL, the middle size WM, and the small size WS. As a result of this size change, the size of the frame W for focus is sequentially demagnified.
  • If the finger F of the user moves away from the touch panel 31 and enters, for example, the fourth to first size ranges E13 to E10, the frame display controller 40 changes the size of the frame W for focus in the order of the small size WS, the middle size WM, the large size WL, and the extra-large size WE. As a result of this size change, the size of the frame W for focus is sequentially magnified.
  • In this third imaging operation, the finger F of the user is approaching to touch the touch panel 31 in the inclined direction in the photography mode.
  • FIG. 24A and FIG. 24B show the situation in which the finger F of the user is about to touch the touch panel 31 in an inclined direction. FIG. 24A shows the image displayed on the LCD 9 in the above situation. FIG. 24B schematically shows the finger F of the user about to touch in the inclined direction in the above situation.
  • If the finger F of the user moves in the inclined direction as shown in FIG. 24B and FIG. 25B and enters the second touch approach range E2 through the first touch approach range E1, the touch panel controller 30 finds the coordinate position of the finger F of the user projected on the touch panel 31 as shown in FIG. 5 in step #14. At the same time, in step #16, the touch panel controller 30 sends, to the frame display controller 40, a message that the finger F of the user has entered the second touch approach range E2. On receipt of the message that the finger F of the user has entered the second touch approach range E2, the frame display controller 40 displays the frame W for focus on the LCD 9 as shown in FIG. 26A.
  • From this point, the frame display controller 40 sequentially demagnifies the size of the frame W for focus in accordance with the frame magnification/demagnification display flowchart in an imaging operation shown in FIG. 21 whenever the finger F of the user approaches the touch panel 31. That is, if the finger F of the user enters the second touch approach range E2 as shown in FIG. 23, the frame display controller 40 judges, in step #50, whether the finger F of the user is in the first size range E10. This judgment is made by whether the capacitance value between the touch panel 31 and the finger F of the user is between Th2 and Tha.
  • If judging that the capacitance value between the touch panel 31 and the finger F of the user is between Th2 and Tha, the frame display controller 40 displays the frame W for focus having the extra-large size WE on the LCD 9 in step #51.
  • In step #52, the frame display controller 40 judges whether the finger F of the user is in the second size range E11. This judgment is made by whether the capacitance value between the touch panel 31 and the finger F of the user is between Tha and Thb.
  • If judging that the capacitance value between the touch panel 31 and the finger F of the user is between Tha and Thb, the frame display controller 40 displays the frame W for focus having the large size WL on the LCD 9 in step #53.
  • In step #54, the frame display controller 40 judges whether the finger F of the user is in the third size range E12. This judgment is made by whether the capacitance value between the touch panel 31 and the finger F of the user is between Thb and Thc.
  • If judging that the capacitance value between the touch panel 31 and the finger F of the user is between Thb and Thc, the frame display controller 40 displays the frame W for focus having the middle size WM on the LCD 9 in step #55.
  • In step #56, the frame display controller 40 judges whether the finger F of the user is in the fourth size range E13. This judgment is made by whether the capacitance value between the touch panel 31 and the finger F of the user is between Thc and Th3.
  • If judging that the capacitance value between the touch panel 31 and the finger F of the user is between Thc and Th3, the frame display controller 40 displays the frame W for focus having the small size WS on the LCD 9 in step #57.
  • Therefore, if the finger F of the user approaches the touch panel 31 and then sequentially enters the first to fourth size ranges E10 to E13 of the second touch approach range E2, the frame display controller 40 changes and demagnifies the size of the frame W for focus in the order of the extra-large size WE, the large size WL, the middle size WM, and the small size WS.
  • The touch panel controller 30 then shifts to step #17 in the same manner as described above.
  • Thus, according to the third imaging operation described above, when the finger F of the user approaches the touch panel 31 from the inclined direction, the position of the finger F of the user approaching the touch panel 31 is found in accordance with the inclination angle θ, and the frame W for focus is displayed on the display screen of the LCD 9 corresponding to the position of the finger F of the user. As a result, the finger F of the user can approach the touch panel 31 from the inclined direction. Consequently, the frame W for focus is not hidden by the finger F of the user, and it is possible to touch within the frame W for focus with a certain recognition of the display position of the frame W for focus, and perform imaging while checking the image in the part on which autofocusing (AF) is to be performed.
  • The size of the frame W for focus is demagnified in the order of, for example, the extra-large size WE, the large size WL, the middle size WM, and the small size WS whenever the finger F of the user approaches the touch panel 31. Therefore, as the finger F of the user approaches the touch panel 31, the part of the touch panel 31 to be touched by the finger F of the user can be checked on the display screen of the LCD 9 before the finger F of the user touches the touch panel 31. Moreover, operation is easier when the finger F of the user touches within the frame W for focus, and the timing in which the finger F of the user touches the touch panel 31 is more easily known, so that operability during imaging can be improved.
  • [Fourth imaging operation]
  • In a fourth imaging operation, a first movement speed ΔP1 and a second movement speed ΔP2 are set in the touch panel controller 30 to judge a variation ΔP in the movement speed of the position where the finger F of the user touches the touch panel 31. The first and second movement speeds ΔP1 and ΔP2 are set in the relation ΔP1>ΔP2.
  • The touch panel controller 30 judges whether the variation ΔP in the movement speed of the position where the finger F of the user touches the touch panel 31 is lower than the preset first movement speed ΔP1, or between the preset movement speeds ΔP1 and ΔP2, or lower than the movement speed ΔP2.
  • The touch panel controller 30 not only judges the variation ΔP in the movement speed of the position where the finger F of the user touches the touch panel 31 but may also judge the variation ΔP in the movement speed of the finger F of the user while the finger F of the user is close to the first and second touch approach ranges E1 and E2 as shown in FIG. 4 and FIG. 5.
  • The frame display controller 40 includes the frames W for focus of sizes corresponding to the variation ΔP in the movement speed of the touch position of the finger F of the user. For example, the frames W for focus include a first size corresponding to one divisional region of 9 divisions of the display screen of the LCD 9, a second size corresponding to one divisional region of 18 divisions of the display screen of the LCD 9, and a third size corresponding to one divisional region of 27 divisions of the display screen of the LCD 9. Regarding the relation between the sizes of the frames W for focus, the first size is the largest, the second size is the second largest, and the third size is the third largest. The sizes of the frames W for focus are not limited to the first to third sizes and may be changed. For example, other sizes may be set, or there may be additional kinds of sizes.
  • If the variation ΔP in the movement speed of the touch position of the finger F of the user, for example, the variation ΔP in the movement speed of the touch position of the finger F of the user is higher than the preset movement speed ΔP1, the frame display controller 40 displays the frame W for focus having the first size on the display screen of the LCD 9.
  • If the variation ΔP in the movement speed of the touch position of the finger F of the user is between the preset movement speeds ΔP1 and ΔP2, the frame display controller 40 displays the frame W for focus having the second size on the display screen of the LCD 9.
  • If the variation ΔP in the movement speed of the touch position of the finger F of the user is lower than the preset movement speed ΔP2, the frame display controller 40 displays the frame W for focus having the third size on the display screen of the LCD 9.
  • Now, the fourth imaging operation is described with reference to an imaging operation flowchart shown in FIG. 28.
  • In step #60, the system controller 26 judges whether the mode selected by, for example, the user operation on the mode dial 6 is the photography mode. If judging that the mode is the photography mode, the system controller 26 starts a photography operation in step #61, controls a series of imaging operations for imaging the subject, and live-view-displays a through-image on the LCD 9 in step #62.
  • In step #63, the touch panel controller 30 monitors a change in capacitance value in the touch panel 31. The touch panel controller 30 judges whether the monitored capacitance value has reached the just-before-touch range T and the finger F of the user has touched the touch panel 31. If the finger F of the user touches the touch panel 31, the touch panel controller 30 also detects a coordinate position of the touched region, and judges whether this coordinate position is a peripheral part on the display screen of the LCD 9. The coordinate position of the peripheral part on the display screen of the LCD 9 is preset.
  • If judging that the coordinate position touched by the finger F of the user is not the peripheral part on the display screen of the LCD 9, the touch panel controller 30 judges in step #64 whether the coordinate position touched by the finger F of the user is a part other than the peripheral part on the display screen of the LCD 9.
  • If judging that the coordinate position touched by the finger F of the user is a part other than the peripheral part on the display screen of the LCD 9, the focus controller 41 starts autofocusing (AF) on the subject on the image corresponding to the coordinate position touched by the finger F of the user.
  • If the autofocusing (AF) is finished and the release button 5 shifts from the on-state of the 1st release switch to the on-state of the 2nd release switch, the imaging controller 42 performs imaging for a still image by the imaging unit 100 in step #66. The imaging controller 42 records main image data acquired by the imaging in the recording medium 22 via the interface 22a.
  • If it is judged in step #64 that the coordinate position touched by the finger F of the user is not a part other than the peripheral part on the display screen of the LCD 9, the imaging controller 42 judges in step #67 whether the release button 5 has shifted from the on-state of the 1st release switch to the on-state of the 2nd release switch and whether to perform an imaging operation. If judging that the imaging operation is performed, the imaging controller 42 sends a face detection instruction to the face detector 43 in step #68. The face detector 43 detects the facial part S of the subject present in the image data.
  • If the facial part S of the subject is detected and the release button 5 shifts from the on-state of the 1st release switch to the on-state of the 2nd release switch, the imaging controller 42 performs imaging for a still image by the imaging unit 100, and records main image data acquired by the imaging in the recording medium 22 via the interface 22a in step #69.
  • On the other hand, if it is judged in step #63 that the coordinate position touched by the finger F of the user is not the peripheral part on the display screen of the LCD 9, the touch panel controller 30 sends, to the frame display controller 40, a message that the coordinate position touched by the finger F of the user is not the peripheral part on the display screen of the LCD 9, and the coordinate position touched by the finger F of the user.
  • In step #70, the frame display controller 40 displays the frame W for focus on the LCD 9 in the peripheral part on the display screen of the LCD 9 and at a position corresponding to the coordinate position touched by the finger F of the user as shown in FIG. 29A. The frame W for focus is displayed in the peripheral part on the display screen of the LCD 9 because when a person is imaged as a subject, the person is generally located in the center of the display screen of the LCD 9, so that the composition for imaging the subject is not affected by the peripheral part of the display screen if the frame W for focus is displayed in the peripheral part.
  • Here, the user moves the frame W for focus on the display screen of the LCD 9 while touching the frame W for focus, and sets the frame W for focus on the facial part S of the subject.
  • In step #71, the frame display controller 40 judges whether the finger F of the user has moved as shown in FIG. 29B. If judging that the finger F of the user has moved, the frame display controller 40 moves the frame W for focus on the display screen of the LCD 9 in accordance with the movement of the finger F of the user in step #72. If moving the frame W for focus in accordance with the movement of the finger F of the user, the frame display controller 40 moves the frame W for focus so that the speed and direction of the movement correspond to the speed and direction of the movement of the finger F of the user.
  • In step #73, the frame display controller 40 judges whether the movement of the finger F of the user has finished. If the movement has not finished, the frame display controller 40 continues the display of the frame W for focus in accordance with the movement of the finger F of the user.
  • FIG. 30 shows a frame display movement flowchart of the frame W for focus.
  • In step #80, the touch panel controller 30 judges the change of the coordinate position touched by the finger F of the user. That is, the touch panel controller 30 detects a coordinate position of a place having a great change in capacitance value on the touch panel 31, sequentially detects coordinate positions of places having a great change in capacitance value which move with the elapse of time, and from each of the coordinate positions, detects the movement of the coordinate position on the touch panel 31 touched by the finger F of the user. When moving the frame W for focus in accordance with the movement on the touch panel 31 touched by the finger F of the user, the frame display controller 40 moves the frame W for focus so that the speed and direction of the movement correspond to the speed and direction of the movement of the touch position of the finger F of the user. Here, a variation in the movement speed of the touch position of the finger F of the user is ΔP, and the direction of the movement is D.
  • In step #81, the touch panel controller 30 judges whether the variation ΔP in the movement speed of the touch position of the finger F of the user is higher than the preset movement speed ΔP1 (ΔP>ΔP1). In step #82, the touch panel controller 30 judges whether the variation ΔP in the movement speed of the touch position of the finger F of the user is higher than the preset movement speed ΔP2 (ΔP>ΔP2).
  • If it is found out from each judgment that the variation ΔP in the movement speed of the touch position of the finger F of the user, for example, the variation ΔP in the movement speed of the touch position of the finger F of the user is higher than the preset movement speed ΔP1, the frame display controller 40 displays the frame W for focus having the first size on the display screen of the LCD 9 in step #82.
  • If the variation ΔP in the movement speed of the touch position of the finger F of the user is between the preset movement speeds ΔP1 and ΔP2, the frame display controller 40 displays the frame W for focus having the second size on the display screen of the LCD 9 in step #84.
  • If the variation ΔP in the movement speed of the touch position of the finger F of the user is lower than the preset movement speed ΔP2, the frame display controller 40 displays the frame W for focus having the third size on the display screen of the LCD 9 in step #85.
  • If the finger F of the user touches the peripheral part on the display screen of the LCD 9 to display the frame W for focus and adjust the frame W for focus to the facial part S of the person which is the subject as shown in FIG. 29A, FIG. 29B, and FIG. 29C, the movement speed of the finger F of the user is higher at the start of the movement, and becomes lower when the facial part S is closer. The variation ΔP in the movement speed of the touch position of the finger F of the user is, for example, ΔP>ΔP1 at the start of the movement, and then becomes ΔP1>ΔP>ΔP2, and becomes ΔP2>ΔP when the frame W for focus is adjusted to the facial part S of the person.
  • The frame display controller 40 first displays the frame W for focus having the first size on the display screen of the LCD 9, and then displays the frame W for focus having the second size on the display screen of the LCD 9, and then displays the frame W for focus having the third size on the display screen of the LCD 9.
  • In step #73, the frame display controller 40 judges whether the variation ΔP in the movement speed of the touch position of the finger F of the user is no longer found and the movement of the frame W for focus having the third size has finished.
  • If it is judged that the movement of the frame W for focus having the third size has finished, the focus controller 41 starts autofocusing (AF) on the subject on the image corresponding to the frame W for focus having the third size.
  • If the autofocusing (AF) is finished, the imaging controller 42 magnifies a through-image of the moving images of the subject corresponding to the frame W for focus having the third size, and live-view-displays a magnified through-image K on the LCD 9 in step #75. The imaging controller 42 displays the magnified through-image K in the frame W for focus having the third size as, for example, a subject in the display screen of the LCD 9 in a region that does not overlap the person.
  • If the release button 5 shifts from the on-state of the 1st release switch to the on-state of the 2nd release switch, the imaging controller 42 performs imaging for a still image by the imaging unit 100 in step #76. The imaging controller 42 records image data acquired by the imaging in the recording medium 22 via the interface 22a in step #77. The system controller 26 displays the still image acquired by the imaging in the imaging unit 100 on the display screen of the LCD 9 in a magnified form.
  • Thus, according to the fourth imaging operation described above, it is judged whether the variation ΔP in the movement speed of the touch position of the finger F of the user is higher than the preset movement speed ΔP1, or between the preset movement speeds ΔP1 and ΔP2, or lower than the movement speed ΔP2. If it is judged that the variation ΔP in the movement speed of the touch position of the finger F of the user is higher than the preset movement speed ΔP1, the frame W for focus having the first size is displayed. If the variation ΔP in the movement speed of the touch position of the finger F of the user is between the preset movement speeds ΔP1 and ΔP2, the frame W for focus having the second size is displayed. If the variation ΔP in the movement speed of the touch position of the finger F of the user is lower than the preset movement speed ΔP2, the frame W for focus having the third size is displayed.
  • Thus, at the beginning of the movement of the frame W for focus, the frame W for focus having the first size is displayed and is more easily positioned relative to the facial part S of the person which is the subject. As the frame W for focus approaches the facial part S, the movement speed of the finger F of the user becomes lower, the display is switched from the frame W for focus having the second size to the frame W for focus having the third size, and the frame W for focus can be accurately positioned in accordance with the facial part S.

Claims (25)

  1. An imaging apparatus characterized by comprising:
    a display unit which displays a moving image or a still image;
    a touch panel provided in the display unit;
    a frame display controller which displays a frame for focus on the display unit when an operation portion is approaching the touch panel, and moves the frame for focus in accordance with the movement of the operation portion that is approaching the touch panel; and
    an imaging controller which performs focusing on the subject in the frame for focus and then performs imaging in response to a photography instruction.
  2. The imaging apparatus according to claim 1, characterized by further comprising:
    an imaging unit which images the subject;
    a touch panel controller which judges whether the operation portion is approaching the touch panel or has contacted the touch panel; and
    a focus controller which performs focusing on the subject in the frame for focus,
    wherein the display unit displays the moving image or the still image obtained by the imaging unit,
    the frame display controller displays the frame for focus on the display unit and moves the frame for focus in accordance with the movement of the operation portion in a situation in which the operation portion is approaching the touch panel, and
    the imaging controller performs imaging by use of the imaging unit.
  3. The imaging apparatus according to claim 1, characterized in that the display unit includes a display screen,
    the touch panel is provided on the display screen, and
    the frame display controller displays the frame for focus at a coordinate position corresponding to a part where the operation portion is approaching the touch panel on the display screen of the display unit.
  4. The imaging apparatus according to claim 3, characterized in that when the operation portion approaches and contacts the touch panel from an inclined direction,
    the touch panel controller finds the position of the part where the operation portion is about to touch the touch panel in accordance with the inclination angle and displays the frame for focus on the display screen corresponding to the position of this part.
  5. The imaging apparatus according to claim 1, characterized in that the frame display controller moves the frame for focus on the display screen of the display unit in accordance with the movement of the approaching operation portion.
  6. The imaging apparatus according to claim 5, characterized in that the frame display controller moves the frame for focus at a speed corresponding to the movement speed of the operation portion and in a direction corresponding to the movement direction of the operation portion.
  7. The imaging apparatus according to claim 6, characterized in that the frame display controller demagnifies or magnifies the size of the frame for focus in accordance with the distance between the operation portion and the touch panel.
  8. The imaging apparatus according to claim 7, characterized in that the frame display controller demagnifies the size of the frame for focus as the distance between the operation portion and the touch panel decreases.
  9. The imaging apparatus according to claim 1, characterized in that the focus controller does not perform the focusing while the frame for focus is being moved by the frame display controller, and the focus controller performs the focusing on the subject when the operation portion stops.
  10. The imaging apparatus according to claim 1, characterized in that when detecting that the operation portion has contacted the touch panel, the imaging controller performs an imaging operation for the subject by use of the imaging unit at the detection of the contact.
  11. The imaging apparatus according to claim 1, characterized by further comprising a face detector which detects whether a facial part of the subject is present in the frame for focus,
    wherein the frame display controller corrects the display position of the frame for focus to the facial part detected by the face detector.
  12. The imaging apparatus according to claim 1, characterized in that the touch panel controller presets touch approach ranges corresponding to the distances at which the operation portion approaches the touch panel, and judges which of the touch approach ranges the operation portion is in, and
    when the touch panel controller judges that the operation portion enters the touch approach range which is far from the touch panel among the touch approach ranges for the touch panel, the frame display controller displays the frame for focus on the display screen of the display unit.
  13. The imaging apparatus according to claim 1, characterized in that when the operation portion enters one of the touch approach ranges which is near the touch panel among the touch approach ranges for the touch panel, the focus controller performs the focusing on the subject.
  14. The imaging apparatus according to claim 12, characterized in that in the case where the operation portion approaches and contacts the touch panel from a perpendicular direction or an inclined direction,
    the touch panel controller sets first and second touch approach ranges and a just-before-touch range in descending order of distance of the operation portion from the touch panel,
    the frame display controller displays the frame for focus on the display screen of the display unit when the operation portion enters the second touch approach range,
    the focus controller performs the focusing on the subject when the operation portion enters the just-before-touch range, and
    the imaging controller performs the imaging operation by use of the imaging unit when the operation portion touches the touch panel.
  15. The imaging apparatus according to claim 14, characterized in that the frame display controller sequentially demagnifies the size of the frame for focus whenever the operation portion comes in the second touch approach range and the operation portion approaches the touch panel.
  16. The imaging apparatus according to claim 14, characterized in that during the focusing after the operation portion has entered the second touch approach range, the focus controller stops the focusing when the operation portion moves, and the focus controller resumes the focusing on the subject when the operation portion stops again.
  17. The imaging apparatus according to claim 14, characterized in that the imaging controller displays an image of the subject acquired by the imaging in the imaging unit on the display screen of the display unit in a magnified form.
  18. The imaging apparatus according to claim 1, characterized by further comprising:
    a list display unit which displays a list of images on the display screen of the display unit during a reproduction mode; and
    an image reproducer which displays one image on the display unit in a magnified form when this image is selected from the above images,
    wherein in a situation in which the operation portion is approaching or in contact in accordance with the touch panel controller, the frame display controller displays the frame for focus on the display unit and moves the frame for focus to the selected image to follow the movement of the operation portion.
  19. An imaging method of an imaging apparatus, characterized by comprising:
    detecting whether an operation portion is approaching a touch panel provided in a display unit which displays a moving image or a still image;
    displaying a frame for focus on the display unit when an operation portion is approaching the touch panel, and moving the frame for focus in accordance with the movement of the operation portion that is approaching the touch panel; and
    performing focusing on the subject in the frame for focus and then performing imaging for the subject in response to a photography instruction.
  20. The imaging method of the imaging apparatus according to claim 19, characterized in that in the case where the operation portion is approaching the touch panel from an inclined direction,
    the position of the operation portion which is approaching the touch panel is found in accordance with the inclination, and the frame for focus is displayed on the display unit corresponding to the position of the operation portion.
  21. The imaging method of the imaging apparatus according to claim 19, characterized in that the movement of the frame for focus follows the movement of the approaching operation portion.
  22. The imaging method of the imaging apparatus according to claim 21, characterized in that the size of the frame for focus is demagnified or magnified in accordance with the distance between the operation portion and the touch panel.
  23. The imaging method of the imaging apparatus according to claim 22, characterized in that the size of the frame for focus is demagnified as the distance between the operation portion and the touch panel decreases.
  24. The imaging method of the imaging apparatus according to claim 19, characterized in that the focusing 0 is not performed while the frame for focus is being moved, and the focusing on the subject is performed when the operation portion stops.
  25. A storage medium configured to store a computer-processible tracking program, the program comprising:
    a detection function to detect whether an operation portion is approaching or has contacted a touch panel provided in a display unit which displays a moving image or a still image;
    a tracking function to display a frame for focus on the display unit when an operation portion is approaching the touch panel, and move the frame for focus in accordance with the movement of the operation portion that is approaching the touch panel; and
    an imaging function to perform focusing on the subject in the frame for focus and then perform imaging for the subject in response to a photography instruction.
EP12858021.4A 2011-12-16 2012-12-13 Imaging device and imaging method, and storage medium for storing tracking program processable by computer Withdrawn EP2782328A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011275924 2011-12-16
PCT/JP2012/082364 WO2013089190A1 (en) 2011-12-16 2012-12-13 Imaging device and imaging method, and storage medium for storing tracking program processable by computer

Publications (2)

Publication Number Publication Date
EP2782328A1 true EP2782328A1 (en) 2014-09-24
EP2782328A4 EP2782328A4 (en) 2015-03-11

Family

ID=48612631

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12858021.4A Withdrawn EP2782328A4 (en) 2011-12-16 2012-12-13 Imaging device and imaging method, and storage medium for storing tracking program processable by computer

Country Status (6)

Country Link
US (1) US9113073B2 (en)
EP (1) EP2782328A4 (en)
JP (1) JP5373229B1 (en)
KR (1) KR101585488B1 (en)
CN (2) CN104012073B (en)
WO (1) WO2013089190A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105681657A (en) * 2016-01-15 2016-06-15 广东欧珀移动通信有限公司 Shooting focusing method and terminal device
EP3255523A4 (en) * 2015-03-13 2018-05-23 Huawei Technologies Co., Ltd. Electronic device, photographing method and photographing apparatus

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013175787A1 (en) * 2012-05-24 2013-11-28 パナソニック株式会社 Imaging device
JP6302215B2 (en) * 2012-11-23 2018-03-28 キヤノン株式会社 Imaging device
KR102056316B1 (en) * 2013-05-03 2020-01-22 삼성전자주식회사 Method of operating touch screen and electronic device thereof
JP6213076B2 (en) * 2013-09-05 2017-10-18 コニカミノルタ株式会社 Touch panel input device, touch panel input device control method, and touch panel input device control program
CN104065878B (en) * 2014-06-03 2016-02-24 小米科技有限责任公司 Filming control method, device and terminal
US9584725B2 (en) 2014-06-03 2017-02-28 Xiaomi Inc. Method and terminal device for shooting control
US9235278B1 (en) * 2014-07-24 2016-01-12 Amazon Technologies, Inc. Machine-learning based tap detection
CN104270573A (en) * 2014-10-27 2015-01-07 上海斐讯数据通信技术有限公司 Multi-touch focus imaging system and method, as well as applicable mobile terminal
CN107465877B (en) * 2014-11-20 2019-07-09 Oppo广东移动通信有限公司 Track focusing method and device and related media production
CN104506765B (en) * 2014-11-21 2018-05-11 惠州Tcl移动通信有限公司 A kind of mobile terminal and its focusing method based on touch gestures
JP6222148B2 (en) * 2015-03-19 2017-11-01 カシオ計算機株式会社 Imaging apparatus, image reproduction method, and program
KR102429427B1 (en) * 2015-07-20 2022-08-04 삼성전자주식회사 Image capturing apparatus and method for the same
US11221707B2 (en) * 2016-01-28 2022-01-11 Maxell, Ltd. Imaging device
EP4102827A1 (en) 2016-08-31 2022-12-14 Canon Kabushiki Kaisha Image capture control apparatus and control method therefor
US11106315B2 (en) * 2018-11-29 2021-08-31 International Business Machines Corporation Touch screen device facilitating estimation of entity orientation and identity
US11523060B2 (en) 2018-11-29 2022-12-06 Ricoh Company, Ltd. Display device, imaging device, object moving method, and recording medium
US10798292B1 (en) * 2019-05-31 2020-10-06 Microsoft Technology Licensing, Llc Techniques to set focus in camera in a mixed-reality environment with hand gesture interaction
CN110908558B (en) * 2019-10-30 2022-10-18 维沃移动通信(杭州)有限公司 Image display method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090244357A1 (en) * 2008-03-27 2009-10-01 Sony Corporation Imaging apparatus, imaging method and program
EP2267716A2 (en) * 2009-06-23 2010-12-29 Sony Corporation Image processing device, image processing method and program
US20110018827A1 (en) * 2009-07-27 2011-01-27 Sony Corporation Information processing apparatus, display method, and display program
US20110084962A1 (en) * 2009-10-12 2011-04-14 Jong Hwan Kim Mobile terminal and image processing method therein
US20110267530A1 (en) * 2008-09-05 2011-11-03 Chun Woo Chang Mobile terminal and method of photographing image using the same

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0213215D0 (en) * 2002-06-08 2002-07-17 Lipman Robert M Computer navigation
JP2006101186A (en) 2004-09-29 2006-04-13 Nikon Corp Camera
JP2006319903A (en) 2005-05-16 2006-11-24 Fujifilm Holdings Corp Mobile apparatus provided with information display screen
JP4929630B2 (en) * 2005-07-06 2012-05-09 ソニー株式会社 Imaging apparatus, control method, and program
GB0617400D0 (en) * 2006-09-06 2006-10-18 Sharan Santosh Computer display magnification for efficient data entry
KR101505681B1 (en) * 2008-09-05 2015-03-30 엘지전자 주식회사 Mobile Terminal With Touch Screen And Method Of Photographing Image Using the Same
JP2011028345A (en) * 2009-07-22 2011-02-10 Olympus Imaging Corp Condition change device, camera, mobile apparatus and program
JP4701424B2 (en) * 2009-08-12 2011-06-15 島根県 Image recognition apparatus, operation determination method, and program
US8826184B2 (en) * 2010-04-05 2014-09-02 Lg Electronics Inc. Mobile terminal and image display controlling method thereof
JP5848561B2 (en) * 2011-09-20 2016-01-27 キヤノン株式会社 Imaging apparatus, control method therefor, program, and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090244357A1 (en) * 2008-03-27 2009-10-01 Sony Corporation Imaging apparatus, imaging method and program
US20110267530A1 (en) * 2008-09-05 2011-11-03 Chun Woo Chang Mobile terminal and method of photographing image using the same
EP2267716A2 (en) * 2009-06-23 2010-12-29 Sony Corporation Image processing device, image processing method and program
US20110018827A1 (en) * 2009-07-27 2011-01-27 Sony Corporation Information processing apparatus, display method, and display program
US20110084962A1 (en) * 2009-10-12 2011-04-14 Jong Hwan Kim Mobile terminal and image processing method therein

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2013089190A1 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3255523A4 (en) * 2015-03-13 2018-05-23 Huawei Technologies Co., Ltd. Electronic device, photographing method and photographing apparatus
CN105681657A (en) * 2016-01-15 2016-06-15 广东欧珀移动通信有限公司 Shooting focusing method and terminal device
CN105681657B (en) * 2016-01-15 2017-11-14 广东欧珀移动通信有限公司 A kind of method and terminal device for shooting focusing

Also Published As

Publication number Publication date
CN104012073B (en) 2017-06-09
KR101585488B1 (en) 2016-01-14
CN104012073A (en) 2014-08-27
KR20140101816A (en) 2014-08-20
EP2782328A4 (en) 2015-03-11
US20140293086A1 (en) 2014-10-02
WO2013089190A1 (en) 2013-06-20
JP5373229B1 (en) 2013-12-18
CN107197141A (en) 2017-09-22
US9113073B2 (en) 2015-08-18
CN107197141B (en) 2020-11-03
JPWO2013089190A1 (en) 2015-04-27

Similar Documents

Publication Publication Date Title
US9113073B2 (en) Imaging apparatus and imaging method of the same, and storage medium to store computer-processible tracking program
US20230367455A1 (en) Information processing apparatus for responding to finger and hand operation inputs
US10205869B2 (en) Video processing apparatus, control method, and recording medium
US8687103B2 (en) Electronic apparatus and method of operating electronic apparatus through touch sensor
EP2701051B1 (en) Electronic apparatus and control method thereof
US9075442B2 (en) Image processing apparatus, method, and computer-readable storage medium calculation size and position of one of an entire person and a part of a person in an image
JP5281838B2 (en) Imaging device
JP2013143578A (en) Image processing device and control method therefor
JP5229928B1 (en) Gaze position specifying device and gaze position specifying program
US11009991B2 (en) Display control apparatus and control method for the display control apparatus
JP6662441B2 (en) Image measuring program, image measuring machine, and image measuring method
JP2013117891A (en) User interface device
KR20100022634A (en) Apparatus for controlling the zoom lens of the camera module and method of the same
JP2015026943A (en) Display device and method for controlling the same

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140618

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

A4 Supplementary search report drawn up and despatched

Effective date: 20150210

RIC1 Information provided on ipc code assigned before grant

Ipc: G03B 17/18 20060101ALI20150204BHEP

Ipc: G06F 3/0482 20130101ALI20150204BHEP

Ipc: H04N 5/225 20060101AFI20150204BHEP

Ipc: G06F 3/0488 20130101ALI20150204BHEP

Ipc: G06F 3/041 20060101ALI20150204BHEP

Ipc: G06F 3/0486 20130101ALI20150204BHEP

Ipc: G03B 17/02 20060101ALI20150204BHEP

DAX Request for extension of the european patent (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: OLYMPUS CORPORATION

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: OLYMPUS CORPORATION

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: OLYMPUS CORPORATION

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20180524

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20180829