CN102457661B - Camera - Google Patents

Camera Download PDF

Info

Publication number
CN102457661B
CN102457661B CN201110314592.0A CN201110314592A CN102457661B CN 102457661 B CN102457661 B CN 102457661B CN 201110314592 A CN201110314592 A CN 201110314592A CN 102457661 B CN102457661 B CN 102457661B
Authority
CN
China
Prior art keywords
mentioned
evf
touch panel
picture
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201110314592.0A
Other languages
Chinese (zh)
Other versions
CN102457661A (en
Inventor
朝仓康夫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aozhixin Digital Technology Co ltd
Original Assignee
Olympus Imaging Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Imaging Corp filed Critical Olympus Imaging Corp
Publication of CN102457661A publication Critical patent/CN102457661A/en
Application granted granted Critical
Publication of CN102457661B publication Critical patent/CN102457661B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)
  • Viewfinders (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)

Abstract

The present invention provides a camera. A viewfinder can be peeked while a touch panel can be operated for setting condition on the picture of the viewfinder. The camera is provided with the following components: a back display part (29) which is equipped at the back surface of the camera body; a touch panel part (14) which is equipped for being superposed on the back display part (29) and detects the input operation instruction; a peeking type EVF (30) which is equipped at the upper part of the camera body; a display switching part (11a) which switches the display target of the image to a random one selected from the back display part (29) or the EVF (30); and an operation input control part (11d) which detects the operation to the touch panel part (14), controls input of operation instruction to the picture of the back display part (29) or the EVF (30), and sets a cursor (84) on the display of the EVF (30) through the sliding amount setting operation to the touch panel part (14).

Description

Camera
Technical field
The present invention relates to the camera of the back displays portion having and be configured in the camera body back side and the EVF spying on type.
Background technology
In the camera changing lens type, provide the back displays portion being provided with and being made up of LCD (liquid crystal display) and the camera spying on both type view finders.As the view finder spying on type, be not limited to optical finder, EVF (electronic viewfinder: Electronic View Finder) has also occurred.In addition, overleaf on display part, as the input unit of camera, mostly arrange and input simple touch panel compared with push-botton operation.
At this set with the back displays portion of touch panel with spy in the camera of both type view finders, when photographing, from the advantage of stable gripping camera etc., mostly spy on type view finder to confirm that subject uses.Now, observe the picture in view finder, operating touch panel, but wanting to spy on view finder carries out touch panel operation, can carry out so-called blind tactile, therefore directly spy on back displays portion with one side while become difficult compared with the operation carried out.
Therefore, propose following camera: be shown as at LCD in the camera of sideways open type, be only configured with 1 operation item (with reference to patent documentation 1) when view finder uses respectively at 4 of touch panel jiaos.In addition, also proposed following camera: in the type that view finder is configured in directly over back displays portion, head is positioned at the position close to touch panel, the space of touch panel tails off, therefore the project that can input is reduced, and the operation on specific direction is regarded as the selection of particular aspects, can make thus (with reference to patent documentation 2) simple to operate.
[patent documentation 1] Japanese Unexamined Patent Publication 2001-326843 publication
[patent documentation 2] Japanese Unexamined Patent Publication 2008-268726 publication
When photographing, when can carry out the alter operation etc. of the photography conditions be presented on view finder picture on touch panel, very convenient.But when spying on type view finder with the camera that the touch panel be arranged on the picture in back displays portion is close, close to touch panel, therefore there is the restricted problem of operation pointed in face.
Summary of the invention
The present invention completes just in light of this situation, and its object is to provides one can spy on view finder operating touch panel, and view finder picture carries out the camera of condition setting.
In order to achieve the above object, the camera of the 1st invention has: back displays portion, and it is configured at the back side of camera body, touch panel, it is arranged to overlap in above-mentioned back displays portion, detects the operation instruction of input, spy on the EVF of type, it is configured at the top of above-mentioned camera body, display switching part, the display destination of image is switched to any one in above-mentioned back displays portion or above-mentioned EVF by it, and operation input control portion, it detects the operation to above-mentioned touch panel, control the input of the operational order of the picture to above-mentioned back displays portion or above-mentioned EVF, aforesaid operations input control portion controls as follows: when by above-mentioned display switching part the display destination of image being switched to above-mentioned back displays portion, input position is appointed as in position on the picture directly corresponding with having carried out touching the position that inputs to above-mentioned touch panel, when by above-mentioned display switching part the display destination of image being switched to above-mentioned EVF, use slidably inputing touch panel, the mobile assigned address on above-mentioned EVF picture according to its glide direction and amount.
The camera of the 2nd invention is in above-mentioned 1st invention, aforesaid operations input control portion controls as follows: when using above-mentioned back displays portion, to whole that the effective region of the input of above-mentioned touch panel is set as to this touch panel be made, when using above-mentioned EVF, a part of face effective region of the input of above-mentioned touch panel being defined as to this touch panel will be made.
The camera of the 3rd invention is in above-mentioned 2nd invention, this camera has posture detecting part, this posture detecting part detects the posture of camera body, aforesaid operations input control portion is when using above-mentioned EVF, the posture of the camera body detected by above-mentioned posture detecting part, carries out switching controls to above-mentioned effective region.
The camera of the 4th invention is in above-mentioned 1st invention, aforesaid operations input control portion controls as follows: when by above-mentioned display switching part the display destination of image being switched to above-mentioned EVF, use slidably inputing above-mentioned touch panel, according to direction and amount mobile assigned address on above-mentioned EVF picture of its sliding action, as the ratio of the amount of movement on picture relative to the slippage of this input, can with than above-mentioned touch panel the few slippage of whole length move the whole length of the corresponding sides of above-mentioned EVF picture.
The camera of the 5th invention is in above-mentioned 4th invention, aforesaid operations input control portion controls as follows: as the ratio of above-mentioned amount of movement when using above-mentioned EVF picture, can with above-mentioned touch panel whole length half below slippage move the whole length of the corresponding sides of above-mentioned EVF picture.
The camera of the 6th invention is the above-mentioned 2nd, in any one invention of the invention of 3 or 4, aforesaid operations input control portion controls as follows: when touch operation being detected, switching to the cursor that on the picture of any one in above-mentioned back displays portion and EVF picture, display position is specified, when the slide of this cursor being detected, move on this screen according to this slide, and when relieving touch operation after this slide, or when again having carried out touch operation after this touch operation is removed, determine the condition on the picture specified by this cursor.
The camera of the 7th invention has: back displays portion, and it is configured at the back side of camera body, touch panel portion, it is arranged to overlap in above-mentioned back displays portion, detects the operation instruction of input, spy on the EVF of type, it is configured at the top of above-mentioned camera body, display switching part, the display destination of image is switched to any one in above-mentioned back displays portion or above-mentioned EVF by it, and operation input control portion, it detects the operation to above-mentioned touch panel, control the input of the operational order of the picture to above-mentioned back displays portion or EVF, aforesaid operations input control portion controls as follows: when by above-mentioned display switching part the display destination of image being switched to above-mentioned EVF, when the ratio of above-mentioned touch panel being used above-mentioned back displays portion, narrow region is set as inputting surveyed area, and each position of the picture in this surveyed area and above-mentioned EVF region is mapped, by the input be appointed as the correspondence position on the picture of above-mentioned EVF the touch input of this surveyed area.
The camera of the 8th invention is in the 7th invention, this camera has posture detecting part, this posture detecting part detects the posture of camera body, aforesaid operations input control portion controls as follows: when using above-mentioned EVF, the posture of the camera body detected by above-mentioned posture detecting part, switches above-mentioned surveyed area.
The camera of the 9th invention has: back displays portion, and it is configured at the back side of camera body; Touch panel portion, it is arranged to overlap in above-mentioned back displays portion, detects the operation instruction of input; Spy on the EVF of type, it is configured at the top of above-mentioned camera body; Display switching part, the display destination of image is switched to any one in above-mentioned back displays portion or above-mentioned EVF by it; And operation input control portion, it detects the operation to above-mentioned touch panel, control the input of the operational order to above-mentioned back displays portion or EVF picture, aforesaid operations input control portion controls as follows: switching on the picture of any one in above-mentioned back displays portion or EVF picture, according to direction and the mobile assigned address of amount of the slide of input, when using above-mentioned EVF picture, the amount of movement on picture is set as than large during the above-mentioned back displays portion of use relative to the ratio of the slippage of this input.
The camera of the 10th invention is in the 9th invention, aforesaid operations input control portion controls as follows: as the ratio of above-mentioned amount of movement when using above-mentioned EVF picture, can with above-mentioned touch panel half below slippage move the whole length of the corresponding sides of above-mentioned EVF picture.
The camera of the 11st invention is in the 9th invention, aforesaid operations input control portion controls as follows: when touch operation being detected, switching to the cursor that on the picture of any one in above-mentioned back displays portion and EVF picture, display position is specified, when the slide of this cursor being detected, move on this screen according to this slide, and when relieving touch operation after this slide, or when having carried out clicking operation after this touch operation is removed, determine the condition on the picture specified by this cursor.
According to the 1st, the 7th and the 9th invention, one can be provided can to spy on view finder operating touch panel, view finder picture carries out the camera of condition setting.
In addition, according to the 2nd invention, unwanted region is set to not induction region, therefore, it is possible to prevent faulty operation.According to the 3rd invention, switch according to posture, though therefore change camera towards, about surveyed area, also consider that the motion carrying out the hand gripped is to set appropriate area.According to the 4th invention, can by the whole length of the small motion mobile EVF picture on touch panel.In addition, according to the 5th invention, the whole length of EVF picture can be moved with the motion below the half of the whole length of touch panel.According to the 6th invention, according to input display highlighting, therefore, it is possible to the input of easily visuognosis information.
In addition, according to the 8th invention, switch according to posture, though therefore change camera towards, about surveyed area, also consider that the motion carrying out the hand gripped is to set appropriate area.
In addition, according to the 9th invention, when touch panel carries out the information input utilizing slide, increase sensitivity, therefore move a little on touch panel.According to the 11st invention, according to input display highlighting, therefore, it is possible to the input of easily visuognosis information.
Accompanying drawing explanation
Fig. 1 is the block diagram of the essential electrical circuit structure of the camera that an embodiment of the invention are shown.
Fig. 2 illustrates in the camera of an embodiment of the invention, observes the figure of the state of the display in back displays portion.
Fig. 3 illustrates in the camera of an embodiment of the invention, selects the figure of the state of subject from the shot object image shown by back displays portion.
Fig. 4 illustrates in the camera of an embodiment of the invention, observes the figure of the state of the display of EVF.
Fig. 5 illustrates in the camera of an embodiment of the invention, by observe shot object image shown by EVF overleaf the enterprising line operate of display part select the figure of the state of subject.
Fig. 6 illustrates in the camera of an embodiment of the invention, by camera being set to lateral attitude and observing shot object image shown by EVF overleaf display part carries out operating the state (category-A type) selecting subject figure with the right hand.
Fig. 7 illustrates in the camera of an embodiment of the invention, by camera being set to lateral attitude and observing shot object image shown by EVF overleaf display part carries out operating the state (category-A type) selecting subject figure with left hand.
Fig. 8 illustrates in the camera of an embodiment of the invention, by camera being set to lengthwise position upward with left side and observing shot object image shown by EVF overleaf display part carries out operating the state (category-A type) selecting subject figure with left hand.
Fig. 9 illustrates in the camera of an embodiment of the invention, by camera being set to lengthwise position upward with right side and observing shot object image shown by EVF overleaf display part carries out operating the state (category-A type) selecting subject figure with left hand.
Figure 10 illustrates in the camera of an embodiment of the invention, holds camera and the figure observing that shot object image shown by EVF overleaf display part carries out operating the state (category-B type) selecting subject with the right hand by horizontal.
Figure 11 illustrates in the camera of an embodiment of the invention, holds camera and the figure observing that shot object image shown by EVF overleaf display part carries out operating the state (category-B type) selecting subject with left hand by horizontal.
Figure 12 illustrates in the camera of an embodiment of the invention, by camera being set to lengthwise position upward with a left side and observing shot object image shown by EVF overleaf display part carries out operating the state (category-B type) selecting subject figure with left hand.
Figure 13 illustrates in the camera of an embodiment of the invention, by camera being set to lengthwise position upward with the right side and observing shot object image shown by EVF overleaf display part carries out operating the state (category-B type) selecting subject figure with left hand.
Figure 14 illustrates in the camera of an embodiment of the invention, overleaf display part carries out operating the figure of the state (C type) of carrying out cursor movement by observing shot object image shown by EVF with the right hand.
Figure 15 illustrates in the camera of an embodiment of the invention, overleaf display part carries out operating the figure of the state (D type) of carrying out cursor movement by observing shot object image shown by EVF with the right hand.
Figure 16 is the figure of the example beyond the main subject illustrated in the camera of an embodiment of the invention is selected.
Figure 17 illustrates in the camera of an embodiment of the invention, the figure of the relation of the input mode of the touch panel when display part overleaf in A ~ D type carrying out the display of live view image and when carrying out the display of live view image on EVF.
Figure 18 is the flow chart of the action illustrated in the camera of an embodiment of the invention.
Figure 19 is the flow chart of the action illustrated in the camera of an embodiment of the invention.
Figure 20 is the flow chart of the action illustrated in the camera of an embodiment of the invention.
Figure 21 is the flow chart of the action illustrated in the camera of an embodiment of the invention.
Label declaration
1: camera; 10: camera body portion; 11: control part; 11a: display switching part; 11b: camera is towards judging part; 11c: image information control part; 11d: operation input control portion; 11e: push-botton operation test section; 12: program/data store; 13: operating portion; 13a: release-push; 14: touch panel portion; 14a: effective range; 14b: invalid region; 15: test section; 16: tilt detection portion; 21: imaging apparatus; 22: shooting handling part; 23:SDRAM; 24: image processing part; 25: compressed and decompressed portion; 26: recording/reproducing unit; 27: image storage part; 28: Graphics Processing portion; 29: back displays portion; 30:EVF; 32:EVFLCD; 34: eyepiece; 50: change camera lens part; 52: imaging lens unit; 71: user; 71R: right finger; 71L: left finger; 81a: subject; 81b: subject; 84: cursor; 85: photography conditions setting shows; 86:AF choice box; 87: selected AF frame; 88: slider bar.
Embodiment
Below, apply camera of the present invention according to accompanying drawing use to be described preferred implementation.The camera of a preferred execution mode of the present invention is digital camera, there is image pickup part, by this image pickup part, shot object image is converted to view data, and according to the view data of this conversion, the back displays portion being configured at the main body back side or the EVF spying on type carry out live view display to shot object image.Cameraman, by observing live view display, determines composition and shutter opportunity (Shutter chance).Further, overleaf display part being provided with touch panel, by touching this touch panel, according to operations such as its touch location, glide direction, slippages, the operations such as the selection of shot object image can being carried out.When releasing operation, by the Imagery Data Recording of rest image or moving image in image storage part.When selecting reproduction mode, the rest image that can record image storage part on display part or the view data of moving image are carried out reproductions and are shown.
Fig. 1 is the block diagram of the structure of the camera 1 that an embodiment of the invention are shown.Camera 1 is made up of camera body portion 10 and replacing camera lens part 50.Change camera lens part 50 and there is in inside imaging lens unit 52 (with reference to Fig. 2) and aperture, carry out the adjustment of focal position, focal length and opening amount.In addition, change camera lens part 50 dismounting in camera body portion 10 freely, when being assembled in camera body portion 10, to communicate with control part 11 via the lens control portion 17 in camera body portion 10.
Lens control portion 17, according to the instruction from control part 11, exports control signal to replacing camera lens part 50, and the various replacing shot information received from replacing camera lens part 50 are outputted to control part 11.Control part (CPU:Central Processing Unit: central processing unit) 11 carries out action according to the control program in the program of being stored in/data store 12, and the entirety of carrying out camera 1 controls.
In the various process that control part 11 performs, the main process called after carried out in the present embodiment is shown switching part 11a, camera towards judging part 11b, image information control part 11c, operation input control portion 11d and push-botton operation test section 11e.Further, each process of these 11a ~ 11e is the process performed according to control program, and the function part therefore comprised as control part 11 is described.
The display destination of image is switched to any one in back displays portion 29 or EVF (ElectronicView Finder: electronic viewfinder) 30 by display switching part 11a.In the present embodiment, as the display unit of the image of live view display and reproduction display etc., there are back displays portion 29 and EVF 30 these two as described later, any one party is carried out the display of image.Display switching part 11a inputs testing result from test section 15, and according to this testing result, carries out following switching controls: any one in two display parts shows.In addition, if necessary, show switching part 11a also can display part 29 and EVF 30 overleaf both on show image.
The detection that camera inputs from tilt detection portion 16 described later towards judging part 11b exports, and judges the posture in camera body portion 10.Namely, the situation length direction of imaging apparatus being in the direction vertical with gravity direction is set to " lateral attitude ", by the length direction of imaging apparatus, situation about being in along the direction of gravity direction is set to " lengthwise position ", judges that camera is in any posture." lateral attitude " is common photography posture.In addition, when " lengthwise position ", also judge which in the left and right of camera is upper.In addition, below, the left and right direction performance observed from cameraman of camera.When image information control part 11c overleaf display part 29 and EVF 30 carries out live view display, predetermined menus and the information such as display highlighting 84 (with reference to Fig. 3 (c)).
Operation input control portion 11d is connected with the output in touch panel portion 14, the detection operation of user in touch panel portion 14, controls the input of the operational order of the picture for back displays portion 29 or EVF 30.
Operation input control portion 11d has two instruction detection modes, suitably selects two.One is the mode (hereinafter referred to as the position specific mode) position on the picture directly corresponding with the position of carrying out touching in touch panel portion 14 inputting being appointed as input position.This is the input mode of general touch panel.Another is by the mode (hereinafter referred to as slippage specific mode) slidably inputing, move assigned address according to its glide direction and amount in touch panel portion 14.Slippage specific mode is the input mode used in the touch pad (Touch Pad) of PC (personal computer: personal computer).
Further, as further later as described in, operation input control portion 11d according to be easily selected by a user 4 types (A ~ D), according to show destination separate use location specific mode and slippage specific mode.
Further, as hereinafter described, touch condition can detect on whole in touch panel portion 14, but according to the posture of set pattern and camera 1, touch condition is detected as effective region (area) by operation input control portion 11d restriction.
Push-botton operation test section 11e is connected with operating portion 13, detects the mode of operation of operating portion 13.
Control part 11 is connected with program/data store 12, operating portion 13, touch panel portion 14, test section 15, tilt detection portion 16 and bus 31.Program/data store 12 is non-volatile memories, is stored in the various data of program and the display performed in control part 11 as mentioned above.
Operating portion 13 has the various functional units for being provided instruction to camera by user.As various functional unit, there are power knob, release-push 13a (with reference to Fig. 3 (a)), cross button, confirming button, reproduction button etc., the testing result of the mode of operation of various functional unit is outputted to the push-botton operation test section 11e of control part 11.
As shown in Figure 2, touch panel portion 14 is configured in before back displays portion 29.In addition, also can be integrally constituted with back displays portion 29.When user touches the picture in back displays portion 29, touch location and touch direction detect in touch panel portion 14, and testing result are outputted to operation input control portion 11d.
As shown in Figure 4, test section 15 is arranged in the eyepiece portion of the EVF 30 of camera body 10, is made up of the light-projecting portion of infrared ray etc. and the light accepting part of this ultrared reverberation 15R of reception.When user spies on eyepiece portion, the eyes of user periphery that infrared ray is spying on eyepiece portion projected from user's test section 15 reflects, and test section 15 receives this reverberation, and output detection signal.In addition, in the present embodiment, have employed the detection mode that projection receives light type, but in addition, other modes such as detecting user's pupil can certainly be adopted.
Tilt detection portion 16 comprises the gravity detecting sensor of the heeling condition for detecting camera body 10, and the detection signal corresponding relative to the angle of gravity direction with camera 1 is outputted to camera towards judging part 11b.
In bus 31, except above-mentioned control part 11, be also connected with shooting handling part 22, SDRAM (Synchronous Dynamic Random Access Memory: Synchronous Dynamic Random Access Memory) 23, image processing part 24, compressed and decompressed portion 25, record-playback portion 26, Graphics Processing portion 28.Shooting handling part 22 is connected with imaging apparatus 21.The shot object image formed by changing camera lens part 50 is converted to picture signal by imaging apparatus 21.Shooting handling part 22 reads picture signal from imaging apparatus 21, and is converted to the view data of numeral, outputs to bus 31.
SDRAM 23 is can the temporary storage memory of volatibility rewritten of electricity, for the interim storage of the view data of the rest image that exports from shooting handling part 22 and consecutive image (comprising moving image) etc.Image processing part 24 carries out as inferior various image procossing: the numeral of DID amplifies (digital gain adjustment process), white balance, color correction, gamma (γ) correction, contrast correction, live view display Computer image genration, moving image generation, thumbnail (downscaled images) generation.
Compressed and decompressed portion 25 is for utilizing the view data of the compress modes such as JPEG or TIFF to the rest image be stored in SDRAM 23 and consecutive image temporarily to compress and decompressing the circuit that carries out showing etc.In addition, image compression is not limited to JPEG, TIFF, MPEG, can also apply other compress mode.
Graphics Processing portion 28 makes back displays portion 29 or EVF 30 show live view display when photographing, making back displays portion 29 or EVF 30 show record browse displays, making back displays portion 29 or EVF30 display reproduction image when reproducing when discharging.Back displays portion 29 is connected with Graphics Processing portion 28, has the display such as LCD monitor or organic EL at the back side being configured at camera body 10.By the picture of the control in Graphics Processing portion 28 display part 29 overleaf shows predetermined image.
As shown in Figure 4, the electronic viewfinder liquid crystal (being EVFLCD below) 32 that the EVF 30 be connected with Graphics Processing portion 28 is shown by the image being arranged on camera body 10 inside and eyepiece 34 are formed.User by spying on eyepiece 34, can observe the image of live view display etc.
Recording/reproducing unit 26 is stored into having carried out the view data after compressing in compressed and decompressed portion 25 in image storage part 27, and reads the view data of the photographs be stored in image storage part 27.The view data of this reading is decompressed in compressed and decompressed portion 25, according to the view data after this decompression, overleaf display part 29 or EVF 30 carries out reproduction display to image.Image storage part 27 to be connected with recording/reproducing unit 26 and can be built-in or be assembled to the recording medium of the view data in camera body 10.Rest image and consecutive image is recorded in this image storage part 27.
Then, Fig. 2 to Figure 16 is used to be described the back displays portion 29 in present embodiment and the using method in EVF 30.Fig. 2 and Fig. 3 illustrates the using state in general back displays portion 29.Overleaf display part 29 shows the image of live view display etc., user 71 can observe this image.
When Fig. 3 illustrates and overleaf display part 29 carried out live view display, select the method for subject in order to AF (Auto Focus: auto-focusing) and AE (Auto Exposure: automatic exposure) etc.First, Fig. 3 (a) illustrates the state of display part 29 overleaf having carried out live view display.That is, overleaf display part 29 shows live view image, on the other hand, EVF 30 does not show live view image.As mentioned above, be because display switching part 11a is according to the testing result of test section 15, when user does not spy on eyepiece 34, only overleaf display part 29 show.
As shown in Fig. 3 (a) ~ (c), overleaf display part 29 shows subject 81a and 81b.Now, as shown in Fig. 3 (b), when user wants to make subject 81b aim at focus and carry out spectrum assignment, touch subject 81b.When touch panel portion 14 exports the touch signal to subject 81b, as shown in Fig. 3 (c), image information control part 11c overleaf on display part 29 with subject 81b display highlighting 84 overlappingly.This is position specific mode.
Fig. 4 and Fig. 5 illustrates the using state of EVF 30.As shown in Figure 4, when user 71 spies on the eyepiece portion of EVF 30, test section 15 detects user 71, and testing result is outputted to display switching part 11a.EVF 30 is set to show state according to this testing result by display switching part 11a.Its result, EVF 30 shows live view image etc., and on the other hand, the display in back displays portion 29 terminates.
When Fig. 5 illustrates and carried out live view display on EVF 30, select the method for subject in order to AF (Auto Focus: auto-focusing) and AE (Auto Exposure: automatic exposure).Fig. 5 (a) illustrates the state of having carried out live view display on EVF 30.That is, only on EVF 30, live view image is shown.Fig. 5 (b) and (c) are the figure after amplifying EVF 30.
As shown in Fig. 5 (a) ~ (c), EVF 30 shows subject 81a and 81b.Now, when user wants to make subject 81b aim at focus and carry out spectrum assignment, as shown in Fig. 5 (a), user carries out touch operation being arranged in the touch panel portion 14 in back displays portion 29.When touch panel portion 14 output detection signal, operation input control portion 11d is according to the condition adjudgement operation input of camera 1.In addition, in the present embodiment, as the mode of operation in touch panel portion 14, these 4 types of category-A type ~ D type have been prepared.According to the type preset by user on menu screen, judge operation input.Further, as shown in Fig. 5 (c), by following explanation any one operation, image information control part 11c on EVF 30 with subject 81b display highlighting 84 overlappingly.
First, Fig. 6 to Fig. 9 is used to be described the category-A type in the method for operation in touch panel portion 14.
A-1) Fig. 6 illustrates, in category-A type, camera 1 is set to " lateral attitude " (Horizontal), carries out the situation (the type being called A-HR type) operated with the right hand (Right).Under the state that EVF 30 has carried out live view display, when user makes subject 81b aim at focus and carry out spectrum assignment, as shown in Fig. 6 (a), when user touches in touch panel portion 14 with the thumb of right hand 71R, display highlighting.Then, the mode that user aims at subject 81b with cursor, with thumb enterprising line slip operation in directions such as P1 and P2 of right hand 71R.This is the operation (hereinafter referred to as slippage operation) utilizing slippage specific mode.
Further, now, as shown in Fig. 6 (b), the effective range of touch operation is defined as the right side (effective range 14a) of panel part 14.That is, only in the region of effective range 14a, be detected as effectively by the slippage assigned operation of right finger 71R, in invalid region 14b in addition, even if carry out slide, touch panel portion 14 is not detected as effectively yet.
When right finger 71R touches effective range 14a at first, as shown in Fig. 6 (c), cursor 84 is presented at the substantial middle of the picture of EVF 30.In this condition, with on right finger 71R overleaf display part 29 along predetermined direction (such as P1, P2 direction) slip scheduled volume time, cursor 84 moves along this glide direction according to its slippage.When thinking that the amount of movement is larger, repeat slide relative to this direction.
Be moved at cursor 84 and be set to user target subject, in the example shown in Fig. 6 for subject 81b overlapping after, when being undertaken clicking (click) operation (one touch) by the right finger 71R of user, determine the position (with reference to Fig. 6 (d)) of cursor 84.To determine in the position of cursor 84 and after certain hour, cursor 84 becomes non-display state automatically.In addition, these operations are not limited to, with the thumb of right hand 71R, to point with other.
A-2) Fig. 7 illustrates, in category-A type, camera 1 is set to lateral attitude (Horizontal), carries out the situation (the type being called A-HL type) operated with left hand (Left).Under the state that EVF 30 has carried out live view display, when user wants to make subject 81b aim at focus and carry out spectrum assignment, as shown in Fig. 7 (a), slide after the picture with left finger 71L display part 29 overleaf having carried out touch, carry out slippage assigned operation.
In the type, also same with A-HR type, as shown in Fig. 7 (b), the left side in touch panel portion 14 is set to effective range 14a, and the whole face in touch panel portion 14 is not set to surveyed area.That is, in the region of the effective range 14a only in touch panel portion 14, be effective by the slippage operation detection of the thumb of left hand 71L etc.
When left finger 71L touches effective range 14a at first, same with the situation of A-HR type, cursor 84 is presented at the substantial middle (with reference to Fig. 7 (c)) of the picture of EVF 30.In this condition, slide in touch panel portion 14 left finger 71L time, cursor 84 moves according to its direction and slippage.
After user moves cursor 84 and be overlapping with the subject being set to target, when carrying out single-click operation with left finger 71L, determine the position of cursor 84.To determine in the position of cursor 84 and after certain hour, cursor 84 becomes non-display state automatically.
A-3) Fig. 8 illustrates camera 1 be set to left side lengthwise position (Vertical) upward and carry out the situation (the type being called A-VL type) that operates with left hand (Left) in category-A type.Herein, " left lengthwise position " is upward that the length direction of the imaging surface of imaging apparatus 21 becomes equidirectional with gravity, the holding mode on the upside of the left side of camera is in.Under the state that EVF 30 has carried out live view display, when user wants to make subject 81b aim at focus and carry out spectrum assignment, as shown in Fig. 8 (a), slide after the picture with left finger 71L display part 29 has overleaf carried out single touch, carry out slippage assigned operation.
In the type, also same with A-HR type and A-HL type, overleaf when the picture of display part 29 enterprising line slip amount assigned operation, the whole face in touch panel portion 14 is not set to surveyed area.That is, as shown in Fig. 8 (b), the upside in touch panel portion 14 being set as effective range 14a, only in the region of this effective range 14a, is effective by the slippage operation detection of left finger 71L.In addition, herein, as effective range 14a, the length direction of its length direction and EVF is equidirectional.Certainly, square or shape of growing crosswise can be set to.In addition, the direction corresponding with the state of the camera gripped is carried out the performance of " position " of set effective range 14a, shows as " upside " in touch panel portion 14 herein.
When left finger 71L touches effective range 14a at first, same with A-HR type and A-HL type, cursor 84 is presented at the substantial middle of the picture of EVF 30.In this condition, slide in touch panel portion 14 left finger 71L time, cursor 84 according to slide direction and slippage move.
After user moves cursor 84 and be overlapping with the subject being set to target, when carrying out single-click operation with left finger 71L, determine the position of cursor 84.To determine in the position of cursor 84 and after certain hour, cursor 84 becomes non-display state automatically.
A-4) Fig. 9 illustrates camera 1 be set to right side lengthwise position (Vertical) upward and carry out the situation (the type being called A-VL (2) type) that operates with left hand (Left) in category-A type.Under the state that EVF 30 has carried out live view display, when user wants to make subject 81b aim at focus and carry out spectrum assignment, as shown in Fig. 9 (a), carry out touching rear slip with on the picture of left finger 71L display part 29 overleaf, carry out slippage assigned operation.
In the type, also same with first three type, overleaf when the picture of display part 29 enterprising line slip amount assigned operation, the whole face in touch panel portion 14 is not set to surveyed area.That is, as shown in Fig. 9 (b), only by the region of the effective range 14a of the downside in touch panel portion 14, the slide of left finger 71L is detected as effectively.
When left finger 71L touches effective range 14a at first, same with the situation of first three type, cursor 84 is presented at the substantial middle of the picture of EVF 30.In this condition, slide in the touch panel portion 14 of display part 29 overleaf left finger 71L time, cursor 84 moves according to the direction of sliding and slippage.
After user moves cursor 84 and be overlapping with the subject being set to target, when carrying out single-click operation with left finger 71L, determine the position of cursor 84.To determine in the position of cursor 84 and after certain hour, cursor 84 becomes non-display state automatically.
Like this, when setting category-A type, show live view image etc. by EVF 30, when selecting subject, can by carrying out slippage assigned operation to back displays portion 29, mobile cursor 84, by the position of single-click operation determination cursor 84.Now, the region (effective range 14a) detecting displacement operation is set, only within the scope of this, operation is set to effectively.Therefore, even if erroneous contacts invalid region 14b, also faulty operation can not be become.
Then, Figure 10 to Figure 13 is used to be described the category-B type in the system of selection of subject.In above-mentioned category-A type, under the state that EVF 30 has carried out live view display, by display part 29 enterprising line slip amount assigned operation overleaf, determine the position of cursor 84.On the other hand, under the state that category-B type has carried out live view display on EVF 30, the touch location place overleaf on display part 29 directly positions.It is the input utilizing position specific mode.
B-1) Figure 10 illustrates camera 1 is set to lateral attitude and carries out the situation (the type being called B-HR type) that operates with the right hand in category-B type.It is the photography posture of standard.Under the state that EVF 30 has carried out live view display, when user wants to make subject 81b aim at focus and carry out spectrum assignment, as shown in Figure 10 (a), when the picture with right finger 71R display part 29 overleaf touches, the position touched is detected in touch panel portion 14, and image information control part 11c carries out the location of cursor 84 according to this position.
But, when the picture of display part 29 overleaf carries out touch operation, the whole face in touch panel portion 14 is not set to surveyed area.That is, as shown in Figure 10 (b), the roughly right half part in touch panel portion 14 is set to effective range 14a.Only in the region of effective range 14a, be detected as utilizing the touch operation of right finger 71R effectively.That is, in this example, the whole picture scope as effective range 14a and the EVF 30 of the part in touch panel portion 14 is corresponding.In addition, in Figure 10 (b), the corresponding relation for convenience of description between EVF picture and touch panel portion 14, touch panel portion 14 illustrate with dashed lines position 81Ma, the 81Mb suitable with subject, but does not in fact show.Only on EVF 30, show live view image, the back displays portion 29 of reality does not show live view image.
When touching suitable with the subject position 81Mb in effective range 14a with right finger 71R, as shown in Figure 10 (d), cursor 84 overlap is presented on the subject 81b of EVF 30.If the position of cursor 84 is not the position that user is set to target, then can by again touching the positioning instant carrying out cursor 84 in back displays portion 29.In addition, positioned by touch in the present embodiment, but double-click can certainly be utilized to position.
B-2) Figure 11 illustrates camera 1 is set to lateral attitude and carries out the situation (the type being called B-HL type) that operates with left hand in category-B type.Under the state that EVF 30 has carried out live view display, when user wants to make subject 81b aim at focus and carry out spectrum assignment, as shown in Figure 11 (a), touch with on the picture of left finger 71L display part 29 overleaf.Image information control part 11c carries out the location of cursor 84 according to the position now touched.
When the picture of display part 29 overleaf carries out touch operation, the whole face in touch panel portion 14 is not set to surveyed area.That is, as shown in Figure 11 (b), the roughly left-half in touch panel portion 14 is set to effective range 14a.Only in the region of effective range 14a, be detected as utilizing the touch operation of left finger 71L effectively.About effective range 14a, identical with the situation of Figure 10, therefore omit the description.
When touching suitable with the subject position 81Mb in effective range 14a with left finger 71L, not shown cursor overlap is presented on the subject 81b of EVF 30.
B-3) Figure 12 illustrates camera 1 be set to left lengthwise position upward and carry out the situation (the type being called B-VL type) that operates with left hand in category-B type.Under the state that EVF 30 has carried out live view display, when user wants to make subject 81b aim at focus and carry out spectrum assignment, as shown in Figure 12 (a), touch with on the picture of left finger 71L display part 29 overleaf.Image information control part 11c carries out the location of cursor 84 according to the position now touched.
When the picture of display part 29 overleaf carries out touch operation, the whole face in touch panel portion 14 is not set to surveyed area.That is, as shown in Figure 12 (b), the roughly upper left half portion in touch panel portion 14 is divided into effective range 14a.About effective range 14a, identical with the situation of Figure 10, therefore omit the description.
When touching suitable with the subject position 81Mb in effective range 14a with left finger 71L, cursor 84 overlap is presented on the subject 81b of EVF 30.
B-4) Figure 13 illustrates camera 1 be set to right lengthwise position upward and carry out the situation (the type being called B-VL type) that operates with left hand in category-B type.Under the state that EVF 30 has carried out live view display, when user wants to make subject 81b aim at focus and carry out spectrum assignment, as shown in Figure 13 (a), touch with on the picture of left finger 71L display part 29 overleaf.Image information control part 11c carries out the location of cursor 84 according to the position now touched.
When the picture of display part 29 overleaf carries out touch operation, the whole face in touch panel portion 14 is not set to surveyed area.That is, as shown in Figure 13 (b), the roughly lower left half in touch panel portion 14 is set to effective range 14a.About effective range 14a, identical with the situation of Figure 10, therefore omit the description.
When touching suitable with the subject position 81Mb in effective range 14a with left finger 71L, cursor 84 overlap is presented on the subject 81b of EVF 30.
C) then, the C type of Figure 14 to the specific mode of subject is used to be described.When selecting C type, when using any one of back displays portion 29/EVF 30, operation input control portion 11d all adopts slippage specific mode.Further, in the manner, the sensitivity that slippage detects, when using EVF, sets higher by operation input control portion 11d.
Figure 14 (a) overleaf display part 29 illustrates that live view shows, and Figure 14 (b), (c) illustrate that live view shows on EVF 30.In Figure 14 (a), represent slippage with (X0, the Y0) shown in solid line, (X0, Y0) of being shown in broken lines represents the amount of movement of the cursor 84 corresponding with slippage.When overleaf display part 29 showing, cursor 84 is to move (X0, Y0) 1 corresponding mode with (X0, the Y0) roughly 1 shown in solid line.
On the other hand, when EVF shows, in the present embodiment, the sensitivity that slippage detects being increased is 2 times.That is, as shown in Figure 14 (b), when be set to overleaf display part 29 slides (X/2, Y/2) time, as shown in Figure 14 (c), cursor 84 moves (X, Y).
Namely, when setting the picture dimension in back displays portion 29 (touch panel portion 14) as horizontal L, vertical H, with the slippage of L/2, the movement of the picture transverse width of EVF 30 can be carried out, and the longitudinal whole movement of picture of EVF 30 can be carried out with the slippage of H/2.Therefore, in C type, even if for the movement of cursor also can be carried out in narrower region.
D) then, the D type in the system of selection of subject is described.When D type overleaf display part 29 shows image, adopt " the position specific mode " of the position on the picture direct corresponding with carrying out touching the position that inputs in touch panel portion 14 being appointed as input position, EVF 30 shows image, adopt by " the slippage specific mode " that slidably input, move assigned address according to its glide direction and amount in touch panel portion 14.Same with C type, the sensitivity that slippage detects is set higher.
Figure 15 shows image to go forward side by side the example of sight of line slip amount assigned operation on EVF 30.Overleaf display part 29 shows image and to carry out the example of touch panel operation identical with Fig. 3, therefore omit the description.Figure 15 (a) shows the state of having carried out live view display on EVF 30, and Figure 15 (b) shows image content.When setting the picture dimension in back displays portion 29 (touch panel portion 14) as horizontal L, vertical H, the picture dimension of EVF 30 is set to the vertical H ' of horizontal L '.By determining the position of cursor 84 relative to the slide in touch panel portion 14.
Further, as the sensitivity of slippage, to carry out the movement of the picture transverse width L ' of EVF 30 relative to the L/2 in touch panel portion 14 or the slippage below it.Equally, the whole movement of the longitudinal H ' of picture of EVF 30 is carried out with H/2 or its following slippage.Also the slippage detection sensitivity of slippage assigned operation is set higher in D type, even if therefore the movement of cursor also can be carried out in narrower region, also unlike category-A type or category-B type, the effective range 14a in touch panel portion 14 can be limited.
In addition, in C, D type, in order to prevent faulty operation, also can with category-A type or category-B type be same sets effective range.In addition, this effective range can with manually carrying out regional choice, or also can according to grip camera towards automatically setting this region.In addition, in the present embodiment, the sensitivity being used for slippage detection has been set to 2 times, but has been not limited thereto, also can be set to the sensitivity (X3) different from it.
In addition, in the explanation of above-mentioned category-A type and category-B type, do not mention the sensitivity setting of slippage especially, but user also can set arbitrarily.In addition, in category-A type and category-B type, also can when the size of effective range is relatively large, the sensitivity of slippage be set lower (such as 1 times), when the size of effective range is relatively little, the sensitivity of slippage be set higher (such as 2 times).
As mentioned above, in Fig. 3 to Figure 15, describe the example setting main subject, but the invention is not restricted to the setting of main subject, various setting can be applied to.Such as, as shown in Figure 16 (a) (b), the setting example of photography conditions is shown.
Figure 16 (a) illustrates state display part 29 overleaf showing photography conditions setting display 85.Now, if user touches the icon being set to the photography conditions of target, then photography conditions can be set.Figure 16 (b) illustrates situation EVF30 showing photography conditions setting display 85.Now, same with the situation that above-mentioned subject is selected, carry out slippage assigned operation or single-click operation (also can be double click operation) to back displays portion 29, screen display control unit 11c is according to the testing result in touch panel portion 14 thus, setting photography conditions.Certainly, except category-A type and category-B type, also can set by C type and D type.
In addition, also as shown in Figure 16 (c), the appointment of AF position can be applied to.Figure 16 (c) is the picture indication example on EVF 30, and any one utilizing in above-mentioned category-A type ~ D type is selected by any one in 11 AF choice boxs 86 formed, and shows the AF frame 87 of this selection.
In addition, also as shown in Figure 16 (d), the setting of the adjustment amount of color saturation and brightness etc. can be applied to.Figure 16 (d) is the picture indication example on EVF 30, at the slider bar 88 of the right side of picture display adjustment.Utilize above-mentioned category-A type ~ D type that slider bar 88 is moved along the vertical direction, thus change and adjustment amount.
Like this, in the present embodiment, when touch panel portion 14 inputs various information, these 4 kinds of modes of operation of category-A type ~ D type are described.Figure 17 gathers this 4 kinds of modes of operation is shown.This table is following table: according to each type of A ~ D, according to the display destination of back displays portion 29 and EVF 30, the input mode to touch panel portion is described.When display part 29 is selected as display destination overleaf, when setting A, B or D type, touch panel portion 14 is inputted by position specific mode.In addition, when setting C type, touch panel portion 14 is inputted by slippage specific mode.
In addition, when EVF 30 is selected as display destination, when setting category-A type, touch panel portion 14 is inputted by slippage specific mode, and limits effective range 14a.This effective range 14a is according to the posture setting regions of camera 1.When setting category-B type, touch panel portion 14 is inputted by position specific mode, and limits effective range 14a.This effective range 14a is according to the posture setting regions of camera 1.In addition, when setting C type, touch panel portion 14 is inputted by slippage specific mode, and especially in this case, the sensitivity that slippage detects becomes 2 times.In addition, when setting D type, touch panel portion 14 is inputted by slippage specific mode, in this case, the sensitivity that slippage detects is set to 2 times identical with C type.
In addition, in explanation so far, show effective range 14a by rectangular shape, but also can be other polygons, circle or ellipse.In addition, in Fig. 6 (b), eliminate lower right side from effective range 14a, but also can comprise this part.
Then, the action in the flow chart explanation present embodiment shown in Figure 18 to Figure 21 is used.These flow charts are performed according to the program in the program of being stored in/data store 12 by control part 11.As mentioned above, in control part 11, particularly show switching part 11a, camera processes below judging part 11b, image information control part 11c, operation input control portion 11d and push-botton operation test section 11e perform.
When entering the flow process shown in Figure 18, first carry out the judgement (S11) whether pattern is photograph mode.It is default mode that this camera 1 is set to photograph mode as pattern, when operating reproduction button, and setting regeneration mode.
When result of determination is not in step s 11 photograph mode, carry out reproduction processes (S13)., read view data from image storage part 27 herein, overleaf display part 29 or EVF 30 carry out reproduction display to image.At the end of reproduction processes, return step S11.
On the other hand, when result of determination is in step s 11 photograph mode, the then predetermined menu of overlapping display and information (S15) on live view.Herein, according to the view data from imaging apparatus 21, generate live view image, and according to the various information such as overlap condition setting display 85, cursor 84 on this live view image such as set pattern.
Then, the judgement (S17) of EVF 30 whether is used.Herein, display switching part 11a exports according to the detection of test section 15, judge whether EVF 30 any one on show.In addition, except the testing result of test section 15, can also in the use of place's setting EVF 30 such as menu screen, dedicated button and dual-purpose button.After carrying out this judgement, then according to this judged result, whether use the judgement (S19) of EVF 30.
Result of determination in step S19 is, when using EVF 30, then on EVF 30, show live view (S21).Now, overleaf display part 29 does not show live view image.Then, determine whether any one (S23) of setting in category-A type or category-B type.Menu screen carries out the setting of category-A type ~ D type, in this step, judges the type of the A ~ D set on menu screen.
Result of determination in step S23 is for setting category-A type or category-B type, then detect camera towards (S25).Herein, camera based on the testing result in tilt detection portion 16 is detected towards the judged result of judging part 11b.Then, according to the testing result in step S25, be whether the judgement (S27) of camera lateral attitude (usual posture).
Result of determination is in step s 27 camera 1 when being lateral attitude, then carries out the judgement (S29) whether effective range being switched to left side.User according to being that the right hand is flexible or left hand flexible, effective range can be set in by menu screen left side or right side.Judge according to this setting result.
Result of determination in step S29 is the switching do not carried out to the left, when being right side, effective range is set as the right side (S31) in touch panel portion 14.When category-A type, as shown in Fig. 6 (b), operation input control portion 11d is at the right side setting effective range 14a in touch panel portion 14.In addition, when category-B type, as shown in Figure 10 (b).
On the other hand, the result of determination in step S29 is when having carried out switching to the left, effective range 14a is set as left side (S33).When category-A type, as shown in Fig. 7 (b).In addition, when category-B type, as shown in Figure 11 (b).
Result of determination is in step s 27 camera 1 when not being lateral attitude, and namely when lengthwise position, whether the left side of then carrying out camera 1 is in the judgement (S39) of top position.Herein, camera towards the testing result of judging part 11b according to tilt detection portion 16, judgement be right side up or left side up.
Result of determination in step S39 is above left side is in, effective range 14a is set as upside (S41).Herein, when category-A type, as shown in Fig. 8 (b).In addition, when category-B type, as shown in Figure 12 (b).In addition, as mentioned above, the direction corresponding with the state of the camera gripped is carried out the performance of the position of effective range 14a, shows as the upside in touch panel portion 14 herein.Like this too in step 39.
On the other hand, the result of determination in step S39 is above left side is not in, when namely right side is in top, effective range is set as downside (S43).Herein, when category-A type, as shown in Fig. 9 (b).In addition, when category-B type, as shown in Figure 13 (b).
Set effective range in step S31, S33, S41, S43 after, then carry out the judgement (with reference to S51, Figure 19) of whether display highlighting 84.As hereinafter described, in step S55, carry out cursor display, in step S73, carry out cursor elimination, judge currently whether carried out cursor display in this step.
Result of determination is in step s 51 when not having display highlighting, whether there is the judgement (S53) of touch operation.Herein, judge according to the testing result in touch panel portion 14.When this result of determination is for existing touch operation, carry out cursor display (S55).As mentioned above, in category-A type, at the substantial middle display highlighting 84 of picture, and in category-B type on touch location display highlighting 84.
Result of determination is in step s 51 when showing cursor 84, then judges after the display of cursor, whether have passed through certain hour (S71).When this result of determination has been through certain hour, eliminate cursor 84 (S73).
On the other hand, the result of determination in step S71 is after the display of cursor not through certain hour, or when carrying out cursor display in step S55, then determines whether to set category-A type (S57).
Result of determination in step S57 is for setting category-A type, then whether there is the judgement (S59) of slide.As mentioned above, category-A type is by slippage assigned operation input information.Herein, determine whether to have carried out slide according to the testing result in touch panel portion 14.
Result of determination in step S59 is for existing slide, then detect amount and direction (S61) of slide.Herein, amount and the direction of slide is detected according to the testing result in touch panel portion 14.Then, according to the movement (S63) operating in the enterprising line cursor of EVF picture etc.As used, Fig. 6 to Fig. 9 illustrates, according to the slide in touch panel portion 14, mobile cursor 84 on the picture of EVF 30.In addition, when inputting other adjustment information such as parameter and menu using Figure 16 to illustrate, move according to each picture.
After the movement of the enterprising line cursor of EVF picture etc., then determine whether in touch panel portion 14, carried out single-click operation (S65).Be when having carried out single-click operation in this result of determination, determine to indicate (S67).Photography conditions (S69) is determined according to determined instruction.
Result of determination in step S57 is when not setting category-A type, when namely setting category-B type, then determines whether to there is touch operation (S81).As mentioned above, category-B type utilizes the Position input mode of touch operation to carry out information input.
Result of determination in step S81 is for existing touch operation, then detect touch location (S83).Then, enterprising line cursor display (S85) of touch location.As used, Figure 10 to Figure 13 illustrates, according to the touch location in touch panel portion 14, in the location of the enterprising line cursor 84 of the picture of EVF 30.In addition, when inputting other adjustment information such as parameter and menu using Figure 16 to illustrate, position according to each picture.Further, when in this moment or when there is single-click operation further, determine to indicate (S87).After determining instruction, enter into above-mentioned steps S69.
Result of determination in step S23 (with reference to Figure 18) is, when not being any one in category-A type and category-B type, when being C type or D type, entering into Figure 21, sensitivity be set to 2 times (S101) of acquiescence.As mentioned above, C type or D type are the operations that slippage is specified, but in step S101, compared with category-A type, the sensitivity that slippage detects are set as 2 times.
After sensitivity is set as 2 times, then same with step S59, whether there is the judgement (S105) of slide.Then, same with step S61, amount and direction (S107) of slide is detected according to the testing result in touch panel portion 14.
Then, according to the movement (S109) operating in EVF picture or the enterprising line cursor of back side picture etc.When setting C type or D type in EVF display, according to the slide in touch panel portion 14, mobile cursor 84 on the picture of EVF30.In addition, when arriving step S109 through step S103 described later, not EVF display but the display in back displays portion 29, now the movement of the enterprising line cursor of picture etc. overleaf.
Carrying out with after the movement of cursor corresponding to operation etc., then judging to point and whether leave from touch panel and carried out single-click operation (S111).When having carried out clicking in step S111, determine to indicate (S113), according to instruction setting photography conditions (S115).
Return Figure 18, the result of determination in step S19 is when not using EVF, display part 29 carries out live view display (S35) overleaf.Now, EVF 30 does not carry out live view display.Then, the judgement (S37) of category-A type, category-B type or D type is carried out.
If the result of determination in step S37 is any one in category-A type, category-B type or D type, then enter into above-mentioned steps S81.Herein, as used illustrated by Figure 17, by Position input mode, information input is carried out to touch panel portion 14.Therefore, identical with the use of category-B type when showing as EVF in fact.
On the other hand, the result of determination in step S37 is not any one in category-A type, category-B type, D type, namely when C type, enter into Figure 21, sensitivity keeps acquiescence (S103), and enters into step S105.Now, set C type, but owing to being the display in back displays portion 29, therefore the sensitivity of slippage assigned operation keeps acquiescence, in below step S105, detects touch operation.
Return Figure 19, when there is not touch operation in step S53, when there is not slide in step S59, when not carrying out single-click operation in step S65, photography conditions is carried out when determining in step S69, when there is not touch operation in step S81, or when having carried out cursor elimination in step S73, enter into step S91.In addition, when setting photography conditions in step sl 15, when not carrying out single-click operation in step S111, or when there is not slide in step S105, also step S91 is entered into respectively.
Return Figure 20, in step S91, determine whether to there is releasing operation.Herein, push-botton operation test section 11e judges according to whether pressing release-push 13a.When this result of determination is for having carried out releasing operation, then carry out cursor elimination (S93).Herein, image information control part 11c eliminates the display of cursor 84.
After elimination cursor, then perform photograph processing (S95).Herein, imaging apparatus 21, during the shutter speed time, exposes shot object image.After end exposure, shooting handling part 22 reads picture signal, and carried out process in image processing part 24 and compressed and decompressed handling part 25 after, carries out the record of view data in image storage part 27.
After the execution of photograph processing, or the result of determination in step S91 is when there is not releasing operation, then determines whether to carry out pattern switching (S97).Herein, determine whether to switch to reproduction mode.When this result of determination switching for there is pattern, entering into step S13, carrying out reproduction processes.
On the other hand, result of determination is in the step s 97, when there is not pattern switching, then carry out the judgement (S99) whether power supply disconnects.Herein, push-botton operation test section 11e detects the mode of operation of the power knob in operating portion 13, and judges according to this testing result.When this result of determination is not power supply disconnection, return step S15.On the other hand, when power supply disconnects, after having carried out power supply disconnection process, main flow is terminated.Again during operating power button from step S11.
As discussed above, in an embodiment of the invention, input position, when by display switching part 11a the display destination of image being switched to back displays portion 29, is appointed as in position on the picture direct corresponding with carrying out touching the position that inputs to touch panel portion 14 by operation input control portion 11d.On the other hand, when by display switching part 11a the display destination of image being switched to EVF 30, operation input control portion 11d by slidably inputing in touch panel portion 14, according to its glide direction and amount mobile assigned address (slippage specific mode, category-A type) on above-mentioned EVF picture.Therefore, when EVF shows, can carry out the operation of so-called touch pad, directly can not touch the position of the touch panel corresponding with picture, touch (Blind touch) even therefore blind, location is also easy to.
In addition, in an embodiment of the invention, operation input control portion 11d is when by showing switching part 11a and the display destination of image being switched to EVF 30, narrow region during the ratio in touch panel portion 14 use back displays portion 29 is set as inputting surveyed area, and each position of the picture in this surveyed area and EVF region is mapped, the touch input of this surveyed area will be appointed as to the input (position specific mode, category-B type) of the corresponding position on the picture of EVF 30.Because region narrows, even therefore blind touching also can prevent faulty operation.
In addition, in an embodiment of the invention, operation input control portion 11d is switching in order on the picture of any one in the picture of back displays portion 29 or EVF 30, according to direction and the mobile assigned address of amount of the slide of input, and when using EVF picture, amount of movement on picture is set as than large (slippage specific mode, sensitivity 2 times, C, D type) during use back displays portion relative to the ratio of the slippage of this input.Therefore, even if the movement of cursor also can be carried out in narrower region.
In addition, in an embodiment of the invention, by camera towards judging part 11b judge camera towards and holding mode, and according to this judged result, change the region of the effective range 14a in touch panel portion 14.Therefore, even if when user changes the holding mode of camera 1, also information input can be carried out in optimum position.
In addition, in an embodiment of the invention, describe and apply example of the present invention in lens exchange type camera, but be not limited thereto, the camera that phtographic lens and camera body are integrally constituted can also be applied to.In addition, being illustrated, but being not limited thereto the type that EVF 30 is fixed on camera body 10, can certainly be the EVF of reassembling type.
In addition, in an embodiment of the invention, the switching in EVF 30 and back displays portion 29 has been carried out by the automatic detection of test section 15.But be not limited thereto, also manually can operate on menu screen, or switched by special switching push button and dual-purpose button etc.Now, test section 15 is optional, also can omit.
In addition, in an embodiment of the invention, as the example spying on EVF 30 and carry out in touch panel portion 14 information input, describe and be applied to that main subject is selected, photography conditions setting, AF position are selected, the example of the setting of adjustment amount.But be not limited thereto, such as, can also be applied to and do not show project on the menu screen of live view and specify, in electronic zoom, carry out the appointment of amplifying reproducing position and the various information input such as mobile.
In addition, in an embodiment of the invention, as the equipment for photographing, use digital camera is illustrated, but as camera, can be digital single-lens reflex camera and Portable digital camera, can be the camera of the such moving image of video camera, video camera, can be built in the camera in mobile phone, portable information terminal PDA (PDA:Personal Digital Assistants: personal digital assistant) or game station etc. in addition certainly.
The present invention is not directly defined as the respective embodiments described above, implementation phase can be out of shape structural element in the scope not departing from its purport and make its specialize.In addition, can by the various invention of appropriately combined formation of multiple structural element disclosed in above-mentioned execution mode.Such as, the several structural elements in the entire infrastructure key element shown in execution mode can be deleted.Further, can the structural element of appropriately combined different execution mode.

Claims (8)

1. a camera, is characterized in that, this camera has:
Back displays portion, it is configured at the back side of camera body;
Touch panel, it is arranged to overlap in above-mentioned back displays portion, detects the operation instruction of input;
Spy on the EVF of type, it is configured at the top of above-mentioned camera body;
Display switching part, the display destination of image is switched to any one in above-mentioned back displays portion or above-mentioned EVF by it; And
Operation input control portion, it detects operation to above-mentioned touch panel, controls the input of the operational order of the picture to above-mentioned back displays portion or above-mentioned EVF,
Aforesaid operations input control portion controls as follows: when by above-mentioned display switching part the display destination of image being switched to above-mentioned back displays portion, input position is appointed as in position on the picture directly corresponding with having carried out touching the position that inputs to above-mentioned touch panel, when by above-mentioned display switching part the display destination of image being switched to above-mentioned EVF, use slidably inputing touch panel, the mobile assigned address on above-mentioned EVF picture according to its glide direction and amount;
Wherein, aforesaid operations input control portion controls as follows: when by above-mentioned display switching part the display destination of image being switched to above-mentioned EVF, use slidably inputing above-mentioned touch panel, according to direction and amount mobile assigned address on above-mentioned EVF picture of its sliding action, as the ratio of the amount of movement on picture relative to the slippage of this input, can with than above-mentioned touch panel the few slippage of whole length move the whole length of the corresponding sides of above-mentioned EVF picture;
Wherein, aforesaid operations input control portion controls as follows: when using above-mentioned back displays portion, to whole that the effective region of the input of above-mentioned touch panel is set as to this touch panel be made, when using above-mentioned EVF, a part of face effective region of the input of above-mentioned touch panel being defined as to this touch panel will be made.
2. camera according to claim 1, is characterized in that,
This camera has posture detecting part, and this posture detecting part detects the posture of camera body,
Aforesaid operations input control portion is when using above-mentioned EVF, and the posture of the camera body detected by above-mentioned posture detecting part, carries out switching controls to above-mentioned effective region.
3. camera according to claim 1, is characterized in that,
Aforesaid operations input control portion controls as follows: as the ratio of above-mentioned amount of movement when using above-mentioned EVF picture, can with above-mentioned touch panel whole length half below slippage move the whole length of the corresponding sides of above-mentioned EVF picture.
4. a camera, is characterized in that, this camera has:
Back displays portion, it is configured at the back side of camera body;
Touch panel portion, it is arranged to overlap in above-mentioned back displays portion, detects the operation instruction of input;
Spy on the EVF of type, it is configured at the top of above-mentioned camera body;
Display switching part, the display destination of image is switched to any one in above-mentioned back displays portion or above-mentioned EVF by it; And
Operation input control portion, it detects operation to above-mentioned touch panel, controls the input of the operational order of the picture to above-mentioned back displays portion or EVF,
Aforesaid operations input control portion controls as follows: when by above-mentioned display switching part the display destination of image being switched to above-mentioned EVF, when the ratio of above-mentioned touch panel being used above-mentioned back displays portion, narrow region is set as inputting surveyed area, and each position of the picture in this surveyed area and above-mentioned EVF region is mapped, by the input be appointed as the correspondence position on the picture of above-mentioned EVF the touch input of this surveyed area;
Wherein, aforesaid operations input control portion controls as follows: when by above-mentioned display switching part the display destination of image being switched to above-mentioned EVF, use slidably inputing above-mentioned touch panel, according to direction and amount mobile assigned address on above-mentioned EVF picture of its sliding action, as the ratio of the amount of movement on picture relative to the slippage of this input, can with than above-mentioned touch panel the few slippage of whole length move the whole length of the corresponding sides of above-mentioned EVF picture.
5. camera according to claim 4, is characterized in that,
This camera has posture detecting part, and this posture detecting part detects the posture of camera body,
Aforesaid operations input control portion controls as follows: when using above-mentioned EVF, the posture of the camera body detected by above-mentioned posture detecting part, switches above-mentioned surveyed area.
6. a camera, is characterized in that, this camera has:
Back displays portion, it is configured at the back side of camera body;
Touch panel portion, it is arranged to overlap in above-mentioned back displays portion, detects the operation instruction of input;
Spy on the EVF of type, it is configured at the top of above-mentioned camera body;
Display switching part, the display destination of image is switched to any one in above-mentioned back displays portion or above-mentioned EVF by it; And
Operation input control portion, it detects operation to above-mentioned touch panel, controls the input of the operational order to above-mentioned back displays portion or EVF picture,
Aforesaid operations input control portion controls as follows: switching on the picture of any one in above-mentioned back displays portion or EVF picture, according to direction and the mobile assigned address of amount of the slide of input, when using above-mentioned EVF picture, the amount of movement on picture is set as than large during the above-mentioned back displays portion of use relative to the ratio of the slippage of this input;
Wherein, aforesaid operations input control portion controls as follows: when using above-mentioned back displays portion, to whole that the effective region of the input of above-mentioned touch panel is set as to this touch panel be made, when using above-mentioned EVF, a part of face effective region of the input of above-mentioned touch panel being defined as to this touch panel will be made.
7. camera according to claim 6, is characterized in that,
Aforesaid operations input control portion controls as follows: as the ratio of above-mentioned amount of movement when using above-mentioned EVF picture, can with above-mentioned touch panel half below slippage move the whole length of the corresponding sides of above-mentioned EVF picture.
8. camera according to claim 6, is characterized in that,
Aforesaid operations input control portion controls as follows: when touch operation being detected, switching to the cursor that on the picture of any one in above-mentioned back displays portion and EVF picture, display position is specified, when the slide of this cursor being detected, move on this screen according to this slide, and when relieving touch operation after this slide, or when having carried out clicking operation after this touch operation is removed, determine the condition on the picture specified by this cursor.
CN201110314592.0A 2010-10-18 2011-10-17 Camera Active CN102457661B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010233326A JP5613005B2 (en) 2010-10-18 2010-10-18 camera
JP2010-233326 2010-10-18

Publications (2)

Publication Number Publication Date
CN102457661A CN102457661A (en) 2012-05-16
CN102457661B true CN102457661B (en) 2015-03-25

Family

ID=46040278

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110314592.0A Active CN102457661B (en) 2010-10-18 2011-10-17 Camera

Country Status (2)

Country Link
JP (1) JP5613005B2 (en)
CN (1) CN102457661B (en)

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105915771B (en) * 2012-06-29 2019-03-15 富士胶片株式会社 Camera and its method of controlling operation
CN109240633B (en) * 2013-03-13 2021-10-22 歌乐株式会社 Display device and information terminal operation method
JP2014178990A (en) * 2013-03-15 2014-09-25 Kyocera Corp Mobile device, control method, and control program
CN104156149B (en) * 2014-07-18 2016-04-13 小米科技有限责任公司 Acquisition parameters control method and device
WO2016172619A1 (en) 2015-04-23 2016-10-27 Apple Inc. Digital viewfinder user interface for multiple cameras
CN105306824B (en) * 2015-11-17 2019-05-28 小米科技有限责任公司 Acquisition parameters adjusting method and device
JP6143023B2 (en) * 2015-11-19 2017-06-07 カシオ計算機株式会社 Electronic device, touch operation control method, and program
JP6614943B2 (en) * 2015-11-30 2019-12-04 キヤノン株式会社 Imaging control apparatus and control method thereof
JP6590666B2 (en) * 2015-11-30 2019-10-16 キヤノン株式会社 Electronic device and control method thereof
JP6608966B2 (en) * 2016-01-28 2019-11-20 マクセル株式会社 Imaging device
JP6602721B2 (en) * 2016-04-15 2019-11-06 株式会社東海理化電機製作所 Vehicle visual recognition device
US9912860B2 (en) * 2016-06-12 2018-03-06 Apple Inc. User interface for camera effects
JP6829015B2 (en) 2016-06-23 2021-02-10 マクセル株式会社 Mobile terminal
DE112017003186B4 (en) * 2016-06-27 2020-06-25 Fujifilm Corporation CAMERA AND ADJUSTMENT METHOD FOR THE CAMERA
JP2018013745A (en) * 2016-07-23 2018-01-25 キヤノン株式会社 Electronic equipment and control method therefor
JP6708516B2 (en) * 2016-08-05 2020-06-10 キヤノン株式会社 Electronic device, control method thereof, and program
JP6701033B2 (en) * 2016-08-30 2020-05-27 キヤノン株式会社 Electronic device and control method thereof
CN112653837B (en) * 2016-08-31 2022-09-13 佳能株式会社 Image pickup control apparatus, control method therefor, and storage medium
JP6757268B2 (en) 2017-01-30 2020-09-16 キヤノン株式会社 Imaging device and its control method
JP6799475B2 (en) * 2017-02-10 2020-12-16 キヤノン株式会社 Imaging device and its control method
DE102017109254A1 (en) 2017-04-28 2018-10-31 Carl Zeiss Ag digital camera
JP6855317B2 (en) * 2017-05-10 2021-04-07 キヤノン株式会社 Imaging device, control method of imaging device, program, and recording medium
DK180859B1 (en) 2017-06-04 2022-05-23 Apple Inc USER INTERFACE CAMERA EFFECTS
JP7009096B2 (en) * 2017-07-06 2022-01-25 キヤノン株式会社 Electronic devices and their control methods
JP2019016299A (en) * 2017-07-10 2019-01-31 キヤノン株式会社 Electronic apparatus having operation member arranged on different surfaces, control method thereof, program and storage medium
US11112964B2 (en) 2018-02-09 2021-09-07 Apple Inc. Media capture lock affordance for graphical user interface
JP7071197B2 (en) * 2018-04-06 2022-05-18 キヤノン株式会社 Imaging device and its control method
US10375313B1 (en) 2018-05-07 2019-08-06 Apple Inc. Creative camera
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
CN112352418B (en) * 2018-06-27 2023-11-21 富士胶片株式会社 Image pickup apparatus and image pickup method
DK201870623A1 (en) 2018-09-11 2020-04-15 Apple Inc. User interfaces for simulated depth effects
US11770601B2 (en) 2019-05-06 2023-09-26 Apple Inc. User interfaces for capturing and managing visual media
US10674072B1 (en) 2019-05-06 2020-06-02 Apple Inc. User interfaces for capturing and managing visual media
US11128792B2 (en) 2018-09-28 2021-09-21 Apple Inc. Capturing and displaying images with multiple focal planes
US11321857B2 (en) 2018-09-28 2022-05-03 Apple Inc. Displaying and editing images with depth information
JP7382031B2 (en) 2018-10-18 2023-11-16 日本エンヂニヤ株式会社 Upflow inclined plate sand settling tank
US11706521B2 (en) 2019-05-06 2023-07-18 Apple Inc. User interfaces for capturing and managing visual media
JP7492349B2 (en) 2020-03-10 2024-05-29 キヤノン株式会社 Imaging device, control method thereof, program, and storage medium
JP7465129B2 (en) 2020-03-24 2024-04-10 キヤノン株式会社 Imaging device, control method thereof, program, and storage medium
US11054973B1 (en) 2020-06-01 2021-07-06 Apple Inc. User interfaces for managing media
US11212449B1 (en) 2020-09-25 2021-12-28 Apple Inc. User interfaces for media capture and management
US11539876B2 (en) 2021-04-30 2022-12-27 Apple Inc. User interfaces for altering visual media
US11778339B2 (en) 2021-04-30 2023-10-03 Apple Inc. User interfaces for altering visual media

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04368081A (en) * 1991-06-14 1992-12-21 Fuji Photo Film Co Ltd Camcorder
JP4123608B2 (en) * 1998-12-14 2008-07-23 ソニー株式会社 Imaging device
JP4241007B2 (en) * 2002-11-12 2009-03-18 キヤノン株式会社 Imaging apparatus, control method therefor, program, and computer-readable storage medium
JP2008268726A (en) * 2007-04-24 2008-11-06 Canon Inc Photographing device
JP4991621B2 (en) * 2008-04-17 2012-08-01 キヤノン株式会社 Imaging device
JP5228858B2 (en) * 2008-12-03 2013-07-03 ソニー株式会社 Imaging device

Also Published As

Publication number Publication date
CN102457661A (en) 2012-05-16
JP2012089973A (en) 2012-05-10
JP5613005B2 (en) 2014-10-22

Similar Documents

Publication Publication Date Title
CN102457661B (en) Camera
CN103248813B (en) Photographic equipment and method of controlling operation thereof thereof
JP6765956B2 (en) Imaging control device and its control method
CN101883213B (en) Image pickup device and method for switching modes of the same
US9001051B2 (en) Information processing apparatus, display method, and display program
CN101651782B (en) Information processing apparatus
CN109076156B (en) Electronic device and control method thereof
CN106817536B (en) Video camera controller and its control method
CN104539849A (en) Image pickup apparatus and its control method
CN103200354A (en) Imaging apparatus and method for controlling the same
KR20080072547A (en) Mobile equipment with display function
CN108377329A (en) Photographic device and its control method
JP2014017665A (en) Display control unit, control method for display control unit, program, and recording medium
JP2009230036A (en) Setting device and program
JP2007235448A (en) Camera, control method of camera, program, and recording medium
JP6986918B2 (en) Electronic devices, control methods, programs, and storage media
JP2013009061A (en) Camera and camera operation method
CN1893560B (en) Electronic equipment and menu display method
CN103578515A (en) Movie processing apparatus and control method thereof
JP2008065851A (en) Information processing apparatus and recording medium
JP4159271B2 (en) Digital camera
JP6393296B2 (en) IMAGING DEVICE AND ITS CONTROL METHOD, IMAGING CONTROL DEVICE, PROGRAM, AND STORAGE MEDIUM
JP5863418B2 (en) Imaging apparatus and control method thereof
JP6069922B2 (en) Electronic apparatus, imaging apparatus, and program
JP5498564B2 (en) Image display device and image display method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20151125

Address after: Tokyo, Japan, Japan

Patentee after: Olympus Corporation

Address before: Tokyo, Japan, Japan

Patentee before: Olympus Imaging Corp.

TR01 Transfer of patent right

Effective date of registration: 20211207

Address after: Tokyo, Japan

Patentee after: Aozhixin Digital Technology Co.,Ltd.

Address before: Tokyo, Japan

Patentee before: OLYMPUS Corp.

TR01 Transfer of patent right