WO2017085982A1 - 制御装置、制御方法、およびプログラム - Google Patents
制御装置、制御方法、およびプログラム Download PDFInfo
- Publication number
- WO2017085982A1 WO2017085982A1 PCT/JP2016/075730 JP2016075730W WO2017085982A1 WO 2017085982 A1 WO2017085982 A1 WO 2017085982A1 JP 2016075730 W JP2016075730 W JP 2016075730W WO 2017085982 A1 WO2017085982 A1 WO 2017085982A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display unit
- control device
- processing
- operation display
- eye
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
- G03B17/12—Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets
- G03B17/14—Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets interchangeably
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/26—Power supplies; Circuitry or arrangement to switch on the power source; Circuitry to check the power source voltage
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B3/00—Focusing arrangements of general interest for cameras, projectors or printers
- G03B3/10—Power-operated focusing
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B5/00—Adjustment of optical system relative to image or object surface other than for focusing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/18—Signals indicating condition of a camera member or suitability of light
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/18—Signals indicating condition of a camera member or suitability of light
- G03B17/20—Signals indicating condition of a camera member or suitability of light visible in viewfinder
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2205/00—Adjustment of optical system relative to image or object surface other than for focusing
- G03B2205/0046—Movement of one or more optical elements for zooming
Definitions
- the present disclosure relates to a control device, a control method, and a program.
- EVF Electronic viewfinder
- Patent Document 1 a through image is displayed on the rear display unit when the user does not bring his eyes close to the viewfinder, and a viewfinder display screen when the user places his eyes close to the viewfinder A technique for displaying a through image is described above.
- Japanese Patent Application Laid-Open No. 2004-228561 describes a technique that allows a user to perform mode switching or the like by touching an icon displayed on the rear display while looking through the viewfinder.
- the present disclosure proposes a new and improved control device, control method, and program capable of executing processing adapted to the state of eye proximity to the viewfinder and the operation state of the operation display unit. .
- control device includes a processing control unit that performs processing related to shooting or image reproduction based on whether or not the proximity of the eye to the finder is detected and the detection of the operation state on the operation display unit. Provided.
- a control method comprising: performing processing related to shooting or image reproduction based on presence / absence of detection of proximity of an eye to a finder and detection of an operation state of an operation display unit.
- the computer is used as a process control unit that executes processing related to shooting or image reproduction based on whether or not the proximity of the eye to the finder is detected and the operation state of the operation display unit.
- a program is provided to make it work.
- FIG. 2 is a functional block diagram showing an internal configuration of a photographing apparatus 10-1 according to the same embodiment. It is explanatory drawing which showed the example of the state of the proximity
- FIG. 2 is a functional block diagram showing an internal configuration of a photographing apparatus 10-2 according to the first application example.
- FIG. It is explanatory drawing which showed the example of the matching of a process by the same application example 1.
- FIG. It is explanatory drawing which showed an example of the external appearance of the imaging device 10-3 by the application example 2 of the embodiment.
- It is the functional block diagram which showed the internal structure of the imaging device 10-3 by the example 2 of application.
- a plurality of constituent elements having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numeral.
- a plurality of components having substantially the same functional configuration are distinguished as the operation button 132a and the operation button 132b as necessary.
- the same reference numerals are given.
- the operation button 132a and the operation button 132b they are simply referred to as the operation button 132.
- FIG. 1 is an explanatory diagram showing a situation where a user is photographing using the photographing apparatus 10-1.
- the imaging device 10-1 is an example of a control device in the present disclosure.
- the photographing apparatus 10-1 is an apparatus for photographing a video of an external environment or reproducing a photographed image. Here, shooting is to actually record an image or to display a monitor image.
- the photographing apparatus 10-1 includes a viewfinder.
- the viewfinder is, for example, a viewing window for deciding a composition before photographing or adjusting a focus when the user brings his / her eyes close (hereinafter sometimes referred to as “peeps”).
- the finder is an EVF 122.
- the EVF 122 displays image information acquired by an image sensor (not shown) included in the imaging device 10-1.
- the finder may be an optical view finder.
- the finder provided with the photographing apparatus 10-1) is the EVF 122 will be mainly described.
- the photographing apparatus 10-1 includes an operation display unit 126 on the back side of the housing, for example.
- the operation display unit 126 has a function as a display unit that displays various types of information such as a captured image and an operation unit that detects an operation by a user.
- the function as the display unit is realized by, for example, a liquid crystal display (LCD) device or an organic light emitting diode (OLED) device.
- the function as an operation part is implement
- the user performs shooting settings at the shooting site (S901).
- the user confirms the contents of the set shooting settings (S903: Yes)
- the user presses a predetermined button attached to the digital camera, for example.
- the digital camera displays the set shooting information on the rear display of the digital camera (S905).
- the user performs framing by looking into the EVF (S909).
- the digital camera performs focusing processing (S911). Thereafter, when the user desires to enlarge a partial area (hereinafter, sometimes referred to as a focus area) centered on the focused position in the subject (S913: Yes), the user selects, for example, the focus area. Press the custom button assigned to enlarge. Then, the digital camera enlarges and displays the focus area (S915).
- a focus area a partial area centered on the focused position in the subject
- the digital camera performs shooting when the user fully presses the shutter button, for example (S917).
- the operation after S917 will be described with reference to FIG.
- the user when the user desires to check the captured image (S921: Yes), the user performs a predetermined operation for displaying the captured image as a preview. . Then, the digital camera previews the captured image on the EVF (S923). If the preview display is set to automatic display in advance, in S921 to S923, the captured image is automatically displayed as a preview on the EVF.
- the digital camera transfers the image being reproduced to the other device based on, for example, a user input to the other device such as a connected PC (Personal Computer). May be. Thereafter, the user can perform rating on the transferred image using the other device.
- a user input to the other device such as a connected PC (Personal Computer). May be. Thereafter, the user can perform rating on the transferred image using the other device.
- PC Personal Computer
- buttons to be pressed differ for each function in principle. For this reason, in order to execute a desired function, the user needs to push the button separately. For example, in the workflow described above, in S905, S915, S923, S927, S931, S935, and S937, the user may need to press different buttons.
- buttons depending on the arrangement of the buttons, the user may not be able to press a desired button without breaking his attitude. For this reason, it is difficult for the user to smoothly perform the operation along the workflow described above.
- the present inventors have created the photographing apparatus 10-1 according to the present embodiment.
- the user can freely associate processing relating to shooting or image reproduction for each combination of whether or not the eye is close to the EVF 122 and the touch state on the operation display unit 126. It is. And the user can perform operation along the workflow mentioned above more easily.
- FIG. 4 is a functional block diagram showing the configuration of the photographing apparatus 10-1 according to the present embodiment.
- the imaging apparatus 10-1 includes a control unit 100, an imaging unit 120, an EVF 122, a detection unit 124, an operation display unit 126, and a storage unit 128.
- description is abbreviate
- control unit 100 uses, for example, hardware such as a CPU (Central Processing Unit), a ROM (Read Only Memory), or a RAM (Random Access Memory) built in the image capturing apparatus 10-1 of the image capturing apparatus 10-1. Control the overall operation. As illustrated in FIG. 4, the control unit 100 includes an association unit 102, a detection result acquisition unit 104, and a processing control unit 106.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the associating unit 102 performs predetermined processing relating to shooting or image reproduction based on, for example, an input by the user, a combination of whether the eye is close to the EVF 122 and the user's operation state on the operation display unit 126. Correlate with. Accordingly, the user can associate a desired process with each combination of whether or not the eye is close to the EVF 122 and the operation state of the user with respect to the operation display unit 126.
- the operation state is a state of a touch operation on the operation display unit 126 or a state of a proximity operation (operation based on determination of proximity to the operation display unit 126).
- the operation state is a touch operation state will be described.
- FIG. 5 is an explanatory diagram showing an example of the state of proximity of the eyes to the EVF 122.
- FIG. 5A shows a case where the user brings his / her eyes close to the EVF 122
- FIG. 5B shows a case where the user does not bring his / her eyes close to the EVF 122. Yes.
- the touch state can be distinguished according to the touch position of the user with respect to the operation display unit 126.
- the touch state may be distinguished depending on which region the user's touch position is included in a plurality of predetermined regions in the operation display unit 126.
- the operation display unit 126 is divided into four areas, and the touch display is performed depending on which of the four areas includes the touch position of the user.
- the state can be distinguished.
- the touch state can be distinguished by a gesture operation on the operation display unit 126.
- the touch state may be distinguished by the type of gesture operation on the operation display unit 126 or may be distinguished by the presence or absence of a gesture operation on the operation display unit 126.
- the gesture operation includes, for example, a drag operation, flick, swipe, pinch, or tap on the operation display unit 126.
- the type of gesture operation may be further distinguished by the length of the operation duration (for example, long press), the number of operations, or the strength of the touch force.
- the type of gesture operation can be further distinguished by the number of fingers used for the operation. For example, as shown in FIG. 8, the operation display unit 126 is swiped from the top to the bottom with only the thumb of the right hand (FIG. 8A), and the operation display unit 126 is moved from the top to the bottom with the thumbs of both hands.
- the operation of swiping (FIG. 8B) is distinguished as a different type of operation.
- association unit 102 when the eye is close to the EVF 122, processing related to shooting can be associated.
- processing for changing the imaging parameter may be associated.
- the processing for changing the shooting parameter includes, for example, execution of an ND filter, activation or switching of a low-pass filter, activation of an auto HDR function, switching processing between auto focus and manual focus, change of shutter speed, adjustment of an aperture, For example, ISO sensitivity is changed or white balance is adjusted.
- processing for displaying the histogram on the EVF 122 may be associated with this case.
- a process for displaying a guide display for determining the composition on the EVF 122 may be associated.
- the guide display is, for example, a grid line or a level. According to these association examples, the user can easily determine the composition while looking into the EVF 122.
- processing for changing the display mode of the focus area may be associated.
- activation of the peaking function that is, highlighting the focus area with a specific color or the like
- enlargement of the focus area or the like
- the user can more easily pursue the focus when using manual focus (or auto focus).
- aperture preview that is, displaying an image shot with a set aperture value before shooting
- setting information relating to shooting to the EVF 122 Processing to be displayed may be associated.
- the setting information related to shooting includes information such as the presence / absence of auto flash setting, the setting information related to the self-timer, the presence / absence of auto focus setting, or the presence / absence of setting of the face recognition function. According to these association examples, even when the user is looking into the EVF 122, the user can check the setting information related to shooting without performing a special operation.
- a process of displaying a preview of the captured image on the EVF 122 immediately after shooting may be associated.
- processing related to shooting can be associated. For example, when the user is not looking into the EVF 122, a process for displaying setting information regarding shooting on the operation display unit 126, a process for displaying a shot image on the operation display unit 126, or live view shooting (that is, the EVF 122). For example, the user may take a picture without peeking).
- processing related to image reproduction can be associated with the case where the eyes are not in proximity to the EVF 122.
- a process of displaying one image on the operation display unit 126, a process of displaying an enlarged image, executing a rating for the image being reproduced, and displaying an image being reproduced A process of deleting or a process of transferring a captured image to another device such as a smartphone may be associated.
- a process for turning off the operation display unit 126 may be associated. According to this association example, for example, when the user stops using the photographing apparatus 10-1, power consumption can be suppressed. Further, it is not necessary for the user to perform a special operation for turning off the screen.
- a predetermined process is associated with the case where the eye is not close to the EVF 122, but the eye is close to the EVF 122. In the case where there is a case, the processing does not have to be associated (totally). An example of this association will be described later with reference to FIG.
- the same processing may be associated regardless of whether or not the eye is close to the EVF 122. An example of this association will be described later with reference to FIG.
- the multi-camera control function may be associated with a case where the operation display unit 126 is touched with, for example, one of the left and right hands.
- control of photographing with respect to the other photographing apparatus 10-1b control of transmission of data such as an image with respect to the other photographing apparatus 10-1b, Or the process which displays the picked-up image received from the other imaging device 10-1b on the operation display part 126 may be matched.
- the operation of deleting the image being played back is associated with the operation of swiping the operation display unit 126 from top to bottom with only the right hand. Good. According to this association example, the user can delete the image intuitively.
- a process of performing rating on the image being reproduced may be associated with an operation of swiping the operation display unit 126 with both hands from the top to the bottom as illustrated in FIG. The rating may be such that a higher score is added as the distance of tracing the operation display unit 126 is larger.
- the focus area is enlarged, and when the lower position on the operation display unit 126 is pressed for a long time, the focus area is displayed. May be associated with each other.
- a process of enlarging and displaying the focus area is associated with a long press on the operation display unit 126, and the enlargement ratio of the focus area is set to change according to the touch position.
- the enlargement ratio of the focus area may be set to a larger value as the touch position is higher.
- processing for displaying a histogram on the operation display unit 126 or processing for enlarging and displaying a region near the touch position on the operation display unit 126 is associated. May be.
- processing for ending the display of peaking may be associated with the duration of the long press. For example, if the long press duration is within a predetermined time, peaking (or grid line, etc.) is displayed, and if the long press duration exceeds a predetermined time, peaking (or grid line) is displayed. Etc.) may be set to be automatically hidden. According to this setting example, in order to hide the peaking once displayed, the user does not need to perform a special operation. For example, the peaking can be hidden without moving the hand.
- association unit 102 can store the association result in the storage unit 128.
- the associating unit 102 stores all combinations of whether or not the eyes are close to the EVF 122 and combinations of touch states on the operation display unit 126 and identification information of the associated processing. 128.
- the detection result acquisition unit 104 acquires, from the detection unit 124, information detected by the detection unit 124 as to whether or not the eye is close to the EVF 122. Further, the detection result acquisition unit 104 acquires a detection result of a touch state on the operation display unit 126 from the operation display unit 126.
- the processing control unit 106 is associated with the detection result acquired by the detection result acquisition unit 104 regarding whether or not the eye is close to the EVF 122 and the detection result of the touch state on the operation display unit 126. Execute the process. For example, the process control unit 106 first identifies the process identification information stored in the storage unit 128 in association with the detection result of whether or not the eye is close to the EVF 122 and the detected touch state. Are extracted from the storage unit 128. And the process control part 106 performs the process which the extracted identification information shows.
- the processing control unit 106 can also change the setting mode of the operation display unit 126 depending on whether or not the eye is close to the EVF 122. For example, when the eyes are close to the EVF 122, the processing control unit 106 sets the operation display unit 126 to the touch panel mode. In this touch panel mode, a screen is displayed on the operation display unit 126, and position designation by a touch operation is designation by an absolute position. When the eye is not in proximity to the EVF 122, the process control unit 106 sets the operation display unit 126 to the touch pad mode. In the touch pad mode, basically, the operation display unit 126 is turned off, and the position designation by the touch operation is designated by the relative position. As a modification, a screen may be displayed on the operation display unit 126 in the touch pad mode.
- Imaging unit 120 captures an image by imaging an external image on an imaging device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) through a lens.
- an imaging device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) through a lens.
- CCD Charge Coupled Device
- CMOS Complementary Metal Oxide Semiconductor
- the detection unit 124 detects the usage state of the photographing apparatus 10-1 by the user. For example, the detection unit 124 detects whether or not the eye is close to the EVF 122 using infrared rays or the like. As an example, the detection unit 124 determines that the eye is close to the EVF 122 when an object is detected in the vicinity of the EVF 122 by the infrared sensor. That is, the detection unit 124 does not have to determine whether or not the object (close to the EVF 122) is an eye.
- the storage unit 128 stores various data and various software. For example, the storage unit 128 stores information indicating the result of association by the association unit 102.
- the configuration of the imaging apparatus 10-1 is not limited to the configuration described above.
- the detection unit 124 may not be included in the imaging device 10-1. .
- the “normal shooting mode” is associated with the case where the user does not touch the operation display unit 126 at all.
- “a process of enlarging and displaying a focus area only while touching” is associated with a case where the user touches the operation display unit 126 with only the right hand.
- the process for turning off the operation display unit 126 is associated with the case where the user does not touch the operation display unit 126 at all.
- the case where the user touches the display unit 126 with only the right hand is associated with “a process for displaying the setting information related to shooting on the operation display unit 126”, and the operation display unit 126 is displayed only on the left hand.
- “Enlarged display processing” is associated.
- the user can easily perform an operation related to confirmation of a focus area at the time of shooting and an operation related to confirmation of an image being reproduced.
- the focus area can be enlarged and displayed simply by touching the operation display unit 126 with the right hand.
- the image being reproduced can be enlarged and displayed simply by touching the operation display unit 126 with both hands.
- the state 20-1c and the state 20-1d are not associated with any process.
- the reason is as follows. Normally, when the user is looking into the EVF 122, the left hand is ready to hold the taking lens. For this reason, it is difficult for the user to touch the operation display unit 126 with the left hand without breaking his attitude.
- the present invention is not limited to this example, and some processing can be associated with the state 20-1c or the state 20-1d.
- FIG. 10 An application example 2 will be described with reference to FIG.
- This application example 2 is an example in which processing related to reproduction is associated with all cases where the user does not look into the EVF 122.
- FIG. 10 the case where the user is looking into the EVF 122 is associated with, for example, the same as in the application example 1 (shown in FIG. 9).
- “processing for reproducing one image” is associated with the case where the user does not touch the operation display unit 126 at all.
- a process for enlarging and displaying an image being reproduced is associated with a case where the user touches the operation display unit 126 with only the right hand. This is because, first, when the user does not touch the operation display unit 126 at all, it is effective to display the entire image (because the operation display unit 126 is not hidden by the user's hand). Second, when the user touches the operation display unit 126 with only one hand, a part of the operation display unit 126 is hidden by the user's hand, so only a part of the entire image is central. This is because it is effective to display on the screen.
- processing for deleting an image being reproduced is associated with the case where the user touches the operation display unit 126 with only the left hand, and the user touches the operation display unit 126 with both hands.
- processing for performing rating on the image being reproduced is associated with the case where the image is being reproduced.
- the processing related to reproduction is associated with all cases where the user does not look into the EVF 122. For this reason, the user can easily perform many types of processing relating to reproduction, such as image reproduction, enlarged display, deletion, or rating.
- the “normal shooting mode” is associated with the case where the user does not touch the operation display unit 126 at all
- processing for displaying a preview of a photographed image is associated with the case where the user touches operation display unit 126 with only the right hand.
- the process of “playing back one image” is associated with the case where the user does not touch the operation display unit 126 at all.
- “Processing for displaying a preview of a captured image” is associated with the case where the user touches only with the right hand, and “when the user is touching operation display unit 126 with only the left hand”
- “Process for Enlarging Image Display” is associated, and “Processing for performing rating on the image being reproduced” is associated with the case where the user touches the operation display unit 126 with both hands. Yes.
- the user can quickly and easily confirm the captured image.
- the user feels that the image has been taken well, it is possible to easily display the image in an enlarged manner or execute the rating.
- the correspondence information of the application examples 1 to 3 described above may be preinstalled in the photographing apparatus 10-1.
- the user can also select any one of the three types of application examples as appropriate.
- the detection unit 124 detects whether or not the eye is close to the EVF 122. And the detection result acquisition part 104 acquires the detection result by the detection part 124 (S203).
- the operation display unit 126 detects a touch state on the operation display unit 126. And the detection result acquisition part 104 acquires the detection result by the operation display part 126 (S205).
- the processing control unit 106 stores the detection result of whether or not the eye is close to the EVF 122 acquired in S203 and the combination of the detection result of the touch state acquired in S205 in association with each other.
- the process identification information stored in the unit 128 is extracted (S207).
- the process control unit 106 executes the process indicated by the extracted identification information (S209).
- the imaging apparatus 10-1 uses a combination of the detection result of whether or not the eye is close to the EVF 122 and the detection result of the touch state on the operation display unit 126 in advance.
- the associated process is executed. For this reason, each user associates a process suitable for the user in advance for each combination of whether the eye is close to the EVF 122 and the touch state on the operation display unit 126, thereby capturing an image.
- operations relating to reproduction can be performed more easily.
- the operation display unit 126 is disposed at a position that is easy to touch when the user holds the photographing apparatus 10-1, the operation display unit 126 can be easily operated as compared with a known digital camera. .
- the user can smoothly perform enlargement display of the focus area at the time of shooting, preview of the shot image, enlargement display of the image being played back, or rating of the image being played back.
- FIG. 13 is an explanatory diagram showing a part of the appearance of the photographing apparatus 10-2 according to the application example 1.
- the touch operation unit 130 is installed at a predetermined position on the outer peripheral portion of the photographing lens.
- the touch operation unit 130 can be installed at the lower part of the outer periphery of the photographing lens.
- the photographing apparatus 10-2 is a combination of whether the eye is close to the EVF 122, the touch state on the operation display unit 126, and the touch state on the touch operation unit 130. Can be executed in advance.
- FIG. 14 is a functional block diagram illustrating a configuration of the imaging device 10-2 according to the application example 1. As illustrated in FIG. As illustrated in FIG. 14, the photographing apparatus 10-2 further includes a touch operation unit 130 as compared with the photographing apparatus 10-1 illustrated in FIG.
- the touch operation unit 130 is an operation unit that detects a touch operation by a user.
- the touch operation unit 130 may have a function of a display unit that displays various types of information, and may be a touch panel, for example.
- the associating unit 102 includes, for example, whether or not the eye is close to the EVF 122 based on an input by the user, a touch state on the operation display unit 126, and a touch state on the touch operation unit 130. Are associated with predetermined processing relating to shooting or image reproduction.
- the processing control unit 106 according to the application example 1, the detection result acquired by the detection result acquisition unit 104, whether the eye is close to the EVF 122, the detection result of the touch state on the operation display unit 126, A process associated with the combination with the detection result of the touch state on the touch operation unit 130 is executed.
- the user when the user does not touch the operation display unit 126 at all and when the user touches the operation display unit 126 with only the right hand, the user touches the touch operation unit 130.
- the state is further classified according to whether or not it is. Accordingly, four different states (that is, the state 20-2 in which the user is touching the touch operation unit 130) are added compared to the application example according to the present embodiment illustrated in FIGS. .
- FIG. 16 is an explanatory diagram showing a part of the appearance of the photographing apparatus 10-3 according to the application example 2.
- the photographing apparatus 10-3 according to the application example 2 includes a plurality of operation buttons 132 on the housing.
- the operation button 132 is arranged at a position where the user can press the button while touching the operation display unit 126.
- the user can press the operation button 132a with the index finger of the right hand while touching the operation display unit 126 with the thumb of the right hand.
- the imaging apparatus 10-3 is a combination of whether the eye is close to the EVF 122, the touch state on the operation display unit 126, and the press state of the operation button 132. It is possible to execute a process associated in advance.
- FIG. 17 is a functional block diagram showing the configuration of the photographing apparatus 10-3 according to the application example 2. As shown in FIG. 17, the photographing apparatus 10-3 further includes an operation button 132, as compared with the photographing apparatus 10-1 shown in FIG.
- the association unit 102 determines whether the eye is close to the EVF 122, the touch state on the operation display unit 126, and whether the operation button 132 is pressed based on, for example, an input by the user.
- the combination of heels is associated with a predetermined process related to shooting or image reproduction.
- the processing control unit 106 according to the application example 2, the detection result acquired by the detection result acquisition unit 104, whether the eye is close to the EVF 122, the detection result of the touch state on the operation display unit 126, A process associated with the combination with the detection result of pressing the operation button 132 is executed.
- the process control unit 106 executes a process associated with the detection result of pressing the operation button 132 when the user touches the operation display unit 126.
- the state is further classified according to whether or not the user presses the operation button 132. Accordingly, six different states (that is, the state 20-3 in which the user presses the operation button 132) are added as compared with the application examples according to the present embodiment illustrated in FIGS.
- the application examples of the present embodiment are not limited to the application examples 1 and 2 described above.
- the detection result by the grip sensor or the touch sensor that can be installed in the photographing apparatus 10-1. May be used.
- a touch state or a pressed state with respect to a button or a focus ring attached to the photographing lens may be used.
- the process control part 106 can perform the process matched with the detection result of these sensors, or the detection result of the touch state or pressed state with respect to these parts. For example, when the grip sensor detects that the user has released his / her hand from the image capturing apparatus 10-1, the process control unit 106 may turn off the operation display unit 126.
- FIG. 19 is an explanatory diagram showing a configuration example of a control system according to the first modification.
- the control system according to the first modification includes a television receiver 30, a camera 32, and a portable terminal 40.
- the camera 32 is installed in the television receiver 30 and shoots an image in front of the television receiver 30.
- the television receiver 30 or the portable terminal 40 can determine whether or not a person located in front of the television receiver 30 is watching the television receiver 30 based on the photographing result by the camera 32. .
- the portable terminal 40 is a portable terminal having a touch display.
- the mobile terminal 40 may be a smartphone or a tablet terminal.
- the mobile terminal 40 when it is detected that the user is watching the television receiver 30 and is touching the touch display of the mobile terminal 40, the mobile terminal 40 does not display the operation screen on the touch display (that is, , Set to touchpad mode).
- the mobile terminal 40 displays an operation screen on the touch display (that is, the touch panel mode is set). Whether or not the user is looking at the touch display of the mobile terminal 40 may be determined based on, for example, a video captured by a camera installed on the mobile terminal 40 or based on a video captured by the camera 32. It may be determined, or may be determined based on these two types of captured images.
- the present disclosure can also be applied to medical applications, and the control device in the present disclosure may be a medical device such as a tip microscope.
- the present disclosure can be applied to a scene in which a user operates a touch display as a touch pad while an eye is close to a microscope or an endoscope.
- the medical device may change the focus, change the zoom, or display setting information related to shooting according to a touch operation on the touch display.
- control device according to the present disclosure is the imaging device 10-1 to the imaging device 10-3 has been described.
- the present invention is not limited to such an example.
- the control device in the present disclosure may be a mobile phone such as a smartphone, a tablet terminal, a PC, or a game machine.
- hardware such as a CPU, a ROM, and a RAM can exhibit functions equivalent to the configurations of the imaging device 10-1 to the imaging device 10-3 according to the above-described embodiment.
- the computer program can also be provided.
- a recording medium on which the computer program is recorded is also provided.
- a processing control unit that executes processing related to shooting or image reproduction based on whether or not the proximity of the eye to the finder is detected and detection of an operation state on the operation display unit;
- a control device comprising: (2) The control device according to (1), wherein the operation state is distinguished by a combination of the number of touch operations performed simultaneously on the operation display unit. (3) The control device according to (1) or (2), wherein the operation state is distinguished according to a touch position on the operation display unit. (4) The control device according to (3), wherein the operation state is distinguished depending on which region the touch position is included in among a plurality of regions predetermined in the operation display unit.
- the processing control unit executes processing related to the photographing or image reproduction.
- the control device according to any one of (13).
- the processing control unit does not execute the processing related to the photographing or image reproduction, (14 ).
- the processing control unit performs the same processing when the proximity of the eye to the finder is not detected and when the proximity of the eye to the finder is detected.
- the control device according to any one of (1) to (15), which is executed.
- the processing control unit further executes processing related to the photographing or image reproduction based on detection of an operation on the touch operation unit on an outer peripheral portion of the photographing lens, according to any one of (1) to (16).
- the control device described in 1. (18)
- the processing control unit further executes processing relating to the photographing or image reproduction based on detection of pressing of an operation button provided in the photographing apparatus.
- a control method comprising: (20) Computer A processing control unit that executes processing related to shooting or image reproduction based on whether or not the proximity of the eye to the finder is detected and detection of an operation state on the operation display unit; Program to function as
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
- Camera Bodies And Camera Details Or Accessories (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
- Automatic Focus Adjustment (AREA)
- Exposure Control For Cameras (AREA)
- Focusing (AREA)
- Viewfinders (AREA)
Abstract
Description
1.撮影装置10‐1の基本構成
2.実施形態の詳細な説明
3.応用例
4.変形例
<1-1.基本構成>
まず、本開示の実施形態による撮影装置10‐1の基本構成について、図1を参照して説明する。図1は、ユーザが撮影装置10‐1を使用して撮影している様子を示した説明図である。
ところで、従来、EVFを備えたデジタルカメラが普及している。ここで、本開示の特徴を明確に示すために、公知のデジタルカメラを用いた撮影および再生に関するワークフローの一例について、図2および図3を参照して説明する。
上述したように公知のデジタルカメラでは、原則として機能ごとに押下するボタンが異なっている。このため、所望の機能を実行するためには、ユーザはボタンを押し分ける必要がある。例えば、上述したワークフローにおいて、S905、S915、S923、S927、S931、S935、および、S937では、ユーザは、それぞれ別々のボタンを押下する必要が生じ得る。
<2-1.構成>
次に、本実施形態による撮影装置10‐1の構成について詳細に説明する。図4は、本実施形態による撮影装置10‐1の構成を示した機能ブロック図である。図4に示したように、撮影装置10‐1は、制御部100、撮像部120、EVF122、検出部124、操作表示部126、および、記憶部128を有する。なお、以下では、上記の説明と重複する内容については説明を省略する。
制御部100は、撮影装置10‐1に内蔵される例えばCPU(Central Processing Unit)、ROM(Read Only Memory)、またはRAM(Random Access Memory)などのハードウェアを用いて、撮影装置10‐1の動作を全般的に制御する。また、図4に示したように、制御部100は、対応付け部102、検出結果取得部104、および、処理制御部106を有する。
対応付け部102は、例えばユーザによる入力に基づいて、EVF122に対して眼が近接しているか否かと、操作表示部126に対するユーザの操作状態との組み合わせを、撮影または画像の再生に関する所定の処理と対応付ける。これにより、EVF122に対して眼が近接しているか否かと操作表示部126に対するユーザの操作状態との組み合わせごとに、ユーザは所望の処理をそれぞれ対応付けることができる。ここで、操作状態とは、操作表示部126に対するタッチ操作の状態、または、近接操作(操作表示部126に対する近接の判定に基づく操作)の状態である。なお、以下では、操作状態がタッチ操作の状態である例について説明を行う。
ここで、対応付け部102による対応付けの一例について説明する。例えば、EVF122に対して眼が近接している場合に対して、撮影に関する処理が対応付けられ得る。例えば、EVF122に対して眼が近接している場合に対して、撮影パラメータを変更する処理が対応付けられてもよい。ここで、撮影パラメータを変更する処理は、例えば、NDフィルタの実行、ローパスフィルタの起動または切り替え、オートHDR機能の起動、オートフォーカスとマニュアルフォーカスとの切り替え処理、シャッタースピードの変更、絞りの調整、ISO感度の変更、または、ホワイトバランスの調整などである。または、この場合に対して、ヒストグラムをEVF122に表示させる処理が対応付けられてもよい。
または、操作表示部126を例えば左右のいずれかの手でタッチする場合に対して、マルチカメラコントロールの機能が対応付けられてもよい。例えば、操作表示部126を左右のいずれかの手でタッチした場合に対して、他の撮影装置10‐1bに対する撮影の制御、他の撮影装置10‐1bに対する画像などのデータの送信の制御、または、他の撮影装置10‐1bから受信される撮影画像を操作表示部126に表示させる処理が対応付けられてもよい。
または、図8の(A)で示したような、操作表示部126を上から下へ右手のみでスワイプする操作に対して、再生中の画像を削除する処理が対応付けられてもよい。この対応付け例によれば、ユーザは直感的に画像を削除することができる。また、図8の(B)で示したような、操作表示部126を上から下へ両手でスワイプする操作に対して、再生中の画像に対するレーティングを実行する処理が対応付けられてもよい。なお、このレーティングは、操作表示部126をなぞる距離が大きいほど、より高い点数が付加されるようなレーティングであってもよい。
また、対応付け部102は、対応付けた結果を記憶部128に格納することが可能である。例えば、対応付け部102は、EVF122に対して眼が近接しているか否か、および、操作表示部126に対するタッチの状態の組合せと、対応付けられた処理の識別情報との組を全て記憶部128に格納する。
検出結果取得部104は、検出部124により検出された、EVF122に対して眼が近接しているか否かの情報を検出部124から取得する。また、検出結果取得部104は、操作表示部126に対するタッチの状態の検出結果を操作表示部126から取得する。
処理制御部106は、検出結果取得部104により取得された、EVF122に対して眼が近接しているか否かの検出結果と、操作表示部126に対するタッチの状態の検出結果とに対応付けられている処理を実行する。例えば、処理制御部106は、まず、EVF122に対して眼が近接しているか否かの検出結果と、検出されたタッチの状態とに対応付けて記憶部128に格納されている処理の識別情報を記憶部128から抽出する。そして、処理制御部106は、抽出した識別情報が示す処理を実行する。
撮像部120は、外部の映像を、レンズを通して例えばCCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)などの撮像素子に結像させることにより、画像を撮影する。
検出部124は、ユーザによる撮影装置10‐1の使用状態などを検出する。例えば、検出部124は、赤外線などを用いて、EVF122に対して眼が近接しているか否かを検出する。一例として、検出部124は、赤外線センサによりEVF122付近で物体が検出された場合に、EVF122に対して眼が近接していると判定する。すなわち、検出部124は、(EVF122に近接している)物体が眼であるか否かについては判別しなくてもよい。
記憶部128は、各種のデータや各種のソフトウェアを記憶する。例えば、記憶部128は、対応付け部102による対応付けの結果を示す情報を記憶する。
以上、本実施形態による構成について説明した。次に、本実施形態による適用例について「2-2-1.適用例1」~「2-2-3.適用例3」において説明する。なお、以下では、タッチの状態が、ユーザの2本の手の各々が操作表示部126にタッチしているか否かの4種類に区分されている場合における処理の対応付け例について説明する。また、以下で説明する対応付け例はあくまで一例であり、かかる例に限定されず、任意の対応付けが可能である。
まず、図9を参照して、適用例1について説明する。この適用例1は、フォーカス領域の確認および再生中の画像の確認を重視して、各状態に対して処理が対応付けられた場合の例である。
次に、図10を参照して、適用例2について説明する。この適用例2は、EVF122をユーザが覗いていない場合に関しては全て、再生に関する処理が対応付けられた例である。なお、図10に示したように、EVF122をユーザが覗いている場合に関しては、例えば、(図9に示した)適用例1と同様に対応付けられる。
次に、図11を参照して、適用例3について説明する。この適用例3は、EVF122に対して眼が近接しているか否かに関わらずに、操作表示部126を片方の手(右手)でタッチする場合に対して、撮影画像をプレビュー表示する処理が対応付けられた例である。
なお、上述した適用例1~適用例3は別々に適用される例に限定されず、適宜組み合わされてもよい。例えば、図9に示した例では、状態20‐1fに「撮影に関する設定情報を操作表示部126に表示させる処理」が対応付けられる例を示しているが、かかる例の代わりに、「撮影した画像をプレビュー表示する処理」が対応付けられてもよい。また、上述した適用例1~適用例3における個々の状態20‐1に関して、対応付けられる処理の種類をユーザが適宜変更することも可能である。
以上、本実施形態による適用例について説明した。次に、本実施形態による動作について、図12を参照して説明する。図12に示したように、まず、ユーザは、例えば電源ボタンを押下する等により、撮影装置10‐1の電源をONにする(S201)。
以上説明したように、本実施形態による撮影装置10‐1は、EVF122に対して眼が近接しているか否かの検出結果と、操作表示部126に対するタッチの状態の検出結果との組み合わせに予め対応付けられている処理を実行する。このため、各ユーザは、EVF122に対して眼が近接しているか否かと操作表示部126に対するタッチの状態との組み合わせごとに、当該ユーザに適した処理をそれぞれ予め対応付けておくことにより、撮影および再生に関する操作をより容易に行うことができる。
次に、本実施形態の応用例について、「3-1.応用例1」~「3-2.応用例2」において説明する。なお、以下では、上述した実施形態と重複する内容については説明を省略する。
まず、応用例1について説明する。図13は、応用例1による撮影装置10‐2の外観の一部を示した説明図である。図13に示したように、応用例1による撮影装置10‐2では、例えば撮影レンズの外周部の所定の位置にタッチ操作部130が設置されている。例えば、タッチ操作部130は、撮影レンズの外周の下部に設置され得る。
図14は、応用例1による撮影装置10‐2の構成を示した機能ブロック図である。図14に示したように、撮影装置10‐2は、図4に示した撮影装置10‐1と比較して、タッチ操作部130をさらに有する。
以上、応用例1による構成について説明した。次に、応用例1の適用例について、図15を参照して説明する。
次に、応用例2について説明する。図16は、応用例2による撮影装置10‐3の外観の一部を示した説明図である。図16に示したように、応用例2による撮影装置10‐3は、筐体上に複数の操作ボタン132を有する。この操作ボタン132は、例えばユーザが操作表示部126にタッチしながらボタンを押下することが可能な位置に配置されている。例えば、図16の(A)に示した例では、ユーザは右手の親指で操作表示部126をタッチしながら、右手の人差し指で操作ボタン132aを押下することができる。また、図16の(B)に示した例では、ユーザは(撮影装置10‐3を左手で把持し、かつ)操作表示部126を左手でタッチしながら、右手の指で操作ボタン132bを押下することができる。
図17は、応用例2による撮影装置10‐3の構成を示した機能ブロック図である。図17に示したように、撮影装置10‐3は、図4に示した撮影装置10‐1と比較して、操作ボタン132をさらに有する。
以上、応用例2による構成について説明した。次に、応用例2の適用例について、図18を参照して説明する。
なお、本実施形態の応用例は、上述した応用例1および応用例2に限定されない。例えば、タッチ操作部130に対するタッチの状態(応用例1)、または、操作ボタン132の押下状態(応用例2)の代わりに、撮影装置10‐1に設置され得るグリップセンサーまたはタッチセンサーによる検出結果が用いられてもよい。または、撮影レンズに付属しているボタンやフォーカスリングなどに対するタッチの状態もしくは押下状態が用いられてもよい。
以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示はかかる例に限定されない。本開示の属する技術の分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
[4-1-1.構成]
例えば、上述した実施形態では、撮影装置10‐1~撮影装置10‐3を用いた撮影または再生に関する適用例について説明したが、かかる例に限定されない。例えば、本開示は、携帯端末を用いてテレビ受像機に対する遠隔操作を行う場面にも適用可能である。図19は、この変形例1による制御システムの構成例を示した説明図である。図19に示したように、変形例1による制御システムは、テレビ受像機30、カメラ32、および、携帯端末40を有する。
なお、上記の説明では、変形例1による操作方法が携帯端末40に対するタッチ操作である例について説明したが、かかる例に限定されない。例えば、ポインターデバイスのように、携帯端末40自体を空間中で動かすジェスチャーにより、テレビ受像機30に対する遠隔操作を行うことが可能であってもよい。なお、この際、当該ジェスチャーは、例えば、カメラ32による撮影結果に基づいて認識されてもよいし、他の技術により認識されてもよい。
または、本開示は、医療用途でも適用可能であり、そして、本開示における制御装置は、例えば先端顕微鏡などの医療用機器であってもよい。例えば、顕微鏡または内視鏡にユーザが眼を近接させながら、タッチディスプレイをタッチパッドとして操作する場面に、本開示は適用可能である。一例として、当該医療用機器は、タッチディスプレイに対するタッチ操作に応じて、フォーカスを変化させたり、ズームを変えたり、撮影に関する設定情報を表示させてもよい。
また、上述した実施形態では、本開示における制御装置が撮影装置10‐1~撮影装置10‐3である例について説明したが、かかる例に限定されない。例えば、本開示における制御装置は、スマートフォンなどの携帯電話、タブレット端末、PC、または、ゲーム機などであってもよい。
(1)
ファインダーに対する眼の近接の検出の有無と、操作表示部に対する操作状態の検出とに基づいて、撮影または画像の再生に関する処理を実行する処理制御部、
を備える、制御装置。
(2)
前記操作状態は、前記操作表示部に対して同時に行われるタッチ操作の数の組合せによって区別される、前記(1)に記載の制御装置。
(3)
前記操作状態は、前記操作表示部に対するタッチ位置に応じて区別される、前記(1)または(2)に記載の制御装置。
(4)
前記操作状態は、前記操作表示部において予め定められている複数の領域のうち前記タッチ位置がいずれの領域に含まれているかによって区別される、前記(3)に記載の制御装置。
(5)
前記操作状態は、さらに、前記操作表示部に対するジェスチャー操作によって区別される、前記(1)~(4)のいずれか一項に記載の制御装置。
(6)
前記ジェスチャー操作は、前記操作表示部に対するなぞり操作を含む、前記(5)に記載の制御装置。
(7)
前記ファインダーに対する眼の近接が検出された場合には、前記処理制御部は、前記操作状態に応じて、撮影に関する処理を実行する、前記(1)~(6)のいずれか一項に記載の制御装置。
(8)
前記撮影に関する処理は、撮影パラメータを変更する処理である、前記(7)に記載の制御装置。
(9)
前記撮影に関する処理は、構図を定めるためのガイド表示を前記ファインダーに表示させる処理である、前記(7)に記載の制御装置。
(10)
前記撮影に関する処理は、被写体におけるフォーカスが合っている位置を中心とした部分領域の表示態様を変化させる処理である、前記(7)に記載の制御装置。
(11)
前記撮影に関する処理は、撮影直後の画像を前記ファインダーに表示させる処理である、前記(7)に記載の制御装置。
(12)
前記ファインダーに対する眼の近接が検出されない場合には、前記処理制御部は、前記操作状態に応じて、撮影に関する処理の実行を制御する、前記(1)~(11)のいずれか一項に記載の制御装置。
(13)
前記ファインダーに対する眼の近接が検出されない場合には、前記処理制御部は、前記操作状態に応じて、画像の再生に関する処理の実行を制御する、前記(1)~(11)のいずれか一項に記載の制御装置。
(14)
前記操作表示部に対する所定の操作が検出され、かつ、前記ファインダーに対する眼の近接が検出されない場合には、前記処理制御部は、前記撮影または画像の再生に関する処理を実行する、前記(1)~(13)のいずれか一項に記載の制御装置。
(15)
前記操作表示部に対する前記所定の操作が検出され、かつ、前記ファインダーに対する眼の近接が検出された場合には、前記処理制御部は、前記撮影または画像の再生に関する処理を実行しない、前記(14)に記載の制御装置。
(16)
前記操作表示部に対する所定の操作が検出された場合には、前記処理制御部は、前記ファインダーに対する眼の近接が検出されない場合と前記ファインダーに対する眼の近接が検出される場合とで同一の処理を実行する、前記(1)~(15)のいずれか一項に記載の制御装置。
(17)
前記処理制御部は、さらに、撮影レンズの外周部におけるタッチ操作部に対する操作の検出に基づいて、前記撮影または画像の再生に関する処理を実行する、前記(1)~(16)のいずれか一項に記載の制御装置。
(18)
前記処理制御部は、さらに、撮影装置に設けられている操作ボタンの押下の検出に基づいて、前記撮影または画像の再生に関する処理を実行する、前記(1)~(17)のいずれか一項に記載の制御装置。
(19)
ファインダーに対する眼の近接の検出の有無と、操作表示部に対する操作状態の検出とに基づいて、撮影または画像の再生に関する処理を実行すること、
を備える、制御方法。
(20)
コンピュータを、
ファインダーに対する眼の近接の検出の有無と、操作表示部に対する操作状態の検出とに基づいて、撮影または画像の再生に関する処理を実行する処理制御部、
として機能させるための、プログラム。
30 テレビ受像機
32 カメラ
40 携帯端末
100 制御部
102 対応付け部
104 検出結果取得部
106 処理制御部
120 撮像部
122 EVF
124 検出部
126 操作表示部
128 記憶部
130 タッチ操作部
132 操作ボタン
Claims (20)
- ファインダーに対する眼の近接の検出の有無と、操作表示部に対する操作状態の検出とに基づいて、撮影または画像の再生に関する処理を実行する処理制御部、
を備える、制御装置。 - 前記操作状態は、前記操作表示部に対して同時に行われるタッチ操作の数の組合せによって区別される、請求項1に記載の制御装置。
- 前記操作状態は、前記操作表示部に対するタッチ位置に応じて区別される、請求項1に記載の制御装置。
- 前記操作状態は、前記操作表示部において予め定められている複数の領域のうち前記タッチ位置がいずれの領域に含まれているかによって区別される、請求項3に記載の制御装置。
- 前記操作状態は、さらに、前記操作表示部に対するジェスチャー操作によって区別される、請求項1に記載の制御装置。
- 前記ジェスチャー操作は、前記操作表示部に対するなぞり操作を含む、請求項5に記載の制御装置。
- 前記ファインダーに対する眼の近接が検出された場合には、前記処理制御部は、前記操作状態に応じて、撮影に関する処理を実行する、請求項1に記載の制御装置。
- 前記撮影に関する処理は、撮影パラメータを変更する処理である、請求項7に記載の制御装置。
- 前記撮影に関する処理は、構図を定めるためのガイド表示、または撮影に関する設定情報を前記ファインダーに表示させる処理である、請求項7に記載の制御装置。
- 前記撮影に関する処理は、被写体におけるフォーカスが合っている位置を中心とした部分領域の表示態様を変化させる処理である、請求項7に記載の制御装置。
- 前記撮影に関する処理は、撮影直後の画像を前記ファインダーに表示させる処理である、請求項7に記載の制御装置。
- 前記ファインダーに対する眼の近接が検出されない場合には、前記処理制御部は、前記操作状態に応じて、撮影に関する処理の実行を制御する、請求項1に記載の制御装置。
- 前記ファインダーに対する眼の近接が検出されない場合には、前記処理制御部は、前記操作状態に応じて、画像の再生に関する処理の実行を制御する、請求項1に記載の制御装置。
- 前記操作表示部に対する所定の操作が検出され、かつ、前記ファインダーに対する眼の近接が検出されない場合には、前記処理制御部は、前記撮影または画像の再生に関する処理を実行する、請求項1に記載の制御装置。
- 前記操作表示部に対する前記所定の操作が検出され、かつ、前記ファインダーに対する眼の近接が検出された場合には、前記処理制御部は、前記撮影または画像の再生に関する処理を実行しない、請求項14に記載の制御装置。
- 前記操作表示部に対する所定の操作が検出された場合には、前記処理制御部は、前記ファインダーに対する眼の近接が検出されない場合と前記ファインダーに対する眼の近接が検出される場合とで同一の処理を実行する、請求項1に記載の制御装置。
- 前記処理制御部は、さらに、撮影レンズの外周部におけるタッチ操作部に対する操作の検出に基づいて、前記撮影または画像の再生に関する処理を実行する、請求項1に記載の制御装置。
- 前記処理制御部は、さらに、撮影装置に設けられている操作ボタンの押下の検出に基づいて、前記撮影または画像の再生に関する処理を実行する、請求項1に記載の制御装置。
- ファインダーに対する眼の近接の検出の有無と、操作表示部に対する操作状態の検出とに基づいて、撮影または画像の再生に関する処理を実行すること、
を備える、制御方法。 - コンピュータを、
ファインダーに対する眼の近接の検出の有無と、操作表示部に対する操作状態の検出とに基づいて、撮影または画像の再生に関する処理を実行する処理制御部、
として機能させるための、プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112016005248.0T DE112016005248T5 (de) | 2015-11-16 | 2016-09-01 | Steuervorrichtung, Steuerverfahren und Programm |
JP2017551556A JP6777091B2 (ja) | 2015-11-16 | 2016-09-01 | 制御装置、制御方法、およびプログラム |
US15/772,628 US11137666B2 (en) | 2015-11-16 | 2016-09-01 | Control device and control method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015223816 | 2015-11-16 | ||
JP2015-223816 | 2015-11-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017085982A1 true WO2017085982A1 (ja) | 2017-05-26 |
Family
ID=58719190
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/075730 WO2017085982A1 (ja) | 2015-11-16 | 2016-09-01 | 制御装置、制御方法、およびプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US11137666B2 (ja) |
JP (1) | JP6777091B2 (ja) |
DE (1) | DE112016005248T5 (ja) |
WO (1) | WO2017085982A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190014246A1 (en) * | 2017-07-10 | 2019-01-10 | Canon Kabushiki Kaisha | Image capturing apparatus with operation members provided on different sides, control method of the same, and storage medium |
JP2019200311A (ja) * | 2018-05-16 | 2019-11-21 | キヤノン株式会社 | 撮像装置及びその制御方法及びプログラム |
WO2022181055A1 (ja) * | 2021-02-26 | 2022-09-01 | 富士フイルム株式会社 | 撮像装置、情報処理方法、及びプログラム |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109690269B (zh) * | 2016-09-14 | 2022-04-29 | 索尼公司 | 传感器、输入装置和电子设备 |
JP2021086427A (ja) * | 2019-11-28 | 2021-06-03 | キヤノン株式会社 | 課金システム、課金方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008268726A (ja) * | 2007-04-24 | 2008-11-06 | Canon Inc | 撮影装置 |
JP2013057837A (ja) * | 2011-09-09 | 2013-03-28 | Olympus Imaging Corp | カメラシステムおよび交換レンズ |
JP2013078075A (ja) * | 2011-09-30 | 2013-04-25 | Olympus Imaging Corp | 撮像装置、撮像方法およびプログラム |
JP2014038195A (ja) * | 2012-08-15 | 2014-02-27 | Olympus Imaging Corp | 撮影機器 |
JP2015045764A (ja) * | 2013-08-28 | 2015-03-12 | オリンパス株式会社 | 撮像装置、撮像方法およびプログラム |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5058607B2 (ja) * | 2006-02-08 | 2012-10-24 | キヤノン株式会社 | 撮像装置、その制御方法、及びプログラム |
JP5326802B2 (ja) * | 2009-05-19 | 2013-10-30 | ソニー株式会社 | 情報処理装置、画像拡大縮小方法及びそのプログラム |
US8416333B2 (en) * | 2009-08-31 | 2013-04-09 | Panasonic Corporation | Imaging apparatus |
JP5457217B2 (ja) * | 2010-02-02 | 2014-04-02 | オリンパスイメージング株式会社 | カメラ |
US9001255B2 (en) | 2011-09-30 | 2015-04-07 | Olympus Imaging Corp. | Imaging apparatus, imaging method, and computer-readable storage medium for trimming and enlarging a portion of a subject image based on touch panel inputs |
JP5830564B2 (ja) | 2014-04-09 | 2015-12-09 | オリンパス株式会社 | 撮像装置および撮像装置におけるモード切換え方法 |
-
2016
- 2016-09-01 US US15/772,628 patent/US11137666B2/en active Active
- 2016-09-01 DE DE112016005248.0T patent/DE112016005248T5/de not_active Withdrawn
- 2016-09-01 JP JP2017551556A patent/JP6777091B2/ja active Active
- 2016-09-01 WO PCT/JP2016/075730 patent/WO2017085982A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008268726A (ja) * | 2007-04-24 | 2008-11-06 | Canon Inc | 撮影装置 |
JP2013057837A (ja) * | 2011-09-09 | 2013-03-28 | Olympus Imaging Corp | カメラシステムおよび交換レンズ |
JP2013078075A (ja) * | 2011-09-30 | 2013-04-25 | Olympus Imaging Corp | 撮像装置、撮像方法およびプログラム |
JP2014038195A (ja) * | 2012-08-15 | 2014-02-27 | Olympus Imaging Corp | 撮影機器 |
JP2015045764A (ja) * | 2013-08-28 | 2015-03-12 | オリンパス株式会社 | 撮像装置、撮像方法およびプログラム |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190014246A1 (en) * | 2017-07-10 | 2019-01-10 | Canon Kabushiki Kaisha | Image capturing apparatus with operation members provided on different sides, control method of the same, and storage medium |
JP2019016299A (ja) * | 2017-07-10 | 2019-01-31 | キヤノン株式会社 | 異なる面に配置された操作部材を有する電子機器、その制御方法、プログラム並びに記憶媒体 |
US10652442B2 (en) * | 2017-07-10 | 2020-05-12 | Canon Kabushiki Kaisha | Image capturing apparatus with operation members provided on different sides, control method of the same, and storage medium |
JP2019200311A (ja) * | 2018-05-16 | 2019-11-21 | キヤノン株式会社 | 撮像装置及びその制御方法及びプログラム |
JP7026572B2 (ja) | 2018-05-16 | 2022-02-28 | キヤノン株式会社 | 撮像装置及びその制御方法及びプログラム |
WO2022181055A1 (ja) * | 2021-02-26 | 2022-09-01 | 富士フイルム株式会社 | 撮像装置、情報処理方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
DE112016005248T5 (de) | 2018-08-16 |
US20190121220A1 (en) | 2019-04-25 |
US11137666B2 (en) | 2021-10-05 |
JPWO2017085982A1 (ja) | 2018-08-30 |
JP6777091B2 (ja) | 2020-10-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10165189B2 (en) | Electronic apparatus and a method for controlling the same | |
JP6777091B2 (ja) | 制御装置、制御方法、およびプログラム | |
JP5185150B2 (ja) | 携帯機器および操作制御方法 | |
US20170142328A1 (en) | Electronic apparatus and control method thereof | |
US10158800B2 (en) | Display control device and control method therefor | |
WO2017085983A1 (ja) | 制御装置、制御方法、およびプログラム | |
US11039073B2 (en) | Electronic apparatus and method for controlling the same | |
KR102125741B1 (ko) | 촬상 장치 및 그 제어 방법 | |
KR20110029000A (ko) | 디지털 촬영 장치 및 그 제어 방법 | |
US11082608B2 (en) | Electronic apparatus, method, and storage medium | |
US9671932B2 (en) | Display control apparatus and control method thereof | |
JP2017123515A (ja) | 電子機器およびその制御方法 | |
US9325902B2 (en) | Image capturing apparatus and control method therefor | |
JP2014228629A (ja) | 撮像装置、その制御方法、およびプログラム、並びに記憶媒体 | |
US20160381281A1 (en) | Image capture control apparatus and control method of the same | |
CN112015266A (zh) | 电子设备、电子设备的控制方法及计算机可读存储介质 | |
US9621809B2 (en) | Display control apparatus and method for controlling the same | |
JP2020204915A (ja) | 電子機器およびその制御方法 | |
US10437330B2 (en) | Gaze detection, identification and control method | |
US9277117B2 (en) | Electronic apparatus including a touch panel and control method thereof for reducing erroneous operation | |
JP6971696B2 (ja) | 電子機器およびその制御方法 | |
JP2013017088A (ja) | 撮像装置、その制御方法、および制御プログラム、並びに記録媒体 | |
US11252330B2 (en) | Display control apparatus and control method therefor | |
JP6393296B2 (ja) | 撮像装置及びその制御方法、撮像制御装置、プログラム、並びに記憶媒体 | |
US9088762B2 (en) | Image capturing apparatus and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16865985 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017551556 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112016005248 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16865985 Country of ref document: EP Kind code of ref document: A1 |