WO2015052974A1 - 情報処理装置、撮像装置、撮像システム、情報処理装置の制御方法、撮像装置の制御方法およびプログラム - Google Patents
情報処理装置、撮像装置、撮像システム、情報処理装置の制御方法、撮像装置の制御方法およびプログラム Download PDFInfo
- Publication number
- WO2015052974A1 WO2015052974A1 PCT/JP2014/069547 JP2014069547W WO2015052974A1 WO 2015052974 A1 WO2015052974 A1 WO 2015052974A1 JP 2014069547 W JP2014069547 W JP 2014069547W WO 2015052974 A1 WO2015052974 A1 WO 2015052974A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information processing
- imaging
- processing apparatus
- imaging device
- display
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1656—Details related to functional adaptations of the enclosure, e.g. to provide protection against EMI, shock, water, or to host detachable peripherals like a mouse or removable expansions units like PCMCIA cards, or to provide access to internal components for maintenance or to removable storage supports like CDs or DVDs, or to mechanically mount accessories
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1698—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a sending/receiving arrangement to establish a cordless communication link, e.g. radio or infrared link, integrated cellular phone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/039—Accessories therefor, e.g. mouse pads
- G06F3/0393—Accessories for touch pads or touch screens, e.g. mechanical guides added to touch screens for drawing straight lines, hard keys overlaying touch screens or touch pads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/161—Indexing scheme relating to constructional details of the monitor
- G06F2200/1614—Image rotation following screen orientation, e.g. switching from landscape to portrait mode
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0384—Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
Definitions
- This technology relates to an information processing apparatus and an imaging apparatus.
- the present invention relates to an information processing apparatus, an imaging apparatus, an imaging system, a control method thereof, and a program for causing a computer to execute the method.
- imaging devices such as digital still cameras and digital video cameras (for example, camera-integrated recorders) that capture a subject and generate an image (image data) and record the image as content have become widespread.
- image data image data
- wireless communication technologies for exchanging various types of data using wireless communication.
- the imaging device can be operated by the electronic device using wireless communication. For example, even when the imaging device and the electronic device are separated from each other, the imaging device is operated using the electronic device. be able to.
- the imaging apparatus is attached to an information processing apparatus capable of operating the imaging apparatus using wireless communication to perform an imaging operation.
- the usage modes of the imaging apparatus and the information processing apparatus are different from the case where the imaging apparatus is installed at a location away from the imaging apparatus and the imaging operation is performed. For this reason, when performing an imaging operation using an imaging device and an information processing device, it is important to provide an appropriate user interface according to the usage mode.
- the present technology has been created in view of such a situation, and an object thereof is to provide an appropriate user interface corresponding to the usage mode of the apparatus.
- the present technology has been made to solve the above-described problems, and a first aspect thereof is to display a display screen for operating the imaging device based on a relative positional relationship with the imaging device.
- An information processing apparatus including a control unit that performs control for switching modes, a control method thereof, and a program that causes a computer to execute the method. This brings about the effect
- control unit may perform control to switch the display mode of the display screen based on the distance between the information processing apparatus and the imaging apparatus. This brings about the effect
- control unit may perform control to switch the display mode of the display screen based on whether or not the imaging device is attached to the information processing device. This brings about the effect
- control unit may perform control to switch the display mode of the display screen based on whether or not the imaging device is mounted on the display surface of the information processing device. Good. This brings about the effect of performing control to switch the display mode of the display screen based on whether or not the imaging device is mounted on the display surface of the information processing device.
- the control unit when the imaging device is mounted on the display surface of the information processing device, the control unit is configured to determine the position of the imaging device on the display surface of the information processing device. You may make it perform control which switches the display mode of the said display screen. Thereby, when the imaging device is mounted on the display surface of the information processing device, the display mode of the display screen is switched based on the position of the imaging device on the display surface of the information processing device.
- control unit displays the display screen including an operation target for operating the imaging device, and displays the operation target based on the relative positional relationship. You may make it perform control which changes. Thereby, a display screen including an operation target for operating the imaging apparatus is displayed, and an operation of changing the display mode of the operation target based on the relative positional relationship is brought about.
- the control unit displays the display screen including an operation target for operating the imaging device, and when the imaging device is not attached to the information processing device.
- the display mode of the operation target is changed based on a change in the attitude of the information processing apparatus, and the imaging apparatus is attached to the information processing apparatus
- the display mode of the information processing apparatus is based on a change in the attitude of the information processing apparatus
- the imaging device is not attached to the information processing device
- the display mode of the operation target is changed based on the change in the attitude of the information processing device, and the imaging device is attached to the information processing device.
- imaging is performed based on an operation input performed on the information processing apparatus on which a display screen in which a display mode is switched based on a relative positional relationship between the imaging apparatus and the information processing apparatus is displayed.
- An imaging apparatus including a control unit that performs control related to an operation, a control method thereof, and a program that causes a computer to execute the method. This brings about the effect that the control relating to the imaging operation is performed based on the operation input performed in the information processing apparatus on which the display screen whose display mode is switched based on the relative positional relationship between the imaging apparatus and the information processing apparatus is displayed. .
- the display mode of the display screen may be switched based on the distance between the information processing apparatus and the imaging apparatus. This brings about the effect
- the display mode of the display screen may be switched based on whether or not the imaging device is attached to the information processing device. This brings about the effect that the display mode of the display screen is switched based on whether or not the imaging device is attached to the information processing device.
- the display mode of the display screen may be switched based on whether or not the imaging device is mounted on the display surface of the information processing device. This brings about the effect that the display mode of the display screen is switched based on whether or not the imaging device is mounted on the display surface of the information processing device.
- the display screen is displayed based on the position of the imaging device on the display surface of the information processing device.
- the mode may be switched. Thereby, when the imaging device is mounted on the display surface of the information processing device, the display mode of the display screen is switched based on the position of the imaging device on the display surface of the information processing device.
- the information processing apparatus displays the display screen including an operation target for operating the imaging device, and displays the operation target based on the relative positional relationship. You may make it change an aspect. As a result, the information processing apparatus displays the display screen including the operation target for operating the imaging apparatus, and brings about the effect of changing the display mode of the operation target based on the relative positional relationship.
- the information processing device displays the display screen including an operation target for operating the imaging device, and the imaging device is not attached to the information processing device.
- Changes the display mode of the operation object based on a change in the posture of the information processing device, and changes the posture of the information processing device when the imaging device is attached to the information processing device. You may make it not change the display mode of the said operation target based on.
- the information processing apparatus changes the display mode of the operation target based on the change in the attitude of the information processing apparatus when the imaging apparatus is not attached to the information processing apparatus, and the imaging apparatus is changed to the information processing apparatus. When it is attached, there is an effect that the display mode of the operation target is not changed based on the change in the posture of the information processing apparatus.
- an imaging apparatus that is connected to an information processing apparatus using wireless communication and performs control related to an imaging operation based on an operation input performed in the information processing apparatus, and the imaging apparatus
- An imaging system including an information processing apparatus that performs control to switch a display mode of a display screen for operating the imaging apparatus based on a relative positional relationship with the imaging apparatus, a control method thereof, and a program that causes a computer to execute the method It is.
- the imaging apparatus is connected to the information processing apparatus using wireless communication, and control related to the imaging operation is performed based on an operation input performed in the information processing apparatus.
- the information processing apparatus is relative to the imaging apparatus. This brings about the effect that the display mode of the display screen for operating the imaging device is switched based on the correct positional relationship.
- FIG. 1 is a diagram illustrating an external configuration when an imaging device 100 according to a first embodiment of the present technology is attached to an information processing device 200.
- FIG. 1 is a diagram illustrating an external configuration when an imaging device 100 according to a first embodiment of the present technology is attached to an information processing device 200.
- FIG. 2 is a block diagram illustrating a functional configuration example of the imaging device 100 and the information processing device 200 according to the first embodiment of the present technology.
- FIG. 6 is a flowchart illustrating an example of a processing procedure of display control processing by the information processing device 200 according to the first embodiment of the present technology.
- FIG. It is a block diagram showing an example of functional composition of imaging device 101 and information processor 200 in a 3rd embodiment of this art. It is a figure which shows the example of a relationship between the attitude
- FIG. It is a figure showing an example of transition of a live view picture displayed on input / output part 240 in a 3rd embodiment of this art, and an operation subject. It is a figure showing an example of a display of a display screen displayed on input / output part 240 in a 3rd embodiment of this art.
- FIG. 16 is a flowchart illustrating an example of a processing procedure of display control processing by the information processing device 200 according to the third embodiment of the present technology.
- First embodiment example of switching the display mode of the display screen based on the distance between the imaging device and the information processing device
- Second embodiment example of switching display mode of display screen based on mounting position of imaging device
- Third Embodiment Example of controlling display mode switching based on a change in posture of an information processing device based on whether or not an imaging device and an information processing device are mounted
- FIG. 1 is a diagram illustrating an external configuration of the imaging apparatus 100 according to the first embodiment of the present technology.
- 1A shows a front view of the imaging apparatus 100
- FIG. 1B shows a side view of the imaging apparatus 100 (side view when viewed from the arrow A).
- the imaging device 100 is a cylindrical (columnar) imaging device. That is, an example in which the shape of the imaging device 100 is such that only a lens portion of a general imaging device (for example, an integrated camera) is extracted.
- the imaging apparatus 100 includes operation members such as a zoom lever and a shutter button, but these are not shown in FIG.
- the imaging apparatus 100 is realized by, for example, a digital still camera or a digital video camera (for example, a camera-integrated recorder).
- the imaging apparatus 100 includes a lens barrel 160 and mounting members 171 and 172.
- the lens barrel 160 accommodates members such as an optical system and an imaging system.
- the attachment members 171 and 172 are attachments used when the imaging apparatus 100 is attached to another apparatus (for example, the information processing apparatus 200 shown in FIG. 2).
- the imaging device 100 can be attached to the device by moving the attachment member 171 in the direction of the arrow 173 and moving the attachment member 172 in the direction of the arrow 174 according to the shape and size of another device. That is, the attachment members 171 and 172 are attachments for fixing the imaging device 100 to other devices.
- an attachment surface when the imaging device 100 is mounted on another device is shown as a mounting surface 175 (a surface on the opposite side of the lens side surface shown in FIG. 1a).
- the imaging apparatus 100 can perform a normal imaging operation and can be used by being mounted on another apparatus (for example, a smartphone).
- another apparatus for example, a smartphone
- the imaging apparatus 100 can be operated by remote operation using the other apparatus.
- FIG. 2 and 3 are diagrams illustrating an external configuration when the imaging device 100 according to the first embodiment of the present technology is attached to the information processing device 200.
- FIG. 2 and 3 are diagrams illustrating an external configuration when the imaging device 100 according to the first embodiment of the present technology is attached to the information processing device 200.
- FIG. 2A shows an example in which the imaging device 100 is attached to one surface of the information processing device 200 (the surface on which the input / output unit 240 is provided).
- 2B illustrates an example in which the imaging device 100 is attached to the other surface of the information processing device 200 (the surface opposite to the surface on which the input / output unit 240 is provided).
- FIG. 3 shows another example when the imaging device 100 is attached to the other surface of the information processing device 200 (the surface opposite to the surface on which the input / output unit 240 is provided).
- the information processing apparatus 200 includes operation members 221 to 223, an input / output unit 240, an audio output unit 280, a light emitting unit 291 and an imaging unit 292.
- the information processing apparatus 200 is realized by an information processing apparatus such as a smartphone or a tablet terminal, for example.
- the operation members 221 to 223 are operation members used when performing various operation inputs.
- the input / output unit 240 displays various images and receives an operation input from the user based on a detection state of an object that is close to or in contact with the display surface of the input / output unit 240.
- the audio output unit 280 outputs various audio information.
- the light emitting unit 291 is a light emitting device that emits light to the subject.
- the light emitting unit 291 is used when performing an imaging operation using the information processing apparatus 200 in an environment where sufficient brightness cannot be expected, such as at night or indoors.
- the imaging unit 292 captures a subject and generates an image (image data).
- the imaging apparatus 100 can be fixed to the information processing apparatus 200 by sandwiching the main body of the information processing apparatus 200 with the mounting members 171 and 172 of the imaging apparatus 100.
- FIG. 4 is a block diagram illustrating a functional configuration example of the imaging device 100 and the information processing device 200 according to the first embodiment of the present technology.
- the imaging system configured by the imaging device 100 and the information processing device 200 is an example of the imaging system described in the claims.
- the imaging apparatus 100 includes an imaging unit 110, an image processing unit 120, a storage unit 130, a control unit 140, and a wireless communication unit 150.
- the imaging unit 110 captures a subject and generates an image (image data), and outputs the generated image to the image processing unit 120.
- the imaging unit 110 includes, for example, an optical system (a plurality of lenses) and an imaging element.
- each unit for example, a zoom lens, a focus lens, and a diaphragm
- image includes both the meaning of the image itself and the image data for displaying the image.
- the image processing unit 120 performs predetermined image processing (for example, demosaic processing) on the image output from the imaging unit 110 based on the control of the control unit 140, and stores the image subjected to the image processing. Stored in the unit 130. Note that the image subjected to the image processing by the image processing unit 120 may be transmitted to the information processing apparatus 200 using wireless communication and stored in the storage unit 270.
- predetermined image processing for example, demosaic processing
- the storage unit 130 is a recording medium that stores an image subjected to image processing by the image processing unit 120 as content (for example, a still image file or a moving image file). Note that the storage unit 130 may be built in the imaging apparatus 100 or detachable from the imaging apparatus 100.
- the control unit 140 controls each unit in the imaging apparatus 100 based on a control program. For example, the control unit 140 controls each unit based on an operation input received by an operation member (not shown) such as a zoom lever or a shutter button provided in the imaging apparatus 100. The control unit 140 controls each unit based on control information from the information processing apparatus 200 received via the wireless communication unit 150. That is, the imaging apparatus 100 can be remotely operated using the information processing apparatus 200.
- the wireless communication unit 150 transmits and receives information (for example, control data and image data) to and from other information processing apparatuses (for example, the information processing apparatus 200) using wireless communication.
- information for example, control data and image data
- other information processing apparatuses for example, the information processing apparatus 200
- a wireless LAN Local Area Network
- Wi-Fi Wireless Fidelity
- wireless communication for example, wireless communication such as NFC (Near Field Communication), Bluetooth (registered trademark), infrared rays, and portable radio waves can be used.
- a plurality of wireless communication methods may be used. For example, at the start of wireless communication, only power-on and data related to Wi-Fi (for example, SSID (Service Set Identifier)) are exchanged by NFC. The subsequent data exchange can be performed by Wi-Fi.
- Wi-Fi Wireless Fidelity
- the information processing apparatus 200 includes an attitude detection unit 210, an operation reception unit 220, a wireless communication unit 230, an input / output unit 240, a control unit 250, an image processing unit 260, a storage unit 270, and an audio output unit 280. With.
- the posture detection unit 210 detects a change in posture of the information processing device 200 by detecting acceleration, movement, inclination, and the like of the information processing device 200, and controls the posture information regarding the detected posture change to the control unit 250. Output to.
- various sensors such as a gyro sensor and an acceleration sensor, can be used, for example.
- the operation accepting unit 220 is an operation accepting unit that accepts an operation performed by the user, and outputs control information (operation information) according to the accepted operation content to the control unit 250.
- the operation receiving unit 220 corresponds to, for example, the operation members 221 to 223 illustrated in FIGS.
- the wireless communication unit 230 transmits and receives each piece of information (for example, control data and image data) to and from another information processing apparatus (for example, the imaging apparatus 100) using wireless communication based on the control of the control unit 250. Is to do.
- the wireless communication for example, the above-described wireless LAN (for example, Wi-Fi), NFC, Bluetooth, infrared rays, portable radio waves, and the like can be used.
- a plurality of wireless communication schemes may be used.
- the input / output unit 240 is configured such that the input unit 241 and the display unit 242 are integrated. Further, the input / output unit 240 displays various images on the display unit 242 based on the control of the control unit 250, and operation input from the user based on the detection state of an object that approaches or touches the display surface of the display unit 242. Is received by the input unit 241. The input unit 241 outputs control information corresponding to the accepted operation input to the control unit 250.
- an electrostatic (capacitance type) touch panel that detects contact or proximity of a conductive object (for example, a person's finger) based on a change in capacitance is used. it can.
- a display panel such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) panel can be used.
- the input / output unit 240 is configured by, for example, overlaying a transparent touch panel on the display surface of the display panel.
- the user can operate the information processing apparatus 200 and the imaging apparatus 100 by performing a contact operation (or a proximity operation) of the operation target displayed on the display unit 242.
- the operation object is displayed on the input / output unit 240, for example, operation objects 301 to 306 shown in FIG. 7b, and an operation button (GUI button) for performing operation input.
- GUI button an operation button
- the control unit 250 controls each unit in the information processing apparatus 200 based on a control program. For example, the control unit 250 determines the attitude of the information processing apparatus 200 based on the attitude information from the attitude detection unit 210, and switches the display mode of the display screen displayed on the input / output unit 240 based on the determination result. . For example, the control unit 250 determines the vertical direction of the posture of the information processing device 200 based on the posture information from the posture detection unit 210, and based on the determination result, the control unit 250 moves the vertical direction of the display screen displayed on the input / output unit 240. Switch direction.
- the control unit 250 displays a display screen for operating the imaging device 100 on the input / output unit 240.
- the control unit 250 performs control to switch the display mode of the display screen for operating the imaging device 100 based on the relative positional relationship with the imaging device 100.
- the control unit 250 performs control to switch the display mode of the display screen based on the distance between the information processing device 200 and the imaging device 100.
- the control unit 250 displays a display screen including an operation object (for example, the operation objects 301 to 306 shown in FIG. 7) for operating the imaging apparatus 100, and the relative positional relationship is displayed. Based on this, control is performed to change the display mode of the operation object.
- control unit 250 performs control to switch the display mode of the display screen based on whether or not the imaging device 100 is attached to the information processing device 200. In this case, the control unit 250 performs control to switch the display mode of the display screen based on whether or not the imaging device 100 is mounted on the display surface of the input / output unit 240. In addition, when the imaging device 100 is mounted on the display surface of the input / output unit 240, the control unit 250 switches the display mode of the display screen based on the position of the imaging device 100 on the display surface of the input / output unit 240. Take control. Note that control for switching the display mode of the display screen based on whether or not the imaging apparatus 100 is mounted on the display surface of the input / output unit 240 is described in a second embodiment of the present technology.
- the display screen is switched based on the relative positional relationship between the imaging apparatus 100 and the information processing apparatus 200 that are connected using wireless communication. Further, the control unit 140 of the imaging device 100 performs control related to the imaging operation based on an operation input using the display screen performed in the information processing device 200 on which the display screen is displayed.
- the image processing unit 260 performs predetermined image processing on the image generated by the imaging device 100 and the image generated by the imaging unit 292 (shown in FIG. 2 b) based on the control of the control unit 250. Then, the image subjected to the image processing is displayed on the display unit 242. Further, the image processing unit 260 stores these images in the storage unit 270 based on the control of the control unit 250. Further, the image processing unit 260 causes the display unit 242 to display a display screen that is used when an imaging operation using the imaging apparatus 100 is performed based on the control of the control unit 250.
- the storage unit 270 is a recording medium that stores information based on the control of the control unit 250.
- the storage unit 270 stores an image generated by the imaging device 100 and an image generated by the imaging unit 292 (shown in b of FIG. 2) as content (for example, a still image file or a moving image file).
- content for example, a still image file or a moving image file.
- the storage unit 270 may be built in the information processing apparatus 200 or detachable from the information processing apparatus 200.
- the audio output unit 280 outputs audio information based on the control of the control unit 250.
- the audio output unit 280 can be realized by a speaker, for example.
- FIG. 5 is a diagram illustrating a usage example of the imaging device 100 and the information processing device 200 according to the first embodiment of the present technology.
- FIG. 5a shows an example of the case where the information processing apparatus 200 is used with the imaging apparatus 100 mounted.
- the imaging device 100 can be mounted on one surface of the information processing device 200 (the surface opposite to the surface on which the input / output unit 240 is provided).
- the user 400 can use the information processing apparatus 200 on which the imaging apparatus 100 is mounted to perform imaging in substantially the same manner as imaging using a general imaging apparatus (for example, an integrated camera). .
- FIG. 5 b shows an example of the case where the information processing apparatus 200 is used without being mounted with the imaging apparatus 100.
- the imaging device 100 can be installed at a location away from the information processing device 200. Even in this case, since the user 400 can remotely control the imaging apparatus 100 using the information processing apparatus 200, the user 400 can perform imaging using the imaging apparatus 100 and the information processing apparatus 200.
- the usage mode (for example, how to hold) of the information processing apparatus 200 is different between the case where the imaging apparatus 100 is attached to the information processing apparatus 200 and the other cases. Therefore, in the first embodiment of the present technology, an example in which the display mode of the display screen is switched based on the distance between the imaging device 100 and the information processing device 200 will be described. In the second embodiment of the present technology, an example in which the display mode of the display screen is switched based on the mounting position of the imaging device 100 is shown.
- the distance between the imaging device 100 and the information processing device 200 can be estimated using the received radio wave intensity.
- a table indicating the relationship between the received radio wave intensity and the distance is created in advance and stored in the storage unit 270. Then, the control unit 250 of the information processing device 200 acquires a distance corresponding to the received radio wave intensity acquired by the wireless communication unit 230 from the table, and uses this distance as the distance between the imaging device 100 and the information processing device 200. be able to.
- the distance between the imaging device 100 and the information processing device 200 can be estimated using the input / output unit 240.
- a case where a projected capacitive touch panel is used as the input / output unit 240 is assumed.
- a material that reacts to the projected capacitive touch panel is used for the mounting surface 175 (shown in FIG. 1 b) of the imaging device 100 to the information processing device 200.
- This material is, for example, conductive silicone rubber.
- the imaging device 100 when the imaging device 100 is mounted on the display surface of the input / output unit 240, the above-described material (for example, conductive silicone rubber) is mounted on the mounting surface 175 of the imaging device 100. Since it is adopted, the input / output unit 240 can detect the imaging device 100. In this case, the control unit 250 can determine whether or not the imaging device 100 is mounted on the display surface of the input / output unit 240 based on the detected size of the object. For example, the size (surface area) of the mounting surface 175 of the imaging device 100 is stored in the storage unit 270. Then, the control unit 250 compares the size of the detected object with the size stored in the storage unit 270, and determines whether or not they match.
- “substantially match” means, for example, a case where the difference value between the size of the detected object and the size stored in the storage unit 270 is small with reference to the threshold value.
- the control unit 250 mounts the imaging device 100 on the display surface of the input / output unit 240. Can be judged. On the other hand, the control unit 250 captures an image on the display surface of the input / output unit 240 if the size of the detected object does not match the size stored in the storage unit 270 and does not match substantially. It can be determined that the device 100 is not attached.
- the input / output unit 240 can detect the position of the imaging device 100 on the display surface. For this reason, the control part 250 can switch the display mode of a display screen based on the position of the imaging device 100 in a display surface. An example of switching the display screen will be described in the second embodiment of the present technology.
- the mounting of the imaging apparatus 100 can be detected using a member for detecting that the imaging apparatus 100 is mounted on the information processing apparatus 200.
- a member for detecting that the imaging apparatus 100 is mounted on the information processing apparatus 200 As the member, for example, a switch can be provided in at least one of the imaging device 100 and the information processing device 200.
- the imaging apparatus 100 is provided with a switch.
- the attachment is detected by the switch, and information indicating that the attachment is detected is output to the control unit 140.
- the control unit 140 transmits the information to the information processing apparatus 200 via the wireless communication unit 150.
- the control unit 250 of the information processing device 200 can detect that the imaging device 100 is attached to the information processing device 200.
- the information processing apparatus 200 is provided with a switch.
- the attachment is detected by the switch, and information indicating that the attachment is detected is output to the control unit 250.
- the control unit 250 of the information processing device 200 can detect that the imaging device 100 is attached to the information processing device 200.
- the distance between the imaging device 100 and the information processing device 200 may be detected using another sensor.
- a distance sensor for example, a sensor that detects a distance using infrared rays or ultrasonic waves
- GPS Global Positioning System
- the respective positions of the imaging apparatus 100 and the information processing apparatus 200 can be acquired using GPS, and the distance between the imaging apparatus 100 and the information processing apparatus 200 can be calculated based on these positions.
- FIG. 6 is a diagram illustrating a switching example of the display screen displayed on the input / output unit 240 according to the first embodiment of the present technology.
- FIG. 6 illustrates an example of a case where the distance between the imaging device 100 and the information processing device 200 is equal to or greater than the threshold value and a case where the distance is less than the threshold value.
- the operation objects displayed on the input / output unit 240 are limited to a minimum and simplified.
- a shutter key, a moving image shooting key, a zoom key, a menu key, and the like can be displayed on a portion of the input / output unit 240 that is easily operated by the user.
- various icons can be displayed in other portions.
- the portion that is easy for the user to operate means the right region in the input / output unit 240, and in the case of a left-handed user, the left region in the input / output unit 240. Means.
- the setting is changed while shooting. Is assumed to be relatively large.
- the user supports the lens barrel 160 of the imaging apparatus 100 with the left hand or performs a ring operation or a zoom operation, and performs a body operation with the right hand. It is envisaged to do.
- operation objects such as various setting buttons in the input / output unit 240.
- a shutter key, various operation keys, and the like can be displayed on a portion of the input / output unit 240 that is easy for the user to operate.
- a zoom lever, various icons, and the like can be displayed in other portions.
- Display screen display example 7 to 9 are diagrams illustrating display examples of the display screen displayed on the input / output unit 240 according to the first embodiment of the present technology.
- FIGS. 7A to 9A show display examples when the distance between the imaging device 100 and the information processing device 200 is equal to or larger than the threshold value
- FIGS. 7B to 9B show the imaging device 100 and the information.
- the example of a display in case the distance between the processing apparatuses 200 is less than a threshold value is shown.
- FIGS. 7 to 9 show examples in which images of a mountain and a dog walking in front of the mountain are displayed as live view images 300 and 310.
- This operation target is displayed on the input / output unit 240, for example, around the live view image or superimposed on the live view image.
- FIG. 7 illustrates an example in which the display mode of the operation target 301 for turning on and off the light emitting unit 291 of the information processing device 200 is switched based on the distance between the imaging device 100 and the information processing device 200.
- FIG. 7 shows an example in which a live view image 300 is displayed and a plurality of operation objects 301 to 306 are displayed on both sides of the live view image 300.
- the operation target 301 is an operation target that is pressed when the on / off operation of the light emitting unit 291 of the information processing apparatus 200 is performed.
- the operation target object 302 is an operation target object that is pressed when switching the shooting mode of the imaging apparatus 100.
- the operation object 303 is an operation object that is pressed when the shutter operation of the imaging apparatus 100 is performed.
- the operation target 304 is an operation target that is pressed when various setting operations of the imaging apparatus 100 are performed.
- the operation target 305 is an operation target that is pressed when the exposure mode of the imaging apparatus 100 is switched.
- the operation target 306 is an operation target for displaying scene information set in the imaging apparatus 100.
- the operation target 301 is not displayed when the distance between the imaging device 100 and the information processing device 200 is equal to or greater than the threshold.
- the light from the light emitting unit 291 of the information processing device 200 is applied to the subject that is the imaging target of the imaging device 100. Is expected to arrive. For this reason, since the light output from the light emission part 291 of the information processing apparatus 200 can be used as imaging
- FIG. 8 illustrates an example in which the display mode of the operation target 317 for performing the zoom operation of the imaging apparatus 100 is switched based on the distance between the imaging apparatus 100 and the information processing apparatus 200.
- a live view image 310 is displayed, a plurality of operation objects 311 to 314 are displayed on both sides of the live view image 310, and a plurality of operation objects 315 to 317 are displayed over the live view image 310.
- An example is shown. In FIG. 8, description will be made assuming that the user is right-handed.
- the operation objects 311 to 313 correspond to the operation objects 301 to 303 shown in FIG.
- the operation object 314 is an operation object that is pressed when an image is reproduced.
- the operation objects 315 and 316 are operation objects for displaying various setting information of the imaging apparatus 100 and changing each setting.
- the operation target 317 is an operation target for performing a zoom operation of the imaging apparatus 100.
- a W (wide) button (wide side button) and a T (tele) button (tele side button) are displayed.
- the control unit 250 acquires control information according to the user operation, and the control information is acquired via the wireless communication units 230 and 150. 100 to the control unit 140.
- the control unit 140 of the imaging apparatus 100 controls the driving of the zoom lens of the imaging unit 110 based on the received control information.
- the main body of the imaging apparatus 100 is provided with an operation member (zoom lever) for performing a zoom operation.
- an operation member zoom lever
- the user does not reach the operation member (zoom lever) provided in the main body of the imaging device 100. Is done. In this case, the user cannot operate the operation member (zoom lever) provided in the main body of the imaging apparatus 100. Therefore, as shown in a of FIG. 8, when the distance between the imaging device 100 and the information processing device 200 is equal to or greater than the threshold value, a portion that is easy for the user to operate (the right portion that is easy for the right-handed user to operate). ) Display the operation object 317.
- the operation target 317 the user can easily perform a zoom operation with the thumb of the right hand.
- the operation target 317 is displayed in a portion different from the portion that is easy for the user to operate. I will let you. For example, the display positions of the operation objects 316 and 317 can be switched. As described above, by arranging the operation target 316, the user can easily perform the setting operation of the imaging apparatus 100 with the thumb of the right hand.
- FIG. 9 shows an example in which the display mode of the operation objects 321 to 327 for performing various operations of the imaging device 100 is switched based on the distance between the imaging device 100 and the information processing device 200.
- FIG. 9a is the same as FIG. 8a. Further, the operation objects 321 to 326 shown in FIG. 9B correspond to, for example, each operation member provided on the back side (opposite side of the lens side) of a general imaging device (for example, an integrated camera). It is an operation object.
- a general imaging device for example, an integrated camera
- the operation target 321 is an operation target that is pressed when the shutter operation of the imaging apparatus 100 is performed.
- the operation target 322 is an operation target used when performing a tracing operation.
- the operation target 322 corresponds to, for example, a mode dial, and various operations can be performed by performing an operation (tracing operation) for moving the operation target 322 in the left-right direction as in the mode dial. .
- the operation target 323 is an operation target that is pressed when performing an operation of starting the moving image recording of the imaging apparatus 100.
- the operation target 323 corresponds to, for example, a moving image REC button.
- the operation target 324 is an operation target that is pressed when displaying a menu screen for performing various settings related to the imaging apparatus 100.
- the operation target 324 corresponds to, for example, a menu button.
- the operation target 325 is an operation target used when performing a tracing operation or a determination operation.
- the operation target 325 corresponds to, for example, a control wheel, and can perform a moving operation of a cursor displayed on the input / output unit 240 and various determination operations.
- the operation object 326 is an operation object that is pressed when an image is reproduced.
- FIG. 9 b shows an example in which various operation objects (for example, icons) are displayed in the display area 327.
- FIG. 9 b shows an example in which the arrangement of operation members substantially the same as that of a general imaging device is realized by displaying each operation object. Specifically, as shown in FIG. 9b, the operation object 321 corresponding to each operation member provided on the back side (opposite the lens side) of a general imaging device (for example, a digital still camera). To 326 are displayed on the input / output unit 240.
- FIG. 10 is a flowchart illustrating an example of a processing procedure of display control processing by the information processing device 200 according to the first embodiment of the present technology.
- FIG. 10 illustrates an example in which the display mode of the input / output unit 240 is switched based on the distance between the imaging device 100 and the information processing device 200.
- control unit 250 detects the distance between the imaging device 100 and the information processing device 200 (step S901). Any of the above-described methods can be used as the distance detection method.
- the control unit 250 compares the distance between the imaging device 100 and the information processing device 200 with a threshold value (step S902).
- a threshold value for example, several cm (for example, 1 cm) can be used. In this case, for example, if the distance between the imaging device 100 and the information processing device 200 is less than the threshold value, it can be considered that the imaging device 100 is attached to the information processing device 200.
- control unit 250 determines whether or not a change has occurred in the comparison result between the distance between the imaging device 100 and the information processing device 200 and the threshold value (step S903).
- the comparison result changes, for example, when the distance changes from a state below the threshold to a state above the threshold, or when the distance changes from a state above the threshold to a state below the threshold Means. If the comparison result has not changed (step S903), the process proceeds to step S907.
- the control unit 250 determines whether the state after the change is a state where the distance between the imaging device 100 and the information processing device 200 is equal to or greater than a threshold value. Is determined (step S904). Then, when the state after the change is a state in which the distance between the imaging device 100 and the information processing device 200 is equal to or greater than the threshold (step S904), the control unit 250 determines that the distance is equal to or greater than the threshold. Is displayed on the input / output unit 240 (step S905). For example, the display screens shown in FIG. 7 a, FIG. 8 a, and FIG. 9 a are displayed on the input / output unit 240.
- the control unit 250 has the distance less than the threshold value.
- Is displayed on the input / output unit 240 step S906.
- the display screens shown in FIG. 7 b, FIG. 8 b, and FIG. 9 b are displayed on the input / output unit 240.
- control unit 250 determines whether or not there has been an instruction to end the imaging operation (step S907). If there is no instruction to end the imaging operation, the process returns to step S901, and there is an instruction to end the imaging operation. The operation of the display control process ends.
- Second Embodiment> In 1st Embodiment of this technique, the example which switches the display mode of a display screen based on the distance between an imaging device and information processing apparatus was shown.
- a case is assumed where the distance between the imaging device and the information processing device is less than the threshold and the imaging device is mounted on the display surface of the input / output unit of the information processing device. In this case, the user cannot see the portion of the display screen displayed on the input / output unit where the imaging device is attached. Therefore, in such a case, it is preferable to switch the display mode of the display screen based on the mounting position.
- the imaging device and the information processing device according to the second embodiment of the present technology are the same as the imaging device 100 and the information processing device 200 illustrated in FIGS. 1 to 4. For this reason, each device in the second embodiment of the present technology is denoted by the same reference numeral as that of the first embodiment of the present technology, and a part of the description thereof is omitted.
- a detection member for detecting attachment of the imaging device 100 is provided in the information processing device 200.
- a detection member for example, a switch or a sensor for detecting the mounting of the imaging device 100 can be used. Then, using the detection member, it is possible to detect which part of the front surface and the back surface of the information processing device 200 is mounted with the imaging device 100.
- the position and size of the imaging device 100 mounted on the display surface of the input / output unit 240 can be detected.
- the position and size of the imaging device 100 mounted on the display surface of the input / output unit 240 can be detected. it can.
- the control unit 250 determines whether or not the distance between the imaging device 100 and the information processing device 200 is less than a threshold (for example, several centimeters), and if the distance is less than the threshold, the imaging device 100. Can be determined to be attached to the information processing apparatus 200. In this case, the control unit 250 can determine the mounting position of the input / output unit 240 on the display surface based on the detection state of the input / output unit 240 on the display surface. An example of this determination process is shown in FIG.
- FIG. 11 is a flowchart illustrating an example of a processing procedure of attachment position determination processing by the information processing device 200 according to the second embodiment of the present technology.
- FIG. 11 illustrates an example in which it is determined that the imaging device 100 is attached to the information processing device 200 when the distance between the imaging device 100 and the information processing device 200 is less than a threshold (for example, several centimeters).
- FIG. 11 shows an example in which it is determined whether the mounting position on the display surface of the input / output unit 240 is the left side or the right side of the display surface of the input / output unit 240.
- control unit 250 detects the distance between the imaging device 100 and the information processing device 200 (step S911). Any of the above-described methods can be used as the distance detection method.
- the control unit 250 compares the distance between the imaging apparatus 100 and the information processing apparatus 200 with a threshold value, and determines whether the distance is less than the threshold value (step S912).
- a threshold value for example, several cm (for example, 1 cm) can be used. If the distance is greater than or equal to the threshold (step S912), the control unit 250 determines that the imaging device 100 is not attached to the information processing device 200 (step S913).
- step S912 the control unit 250 determines that the imaging device 100 is attached to the information processing device 200. In this case, the control unit 250 determines the mounting position of the imaging device 100 in the information processing device 200.
- control unit 250 acquires the position and size of the object that is in contact with the display surface of the input / output unit 240 (step S914). Subsequently, the control unit 250 determines whether or not the size of the acquired object is the same (or substantially the same) as the set value (step S915). About this substantially identical judgment, it can set to the range which can accept
- the control unit 250 determines that the imaging device 100 is mounted on the display surface of the input / output unit 240. judge. In this case, the control unit 250 determines the mounting position of the imaging device 100 on the display surface of the input / output unit 240.
- control unit 250 determines whether or not the acquired position of the object (the position of the object in contact with the display surface of input / output unit 240) is on the left side of the display surface of input / output unit 240. Is determined (step S917).
- the control unit 250 determines that the mounting position of the imaging device 100 is the display surface of the input / output unit 240. (Step S918).
- step S917 when the acquired position of the object is not on the left side of the display surface of the input / output unit 240 (in the case of the right side) (step S917), the control unit 250 determines that the mounting position of the imaging device 100 is the input / output It determines with it being the right side in the display surface of the part 240 (step S919).
- Display screen display example 12 to 14 are diagrams illustrating display examples of the display screen displayed on the input / output unit 240 according to the second embodiment of the present technology.
- the imaging unit 110 of the imaging device 100 performs the imaging operation using the imaging device 100 and the information processing device 200.
- An example in which the generated image is displayed on the input / output unit 240 as a live view image is shown.
- an operation target for operating the imaging apparatus 100 is displayed on the input / output unit 240. An example is shown.
- FIG. 12 illustrates an example in which the display mode of the operation target is switched based on the surface on which the imaging apparatus 100 is mounted in the information processing apparatus 200.
- FIG. 12A shows a display example when the imaging apparatus 100 is mounted on the display surface of the input / output unit 240.
- FIG. 12A illustrates an example in which a live view image 330 is displayed and a plurality of operation objects 301 to 306 are displayed around the live view image 330. Since the operation objects 301 to 306 are the same as the operation objects 301 to 306 shown in FIG. 7, the same reference numerals are given in FIG.
- FIG. 12B shows a display example when the imaging apparatus 100 is mounted on the surface opposite to the surface on which the input / output unit 240 is provided.
- FIG. 12B is the same as the example shown in FIG. 7B except that the mounting members 171 and 172 of the imaging apparatus 100 are added, and therefore, the same reference numerals as those in FIG.
- each piece of information to be displayed is displayed in a region other than the region where the imaging device 100 is mounted. That is, when the imaging device 100 is mounted on the display surface of the input / output unit 240, an area covered by the imaging device 100 is generated on the display surface of the input / output unit 240. Each information to be displayed is displayed in a non-existing area.
- control unit 250 displays the live view image 330 in a reduced size.
- control unit 250 causes the arrangement and size of the plurality of operation objects 301 to 306 to be changed and displayed.
- FIGS. 13 and 14 show an example in which the display mode of the operation target is switched based on the position where the imaging device 100 is mounted on the display surface of the input / output unit 240. 14 shows the entire display surface of the input / output unit 240 corresponding to FIG.
- FIG. 13 a shows a display example when the imaging device 100 is mounted at the left position on the display surface of the input / output unit 240.
- the example shown to a of FIG. 13 is the same as a of FIG.
- FIG. 14a shows a display example corresponding to FIG.
- FIG. 13 b shows a display example when the imaging device 100 is mounted at the right position on the display surface of the input / output unit 240. Note that the example shown in FIG. 13B is the same as FIG. 13A except that the live view image 330 and the display positions of the plurality of operation objects 301 to 306 are changed. For this reason, the same reference numerals as those in FIG. FIG. 14b shows a display example corresponding to FIG. 13b.
- FIG. 15 is a flowchart illustrating an example of a processing procedure of display control processing by the information processing device 200 according to the second embodiment of the present technology.
- FIG. 15 illustrates an example in which the display mode of the input / output unit 240 is switched based on the position where the imaging device 100 is mounted in the information processing device 200.
- FIG. 15 shows an example in the case where it is determined that the imaging device 100 is attached to the information processing device 200.
- control unit 250 detects the mounting position of the imaging device 100 in the information processing device 200 (step S921). Any of the above-described methods can be used as a method for detecting the mounting position.
- the control unit 250 determines whether or not a change has occurred in the mounting position of the imaging device 100 in the information processing device 200 (step S922).
- the attachment position changes, for example, when the attachment position moves on the display surface of the input / output unit 240 or when the attachment position changes to another surface (for example, when the input / output unit 240 is When moving from the provided surface to the opposite surface). And when the change has not arisen in the attachment position (step S922), it progresses to step S928.
- the control unit 250 determines whether the attachment position of the imaging device 100 is the display surface of the information processing device 200 (the surface of the input / output unit 240). Is determined (step S923). When the mounting position of the imaging device 100 is not the display surface of the information processing device 200 (step S923), the control unit 250 enters a display screen when the mounting position of the imaging device 100 is the back side of the information processing device 200. The information is displayed on the output unit 240 (step S924). For example, the display screen shown in FIG.
- the control unit 250 determines whether the mounting position of the imaging device 100 is the left side of the display surface of the input / output unit 240. Is determined (step S925).
- the control unit 250 determines that the mounting position of the imaging device 100 is on the left side of the display surface of the input / output unit 240.
- the display screen is displayed on the input / output unit 240 (step S926). For example, the display screen shown in FIG.
- the control unit 250 determines that the mounting position of the imaging device 100 is on the right side on the display surface of the input / output unit 240.
- the display screen is displayed on the input / output unit 240 (step S927). For example, the display screen shown in FIG. 14 b is displayed on the input / output unit 240.
- the example in which the mounting position of the imaging device 100 is determined on either side of the display surface of the input / output unit 240 (left side or right side) has been described. However, for example, it may be determined which of the three or more areas on the display surface of the input / output unit 240 is the attachment position of the imaging device 100. In this case, the live view image and the operation target are displayed so as to avoid the mounting position of the imaging device 100.
- the information processing apparatuses shown in the first and second embodiments of the present technology can switch the display mode of the display screen based on the attitude of the information processing apparatus.
- the information processing device is in a horizontally long state (a state in which the longitudinal direction of the information processing device is substantially parallel to the horizontal direction) or a vertically long state (the longitudinal direction of the information processing device is substantially parallel to the gravity direction).
- the display mode of the display screen can be switched depending on whether the state is a state.
- the imaging device and information are compared with a case where the imaging operation is performed while the imaging device and the information processing device are separated from each other. It is assumed that the usage mode of the processing apparatus is different. For this reason, when performing an imaging operation using an imaging device and an information processing device, it is important to provide an appropriate user interface according to the usage mode.
- FIG. 16 is a block diagram illustrating a functional configuration example of the imaging device 101 and the information processing device 200 according to the third embodiment of the present technology.
- the information processing apparatus 200 is the same as the information processing apparatus 200 shown in FIG.
- the imaging device 101 is a modification of the imaging device 100 illustrated in FIG. For this reason, portions common to the imaging device 100 and the information processing device 200 illustrated in FIG. 4 are denoted by the same reference numerals, and a part of these descriptions is omitted.
- the imaging apparatus 101 includes a posture detection unit 180.
- the posture detection unit 180 detects a change in posture of the imaging device 101 by detecting acceleration, movement, tilt, and the like of the imaging device 101, and outputs posture information regarding the detected posture change to the control unit 140.
- the attitude detection unit 180 can detect the rotation angle when the optical axis direction of the imaging apparatus 101 is the rotation axis as a change in the attitude of the imaging apparatus 101.
- various sensors such as a gyro sensor and an acceleration sensor, can be used, for example.
- the control unit 140 transmits the posture information output from the posture detection unit 180 to the information processing device 200 via the wireless communication unit 150.
- the control unit 250 of the information processing device 200 can grasp the posture of the imaging device 101.
- the control unit 250 can change the display mode based on the orientation of the imaging device 101. it can. Examples of changing the display mode are shown in FIGS.
- NFC NFC Tag discrimination command defined in the NFC standard (see, for example, NFC Forum Type3 Tag Operation Specification NFC Forum-TS-Type-3-Tag_1.1).
- the information processing apparatus 200 can determine that there is an adjacent apparatus.
- the distance at which data communication using NFC is possible is about 1 to 10 cm. Therefore, when there is a polling response, the control unit 250 of the information processing apparatus 200 can determine that the imaging apparatus 101 is attached to the information processing apparatus 200.
- the information processing apparatus 200 can determine that there is no adjacent apparatus. In this case, the control unit 250 of the information processing device 200 can determine that the imaging device 101 is not attached to the information processing device 200.
- NFC is widely used. Moreover, NFC is often mounted on an information processing apparatus such as a smartphone. For this reason, in the case of an information processing apparatus equipped with NFC, it is not necessary to newly provide hardware for performing detection by detecting attachment using NFC. Thereby, the manufacturing cost of the information processing apparatus can be reduced.
- NFC short-range wireless communication
- the specific information is information (identification information) for specifying the imaging device 101.
- check command and check response are commands for reading the contents of the NFC tag. This command is defined in NFC Forum Type3 Tag Operation Specification.
- a polling command is issued and a response to the polling command (polling response) is exchanged.
- the control unit 250 of the information processing apparatus 200 transmits a check command.
- the control unit 140 of the imaging apparatus 101 transmits a response (Check Response) to the Check Command.
- the control unit 140 of the imaging device 101 transmits the specific information (information for identifying the imaging device 101 (identification information)) included in the check response.
- information indicating “ABC DSC / Lens-Style Camera” can be included in the Check Response as specific information and transmitted.
- “ABC” is information indicating the name of a company that manufactures the imaging apparatus 101
- “DSC” is information indicating that the imaging apparatus 101 is an imaging apparatus.
- “Lens-Style Camera” is information indicating that the camera is a lens-style camera.
- the control unit 140 of the imaging apparatus 101 transmits the specific information included in the check response. Accordingly, the information processing apparatus 200 that has received the check response can recognize that the apparatus that has transmitted the check response is the imaging apparatus 101 based on the specific information included in the check response. That is, it can be understood that the device that transmitted the check response is a lens-style camera (imaging device 101) manufactured by “AAA” company.
- the control unit 250 of the information processing apparatus 200 acquires the content of the Check Response. Subsequently, the control unit 250 of the information processing device 200 determines whether or not specific information is included in the check response. Then, the control unit 250 of the information processing device 200 determines that the imaging device 101 is attached when the specific information is included in the check response. On the other hand, when the specific information is not included in the check response, the control unit 250 of the information processing device 200 determines that the imaging device 101 is not attached.
- this detection method is an example, and the present invention is not limited to these, and other detection methods may be used.
- FIG. 17 is a diagram illustrating a relationship example between the posture of the imaging apparatus 101 and a captured image generated by the imaging unit 110 according to the third embodiment of the present technology.
- the 17a shows a captured image 501 generated by the imaging unit 110 when the imaging apparatus 101 is in a normal posture.
- the normal posture is, for example, as shown in FIG. 17a (a and b in FIG. 1), the attachment member 171 is on the upper side in the vertical direction, and the attachment member 172 is on the lower side in the vertical direction. It means posture.
- a captured image 501 having a horizontally long shape (a shape that is long in the horizontal direction) is generated.
- FIG. 17B shows a captured image 502 generated by the imaging unit 110 when the imaging apparatus 101 in the posture shown in FIG. 17A is rotated 90 degrees with the optical axis direction of the imaging apparatus 101 as the rotation axis. Show. That is, the captured image 502 generated by the imaging unit 110 when the imaging device 101 in the posture illustrated in FIG. 17A is rotated 90 degrees in the direction of the arrow 511 is illustrated. In this case, a captured image 502 having a vertically long shape (a shape that is long in the vertical direction) is generated.
- FIG. 17C illustrates a captured image 503 generated by the imaging unit 110 when the image capturing apparatus 101 in the posture illustrated in FIG. 17A is rotated 180 degrees with the optical axis direction of the image capturing apparatus 101 as the rotation axis. Show. That is, the captured image 503 generated by the imaging unit 110 when the imaging device 101 in the posture illustrated in FIG. 17A is rotated 180 degrees in the direction of the arrow 512 is illustrated. In this case, a captured image 503 having a horizontally long shape (a shape that is long in the horizontal direction) is generated. However, the subject included in the captured image 503 is obtained by rotating the subject included in the captured image 501 shown in FIG.
- FIG. 18 is a diagram illustrating a transition example of the live view image and the operation target displayed on the input / output unit 240 according to the third embodiment of the present technology.
- FIG. 18 illustrates an example in which a captured image generated by the imaging unit 110 is displayed as a live view image when the imaging apparatus 101 is installed so as to have the posture illustrated in FIG.
- 18A shows a display example of the live view image 520 and the operation objects 521 to 527 when the longitudinal direction of the information processing apparatus 200 is substantially the same as the horizontal direction (direction perpendicular to the gravity direction).
- 18B shows a display example of the live view image 530 and the operation objects 531 to 537 when the longitudinal direction of the information processing apparatus 200 is substantially the same as the vertical direction (direction parallel to the gravity direction).
- the operation objects 521 to 524, 526, and 527 shown in FIG. 18a and the operation objects 531 to 534, 536, and 537 shown in FIG. 18b are the operation objects shown in a and b of FIG. It corresponds to 311 to 315, 317. Further, an operation object 525 shown in FIG. 18a and an operation object 535 shown in FIG. 18b correspond to the operation objects 305 and 306 shown in a and b of FIG.
- the control unit 250 determines the posture of the information processing apparatus 200 based on the posture information output from the posture detection unit 210. Then, as illustrated in FIG. 18 a, the control unit 250 displays the live view image 520 and the operation objects 521 to 527 when the longitudinal direction of the information processing apparatus 200 is substantially the same as the horizontal direction. On the other hand, as illustrated in FIG. 18 b, the control unit 250 displays the live view image 530 and the operation objects 531 to 537 when the longitudinal direction of the information processing apparatus 200 is substantially the same as the vertical direction. In this case, the control unit 250 reduces the live view image 520 so that the horizontal width (size in the horizontal direction) of the live view image 520 shown in FIG. A view image 530 is displayed.
- FIG. 18A a display example when the information processing apparatus 200 shown in FIG. 18A is rotated 180 degrees with the axis orthogonal to the display surface of the input / output unit 240 as the rotation axis is shown in FIG.
- FIG. 18B a display example when the information processing apparatus 200 illustrated in FIG. 18B is rotated 180 degrees about an axis orthogonal to the display surface of the input / output unit 240 is illustrated in FIG. 19C.
- Display screen display example 19 and 20 are diagrams illustrating display examples of the display screen displayed on the input / output unit 240 according to the third embodiment of the present technology. 19 and 20 show display examples of the display screen when the imaging apparatus 101 is not attached to the information processing apparatus 200. FIG.
- 19A to 19D show transition examples of display screens that display the captured image generated by the imaging unit 110 as a live view image when the imaging apparatus 101 is installed so as to have the posture shown in FIG. Show.
- the live view image is displayed so that the vertical direction of the imaging device 101 is the same as the vertical direction of the live view image displayed on the input / output unit 240 of the information processing device 200.
- the vertical direction of the imaging apparatus 101 means a direction connecting the attachment members 171 and 172
- the horizontal direction of the imaging apparatus 101 means a direction orthogonal to the vertical direction, for example. Note that a in FIG. 19 corresponds to b in FIG. 18, and b in FIG. 19 corresponds to a in FIG.
- the posture of the information processing apparatus 200 can be changed by rotating the information processing apparatus 200 about the orthogonal direction orthogonal to the display surface of the input / output unit 240 of the information processing apparatus 200 as a rotation axis. Change.
- the input / output unit is configured so that the vertical direction of the imaging device 101 and the vertical direction of the live view image are the same based on the change in the posture.
- the display mode of the live view image displayed in 240 is changed. Further, the display mode of each operation target is changed based on a change in the attitude of the information processing apparatus 200.
- the display shown in FIGS. 19a to 19d is also performed when the picked-up image generated by the image pickup unit 110 is displayed as a live view image.
- the live view image is displayed so that the upper side in the vertical direction of the imaging device 101 is opposite to the upper side in the vertical direction of the live view image displayed on the input / output unit 240 of the information processing device 200.
- the input / output unit 240 of the information processing device 200 has the correct vertical direction of the subject included in the live view image. Live view images can be displayed.
- the live view image is displayed so that the horizontal direction of the imaging device 101 is the same as the vertical direction of the live view image displayed on the input / output unit 240 of the information processing device 200.
- the posture of the information processing device 200 when the posture of the information processing device 200 is changed, the horizontal direction of the imaging device 101 and the vertical direction of the live view image based on the change in the posture.
- the display mode of the live view image is changed so that and are the same. Further, the display mode of each operation target is changed based on a change in the attitude of the information processing apparatus 200.
- a captured image generated by the imaging unit 110 is displayed as a live view image.
- This is the same as the transition example of the display screen shown in FIG. That is, even when the imaging apparatus 101 is installed by being rotated 90 degrees (for example, when the imaging apparatus 101 is installed with the posture shown in FIG. 17 b), the vertical direction of the subject included in the live view image is correct. Live view images can be displayed.
- the display mode of the live view image and the operation target can be changed based on the change in the posture of the information processing device 200.
- the imaging apparatus 101 is mounted on the information processing apparatus 200, the user can hold the imaging apparatus 101 and information in the same way regardless of the orientation of the imaging apparatus 101 and the information processing apparatus 200. It is assumed that the processing apparatus 200 is held. For this reason, if the display mode of the live view image and the operation target object is changed based on the change in the attitude of the information processing apparatus 200, it may be difficult for the user to operate. Therefore, in the third embodiment of the present technology, when the imaging apparatus 101 is attached to the information processing apparatus 200, a live view image and an operation target display mode based on a change in posture of the information processing apparatus 200 are displayed. Do not make any changes.
- FIG. 21 is a diagram illustrating a display example of the display screen displayed on the input / output unit 240 according to the third embodiment of the present technology.
- FIG. 21 shows a display example of the display screen when the imaging apparatus 101 is attached to the information processing apparatus 200.
- FIG. 21 illustrates a display example of the display screen when the user performs each operation of the imaging apparatus 101 using the operation target displayed on the input / output unit 240.
- FIG. 21A is the same as FIG. 18A except that the attachment members 171 and 172 are attached to the information processing apparatus 200. For this reason, the same code
- 21B shows a case where the information processing apparatus 200 shown in FIG. 21A is rotated 90 degrees about the axis orthogonal to the display surface of the input / output unit 240 as a rotation axis (rotated 90 degrees in the direction of the arrow 550). Display example).
- 21B is the same as FIG. 21A except that the live view image 540 is displayed instead of the live view image 520. For this reason, the same code
- the display mode of the operation target displayed on the input / output unit 240 is not changed even when the attitude of the information processing apparatus 200 changes. . Thereby, the operation target is displayed in the same display mode regardless of the orientations of the imaging device 101 and the information processing device 200.
- the display mode of the live view image displayed on the input / output unit 240 is not changed even when the attitude of the imaging apparatus 101 changes. Thereby, a live view image can be confirmed in the same display mode regardless of the orientations of the imaging apparatus 101 and the information processing apparatus 200.
- the user holding the imaging apparatus 101 and the information processing apparatus 200 operates the operation target 527 (zoom operation) with the thumb of the right hand in the state illustrated in FIG.
- the user holding the imaging device 101 and the information processing device 200 operates the operation target 527 (zoom operation) with the thumb of the right hand. be able to. Accordingly, the user can easily perform a zoom operation with the thumb of the right hand regardless of the posture of the imaging device 101 and the information processing device 200.
- the user holding the imaging apparatus 101 and the information processing apparatus 200 operates the operation target 522 (shutter operation) with the thumb of the right hand.
- the user holding the imaging device 101 and the information processing device 200 operates the operation target 522 (shutter operation) with the thumb of the right hand. be able to.
- the user can easily perform the shutter operation with the thumb of the right hand regardless of the posture of the imaging apparatus 101 and the information processing apparatus 200.
- the display mode of the operation target and the live view image displayed on the input / output unit 240 is not changed. Accordingly, it is possible to prevent the user from feeling difficult to operate by changing the position of the operation target for performing the zoom operation or the shutter operation every time the attitude of the information processing apparatus 200 changes.
- the live view image changes each time the posture of the information processing apparatus 200 changes, the user can be prevented from feeling difficult to confirm the subject. That is, the operability during the imaging operation can be improved.
- the display mode of all the operation objects is not changed when the imaging apparatus 101 and the information processing apparatus 200 are mounted.
- the display mode of some operation objects for example, operation objects that are hardly used during the imaging operation
- other operation objects for example, operation objects frequently used during the imaging operation
- at least one operation object may be changed only in its direction without changing its position.
- the direction of the operation target 522 may be changed based on a change in the posture of the information processing apparatus 200 without changing the position on the display surface of the input / output unit 240.
- the camera mark of the operation target 522 can be changed to face upward based on a change in the attitude of the information processing apparatus 200.
- the user may set whether or not to change the display mode of the operation target for each operation target, and may determine whether or not there is a change based on the setting content.
- FIG. 22 is a flowchart illustrating an example of a processing procedure of display control processing by the information processing device 200 according to the third embodiment of the present technology.
- FIG. 22 shows an example of controlling whether to switch the display mode based on the change in the attitude of the information processing device 200 based on whether or not the imaging device 101 and the information processing device 200 are attached.
- control unit 250 causes the input / output unit 240 to display a display screen including the live view image and the operation target (step S931). For example, the display screen shown in FIG.
- the control unit 250 detects a change in the posture of the information processing apparatus 200 based on the posture information from the posture detection unit 210 (step S932). And the control part 250 judges whether the change is more than a threshold value (step S933). For example, as shown in FIGS. 18 to 20, when the rotation is performed by a predetermined angle or more with the axis orthogonal to the display surface of the input / output unit 240 as the rotation axis, the change is determined to be equal to or greater than the threshold value. And when the change is less than a threshold value (step S933), it progresses to step S936.
- step S933 the control unit 250 determines whether or not the imaging device 101 is attached to the information processing device 200 (step S934). If the imaging device 101 is mounted on the information processing device 200 (step S934), the process proceeds to step S936.
- the control unit 250 changes the live view image and the operation target based on the attitude of the information processing apparatus 200 and enters the information processing apparatus 200. It is displayed on the output unit 240 (step S935). For example, as shown in FIGS. 18 to 20, the live view image and the operation target are changed and displayed on the input / output unit 240.
- control unit 250 determines whether or not there has been an instruction to end the imaging operation (step S936). If there is no instruction to end the imaging operation, the control unit 250 returns to step S931 and receives an instruction to end the imaging operation. The operation of the display control process ends.
- control unit 250 causes the input / output unit 240 to display a display screen including an operation target for operating the imaging apparatus 101. Then, when the imaging apparatus 101 is not attached to the information processing apparatus 200, the control unit 250 performs control to change the display mode of the live view image and the operation target based on the change in the attitude of the information processing apparatus 200. Do. On the other hand, when the imaging apparatus 101 is attached to the information processing apparatus 200, the control unit 250 does not change the display mode of the live view image and the operation target based on the change in the attitude of the information processing apparatus 200. To control.
- the display control processing switching of the display mode of the display screen based on the distance between the imaging device and the information processing device or the mounting position of the imaging device shown in the first and second embodiments of the present technology is shown in FIG.
- the cylindrical (columnar) imaging devices 100 and 101 have been described as an example.
- the imaging technology may be applied to other shapes of imaging devices that can be attached to other devices.
- the embodiment can be applied.
- the information processing apparatus 200 such as a smartphone or a tablet terminal has been described as an example.
- other apparatuses that can be connected to an imaging apparatus using wireless communication are also described in this embodiment.
- Technical embodiments can be applied.
- the relationship between the imaging device and the information processing device has been described as an example.
- the embodiment of the present technology can be applied to other devices.
- the embodiments of the present technology can also be applied to a light emitting device and an imaging device (for example, a digital still camera) used in still image shooting.
- the embodiment of the present technology can also be applied to a sound collection device (for example, a microphone) and an imaging device (for example, a camera-integrated recorder) used in moving image shooting.
- the embodiment of the present technology can also be applied to an audio output device (for example, a speaker) and an imaging device (for example, a camera-integrated recorder) used in moving image reproduction.
- the embodiments of the present technology can be applied to a sound collection device (for example, a microphone) and an information processing device.
- the embodiment of the present technology can also be applied to an audio output device (for example, a speaker) and an information processing device that are used in moving image reproduction. That is, the embodiment of the present technology is an information processing apparatus including a control unit that performs control to switch a display mode of a display screen for operating another apparatus based on a relative positional relationship with the other apparatus ( As an example of an electronic device).
- the processing procedure described in the above embodiment may be regarded as a method having a series of these procedures, and a program for causing a computer to execute these series of procedures or a recording medium storing the program. You may catch it.
- a recording medium for example, a CD (Compact Disc), an MD (MiniDisc), a DVD (Digital Versatile Disc), a memory card, a Blu-ray disc (Blu-ray (registered trademark) Disc), or the like can be used.
- An information processing apparatus including a control unit that performs control to switch a display mode of a display screen for operating the imaging apparatus based on a relative positional relationship with the imaging apparatus.
- the control unit performs control to switch a display mode of the display screen based on a distance between the information processing device and the imaging device.
- the control unit performs control to switch a display mode of the display screen based on whether or not the imaging apparatus is attached to the information processing apparatus.
- the information processing apparatus performs control to switch a display mode of the display screen based on whether or not the imaging apparatus is mounted on a display surface of the information processing apparatus.
- the control unit is configured to switch the display mode of the display screen based on the position of the imaging device on the display surface of the information processing device when the imaging device is mounted on the display surface of the information processing device.
- the information processing apparatus according to (4), wherein: (6) The control unit displays the display screen including an operation target for operating the imaging apparatus, and performs control to change a display mode of the operation target based on the relative positional relationship (1 ) To (5).
- the control unit displays the display screen including an operation target for operating the imaging apparatus, and when the imaging apparatus is not attached to the information processing apparatus, a change in attitude of the information processing apparatus
- the display mode of the operation target is changed based on a change in the attitude of the information processing device when the display mode of the operation target is changed based on the information and the imaging device is attached to the information processing device.
- the information processing apparatus according to any one of (1) to (6), wherein control is performed so as not to perform.
- Imaging including a control unit that performs control related to an imaging operation based on an operation input performed on the information processing apparatus on which a display screen whose display mode is switched based on a relative positional relationship between the imaging apparatus and the information processing apparatus is displayed apparatus.
- the imaging apparatus according to (8) wherein a display mode of the display screen is switched based on a distance between the information processing apparatus and the imaging apparatus.
- the imaging apparatus according to (8) wherein a display mode of the display screen is switched based on whether the imaging apparatus is attached to the information processing apparatus.
- the imaging device according to (10) wherein a display mode of the display screen is switched based on whether or not the imaging device is mounted on a display surface of the information processing device.
- the display mode of the display screen is switched based on the position of the imaging device on the display surface of the information processing device. The imaging device described.
- the information processing apparatus displays the display screen including an operation target for operating the imaging device, and changes the display mode of the operation target based on the relative positional relationship.
- the imaging device according to any one of (12).
- the information processing apparatus displays the display screen including an operation target for operating the imaging apparatus, and when the imaging apparatus is not attached to the information processing apparatus, the attitude of the information processing apparatus When the display mode of the operation object is changed based on the change, and the imaging apparatus is mounted on the information processing apparatus, the display mode of the operation object based on the change in the attitude of the information processing apparatus
- the imaging apparatus according to any one of (8) to (13), wherein no change is performed.
- An imaging apparatus that is connected to the information processing apparatus using wireless communication and performs control related to an imaging operation based on an operation input performed in the information processing apparatus;
- An imaging system comprising: an information processing apparatus that performs control to switch a display mode of a display screen for operating the imaging apparatus based on a relative positional relationship with the imaging apparatus.
- a control method for an information processing apparatus that performs control for switching a display mode of a display screen for operating the imaging apparatus based on a relative positional relationship with the imaging apparatus.
- a control method for an imaging apparatus that performs control related to an imaging operation based on an operation input performed on the information processing apparatus on which a display screen whose display mode is switched based on a relative positional relationship between the imaging apparatus and the information processing apparatus is displayed.
- a program that causes a computer to execute control for switching a display mode of a display screen for operating the imaging device based on a relative positional relationship with the imaging device (19) A program for causing a computer to execute control related to an imaging operation based on an operation input performed on the information processing apparatus on which a display screen whose display mode is switched based on a relative positional relationship between the imaging apparatus and the information processing apparatus is displayed.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
- Accessories Of Cameras (AREA)
Abstract
Description
1.第1の実施の形態(撮像装置および情報処理装置間の距離に基づいて表示画面の表示態様を切り替える例)
2.第2の実施の形態(撮像装置の取付位置に基づいて表示画面の表示態様を切り替える例)
3.第3の実施の形態(撮像装置および情報処理装置の装着の有無に基づいて、情報処理装置の姿勢の変化に基づく表示態様の切り替えを制御する例)
[撮像装置の外観構成例]
図1は、本技術の第1の実施の形態における撮像装置100の外観構成を示す図である。図1のaには、撮像装置100の正面図を示し、図1のbには、撮像装置100の側面図(矢印Aから見た場合における側面図)を示す。
図2および図3は、本技術の第1の実施の形態における撮像装置100を情報処理装置200に取り付けた場合の外観構成を示す図である。
図4は、本技術の第1の実施の形態における撮像装置100および情報処理装置200の機能構成例を示すブロック図である。なお、撮像装置100および情報処理装置200により構成される撮像システムは、請求の範囲に記載の撮像システムの一例である。
図5は、本技術の第1の実施の形態における撮像装置100および情報処理装置200の使用例を示す図である。
ここで、撮像装置100および情報処理装置200間の距離を取得する方法について説明する。
例えば、受信電波強度を用いて撮像装置100および情報処理装置200間の距離を推定することができる。例えば、受信電波強度と距離との関係を示すテーブルを予め作成しておき、記憶部270に記憶しておく。そして、情報処理装置200の制御部250は、無線通信部230により取得された受信電波強度に対応する距離をそのテーブルから取得し、この距離を撮像装置100および情報処理装置200間の距離として用いることができる。
例えば、入出力部240を用いて撮像装置100および情報処理装置200間の距離を推定することができる。ここでは、入出力部240として、投影型静電容量方式のタッチパネルを用いる場合を想定する。この場合には、撮像装置100の情報処理装置200への装着面175(図1のbに示す)に、投影型静電容量方式のタッチパネルに反応する素材を採用するようにする。この素材は、例えば、導電性シリコンゴムである。
例えば、情報処理装置200に撮像装置100が装着されたことを検出するための部材を用いて撮像装置100の装着を検出することができる。その部材として、例えば、撮像装置100および情報処理装置200の少なくとも一方にスイッチを設けることができる。
図6は、本技術の第1の実施の形態における入出力部240に表示される表示画面の切替例を示す図である。図6では、撮像装置100および情報処理装置200間の距離が閾値以上となっている場合と、その距離が閾値未満となっている場合との例を示す。
図7乃至図9は、本技術の第1の実施の形態における入出力部240に表示される表示画面の表示例を示す図である。
図7には、撮像装置100および情報処理装置200間の距離に基づいて、情報処理装置200の発光部291をオンオフするための操作対象物301の表示態様を切り替える例を示す。
図8には、撮像装置100および情報処理装置200間の距離に基づいて、撮像装置100のズーム操作を行うための操作対象物317の表示態様を切り替える例を示す。
図9には、撮像装置100および情報処理装置200間の距離に基づいて、撮像装置100の各種操作を行うための操作対象物321乃至327の表示態様を切り替える例を示す。
図10は、本技術の第1の実施の形態における情報処理装置200による表示制御処理の処理手順の一例を示すフローチャートである。図10では、撮像装置100および情報処理装置200間の距離に基づいて、入出力部240の表示態様を切り替える例を示す。
本技術の第1の実施の形態では、撮像装置および情報処理装置間の距離に基づいて表示画面の表示態様を切り替える例を示した。ここで、例えば、撮像装置および情報処理装置間の距離が閾値未満であり、かつ、情報処理装置の入出力部の表示面に撮像装置が装着されている場合を想定する。この場合には、入出力部に表示される表示画面のうち、撮像装置が取り付けられている部分については、ユーザが見ることができない。そこで、このような場合には、その取付位置に基づいて表示画面の表示態様を切り替えることが好ましい。
最初に、撮像装置100の取付位置を判定する判定方法について説明する。
例えば、撮像装置100の装着を検出するための検出部材を情報処理装置200に設ける。例えば、情報処理装置200の表面(入出力部240が設けられている面)および裏面(撮像部292が設けられている面)のそれぞれにおける両端部(情報処理装置200の長手方向の両端部)に検出部材を設ける。この検出部材として、例えば、撮像装置100の装着を検出するためのスイッチやセンサを用いることができる。そして、その検出部材を用いて、情報処理装置200の表面および裏面のうちのどの部分に、撮像装置100が装着されたかを検出することができる。
例えば、入出力部240が光センサ方式のタッチパネルである場合には、入出力部240の表示面に装着される撮像装置100の位置およびサイズを検出することができる。また、例えば、入出力部240が投影型静電容量方式のタッチパネルである場合でも、上述したように、入出力部240の表示面に装着される撮像装置100の位置およびサイズを検出することができる。例えば、撮像装置100の装着面175(図1のbに示す)に、投影型静電容量方式のタッチパネルに反応する素材を採用することにより、入出力部240の表示面に装着される撮像装置100の位置およびサイズを検出することができる。
図11は、本技術の第2の実施の形態における情報処理装置200による取付位置判定処理の処理手順の一例を示すフローチャートである。図11では、撮像装置100および情報処理装置200間の距離が閾値(例えば、数cm)未満である場合には、撮像装置100が情報処理装置200に装着されていると判定する例を示す。また、図11では、入出力部240の表示面における取付位置として、入出力部240の表示面の左側であるか右側であるかを判定する例を示す。
図12乃至図14は、本技術の第2の実施の形態における入出力部240に表示される表示画面の表示例を示す図である。
図12には、情報処理装置200において撮像装置100が装着されている面に基づいて操作対象物の表示態様を切り替える例を示す。
図13および図14には、入出力部240の表示面において撮像装置100が装着されている位置に基づいて操作対象物の表示態様を切り替える例を示す。なお、図14には、図13に対応する入出力部240の表示面の全体を示す。
図15は、本技術の第2の実施の形態における情報処理装置200による表示制御処理の処理手順の一例を示すフローチャートである。図15では、情報処理装置200において撮像装置100が装着されている位置に基づいて、入出力部240の表示態様を切り替える例を示す。また、図15では、情報処理装置200に撮像装置100が装着されていることが判断された場合における例を示す。
本技術の第1および第2の実施の形態では、撮像装置および情報処理装置間の距離や、撮像装置の取付位置に基づいて表示画面の表示態様を切り替える例を示した。ここで、本技術の第1および第2の実施の形態に示す情報処理装置は、上述したように、情報処理装置の姿勢に基づいて表示画面の表示態様を切り替えることが可能である。例えば、情報処理装置が横長状態(情報処理装置の長手方向が水平方向に略平行となっている状態)であるか、縦長状態(情報処理装置の長手方向が重力方向に略平行となっている状態)であるかに応じて、表示画面の表示態様を切り替えることができる。しかしながら、例えば、撮像装置が情報処理装置に装着されている状態で撮像動作を行う場合には、撮像装置および情報処理装置が離れている状態で撮像動作を行う場合と比較して撮像装置および情報処理装置の使用態様が異なることが想定される。このため、撮像装置および情報処理装置を用いて撮像動作を行う場合には、その使用態様に応じて適切なユーザインターフェースを提供することが重要である。
図16は、本技術の第3の実施の形態における撮像装置101および情報処理装置200の機能構成例を示すブロック図である。なお、情報処理装置200は、図4に示す情報処理装置200と同様である。また、撮像装置101は、図4に示す撮像装置100の変形例である。このため、図4に示す撮像装置100および情報処理装置200と共通する部分については、同一の符号を付してこれらの説明の一部を省略する。
ここで、上述した装着の検出方法以外の他の検出方法について説明する。例えば、近距離無線通信を用いて撮像装置101の情報処理装置200への装着を検出することができる。ここでは、近距離無線通信として、NFC(Near Field Communication)を用いる例を示す。
以上では、NFCを用いて撮像装置の装着を検出する例を示した。このように、NFCを用いて撮像装置の装着を検出する場合には、Check CommandおよびCheck Responceを用いて撮像装置の装着を検出するようにしてもよい。
図17は、本技術の第3の実施の形態における撮像装置101の姿勢と撮像部110により生成される撮像画像との関係例を示す図である。
図18は、本技術の第3の実施の形態における入出力部240に表示されるライブビュー画像および操作対象物の遷移例を示す図である。図18では、図17のaに示す姿勢となるように撮像装置101を設置した場合に撮像部110により生成される撮像画像をライブビュー画像として表示する例を示す。
図19および図20は、本技術の第3の実施の形態における入出力部240に表示される表示画面の表示例を示す図である。図19および図20では、撮像装置101が情報処理装置200に装着されていない場合における表示画面の表示例を示す。
図21は、本技術の第3の実施の形態における入出力部240に表示される表示画面の表示例を示す図である。図21では、撮像装置101が情報処理装置200に装着されている場合における表示画面の表示例を示す。また、図21では、入出力部240に表示される操作対象物を用いて撮像装置101の各操作をユーザが行う場合における表示画面の表示例を示す。
図22は、本技術の第3の実施の形態における情報処理装置200による表示制御処理の処理手順の一例を示すフローチャートである。図22では、撮像装置101および情報処理装置200の装着の有無に基づいて、情報処理装置200の姿勢の変化に基づく表示態様の切り替えをするか否かを制御する例を示す。
(1)
撮像装置との相対的な位置関係に基づいて前記撮像装置を操作するための表示画面の表示態様を切り替える制御を行う制御部を具備する情報処理装置。
(2)
前記制御部は、前記情報処理装置および前記撮像装置間の距離に基づいて前記表示画面の表示態様を切り替える制御を行う前記(1)に記載の情報処理装置。
(3)
前記制御部は、前記撮像装置が前記情報処理装置に装着されているか否かに基づいて前記表示画面の表示態様を切り替える制御を行う前記(1)に記載の情報処理装置。
(4)
前記制御部は、前記撮像装置が前記情報処理装置の表示面に装着されているか否かに基づいて前記表示画面の表示態様を切り替える制御を行う前記(3)に記載の情報処理装置。
(5)
前記制御部は、前記撮像装置が前記情報処理装置の表示面に装着されている場合には、前記情報処理装置の表示面における前記撮像装置の位置に基づいて前記表示画面の表示態様を切り替える制御を行う前記(4)に記載の情報処理装置。
(6)
前記制御部は、前記撮像装置を操作するための操作対象物を含む前記表示画面を表示させ、前記相対的な位置関係に基づいて前記操作対象物の表示態様を変更する制御を行う前記(1)から(5)のいずれかに記載の情報処理装置。
(7)
前記制御部は、前記撮像装置を操作するための操作対象物を含む前記表示画面を表示させ、前記撮像装置が前記情報処理装置に装着されていない場合には、前記情報処理装置の姿勢の変化に基づいて前記操作対象物の表示態様を変更し、前記撮像装置が前記情報処理装置に装着されている場合には、前記情報処理装置の姿勢の変化に基づく前記操作対象物の表示態様の変更を行わないように制御を行う前記(1)から(6)のいずれかに記載の情報処理装置。
(8)
撮像装置および情報処理装置の相対的な位置関係に基づいて表示態様が切り替えられる表示画面が表示される前記情報処理装置において行われる操作入力に基づいて撮像動作に関する制御を行う制御部を具備する撮像装置。
(9)
前記情報処理装置および前記撮像装置間の距離に基づいて前記表示画面の表示態様が切り替えられる前記(8)に記載の撮像装置。
(10)
前記撮像装置が前記情報処理装置に装着されているか否かに基づいて前記表示画面の表示態様が切り替えられる前記(8)に記載の撮像装置。
(11)
前記撮像装置が前記情報処理装置の表示面に装着されているか否かに基づいて前記表示画面の表示態様が切り替えられる前記(10)に記載の撮像装置。
(12)
前記撮像装置が前記情報処理装置の表示面に装着されている場合には、前記情報処理装置の表示面における前記撮像装置の位置に基づいて前記表示画面の表示態様が切り替えられる前記(11)に記載の撮像装置。
(13)
前記情報処理装置は、前記撮像装置を操作するための操作対象物を含む前記表示画面を表示させ、前記相対的な位置関係に基づいて前記操作対象物の表示態様を変更する前記(8)から(12)のいずれかに記載の撮像装置。
(14)
前記情報処理装置は、前記撮像装置を操作するための操作対象物を含む前記表示画面を表示させ、前記撮像装置が前記情報処理装置に装着されていない場合には、前記情報処理装置の姿勢の変化に基づいて前記操作対象物の表示態様を変更し、前記撮像装置が前記情報処理装置に装着されている場合には、前記情報処理装置の姿勢の変化に基づく前記操作対象物の表示態様の変更を行わない前記(8)から(13)のいずれかに記載の撮像装置。
(15)
無線通信を利用して情報処理装置と接続されて前記情報処理装置において行われる操作入力に基づいて撮像動作に関する制御が行われる撮像装置と、
前記撮像装置との相対的な位置関係に基づいて前記撮像装置を操作するための表示画面の表示態様を切り替える制御を行う情報処理装置と
を具備する撮像システム。
(16)
撮像装置との相対的な位置関係に基づいて前記撮像装置を操作するための表示画面の表示態様を切り替える制御を行う情報処理装置の制御方法。
(17)
撮像装置および情報処理装置の相対的な位置関係に基づいて表示態様が切り替えられる表示画面が表示される前記情報処理装置において行われる操作入力に基づいて撮像動作に関する制御を行う撮像装置の制御方法。
(18)
撮像装置との相対的な位置関係に基づいて前記撮像装置を操作するための表示画面の表示態様を切り替える制御をコンピュータに実行させるプログラム。
(19)
撮像装置および情報処理装置の相対的な位置関係に基づいて表示態様が切り替えられる表示画面が表示される前記情報処理装置において行われる操作入力に基づいて撮像動作に関する制御をコンピュータに実行させるプログラム。
110 撮像部
120 画像処理部
130 記憶部
140 制御部
150 無線通信部
160 鏡筒
171 取付部材
172 取付部材
175 装着面
180 姿勢検出部
200 情報処理装置
210 姿勢検出部
220 操作受付部
230 無線通信部
240 入出力部
241 入力部
242 表示部
250 制御部
260 画像処理部
270 記憶部
280 音声出力部
291 発光部
292 撮像部
Claims (19)
- 撮像装置との相対的な位置関係に基づいて前記撮像装置を操作するための表示画面の表示態様を切り替える制御を行う制御部を具備する情報処理装置。
- 前記制御部は、前記情報処理装置および前記撮像装置間の距離に基づいて前記表示画面の表示態様を切り替える制御を行う請求項1記載の情報処理装置。
- 前記制御部は、前記撮像装置が前記情報処理装置に装着されているか否かに基づいて前記表示画面の表示態様を切り替える制御を行う請求項1記載の情報処理装置。
- 前記制御部は、前記撮像装置が前記情報処理装置の表示面に装着されているか否かに基づいて前記表示画面の表示態様を切り替える制御を行う請求項3記載の情報処理装置。
- 前記制御部は、前記撮像装置が前記情報処理装置の表示面に装着されている場合には、前記情報処理装置の表示面における前記撮像装置の位置に基づいて前記表示画面の表示態様を切り替える制御を行う請求項4記載の情報処理装置。
- 前記制御部は、前記撮像装置を操作するための操作対象物を含む前記表示画面を表示させ、前記相対的な位置関係に基づいて前記操作対象物の表示態様を変更する制御を行う請求項1記載の情報処理装置。
- 前記制御部は、前記撮像装置を操作するための操作対象物を含む前記表示画面を表示させ、前記撮像装置が前記情報処理装置に装着されていない場合には、前記情報処理装置の姿勢の変化に基づいて前記操作対象物の表示態様を変更し、前記撮像装置が前記情報処理装置に装着されている場合には、前記情報処理装置の姿勢の変化に基づく前記操作対象物の表示態様の変更を行わないように制御を行う請求項1記載の情報処理装置。
- 撮像装置および情報処理装置の相対的な位置関係に基づいて表示態様が切り替えられる表示画面が表示される前記情報処理装置において行われる操作入力に基づいて撮像動作に関する制御を行う制御部を具備する撮像装置。
- 前記情報処理装置および前記撮像装置間の距離に基づいて前記表示画面の表示態様が切り替えられる請求項8記載の撮像装置。
- 前記撮像装置が前記情報処理装置に装着されているか否かに基づいて前記表示画面の表示態様が切り替えられる請求項8記載の撮像装置。
- 前記撮像装置が前記情報処理装置の表示面に装着されているか否かに基づいて前記表示画面の表示態様が切り替えられる請求項10記載の撮像装置。
- 前記撮像装置が前記情報処理装置の表示面に装着されている場合には、前記情報処理装置の表示面における前記撮像装置の位置に基づいて前記表示画面の表示態様が切り替えられる請求項11記載の撮像装置。
- 前記情報処理装置は、前記撮像装置を操作するための操作対象物を含む前記表示画面を表示させ、前記相対的な位置関係に基づいて前記操作対象物の表示態様を変更する請求項8記載の撮像装置。
- 前記情報処理装置は、前記撮像装置を操作するための操作対象物を含む前記表示画面を表示させ、前記撮像装置が前記情報処理装置に装着されていない場合には、前記情報処理装置の姿勢の変化に基づいて前記操作対象物の表示態様を変更し、前記撮像装置が前記情報処理装置に装着されている場合には、前記情報処理装置の姿勢の変化に基づく前記操作対象物の表示態様の変更を行わない請求項8記載の撮像装置。
- 無線通信を利用して情報処理装置と接続されて前記情報処理装置において行われる操作入力に基づいて撮像動作に関する制御が行われる撮像装置と、
前記撮像装置との相対的な位置関係に基づいて前記撮像装置を操作するための表示画面の表示態様を切り替える制御を行う情報処理装置と
を具備する撮像システム。 - 撮像装置との相対的な位置関係に基づいて前記撮像装置を操作するための表示画面の表示態様を切り替える制御を行う情報処理装置の制御方法。
- 撮像装置および情報処理装置の相対的な位置関係に基づいて表示態様が切り替えられる表示画面が表示される前記情報処理装置において行われる操作入力に基づいて撮像動作に関する制御を行う撮像装置の制御方法。
- 撮像装置との相対的な位置関係に基づいて前記撮像装置を操作するための表示画面の表示態様を切り替える制御をコンピュータに実行させるプログラム。
- 撮像装置および情報処理装置の相対的な位置関係に基づいて表示態様が切り替えられる表示画面が表示される前記情報処理装置において行われる操作入力に基づいて撮像動作に関する制御をコンピュータに実行させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/911,126 US20160198093A1 (en) | 2013-10-07 | 2014-07-24 | Information processing apparatus, imaging apparatus, imaging system, control method of information processing apparatus, control method of imaging apparatus, and program |
CN201480054226.8A CN105637853A (zh) | 2013-10-07 | 2014-07-24 | 信息处理装置、成像装置、成像系统、信息处理装置的控制方法、成像装置的控制方法以及程序 |
EP14852189.1A EP3038343B1 (en) | 2013-10-07 | 2014-07-24 | Information processing device, imaging system, method for controlling information processing device and program |
JP2015541460A JP6447505B2 (ja) | 2013-10-07 | 2014-07-24 | 情報処理装置、撮像装置、撮像システム、情報処理装置の制御方法、撮像装置の制御方法およびプログラム |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-210118 | 2013-10-07 | ||
JP2013210118 | 2013-10-07 | ||
JP2013237988 | 2013-11-18 | ||
JP2013-237988 | 2013-11-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015052974A1 true WO2015052974A1 (ja) | 2015-04-16 |
Family
ID=52812795
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/069547 WO2015052974A1 (ja) | 2013-10-07 | 2014-07-24 | 情報処理装置、撮像装置、撮像システム、情報処理装置の制御方法、撮像装置の制御方法およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20160198093A1 (ja) |
EP (1) | EP3038343B1 (ja) |
JP (1) | JP6447505B2 (ja) |
CN (1) | CN105637853A (ja) |
WO (1) | WO2015052974A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017081914A1 (ja) * | 2015-11-10 | 2017-05-18 | ソニー株式会社 | アダプター装置及び撮像装置 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5909033B2 (ja) * | 2014-03-19 | 2016-04-27 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 情報端末の制御方法及びプログラム |
US11284003B2 (en) | 2015-07-29 | 2022-03-22 | Samsung Electronics Co., Ltd. | User terminal apparatus and control method thereof |
US9936138B2 (en) | 2015-07-29 | 2018-04-03 | Samsung Electronics Co., Ltd. | User terminal apparatus and control method thereof |
JP6579899B2 (ja) * | 2015-10-09 | 2019-09-25 | キヤノン株式会社 | 撮像装置、撮像装置の制御方法、プログラム及び記憶媒体 |
US10498964B2 (en) | 2017-12-28 | 2019-12-03 | Gopro, Inc. | Adaptive modes of operation based on user intention or activity |
US11426055B2 (en) | 2020-02-21 | 2022-08-30 | Ambu A/S | Medical visualisation system including a monitor and a graphical user interface therefore |
WO2021165357A1 (en) * | 2020-02-21 | 2021-08-26 | Ambu A/S | Rotational user interface for a medical visualisation system |
JP2021136666A (ja) * | 2020-02-28 | 2021-09-13 | キヤノン株式会社 | 撮像装置、デバイス、制御方法、およびプログラム |
JP2021136664A (ja) | 2020-02-28 | 2021-09-13 | キヤノン株式会社 | デバイス、制御方法、およびプログラム |
CN114422711B (zh) * | 2022-03-29 | 2022-08-12 | 深圳市猿人创新科技有限公司 | 拍摄设备及其软件功能和拍摄参数自适应方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003032520A (ja) * | 2001-07-13 | 2003-01-31 | Yokogawa Electric Corp | カメラ操作システム |
JP2004080519A (ja) * | 2002-08-20 | 2004-03-11 | Nikon Corp | 電子カメラ |
JP2009094591A (ja) | 2007-10-04 | 2009-04-30 | Sony Corp | 電子機器、撮像装置、撮像システム、および撮像遠隔操作方法 |
JP2013013063A (ja) * | 2011-06-03 | 2013-01-17 | Panasonic Corp | 撮像装置及び撮像システム |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005080195A (ja) * | 2003-09-03 | 2005-03-24 | Fuji Photo Film Co Ltd | カメラ |
JP4160920B2 (ja) * | 2004-03-29 | 2008-10-08 | 富士フイルム株式会社 | カメラシステムおよびカメラ本体 |
JP4850400B2 (ja) * | 2004-09-17 | 2012-01-11 | キヤノン株式会社 | 撮像装置 |
US7957765B1 (en) * | 2007-05-25 | 2011-06-07 | At&T Mobility Ii Llc | Mobile phone with integrated wireless camera |
KR101555509B1 (ko) * | 2009-02-18 | 2015-09-25 | 삼성전자주식회사 | 탈부착 가능한 서브 모듈을 구비하는 휴대 단말기 |
KR20130061511A (ko) * | 2011-12-01 | 2013-06-11 | 삼성전자주식회사 | 디지털 촬영 시스템 및 디지털 촬영 시스템의 동작 방법 |
US8587711B2 (en) * | 2012-02-23 | 2013-11-19 | Paul S Anderson | Smartphone user interface viewfinder system |
JP6257336B2 (ja) * | 2014-01-14 | 2018-01-10 | キヤノン株式会社 | 撮像装置及び撮像装置の制御方法 |
-
2014
- 2014-07-24 JP JP2015541460A patent/JP6447505B2/ja active Active
- 2014-07-24 CN CN201480054226.8A patent/CN105637853A/zh active Pending
- 2014-07-24 EP EP14852189.1A patent/EP3038343B1/en active Active
- 2014-07-24 WO PCT/JP2014/069547 patent/WO2015052974A1/ja active Application Filing
- 2014-07-24 US US14/911,126 patent/US20160198093A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003032520A (ja) * | 2001-07-13 | 2003-01-31 | Yokogawa Electric Corp | カメラ操作システム |
JP2004080519A (ja) * | 2002-08-20 | 2004-03-11 | Nikon Corp | 電子カメラ |
JP2009094591A (ja) | 2007-10-04 | 2009-04-30 | Sony Corp | 電子機器、撮像装置、撮像システム、および撮像遠隔操作方法 |
JP2013013063A (ja) * | 2011-06-03 | 2013-01-17 | Panasonic Corp | 撮像装置及び撮像システム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3038343A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017081914A1 (ja) * | 2015-11-10 | 2017-05-18 | ソニー株式会社 | アダプター装置及び撮像装置 |
US10401708B2 (en) | 2015-11-10 | 2019-09-03 | Sony Corporation | Adapter device and imaging apparatus |
Also Published As
Publication number | Publication date |
---|---|
JPWO2015052974A1 (ja) | 2017-03-09 |
US20160198093A1 (en) | 2016-07-07 |
EP3038343B1 (en) | 2021-03-17 |
EP3038343A1 (en) | 2016-06-29 |
CN105637853A (zh) | 2016-06-01 |
JP6447505B2 (ja) | 2019-01-09 |
EP3038343A4 (en) | 2017-06-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6447505B2 (ja) | 情報処理装置、撮像装置、撮像システム、情報処理装置の制御方法、撮像装置の制御方法およびプログラム | |
JP6209952B2 (ja) | 情報処理装置、撮像装置、撮像システム、情報処理方法およびプログラム | |
JP6658519B2 (ja) | 情報処理装置、情報処理システム、情報処理装置の制御方法およびプログラム | |
JP6201641B2 (ja) | 情報処理装置、撮像装置、撮像システム、情報処理装置の制御方法、撮像装置の制御方法およびプログラム | |
JP6451644B2 (ja) | 情報処理装置、撮像装置、撮像システム、情報処理装置の制御方法、撮像装置の制御方法およびプログラム | |
JP6171877B2 (ja) | 情報処理装置、撮像装置、撮像システム、情報処理方法およびプログラム | |
JP6624235B2 (ja) | 情報処理装置、撮像装置、撮像システム、情報処理装置の制御方法およびプログラム | |
JP6387700B2 (ja) | 情報処理装置、情報処理システム、情報処理装置の制御方法およびプログラム | |
JP6465029B2 (ja) | 情報処理装置、撮像装置、撮像システム、情報処理装置の制御方法、撮像装置の制御方法およびプログラム | |
US10218892B2 (en) | Information processing device, imaging device, imaging system, and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14852189 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015541460 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14911126 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014852189 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |