WO2016136838A1 - ウェアラブル装置、制御方法及び制御プログラム - Google Patents
ウェアラブル装置、制御方法及び制御プログラム Download PDFInfo
- Publication number
- WO2016136838A1 WO2016136838A1 PCT/JP2016/055522 JP2016055522W WO2016136838A1 WO 2016136838 A1 WO2016136838 A1 WO 2016136838A1 JP 2016055522 W JP2016055522 W JP 2016055522W WO 2016136838 A1 WO2016136838 A1 WO 2016136838A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- predetermined
- wearable device
- imaging
- upper limb
- user
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 354
- 238000003384 imaging method Methods 0.000 claims abstract description 856
- 210000001364 upper extremity Anatomy 0.000 claims description 484
- 230000008569 process Effects 0.000 claims description 269
- 230000008859 change Effects 0.000 claims description 230
- 238000001514 detection method Methods 0.000 claims description 166
- 238000012545 processing Methods 0.000 claims description 111
- 230000033001 locomotion Effects 0.000 claims description 74
- 230000006870 function Effects 0.000 claims description 68
- 210000003128 head Anatomy 0.000 claims description 52
- 230000009467 reduction Effects 0.000 claims description 39
- 238000006073 displacement reaction Methods 0.000 claims description 25
- 230000009471 action Effects 0.000 claims description 9
- 210000003414 extremity Anatomy 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 6
- 238000013459 approach Methods 0.000 claims description 5
- 238000003672 processing method Methods 0.000 claims description 4
- 230000001960 triggered effect Effects 0.000 claims description 3
- 230000007423 decrease Effects 0.000 claims 1
- 210000003811 finger Anatomy 0.000 description 213
- 210000003813 thumb Anatomy 0.000 description 52
- 238000010586 diagram Methods 0.000 description 45
- 238000006243 chemical reaction Methods 0.000 description 29
- 238000011946 reduction process Methods 0.000 description 9
- 230000035945 sensitivity Effects 0.000 description 8
- 238000005259 measurement Methods 0.000 description 7
- 238000012937 correction Methods 0.000 description 6
- 210000001015 abdomen Anatomy 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 3
- 230000004913 activation Effects 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 210000001145 finger joint Anatomy 0.000 description 2
- 210000005224 forefinger Anatomy 0.000 description 2
- 210000004247 hand Anatomy 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000005389 magnetism Effects 0.000 description 2
- 230000003321 amplification Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000008034 disappearance Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000000245 forearm Anatomy 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Definitions
- the present invention relates to a wearable device that can be worn on a user's head, a control method, and a control program.
- a head-mounted display device that includes a display arranged in front of the eyes and an infrared detection unit capable of recognizing the movement of a finger and is operated by a gesture of both hands has been disclosed (patent) Reference 1).
- An object of the present invention is to provide a wearable device that is easier to use.
- a wearable device is a wearable device that can be worn on a head, an imaging unit that captures a landscape in front of a user, and a control unit that detects a shielding object from a captured image captured by the imaging unit The control unit executes the acquisition of the captured image when the shielding object moves from the imaging range of the imaging unit to the outside of the imaging range.
- the control method which concerns on one aspect is a control method performed with the wearable apparatus which can be mounted
- a control program includes: a step of imaging a landscape in front of a user by an imaging unit on a wearable device that can be worn on a head; and a step of detecting a shielding object from a captured image captured by the imaging unit; And the step of executing acquisition of the captured image triggered by the movement of the shielding object from the imaging range of the imaging unit to the outside of the imaging range.
- a wearable device is a wearable device that can be worn on a head, an imaging unit that captures a scene in front of a user, and a predetermined image detected from a captured image captured by the imaging unit, And a control unit that performs a predetermined change process according to the change in the size of the predetermined region in the captured image when it is detected that the size of the predetermined region has changed.
- the control method which concerns on one aspect is a control method performed with the wearable apparatus which can be mounted
- a control program includes: a step of capturing a scene in front of a user by an imaging unit on a wearable device that can be worn on a head; and a step of detecting a predetermined object from a captured image captured by the imaging unit; When detecting that the size of the predetermined area of the predetermined object in the captured image has changed, a step of performing a predetermined change process according to the change in the size is executed.
- a wearable device is a wearable device that can be worn on a head, a distance measuring unit that detects a distance between the device itself and a predetermined object, and detecting the change in the distance, the distance And a control unit that performs a predetermined change process according to the change.
- a control method is a control method executed by a wearable device that can be worn on a head, the step of detecting the distance between the user's own device and a predetermined object in the front-rear direction of the user, and the distance is And a step of performing a predetermined change process according to the change of the distance when the change is detected.
- control program detects a change in the distance between the user's own device and the predetermined object in the front-rear direction of the user on the wearable device that can be worn on the head, Performing a predetermined change process according to a change in distance.
- a wearable device is a wearable device that can be worn on a head, a display unit disposed in front of a user's eyes, an imaging unit that captures a scene in front of the user, and the imaging unit that captures an image.
- a control unit that detects a predetermined object from the captured image and detects a predetermined operation of the predetermined object, and performs a predetermined change process according to the displacement of the predetermined object associated with the predetermined operation. The change rate per unit displacement of the predetermined object in the changing process is changed according to the size of the predetermined object.
- a control method is a control method executed by a wearable device that can be mounted on a head, including a display unit arranged in front of a user's eyes, and images an image of a scene in front of the user by the imaging unit. And a step of performing a predetermined change process according to the displacement of the predetermined object accompanying the predetermined operation when detecting a predetermined object from the captured image captured by the imaging unit and detecting a predetermined operation of the predetermined object; The change rate per unit displacement of the predetermined object in the change process is changed according to the size of the predetermined object in the captured image.
- a control program includes: a step of imaging a scene in front of the user by an imaging unit on a wearable device that can be worn on a head including a display unit disposed in front of the user's eyes; and And a step of performing a predetermined change process according to the displacement of the predetermined object accompanying the predetermined operation when a predetermined object is detected from a captured image to be captured and a predetermined operation of the predetermined object is detected.
- the change rate per unit displacement of the predetermined object in the changing process is changed according to the size of the predetermined object in the captured image.
- a wearable device is a wearable device that can be worn on a head, and detects a distance between a display unit arranged in front of a user's eyes and a user's own device and a predetermined object in a front-rear direction of the user.
- a control unit that performs a predetermined change process, and a change rate per unit displacement of the predetermined object in the change process is changed according to a distance between the device and the predetermined object.
- a control method is a control method executed by a wearable device that can be worn on a head, including a display unit that is arranged in front of a user's eyes, and the user's own device and a predetermined object in the front-rear direction
- a step of detecting a distance between the detection unit, a step of detecting an object that actually exists by the detection unit, and detecting a predetermined operation of a predetermined object that actually exists from the detection result of the detection unit, accompanying the predetermined operation Performing a predetermined change process according to the displacement of the predetermined object, and a change rate per unit displacement of the predetermined object in the change process according to a distance between the device and the predetermined object Is changed.
- a control program detects a distance between the user's own device and a predetermined object in the front-rear direction of the user in a wearable device that can be worn on the head, including a display unit arranged in front of the user's eyes.
- Performing a predetermined change process wherein a change rate per unit displacement of the predetermined object in the change process is changed according to a distance between the device and the predetermined object. Is done.
- a wearable device is a wearable device that can be worn on the head, and is disposed in front of the user's eyes and displays an object between the user's own device and a predetermined object in the front-rear direction.
- a distance measuring unit that detects a distance between the first state and a second state in which the distances are different from each other, and a predetermined process for the object can be executed in the first state. In the second state, the process cannot be executed.
- a wearable device is a wearable device that can be worn on a head, and includes a detection unit that detects an upper limb in an actual space, and a predetermined state in the upper limb is determined from a detection result of the detection unit. The number of fingers held is detected, and based on the number of fingers, change processing relating to a predetermined function is executed.
- a control method is a control method executed by a wearable device that can be worn on a head, the step of detecting an upper limb in an actual space by a detection unit, and a detection result of the detection unit Detecting the number of fingers having a predetermined state in the upper limb, and executing a changing process related to a predetermined function based on the number of fingers.
- a control program includes a step of detecting an upper limb in an actual space by a detection unit in a wearable device that can be worn on a head, and a predetermined state of the upper limb from a detection result of the detection unit. Detecting the number of fingers, and executing a change process relating to a predetermined function based on the number of fingers.
- FIG. 1 is a perspective view of a wearable device.
- FIG. 2 is a front view of the wearable device worn by the user.
- FIG. 3 is a block diagram of the wearable device.
- FIG. 4 is a diagram illustrating an example of an imaging function by the wearable device.
- FIG. 5 is a diagram illustrating another example of the imaging function by the wearable device.
- FIG. 6 is a flowchart illustrating a processing procedure related to imaging control by the wearable device.
- FIG. 7 is a diagram illustrating an example of control related to still image capturing by the wearable device.
- FIG. 8 is a diagram illustrating an example of control related to imaging of a moving image by the wearable device.
- FIG. 1 is a perspective view of a wearable device.
- FIG. 2 is a front view of the wearable device worn by the user.
- FIG. 3 is a block diagram of the wearable device.
- FIG. 4 is a diagram illustrating an example of an imaging function by the wearable
- FIG. 9 is a flowchart illustrating a processing procedure related to imaging control of still images and moving images by the wearable device.
- FIG. 10 is a diagram illustrating an example of control related to zoom imaging by the wearable device.
- FIG. 11 is a diagram illustrating another example of control related to zoom imaging performed by the wearable device.
- FIG. 12 is a flowchart illustrating a processing procedure related to zoom imaging control by the wearable device.
- FIG. 13 is a diagram illustrating an example of control related to imaging using a preview image by the wearable device.
- FIG. 14 is a flowchart illustrating a control processing procedure related to imaging using a preview image by the wearable device.
- FIG. 15 is a diagram illustrating an example of control related to zoom imaging using a preview image by the wearable device.
- FIG. 16 is a flowchart illustrating a control processing procedure related to zoom imaging using a preview image by the wearable device.
- FIG. 17 is a diagram for describing an example of control provided by the control program of the wearable device.
- FIG. 18 is a flowchart illustrating a processing procedure related to imaging control by the wearable device.
- FIG. 19 is a diagram illustrating an example in which various change processes are performed.
- FIG. 20 is a diagram for describing an example of control provided by the control program of the wearable device.
- FIG. 21A is a diagram for describing an example in which the size of the display image is changed based on the motion of the upper limb.
- FIG. 21B is a diagram for describing an example in which the size of the display image is changed based on the motion of the upper limb.
- FIG. 21C is a diagram for describing an example in which the size of the display image is changed based on the motion of the upper limb.
- FIG. 21D is a diagram for describing an example in which the size of the display image is changed based on the motion of the upper limb.
- FIG. 22 is a diagram for describing another example of the control provided by the control program of the wearable device.
- FIG. 23 is a diagram illustrating an example of a conversion table in which the size of the upper limb in the captured image is associated with the converted value.
- FIG. 24 is a flowchart illustrating a processing procedure related to imaging control by the wearable device.
- FIG. 25 is a diagram for explaining another example of the control provided by the control program of the wearable device.
- FIG. 26 is a diagram for describing another example of the control provided by the control program of the wearable device.
- FIG. 27 is a diagram for describing another example of the control provided by the control program of the wearable device.
- FIG. 28 is a diagram for explaining another example of the control provided by the control program of the wearable device.
- FIG. 29 is a diagram for explaining another example of the control provided by the control program of the wearable device.
- FIG. 30 is a flowchart illustrating a processing procedure related to imaging control by the wearable device.
- FIG. 1 is a perspective view of the wearable device 1.
- FIG. 2 is a front view of the wearable device 1 worn by the user.
- the wearable apparatus 1 is a head-mounted type apparatus with which a user's head is mounted
- the wearable device 1 has a front surface portion 1a, a side surface portion 1b, and a side surface portion 1c.
- Front part 1a is arranged in front of the user so as to cover both eyes of the user when worn.
- the side surface portion 1b is connected to one end portion of the front surface portion 1a
- the side surface portion 1c is connected to the other end portion of the front surface portion 1a.
- the side surface portion 1b and the side surface portion 1c are supported by a user's ear like a vine of glasses when worn, and stabilize the wearable device 1.
- the side surface portion 1b and the side surface portion 1c may be configured to be connected to the back surface of the user's head when worn.
- the front part 1a is provided with a display part 32a and a display part 32b on the surface facing the user's eyes when worn.
- the display unit 32a is disposed at a position facing the user's right eye when worn, and the display unit 32b is disposed at a position facing the user's left eye when worn.
- the display unit 32a displays a right-eye image, and the display unit 32b displays a left-eye image.
- the display unit 32a and the display unit 32b are a pair of transflective displays, but are not limited thereto.
- the display unit 32a and the display unit 32b may be provided with lenses such as an eyeglass lens, a sunglasses lens, and an ultraviolet cut lens, and the display unit 32a and the display unit 32b may be provided separately from the lens.
- the display unit 32a and the display unit 32b may be configured by one display device as long as different images can be independently provided to the right eye and the left eye of the user.
- the front surface portion 1a includes an imaging unit 40 and an imaging unit 42 on a surface opposite to the surface on which the display unit 32a and the display unit 32b are provided.
- the imaging unit 40 is disposed in the vicinity of one end (the right eye side at the time of mounting) of the front surface 1a, and the imaging unit 42 is in the vicinity of the other end (the left eye side at the time of mounting) of the front surface 1a.
- the imaging unit 40 acquires an image in a range corresponding to the field of view of the user's right eye.
- the imaging unit 42 acquires an image in a range corresponding to the field of view of the user's left eye.
- the field of view means, for example, the field of view when the user is looking at the front.
- the wearable device 1 has a function of visually recognizing various information on the foreground that the user is viewing.
- the foreground means a landscape in front of the user.
- the wearable device 1 allows the user to visually recognize the foreground through the display unit 32a and the display unit 32b.
- the wearable device 1 allows the user to visually recognize the foreground through the display unit 32a and the display unit 32b and the display contents of the display unit 32a and the display unit 32b.
- the wearable device 1 may display three-dimensional information on the display unit 32a and the display unit 32b.
- the wearable device 1 and 2 show an example in which the wearable device 1 has a shape like glasses, but the shape of the wearable device 1 is not limited to this.
- the wearable device 1 may have a goggle shape.
- the wearable device 1 may be configured to be connected to an external device such as an information processing device or a battery device in a wired or wireless manner, for example.
- FIG. 3 is a block diagram of the wearable device 1.
- the wearable device 1 includes an operation unit 13, a control unit 22, a storage unit 24, display units 32 a and 32 b, imaging units 40 and 42, a detection unit 44, and a distance measurement unit 46. And have.
- the operation unit 13 accepts basic operations such as starting and stopping the wearable device 1 and changing the operation mode.
- the display units 32a and 32b include a transflective display device such as a liquid crystal display (Liquid Crystal Display), an organic EL (Organic Electro-Luminescence) panel, and the like.
- the display units 32 a and 32 b display various types of information according to control signals input from the control unit 22.
- the display units 32a and 32b may be projection devices that project an image onto a user's retina using a light source such as a laser beam.
- the imaging units 40 and 42 electronically capture an image using an image sensor such as a CCD (Charge Coupled Device Image Sensor) or a CMOS (Complementary Metal Oxide Semiconductor).
- the imaging units 40 and 42 convert the captured images into signals and output the signals to the control unit 22.
- the detection unit 44 detects an actual object (predetermined object) existing in the vicinity of the image pickup range of the image pickup units 40 and 42 and the vicinity of the image pickup range (or the foreground). For example, the detection unit 44 matches an object (for example, a shape of a human hand or a finger) registered in advance among real objects existing in the imaging range (a shielding object that shields the foreground, a predetermined object, or the like). ). The detection unit 44 detects the range (shape and size) of the actual object in the image based on the brightness, saturation, hue edge, and the like of the pixel even for an object whose shape is not registered in advance. May be configured.
- the detection unit 44 includes a sensor that detects a real object existing in the imaging range.
- the sensor is, for example, a sensor that detects an actual object existing in the imaging range using at least one of visible light, infrared light, ultraviolet light, radio waves, sound waves, magnetism, and capacitance.
- the detection unit 44 detects, for example, the movement of an object from the imaging range to the outside of the imaging range based on a plurality of detection results.
- the detection unit 44 may be able to detect an actual object that exists in a wider range than the imaging range.
- the distance measuring unit 46 measures the distance to the actual object.
- the distance to the actual object is measured for each eye based on the position of each eye of the user wearing the wearable device 1. For this reason, when the reference position where the distance measuring unit 46 measures the distance is shifted from the position of each eye, the measured value of the distance measuring unit 46 indicates the distance to the eye position according to the difference. It is corrected to represent.
- the distance measuring unit 46 includes a sensor that detects a distance to an actual object.
- the sensor is, for example, a sensor that detects the distance to an actual object using at least one of visible light, infrared light, ultraviolet light, radio waves, sound waves, magnetism, and capacitance.
- the control unit 22 described later can calculate the distance in the front-rear direction of the user between the wearable device 1 and a predetermined object (for example, the user's upper limb) based on the detection result of the distance measuring unit 46.
- the imaging units 40 and 42 also serve as the detection unit 44 and the distance measurement unit 46. That is, the imaging units 40 and 42 detect an object within the imaging range by analyzing the captured image. The imaging units 40 and 42 estimate changes in the distance between the object in the imaging range and the device itself by analyzing the captured image. Alternatively, the imaging units 40 and 42 detect whether an object within the imaging range has moved to a position closer to the own device or a position farther away. Further, the imaging units 40 and 42 serving as the distance measuring unit 46 compare the object included in the image captured by the imaging unit 40 with the object included in the image captured by the imaging unit 42, so that The distance may be measured (calculated).
- the control unit 22 includes a CPU (Central Processing Unit) that is a calculation means and a memory that is a storage means, and implements various functions by executing programs using these hardware resources. Specifically, the control unit 22 reads a program or data stored in the storage unit 24 and expands it in a memory, and causes the CPU to execute instructions included in the program expanded in the memory. The control unit 22 reads / writes data from / to the memory and the storage unit 24 and controls operations of the display units 32a, 32b and the like according to the execution result of the instruction by the CPU. When the CPU executes an instruction, the data developed in the memory and the operation detected via the detection unit 44 are used as a part of parameters and determination conditions.
- a CPU Central Processing Unit
- the storage unit 24 includes a non-volatile storage device such as a flash memory, and stores various programs and data.
- the program stored in the storage unit 24 includes a control program 24a.
- the data stored in the storage unit 24 includes imaging data 24b.
- the storage unit 24 may be configured by a combination of a portable storage medium such as a memory card and a read / write device that reads from and writes to the storage medium.
- the control program 24a and the imaging data 24b may be stored in a storage medium.
- the control program 24a may be acquired from another device such as a server device by wireless communication or wired communication.
- the control program 24 a provides functions related to various controls for operating the wearable device 1.
- the functions provided by the control program 24a include a function for controlling the imaging of the imaging units 40 and 42, a function for controlling the display of the display units 32a and 32b, and the like.
- the control program 24a includes a detection processing unit 25 and a display control unit 26.
- the detection processing unit 25 provides a function for detecting an actual object existing in the imaging range of the imaging units 40 and 42.
- the function provided by the detection processing unit 25 includes a function of measuring the distance to each detected object.
- the display control unit 26 provides a function for managing the correspondence between the display states of the display units 32a and 32b and the imaging range.
- the control program 24a includes a function of detecting the presence / absence of the action of the actual object or the type of action from the captured images taken by the imaging units 40 and 42, the detection result of the detection unit 44, and the like.
- the imaging data 24b is data indicating an image obtained by imaging the imaging range by the imaging units 40 and 42.
- the imaging data 24b may be data obtained by combining images captured by the imaging units 40 and 42, or data indicating the images of the imaging units 40 and 42, respectively.
- the imaging data 24b includes still image data and moving image data.
- FIG. 4 is a diagram illustrating an example of an imaging function by the wearable device 1.
- FIG. 5 is a diagram illustrating another example of the imaging function of the wearable device 1.
- the same components are denoted by the same reference numerals, and redundant descriptions may be omitted.
- the visual recognition content indicates the foreground 100 in the imaging range R viewed through the display units 32a and 32b in a state in which the user wears the wearable device 1 on the head.
- the states of the imaging units 40 and 42 indicate the states of the imaging units 40 and 42 corresponding to the visually recognized content.
- the imaging units 40 and 42 have states such as an unstarted state, an imaging standby state, and an imaging state, for example.
- the unstarted state is a state in which the imaging units 40 and 42 are not started.
- the imaging standby state is a state in which acquisition of captured images by the imaging units 40 and 42 is on standby.
- the imaging state is a state in which images captured by the imaging units 40 and 42 are acquired.
- the imaging range R is a range obtained by combining the imaging ranges of the imaging units 40 and 42.
- the user views the foreground 100 in the imaging range R of the imaging units 40 and 42 through the display units 32a and 32b.
- the foreground 100 outside the imaging range R is omitted, but the user visually recognizes the foreground 100 even between the front surface 1a (frame) of the wearable device 1 and the imaging range R. ing.
- the wearable device 1 does not display a frame of the imaging range R. For this reason, the user does not visually recognize the frame in the imaging range R.
- the wearable device 1 detects an object by the detection unit 44 while maintaining the unactivated state of the imaging units 40 and 42.
- step S11 the user moves the right hand H with the index finger raised from the front right side of the wearable device 1 toward the center.
- the user views the right hand H moving in front of the wearable device 1 and the foreground 100 within the imaging range R through the display units 32a and 32b.
- “Viewing within the imaging range R” means that the user is viewing the space in front of the user corresponding to the imaging range R.
- the wearable device 1 determines whether the right hand H has moved from outside the imaging range R into the imaging range R based on the detected change in the position of the right hand H. .
- the wearable device 1 detects the movement into the imaging range R, the wearable device 1 activates the imaging units 40 and 42 to shift the imaging units 40 and 42 from the unactivated state to the imaging standby state.
- the wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- step S12 when the user moves the right hand H to the vicinity of the center in front of the wearable device 1, the user turns around the center of the wearable device 1 and moves it in the reverse direction toward the right side of the wearable device 1. In this case, the user visually recognizes the right hand H moving in front of the wearable device 1 toward the outside of the imaging range R and the foreground 100 in the imaging range R through the display units 32a and 32b.
- the wearable device 1 determines whether the right hand H has moved out of the imaging range R from the imaging range R based on the detection result of the detection unit 44 in step S12. When the wearable device 1 determines that the right hand H has moved outside the imaging range R, the wearable device 1 causes the foreground 100 to be imaged with the imaging units 40 and 42 in the imaging state.
- the imaging units 40 and 42 image the foreground 100 with a preset focus.
- the wearable device 1 acquires a captured image 71 captured by the imaging units 40 and 42 and stores the captured image 71 in the storage unit 24 as captured data 24b. After that, the wearable device 1 stops the imaging units 40 and 42 and shifts the imaging units 40 and 42 to an unstarted state.
- the wearable device 1 acquires the captured image 71 captured by the imaging units 40 and 42 when the user's upper limb moves from the imaging range R of the imaging units 40 and 42 to the outside of the imaging range R. To do.
- the user can take a forward image by the imaging units 40 and 42 only by performing a gesture of moving the upper limb in front of the wearable device 1. For example, even if a photo opportunity occurs suddenly, the user can take an image of the photo opportunity by performing the above gesture.
- the wearable device 1 can improve the operability and convenience of the imaging units 40 and 42.
- “upper limb” means a person's arm or hand, and is defined as including at least one of the upper arm, forearm, hand, or finger.
- the wearable device 1 activates the imaging units 40 and 42 when the detection unit 44 detects the user's upper limb within the imaging range R. Thereby, since the wearable apparatus 1 starts the imaging parts 40 and 42 at the timing which a user images, it can aim at power saving. For example, the wearable device 1 is effectively applied to glasses-type and head-mounted devices that are difficult to mount a large-capacity battery.
- the example shown in FIG. 5 is a modification of the example shown in FIG. 4 and shows an example of control in the case where the wearable device 1 detects the focal point to be imaged based on the movement trajectory of the user's upper limb.
- step S11 shown in FIG. 5 the user moves the right hand H with the index finger raised from the front right side of the wearable device 1 toward the center. In this case, the user views the right hand H moving in front of the wearable device 1 and the foreground 100 in the imaging range R through the display units 32a and 32b.
- Wearable device 1 determines whether right hand H has moved from outside imaging range R into imaging range R based on a change in the position of right hand H when detection unit 44 detects right hand H at step S11. When the wearable device 1 detects the movement into the imaging range R, the wearable device 1 activates the imaging units 40 and 42 to shift the imaging units 40 and 42 from the unactivated state to the imaging standby state. The wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- step S12 when the user moves the right hand H to the focal position of the foreground 100 to be imaged, the user turns the right hand H back at the focal position and moves it in the reverse direction toward the right side of the wearable device 1. In this case, the user visually recognizes in the imaging range R the right hand H and the foreground 100 that move in front of the wearable device 1 from the focal position toward the outside of the imaging range R through the display units 32a and 32b. .
- the wearable device 1 determines whether the right hand H has moved out of the imaging range R from the imaging range R based on the detection result of the detection unit 44 in step S12. When the wearable device 1 determines that the right hand H has moved outside the imaging range R, the wearable device 1 estimates the movement locus L within the imaging range R of the right hand H based on the stored video data 70.
- the wearable device 1 estimates the movement locus L that reaches outside the imaging range R, focusing on the fingertip of the right hand H or the like.
- the wearable device 1 estimates a start position where the user moves the right hand H out of the imaging range R based on the estimated movement locus L.
- the estimation method of the start position includes, for example, a method of estimating based on a change point of the movement locus L before moving outside the imaging range R.
- a position where the right hand H resumes movement after stopping for a predetermined time may be regarded as a change point of the movement locus L.
- the start position is estimated by using a position where the direction of the trajectory changes for the first time as a change point of the trajectory L from the predetermined position of the right hand H where the right hand H has moved outside the imaging range R. May be considered.
- the wearable device 1 estimates the focal position Lc based on the movement locus L of the right hand H in step S13, the focus of the imaging units 40 and 42 is set to the focal position Lc, and the imaging units 40 and 42 are in the imaging state. 100 is imaged.
- the imaging units 40 and 42 capture the foreground 100 with the set focus.
- the focal position Lc is, for example, a changing point of the movement locus L described above.
- the wearable device 1 acquires a captured image 71 captured by the imaging units 40 and 42 and stores the captured image 71 in the storage unit 24 as captured data 24b. After that, the wearable device 1 stops the imaging units 40 and 42 and shifts the imaging units 40 and 42 to an unstarted state.
- the method of defining the focal position Lc a method other than the method of estimating based on the changing point of the movement locus L may be adopted. For example, after detecting the movement of the right hand H in the imaging range R in the forward direction (or the direction opposite to the forward direction) based on the detection results of the detection unit 44 and the distance measuring unit 46, the right hand H is in the imaging range. When it is detected that it has moved out of R, the position at which the movement of the right hand H in the forward direction is detected may be regarded as the focal position Lc.
- the wearable device 1 has the imaging unit 40 at a focal point corresponding to the movement trajectory L of the upper limb when the user's upper limb moves from the imaging range R of the imaging units 40 and 42 to the outside of the imaging range R. And the captured image 71 which 42 and 42 imaged is acquired. Thereby, the user can image the front image with the designated focus by the imaging units 40 and 42 only by performing a gesture of moving the upper limb in front of the wearable device 1. As a result, the wearable device 1 can improve the operability and convenience of the imaging units 40 and 42.
- FIG. 6 is a flowchart illustrating a processing procedure related to imaging control by the wearable device 1.
- the processing procedure shown in FIG. 6 is realized by the control unit 22 executing the control program 24a.
- the processing procedure shown in FIG. 6 is repeatedly executed when the wearable device 1 is worn on the head, when the wearable device 1 is operating, or the like.
- the control unit 22 of the wearable device 1 determines whether the upper limb has been detected within the imaging range R as step S ⁇ b> 101. Specifically, the control unit 22 determines that the upper limb has been detected when the detection unit 44 detects the upper limb. When the upper limb is not detected (No at Step S102), the process at Step S101 is performed again.
- step S102 When the upper limb is detected (step S102, Yes), the control unit 22 advances the process to step S103.
- the control part 22 starts the imaging parts 40 and 42 as step S103.
- step S104 the control unit 22 detects the movement of the upper limb in the imaging range R based on the video data 70 of the imaging units 40 and 42.
- the control unit 22 determines whether the upper limb has moved out of the imaging range R from the imaging range R based on the detection result of step S104 as step S105. Specifically, the control unit 22 determines that the upper limb has moved out of the imaging range R when it detects that all of the upper limb has moved out of the imaging range R so that the upper limb is not reflected in the captured image 71. . If it has not moved outside the imaging range R (No at Step S106), the control unit 22 re-executes the processing after Step S104.
- step S106 When moving outside the imaging range R (step S106, Yes), the control unit 22 advances the process to step S107.
- the control part 22 sets an in-focus position as step S107. Specifically, the control unit 22 sets a focus position that is a preset initial value. Alternatively, the control unit 22 estimates the focal position Lc based on the movement locus L of the upper limb, and sets the focal position Lc.
- the control unit 22 acquires the captured image 71 as step S108. Specifically, the control unit 22 causes the imaging units 40 and 42 to capture the foreground 100 with the focal point set in step S107, acquires the captured image 71, and stores the acquired captured image 71 as imaging data 24b. To remember. Then, the control part 22 stops the imaging parts 40 and 42 as step S109. When the imaging units 40 and 42 stop, the control unit 22 ends the processing procedure shown in FIG.
- FIG. 7 is a diagram illustrating an example of control related to still image capturing by the wearable device 1.
- the wearable device 1 When the wearable device 1 detects that the right hand H has entered the imaging range R by the detection unit 44 in step S20 illustrated in FIG. 7, the wearable device 1 shifts the imaging units 40 and 42 to the imaging standby state.
- the wearable device 1 displays icons 51 and 52 on the display units 32a and 32b that allow the user to select a still image and a moving image so as to be within the imaging range R.
- the wearable device 1 displays an icon 51 indicating a still image on the left side of the imaging range R and an icon 52 indicating a moving image on the right side.
- the wearable device 1 will describe a case where the icons 51 and 52 are displayed within the imaging range R, but may be displayed outside the imaging range R.
- the wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- step S20 when the user moves the right hand H from outside the imaging range R of the imaging units 40 and 42 into the imaging range R, the icons 51 and 52 are visually recognized on the display units 32a and 32b, and the display units 32a and 32b are displayed. Through the foreground 100 and the right hand H. The user can recognize that the wearable device 1 can capture an image by visually recognizing the icons 51 and 52.
- step S21 when it is desired to capture a still image, the user moves the right hand H so that the right hand H in the imaging range R passes the icon 51 and is outside the imaging range R.
- the user views the right hand H moving in front of the wearable device 1 from the icon 51 side to the outside of the imaging range R and the foreground 100 in the imaging range R through the display units 32a and 32b.
- the wearable device 1 determines whether the right hand H has moved out of the imaging range R from the imaging range R based on the detection result of the detection unit 44 in step S21. The wearable device 1 determines that the right hand H has moved out of the imaging range R and detects that the right hand H has passed the icon 51 (or the movement direction of the right hand H is directed to the still image icon 51). When the direction is detected), it is determined that the user has selected still image capturing. The wearable device 1 sets the imaging units 40 and 42 to the imaging state, and images the foreground 100 with a still image. The imaging units 40 and 42 capture a still image of the foreground 100 with a preset focus.
- the wearable device 1 acquires a captured image 71 captured by the imaging units 40 and 42 and stores the captured image 71 in the storage unit 24 as captured data 24b. After that, the wearable device 1 stops the imaging units 40 and 42 and shifts the imaging units 40 and 42 to an unstarted state.
- FIG. 8 is a diagram illustrating an example of control related to capturing a moving image by the wearable device 1.
- the wearable device 1 shifts the imaging units 40 and 42 to the imaging standby state when the detection unit 44 detects that the right hand H has entered the imaging range R in step S20 shown in FIG.
- the wearable device 1 displays icons 51 and 52 on the display units 32a and 32b that allow the user to select a still image and a moving image so as to be within the imaging range R.
- the wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- step S20 when the user moves the right hand H from outside the imaging range R of the imaging units 40 and 42 into the imaging range R, the icons 51 and 52 are visually recognized on the display units 32a and 32b, and the display units 32a and 32b are displayed. Through the foreground 100 and the right hand H. The user can recognize that the wearable device 1 can capture an image by visually recognizing the icons 51 and 52.
- step S22 when the user wants to capture a moving image, the user moves the right hand H so that the right hand H positioned in the imaging range R is positioned outside the imaging range R after passing through the icon 52. In this case, the user visually recognizes the right hand H moving in front of the wearable device 1 from the moving image icon 52 side to the outside of the imaging range R and the foreground 100 in the imaging range R through the display units 32a and 32b. Yes.
- the wearable device 1 determines whether the right hand H has moved out of the imaging range R from the imaging range R based on the detection result of the detection unit 44 in step S22. The wearable device 1 determines that the right hand H has moved outside the imaging range R and detects that the right hand H has passed the icon 52 (or the movement direction of the right hand H is directed to the still image icon 52). When the direction is detected), it is determined that the start of moving image capturing has been selected by the user. The wearable device 1 places the imaging units 40 and 42 in the moving image capturing state, and starts capturing the foreground 100 with a moving image. The imaging units 40 and 42 start imaging the foreground 100 with a moving image with a preset focus.
- step S23 when the user wants to stop the moving image being shot, the user moves the right hand H so that the right hand H outside the imaging range R passes through the icon 52 and is within the imaging range R.
- the user enters the imaging range R with the right hand H and the foreground 100 moving from the outside of the imaging range R on the icon 52 side into the imaging range R through the display units 32a and 32b. It is visually recognized.
- the case where the user moves the right hand H into the imaging range R through the icon 52 will be described, but the present invention is not limited to this.
- the user may move the right hand H into the imaging range R without passing through the icon 52.
- the wearable device 1 determines whether the right hand H has moved from outside the imaging range R into the imaging range R based on the detection result of the detection unit 44 in step S23. When the wearable device 1 determines that the right hand H has moved into the imaging range R, the wearable device 1 determines that the end of imaging of the moving image has been selected by the user. The wearable device 1 places the imaging units 40 and 42 in the imaging end state and stops the imaging of the moving image. The wearable device 1 acquires a captured image 72 of a moving image captured by the imaging units 40 and 42 and stores the captured image 72 in the storage unit 24 as imaging data 24b. After that, the wearable device 1 stops the imaging units 40 and 42 and shifts the imaging units 40 and 42 to an unstarted state.
- the wearable device 1 when the wearable device 1 detects that the user's upper limb has moved from within the imaging range R of the imaging units 40 and 42 to the outside of the imaging range R, the wearable device 1 uses a moving image based on the moving direction of the upper limb. Switch the imaging mode to capture images or still images. Thereby, the user can switch the imaging modes of the imaging units 40 and 42 only by switching the moving direction of the upper limbs into the imaging range R. As a result, the wearable device 1 does not need to be provided with a switch, an interface, or the like for switching the shooting mode, and can improve the operability and convenience of the imaging units 40 and 42.
- zoom processing may be performed by performing a predetermined operation.
- the brightness of a moving image to be acquired may be changed by a predetermined operation.
- the wearable device 1 includes a microphone and can collect a user's voice or the like during the acquisition of a moving image
- a predetermined operation is performed during the shooting of the moving image, so that the amplification degree of the audio signal is increased.
- the (collection volume) may be changed.
- a touch sensor that can detect a user's contact is arranged on the side surface (1b or 1c) having a shape like a vine, and a finger is in contact with the side surface.
- It may be an operation for performing an operation of moving in the direction (slide operation).
- an aspect may be employed in which a zoom-in process is performed when the user slides a finger from the rear to the front, and a zoom-out process is performed when the user slides the finger from the front to the publication.
- a predetermined operation has been performed based on the detection of the user's upper limb being within a predetermined range in the detection range of the detection unit 44 during moving image shooting. It may be a configuration. At this time, a predetermined association between the detection range of the detection unit 44 and the display region of the display unit is made, and the position in the display region of the display unit can be specified by the position of the upper limb within the detection range of the detection unit 44 In such a case, it may be determined that a predetermined operation has been performed based on the fact that a predetermined position in the display area of the display unit has been designated by the user.
- an icon indicating an operation related to moving image shooting is displayed in a predetermined area in the display areas of the display units 32a and 32b, and the icon is displayed when a position is designated by the user's upper limb in the icon display area. You may make it perform operation based on.
- the wearable device 1 has described the case where the icons 51 and 52 that guide the switching of the imaging mode are displayed on the display units 32a and 32b.
- the present invention is not limited to this.
- the display unit 1 may be configured to switch the imaging mode in a moving direction outside the predetermined imaging range R without displaying the icons 51 and 52.
- the wearable device 1 described above is used when the upper limb is moved out of the imaging range R from the upper and lower portions of the imaging range R where the icons 51 and 52 are not provided. , May be configured to stop imaging.
- the wearable device 1 has described the case where the icons 51 and 52 that guide the switching of the imaging mode are displayed on the display units 32a and 32b.
- the present invention is not limited to this.
- the wearable device 1 may be configured to switch the imaging mode in a moving direction outside the predetermined imaging range R without displaying the icons 51 and 52.
- FIG. 9 is a flowchart showing a processing procedure related to imaging control of still images and moving images by the wearable device 1.
- the processing procedure shown in FIG. 9 is realized by the control unit 22 executing the control program 24a.
- the processing procedure shown in FIG. 9 is repeatedly executed when the wearable device 1 is worn on the head, when the wearable device 1 is operating, or the like.
- the control unit 22 of the wearable device 1 determines whether an upper limb has been detected within the imaging range R as step S ⁇ b> 201.
- the process at Step S201 is re-executed.
- step S202 When the upper limb is detected (step S202, Yes), the control unit 22 advances the process to step S203.
- the control part 22 starts the imaging parts 40 and 42 as step S203.
- step S204 the control unit 22 displays the icons 51 and 52 on the display units 32a and 32b.
- the control unit 22 detects the movement of the upper limb in the imaging range R based on the video data 70 of the imaging units 40 and 42 as step S205.
- step S206 the control unit 22 determines whether the upper limb has moved out of the imaging range R from the imaging range R based on the detection result in step S205. Specifically, the control unit 22 determines that the upper limb has moved out of the imaging range R when it detects that all of the upper limb has moved out of the imaging range R so that the upper limb is not reflected in the captured image 71. . If it has not moved outside the imaging range R (No at Step S207), the control unit 22 re-executes the processing after Step S205.
- step S207 When moving outside the imaging range R (step S207, Yes), the control unit 22 advances the process to step S208.
- step S208 the control unit 22 determines whether the moving direction of the upper limb is the direction of the still image icon 51. Specifically, when the control unit 22 detects that the upper limb has passed the icon 51 and moved out of the imaging range R based on the video data 70 of the imaging units 40 and 42, the still image icon 51 is displayed. The direction is determined.
- step S210 the control unit 22 advances the process to step S210.
- the control part 22 sets a focus position as step S210. Specifically, the control unit 22 sets a focus position that is a preset initial value.
- the control part 22 acquires the captured image 71 which is a still image as step S211. Specifically, the control unit 22 causes the imaging units 40 and 42 to capture the foreground 100 as a still image with the focus set in step S210, acquires the captured image 71, and uses the acquired captured image 71 as the imaging data 24b. Store in the storage unit 24. Thereafter, the control unit 22 stops the imaging units 40 and 42 in step S217. When the imaging units 40 and 42 stop, the control unit 22 ends the processing procedure shown in FIG.
- step S212 the control unit 22 causes the imaging units 40 and 42 to start capturing moving images.
- the control part 22 determines whether the upper limb was detected within the imaging range R as step S213. When the upper limb has not been detected (No at Step S214), the process at Step S213 is executed again.
- step S215 the control unit 22 ends the moving image capturing by the imaging units 40 and 42.
- the control part 22 acquires the captured image 72 which is a moving image as step S216. Specifically, the control unit 22 acquires the captured image 72 of the moving image captured by the imaging units 40 and 42, and stores the captured image 72 in the storage unit 24 as the imaging data 24b. Thereafter, the control unit 22 stops the imaging units 40 and 42 in step S217. When the imaging units 40 and 42 stop, the control unit 22 ends the processing procedure shown in FIG.
- FIG. 10 is a diagram illustrating an example of control related to zoom imaging performed by the wearable device 1.
- the wearable device 1 When the wearable device 1 detects that the right hand H has entered the imaging range R by the detection unit 44 in step S31 illustrated in FIG. 10, the wearable device 1 shifts the imaging units 40 and 42 to the imaging standby state. The wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- step S31 the user moves the right hand H from the outside of the imaging range R of the imaging units 40 and 42 into the imaging range R while being close to the wearable device 1.
- the user visually recognizes the index finger F of the right hand H and the foreground 100 moving from the outside of the imaging range R into the imaging range R in the imaging range R through the display units 32a and 32b. is doing.
- the user moves the right hand H away from the vicinity of the wearable device 1 in the forward direction.
- the wearable device 1 detects the distance to the right hand H when detecting the forward movement of the right hand H within the imaging range R based on the detection results of the detection unit 44 and the distance measurement unit 46 in step S32.
- a zoom process based on the detected distance is executed.
- the zoom process is a process of specifying a zoom range R1 to be zoomed in or zoomed out in the imaging range R based on the detected distance and the conversion table, for example.
- the wearable device 1 displays the specified zoom range R1 on the display units 32a and 32b so as to be within the imaging range R.
- step S32 the user visually recognizes in the imaging range R the right hand H that moves in front of the wearable device 1 in the forward direction within the imaging range R and the foreground 100 through the display units 32a and 32b.
- the user visually recognizes the zoom range R1 displayed on the display units 32a and 32b.
- the user moves the right hand H moved forward in the imaging range R toward the outside of the imaging range R.
- the wearable device 1 determines whether the right hand H has moved out of the imaging range R from the imaging range R based on the detection result of the detection unit 44 in step S33. When the wearable device 1 determines that the right hand H has moved outside the imaging range R, the wearable device 1 sets the imaging units 40 and 42 to the imaging state, sets the focal position in the specified zoom region R1, and images the foreground 100. The imaging units 40 and 42 capture the foreground 100 with the set focus. The wearable device 1 acquires a captured image 71 captured by the imaging units 40 and 42 and stores the captured image 71 in the storage unit 24 as captured data 24b. After that, the wearable device 1 stops the imaging units 40 and 42 and shifts the imaging units 40 and 42 to an unstarted state.
- step S33 the user visually recognizes in the imaging range R the right hand H that moves in front of the wearable device 1 in the forward direction within the imaging range R and the foreground 100 through the display units 32a and 32b. In this case, the user visually recognizes that the zoom range R1 displayed on the display units 32a and 32b is erased.
- the wearable device 1 has been described with respect to the case where the imaging range R is enlarged according to the distance from which the upper limb is separated from the device itself, but is not limited thereto.
- the wearable device 1 may reduce the imaging range R in accordance with the distance that the upper limb approaches the device.
- the wearable device 1 estimates the change in the distance between the own device and the upper limb in the imaging range R, changes the enlargement / reduction ratio of the imaging range R according to the change in the distance, and the upper limb is outside the imaging range R.
- the captured image 71 captured by the imaging units 40 and 42 with the changed scaling ratio is acquired.
- the user can change the enlargement / reduction ratio of the imaging range R only by changing the distance between the upper limb and the user's own device within the imaging range R.
- the wearable device 1 can improve the operability and convenience of the imaging units 40 and 42.
- FIG. 11 is a diagram illustrating another example of control related to zoom imaging performed by the wearable device 1.
- the wearable device 1 shifts the imaging units 40 and 42 to the imaging standby state when the detection unit 44 detects that the left hand H1 has entered the imaging range R in step S41 shown in FIG.
- the wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- this embodiment demonstrates the case where a user operates with the left hand H1, you may operate with the right hand H.
- the wearable device 1 regards the zoom range R1 as an imaging range, and when the upper limb moves from the zoom range R1 to the outside of the zoom range R1, the imaging unit 40 and the imaging It is good also as a structure which acquires the captured image 71 which the part 42 imaged.
- step S41 the user moves the left hand H1 from outside the imaging range R of the imaging units 40 and 42 into the imaging range R.
- the user visually recognizes in the imaging range R the left hand H1 and the foreground 100 that move the front of the wearable device 1 from the outside of the imaging range R into the imaging range R through the display units 32a and 32b.
- the user changes the shape of the left hand H1 within the imaging range R to a predetermined shape corresponding to the zoom function.
- the predetermined shape is a shape in which the left hand H1 is opened, but is not limited thereto.
- Wearable device 1 detects the operation which opens left hand H1 based on the detection result of detection part 44 as Step S42.
- the wearable device 1 detects an operation of opening the left hand H1
- the wearable device 1 executes zoom processing.
- the wearable device 1 displays the current zoom range R1 on the display units 32a and 32b so as to be within the imaging range R.
- step S42 the user visually recognizes the left hand H1 and the foreground 100 performing the opening operation through the display units 32a and 32b in the imaging range R.
- the user visually recognizes the zoom range R1 displayed on the display units 32a and 32b.
- the user performs a predetermined operation of rotating the left hand H1 in the open state within the imaging range R.
- Wearable device 1 detects the predetermined operation which rotates left hand H1 based on the detection result of detection part 44 as Step S43.
- the wearable device 1 executes zoom processing based on the rotation amount.
- the zoom process is a process of specifying a zoom range R1 to zoom in on the imaging range R, for example, based on the detected rotation amount of the left hand H1 and the conversion table.
- the wearable device 1 displays the specified zoom range R1 on the display units 32a and 32b so as to be within the imaging range R.
- step S43 the user visually recognizes the left hand H1 and the foreground 100 performing the rotation operation in the imaging range R through the display units 32a and 32b.
- the user visually recognizes the zoom range R1 displayed on the display units 32a and 32b and changing according to the rotation amount. Thereafter, when the zoom range R1 changes to a desired size, the user moves the left hand H1 in an open state toward the outside of the imaging range R.
- the wearable device 1 determines whether the left hand H1 has moved out of the imaging range R from the imaging range R based on the detection result of the detection unit 44 in step S44. When the wearable device 1 determines that the left hand H1 has moved outside the imaging range R, the wearable device 1 sets the imaging units 40 and 42 to the imaging state, sets the focal position in the specified zoom region R1, and images the foreground 100. The imaging units 40 and 42 capture the foreground 100 with the set focus. The wearable device 1 acquires a captured image 71 captured by the imaging units 40 and 42 and stores the captured image 71 in the storage unit 24 as captured data 24b. After that, the wearable device 1 stops the imaging units 40 and 42 and shifts the imaging units 40 and 42 to an unstarted state.
- step S44 the user visually recognizes in the imaging range R the left hand H1 that moves in front of the wearable device 1 in the forward direction within the imaging range R and the foreground 100 through the display units 32a and 32b. In this case, the user visually recognizes that the zoom range R1 displayed on the display units 32a and 32b is erased.
- the wearable device 1 has been described with respect to the case where the imaging range R is enlarged based on the amount of rotation of the left hand H1 in the opened state, but the present invention is not limited to this.
- the wearable device 1 may be configured to switch enlargement / reduction based on the rotation direction of the left hand H1 in the opened state.
- the wearable device 1 may be configured to switch the enlargement / reduction based on whether the wearer 1 is the left hand H1 or the right hand H.
- the wearable device 1 detects the predetermined motion of the upper limb in the imaging range R, changes the enlargement / reduction ratio of the imaging range R according to the detected predetermined motion, and indicates that the upper limb has moved out of the imaging range R.
- the captured image 71 captured by the imaging units 40 and 42 with the changed scaling ratio is acquired.
- the wearable device 1 can improve the operability and convenience of the imaging units 40 and 42.
- FIG. 12 is a flowchart illustrating a processing procedure related to zoom imaging control by the wearable device 1.
- the processing procedure shown in FIG. 12 is realized by the control unit 22 executing the control program 24a.
- the processing procedure shown in FIG. 12 is repeatedly executed when the wearable device 1 is worn on the head, when the wearable device 1 is operating, or the like.
- the control unit 22 of the wearable device 1 determines whether the upper limb has been detected within the imaging range R as step S301. When the upper limb is not detected (No at Step S302), the process at Step S301 is performed again.
- step S302 When the upper limb is detected (step S302, Yes), the control unit 22 advances the process to step S303.
- the control part 22 starts the imaging parts 40 and 42 as step S303.
- the control unit 22 detects a predetermined motion of the upper limb within the imaging range R as step S304. Specifically, the control unit 22 detects a predetermined operation based on the detection result of the detection unit 44, for example, when detecting an operation in which the upper limb moves back and forth, an operation to change the shape of the upper limb, or the like. Is determined.
- step S305 When the predetermined operation is detected (step S305, Yes), the control unit 22 advances the process to step S306.
- the control unit 22 executes zoom processing by the imaging units 40 and 42 as step S306. Specifically, the control unit 22 specifies the zoom range R1 in the imaging range R in which the imaging units 40 and 42 zoom in and out based on the detected predetermined operation and the conversion table, and the specified result is stored in the storage unit. 24.
- step S307 the control unit 22 displays the zoom range R1 specified in step S306 on the display units 32a and 32b.
- the control unit 22 detects the movement of the upper limb in the imaging range R based on the video data 70 of the imaging units 40 and 42 as step S308.
- the control unit 22 determines whether the upper limb has moved out of the imaging range R from the imaging range R based on the detection result in step S308. Specifically, the control unit 22 determines that the upper limb has moved out of the imaging range R when it detects that all of the upper limb has moved out of the imaging range R so that the upper limb is not reflected in the captured image 71. .
- the control unit 22 re-executes the processing after step S304.
- step S310 When moving outside the imaging range R (step S310, Yes), the control unit 22 advances the process to step S311.
- the control unit 22 sets a focus position as step S311. Specifically, the control unit 22 sets a focus position that is a preset initial value. Alternatively, the control unit 22 estimates the focal position Lc corresponding to the zoom range R1, and sets the focal position Lc.
- the control unit 22 acquires the captured image 71 as step S312. Specifically, the control unit 22 causes the imaging units 40 and 42 to capture the foreground 100 with the focal point set in step S311, acquires the captured image 71, and stores the acquired captured image 71 as imaging data 24b. To remember. Then, the control part 22 stops the imaging parts 40 and 42 as step S313. When the imaging units 40 and 42 stop, the control unit 22 ends the processing procedure shown in FIG.
- control unit 22 executes the processes after Step S308 already described. Then, the control part 22 complete
- FIG. 13 is a diagram illustrating an example of control related to imaging using a preview image by the wearable device 1.
- step S51 shown in FIG. 13 the user visually recognizes the foreground 100 in the imaging range R of the imaging units 40 and 42 through the display units 32a and 32b. The user does not visually recognize the frame in the imaging range R.
- the wearable device 1 detects an object by the detection unit 44 while the imaging units 40 and 42 are not activated.
- step S52 the user moves the index finger F from the front right side toward the center of the wearable device 1 so that only the index finger F enters the imaging range R.
- the user views the index finger F moving in front of the wearable device 1 and the foreground 100 in the imaging range R through the display units 32a and 32b.
- Wearable device 1 determines whether it moved from outside imaging range R into imaging range R based on the position of index finger F, if index finger F is detected by detection part 44 as Step S52. When the wearable device 1 detects the movement into the imaging range R, the wearable device 1 activates the imaging units 40 and 42 to shift the imaging units 40 and 42 from the unactivated state to the imaging standby state. The wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- the wearable device 1 displays the video data 70 on the display units 32a and 32b as a preview image 60 so as to be within the imaging range R in step S53.
- the range of the preview image 60 may be the same size as the imaging range R or may be a range narrower than the imaging range R.
- the wearable device 1 describes a case where the preview image 60 is displayed on the upper right in the imaging range R, but is not limited thereto.
- the preview image 60 may be displayed at a position in the foreground 100 that does not overlap with the index finger F. By doing in this way, wearable device 1 can make it easy to grasp the position of index finger F in foreground 100 when the user performs zoom processing or the like based on the operation of index finger F, for example.
- step S53 the user visually recognizes the index finger F moving in front of the wearable device 1 and the foreground 100 in the imaging range R through the display units 32a and 32b.
- the user visually recognizes the preview image 60 displayed on the display units 32a and 32b.
- the user recognizes that his / her index finger F is displayed on the preview image 60.
- step S54 the user moves the index finger F toward the outside of the imaging range R so that the index finger F disappears from the preview image 60.
- the wearable device 1 determines whether the index finger F has moved out of the range (imaging range R) of the preview image 60 based on the detection result of the detection unit 44 in step S54. When the wearable device 1 determines that the index finger F has moved out of the range, the forefinger F disappears from the preview image 60, and thus the foreground 100 is imaged with the imaging units 40 and 42 in the imaging state. The imaging units 40 and 42 image the foreground 100 with a preset focus. The wearable device 1 acquires a captured image 71 captured by the imaging units 40 and 42 and stores the captured image 71 in the storage unit 24 as captured data 24b. After that, the wearable device 1 stops the imaging units 40 and 42 and shifts the imaging units 40 and 42 to an unstarted state.
- the wearable device 1 displays the preview image 60 in the imaging range R, and acquires the captured image 71 captured by the imaging units 40 and 42 when detecting that the upper limb has disappeared from the preview image 60.
- the user can capture the front image by the imaging units 40 and 42 only by moving the upper limb so as to disappear while visually recognizing the preview image 60.
- the wearable device 1 can improve the operability and convenience of the imaging units 40 and 42.
- FIG. 14 is a flowchart illustrating a control processing procedure related to imaging using a preview image by the wearable device 1.
- the processing procedure shown in FIG. 14 is realized by the control unit 22 executing the control program 24a.
- the processing procedure shown in FIG. 14 is repeatedly executed when the wearable device 1 is worn on the head, when the wearable device 1 is operating, or the like.
- the control unit 22 of the wearable device 1 determines whether an upper limb has been detected within the imaging range R as step S401. If the upper limb is not detected (No at Step S402), the control unit 22 re-executes the process at Step S401.
- step S402 When the upper limb is detected (step S402, Yes), the control unit 22 advances the process to step S403.
- the control unit 22 activates the imaging units 40 and 42 as step S403.
- step S404 the control unit 22 controls the display units 32a and 32b to display the preview screen 60 within the imaging range R.
- the control unit 22 detects the movement of the upper limb in the imaging range R based on the video data 70 of the imaging units 40 and 42 as step S405.
- the control unit 22 determines whether the upper limb has moved out of the range of the preview image 60 based on the detection result in step S405. Specifically, the control unit 22 moves out of the range of the preview image 60 when it detects that all of the upper limbs have moved out of the range of the preview image 60 so that the upper limb is not reflected in the captured image 71. It is determined that
- step S407 If it has not moved out of the range of the preview image 60 (step S407, No), the control unit 22 advances the process to step S408.
- step S408 the control unit 22 causes the display units 32a and 32b to update the display of the preview image 60. Thereafter, the control unit 22 re-executes the processes after step S405.
- step S407 When moving outside the range of the preview image 60 (step S407, Yes), the control unit 22 advances the process to step S409.
- the control part 22 acquires the captured image 71 as step S409. Specifically, the control unit 22 causes the imaging units 40 and 42 to capture the foreground 100 with a preset focus, acquires the captured image 71, and stores the acquired captured image 71 as the captured data 24b in the storage unit 24. Remember. Then, the control part 22 stops the imaging parts 40 and 42 as step S410. When the imaging units 40 and 42 stop, the control unit 22 ends the processing procedure shown in FIG.
- FIG. 15 is a diagram illustrating an example of control related to zoom imaging using a preview image by the wearable device 1.
- the wearable device 1 shifts the imaging units 40 and 42 to the imaging standby state when the detection unit 44 detects that the index finger F has entered the imaging range R in step S61 shown in FIG.
- the wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- the wearable device 1 displays the video data 70 on the display units 32a and 32b as a preview image 60 so as to be within the imaging range R.
- step S61 the user sees the foreground 100 in the imaging range R of the imaging units 40 and 42 through the display units 32a and 32b. The user visually recognizes the preview image 60 displayed on the display units 32a and 32b.
- step S61 the user moves the index finger F in the forward direction within the imaging range R to a position corresponding to the zoom range R2 where zooming is planned.
- step S62 the user moves the index finger F to a position corresponding to the planned zoom range R2, so that the index finger F is located within the imaging range R and is out of the planned zoom range R2.
- Wearable device 1 detects the distance to index finger F when detecting forward movement of index finger F based on the detection results of detection unit 44 and distance measurement unit 46 in step S62, and based on the detected distance. Execute zoom processing.
- the zoom process is a process of specifying a zoom range R2 to zoom in on the imaging range R based on the detected distance and the conversion table, for example.
- the wearable device 1 changes the focus of the imaging units 40 and 42 when the zoom process is executed.
- the wearable device 1 displays the zoomed video data 70 as a preview image 60 on the display units 32a and 32b.
- step S62 the index finger F has disappeared from the preview screen 60 due to the zoom process. Note that the wearable device 1 does not execute the acquisition of the captured image 71 by the imaging units 40 and 42 depending on the disappearance of the index finger F at this time. This will be described later.
- step S63 when the preview screen 60 is at a desired zoom, the user moves the index finger F that has disappeared from the preview image 60 so as to be displayed on the preview screen 60 again. Thereafter, when the index finger F is displayed on the preview image 60, the user disappears the index finger F from the preview image 60 again.
- step S63 the wearable device 1 indicates that the index finger F has moved from outside the zoom range R2 into the zoom range R2 and then moved outside the zoom range R2 based on the detection results of the detection unit 44 and the distance measurement unit 46. Is detected.
- the wearable device 1 causes the foreground 100 to be imaged with the imaging units 40 and 42 in the imaging state.
- the imaging units 40 and 42 capture the foreground 100 with the focus already set.
- the wearable device 1 acquires a captured image 71 captured by the imaging units 40 and 42 and stores the captured image 71 in the storage unit 24 as captured data 24b. After that, the wearable device 1 stops the imaging units 40 and 42 and shifts the imaging units 40 and 42 to an unstarted state.
- the wearable device 1 does not acquire the captured image 71 in the imaging range R when the index finger F disappears from the preview screen 60 (zoom range) by the zoom process. Thereafter, the wearable device 1 acquires the captured image 71 captured at the enlargement / reduction ratio changed by the imaging units 40 and 42 when the upper limb moves from the preview image 60 to the outside of the preview image 60. As a result, the wearable device 1 allows the captured image of the imaging range R even if the user changes the distance between the upper limb and the own device within the imaging range R so that the upper limb disappears from the imaging range R. 71 is not acquired. As a result, the wearable device 1 can improve the detection accuracy of the opportunity to acquire the captured image 71.
- FIG. 16 is a flowchart showing a control processing procedure related to zoom imaging using a preview image by the wearable device 1.
- the processing procedure illustrated in FIG. 16 is realized by the control unit 22 executing the control program 24a.
- the processing procedure shown in FIG. 16 is repeatedly executed when the wearable device 1 is worn on the head, when the wearable device 1 is operating, or the like.
- the control unit 22 of the wearable device 1 determines whether an upper limb has been detected within the imaging range R as step S501. When the upper limb is not detected (No at Step S502), the control unit 22 re-executes the process at Step S501.
- step S503 When the upper limb is detected (step S502, Yes), the control unit 22 advances the process to step S503.
- the control part 22 starts the imaging parts 40 and 42 as step S503.
- step S504 the control unit 22 causes the display units 32a and 32b to display the preview screen 60 within the imaging range R.
- the control unit 22 detects a predetermined motion of the upper limb within the imaging range R as step S505. Specifically, based on the detection result of the detection unit 44, the control unit 22 determines that a predetermined operation has been detected, for example, when an operation in which the upper limb moves back and forth in the forward direction is detected.
- step S506 If the predetermined operation is detected (step S506, Yes), the control unit 22 advances the process to step S507.
- the control unit 22 executes zoom processing by the imaging units 40 and 42 in step S507. Specifically, the control unit 22 specifies the zoom range R2 in the imaging range R in which the imaging units 40 and 42 zoom in and out based on the detected predetermined operation and the conversion table, and stores the specified result in the storage unit. 24.
- step S508 the control unit 22 displays the zoomed preview image 60 in the imaging range R in the zoom range R2. Specifically, the control unit 22 displays the preview screen 60 on the display units 32a and 32b so that the image data 70 of the image capturing units 40 and 42 after zooming is within the image capturing range R.
- the control unit 22 detects the movement of the upper limb in the imaging range R based on the video data 70 of the imaging units 40 and 42 as step S509.
- step S510 the control unit 22 determines whether the upper limb has moved out of the range of the preview image 60 based on the detection result in step S509. Specifically, the control unit 22 moves out of the range of the preview image 60 when it detects that all of the upper limbs have moved out of the range of the preview image 60 so that the upper limb is not reflected in the captured image 71. It is determined that
- step S511, No If the preview image 60 has not been moved outside the range (step S511, No), the control unit 22 re-executes the processing after step S505.
- step S511, Yes the control unit 22 advances the process to step S512.
- the control part 22 acquires the captured image 71 as step S512. Specifically, the control unit 22 causes the imaging units 40 and 42 to capture the foreground 100 with the focus already set, acquires the captured image 71, and stores the acquired captured image 71 as the imaging data 24b. To remember. Then, the control part 22 stops the imaging parts 40 and 42 as step S513. When the imaging units 40 and 42 stop, the control unit 22 ends the processing procedure shown in FIG.
- Step S506 the control unit 22 advances the process to Step S514.
- step S514 the control unit 22 causes the display units 32a and 32b to update the display of the preview image 60. Thereafter, the control unit 22 re-executes the processes after step S509 described above.
- each program shown in FIG. 3 may be divided into a plurality of modules, or may be combined with other programs.
- the wearable device 1 has been described with respect to the case where the imaging units 40 and 42 are stopped when the captured images 71 of the imaging units 40 and 42 are acquired.
- the wearable device 1 may be configured to stop the imaging units 40 and 42 when the user requests the imaging units 40 and 42 to stop.
- Said wearable apparatus 1 may be comprised so that a user may be notified of acquisition of the captured image 71, when the captured image 71 of the imaging parts 40 and 42 is acquired.
- the wearable device 1 is configured to display the activation icons of the imaging units 40 and 42 on the display units 32a and 32b, and activate the imaging units 40 and 42 when a predetermined gesture for the activation icon is detected. May be. Said wearable apparatus 1 may be comprised so that the imaging parts 40 and 42 may always be in a starting standby state.
- the wearable device 1 includes the operation unit 13 on the side surface 1b or the side surface 1c, and is configured to control activation and stop of the imaging units 40 and 42 according to a predetermined operation on the operation unit 13. Also good.
- the wearable device 1 has the index finger F disappeared from the preview screen 60 (zoom range R 1) by the zoom process in the imaging range R.
- the configuration is shown in which the captured image 71 is not acquired and the captured image 71 is acquired when the upper limb moves from the preview image 60 to the outside of the preview image 60 after that, but is not limited thereto.
- the wearable device 1 may be configured to perform zoom processing around the index finger F. With this configuration, the wearable device 1 does not lose the index finger F from the preview image 60 (zoom range R1) due to zoom processing.
- the shielding object existing in the imaging range R has been described as the user's upper limb.
- the shielding object is not limited to the user's upper limb.
- the shielding object may be a predetermined object that the user has.
- the predetermined object possessed by the user is, for example, a rod-like object such as a pen.
- the predetermined item that the user has may be a portable electronic device such as a mobile phone held by the user, or an electronic device such as a wristwatch-type terminal worn on the upper limb of the user.
- the content (brightness and the like) captured by the imaging units 40 and 42 of the wearable device 1 is determined by a predetermined operation on the mobile phone.
- the captured image is acquired in a state where the imaging content is changed.
- FIG. 17 is a diagram for describing an example of control provided by the control program 24a of the wearable device 1.
- the wearable device 1 shifts the imaging units 40 and 42 to the imaging standby state when the detection unit 44 detects that the right hand H has entered the imaging range R in step S130 shown in FIG.
- the wearable device 1 displays the icons 53 to 55 on the display units 32a and 32b so as to be within the imaging range R.
- the icon 53 is, for example, an image indicating that a still image is captured by the imaging units 40 and 42.
- the icon 54 is, for example, an image indicating that a moving image is captured by the imaging units 40 and 42.
- the icon 55 is, for example, an image for canceling the imaging standby state of the imaging units 40 and 42 (stopping imaging).
- the icons 53 to 55 are displayed in parallel in the region along the right side of the imaging range R.
- the wearable device 1 describes a case where the icons 53 to 55 are displayed within the imaging range R, but may be displayed outside the imaging range R.
- the wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- step S130 when the user moves the right hand H from outside the imaging range R of the imaging units 40 and 42 into the imaging range R, the icons 53 to 55 are visually recognized on the display units 32a and 32b, and the display units 32a and 32b are displayed. Through the foreground 100 and the right hand H. The user can recognize that the wearable device 1 can capture an image by visually recognizing the icons 53 to 55.
- the wearable device 1 determines that the preset part of the right hand H has moved through the icon 53 and moved out of the imaging range R based on the detection result of the detection unit 44. Then, the wearable device 1 sets the imaging units 40 and 42 to the imaging state, and images the foreground 100 with a still image.
- the wearable device 1 determines that the preset part of the right hand H has moved through the icon 54 and moved out of the imaging range R based on the detection result of the detection unit 44.
- the wearable device 1 sets the imaging units 40 and 42 to the imaging state, and starts imaging with a moving image.
- the wearable device 1 determines that the preset portion of the right hand H has passed the icon 55 and moved out of the imaging range R based on the detection result of the detection unit 44. Then, the wearable device 1 cancels the imaging standby state of the imaging units 40 and 42.
- each of the icons 53 to 55 is displayed on the display units 32a and 32b as an image larger than a preset portion of the right hand H as shown in step S130.
- the preset part of the right hand H includes, for example, a part closer to the fingertip than the first joint of the index finger of the right hand H.
- the preset part of the right hand H may be, for example, a part from the base of the finger (third joint) to the fingertip.
- any two of the fingertips, the first joint, the second joint, and the third joint are selected, and the part between the two is preset. It may be a part.
- a circular (or elliptical or rectangular) region having a predetermined size based on the size of the finger or hand may be set as a preset portion.
- the preset part may be displayed on the display parts 32a and 32b as a predetermined
- step S131 when the user moves the right hand H to a position closer to the wearable device 1, only the index finger F of the right hand H is visually recognized in the imaging range R. At this time, the index finger F of the right hand H is visually recognized in a state larger than the index finger in step S130. That is, the size of the index finger F in the captured images (video data 70 temporarily stored in the storage unit 24) sequentially transmitted from the imaging units 40 and 42 in the imaging standby state is compared with the index finger in the case of step S130. Become bigger. At this time, the wearable device 1 displays the images of the icons 53 to 55 on the display units 32a and 32b as images larger than the icons 53 to 55 displayed in the case of step S130, as shown in step S131.
- step S131 the wearable device 1 displays the icons 53 to 55 from the display position in step S130 in order to display the icons 53 to 55 as an image larger than the size of the icon 53 to 55 in step S130. change.
- the icon 53 is an area along the right side of the imaging range R
- the icon 54 is an area along the upper side of the imaging range R
- the icon 55 is an area along the left side of the imaging range R.
- each of the icons 53 to 55 in step S131 is displayed on the display units 32a and 32b as an image larger than a preset part of the right hand H (for example, a part closer to the fingertip than the first joint in the index finger F). Has been.
- the wearable device 1 allows the pre-registered portion of the right hand H to pass through a plurality of icons 53 to 55 and select the plurality of icons. It can be made difficult to occur. That is, the wearable device 1 can make it easy to select one icon desired by the user.
- the process of changing the size of the icons 53 to 55 as described above is referred to as a change process.
- the wearable device 1 detects the upper limb (predetermined object) from the captured images captured by the imaging units 40 and 42, and indicates that the size of the predetermined region of the upper limb in the captured image has changed. When detected, a predetermined change process according to the change in the size is performed. Accordingly, the wearable device 1 can easily select one icon desired by the user with the upper limb (or a predetermined region in the upper limb) in a configuration in which the icon is selected by moving the user's upper limb (or a predetermined region in the upper limb). As described above, the size of the icon can be changed, and the usability is improved.
- the wearable device 1 may display the icon on the display units 32a and 32b so that the size of the icon is larger than that of the upper limb (or a predetermined region in the upper limb). That is, the control unit 22 displays an icon (predetermined image displayed on the display unit) that the user visually recognizes on the display units 32a and 32b, or an upper limb (or on the upper limb) that the user visually recognizes via the display units 32a and 32b. The size of the icon is changed so as to be larger than the predetermined area.
- the wearable device 1 is configured to execute a function based on an icon by superimposing an upper limb (or a predetermined region in the upper limb) and an icon (a predetermined image displayed on the display unit), for example, in a predetermined portion of the upper limb. It is possible to make it difficult for a plurality of icons to be selected by superimposing them on a plurality of images.
- the control unit 22 allows the user to display the display unit 32a and
- region in an upper limb) visually recognized via 32b may be sufficient.
- the “size of the icon (image) that the user visually recognizes on the display units 32a and 32b” in the above configuration is the ratio of the display area of the icon (image) to the entire display area (the entire imaging range R). It may be defined virtually.
- the size of the upper limb (or a predetermined area in the upper limb) that the user visually recognizes via the display units 32a and 32b” is the upper limb in the captured image with respect to the entire captured image (the entire captured range R). Alternatively, it may be virtually defined as a ratio of a predetermined area in the upper limb).
- the size of the icon (image) that the user visually recognizes on the display units 32a and 32b” and “the upper limb that the user visually recognizes via the display units 32a and 32b (or The size of the icon (image) may be changed based on “the size of the predetermined region in the upper limb” ”.
- the wearable device 1 may display the icon on the display units 32a and 32b so that the size of the icon is larger than the upper limb (or a predetermined region in the upper limb) in the captured image.
- “the size of the icon is larger than the upper limb (or a predetermined region in the upper limb)” means that the icon is in the upper limb (or the upper limb) in any of two directions orthogonal to each other (for example, the vertical direction and the horizontal direction) Larger than a predetermined area).
- the wearable device 1 When an icon is displayed in the end region in the horizontal direction of the imaging range R, the upper limb is moved in the horizontal direction in order to move the icon outside the imaging range R while passing through the icon. Therefore, the wearable device 1 is configured so that if the size of the icon in the direction (vertical direction) perpendicular to the moving direction of the upper limb is larger than that of the upper limb (or a predetermined region in the upper limb), It can be made difficult for a plurality of icons to be selected by superimposing a region) on a plurality of images. Similarly, when the wearable device 1 displays an icon in the end region in the vertical direction of the imaging range R (in the case of the icon 54), the size of the icon in the horizontal direction may be larger than that of the upper limb. .
- the wearable device 1 allows the control unit 22 to change the display positions of the icons 53 to 55 (display image) as the sizes of the icons 53 to 55 (display image) change. You may have the structure to change.
- FIG. 18 is a flowchart illustrating a processing procedure related to imaging control by the wearable device 1.
- the processing procedure shown in FIG. 18 is realized by the control unit 22 executing the control program 24a.
- the processing procedure shown in FIG. 18 is repeatedly executed when the wearable device 1 is worn on the head, when the wearable device 1 is operating, or the like.
- the control unit 22 of the wearable device 1 determines whether an upper limb has been detected within the imaging range R as step S1101. Specifically, the control unit 22 determines that the upper limb has been detected when the detection unit 44 detects the upper limb. When the upper limb is not detected (No at Step S1102), the process at Step S1101 is re-executed.
- step S1102 When the upper limb is detected (step S1102, Yes), the control unit 22 advances the process to step S1103.
- the control part 22 starts the imaging parts 40 and 42 as step S1103.
- step S1104 the control unit 22 displays icons 53 to 55 on the display units 32a and 32b.
- step S1105 the control unit 22 controls the upper limb (or a predetermined region in the upper limb) in the captured image (video data 70 temporarily stored in the storage unit 24) sequentially transmitted from the video data 70 of the imaging units 40 and 42.
- the size of the upper limb (or the predetermined region in the upper limb) may be a ratio of the upper limb (or the predetermined region in the upper limb) to the entire captured image.
- the control part 22 determines whether the magnitude
- step S1107 the size of the changed upper limb (or the predetermined region in the upper limb). Based on the size, the size and position of the icons 53 to 55 are changed. After finishing the process in step S1107, the control unit 22 proceeds to the process in step S1108.
- the control unit 22 detects the movement of the upper limb in the imaging range R based on the video data 70 of the imaging units 40 and 42 as step S1108.
- step S1109 the control unit 22 determines whether the upper limb has moved out of the imaging range R from the imaging range R based on the detection result of step S1108. Specifically, the control unit 22 determines that the upper limb has moved out of the imaging range R when detecting that the entire upper limb has moved out of the imaging range R so that the upper limb is not reflected in the captured image.
- the control unit 22 re-executes the processing after step S1108.
- the control unit 22 re-executes the processing after step S1105 instead of re-executing the processing after step S1108. Good.
- step S1110 When moving outside the imaging range R (step S1110, Yes), the control unit 22 advances the process to step S1111.
- step S1111 the control unit 22 determines the type of icon that the upper limb (or a predetermined portion of the upper limb) has passed.
- step S1112 the control unit 22 determines in step S1112 that the upper limb (or a predetermined region in the upper limb) has passed the still image icon 53 (an image indicating that still images are to be captured by the imaging units 40 and 42) (step S1112).
- step S1112 Yes
- the process proceeds to step S1113.
- the control part 22 acquires the captured image which is a still image as step S1113.
- control unit 22 causes the imaging units 40 and 42 to capture the foreground 100 with a preset focus, acquires the captured image, and stores the acquired captured image in the storage unit 24 as imaging data 24b. Then, the control part 22 stops the imaging parts 40 and 42 as step S1121. When the imaging units 40 and 42 stop, the control unit 22 ends the processing procedure shown in FIG.
- step S1112 if the control unit 22 does not determine that the upper limb (or a predetermined region in the upper limb) has passed the still image icon 53 (step S1112, No), the process proceeds to step S1114.
- step S1114 determines in step S1114 that the upper limb (or a predetermined region in the upper limb) has passed the moving image icon 54 (an image indicating that moving images are to be captured by the imaging units 40 and 42) (step S1114). , Yes)
- step S1115 the control unit 22 causes the imaging units 40 and 42 to start capturing moving images.
- step S1116 the control unit 22 determines whether the upper limb has been detected within the imaging range R. When it determines with not detecting the upper limb (step S1117, No), the control part 22 re-executes the process of step S1116.
- step S1117 When it is determined that the upper limb has been detected (step S1117, Yes), the control unit 22 advances the process to step S1118.
- the control unit 22 ends the moving image capturing by the image capturing units 40 and 42 in step S1118.
- the control part 22 acquires the captured image which is a moving image as step S1119. Specifically, the control unit 22 acquires a captured image that is a moving image captured by the imaging units 40 and 42 and stores the captured image in the storage unit 24 as imaging data 24b. Then, the control part 22 stops the imaging parts 40 and 42 as step S1121. When the imaging units 40 and 42 stop, the control unit 22 ends the processing procedure shown in FIG.
- step S1114 determines in step S1114 that the upper limb has not passed the moving image icon 54 (step S1114, No)
- step S1120 determines in step S1120 that the upper limb has passed the imaging stop icon 55 (an image to cancel the imaging standby state of the imaging units 40 and 42) (step S1120, Yes)
- step S1121 Proceed with the process.
- the control part 22 stops the imaging parts 40 and 42 as step S1121. When the imaging units 40 and 42 are stopped, the control unit 22 ends the processing procedure illustrated in FIG.
- step S1120 when it is determined that the upper limb does not pass the imaging stop icon 55 (No in step S1120), the control unit 22 re-executes the process in step S1108. That is, when the upper limb moves outside the imaging range R and the upper limb does not pass any of the icons 53 to 55, the control unit 22 does not acquire a captured image and enters the imaging standby state. maintain. Thereby, the wearable device 1 can omit the process of starting the imaging units 40 and 42 again when the upper limb that has moved outside the imaging range R moves into the imaging range R.
- the wearable device 1 has a configuration in which an imaging stop icon 55 (an image for canceling the imaging standby state of the imaging units 40 and 42) is displayed. It is not limited to this. Other configurations in which the imaging stop icon 55 is not displayed may be employed. For example, when the upper limb moves out of the imaging range R and the upper limb does not pass a predetermined icon (the above icons 53, 54, etc.), the wearable device 1 stops the imaging units 40 and 42. It is good also as a structure.
- the wearable device 1 detects the distance between the wearable device 1 (self device) and the upper limb (predetermined object) in the front-rear direction of the user by the distance measuring unit 46, and the distance changes. It may be configured to perform change processing according to the change in the distance when it is detected.
- the control part 22 of the wearable apparatus 1 is the magnitude
- the control unit 22 of the wearable device 1 displays according to the distance.
- the configuration may be such that the process of changing the size of the image is not performed.
- the control unit 22 of the wearable device 1 displays the display image according to the size of the predetermined area.
- the process of changing the size of may not be performed. For example, if the upper limb is detected in the captured image with an unreasonably small size, such as detecting the upper limb of another person who is in front of the user, the display image size is not changed. No need to bother.
- the control unit 22 of the wearable device 1 displays according to the distance.
- the configuration may be such that the process of changing the size of the image is not performed.
- control unit 22 may determine whether the upper limb detected in the captured image is the upper right limb or the left upper limb, and may change the display position of the plurality of icons (images) according to the determination result.
- the control unit 22 determines that the upper limb is the upper right limb
- the plurality of icons are displayed in parallel along the right side edge in the display area (or the imaging range R) of the display units 32a and 32b.
- the icons 53 to 55 are arranged along the left side edge in the display area (or the imaging range R) of the display units 32a and 32b. You may make it display in parallel.
- the control unit 22 displays, for example, a plurality of icons.
- the control unit 22 displays, for example, a plurality of icons.
- the display may be divided into a region along the left edge and a region along the lower edge in the display regions 32a and 32b.
- the wearable device 1 appropriately displays the icons 53 to 55 regardless of whether the size of the icons 53 to 55 is changed or whether the upper limb detected in the captured image is the upper right limb or the left upper limb. It may be displayed at any position.
- the control unit 22 may display the icons 53 to 55 on the lower side in the display area (or the imaging range R). Since the wearable device 1 according to this embodiment is worn on the head, the upper limb of the user is usually positioned below the imaging range R of the wearable device 1. In this case, the user needs to raise the upper limb upward in order to position the upper limb in the imaging range R, while the user lowers the upper limb in order to move the upper limb into the imaging range R. Is easy to do. Therefore, the configuration in which the icons 53 to 55 (display images) are displayed below the display area (or the imaging range R) is convenient for the user.
- the wearable device 1 determines the edge of the imaging range R on the side where the upper limb has entered, and icons 53 to 55 (images). May be displayed on the determined edge side in the display area (or imaging range R) of the display units 32a and 32b. For example, when the upper limb enters from the right side of the imaging range R, the control unit 22 displays the icons 53 to 55 on the right side in the display area (or the imaging range R) of the display units 32a and 32b.
- the display position of the icon may be displayed at an appropriate position depending on the content of the icon.
- the icon to be displayed is the icon 55 (imaging stop icon)
- the icon 55 may be displayed in a relatively small area such as the upper right corner (or upper left corner) in the imaging range R. In this way, when the user does not intentionally move the upper limb, the upper limb hardly passes the icon 55.
- the wearable device 1 may be applied when, for example, a work procedure in farm work or the like is recorded as a moving image.
- the user wearing the wearable device 1 may acquire the work as a moving image by the wearable device 1 attached while performing the actual work.
- the icon 55 (imaging stop icon) is passed. Since it is difficult, it is not necessary to end the work recording at an incorrect timing.
- the wearable device 1 moves the moving image when the upper limb does not pass any of the icons when the upper limb moves outside the imaging range R. Acquisition may be interrupted and resumed when the upper limb enters the imaging range R. As a result, the wearable device 1 can omit the recording of a moving image in a time zone that is not related to the work to be recorded, and does not have to create a redundant moving image.
- Wearable device 1 may newly provide an icon (interruption icon) for interrupting a moving image.
- the wearable device 1 interrupts the recording of the moving image when the upper limb moves from the imaging range R to the outside of the imaging range R while passing the interruption icon, while the upper limb passes the interruption icon.
- the wearable device 1 interrupts the recording of the moving image when the upper limb moves from the imaging range R to the outside of the imaging range R while passing the interruption icon, while the upper limb passes the interruption icon.
- recording of moving images may be resumed.
- the configuration in which the control unit 22 of the wearable device 1 executes a function based on the icon that the upper limb (or a predetermined region in the upper limb) passes among the plurality of icons 53 to 55 is exemplified.
- the control unit 22 assumes that an icon in which the upper limb (or a predetermined region in the upper limb) is superimposed for a predetermined time or more is selected from the plurality of icons 53 to 55, and performs a function based on the icon. It may be configured to execute.
- the icon is displayed outside the imaging range R, the upper limb does not enter the captured image when the captured image is acquired.
- the function based on the icon is a function (acquisition of still image, acquisition of moving image) related to the acquisition processing method of the captured image (video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state).
- the function based on the icon may be a function related to a processing method for the captured image 71 acquired by the imaging units 40 and 42.
- an icon indicating a function of saving the acquired captured image in a predetermined folder (an icon imitating a folder) or a function of shifting to an e-mail creation screen (captured image as attached data to an e-mail) is illustrated. Examples include icons, icons indicating functions of electronic bulletin boards, blogs, SNSs (Social Networking Services) and the like (functions for using captured images on electronic bulletin boards, blogs, SNSs, etc.).
- the wearable device 1 has a size of a display image (icon) as a predetermined change process according to a change in the size of the upper limb in the captured image or a change in the distance between the own device and the upper limb.
- a size of a display image (icon) as a predetermined change process according to a change in the size of the upper limb in the captured image or a change in the distance between the own device and the upper limb.
- FIG. 19 is a diagram illustrating an example of performing various types of change processing.
- FIG. 19 shows an example in which various types of change processing are performed in accordance with a change in the size of the upper limb in the captured image or a change in the distance between the own apparatus and the upper limb.
- three patterns are shown as a table as the change processing.
- FIG. 19 shows the content of the captured image 71 captured by the imaging units 42 and 44 in the upper part of the table.
- FIG. 19 shows a case where the upper limb in the captured image 71 is large (the distance between the own device and the upper limb is small) on the left side.
- FIG. 19 shows a case where the upper limb in the captured image 71 is large (the distance between the own device and the upper limb is small) on the left side.
- FIG. 19 shows a case where the upper limb in the captured image 71 is small (the distance between the own device and the upper limb is small) on the right side.
- FIG. 19 shows a display image 56 displayed on the display units 32a and 32b in the lower part of the table.
- the display image 56 is, for example, an image including character information and a photograph (or an illustration).
- Pattern 1 shows an example in which the display image 56 is enlarged or reduced regardless of the position of the upper limb as the change process.
- the display image 56 changes from the state (1-1) to the state (1-2).
- the enlarged display with the center of the display image 56 as the center of enlargement / reduction is performed.
- the display image 56 is enlarged and displayed at an enlargement ratio proportional to the amount of change in the size of the upper limb in the captured image 71.
- FIG. 19 a configuration in which the display image 56 is enlarged when the size of the upper limb in the captured image 71 becomes smaller (or when the distance between the own device and the upper limb becomes larger) is illustrated.
- the example shown in FIG. 19 may be configured to reduce the display image 56 when the size of the upper limb in the captured image 71 becomes smaller (or when the distance between the own apparatus and the upper limb becomes larger).
- the portion of the display image 56 that is enlarged and displayed in the state (1-2) is shown as a virtual broken line in the state (1-1).
- Pattern 2 shows an example in which the display image 56 is enlarged or reduced depending on the position of the upper limb as the change process.
- the display image 56 changes from the state (2-1) to the state (2-2).
- the enlarged display with the predetermined area (for example, fingertip position) of the upper limb of the display image 56 as the center of enlargement / reduction is performed.
- a position designated by a predetermined region (for example, fingertip position) of the upper limb in the display image 56 is indicated by a broken virtual line.
- the structure displayed as a visually recognizable image may be taken.
- FIG. 19 a configuration in which the display image 56 is enlarged when the size of the upper limb in the captured image 71 becomes smaller (or when the distance between the own device and the upper limb becomes larger) is illustrated.
- the example shown in FIG. 19 may be configured to reduce the display image 56 when the size of the upper limb in the captured image 71 becomes smaller (or when the distance between the own device and the upper limb becomes larger). .
- Pattern 3 shows an example in which only the characters included in the predetermined area 57 determined by the position of the upper limb are enlarged or reduced as the changing process.
- the display image 56 changes from the state (3-1) to the state (3-2).
- the characters included in the region 57 are enlarged and displayed.
- FIG. 19 the configuration in which the region 57 is enlarged when the size of the upper limb in the captured image 71 becomes smaller (or when the distance between the own apparatus and the upper limb becomes larger) is illustrated.
- the present invention is not limited to this.
- the example shown in FIG. 19 may be configured to reduce the region 57 when the size of the upper limb in the captured image 71 becomes smaller (or when the distance between the own apparatus and the upper limb becomes larger).
- the change process may take various forms other than patterns 1 to 3.
- the change processing may be configured to enlarge only the character information without enlarging the photograph, or the character information may be enlarged. It may be the composition which expands only a photograph, without doing.
- a slider for example, a knob-shaped operation portion displayed at the time of starting the application may be enlarged or reduced.
- the slider includes a change bar for changing the volume at the time of music playback when the music playback application is activated.
- the slider includes a seek bar for displaying the data playback location and changing the playback location by a predetermined operation when the moving image playback application is activated.
- the change process may associate the coordinate position in the imaging range R of the imaging units 40 and 42 and the detection range of the detection unit 44 with the coordinate position in the display area of the display units 32a and 32b.
- the change process may be a state in which the slider is selected based on the presence of the upper limb at a predetermined position in the imaging range R (detection range) corresponding to the coordinate position where the slider is displayed in the display area.
- the change process when the upper limb moves with the slider selected, the display position of the slider in the display area is moved based on the movement of the position coordinates of the upper limb in the imaging range R (detection range). Then, the change process changes the volume and the playback position based on the movement of the display position of the slider.
- the slider is expanded, that is, the area of the coordinate that is regarded as the display position of the slider is expanded. Therefore, in the wearable device 1, the range in which the slider can be selected by the upper limb in the imaging range R (detection range) is expanded.
- the change process is not limited to this.
- the playback volume of the music is used as a predetermined change process according to a change in the size of the upper limb in the captured image or a change in the distance between the own device and the upper limb.
- the playback speed, tone, etc. may be changed to any one of a plurality of patterns.
- the changing process can change the ISO sensitivity, white balance, shutter speed, aperture value, depth of field, focal length, and the like when the imaging units 40 and 42 of the wearable device 1 acquire the captured image. It may be.
- the wearable device 1 may have a configuration in which a predetermined image is enlarged or reduced by a predetermined operation of the upper limb in the captured images captured by the imaging units 40 and 42.
- the points described below are different from the above-described embodiment.
- FIG. 20 is a diagram for describing an example of control provided by the control program 24a of the wearable device 1.
- the example illustrated in FIG. 20 is an example of control related to imaging using a preview image by the wearable device 1.
- the wearable device 1 When the wearable device 1 detects that the right hand H has entered the imaging range R by the detection unit 44 in step S140 illustrated in FIG. 20, the wearable device 1 shifts the imaging units 40 and 42 to the imaging standby state. The wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- step S140 the wearable device 1 displays the video data 70 on the display units 32a and 32b as the preview image 60 so as to be within the imaging range R.
- the range of the preview image 60 may be the same size as the imaging range R or may be a range narrower than the imaging range R. In the example illustrated in FIG. 20, the wearable device 1 will describe a case where the preview image 60 is displayed on the upper right in the imaging range R.
- step S140 the user views the foreground 100 and the preview image 60 in the imaging range R of the imaging units 40 and 42 through the display units 32a and 32b.
- step S141 the user moves the right hand H to the position where the preview image 60 is superimposed when viewed from the user in the imaging range R viewed through the display units 32a and 32b. At this time, the user's right hand H is in a state where the index finger and thumb belly are in contact with each other (picked).
- step S142 the user performs an operation of separating the forefinger and the thumb from each other with the right hand H superimposed on the preview image 60.
- the wearable device 1 enlarges the preview image 60 by detecting an operation in which the index finger and the thumb are separated from each other. That is, the operation of separating the index finger and the thumb from each other is an operation for executing the enlargement process of the preview image 60.
- the wearable device 1 enlarges the image at an enlargement ratio based on the distance between the index finger and the thumb.
- the operation in which the index finger and the thumb are separated from each other when the index finger and the thumb are in contact with each other but also the operation in which the index finger and the thumb are further separated from the state in which the index finger and the thumb are originally separated from each other is executed.
- the operation may be as follows.
- the wearable device 1 may execute a reduction process of the preview image 60 when detecting an operation in which the index finger and the thumb approach each other.
- the wearable device 1 detects a predetermined region (a right hand H or a pre-registered portion of the right hand H) of the upper limb or the upper limb from the captured images captured by the imaging units 40 and 42.
- An enlargement / reduction process (change process) of an image displayed on the display units 32a and 32b can be executed by a predetermined operation of the upper limb or the upper limb in the captured image.
- the wearable device 1 can change the image to a size desired by the user, which improves usability.
- step S142 the user sees the right hand H moving in front of the wearable device 1 and the foreground 100 in the imaging range R through the display units 32a and 32b.
- the user visually recognizes the preview image 60 displayed on the display units 32a and 32b.
- the user recognizes that his / her right hand H is displayed on the preview image 60.
- step S143 the user moves the right hand H toward the outside of the imaging range R so that the right hand H disappears from the preview image 60.
- Wearable device 1 determines whether right hand H has moved out of the range (imaging range R) of preview image 60 based on the detection result of detection unit 44 in step S143. When the wearable device 1 determines that the right hand H has moved out of the range, the right hand H disappears from the preview image 60, and thus the foreground 100 is imaged with the imaging units 40 and 42 in the imaging state.
- the imaging units 40 and 42 image the foreground 100 with a preset focus.
- the wearable device 1 acquires a captured image 71 captured by the imaging units 40 and 42 and stores the captured image 71 in the storage unit 24 as captured data 24b. After that, the wearable device 1 stops the imaging units 40 and 42 and shifts the imaging units 40 and 42 to an unstarted state.
- the wearable device 1 can improve the operability and convenience of the imaging units 40 and 42.
- FIGS. 21A to 21D are diagrams for describing an example in which the size of the display image is changed based on the motion of the upper limb.
- 21A and 21B and FIGS. 21C and 21D show a case where the length between the wearable device 1 and the upper limb (hand) is different from each other.
- 21A and 21C show how the preview image 60 is enlarged by a predetermined motion of the upper limb, respectively.
- 21A and 21C are schematic diagrams illustrating the distance L between the wearable device 1 and the upper limb (right hand H) in the front-rear direction of the user.
- the wearable device 1 detects the upper limb (predetermined object) from the captured images captured by the imaging units 40 and 42, and performs a predetermined operation (index finger) of the upper limb. And the movement of the thumb apart from each other). Wearable device 1 executes enlargement / reduction processing of preview image 60 based on an operation in which the index finger and the thumb move away from each other.
- the enlargement / reduction process changes the size of the image with a predetermined enlargement / reduction ratio, and is therefore referred to as a change process.
- the enlargement / reduction rate M in the enlargement / reduction processing (change processing) is referred to as a change rate.
- the distance D is, for example, a linear distance between a predetermined position of the tip of the index finger and a predetermined position of the tip of the thumb in the captured images captured by the imaging units 40 and 42. Therefore, if the user separates the index finger and the thumb by a larger distance, the preview image 60 is enlarged with a larger enlargement ratio.
- the enlargement / reduction rate M (change rate) in the enlargement / reduction processing (change processing) is set to the size P of the upper limb (or a predetermined region in the upper limb) in the captured image.
- the conversion value P ′ is defined such that the larger the size P of the upper limb (or a predetermined region in the upper limb), the smaller the value.
- the wearable device 1 stores a conversion table between the size P of the upper limb (or a predetermined region in the upper limb) and the converted value P ′ in the storage unit 24 and refers to the conversion table based on the detected size P.
- the conversion value P ′ is acquired.
- the upper limb (right hand H) is closer to the wearable device 1 than in the case of FIGS. 21A and 21B (see FIG. 21D and the like).
- the upper limb (right hand H) is visually recognized in a larger state than in the case of FIGS. 21A and 21B (see FIG. 21C).
- the wearable device 1 refers to the conversion table from the size P of the upper limb (right hand H) in the captured image, and the conversion value Pa ′ in the case of FIGS. 21A and 21B and the conversion value in the case of FIGS. 21C and 21D.
- Pb ′ is acquired. Note that the converted value Pb 'is smaller than the converted value Pa' in the case of FIGS. 21A and 21B.
- Equation 2 (calculation means depending on the converted value P ′) may be applied.
- Expression 2 in the case of FIGS. 21C and 21D, the preview image 60 is enlarged at the enlargement ratio Mb multiplied by the conversion value Pb ′ smaller than the conversion value Pa ′ in the case of FIGS. 21A and 21B. Therefore, the size of the enlarged preview image 61 can be made closer to the enlarged preview image 61 in the case of FIGS. 21A and 21B.
- 21A to 21D illustrate the case where the conversion values Pa ′ and Pb ′ are set so that Ma and Mb are the same.
- the wearable device 1 detects a predetermined object (an upper limb (or a predetermined area in the upper limb) from a captured image captured by the imaging units 40 and 42 and performs a predetermined operation of the predetermined object.
- a wearable device 1 including a control unit 22 that performs a predetermined change process (image enlargement / reduction process) according to the displacement (distance D) of a predetermined object associated with the predetermined operation upon detecting (an operation in which the index finger and the thumb are separated).
- the wearable device 1 is configured to change the change rate (enlargement / reduction rate M) of the predetermined object per unit displacement in the change process according to the size of the predetermined object in the captured image.
- the wearable device 1 described above is based on the distance L between the wearable device 1 (own device) and the upper limb (predetermined object) detected by the distance measuring unit 46 in the front-rear direction of the user. ) May be changed.
- the wearable device 1 converts the conversion value converted based on the distance L between the wearable device 1 (self device) and the upper limb (predetermined object) to L ′ (conversion for obtaining the conversion value L ′ from the distance L).
- the conversion value L ′ is defined such that the larger the distance L between the wearable device 1 (self device) and the upper limb (predetermined object), the larger the value.
- the upper limb (right hand H) is closer to the wearable device 1 than in the case of FIGS. 21A and 21B (see FIG. 21D).
- the upper limb (right hand H) is visually recognized in a larger state than in the case of FIGS. 21A and 21B (see FIG. 21C).
- the wearable device 1 refers to the conversion table from the distance L between the device itself and the upper limb (right hand H), and refers to the conversion value La ′ in the case of FIGS. 21A and 21B and the conversion value Lb in the case of FIGS. 21C and 21D. 'And get respectively.
- the converted value Lb ′ is smaller than the converted value La ′ in the case of FIGS. 21A and 21B.
- the wearable device 1 can make the size of the enlarged preview image 61 closer to the enlarged preview image 61 in the case of FIGS. 21A and 21B.
- FIG. 22 is a diagram for explaining another example of the control provided by the control program 24a of the wearable device 1.
- FIG. 20 the process of enlarging / reducing the preview image by the predetermined action of the upper limb has been described.
- FIG. 22 an example of performing the zoom process by the predetermined action of the upper limb will be described.
- the wearable device 1 When the wearable device 1 detects that the right hand H has entered the imaging range R by the detection unit 44 in step S150 illustrated in FIG. 22, the wearable device 1 shifts the imaging units 40 and 42 to the imaging standby state. The wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- the wearable device 1 displays the video data 70 on the display units 32a and 32b as a preview image 60 so as to be within the imaging range R in step S150.
- wearable device 1 displays preview image 60 on the upper right in imaging range R.
- step S151 the user is in a state where the belly of the index finger of the right hand H and the belly of the thumb are in contact with each other (picked).
- step S152 the user performs an operation of separating the index finger and the thumb from each other at a position where the right hand H does not overlap the preview image 60 in the imaging range R. Then, the wearable device 1 executes zoom processing (change processing) by detecting an operation in which the index finger and the thumb are separated from each other in step S152. The wearable device 1 detects a distance where the index finger and the thumb are separated from each other, and performs a zoom process based on the detected distance.
- the zoom process is a process of specifying a zoom range to zoom in or zoom out in the imaging range R based on the detected distance and the conversion table, for example.
- the wearable device 1 may display the specified zoom range on the display units 32a and 32b so as to be within the imaging range R when the zoom process is executed. Note that not only the movement of the index finger and thumb away from each other from the state in which the index finger and the thumb are in contact with each other does not correspond to the operation for performing the zoom process, but also the index finger from the state in which the index finger and the thumb are separated from each other. The operation of separating the thumb from the thumb also corresponds to the operation for executing the zoom process.
- the wearable device 1 may execute a reduction process of the preview image 60 when detecting an operation in which the index finger and the thumb approach each other.
- the wearable device 1 detects a predetermined part of the upper limb or the upper limb (a right hand H or a pre-registered part of the right hand H) from the captured images captured by the imaging units 40 and 42.
- the zoom process can be executed by a predetermined motion of the upper limb or the upper limb in the captured image.
- the wearable device 1 can acquire the video data 70 in a size desired by the user, which improves usability.
- step S152 the user sees the right hand H moving in front of the wearable device 1 and the foreground 100 in the imaging range R through the display units 32a and 32b.
- the user visually recognizes the preview image 60 displayed on the display units 32a and 32b.
- the user recognizes that his / her right hand H is displayed on the preview image 60.
- step S153 the user moves the right hand H toward the outside of the imaging range R so that the right hand H disappears from the preview image 60.
- Wearable device 1 determines whether right hand H has moved out of the range (imaging range R) of preview image 60 based on the detection result of detection unit 44 in step S153. When the wearable device 1 determines that the right hand H has moved out of the range (or determines that the right hand H has disappeared from the preview image 60), the wearable device 1 causes the imaging units 40 and 42 to capture the foreground 100. The imaging units 40 and 42 image the foreground 100 with a preset focus. The wearable device 1 acquires a captured image 71 captured by the imaging units 40 and 42 and stores the captured image 71 in the storage unit 24 as captured data 24b. After that, the wearable device 1 stops the imaging units 40 and 42 and shifts the imaging units 40 and 42 to an unstarted state.
- the wearable device 1 displays the preview image 60 in the imaging range R, and when the wearable device 1 detects that the right hand H (upper limb) has disappeared from the preview image 60, acquires the captured image 71 captured by the imaging units 40 and 42. To do.
- the user can capture the front image by the imaging units 40 and 42 only by moving the upper limb so as to disappear while visually recognizing the preview image 60.
- the wearable device 1 can improve the operability and convenience of the imaging units 40 and 42.
- the wearable device 1 detects a predetermined object (upper limb (or a predetermined region in the upper limb)) from the captured images captured by the imaging units 40 and 42, as in the example illustrated in FIG. 20.
- the wearable device 1 includes a control unit 22 that detects a predetermined action of a predetermined object (an action in which the index finger and the thumb separate) and executes a change process (zoom process) based on the predetermined action.
- Wearable device 1 may change the change rate (zoom rate) in change processing according to the size of the predetermined thing in a picked-up image.
- imaging control zoom processing
- a predetermined operation gesture
- the wearable device 1 is based on the distance L between the wearable device 1 (self device) and the upper limb (predetermined object) detected by the distance measuring unit 46 in the front-rear direction of the user, as described above.
- the change rate (zoom rate) may be changed.
- FIG. 23 shows an example of a conversion table in which the size P of the upper limb in the captured image (or the distance L between the upper limb and the own device) and the converted value P ′ (or L ′) are associated with each other.
- FIG. Hereinafter, an example relating to the changing process based on the size P of the upper limb in the captured image will be described.
- FIG. 23 shows two patterns.
- the pattern P1 is an image in which the converted value changes stepwise as the size P of the upper limb in the captured image changes stepwise, and at the unit movement distance of the upper limb (for example, when the index finger and the thumb are 1 cm apart).
- An example is shown in which the enlargement / reduction ratio (change rate) changes stepwise. That is, the pattern P1 is a pattern that is changed at a change rate to which the converted value is applied.
- the present embodiment is not limited to the pattern P1.
- the pattern P2 is an example in which the change process based on the change rate is not performed while the size P of the upper limb in the captured image is within a predetermined range.
- the conversion value is not applied, that is, the change process based on the change rate is not performed, and another change process is performed.
- another changing process a process of changing the size of the display image so that two opposite corners of the predetermined image are matched with (attached to) the position of the fingertip of the index finger and the position of the fingertip of the thumb. It is. Note that, in the pattern P2, when the size P of the upper limb in the captured image is 20% or less, the change process based on the change rate is executed in the same manner as the pattern P1 described above.
- FIG. 24 is a flowchart illustrating a processing procedure related to imaging control by the wearable device 1.
- the processing procedure shown in FIG. 24 is realized by the control unit 22 executing the control program 24a.
- the processing procedure shown in FIG. 24 is repeatedly executed when the wearable device 1 is worn on the head, when the wearable device 1 is operating, or the like.
- the control unit 22 of the wearable device 1 determines whether an upper limb has been detected within the imaging range R as step S1201. Specifically, the control unit 22 determines that the upper limb has been detected when the detection unit 44 detects the upper limb. When the upper limb is not detected (No at Step S1202), the control unit 22 re-executes the process at Step S1201.
- step S1202 When the upper limb is detected (step S1202, Yes), the control unit 22 advances the process to step S1203.
- the control unit 22 activates the imaging units 40 and 42 as step S1203.
- step S1204 the control unit 22 displays the preview image 60 on the display units 32a and 32b.
- step S1205 the control unit 22 controls the upper limb (or a predetermined region in the upper limb) in the captured image (video data 70 temporarily stored in the storage unit 24) sequentially transmitted from the video data 70 of the imaging units 40 and 42.
- the size of the upper limb (or the predetermined region in the upper limb) may be a ratio of the upper limb (or the predetermined region in the upper limb) to the entire captured image.
- the control part 22 determines whether the magnitude
- step S1206 When it is determined that the size of the upper limb has changed in step S1206 (step S1206, Yes), the control unit 22 sets a change rate based on the changed size of the upper limb as step S1207. After finishing the process in step S1207, the control unit 22 proceeds to the process in step S1208.
- the control unit 22 determines whether a predetermined motion of the upper limb in the imaging range R has been detected based on the video data 70 of the imaging units 40 and 42 as step S1208.
- the predetermined operation is, for example, an operation of separating (or bringing the index finger and thumb) apart, and does not correspond to the movement operation of the upper limb determined in step S1213 described later. If it is determined that the predetermined motion of the upper limb has not been detected (step S1208, No), the control unit 22 advances the process to step S1213 described later.
- step S1208 If it is determined in step S1208 that the predetermined motion of the upper limb has been detected (step S1208, Yes), the control unit 22 determines the upper limb in the imaging range R based on the video data 70 of the imaging units 40 and 42 as step S1209. Detect position.
- step S1210 the control unit 22 determines whether the position of the upper limb detected in step S1209 is a position that overlaps the preview image 60 in the imaging range R. When it is determined that the position of the upper limb is in a position where it is superimposed on the preview image 60 (step S1210, Yes), the control unit 22 advances the process to step S1211.
- step S1211 the control unit 22 changes the size of the preview image 60 (performs enlargement / reduction processing).
- the enlargement / reduction processing of the preview image 60 is executed based on the change rate (enlargement / reduction rate) set in step S1207. After finishing the process in step S1211, the control unit 22 advances the process to step S1213 described later.
- step S1210 when it is not determined in step S1210 that the position of the upper limb is in a position overlapping the preview image 60 (No in step S1210), the control unit 22 advances the process to step S1212.
- the control unit 22 executes zoom processing by the imaging units 40 and 42 as step S1212.
- the zoom process is executed based on the change rate (zoom rate) set in step S1207.
- the control unit 22 advances the process to step S1213.
- the control unit 22 detects the movement of the upper limb in the imaging range R based on the video data 70 of the imaging units 40 and 42 as step S1213.
- the control unit 22 determines whether the upper limb has moved out of the imaging range R from the imaging range R based on the detection result in step S1213. Specifically, the control unit 22 determines that the upper limb has moved out of the imaging range R when detecting that the entire upper limb has moved out of the imaging range R so that the upper limb is not reflected in the captured image.
- the control part 22 re-executes the process after step S1213.
- step S1215 the control unit 22 acquires a captured image. Specifically, the control unit 22 causes the imaging units 40 and 42 to capture the foreground 100 with a preset focus, acquires the captured image 71, and stores the acquired captured image 71 as the captured data 24b in the storage unit 24. Remember. Then, the control part 22 stops the imaging parts 40 and 42 as step S1216. When the imaging units 40 and 42 stop, the control unit 22 ends the processing procedure shown in FIG.
- FIG. 25 is a diagram for explaining another example of the control provided by the control program 24a of the wearable device 1.
- FIG. 25 is an example of control for performing enlargement / reduction processing of the preview image by the wearable device 1 as in the case of FIG.
- step S160 illustrated in FIG. 25 the user moves the right hand H to the position where the preview image 60 is superimposed when viewed from the user in the imaging range R viewed through the display units 32a and 32b. . At this time, the user's right hand H is in a state where the index finger and thumb belly are in contact with each other (picked).
- the detection unit 44 detects that the right hand H has entered the imaging range R, and the imaging units 40 and 42 are in an imaging standby state. At this time, the wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- step S161 the user performs an operation of separating the index finger and the thumb from each other with the right hand H superimposed on the preview image 60.
- the wearable device 1 enlarges the preview image 60 by detecting an operation in which the index finger and the thumb are separated from each other.
- the wearable device 1 detects a direction (first direction; T1) in which the index finger and the thumb are separated from each other, and temporarily stores the direction T1.
- the first direction T1 may be a direction in which the index finger is separated from the thumb, or may be a direction in which the thumb is separated from the index finger.
- step S162 the user moves the upper limb in a second direction T2 that intersects the first direction T1 at a predetermined angle or more (in step S62, the second direction T2 is substantially orthogonal to the first direction T1).
- the wearable device 1 detects that the upper limb has moved in the second direction T2.
- the wearable device 1 determines that the operation of separating the index finger and the thumb from each other has been completed with the detection of the movement of the upper limb in the second direction T2.
- the wearable device 1 moves in the second direction of the upper limb, which is difficult to interact with the operation (the operation of separating the index finger and the thumb in the first direction) involved in the image scaling operation. ), The enlargement / reduction operation can be completed. Therefore, the operation is not troublesome, and the usability is improved.
- the operation for performing the image enlargement / reduction operation and the operation for completing the image enlargement / reduction operation are distinguished from each other by operations in different directions.
- the present invention is not limited to this. .
- FIG. 26 is a diagram for describing another example of the control provided by the control program 24a of the wearable device 1.
- FIG. 26 is an example of control for performing enlargement / reduction processing of the preview image by the wearable device 1 as in the case of FIG.
- the display contents of the display units 32a and 32b display contents in the display area in the area overlapping the imaging range R
- the distance L between the wearable device 1 and the upper limb Respectively.
- the distance L between the wearable device 1 and the upper limb in the front-rear direction of the user is detected by the distance measuring unit 46.
- step S170 the user moves the right hand H to the position where it is superimposed on the preview image 60 when viewed from the user in the imaging range R viewed through the display units 32a and 32b. At this time, the user's right hand H is in a state where the index finger and thumb belly are in contact with each other (picked).
- step S170 the wearable device 1 detects that the right hand H has entered the imaging range R by the detection unit 44, and sets the imaging units 40 and 42 in the imaging standby state. At this time, the wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- the distance between wearable device 1 and the upper limb (right hand H) is distance L1.
- step S171 the user performs an operation of separating the index finger and the thumb from each other with the right hand H superimposed on the preview image 60.
- the wearable device 1 enlarges the preview image 60 to the preview image 61 by detecting the movement of the index finger and the thumb away from each other in step S171.
- the distance between the wearable device 1 and the upper limb (right hand H) is the distance L1 as in step S170.
- step S172 the user moves the right hand H closer to the front than in the case of step S171.
- the distance between the wearable device 1 and the upper limb (right hand H) is a distance L2 that is smaller than the distance L1.
- the wearable device 1 does not execute the enlargement / reduction process of the preview image 60 even if it is detected that a predetermined operation (an operation of releasing the index finger and the thumb) is performed.
- the wearable device 1 has the first state and the second state in which the distance L between the user's own device and the upper limb (predetermined object) in the front-rear direction of the user is different. Detect and distinguish. Wearable device 1 performs predetermined processing (processing for enlarging / reducing preview image 60 by moving the index finger and thumb apart (or closer)) to preview image 60 (object) in the first state (distance at distance L1). enable. On the other hand, the wearable device 1 disables the process in the second state (the state at the distance L2).
- the wearable device 1 is different from the finger motion involved in the image enlargement / reduction operation (the motion of moving the upper limb in a direction toward or away from the wearable device 1).
- the enlargement / reduction operation can be completed, the operation is not troublesome, and the usability is improved.
- a configuration may be used in which the distance between the device itself and the upper limb (predetermined object) is estimated based on the size of the upper limb in the captured images captured by the imaging units 40 and 42.
- the image enlargement / reduction process is exemplified as the predetermined process, but the present invention is not limited to this and may be applied to other processes.
- the predetermined process for the object may be a process for performing a drag operation of the object. An example of this case will be described with reference to a schematic diagram of the distance between the wearable device 1 and the upper limb in FIG. In a state where the distance L between the wearable device 1 and the upper limb is the predetermined distance L2, the wearable device 1 is in a state where the drag operation cannot be performed.
- the wearable device 1 When the user performs an operation of moving the upper limb in a direction away from the wearable device 1 (an operation of pushing the upper limb back), the wearable device 1 has a distance L1 that is greater than the distance L2. It is good also as a state which can perform drag operation. In this way, when the user is performing an operation of pushing the upper limb back, a drag operation can be performed, and the user performs an operation of pulling the upper limb forward at a position desired by the user. By doing so, it is possible to drop the object (cancel the selected state).
- the drag operation is regarded as a state where the object is selected by superimposing the upper limb on the display position of the object for a predetermined time or more, and in the selected state, the movement of the object follows the movement of the position of the upper limb. Includes operations to move the display position.
- the wearable device 1 is configured to display the size of the icon to be displayed and the image enlargement / reduction processing (change processing) based on the size of the upper limb in the captured image (or the distance between the wearable device 1 and the upper limb).
- FIG. 27 is a diagram for explaining another example of the control provided by the control program 24a of the wearable device 1.
- FIG. 27 is an example of control for changing the display area of the wearable device 1.
- the display contents of the display units 32a and 32b display contents in the display area in the area overlapping the imaging range R
- the distance L between the wearable device 1 and the upper limb Respectively.
- the distance L between the wearable device 1 and the upper limb in the front-rear direction of the user is detected by the distance measuring unit 46.
- step S180 the upper limb (right hand H) is located at a distance L1 forward from the wearable device 1.
- the wearable device 1 sets the display application area of the display units 32a and 32b as a region 81 in accordance with the distance L1 between the wearable device 1 and the upper limb.
- the display application area is an area including the position based on the position of the upper limb in the imaging range R.
- an icon group 58 that can execute a predetermined process by superimposing an upper limb (right hand H) is displayed.
- step S181 the user has moved the upper limb (right hand H) to the front rather than in the case of step S180.
- the distance between the wearable device 1 and the upper limb (right hand H) is a distance L2 that is smaller than the distance L1.
- size of the upper limb (right hand H) which a user visually recognizes via the display parts 32a and 32b becomes larger than the case of step S180.
- the wearable device 1 has a display application region that is larger than the region 81 in accordance with the distance L2 between the wearable device 1 and the upper limb (the upper limb (right hand H) being visually recognized). It is said.
- an icon group 58 that can execute a predetermined process by superimposing an upper limb (right hand H) is displayed.
- the wearable device 1 estimates the change in the size of the upper limb (right hand H) that the user visually recognizes via the display units 32a and 32b based on the change in the distance L between the wearable device 1 and the upper limb, Based on the estimated size, the size of the display application area is changed.
- the wearable device 1 applies display based on the size of the upper limb in the captured images captured by the imaging units 40 and 42 instead of detecting the distance L by the distance measuring unit 46. It is good also as a structure which changes the magnitude
- the change process may be a process using a change bar having a bar-shaped operation area.
- the change bar displays a data reproduction location when starting a music playback application, a change bar for changing the volume at the time of music playback, or a video playback application, and plays back by a predetermined operation. Includes a seek bar to change the location. For example, when the size of the upper limb detected in the captured image (the ratio of the upper limb region to the entire captured image) is greater than or equal to a predetermined value, the change process displays the change bar having the first length on the display unit.
- the change bar having the second length smaller than the first length is displayed on the display units 32a and 32b. You can do it.
- the slider for example, a knob-shaped operation portion
- the slider can be slid within a range suitable for the movable range of the upper limb. it can.
- the slider may be inappropriate due to a slight movement of the upper limb (a slight movement of the position of the upper limb).
- this is unlikely to occur.
- the “change process” in the above embodiment refers to, for example, a process for changing a physical quantity, a set value, a parameter, or a sensory quantity realized by a user, which changes continuously or stepwise with respect to execution of a predetermined function.
- the target of the change process is a correction value in exposure correction related to the imaging function, ISO sensitivity, white balance, shutter speed, aperture value, depth of field, focal length, zoom rate, volume in music playback, video playback, etc. Playback location, volume of microphones, display image size, display position, display image enlargement / reduction ratio, and the like. Note that the process of selecting one of a plurality of selection targets (for example, a plurality of display images) based on the number of fingers having a predetermined state is not included in the “change process”.
- FIG. 28 is a diagram for explaining another example of the control provided by the control program 24a of the wearable device 1.
- the wearable device 1 When the wearable device 1 detects that the right hand H has entered the imaging range R by the detection unit 44 in step S230 illustrated in FIG. 28, the wearable device 1 shifts the imaging units 40 and 42 to the imaging standby state.
- the wearable device 1 displays the icons 65 and 66 on the display units 32a and 32b so as to be within the imaging range R.
- the icon 65 is one of imaging functions, for example.
- the icon 65 is a display image showing a function for changing the ISO sensitivity.
- the icon 66 is one of imaging functions, for example.
- the icon 66 is a display image showing a function for changing a correction value in exposure correction.
- the wearable device 1 will describe a case where the icons 65 and 66 are displayed within the imaging range R, but may be displayed outside the imaging range R.
- the wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- step S230 when the user moves the right hand H from outside the imaging range R of the imaging units 40 and 42 into the imaging range R, the icons 65 and 66 are visually recognized on the display units 32a and 32b, and the display units 32a and 32b are displayed. Through the foreground 100 and the right hand H. The user can recognize that the wearable device 1 can capture an image by visually recognizing the icons 65 and 66.
- the user extends the index finger of the right hand H.
- the extension of the finger means that the finger has a predetermined state.
- the predetermined area of the index finger may function as a pointer that can specify the predetermined position of the display area of the display units 32a and 32b.
- the predetermined region of the index finger may be defined as, for example, a portion closer to the fingertip than the first joint of the index finger, a nail of the index finger of the right hand H, or the like.
- step S230 the user superimposes a predetermined area of the index finger on the icon 65 in the display area of the display units 32a and 32b.
- the user visually recognizes that a predetermined area of the index finger is superimposed on the icon 65 in the imaging range R.
- the wearable device 1 substantially matches the imaging range R with the display areas of the display units 32a and 32b. Therefore, the wearable device 1 recognizes that the user is viewing such that the predetermined area of the index finger and the icon 65 are superimposed.
- Wearable device 1 puts icon 65 in a selected state based on the fact that a predetermined area of the index finger is superimposed on icon 65.
- the icon 65 may be selected based on the fact that the predetermined area of the index finger is superimposed on the icon 65 for a predetermined time or longer.
- the wearable device 1 moves to step S231.
- step S231 the wearable device 1 displays a list 67, which is a new display image, based on the selection of the icon 65.
- the list 67 is a list of setting values based on the display content of the icon 65 (ISO sensitivity changing function).
- the types of setting values are defined as five or less types so that each can be selected according to the number of fingers extended.
- four types of setting values are defined. Each setting value is numbered.
- the list 67 does not necessarily have a configuration in which each setting value is numbered.
- step S231 as in step S232, based on the fact that the number of fingers having a predetermined state (extended state) is 1 (only the index finger), the first (one of the four types of setting values) A setting value (400) of (upper stage) is set.
- the list 67 is displayed at a position superimposed on the icon 66 in step S230, but may be displayed at a position other than this.
- step S232 it is assumed that the user extends the middle finger and the ring finger in addition to the index finger. Then, based on the fact that the number of fingers having a predetermined state (stretched state) has changed to 3, the third set value (1600) among the four types of set values is set.
- step S233 the user moves the right hand H from the imaging range R to the outside of the imaging range R. At this time, the user moves the right hand H without changing the number of extending fingers.
- the wearable device 1 determines whether the right hand H has moved from within the imaging range R to outside the imaging range R based on the detection result of the detection unit 44, the wearable device 1 causes the foreground 100 to be imaged with the imaging units 40 and 42 being in the imaging state.
- the imaging units 40 and 42 capture the foreground 100 with the ISO sensitivity (1600) set in step S32.
- the wearable device 1 acquires a captured image 71 captured by the imaging units 40 and 42 and stores the captured image 71 in the storage unit 24 as captured data 24b. After that, the wearable device 1 stops the imaging units 40 and 42 and shifts the imaging units 40 and 42 to an unstarted state.
- the control unit 22 in the wearable device 1 detects the number of fingers (extended fingers) having a predetermined state in the upper limb from the detection results of the imaging units 40 and 42 serving as the detection unit 44, and Based on the number, change processing related to a predetermined function (in the above example, change of ISO sensitivity in the imaging function) is executed. Thereby, the wearable device 1 can be operated by a simple and intuitive operation.
- the changing process may be any process capable of changing a predetermined setting value related to the execution content of the predetermined function, in addition to the above-described change in ISO sensitivity.
- the predetermined position of the display area is designated based on the position of the predetermined area of the index finger within the imaging range R.
- the present invention is not limited to this.
- a configuration in which a predetermined position of the display region is designated based on the position of the predetermined region of the index finger within the detection range of the detection unit 44 other than the imaging units 40 and 42 may be used.
- FIG. 29 is a diagram for explaining another example of the control provided by the control program 24a of the wearable device 1.
- the wearable device 1 When the wearable device 1 detects that the right hand H has entered the imaging range R by the detection unit 44 in step S240 shown in FIG. 29, the wearable device 1 shifts the imaging units 40 and 42 to the imaging standby state.
- the wearable device 1 displays the icons 65 and 66 on the display units 32a and 32b in the same manner as in the example shown in FIG.
- the wearable device 1 temporarily stores the video data 70 sequentially transmitted from the imaging units 40 and 42 in the imaging standby state in the storage unit 24.
- the user visually recognizes the icons 65 and 66 on the display portions 32a and 32b, and visually recognizes the foreground 100 and the right hand H through the display portions 32a and 32b.
- step S240 the user extends the index finger of the right hand H.
- a predetermined region of the index finger (a portion closer to the fingertip than the first joint in the index finger) is assumed to function as a pointer that can specify a predetermined position in the display region of the display units 32a and 32b.
- step S240 the user superimposes a predetermined area of the index finger on the display area of the icon 65 in the display units 32a and 32b.
- the wearable device 1 recognizes that the user is visually recognizing the predetermined area of the index finger and the icon 65 superimposed on each other, and can use this as a trigger to execute a change process related to the function based on the icon 65 To.
- step S241 the wearable device 1 displays a list 67 of setting values based on the display content of the icon 65 (ISO sensitivity changing function), as in the example shown in FIG.
- the list 67 is defined as five or less types so that each can be selected according to the number of fingers extended.
- the user in the state where the predetermined area of the index finger is superimposed on the display area of the icon 65 in the display units 32a and 32b (the state in which the change process based on the icon 65 can be performed), the user can The ring finger is extended.
- wearable device 1 changes the third set value (1600) of the four types of set values based on the number of fingers having a predetermined state (stretched state) having changed to three. To do.
- the control unit 22 in the wearable device 1 determines whether the predetermined area of the index finger is included in the actual first predetermined space from the captured images captured by the imaging units 40 and 42.
- the first predetermined space is a space that can be visually recognized by the user superimposed on the icon 65 via the display units 32a and 32b.
- the control unit 22 in the wearable device 1 determines whether or not the user visually recognizes that the predetermined area of the index finger and the display area of the icon 65 are superimposed.
- the wearable device 1 determines that the predetermined area of the index finger is visually recognized by the user so as to overlap the display area of the icon 61 Similarly, the change process can be executed as in the example shown in FIG.
- step S242 the user superimposes a predetermined area of the index finger on the display area of the icon 66.
- the control unit 22 in the wearable device 1 determines that there is a predetermined region of the index finger in a space (first predetermined space) that can be visually recognized by the user superimposed on the icon 66 via the display units 32a and 32b. And the control part 22 enables execution of a change process.
- wearable device 1 displays a list 68 of setting values based on the display content (exposure correction function) of icon 66.
- the control unit 22 determines that, based on the number of fingers having a predetermined state (extended state) being 4, among the five types of setting values in the exposure correction of the list 68, A fourth set value (+1) is set.
- control unit 22 in the wearable device 1 determines whether the predetermined portion of the upper limb is included in the actual first predetermined space, and determines that the predetermined portion of the upper limb is included in the actual first predetermined space. In this case, the execution of the change process is permitted. Accordingly, since the changing process is not performed except when the user intends to position the predetermined portion of the upper limb in the first predetermined space, erroneous operations can be reduced.
- the wearable device 1 is configured to detect the position of the upper limb (or a predetermined region of the upper limb) in the actual space from the captured images captured by the imaging units 40 and 42. It is not limited to this.
- the wearable device 1 may detect the position of the upper limb in the actual space by the detection unit 44 other than the imaging units 40 and 42.
- the space in front of the user is defined by a three-dimensional coordinate system with the forward direction as the z axis, and the position (x, y) of the upper limb in the space is detected, and the predetermined position in the display area is set.
- a predetermined position in the display area of the display units 32a and 32b may be designated according to the position of the upper limb.
- FIG. 30 is a flowchart illustrating a processing procedure related to imaging control by the wearable device 1.
- the processing procedure shown in FIG. 30 is realized by the control unit 22 executing the control program 24a.
- the processing procedure shown in FIG. 30 is repeatedly executed when the wearable device 1 is worn on the head, when the wearable device 1 is operating, or the like.
- the control unit 22 of the wearable device 1 determines whether an upper limb has been detected within the imaging range R as step S2101. Specifically, the control unit 22 determines that the upper limb has been detected when the detection unit 44 detects the upper limb. When it determines with not detecting the upper limb (step S2102, No), the control part 22 re-executes the process of step S2101.
- step S2102 If it is determined that the upper limb has been detected (step S2102, Yes), the control unit 22 advances the process to step S2103.
- the control unit 22 activates the imaging units 40 and 42 as step S2103.
- step S2104 the control unit 22 displays the icons 65 and 66 on the display units 32a and 32b.
- step S2105 the control unit 22 estimates the position of the predetermined region of the upper limb in the real space from the captured images sequentially transmitted from the video data 70 of the imaging units 40 and 42.
- step S2106 based on the position of the upper limb estimated in step S2105, it is determined whether the predetermined region of the upper limb is included in the actual first predetermined space. In other words, the control unit 22 determines whether or not the predetermined area of the index finger is being visually recognized by the user so as to overlap the display area of the icon 65.
- step S2106 If it is determined that the upper limb is included in the first predetermined space (step S2106, Yes), the control unit 22 advances the process to step S2107. On the other hand, when it is determined that the upper limb is not included in the first predetermined space (step S2106, No), the process of step S2105 is performed again.
- the control unit 22 sets the state in which the change process can be executed in step S2107. In this case, the control unit 22 displays a list 67 of setting values on the display units 32a and 32b.
- the control unit 22 detects the number of fingers having a predetermined state as step S2108. And control part 22 performs change processing based on the number of fingers with a predetermined state as Step S2109. Specifically, the control unit 22 changes the setting value related to the imaging function to a setting value based on the number of fingers having a predetermined state.
- the control unit 22 detects the movement of the upper limb in the imaging range R based on the video data 70 of the imaging units 40 and 42 as step S2110. In step S2111, the control unit 22 determines whether the upper limb has moved out of the imaging range R from the imaging range R based on the detection result in step S2110. Specifically, the control unit 22 determines that the upper limb has moved out of the imaging range R when detecting that the entire upper limb has moved out of the imaging range R so that the upper limb is not reflected in the captured image. When it determines with not having moved out of the imaging range R (step S2111, No), the control part 22 re-executes the process after step S2110.
- step S21112 When it determines with having moved out of the imaging range R (step S2111, Yes), the control part 22 advances a process to step S2112.
- the control part 22 acquires a captured image as step S2112. Specifically, the control unit 22 causes the imaging units 40 and 42 to capture the foreground 100 with the setting value set in step S2109, obtains the captured image 71, and uses the acquired captured image 71 as the imaging data 24b. Is stored in the storage unit 24. Thereafter, the control unit 22 stops the imaging units 40 and 42 as step S2113. When the imaging units 40 and 42 stop, the control unit 22 ends the processing procedure shown in FIG.
- the wearable device 1 has a configuration in which change processing can be executed when a predetermined portion of the upper limb is superimposed on the icon 65 (or 66) related to the imaging function.
- the wearable device 1 performs change processing based on the number of fingers having a predetermined state at the time of entry when a predetermined portion of the upper limb enters the imaging range R (or the detection range of the detection unit 44). May be executed.
- the change process is not limited to only the process related to the imaging function.
- the display image when the wearable device 1 displays a predetermined display image, the display image may be enlarged or reduced based on the number of fingers having a predetermined state as the change process.
- the configuration when the display image is composed of character information and a photograph (or an image diagram), the configuration may be such that only the character information is enlarged based on the number of fingers having a predetermined state. The configuration may be such that only the photo is enlarged based on the number of the images.
- the extended finger is caused to function as a pointer, and the display area of the display units 32a and 32b is determined in accordance with the position of the predetermined area of the extended finger in the real space (or the position of the predetermined area of the extended finger in the captured image). When the position can be designated, only a predetermined display area including the designated position may be enlarged or reduced.
- the change process may be, for example, a process of changing the volume at the time of music playback based on the number of extended fingers when the music playback application is activated.
- the change process may be, for example, a process of changing the data playback location based on the number of extended fingers when the video playback application is activated.
- the state in which the finger is bent may be set as the predetermined state.
- the finger being bent may mean that the finger joint (any of the first to third joints) is bent by a predetermined angle or more (on the other hand, the finger joint). Can be considered as if the finger is extended). If the finger is bent for more than a certain period of time, it may be considered that the finger has a predetermined state, or a transition from a state where the finger is extended to a state where the finger is bent Therefore, it may be regarded as having a predetermined state.
- the wearable device 1 detects that a predetermined portion of the upper limb has entered the imaging range R (or the detection range of the detection unit 44), the wearable device 1 has a finger having a predetermined state at the time of detection (when the upper limb enters).
- the change process based on the number may not be executed.
- the control unit 22 determines whether a predetermined part of the upper limb has entered the actual second predetermined space (imaging range R), and the predetermined part of the upper limb enters the actual second predetermined space. If it is determined that the change has been made, the change processing based on the number of fingers having the predetermined state at the time of entering may not be executed. In this case, it is not necessary to cause an unexpected change process when the user moves the upper limb into the imaging range R.
- the wearable device 1 determines whether the combination of the plurality of fingers is a predetermined combination when the extended fingers are a plurality of fingers of the five fingers. Instead of the change process based on the number of fingers having a state, another process may be executed.
- Wearable device 1 determines whether the combination of the plurality of fingers is a predetermined combination when the extended fingers are a plurality of fingers of the five fingers, and when the combination is a predetermined combination Determining whether or not the finger extended within the first predetermined time has performed a predetermined operation, and when the predetermined operation has been performed, instead of the change processing based on the number of fingers having the predetermined state, A process based on a predetermined operation may be executed.
- the extending finger is an index finger and a thumb
- the index finger and the thumb are separated from each other as another process, or The user may perform a process of enlarging or reducing the display image or the like by performing an operation of bringing them close to each other.
- the predetermined part of the index finger and the predetermined part of the thumb are set as the predetermined operation of the index finger and the thumb within the first predetermined time from the detection time. Are determined to be separated or close to each other, and if it is determined that the operation has been performed, other processing based on the operation (such as processing for enlarging or reducing the display image) can be performed. Also good. In the case where the predetermined part of the index finger and the predetermined part of the thumb are not separated from each other or are not brought close to each other within the first predetermined time from the detection that the extending fingers are the index finger and the thumb. The changing process based on the number of extended fingers may be executed.
- control unit 22 in the wearable device 1 may execute the change process when the number of fingers extended and the position of the upper limb in the actual space have not changed for a predetermined time or more.
- the change process is not performed only when the user performs the finger extension operation unintentionally or when the position of the upper limb is stopped unintentionally, so that the possibility of erroneous operation can be reduced. .
- control unit 22 in the wearable device 1 makes the change process executable in response to the finger extension operation in the upper limb, and when the change process is executable, The change process may be executed when the extension operation is performed again.
- the change process is not performed depending on the operation, and the next process is performed. Since the change process is performed by the extension operation, the change process is not unexpectedly performed by the first extension operation.
- the configuration in which the captured image acquisition process is executed when the upper limb moves from the imaging range R to the outside of the imaging range R has been described.
- the control unit 22 does not execute the changing process based on the changed number of fingers. You may do it.
- the upper limb moves from the imaging range R to the outside of the imaging range R, and a part of the upper limb (right hand H or the like) is outside the imaging range R, the number of fingers extended Even if there is a change, the change process based on this is not executed.
- the number of fingers extended in a state where the upper limb (may be the right hand H) is within the imaging range R is stored, and after the upper limb (may be the right hand H) is moved out of the imaging range R. Apply to the change process based on the number.
- the set value is divided into five stages (five types) based on the fact that the number of fingers that can take a predetermined state is 1 to 5 in only one hand (right hand H) is shown.
- the set value may be divided into 10 steps (or 6 to 9 steps), and the change process may be performed based on the number of fingers having a predetermined state in both hands.
- the configuration in which the setting value is divided into five stages based on the fact that the number of fingers that can take a predetermined state is 1 to 5 with only one hand (right hand H) is shown, but the present invention is not limited to this.
- the setting value is divided into four stages, and when the number of fingers having a predetermined state is any one of 2 to 5, the setting value is changed to any one of the four stages based on the number It is good.
- the number of fingers having a predetermined state is 1, it may function as a pointer for designating a predetermined position in the display area of the display units 32a and 32b based on the movement of the finger.
- a larger set value is set, i.e., the number of fingers having the predetermined state is set larger or smaller.
- the configuration may be such that a smaller set value is set as the number of fingers having a predetermined state increases.
- the magnitude of the number of fingers having a predetermined state and the magnitude of the set value to be set may not be associated with each other and may be associated with each other at random.
- the wearable apparatus 1 when the magnitude
- the wearable device 1 calculates the distance between the user's own device in the front-rear direction of the user and the predetermined object (upper limb) existing in the actual space, which is calculated from the detection result of the distance measuring unit 46, when the distance is equal to or less than the predetermined distance.
- the change process based on the number of fingers having a predetermined state may be disabled.
- control unit 22 of the wearable device 1 performs a change process based on the number of fingers having a predetermined state when the size of the predetermined region of the predetermined object (upper limb) in the captured image is less than the predetermined size. May be disabled. For example, when the upper limb is detected in the captured image with an unreasonably small size, such as detecting the other person's upper limb in front of the user, the change process is not performed, so the user feels bothersome. I don't have to.
- Wearable device 1 is calculated when the distance between the user's own device in the front-rear direction of the user and the predetermined object (upper limb) existing in the actual space is greater than or equal to the predetermined distance, calculated from the detection result of distance measurement unit 46.
- the change process based on the number of fingers having a predetermined state may be disabled.
- the wearable device 1 has a glasses shape
- the shape of the wearable device 1 is not limited to this.
- the wearable device 1 may have a helmet-type shape that covers substantially the upper half of the user's head.
- the wearable device 1 may have a mask type shape that covers almost the entire face of the user.
- the configuration in which the display unit has the pair of display units 32a and 32b provided in front of the left and right eyes of the user has been exemplified.
- the structure which has one display part provided in front of one of the left and right eyes of a person may be sufficient.
- the configuration in which the edge portion of the front portion surrounds the entire periphery of the display area of the display portions 32a and 32b is not limited to this, and the display portions 32a and 32b are not limited thereto.
- a configuration in which only a part of the edge of the display area is enclosed may be employed.
- a configuration is shown in which a hand or finger is detected by the imaging unit (or detection unit) as the user's upper limb, but the hand or finger is in a state in which gloves, gloves, or the like are worn. However, it can be similarly detected.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Studio Devices (AREA)
- Camera Bodies And Camera Details Or Accessories (AREA)
- Controls And Circuits For Display Device (AREA)
- Position Input By Displaying (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
- Image Analysis (AREA)
Abstract
Description
まず、図1及び図2を参照しながら、ウェアラブル装置1の全体的な構成について説明する。図1は、ウェアラブル装置1の斜視図である。図2は、利用者によって装着されたウェアラブル装置1を正面から見た図である。図1及び図2に示すように、ウェアラブル装置1は、利用者の頭部に装着されるヘッドマウントタイプの装置である。
1a 前面部
1b 側面部
1c 側面部
13 操作部
22 制御部
24 記憶部
24a 制御プログラム
24b 撮像データ
25 検出処理部
26 表示制御部
32a、32b 表示部
40、42 撮像部
44 検出部
46 測距部
71 撮像画像
R 撮像範囲
Claims (69)
- 頭部に装着可能なウェアラブル装置であって、
ユーザの前方の風景を撮像する撮像部と、
前記撮像部が撮像する撮像画像から遮蔽物を検出する制御部と、を備え、
前記制御部は、前記遮蔽物が前記撮像部の撮像範囲内から撮像範囲外へ移動したことを契機に、前記撮像画像の取得を実行する
ウェアラブル装置。 - 前記ユーザの前方に前記遮蔽物が存ずるか否かを検出可能な検出部をさらに備え、
前記撮像部は、前記ユーザの前方に前記遮蔽物が存ずることを契機に前記風景の撮像動作を開始する請求項1に記載のウェアラブル装置。 - 前記制御部は、前記遮蔽物が前記撮像範囲内から前記撮像範囲外へ移動する軌跡を推定し、該軌跡の始点を前記撮像画像の取得における合焦位置に設定する請求項1又は2に記載のウェアラブル装置。
- 前記制御部は、前記遮蔽物の移動方向に基づいて、前記撮像画像の取得内容、又は前記取得した撮像画像に対する処理内容を変更する請求項1から3のいずれか1項に記載のウェアラブル装置。
- 前記制御部は、前記遮蔽物の移動方向に基づいて、動画像の取得を開始した場合、前記撮像範囲外から前記撮像範囲内へ移動したことを契機に、前記動画像の取得を停止又は中断する請求項4に記載のウェアラブル装置。
- 前記制御部は、前記撮像範囲内における前記遮蔽物の所定動作に基づいて、前記撮像部にズーム処理を実行させる請求項1から5のいずれか1項に記載のウェアラブル装置。
- 前記制御部は、自装置と前記遮蔽物との距離の変化を推定し、該距離の変化に応じて前記撮像範囲の拡大縮小率を変更する請求項6に記載のウェアラブル装置。
- 前記遮蔽物は、ユーザの上肢であり、
前記制御部は、前記上肢が所定形状を形成したことを契機に、前記ズーム処理を実行可能な状態にする請求項6又は7に記載のウェアラブル装置。 - 前記制御部は、前記ズーム処理を実行可能な状態において、前記上肢が前記所定形状を形成しつつ、所定ジェスチャがなされたことを検出すると、該ジェスチャに応じて前記撮像範囲の拡大縮小率を変更する請求項8に記載のウェアラブル装置。
- 前記遮蔽物は、ユーザの上肢、ユーザが持つ所定物のうちのいずれかである
請求項1から7のいずれか1項に記載のウェアラブル装置。 - 前記ユーザが持つ所定物は、ユーザの上肢に装着される他の電子機器である
請求項10に記載のウェアラブル装置。 - ユーザの眼前に配置され、前記撮像画像を表示する表示部をさらに備え、
前記制御部は、
前記撮像画像から前記遮蔽物が消失したことを契機に、前記遮蔽物が前記撮像部の撮像範囲内から撮像範囲外へ移動したものと判定して前記撮像画像の取得を実行し、
前記ズーム処理によって、前記撮像画像から前記遮蔽物が消失した場合には、前記撮像画像の取得を実行しない請求項6から11のいずれか1項に記載のウェアラブル装置。 - 頭部に装着可能なウェアラブル装置によって実行される制御方法であって、
撮像部によってユーザの前方の風景を撮像するステップと、
前記撮像部が撮像する撮像画像から遮蔽物を検出するステップと、
前記遮蔽物が前記撮像部の撮像範囲内から撮像範囲外へ移動したことを契機に、前記撮像画像の取得を実行するステップと、
を含む制御方法。 - 頭部に装着可能なウェアラブル装置に、
撮像部によってユーザの前方の風景を撮像するステップと、
前記撮像部が撮像する撮像画像から遮蔽物を検出するステップと、
前記遮蔽物が前記撮像部の撮像範囲内から撮像範囲外へ移動したことを契機に、前記撮像画像の取得を実行するステップと、
を実行させる制御プログラム。 - 頭部に装着可能なウェアラブル装置であって、
ユーザの前方の景色を撮像する撮像部と、
前記撮像部が撮像する撮像画像から所定物を検出し、
前記撮像画像中における前記所定物の所定領域の大きさが変化したことを検出すると、該大きさの変化に応じた所定の変更処理を行う制御部と、を備える
ウェアラブル装置。 - ユーザの眼前に配置される表示部を備え、
前記制御部は、前記変更処理として、前記表示部が表示する表示画像の大きさを変更する処理を行い、
前記撮像画像中の前記所定物の所定領域の大きさの変化に応じて、前記表示画像の大きさを変更する
請求項15に記載のウェアラブル装置。 - 前記制御部は、ユーザが前記表示部にて視認する前記表示画像が、前記ユーザが前記表示部を介して視認する前記所定物の所定領域の大きさよりも大きくなるように、当該表示画像の大きさを変更する
請求項16に記載のウェアラブル装置。 - 前記制御部は、予め設定された前記撮像部の撮像範囲と、前記撮像画像中における前記所定物の所定領域の大きさと、に基づいて、前記ユーザが前記表示部を介して視認する前記所定物の所定領域の大きさを推定する
請求項17に記載のウェアラブル装置。 - 前記制御部は、前記撮像画像中の前記所定物の所定領域の大きさが第1の大きさ以上である場合には、該所定領域の大きさに応じて前記表示画像の大きさを変更する処理を行わない
請求項16から18のいずれか1項に記載のウェアラブル装置。 - 前記制御部は、前記撮像画像中の前記所定物の所定領域の大きさが第2の大きさ以下である場合には、該所定領域の大きさに応じて前記表示画像の大きさを変更する処理を行わない
請求項16から19のいずれか1項に記載のウェアラブル装置。 - 頭部に装着可能なウェアラブル装置であって、
ユーザの前後方向における自装置と所定物との距離を検出する測距部と、
前記距離が変化したことを検出すると、該距離の変化に応じた所定の変更処理を行う制御部と、を備える
ウェアラブル装置。 - ユーザの眼前に配置される表示部を備え、
前記制御部は、前記変更処理として、前記表示部が表示する表示画像の大きさを変更する処理を行い、
前記距離の変化に応じて、前記表示画像の大きさを変更する
請求項21に記載のウェアラブル装置。 - 前記制御部は、前記距離が第1距離以上である場合には、前記距離に応じて前記表示画像の大きさを変更する処理を行わない
請求項22に記載のウェアラブル装置。 - 前記制御部は、前記距離が第2距離以下である場合には、前記距離に応じて前記表示画像の大きさを変更する処理を行わない
請求項22又は23に記載のウェアラブル装置。 - 前記制御部は、前記表示画像の大きさの変更に伴って、該表示画像の表示位置を変更する
請求項16又は22に記載のウェアラブル装置。 - 前記制御部は、前記表示部が複数の前記表示画像を表示する場合、該複数の表示画像の各々が前記表示部の表示領域の異なる端辺に沿うように、前記表示位置の変更を行う
請求項25に記載のウェアラブル装置。 - 前記所定物は、ユーザの上肢であり、
前記制御部は、前記上肢が右上肢か左上肢かを判定し、
前記判定した結果に応じて、前記表示画像の表示位置を変更する
請求項25に記載のウェアラブル装置。 - 前記制御部は、前記上肢が右上肢であると判定した場合に、前記表示画像を前記表示部の表示領域における右側、又は、前記撮像範囲における右側に表示し、
前記上肢が左上肢であると判定した場合に、前記表示画像を前記表示部の表示領域における左側、又は、前記撮像範囲における左側に表示する
請求項27に記載のウェアラブル装置。 - 現実の物体を検出する検出部をさらに備え、
前記制御部は、前記検出部の検出結果から、前記所定物が前記撮像部の撮像範囲外から該撮像範囲内に進入したことを検出したことを契機に、前記撮像部による前記景色の撮像を開始する
請求項15から28のいずれか1項に記載のウェアラブル装置。 - ユーザの眼前に配置される表示部を備え、
前記制御部は、前記撮像範囲における、前記所定物が進入した側の端辺を判定し、
前記表示部の表示領域における前記端辺側に前記表示画像を表示する
請求項29に記載のウェアラブル装置。 - 前記制御部は、前記検出部の検出結果から、前記所定物が前記撮像部の撮像範囲外から該撮像範囲内に進入したことを検出したことを契機に、前記表示画像を表示する
請求項30に記載のウェアラブル装置。 - 前記制御部は、前記所定物が前記撮像部の撮像範囲内から撮像範囲外へ移動したことを契機に、前記撮像画像の取得を実行する
請求項15から31のいずれか1項に記載のウェアラブル装置。 - 前記制御部は、複数の前記表示画像の内、前記所定物が通過した表示画像に基づく機能を実行する
請求項32に記載のウェアラブル装置。 - 前記機能は、前記撮像画像の取得処理方法に係る機能である
請求項33に記載のウェアラブル装置。 - 前記機能は、前記撮像部が取得する撮像画像に対する処理方法に係る機能である
請求項33に記載のウェアラブル装置。 - 頭部に装着可能なウェアラブル装置であって、
ユーザの眼前に配置される表示部と、
ユーザの前方の景色を撮像する撮像部と、
前記撮像部が撮像する撮像画像から所定物を検出するとともに該所定物の所定動作を検出すると、該所定動作に伴う前記所定物の変位に応じた所定の変更処理を行う制御部と、を備え、
前記撮像画像中の前記所定物の大きさに応じて、前記変更処理における、前記所定物の単位変位当りの変更率が変更される
ウェアラブル装置。 - 前記変更率は、前記撮像画像中の前記所定物の大きさが大きくなる程、小さくなる
請求項36に記載のウェアラブル装置。 - 前記所定物は、ユーザの上肢であり、
前記制御部は、前記上肢における一の指と他の指とが離れる動作、或いは、前記上肢における一の指と他の指とが近づく動作を前記所定動作として検出する
請求項36又は37に記載のウェアラブル装置。 - 前記制御部は、前記変更処理として、前記表示部が表示する表示画像の大きさを変更する処理を行い、
前記撮像画像中の前記所定物の大きさが所定の範囲内にある場合は、前記変更率に基づく前記変更処理の実行を行わず、
前記上肢における一の指の位置と他の指の位置とに基づく他の変更処理を行う
請求項38に記載のウェアラブル装置。 - 前記表示画像の互いに対向する2つの角部のそれぞれが、前記一の指の位置と他の指の位置のそれぞれに対応するように、該表示画像の大きさを変更する
請求項39に記載のウェアラブル装置。 - 前記制御部は、前記一の指と他の指とが離れる、或いは、前記一の指と他の指とが近づく第1方向を検出し、
前記所定動作がなされた後に、前記第1方向と所定角度以上で交わる方向に前記上肢が移動したことを検出したことを契機に前記所定動作が完了したものと判定する
請求項38に記載のウェアラブル装置。 - 頭部に装着可能なウェアラブル装置であって、
ユーザの眼前に配置される表示部と、
ユーザの前後方向における自装置と所定物との間の距離を検出する測距部と、
現実に存在する物体を検出する検出部と、
前記検出部の検出結果から現実に存在する所定物の所定動作を検出すると、該所定動作に伴う前記所定物の変位に応じた所定の変更処理を行う制御部と、を備え、
前記自装置と所定物との間の距離に応じて、前記変更処理における、前記所定物の単位変位当りの変更率が変更される
ウェアラブル装置。 - 前記変更率は、前記自装置と所定物との間の距離が大きくなる程、大きくなる
請求項42に記載のウェアラブル装置。 - 頭部に装着可能なウェアラブル装置であって、
ユーザの眼前に配置され、オブジェクトを表示する表示部と、
ユーザの前後方向における自装置と所定物との間の距離を検出する測距部と、
前記距離が異なる第1状態と第2状態とを区別して検出する制御部と、を備え、
前記第1状態において、前記オブジェクトに対する所定の処理を実行可能にし、前記第2状態において、前記処理を実行不可能にする
ウェアラブル装置。 - 前記オブジェクトに対する所定の処理は、当該オブジェクトのドラッグ操作を行う処理である
請求項44に記載のウェアラブル装置。 - 頭部に装着可能なウェアラブル装置によって実行される制御方法であって、
撮像部によってユーザの前方の景色を撮像するステップと、
前記撮像部が撮像する撮像画像から所定物を検出するステップと、
前記撮像画像中における前記所定物の所定領域の大きさが変化したことを検出すると、該大きさの変化に応じた所定の変更処理を行うステップと、
を含む制御方法。 - 頭部に装着可能なウェアラブル装置に、
撮像部によってユーザの前方の景色を撮像するステップと、
前記撮像部が撮像する撮像画像から所定物を検出するステップと、
前記撮像画像中における前記所定物の所定領域の大きさが変化したことを検出すると、該大きさの変化に応じた所定の変更処理を行うステップと、
を実行させる制御プログラム。 - 頭部に装着可能なウェアラブル装置によって実行される制御方法であって、
ユーザの前後方向における自装置と所定物との距離を検出するステップと、
前記距離が変化したことを検出すると、該距離の変化に応じた所定の変更処理を行うステップと、
を含む制御方法。 - 頭部に装着可能なウェアラブル装置に、
ユーザの前後方向における自装置と所定物との距離を検出するステップと、
前記距離が変化したことを検出すると、該距離の変化に応じた所定の変更処理を行うステップと、
を実行させる制御プログラム。 - ユーザの眼前に配置される表示部を備える、頭部に装着可能なウェアラブル装置によって実行される制御方法であって、
撮像部によってユーザの前方の景色を撮像するステップと、
前記撮像部が撮像する撮像画像から所定物を検出するとともに該所定物の所定動作を検出すると、該所定動作に伴う前記所定物の変位に応じた所定の変更処理を行うステップと、を含み、
前記撮像画像中の前記所定物の大きさに応じて、前記変更処理における、前記所定物の単位変位当りの変更率が変更される
制御方法。 - ユーザの眼前に配置される表示部を備える、頭部に装着可能なウェアラブル装置に、
撮像部によってユーザの前方の景色を撮像するステップと、
前記撮像部が撮像する撮像画像から所定物を検出するとともに該所定物の所定動作を検出すると、該所定動作に伴う前記所定物の変位に応じた所定の変更処理を行うステップと、を実行させる制御プログラムであって、
前記撮像画像中の前記所定物の大きさに応じて、前記変更処理における、前記所定物の単位変位当りの変更率が変更される
制御プログラム。 - ユーザの眼前に配置される表示部を備える、頭部に装着可能なウェアラブル装置によって実行される制御方法であって、
ユーザの前後方向における自装置と所定物との間の距離を検出するステップと、
検出部によって現実に存在する物体を検出するステップと、
前記検出部の検出結果から現実に存在する所定物の所定動作を検出すると、該所定動作に伴う前記所定物の変位に応じた所定の変更処理を行うステップと、を含み、
前記自装置と所定物との間の距離に応じて、前記変更処理における、前記所定物の単位変位当りの変更率が変更される
制御方法。 - ユーザの眼前に配置される表示部を備える、頭部に装着可能なウェアラブル装置に、
ユーザの前後方向における自装置と所定物との間の距離を検出するステップと、
検出部によって現実に存在する物体を検出するステップと、
前記検出部の検出結果から現実に存在する所定物の所定動作を検出すると、該所定動作に伴う前記所定物の変位に応じた所定の変更処理を行うステップと、を実行させる制御プログラムであって、
前記自装置と所定物との間の距離に応じて、前記変更処理における、前記所定物の単位変位当りの変更率が変更される
制御プログラム。 - 頭部に装着可能なウェアラブル装置であって、
現実の空間に在る上肢を検出する検出部を備え、
前記検出部の検出結果から前記上肢における所定状態を有した指の数を検出し、該指の数に基づいて、所定機能に関する変更処理を実行する
ウェアラブル装置。 - 制御部を備え、
前記制御部は、前記検出部の検出結果から前記上肢における所定状態を有した指の数を検出し、該指の数に基づいて、所定機能に関する変更処理を実行する
請求項54に記載のウェアラブル装置。 - 前記変更処理は、前記所定機能の実行内容に関する所定の設定値を前記指の数に基づいた値に変更する処理である
請求項54又は55に記載のウェアラブル装置。 - 前記指における所定状態とは、指が伸展している状態である
請求項54から56のいずれか1項に記載のウェアラブル装置。 - 前記制御部は、前記上肢の所定部分が現実の第1所定空間に含まれるかを判定し、
前記上肢の所定部分が現実の前記第1所定空間に含まれると判定した場合に、前記変更処理の実行を許可にする
請求項55から57のいずれか1項に記載のウェアラブル装置。 - 前記制御部は、前記上肢の所定部分が現実の第2所定空間に進入したかを判定し、
前記上肢の所定部分が現実の前記第2所定空間に進入したと判定した場合には、該進入した時点における前記所定状態を有した指の数に基づいた前記変更処理を実行しない
請求項55から57のいずれか1項に記載のウェアラブル装置。 - 前記制御部は、前記伸展した指が5指の内の複数の指である場合、該複数の指の組み合わせが所定の組み合わせであるかを判定し、
前記組み合わせが前記所定の組み合わせである場合には、前記所定状態を有した指の数に基づいた前記変更処理に替えて他の処理を実行する
請求項55から59のいずれか1項に記載のウェアラブル装置。 - 前記制御部は、前記伸展した指が5指の内の複数の指である場合、該複数の指の組み合わせが所定の組み合わせであるかを判定し、
前記組み合わせが前記所定の組み合わせである場合には、第1所定時間内に前記伸展した指が所定の動作をしたかを判定し、該所定の動作をした場合には、前記所定状態を有した指の数に基づいた前記変更処理に替えて該所定の動作に基づく処理を実行する
請求項57から59のいずれか1項に記載のウェアラブル装置。 - 前記所定の動作は、前記伸展した複数の指の所定部分が互いに離れる、又は近づく動作である
請求項61に記載のウェアラブル装置。 - 前記制御部は、前記伸展した指の数及び現実の空間における前記上肢の位置が所定時間以上、変化しなかった場合に、前記変更処理を実行する
請求項57から62のいずれか1項に記載のウェアラブル装置。 - 前記制御部は、前記上肢における指の伸展動作がなされたことを契機に、前記変更処理を実行可能な状態とし、
前記変更処理が実行可能な状態であるときに、前記上肢における前記指の伸展動作が再度なされたことを契機に、前記変更処理を実行する
請求項57に記載のウェアラブル装置。 - 前記検出部として、ユーザの前方の風景を撮像する撮像部を備え、
前記撮像部が撮像する撮像画像中の前記上肢を検出するとともに、該上肢における所定状態を有した指の数を検出し、該指の数に基づいて、前記変更処理を実行する
請求項55から64のいずれか1項に記載のウェアラブル装置。 - 前記制御部は、前記上肢が前記撮像部の撮像範囲内から撮像範囲外へ移動したことを契機に、前記撮像画像の取得処理を実行する
請求項65に記載のウェアラブル装置。 - 前記制御部は、前記上肢が撮像範囲内から撮像範囲外へ移動したことによって、前記伸展した指の数が変化した場合には、該変化した指の数に基づく前記変更処理は実行しない
請求項66に記載のウェアラブル装置。 - 頭部に装着可能なウェアラブル装置によって実行される制御方法であって、
検出部によって現実の空間に在る上肢を検出するステップと、
前記検出部の検出結果から前記上肢における所定状態を有した指の数を検出し、該指の数に基づいて、所定機能に関する変更処理を実行するステップと、
を含む制御方法。 - 頭部に装着可能なウェアラブル装置に、
検出部によって現実の空間に在る上肢を検出するステップと、
前記検出部の検出結果から前記上肢における所定状態を有した指の数を検出し、該指の数に基づいて、所定機能に関する変更処理を実行するステップと、
を実行させる制御プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017502443A JP6409118B2 (ja) | 2015-02-25 | 2016-02-24 | ウェアラブル装置、制御方法及び制御プログラム |
US15/553,236 US10477090B2 (en) | 2015-02-25 | 2016-02-24 | Wearable device, control method and non-transitory storage medium |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-035830 | 2015-02-25 | ||
JP2015035830 | 2015-02-25 | ||
JP2015-068613 | 2015-03-30 | ||
JP2015068614 | 2015-03-30 | ||
JP2015068613 | 2015-03-30 | ||
JP2015-068614 | 2015-03-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016136838A1 true WO2016136838A1 (ja) | 2016-09-01 |
Family
ID=56789310
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/055522 WO2016136838A1 (ja) | 2015-02-25 | 2016-02-24 | ウェアラブル装置、制御方法及び制御プログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US10477090B2 (ja) |
JP (5) | JP6409118B2 (ja) |
WO (1) | WO2016136838A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018150831A1 (ja) * | 2017-02-16 | 2018-08-23 | ソニー株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
JP2019179387A (ja) * | 2018-03-30 | 2019-10-17 | Necソリューションイノベータ株式会社 | モーション判定装置、モーション判定方法、及びプログラム |
JPWO2018185830A1 (ja) * | 2017-04-04 | 2019-12-26 | 株式会社オプティム | 情報処理システム、情報処理方法、ウェアラブル端末、及びプログラム |
WO2020145212A1 (en) * | 2019-01-11 | 2020-07-16 | Ricoh Company, Ltd. | Image capturing apparatus, image capturing method, and recording medium |
JP2020113974A (ja) * | 2019-01-11 | 2020-07-27 | 株式会社リコー | 撮像装置、方法およびプログラム |
JP2020198108A (ja) * | 2017-04-28 | 2020-12-10 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理装置の制御方法、及びプログラム |
JP2021519996A (ja) * | 2019-02-01 | 2021-08-12 | ベイジン センスタイム テクノロジー デベロップメント カンパニー, リミテッド | ユーザの姿勢変化に基づく仮想対象の制御 |
JP2022140636A (ja) * | 2015-02-25 | 2022-09-26 | 京セラ株式会社 | ウェアラブル装置、方法及びプログラム |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10846825B2 (en) * | 2015-03-31 | 2020-11-24 | Pioneer Corporation | Display control apparatus, information processing apparatus, display control method, program for display control, and recording medium |
WO2016194844A1 (ja) * | 2015-05-29 | 2016-12-08 | 京セラ株式会社 | ウェアラブル装置 |
US10043487B2 (en) * | 2015-06-24 | 2018-08-07 | Samsung Electronics Co., Ltd. | Apparatus and method for split screen display on mobile device |
US10578869B2 (en) | 2017-07-24 | 2020-03-03 | Mentor Acquisition One, Llc | See-through computer display systems with adjustable zoom cameras |
JP7562282B2 (ja) | 2019-06-28 | 2024-10-07 | キヤノン株式会社 | 撮像表示装置、およびウェアラブルデバイス |
JP7478291B1 (ja) | 2023-08-08 | 2024-05-02 | ユーソナー株式会社 | 情報処理システム、情報処理装置、プログラム、アプリケーションソフトウェア、端末装置及び情報処理方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000138858A (ja) * | 1998-11-02 | 2000-05-16 | Fuji Photo Film Co Ltd | 電子カメラシステム |
JP2010146481A (ja) * | 2008-12-22 | 2010-07-01 | Brother Ind Ltd | ヘッドマウントディスプレイ |
JP2010213057A (ja) * | 2009-03-11 | 2010-09-24 | Canon Inc | 撮像装置、その制御方法、プログラム及び記録媒体 |
JP2013190925A (ja) * | 2012-03-13 | 2013-09-26 | Nikon Corp | 入力装置、及び表示装置 |
JP2013210643A (ja) * | 2013-04-26 | 2013-10-10 | Sony Corp | 表示装置、表示方法 |
JP2014241099A (ja) * | 2013-06-12 | 2014-12-25 | 株式会社ニコン | 撮像装置 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3867039B2 (ja) * | 2002-10-25 | 2007-01-10 | 学校法人慶應義塾 | ハンドパターンスイッチ装置 |
JP2005047331A (ja) * | 2003-07-31 | 2005-02-24 | Nissan Motor Co Ltd | 制御装置 |
JP2006279793A (ja) | 2005-03-30 | 2006-10-12 | Fuji Photo Film Co Ltd | カメラ用クレードル及び撮影システム |
US8217895B2 (en) * | 2006-04-28 | 2012-07-10 | Mtekvision Co., Ltd. | Non-contact selection device |
JP4203863B2 (ja) | 2007-07-27 | 2009-01-07 | 富士フイルム株式会社 | 電子カメラ |
JP5299127B2 (ja) * | 2009-07-01 | 2013-09-25 | カシオ計算機株式会社 | 撮像装置及びプログラム |
JP2012079138A (ja) * | 2010-10-04 | 2012-04-19 | Olympus Corp | ジェスチャ認識装置 |
JP5641970B2 (ja) * | 2011-02-18 | 2014-12-17 | シャープ株式会社 | 操作装置、再生装置及びテレビ受信装置 |
CN103608761B (zh) * | 2011-04-27 | 2018-07-27 | 日本电气方案创新株式会社 | 输入设备、输入方法以及记录介质 |
JP5885395B2 (ja) | 2011-04-28 | 2016-03-15 | オリンパス株式会社 | 撮影機器及び画像データの記録方法 |
JP2013190941A (ja) * | 2012-03-13 | 2013-09-26 | Nikon Corp | 情報入出力装置、及び頭部装着表示装置 |
US20130293454A1 (en) * | 2012-05-04 | 2013-11-07 | Samsung Electronics Co. Ltd. | Terminal and method for controlling the same based on spatial interaction |
JP5944287B2 (ja) * | 2012-09-19 | 2016-07-05 | アルプス電気株式会社 | 動作予測装置及びそれを用いた入力装置 |
US9076033B1 (en) * | 2012-09-28 | 2015-07-07 | Google Inc. | Hand-triggered head-mounted photography |
JP5962403B2 (ja) * | 2012-10-01 | 2016-08-03 | ソニー株式会社 | 情報処理装置、表示制御方法及びプログラム |
JP6030430B2 (ja) * | 2012-12-14 | 2016-11-24 | クラリオン株式会社 | 制御装置、車両及び携帯端末 |
JP6195893B2 (ja) * | 2013-02-19 | 2017-09-13 | ミラマ サービス インク | 形状認識装置、形状認識プログラム、および形状認識方法 |
WO2014128751A1 (ja) | 2013-02-19 | 2014-08-28 | 株式会社ブリリアントサービス | ヘッドマウントディスプレイ装置、ヘッドマウントディスプレイ用プログラム、およびヘッドマウントディスプレイ方法 |
US20140282275A1 (en) * | 2013-03-15 | 2014-09-18 | Qualcomm Incorporated | Detection of a zooming gesture |
US9367169B2 (en) * | 2014-03-31 | 2016-06-14 | Stmicroelectronics Asia Pacific Pte Ltd | Method, circuit, and system for hover and gesture detection with a touch screen |
US10477090B2 (en) * | 2015-02-25 | 2019-11-12 | Kyocera Corporation | Wearable device, control method and non-transitory storage medium |
US20200387286A1 (en) * | 2019-06-07 | 2020-12-10 | Facebook Technologies, Llc | Arm gaze-driven user interface element gating for artificial reality systems |
-
2016
- 2016-02-24 US US15/553,236 patent/US10477090B2/en active Active
- 2016-02-24 JP JP2017502443A patent/JP6409118B2/ja active Active
- 2016-02-24 WO PCT/JP2016/055522 patent/WO2016136838A1/ja active Application Filing
-
2018
- 2018-09-21 JP JP2018178108A patent/JP6652613B2/ja active Active
-
2020
- 2020-01-23 JP JP2020008939A patent/JP2020080543A/ja active Pending
-
2021
- 2021-01-18 JP JP2021005989A patent/JP2021073579A/ja active Pending
-
2022
- 2022-08-02 JP JP2022123447A patent/JP2022140636A/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000138858A (ja) * | 1998-11-02 | 2000-05-16 | Fuji Photo Film Co Ltd | 電子カメラシステム |
JP2010146481A (ja) * | 2008-12-22 | 2010-07-01 | Brother Ind Ltd | ヘッドマウントディスプレイ |
JP2010213057A (ja) * | 2009-03-11 | 2010-09-24 | Canon Inc | 撮像装置、その制御方法、プログラム及び記録媒体 |
JP2013190925A (ja) * | 2012-03-13 | 2013-09-26 | Nikon Corp | 入力装置、及び表示装置 |
JP2013210643A (ja) * | 2013-04-26 | 2013-10-10 | Sony Corp | 表示装置、表示方法 |
JP2014241099A (ja) * | 2013-06-12 | 2014-12-25 | 株式会社ニコン | 撮像装置 |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022140636A (ja) * | 2015-02-25 | 2022-09-26 | 京セラ株式会社 | ウェアラブル装置、方法及びプログラム |
US11170580B2 (en) | 2017-02-16 | 2021-11-09 | Sony Corporation | Information processing device, information processing method, and recording medium |
JP7095602B2 (ja) | 2017-02-16 | 2022-07-05 | ソニーグループ株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
CN110506249A (zh) * | 2017-02-16 | 2019-11-26 | 索尼公司 | 信息处理设备、信息处理方法和记录介质 |
JPWO2018150831A1 (ja) * | 2017-02-16 | 2019-12-12 | ソニー株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
WO2018150831A1 (ja) * | 2017-02-16 | 2018-08-23 | ソニー株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
JPWO2018185830A1 (ja) * | 2017-04-04 | 2019-12-26 | 株式会社オプティム | 情報処理システム、情報処理方法、ウェアラブル端末、及びプログラム |
US11896893B2 (en) | 2017-04-28 | 2024-02-13 | Sony Interactive Entertainment Inc. | Information processing device, control method of information processing device, and program |
US11077360B2 (en) | 2017-04-28 | 2021-08-03 | Sony Interactive Entertainment Inc. | Information processing device, control method of information processing device, and program |
US11617942B2 (en) | 2017-04-28 | 2023-04-04 | Sony Interactive Entertainment Inc. | Information processing device, control method of information processing device, and program |
JP2020198108A (ja) * | 2017-04-28 | 2020-12-10 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理装置の制御方法、及びプログラム |
US11260287B2 (en) | 2017-04-28 | 2022-03-01 | Sony Interactive Entertainment Inc. | Information processing device, control method of information processing device, and program |
JP2019179387A (ja) * | 2018-03-30 | 2019-10-17 | Necソリューションイノベータ株式会社 | モーション判定装置、モーション判定方法、及びプログラム |
JP7239132B2 (ja) | 2018-03-30 | 2023-03-14 | Necソリューションイノベータ株式会社 | モーション判定装置、モーション判定方法、及びプログラム |
WO2020145212A1 (en) * | 2019-01-11 | 2020-07-16 | Ricoh Company, Ltd. | Image capturing apparatus, image capturing method, and recording medium |
US11388332B2 (en) | 2019-01-11 | 2022-07-12 | Ricoh Company, Ltd. | Image capturing apparatus, image capturing method, and recording medium |
JP2020113974A (ja) * | 2019-01-11 | 2020-07-27 | 株式会社リコー | 撮像装置、方法およびプログラム |
JP7516755B2 (ja) | 2019-01-11 | 2024-07-17 | 株式会社リコー | 撮像装置、方法およびプログラム |
US11429193B2 (en) | 2019-02-01 | 2022-08-30 | Beijing Sensetime Technology Development Co., Ltd. | Control of virtual objects based on gesture changes of users |
JP7104179B2 (ja) | 2019-02-01 | 2022-07-20 | ベイジン・センスタイム・テクノロジー・デベロップメント・カンパニー・リミテッド | ユーザの姿勢変化に基づく仮想対象の制御 |
JP2021519996A (ja) * | 2019-02-01 | 2021-08-12 | ベイジン センスタイム テクノロジー デベロップメント カンパニー, リミテッド | ユーザの姿勢変化に基づく仮想対象の制御 |
Also Published As
Publication number | Publication date |
---|---|
JP6409118B2 (ja) | 2018-10-17 |
US20180063397A1 (en) | 2018-03-01 |
JP2020080543A (ja) | 2020-05-28 |
JP2021073579A (ja) | 2021-05-13 |
JP2019036974A (ja) | 2019-03-07 |
US10477090B2 (en) | 2019-11-12 |
JPWO2016136838A1 (ja) | 2017-09-14 |
JP2022140636A (ja) | 2022-09-26 |
JP6652613B2 (ja) | 2020-02-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6409118B2 (ja) | ウェアラブル装置、制御方法及び制御プログラム | |
JP6400197B2 (ja) | ウェアラブル装置 | |
JP6393367B2 (ja) | 追従表示システム、追従表示プログラム、および追従表示方法、ならびにそれらを用いたウェアラブル機器、ウェアラブル機器用の追従表示プログラム、およびウェアラブル機器の操作方法 | |
JP7459798B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
JPWO2018150831A1 (ja) | 情報処理装置、情報処理方法及び記録媒体 | |
CN113709410A (zh) | 基于mr眼镜的人眼视觉能力增强的方法、系统及设备 | |
CN109644235B (zh) | 用于提供混合现实图像的方法、设备和计算机可读介质 | |
JP5766957B2 (ja) | ジェスチャ入力装置 | |
JP7009096B2 (ja) | 電子機器およびその制御方法 | |
WO2013190906A1 (ja) | 表示制御装置および撮像装置ならびに表示制御方法 | |
JP7059934B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US20230336865A1 (en) | Device, methods, and graphical user interfaces for capturing and displaying media | |
JP2015225374A (ja) | 情報処理装置、方法及びプログラム並びに記録媒体 | |
KR20180004112A (ko) | 안경형 단말기 및 이의 제어방법 | |
US20240028177A1 (en) | Devices, methods, and graphical user interfaces for interacting with media and three-dimensional environments | |
JP6483514B2 (ja) | ウェアラブル装置、制御方法及び制御プログラム | |
JP2019179080A (ja) | 情報処理装置、情報処理方法およびプログラム | |
CN117043722A (zh) | 用于地图的设备、方法和图形用户界面 | |
CN104427226B (zh) | 图像采集方法和电子设备 | |
CN113853569A (zh) | 头戴式显示器 | |
JP6817350B2 (ja) | ウェアラブル装置、制御方法及び制御プログラム | |
JP2015056825A (ja) | 撮像装置、その制御方法及びプログラム | |
JP6269692B2 (ja) | 表示装置、電子機器、およびプログラム | |
JP5879856B2 (ja) | 表示装置、表示方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16755582 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017502443 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15553236 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16755582 Country of ref document: EP Kind code of ref document: A1 |