WO2015136952A1 - Gesture recognition device, head-mounted display, and portable terminal - Google Patents

Gesture recognition device, head-mounted display, and portable terminal Download PDF

Info

Publication number
WO2015136952A1
WO2015136952A1 PCT/JP2015/050539 JP2015050539W WO2015136952A1 WO 2015136952 A1 WO2015136952 A1 WO 2015136952A1 JP 2015050539 W JP2015050539 W JP 2015050539W WO 2015136952 A1 WO2015136952 A1 WO 2015136952A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture recognition
notification
user
hmd device
gesture
Prior art date
Application number
PCT/JP2015/050539
Other languages
French (fr)
Japanese (ja)
Inventor
義朗 平原
片桐 哲也
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Publication of WO2015136952A1 publication Critical patent/WO2015136952A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/453Help systems

Definitions

  • the present invention relates to a gesture recognition device, a head mounted display, and a mobile terminal.
  • a head mounted display is a display device that is used by being mounted on a user's head. The user can view various contents such as image information displayed in front of the user with the head mounted display mounted on the head.
  • Some of such head mounted displays can capture a part of the user's body using a small camera and recognize the movement (gesture) of the body as an input operation to the head mounted display.
  • Patent Document 1 is not an example using a head-mounted display, a small camera is installed on the ceiling, and a user's gesture made under the small camera is captured and recognized, and input to a computer. Techniques for determining commands are described.
  • the small camera is mounted on the head-mounted display. Therefore, if the user's head is shaken, the small camera is also shaken. Therefore, when the user's body moves out of the field of view of the small camera, it cannot be distinguished whether the small camera has moved or the user's body has moved. In particular, in a situation where the user wears a head-mounted display and is gazing at the display, it tends to be difficult to distinguish whether the small camera has moved or the user's body has moved. Therefore, even if the user is prompted to correct the position of the gesture without showing a specific method for correcting the position of the gesture, the user cannot determine how to correct the position of the gesture.
  • the present invention has been made in view of the above circumstances, and an object thereof is to provide a gesture recognition device, a head-mounted display, and a mobile terminal that guide a user to intuitively correct the position of a gesture.
  • a gesture recognition device that is worn or held by a user and that uses an image capturing unit that continuously captures images, changes in the orientation of the image capturing unit, and images captured by the image capturing unit Tracking the predetermined target, determining whether the tracking is possible, and if the target cannot be tracked based on the determination result, the amount of change in the detected direction is calculated, and the reason why the tracking is impossible
  • a gesture recognizing apparatus comprising: a control unit that identifies a notification; and a notification unit that performs notification based on the identified cause.
  • the notification by the notification unit includes at least one of notification by display, notification by sound, and notification by vibration apparatus.
  • the control unit determines a command according to the tracked movement of the target, and the notification unit notifies when the target cannot be tracked when the command is determined by the control unit.
  • the gesture recognition device according to any one of (1) to (5), wherein:
  • the cause is identified and appropriate notification based on the identified cause is performed.
  • This notification allows the user to easily determine in which direction the head should be moved and in which direction the hand should be moved so that the head-mounted display can correctly recognize the gesture. In this way, the user can intuitively correct the position of the gesture.
  • HMD head mounted display
  • HMD head mounted display
  • (A) It is a figure which shows the example of a state when object moves to the right direction.
  • (A) It is a figure which shows the example of notification (notification A) when a target remove
  • (B) It is a figure which shows the example of notification (notification A) when a target remove
  • (C) It is a figure which shows the example of notification (notification A) when a target remove
  • D It is a figure which shows the example of notification (notification A) when a target remove
  • (A) It is a figure which shows the example of notification (notification A) when a target remove
  • (B) It is a figure which shows the example of notification (notification A) when a target remove
  • (C) It is a figure which shows the example of notification (notification A) when a target remove
  • (A) It is a figure which shows the example of a state when a user's head turns to the left direction.
  • (B) It is a figure which shows the example of notification (notification B) when a target remove
  • A It is a figure which shows the example of notification (notification B) when a target remove
  • B It is a figure which shows the example of notification (notification B) when a target remove
  • C It is a figure which shows the example of notification (notification B) when a target remove
  • D It is a figure which shows the example of notification (notification B) when a target remove
  • A It is a figure which shows the example of notification (notification B) when a target remove
  • B It is a figure which shows the example of notification (notification B) when a target remove
  • C It is a figure which shows the example of notification (notification B) when a target remove
  • FIG. 1 is a diagram showing an example of the appearance of a head mounted display (HMD) device according to this embodiment.
  • FIG. 2 is a block diagram illustrating a hardware configuration example of the HMD device.
  • HMD head mounted display
  • the HMD device 1 is an image display device that is used by being mounted on the head of a user (hereinafter also referred to as “observer”).
  • the HMD device 1 has a structure imitating eyeglasses for correcting vision. That is, the HMD device 1 includes a pair of left and right temples L1 and L2, a bridge B, and a pair of left and right transparent members G1 and G2.
  • Temples L1 and L2 are, for example, long bar-shaped members made of an elastic material. Each of the temples L1 and L2 is provided with an ear hook portion that is hung on the user's ear, and the transparent members G1 and G2 are fixed to the other end portion. In addition, a control unit U is mounted in the vicinity of the ear hook portion that is hung on one ear of the user.
  • the bridge B is a short bar-like member for connecting the pair of left and right transparent members G1 and G2 to each other.
  • Transparent members G1 and G2 are fixed to both ends of the bridge B.
  • the pair of left and right transparent members G1 and G2 are held at a predetermined interval.
  • the transparent members G1 and G2 are transparent materials (glass, plastic, film, etc.) that can transmit light from the outside world to the user's eyes so that the user wearing the HMD device 1 can observe the outside world It is formed by.
  • the imaging device 11 is provided above the transparent member G1 corresponding to the user's right eye, and the display device 12 is provided so as to overlap the transparent member G1.
  • the imaging device 11 images the outside of the camera lens in the optical axis direction (hereinafter also referred to as “imaging direction”).
  • imaging direction the optical axis direction
  • the imaging device 11 is fixedly held with respect to the HMD device 1 so that the optical axis of the camera lens substantially coincides with the user's line-of-sight direction.
  • the imaging device 11 can take an image of the user's front visual field.
  • the imaging device 11 is a digital camera or a video camera including an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • a captured image such as JPEG or MPEG2 / 4 (or a moving image captured at a predetermined frame rate) is generated by imaging by the imaging device 11 and transmitted to the control unit U connected by wire.
  • a personal-use web camera (with an angle of view of about 72 degrees) is used as the imaging device 11.
  • the display device 12 displays various contents such as a predetermined image (video) on the transparent members G1 and G2.
  • the display device 12 may display an image (AR image) that is additionally superimposed on light from the outside.
  • the display device 12 can display a pop-up image including information such as a name and work place as if there is a person near the person in the direction of the user's line of sight.
  • the display device 12 may be a binocular type instead of the monocular type as shown in FIG.
  • an inertial sensor 13 is provided in the vicinity of the imaging device 11.
  • the inertial sensor 13 is a sensor that detects shaking of the HMD device 1 and provides data necessary for calculating the shaking amount of the imaging device 11 at that time.
  • the inertial sensor 13 includes an acceleration sensor, a gyro sensor, or the like.
  • the inertial sensor 13 may have any form, name, or structure as long as it can provide data necessary for calculating the amount of shaking of the imaging device 11.
  • the control unit U has a configuration necessary for controlling the imaging device 11, the display device 12, and the inertial sensor 13 to function as a head mounted display as a whole. For example, the control unit U recognizes the user's gesture from the captured image (moving image) transmitted from the imaging device 11, executes a command corresponding to the recognized gesture, and displays the result on the display device 12. Perform the process. A specific configuration of the control unit U will be described later.
  • control unit U includes a CPU (Central Processing Unit) 31, a memory 32, a storage 33, an input device 34, and a display controller 35.
  • CPU Central Processing Unit
  • the CPU 31 is a control circuit composed of a processor or the like that executes control of each unit and various arithmetic processes according to a program, and each function of the HMD device 1 is exhibited by the CPU 31 executing a corresponding program.
  • the memory 32 is a high-speed accessible main storage device that temporarily stores programs and data as a work area.
  • DRAM Dynamic Random Access Memory
  • SDRAM Serial Dynamic Access Memory
  • SRAM Static Random Access Memory
  • the storage 33 is a large-capacity auxiliary storage device that stores various programs including an operating system and various data.
  • a flash memory or the like is employed as the storage 33.
  • the input device 34 is composed of physical keys and buttons for inputting instructions such as power-on of the HMD device 1. Moreover, you may utilize an external controller (for example, a smart phone, a remote controller, etc.) as the input device 34.
  • an external controller for example, a smart phone, a remote controller, etc.
  • the display controller 35 causes the display device 12 to display various contents such as a predetermined image (video), an AR image, and the like.
  • the display controller 35 reads a predetermined image from the memory 32 every predetermined display cycle, converts it into a signal for input to the display device 12, and generates a pulse signal such as a horizontal synchronizing signal and a vertical synchronizing signal. To do.
  • the HMD device 1 having the hardware configuration as described above has the following functional configuration.
  • FIG. 3 is a block diagram illustrating a functional configuration example of the HMD device.
  • the HMD device 1 includes, as a functional configuration, a control unit 41, an imaging control unit 42, a shake detection unit 43, a tracking unit 44, a tracking availability determination unit 45, a cause identification unit 46, A notification unit 47, a command determination unit 48, a display control unit 49, and an application unit 50 are included.
  • the function units 41 to 48 and 50 are realized by the CPU 31 reading the program installed in the storage 33 into the memory 32 and executing it, and the display control unit 49 is realized by the display controller 35.
  • the functional units 41 to 48 and 50 are not limited to this, and may be realized by hardware such as an ASIC.
  • the control unit 41 controls the entire HMD device 1.
  • the imaging control unit 42 controls the imaging device 11 to continuously image the outside world. Thereby, a captured image (moving image) is generated.
  • the shake detection unit 43 detects a change in the direction of the imaging device 11 (for example, the optical axis direction of the camera lens) when the user wearing the HMD device 1 shakes. For example, the shake detection unit 43 acquires the sensor value of the inertial sensor 13 and detects a change in the orientation of the imaging device 11.
  • the tracking unit 44 tracks a predetermined target (for example, the user's hand, fingertip, etc.) in order to recognize the user's gesture.
  • a predetermined target for example, the user's hand, fingertip, etc.
  • the tracking unit 44 sets an area for recognizing a user's gesture in advance (hereinafter referred to as “gesture recognition area”), and tracks a predetermined target in the gesture recognition area.
  • the gesture recognition area is set within the field of view of the imaging device 11.
  • the tracking availability determination unit 45 determines whether tracking by the tracking unit 44 is possible.
  • the cause identification unit 46 calculates the amount of change in the orientation of the imaging device 11 detected by the shake detection unit 43 when the predetermined target cannot be tracked based on the determination result of the tracking enable / disable determination unit 45, and indicates that the tracking is impossible. Identify the cause. For example, when the amount of change in the orientation of the imaging device 11 is greater than or equal to the threshold value, the cause identifying unit 46 identifies that tracking is impossible due to the user's body shaking and the imaging device 11 moving. On the other hand, if the amount of change in the orientation of the imaging device 11 is less than the threshold value, it is determined that the predetermined target cannot be tracked due to the movement (gesture) of the predetermined target itself.
  • the notification unit 47 performs notification based on the cause specified by the cause specifying unit 46. For example, the notification unit 47 performs different notifications depending on the cause specified by the cause specifying unit 46.
  • the notification performed by the notification unit 47 includes at least one of notification by display (display device 12), notification by voice (not shown speaker), and notification by vibration (small vibrator not shown). included.
  • the command determination unit 48 determines the user's instruction content, that is, the command to the HMD device 1 according to the movement (gesture) of the predetermined target tracked by the tracking unit 44.
  • the display control unit 49 causes the display device 12 to display various contents such as a predetermined image (video), an AR image, and the like.
  • the application unit 50 executes processing corresponding to the command determined by the command determination unit 48.
  • the application unit 50 performs processing such as display screen enlargement processing, reduction processing, content playback, pause, and stop.
  • FIG. 4 is a flowchart showing the procedure of the gesture recognition process.
  • FIG. 5 is a flowchart showing the procedure of the warning process.
  • FIG. 6A is a diagram illustrating a state example when the object moves in the right direction.
  • FIGS. 6 (B), 7 (A) to (D), and FIGS. 8 (A) to (C) are respectively the right side, left side, upper side, lower side, upper left side, upper right side of the gesture recognition area, It is a figure which shows the example of notification (notification A) when it remove
  • FIG. 9A is a diagram illustrating an example of a state when the user's head is directed leftward.
  • the objects are the right side, left side, upper side, lower side, upper left side, upper right side of the gesture recognition area, respectively. It is a figure which shows the example of notification (notification B) when it remove
  • the HMD device 1 starts the gesture recognition process shown in FIG. 4 at the timing when the power is turned on.
  • the timing of starting the gesture recognition process is not limited to this, and the gesture recognition process may be started when an operation for starting the gesture recognition process is performed on the input device 34.
  • the gesture recognition process may be started when it is detected that the HMD device 1 is mounted on the user's head.
  • the control unit 41 calibrates various parameters necessary for controlling the HMD device 1. For example, registration of the reference value of the inertial sensor 13 and setting of a gesture recognition area are performed. In the present embodiment, the control unit 41 acquires a sensor value from the inertial sensor 13 and registers it in the memory 32 as a reference value for the orientation of the imaging device 11. Further, the control unit 41 sets the entire field of view of the imaging device 11 (the entire region of the captured image) as a gesture recognition area.
  • the HMD device 1 functions as the imaging control unit 42 and continuously images the outside world at a predetermined frame rate (a predetermined number of frames). Specifically, the HMD device 1 generates a captured image (moving image) in the optical axis direction of the camera lens by controlling the imaging device 11 and starting imaging. Then, the HMD device 1 stores the generated captured image (moving image) in the memory 32 or the storage 33. In addition, while the imaging device 11 is imaging, the user can perform a gesture operation by moving a part of the body (for example, a hand) so that the HMD device 1 can recognize it.
  • a predetermined frame rate a predetermined number of frames
  • gesture movements include pinch-out that spreads the thumb and index finger from the closed state, pinch-in that closes the thumb and index finger from the expanded state, slide of gesture that moves the hand or extended index finger, spread hand or Tap with a finger, double tap to tap it continuously, swipe to move your hand left and right or up and down.
  • the HMD device 1 functions as the shake detection unit 43 and acquires a sensor value from the inertial sensor 13. For example, the HMD device 1 acquires acceleration in the triaxial direction from the inertial sensor 13. In step S102, the HMD device 1 acquires a sensor value every time a captured image for one frame is generated in step S101, and associates the captured image with the sensor value for each frame and stores them in the memory 32 or the like. .
  • the HMD device 1 functions as a tracking unit 44 and tracks a predetermined target (for example, a user's hand) by a general pattern matching method. At this time, the HMD device 1 functions as the traceability determination unit 45 to determine whether a predetermined target is traceable or untraceable. Specifically, the HMD device 1 has features (colors and shapes) that are the same as or similar to the template image of “hand” prepared in advance from the captured images (each frame) generated in step S101. Find the area you have. Then, the HMD device 1 calculates the movement amount of the “hand” between all successive frames.
  • the HMD device 1 determines that the “hand” can be tracked when the movement amount of the “hand” between all the consecutive frames is less than a predetermined amount. On the other hand, the HMD device 1 determines that the movement amount of the “hand” between at least one continuous frame is equal to or greater than a predetermined amount, or there is a frame in which no “hand” exists and If the amount of movement cannot be calculated, it is determined that the “hand” cannot be tracked.
  • step S103 determines that the predetermined target can be tracked
  • step S103: NO determines that the predetermined target cannot be tracked
  • Step S104 The HMD device 1 executes a warning process for prompting the user to redo the gesture. Details of the warning process will be described later.
  • the HMD device 1 functions as the shake detection unit 43 and detects a change in the orientation of the imaging device 11 (for example, the optical axis direction of the camera lens). For example, the shake detection unit 43 obtains the difference between the sensor values of the first and last frames among the sensor values of all the frames acquired in step S102, and if the difference is less than a predetermined amount, the imaging device 11 Judge that there is no change in the direction of the. On the other hand, the HMD device 1 determines that there is a change in the orientation of the imaging device 11 when the difference between the sensor values between the first and last frames is a predetermined amount or more.
  • step S105: NO When the HMD device 1 determines that there is no change in the orientation of the imaging device 11 (step S105: NO), the process of step S106 is omitted, and the process proceeds to step S107. On the other hand, when the HMD device 1 determines that there is a change in the orientation of the imaging device 11 (step S105: YES), the process proceeds to step S106.
  • the HMD device 1 functions as the shake detection unit 43 and corrects the locus of movement of a predetermined target in consideration of the amount of change in the orientation of the imaging device 11. That is, the HMD device 1 converts the movement amount of the “hand” between all successive frames into a movement amount corresponding to the actual movement of the “hand”. Specifically, the HMD device 1 may synthesize the movement amount (vector) of the “hand” in the captured image and the change amount (vector) of the orientation of the imaging device 11 between consecutive frames.
  • the HMD device 1 functions as the command determination unit 48 and recognizes the user's gesture from the movement amount of the “hand” calculated in step S103 or the movement amount of the “hand” corrected in step S106. Then, the HMD device 1 determines a user instruction content, that is, a command to the HMD device 1 according to the recognized gesture. Note that gestures and commands are associated with each other in a one-to-one relationship and registered in advance in a predetermined table, and the HMD device 1 may read commands associated with the recognized gesture from the table.
  • the HMD device 1 functions as the command determination unit 48 and determines whether or not the command determined in step S107 is a regular command. Specifically, the HMD device 1 is not a legitimate command when there is no command corresponding to the gesture recognized in step S107, or when it is a command for instructing control restricted at that time. judge. On the other hand, if the command corresponding to the gesture recognized in step S107 is a command for instructing control that can be executed at that time, the HMD device 1 determines that the command is a regular command.
  • step S108 determines that the command is a regular command
  • step S109 determines that the command is not a regular command
  • step S110 proceeds to step S110. Proceed to
  • the HMD device 1 functions as the application unit 50 and executes processing corresponding to the command determined by the command determination unit 48. For example, if the command determined in step S107 is a command corresponding to a gesture such as pinch-out or pinch-in, the HMD device 1 executes enlargement processing, reduction processing, and the like. In addition, when the command determined in step S107 is a command corresponding to a gesture such as slide, tap, or double tap, the HMD device 1 executes processing such as content playback, pause, and stop.
  • the HMD device 1 functions as the notification unit 47 and the display control unit 49 and notifies an error indicating that processing corresponding to the user's gesture cannot be executed. For example, the HMD device 1 causes the display device 12 to display a message such as “Please retry the gesture at the same position”.
  • the HMD device 1 functions as the control unit 41 and determines whether or not an instruction to end the gesture recognition process has been given by the user. Specifically, the HMD device 1 may determine whether or not the input device 34 has been operated to end the gesture recognition process.
  • the HMD device 1 returns the process to step S101 when the instruction to end the gesture recognition process has not been given (step S111: NO), and continues the gesture recognition process. On the other hand, if an instruction to end the gesture recognition process is given (step S111: YES), the HMD device 1 ends the gesture recognition process.
  • step S104 in the gesture recognition process described above the HMD device 1 starts a warning process.
  • the HMD device 1 functions as the shake detection unit 43 and detects a change in the orientation of the imaging device 11 (for example, the optical axis direction of the camera lens). For example, the shake detection unit 43 obtains the difference between the sensor values of the first and last frames among the sensor values of all the frames acquired in step S102, and if the difference is less than a predetermined amount, the imaging device 11 Judge that there is no change in the direction of the. On the other hand, the HMD device 1 determines that there is a change in the orientation of the imaging device 11 when the difference between the sensor values between the first and last frames is a predetermined amount or more.
  • step S201: NO If the HMD device 1 determines that there is no change in the orientation of the imaging device 11 (step S201: NO), the process proceeds to step S202. On the other hand, when the HMD device 1 determines that there is a change in the orientation of the imaging device 11 (step S201: YES), the process proceeds to step S206.
  • the HMD device 1 functions as the command determination unit 48 and recognizes the user's gesture from the movement amount of the “hand” calculated in step S103. Then, the HMD device 1 determines a user instruction content, that is, a command to the HMD device 1 according to the recognized gesture. However, if the movement amount of the “hand” cannot be calculated in step S103, there is a high possibility that the gesture includes a movement in which a predetermined target (for example, the user's hand) goes out of the gesture recognition area. . Therefore, the HMD device 1 recognizes the user's gesture from the amount of movement of the “hand” between the frames captured while the “hand” is moving in the gesture recognition area, and determines the corresponding command. To do.
  • the HMD device 1 functions as the command determination unit 48 and determines whether or not the command determined in step S202 is a regular command. Specifically, the HMD device 1 is not a legitimate command when there is no command corresponding to the gesture recognized in step S202, or when it is a command for instructing control restricted at that time. judge. On the other hand, if the command corresponding to the gesture recognized in step S202 is a command for instructing control that can be executed at that time, the HMD device 1 determines that the command is a regular command.
  • step S203 determines that the command is a regular command (step S203: YES)
  • step S204 determines that the command is not a regular command
  • step S205 proceeds to step S205. Proceed to
  • Step S204 The HMD device 1 functions as the application unit 50 and executes processing corresponding to the command determined by the command determination unit 48. For example, when the command determined in step S203 is a command corresponding to a gesture such as swipe, the HMD device 1 performs a process of turning a displayed page.
  • the HMD device 1 functions as the notification unit 47 and the display control unit 49, and the imaging device 11 is not moving, but a predetermined target (for example, a user's hand) is out of the gesture recognition area (FIG. 6A). Notification A for informing the user of the state shown in FIG.
  • the HMD device 1 controls the display device 12 and imitates the hand on the right side of the display screen.
  • An image is displayed (FIG. 6B).
  • the HMD device 1 controls the display device 12 to display an image imitating the hand on the left side of the display screen (FIG. 7A). )).
  • the user can easily grasp that the hand has moved to the left side of the gesture recognition area, and intuitively move the gesture position to the right side.
  • the HMD device 1 controls the display device 12 to display an image imitating a hand on the upper side of the display screen (FIG. 7B )).
  • the user can easily grasp that the hand has moved out of the gesture recognition area, and intuitively move the gesture position downward.
  • the HMD device 1 displays the lower side, upper left side, upper right side of the display screen. Then, the image imitating the hand is displayed on the lower left side and the lower right side (FIG. 7C, FIG. 7D, FIG. 8A, FIG. 8B, FIG. 8C). As described above, the HMD device 1 displays an image for warning on the side where the target is excluded from the gesture recognition area, and allows the user to intuitively confirm the direction in which the operation is returned.
  • the HMD device 1 functions as the control unit 41 and returns the process to the gesture recognition process.
  • the HMD device 1 functions as the shake detection unit 43 and corrects the locus of movement of a predetermined target in consideration of the amount of change in the orientation of the imaging device 11. That is, the HMD device 1 converts the movement amount of the “hand” between all successive frames into a movement amount corresponding to the actual movement of the “hand”. Specifically, the HMD device 1 may synthesize the movement amount (vector) of the “hand” in the captured image and the change amount (vector) of the orientation of the imaging device 11 between consecutive frames.
  • the HMD device 1 moves only the movement amount of the “hand” between the frames captured while the “hand” is moving in the gesture recognition area as much as the “hand” is actually moved. Convert to quantity.
  • the HMD device 1 functions as the command determination unit 48 and recognizes the user's gesture from the movement amount of the “hand” corrected in step S206. Then, the HMD device 1 determines a user instruction content, that is, a command to the HMD device 1 according to the recognized gesture.
  • the HMD device 1 functions as the command determination unit 48 and determines whether or not the command determined in step S207 is a regular command. Specifically, the HMD device 1 is not a legitimate command when there is no command corresponding to the gesture recognized in step S207, or when it is a command instructing control restricted at that time. judge. On the other hand, if the command corresponding to the gesture recognized in step S207 is a command for instructing control that can be executed at that time, the HMD device 1 determines that the command is a regular command.
  • step S208: YES If the HMD device 1 determines that the command is a regular command (step S208: YES), the process proceeds to step S209. If the HMD device 1 determines that the command is not a regular command (step S208: NO), the process proceeds to step S210. Proceed to
  • Step S209 The HMD device 1 functions as the application unit 50 and executes processing corresponding to the command determined by the command determination unit 48. For example, if the command determined in step S207 is a command corresponding to a gesture such as swipe, the HMD device 1 performs a process of turning a displayed page.
  • the HMD device 1 functions as the notification unit 47 and the display control unit 49, and the predetermined target (for example, the user's hand) has moved out of the gesture recognition area due to the movement of the imaging device 11 (FIG. 9A).
  • Notification B for informing the user of the state shown in FIG.
  • the HMD device 1 controls the display device 12 to display the display screen.
  • An image simulating an arrow is displayed on the right side (FIG. 9B).
  • the HMD device 1 controls the display device 12 to display an image imitating an arrow on the left side of the display screen. (FIG. 10A).
  • the user can easily grasp that the imaging device 11 is facing the right direction, and intuitively turn the head to the left.
  • the HMD device 1 controls the display device 12 to display an image imitating an arrow on the upper side of the display screen. (FIG. 10B).
  • the user can easily grasp that the imaging device 11 is facing downward, and understand that it is only necessary to intuitively point the head upward.
  • the HMD device 1 controls the display device 12
  • Images imitating arrows are displayed on the lower side, upper left side, upper right side, lower left side, and lower right side of the display screen (FIGS. 10C, 10D, 11A, and 11B). ), FIG. 11 (C)).
  • FIG. 11 (C) As described above, when the target is away from the gesture recognition area due to the user facing in a certain direction, an image is displayed on the opposite side of the direction in which the user is facing. Therefore, the user can intuitively confirm the direction in which the head is directed.
  • the HMD device 1 functions as the control unit 41 and returns the process to the gesture recognition process.
  • the cause is identified and the identified cause Appropriate notifications A and B are made based on this.
  • the user can easily determine in which direction the head should be moved and in which direction the hand should be moved so that the HMD device 1 can correctly recognize the gesture. . In this way, the user can intuitively correct the position of the gesture.
  • Each processing unit in each flowchart described above is divided according to main processing contents in order to facilitate understanding of the HMD device 1.
  • the invention of the present application is not limited by the method of classification of the processing steps and the names thereof.
  • the processing performed in the HMD device 1 can be divided into more processing steps.
  • One processing step may execute more processes.
  • the entire field of view of the imaging device 11 (the entire area of the captured image) is set as the gesture recognition area.
  • the present invention is not limited to this, and a partial region in the field of view of the imaging device 11 may be set as the gesture recognition area.
  • FIG. 12 is a diagram showing a modification of the gesture recognition area.
  • a region on the right side of the visual field of the imaging device 11 may be set as a gesture recognition area.
  • the gesture recognition area may or may not be displayed on the transparent members G1 and G2.
  • the present invention is applicable to any device as long as it is a device worn or held by the user and has a function capable of recognizing the user's gesture. Therefore, the present invention is not limited to the example applied to the HMD device 1 of the above embodiment.
  • FIG. 13 is a diagram for explaining an example in which the present invention is applied to a mobile terminal.
  • the present invention can also be applied to a portable terminal 60 that is used while being held by a user's hand.
  • the mobile terminal includes a mobile information terminal device having a display screen capable of displaying various types of information and contents and an imaging device, such as a smartphone, a tablet PC, and a PDA (Personal Digital Assistant).
  • the mobile terminal 60 When the present invention is applied to the mobile terminal 60, the mobile terminal 60 has the same configuration as that of the above embodiment, and recognizes the movement of the user's face as a gesture. Then, when the mobile terminal 60 moves away from the gesture recognition area, the user's hand holding the mobile terminal 60 (that is, the imaging device 11) has moved or the user's face (gesture) has moved. The user is notified so that it can be distinguished.
  • the notification unit 47 may output a different warning sound according to the cause specified by the cause specifying unit 46. Further, different vibration patterns may be generated by a motor or the like depending on the cause specified by the notification unit 47 and the cause specifying unit 46.
  • the HMD device 1 is a see-through image display device.
  • the present invention is not limited to this, and the HMD device 1 may be a non-transmissive image display device.
  • the above-described configuration of the HMD device 1 is not limited to the above-described configuration because the main configuration has been described in describing the features of the above-described embodiments and modifications. In addition, the configuration of the general HMD device 1 is not excluded.
  • each functional configuration of the HMD device 1 described above is classified according to main processing contents in order to facilitate understanding of each functional configuration.
  • the present invention is not limited by the way of classification and names of the constituent elements.
  • Each functional configuration can be classified into more components according to the processing content. Moreover, it can also classify
  • each functional configuration of the HMD device 1 described above can also be realized by a dedicated hardware circuit. In this case, it may be executed by one hardware or a plurality of hardware.
  • the program for operating the HMD device 1 may be provided by a computer-readable recording medium such as a USB memory, a flexible disk, a CD-ROM, or may be provided online via a network such as the Internet. .
  • the program recorded on the computer-readable recording medium is usually transferred and stored in the memory 32, the storage 33, or the like.
  • this program may be provided as, for example, a single application software, or may be incorporated in the software of each device as one function of the HMD device 1.
  • 1 HMD device 11 imaging device, 12 display device, 13 Inertial sensor, 31 CPU, 32 memory, 33 storage, 34 input devices, 35 Display controller, 41 control unit, 42 imaging control unit, 43 Shake detection unit, 44 Tracker, 45 Tracking availability determination unit, 46 Cause identification part, 47 Notification section, 48 Command decision part, 49 Display control unit, 50 application section, 60 mobile devices, U Control unit.

Abstract

[Problem] To provide a gesture recognition device, a head-mounted display, and a portable terminal, which guide a user so as to correct the position of a gesture intuitively. [Solution] A gesture recognition device (1) which is used when worn or held by a user, said gesture recognition device being equipped with: an imaging unit (42) that continuously captures images; a control unit (43-46) that detects a change in the orientation of the imaging unit (42), tracks a prescribed object using the images captured by the imaging unit (42), determines whether the tracking operation is possible, calculates an amount of change for the detected orientation when the result of the determination is that tracking of the object is not possible, and identifies the cause when tracking is not possible; and a notification unit (47) that provides a notification based on the cause that has been identified.

Description

ジェスチャー認識装置、ヘッドマウントディスプレイ、および携帯端末Gesture recognition device, head mounted display, and portable terminal
 本発明は、ジェスチャー認識装置、ヘッドマウントディスプレイ、および携帯端末に関する。 The present invention relates to a gesture recognition device, a head mounted display, and a mobile terminal.
 近年、ウェアラブルコンピューターの一つとしてヘッドマウントディスプレイ(HMD:Head Mounted Display)が開発されている。ヘッドマウントディスプレイは、ユーザーの頭部に装着して利用される表示装置である。ユーザーは、ヘッドマウントディスプレイを頭部に装着した状態で、目前に表示された画像情報等の各種コンテンツを見ることができる。 In recent years, a head mounted display (HMD) has been developed as one of wearable computers. A head mounted display is a display device that is used by being mounted on a user's head. The user can view various contents such as image information displayed in front of the user with the head mounted display mounted on the head.
 このようなヘッドマウントディスプレイの中には、小型カメラを使用してユーザーの身体の一部を撮像し、その身体の動き(ジェスチャー)を、ヘッドマウントディスプレイに対する入力操作として認識できるものがある。 Some of such head mounted displays can capture a part of the user's body using a small camera and recognize the movement (gesture) of the body as an input operation to the head mounted display.
 特許文献1には、ヘッドマウントディスプレイを用いた例ではないが、天井に小型カメラを設置しておき、小型カメラの下で行われたユーザーのジェスチャーを撮像、認識することによって、コンピューターに入力するコマンドを決定する技術が記載されている。 Although Patent Document 1 is not an example using a head-mounted display, a small camera is installed on the ceiling, and a user's gesture made under the small camera is captured and recognized, and input to a computer. Techniques for determining commands are described.
 特許文献1のように小型カメラが固定カメラとして使用される場合には、ユーザーの身体が小型カメラの視野外へ動かされたとき、ユーザーに対してジェスチャーの位置を正すように促すことができる。 When a small camera is used as a fixed camera as in Patent Document 1, when the user's body is moved out of the field of view of the small camera, the user can be prompted to correct the position of the gesture.
特開2013-196482号公報JP 2013-196482 A
 しかし、ヘッドマウントディスプレイを用いる場合には、小型カメラはヘッドマウントディスプレイに搭載されるため、ユーザーの頭部が揺れると小型カメラも揺れてしまう。そのため、ユーザーの身体が小型カメラの視野外へ移ったときに、小型カメラが動いたのかユーザーの身体が動いたのか区別できない。特に、ユーザーが、ヘッドマウントディスプレイを装着して、ディスプレイを注視している状況においては、小型カメラが動いたのかユーザーの身体が動いたのか区別しづらくなる傾向がある。そのため、ジェスチャーの位置を正すための具体的な方法を示さずに、ジェスチャーの位置を正すよう促しても、ユーザーはジェスチャーの位置をどのように正せばよいのか判断できない。 However, when using a head-mounted display, the small camera is mounted on the head-mounted display. Therefore, if the user's head is shaken, the small camera is also shaken. Therefore, when the user's body moves out of the field of view of the small camera, it cannot be distinguished whether the small camera has moved or the user's body has moved. In particular, in a situation where the user wears a head-mounted display and is gazing at the display, it tends to be difficult to distinguish whether the small camera has moved or the user's body has moved. Therefore, even if the user is prompted to correct the position of the gesture without showing a specific method for correcting the position of the gesture, the user cannot determine how to correct the position of the gesture.
 本発明は、上記事情に鑑みてなされたものであり、ユーザーが直観的にジェスチャーの位置を正せるように案内するジェスチャー認識装置、ヘッドマウントディスプレイ、および携帯端末を提供することを目的とする。 The present invention has been made in view of the above circumstances, and an object thereof is to provide a gesture recognition device, a head-mounted display, and a mobile terminal that guide a user to intuitively correct the position of a gesture.
 (1)ユーザーが装着または保持して用いるジェスチャー認識装置であって、連続して画像を撮像する撮像部と、前記撮像部の向きの変化を検出し、前記撮像部により撮像された画像を用いて、所定の対象を追跡し、当該追跡の可否を判断し、当該判断結果により前記対象の追跡が不可の場合、検出された前記向きの変化量を算出して、追跡が不可となった原因を特定する制御部と、特定された前記原因に基づく通知を行う通知部と、を備えるジェスチャー認識装置。 (1) A gesture recognition device that is worn or held by a user and that uses an image capturing unit that continuously captures images, changes in the orientation of the image capturing unit, and images captured by the image capturing unit Tracking the predetermined target, determining whether the tracking is possible, and if the target cannot be tracked based on the determination result, the amount of change in the detected direction is calculated, and the reason why the tracking is impossible A gesture recognizing apparatus comprising: a control unit that identifies a notification; and a notification unit that performs notification based on the identified cause.
 (2)前記所定の対象は、ユーザーの体の一部である、上記(1)に記載のジェスチャー認識装置。 (2) The gesture recognition device according to (1), wherein the predetermined target is a part of a user's body.
 (3)前記通知部は、前記特定部により特定された前記原因に応じて異なる通知を行う、上記(1)または(2)に記載のジェスチャー認識装置。 (3) The gesture recognition device according to (1) or (2), wherein the notification unit performs different notifications according to the cause specified by the specifying unit.
 (4)前記通知部による通知には、表示による通知、音声による通知、および振動による通知の少なくともいずれか一つの通知が含まれる、上記(1)~(3)のいずれかに記載のジェスチャー認識装置。 (4) The gesture recognition according to any one of (1) to (3), wherein the notification by the notification unit includes at least one of notification by display, notification by sound, and notification by vibration apparatus.
 (5)前記制御部による追跡は、前記撮像部の視野内に設定された認識領域内において行われる、上記(1)~(4)のいずれかに記載のジェスチャー認識装置。 (5) The gesture recognition device according to any one of (1) to (4), wherein the tracking by the control unit is performed in a recognition region set within a field of view of the imaging unit.
 (6)前記制御部は、追跡された前記対象の動きに従ってコマンドを決定し、前記通知部は、前記制御部によりコマンドが決定される際に前記対象の追跡が不可となった場合に、通知を行う、上記(1)~(5)のいずれかに記載のジェスチャー認識装置。 (6) The control unit determines a command according to the tracked movement of the target, and the notification unit notifies when the target cannot be tracked when the command is determined by the control unit. The gesture recognition device according to any one of (1) to (5), wherein:
 (7)上記(1)~(6)のいずれかに記載のジェスチャー認識装置としての機能を有する、ヘッドマウントディスプレイ。 (7) A head-mounted display having a function as the gesture recognition device according to any one of (1) to (6) above.
 (8)上記(1)~(6)のいずれかに記載のジェスチャー認識装置としての機能を有する、携帯端末。 (8) A mobile terminal having a function as the gesture recognition device according to any one of (1) to (6) above.
 本発明によれば、所定の対象(ユーザーの身体の一部)の追跡ができなくなった場合には、その原因を特定し、特定した原因に基づく適切な通知を行っている。この通知によって、ユーザーは、ヘッドマウントディスプレイにより正しくジェスチャーを認識できるようにするために、頭部をどの方向に動かせばよいのか、手をどの方向に動かせばよいのか、容易に判断できる。このようにして、ユーザーは直観的にジェスチャーの位置を正せるようになる。 According to the present invention, when a predetermined target (part of the user's body) can no longer be tracked, the cause is identified and appropriate notification based on the identified cause is performed. This notification allows the user to easily determine in which direction the head should be moved and in which direction the hand should be moved so that the head-mounted display can correctly recognize the gesture. In this way, the user can intuitively correct the position of the gesture.
本実施形態に係るヘッドマウントディスプレイ(HMD)装置の外観例を示す図である。It is a figure which shows the example of an external appearance of the head mounted display (HMD) apparatus which concerns on this embodiment. HMD装置のハードウェア構成例を示すブロック図である。It is a block diagram which shows the hardware structural example of an HMD apparatus. HMD装置の機能構成例を示すブロック図である。It is a block diagram which shows the function structural example of an HMD apparatus. ジェスチャー認識処理の手順を示すフローチャートである。It is a flowchart which shows the procedure of gesture recognition processing. 警告処理の手順を示すフローチャートである。It is a flowchart which shows the procedure of a warning process. (A)対象が右方向へ移動したときの状態例を示す図である。(B)対象がジェスチャー認識エリアの右側へ外れたときの通知例(通知A)を示す図である。(A) It is a figure which shows the example of a state when object moves to the right direction. (B) It is a figure which shows the example of notification (notification A) when a target remove | deviates to the right side of a gesture recognition area. (A)対象がジェスチャー認識エリアの左側へ外れたときの通知例(通知A)を示す図である。(B)対象がジェスチャー認識エリアの上側へ外れたときの通知例(通知A)を示す図である。(C)対象がジェスチャー認識エリアの下側へ外れたときの通知例(通知A)を示す図である。(D)対象がジェスチャー認識エリアの左上側へ外れたときの通知例(通知A)を示す図である。(A) It is a figure which shows the example of notification (notification A) when a target remove | deviates to the left side of a gesture recognition area. (B) It is a figure which shows the example of notification (notification A) when a target remove | deviates to the upper side of a gesture recognition area. (C) It is a figure which shows the example of notification (notification A) when a target remove | deviates to the lower side of a gesture recognition area. (D) It is a figure which shows the example of notification (notification A) when a target remove | deviates to the upper left side of a gesture recognition area. (A)対象がジェスチャー認識エリアの右上側へ外れたときの通知例(通知A)を示す図である。(B)対象がジェスチャー認識エリアの左下側へ外れたときの通知例(通知A)を示す図である。(C)対象がジェスチャー認識エリアの右下側へ外れたときの通知例(通知A)を示す図である。(A) It is a figure which shows the example of notification (notification A) when a target remove | deviates to the upper right side of a gesture recognition area. (B) It is a figure which shows the example of notification (notification A) when a target remove | deviates to the lower left side of a gesture recognition area. (C) It is a figure which shows the example of notification (notification A) when a target remove | deviates to the lower right side of a gesture recognition area. (A)ユーザーの頭部が左方向を向いたときの状態例を示す図である。(B)対象がジェスチャー認識エリアの右側へ外れたときの通知例(通知B)を示す図である。(A) It is a figure which shows the example of a state when a user's head turns to the left direction. (B) It is a figure which shows the example of notification (notification B) when a target remove | deviates to the right side of a gesture recognition area. (A)対象がジェスチャー認識エリアの左側へ外れたときの通知例(通知B)を示す図である。(B)対象がジェスチャー認識エリアの上側へ外れたときの通知例(通知B)を示す図である。(C)対象がジェスチャー認識エリアの下側へ外れたときの通知例(通知B)を示す図である。(D)対象がジェスチャー認識エリアの左上側へ外れたときの通知例(通知B)を示す図である。(A) It is a figure which shows the example of notification (notification B) when a target remove | deviates to the left side of a gesture recognition area. (B) It is a figure which shows the example of notification (notification B) when a target remove | deviates to the upper side of a gesture recognition area. (C) It is a figure which shows the example of notification (notification B) when a target remove | deviates to the lower side of a gesture recognition area. (D) It is a figure which shows the example of notification (notification B) when a target remove | deviates to the upper left side of a gesture recognition area. (A)対象がジェスチャー認識エリアの右上側へ外れたときの通知例(通知B)を示す図である。(B)対象がジェスチャー認識エリアの左下側へ外れたときの通知例(通知B)を示す図である。(C)対象がジェスチャー認識エリアの右下側へ外れたときの通知例(通知B)を示す図である。(A) It is a figure which shows the example of notification (notification B) when a target remove | deviates to the upper right side of a gesture recognition area. (B) It is a figure which shows the example of notification (notification B) when a target remove | deviates to the lower left side of a gesture recognition area. (C) It is a figure which shows the example of notification (notification B) when a target remove | deviates to the lower right side of a gesture recognition area. ジェスチャー認識エリアの変形例を示す図である。It is a figure which shows the modification of a gesture recognition area. 本発明を携帯端末に適用する例について説明するための図である。It is a figure for demonstrating the example which applies this invention to a portable terminal.
 以下、添付した図面を参照して、本発明の実施形態を説明する。なお、図面の説明において同一の要素には同一の符号を付し、重複する説明を省略する。また、図面の寸法比率は、説明の都合上誇張されており、実際の比率とは異なる場合がある。 Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. In the description of the drawings, the same elements are denoted by the same reference numerals, and redundant description is omitted. In addition, the dimensional ratios in the drawings are exaggerated for convenience of explanation, and may be different from the actual ratios.
 図1は、本実施形態に係るヘッドマウントディスプレイ(HMD)装置の外観例を示す図である。図2は、HMD装置のハードウェア構成例を示すブロック図である。 FIG. 1 is a diagram showing an example of the appearance of a head mounted display (HMD) device according to this embodiment. FIG. 2 is a block diagram illustrating a hardware configuration example of the HMD device.
 以下、図1、図2を参照して、HMD装置1の概略構成、特にハードウェア構成について説明する。 Hereinafter, with reference to FIG. 1 and FIG. 2, a schematic configuration of the HMD device 1, particularly a hardware configuration will be described.
 <HMD装置1(ハードウェア構成)>
 本実施形態に係るHMD装置1は、ユーザー(以下「観察者」とも称する)の頭部に装着して使用される画像表示装置である。たとえば、HMD装置1は、視力矯正用のメガネを模した構造で構成されている。すなわち、HMD装置1は、左右一対のテンプルL1、L2と、ブリッジBと、左右一対の透明部材G1、G2とを備えて構成される。
<HMD device 1 (hardware configuration)>
The HMD device 1 according to the present embodiment is an image display device that is used by being mounted on the head of a user (hereinafter also referred to as “observer”). For example, the HMD device 1 has a structure imitating eyeglasses for correcting vision. That is, the HMD device 1 includes a pair of left and right temples L1 and L2, a bridge B, and a pair of left and right transparent members G1 and G2.
 テンプルL1、L2は、たとえば、弾性素材から構成される長尺棒状の部材である。テンプルL1、L2のそれぞれの一方端部には、ユーザーの耳に掛けられる耳掛け部分が設けられ、他方端部には、透明部材G1、G2が固定される。また、ユーザーの片方の耳に掛けられる耳掛け部分付近には、制御ユニットUが装着されている。 Temples L1 and L2 are, for example, long bar-shaped members made of an elastic material. Each of the temples L1 and L2 is provided with an ear hook portion that is hung on the user's ear, and the transparent members G1 and G2 are fixed to the other end portion. In addition, a control unit U is mounted in the vicinity of the ear hook portion that is hung on one ear of the user.
 ブリッジBは、左右一対の透明部材G1、G2を互いに連結するための短尺棒状の部材である。ブリッジBの両端には、透明部材G1、G2が固定される。左右一対の透明部材G1、G2は、一定の間隔を空けて保持される。 The bridge B is a short bar-like member for connecting the pair of left and right transparent members G1 and G2 to each other. Transparent members G1 and G2 are fixed to both ends of the bridge B. The pair of left and right transparent members G1 and G2 are held at a predetermined interval.
 透明部材G1、G2は、HMD装置1を装着したユーザーが外界を観察できるように、外界からの光を透過してユーザーの眼に届けることが可能な透明な材料(ガラス、プラスチック、フィルムなど)により形成される。 The transparent members G1 and G2 are transparent materials (glass, plastic, film, etc.) that can transmit light from the outside world to the user's eyes so that the user wearing the HMD device 1 can observe the outside world It is formed by.
 本実施形態では、ユーザーの右眼に対応する透明部材G1の上部には、撮像装置11が設けられ、透明部材G1に重なるように表示装置12が設けられている。 In the present embodiment, the imaging device 11 is provided above the transparent member G1 corresponding to the user's right eye, and the display device 12 is provided so as to overlap the transparent member G1.
 撮像装置11は、カメラレンズの光軸方向(以下では、「撮像方向」とも称する)の外界を撮像する。撮像装置11は、カメラレンズの光軸がユーザーの視線方向に略一致するように、HMD装置1に対し固定的に保持されている。これによって撮像装置11は、ユーザーの前方視野内を撮影することが可能となる。たとえば、撮像装置11は、CCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)等のイメージセンサーを備えたデジタルカメラあるいはビデオカメラである。撮像装置11の撮像によって、たとえば、JPEG、MPEG2/4等の撮像画像(または、所定のフレームレートで撮像された動画)が生成され、有線で接続された制御ユニットUに送信される。なお、本実施形態では、撮像装置11として、パーソナル用途のウェブカメラ(画角が約72度)が用いられる。 The imaging device 11 images the outside of the camera lens in the optical axis direction (hereinafter also referred to as “imaging direction”). The imaging device 11 is fixedly held with respect to the HMD device 1 so that the optical axis of the camera lens substantially coincides with the user's line-of-sight direction. As a result, the imaging device 11 can take an image of the user's front visual field. For example, the imaging device 11 is a digital camera or a video camera including an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). For example, a captured image such as JPEG or MPEG2 / 4 (or a moving image captured at a predetermined frame rate) is generated by imaging by the imaging device 11 and transmitted to the control unit U connected by wire. In the present embodiment, a personal-use web camera (with an angle of view of about 72 degrees) is used as the imaging device 11.
 表示装置12は、所定の画像(映像)等の各種コンテンツを透明部材G1、G2上に表示する。また、表示装置12は、外界からの光に付加的に重畳させる画像(AR画像)を表示してもよい。たとえば、表示装置12は、ユーザーの視線方向に存在する人物の近くに、名前、勤務先などの情報を含むポップアップ画像があたかも存在するかのように表示することもできる。なお、表示装置12は、図1に示すような単眼式ではなく、双眼式でもよい。 The display device 12 displays various contents such as a predetermined image (video) on the transparent members G1 and G2. The display device 12 may display an image (AR image) that is additionally superimposed on light from the outside. For example, the display device 12 can display a pop-up image including information such as a name and work place as if there is a person near the person in the direction of the user's line of sight. The display device 12 may be a binocular type instead of the monocular type as shown in FIG.
 また、図1には示していないが、撮像装置11の近傍には、慣性センサー13が設けられる。慣性センサー13は、HMD装置1の揺れを検出し、そのときの撮像装置11の揺れ量の算出に必要なデータを提供するセンサーである。たとえば、慣性センサー13は、加速度センサー、ジャイロセンサー等により構成される。なお、慣性センサー13は、撮像装置11の揺れ量の算出に必要なデータを提供可能なものであれば、どのような形態、名称、構造のものであってもよい。 Although not shown in FIG. 1, an inertial sensor 13 is provided in the vicinity of the imaging device 11. The inertial sensor 13 is a sensor that detects shaking of the HMD device 1 and provides data necessary for calculating the shaking amount of the imaging device 11 at that time. For example, the inertial sensor 13 includes an acceleration sensor, a gyro sensor, or the like. The inertial sensor 13 may have any form, name, or structure as long as it can provide data necessary for calculating the amount of shaking of the imaging device 11.
 制御ユニットUは、撮像装置11と、表示装置12と、慣性センサー13とを制御し、全体としてヘッドマウントディスプレイとして機能させるために必要な構成を有する。たとえば、制御ユニットUは、撮像装置11から送信されてきた撮像画像(動画)からユーザーのジェスチャーを認識し、認識されたジェスチャーに応じたコマンドを実行し、その結果を表示装置12に表示させる等の処理を行う。制御ユニットUの具体的な構成については後述する。 The control unit U has a configuration necessary for controlling the imaging device 11, the display device 12, and the inertial sensor 13 to function as a head mounted display as a whole. For example, the control unit U recognizes the user's gesture from the captured image (moving image) transmitted from the imaging device 11, executes a command corresponding to the recognized gesture, and displays the result on the display device 12. Perform the process. A specific configuration of the control unit U will be described later.
 (制御ユニットUの具体的な構成)
 次に、制御ユニットUの具体的な構成について説明する。
(Specific configuration of the control unit U)
Next, a specific configuration of the control unit U will be described.
 図2に示すとおり、制御ユニットUは、CPU(Central Processing Unit)31と、メモリー32と、ストレージ33と、入力装置34と、表示コントローラー35とを備える。 As shown in FIG. 2, the control unit U includes a CPU (Central Processing Unit) 31, a memory 32, a storage 33, an input device 34, and a display controller 35.
 CPU31は、プログラムにしたがって上記各部の制御や各種の演算処理を実行するプロセッサ等から構成される制御回路であり、HMD装置1の各機能は、それに対応するプログラムをCPU31が実行することにより発揮される。 The CPU 31 is a control circuit composed of a processor or the like that executes control of each unit and various arithmetic processes according to a program, and each function of the HMD device 1 is exhibited by the CPU 31 executing a corresponding program. The
 メモリー32は、作業領域として一時的にプログラムおよびデータを記憶する高速アクセス可能な主記憶装置である。メモリー32には、たとえば、DRAM(Dymamic Random Access Memory)、SDRAM(Synchronous Dymamic Random Access Memory)、SRAM(Static Random Access Memory)等が採用される。 The memory 32 is a high-speed accessible main storage device that temporarily stores programs and data as a work area. As the memory 32, for example, DRAM (Dynamic Random Access Memory), SDRAM (Synchronous Dynamic Access Memory), SRAM (Static Random Access Memory), or the like is adopted.
 ストレージ33は、オペレーティングシステムを含む各種プログラムや各種データを格納する大容量の補助記憶装置である。ストレージ33には、たとえば、フラッシュメモリー等が採用される。 The storage 33 is a large-capacity auxiliary storage device that stores various programs including an operating system and various data. As the storage 33, for example, a flash memory or the like is employed.
 入力装置34は、HMD装置1の電源投入などの指示を入力するための物理的なキーやボタンにより構成される。また、入力装置34として、外部のコントローラー(たとえば、スマートフォン、リモートコントローラーなど)を利用してもよい。 The input device 34 is composed of physical keys and buttons for inputting instructions such as power-on of the HMD device 1. Moreover, you may utilize an external controller (for example, a smart phone, a remote controller, etc.) as the input device 34. FIG.
 表示コントローラー35は、所定の画像(映像)等の各種コンテンツや、AR画像などを、表示装置12に表示させる。たとえば、表示コントローラー35は、所定の表示周期毎にメモリー32から所定の画像を読み出して、表示装置12へ入力するための信号に変換するとともに、水平同期信号、垂直同期信号などのパルス信号を生成する。 The display controller 35 causes the display device 12 to display various contents such as a predetermined image (video), an AR image, and the like. For example, the display controller 35 reads a predetermined image from the memory 32 every predetermined display cycle, converts it into a signal for input to the display device 12, and generates a pulse signal such as a horizontal synchronizing signal and a vertical synchronizing signal. To do.
 以上のようなハードウェア構成を有するHMD装置1は、以下の機能構成を有する。 The HMD device 1 having the hardware configuration as described above has the following functional configuration.
 図3は、HMD装置の機能構成例を示すブロック図である。 FIG. 3 is a block diagram illustrating a functional configuration example of the HMD device.
 <HMD装置1(機能構成)>
 図3に示すとおり、HMD装置1は、機能構成として、制御部41と、撮像制御部42と、揺れ検出部43と、追跡部44と、追跡可否判断部45と、原因特定部46と、通知部47と、コマンド決定部48と、表示制御部49と、アプリケーション部50とを有する。
<HMD device 1 (functional configuration)>
As shown in FIG. 3, the HMD device 1 includes, as a functional configuration, a control unit 41, an imaging control unit 42, a shake detection unit 43, a tracking unit 44, a tracking availability determination unit 45, a cause identification unit 46, A notification unit 47, a command determination unit 48, a display control unit 49, and an application unit 50 are included.
 以下、各機能部41~50について具体的に説明する。なお、各機能部41~48、50は、CPU31が、ストレージ33にインストールされているプログラムをメモリー32に読み出して実行することにより実現され、表示制御部49は、表示コントローラー35により実現される。また、これに限らず、各機能部41~48、50は、ASIC等のハードウェアにより実現されてもよい。 Hereinafter, each functional unit 41 to 50 will be described in detail. The function units 41 to 48 and 50 are realized by the CPU 31 reading the program installed in the storage 33 into the memory 32 and executing it, and the display control unit 49 is realized by the display controller 35. The functional units 41 to 48 and 50 are not limited to this, and may be realized by hardware such as an ASIC.
 制御部41は、HMD装置1全体を制御する。 The control unit 41 controls the entire HMD device 1.
 撮像制御部42は、撮像装置11を制御して、外界を連続して撮像する。これにより、撮像画像(動画)が生成される。 The imaging control unit 42 controls the imaging device 11 to continuously image the outside world. Thereby, a captured image (moving image) is generated.
 揺れ検出部43は、HMD装置1を装着しているユーザーが揺れたときの撮像装置11の向き(たとえば、カメラレンズの光軸方向)の変化を検出する。たとえば、揺れ検出部43は、慣性センサー13のセンサー値を取得して、撮像装置11の向きの変化を検出する。 The shake detection unit 43 detects a change in the direction of the imaging device 11 (for example, the optical axis direction of the camera lens) when the user wearing the HMD device 1 shakes. For example, the shake detection unit 43 acquires the sensor value of the inertial sensor 13 and detects a change in the orientation of the imaging device 11.
 追跡部44は、ユーザーのジェスチャーを認識するために、所定の対象(たとえば、ユーザーの手、指先など)を追跡する。たとえば、追跡部44は、予めユーザーのジェスチャーを認識する領域(以下では「ジェスチャー認識領域」と称する)を設定しておき、そのジェスチャー認識領域内において所定の対象を追跡する。本実施形態では、ジェスチャー認識領域は、撮像装置11の視野内に設定される。 The tracking unit 44 tracks a predetermined target (for example, the user's hand, fingertip, etc.) in order to recognize the user's gesture. For example, the tracking unit 44 sets an area for recognizing a user's gesture in advance (hereinafter referred to as “gesture recognition area”), and tracks a predetermined target in the gesture recognition area. In the present embodiment, the gesture recognition area is set within the field of view of the imaging device 11.
 追跡可否判断部45は、追跡部44による追跡の可否を判断する。 The tracking availability determination unit 45 determines whether tracking by the tracking unit 44 is possible.
 原因特定部46は、追跡可否判断部45の判断結果により所定の対象の追跡が不可の場合、揺れ検出部43により検出された撮像装置11の向きの変化量を算出して、追跡が不可となった原因を特定する。たとえば、原因特定部46は、撮像装置11の向きの変化量が閾値以上である場合には、ユーザーの身体が揺れて撮像装置11が動いたことが原因で追跡不可になったと特定する。一方、撮像装置11の向きの変化量が閾値未満である場合には、所定の対象の動き(ジェスチャー)そのものが原因で所定の対象が追跡不可になったと特定する。 The cause identification unit 46 calculates the amount of change in the orientation of the imaging device 11 detected by the shake detection unit 43 when the predetermined target cannot be tracked based on the determination result of the tracking enable / disable determination unit 45, and indicates that the tracking is impossible. Identify the cause. For example, when the amount of change in the orientation of the imaging device 11 is greater than or equal to the threshold value, the cause identifying unit 46 identifies that tracking is impossible due to the user's body shaking and the imaging device 11 moving. On the other hand, if the amount of change in the orientation of the imaging device 11 is less than the threshold value, it is determined that the predetermined target cannot be tracked due to the movement (gesture) of the predetermined target itself.
 通知部47は、原因特定部46により特定された原因に基づく通知を行う。たとえば、通知部47は、原因特定部46により特定された原因に応じて異なる通知を行う。なお、通知部47が行う通知には、表示(表示装置12)による通知、音声(不図示のスピーカー)による通知、および振動(不図示の小型振動機)による通知の少なくともいずれか一つの通知が含まれる。 The notification unit 47 performs notification based on the cause specified by the cause specifying unit 46. For example, the notification unit 47 performs different notifications depending on the cause specified by the cause specifying unit 46. The notification performed by the notification unit 47 includes at least one of notification by display (display device 12), notification by voice (not shown speaker), and notification by vibration (small vibrator not shown). included.
 コマンド決定部48は、追跡部44により追跡された所定の対象の動き(ジェスチャー)に従って、HMD装置1に対するユーザーの指示内容、すなわち、コマンドを決定する。 The command determination unit 48 determines the user's instruction content, that is, the command to the HMD device 1 according to the movement (gesture) of the predetermined target tracked by the tracking unit 44.
 表示制御部49は、所定の画像(映像)等の各種コンテンツや、AR画像などを、表示装置12に表示させる。 The display control unit 49 causes the display device 12 to display various contents such as a predetermined image (video), an AR image, and the like.
 アプリケーション部50は、コマンド決定部48により決定されたコマンドに対応する処理を実行する。たとえば、アプリケーション部50は、表示画面の拡大処理、縮小処理や、コンテンツの再生、一時停止、停止などの処理を実行する。 The application unit 50 executes processing corresponding to the command determined by the command determination unit 48. For example, the application unit 50 performs processing such as display screen enlargement processing, reduction processing, content playback, pause, and stop.
 各機能部41~50の詳細な動作については以下に詳述する。 The detailed operation of each functional unit 41 to 50 will be described in detail below.
 <HMD装置1の動作>
 図4は、ジェスチャー認識処理の手順を示すフローチャートである。図5は、警告処理の手順を示すフローチャートである。図6(A)は、対象が右方向へ移動したときの状態例を示す図である。図6(B)、図7(A)~(D)、図8(A)~(C)は、それぞれ、対象がジェスチャー認識エリアの右側、左側、上側、下側、左上側、右上側、左下側、右下側へ外れたときの通知例(通知A)を示す図である。図9(A)は、ユーザーの頭部が左方向を向いたときの状態例を示す図である。図9(B)、図10(A)~(D)、図11(A)~(C)は、それぞれ、対象がジェスチャー認識エリアの右側、左側、上側、下側、左上側、右上側、左下側、右下側へ外れたときの通知例(通知B)を示す図である。
<Operation of HMD Device 1>
FIG. 4 is a flowchart showing the procedure of the gesture recognition process. FIG. 5 is a flowchart showing the procedure of the warning process. FIG. 6A is a diagram illustrating a state example when the object moves in the right direction. FIGS. 6 (B), 7 (A) to (D), and FIGS. 8 (A) to (C) are respectively the right side, left side, upper side, lower side, upper left side, upper right side of the gesture recognition area, It is a figure which shows the example of notification (notification A) when it remove | deviates to the lower left side and the lower right side. FIG. 9A is a diagram illustrating an example of a state when the user's head is directed leftward. 9 (B), 10 (A) to (D), and 11 (A) to (C), the objects are the right side, left side, upper side, lower side, upper left side, upper right side of the gesture recognition area, respectively. It is a figure which shows the example of notification (notification B) when it remove | deviates to the lower left side and the lower right side.
 以下、図4~図11を参照して、ジェスチャー認識処理および警告処理の手順について説明する。 Hereinafter, the gesture recognition process and the warning process will be described with reference to FIGS.
 (ジェスチャー認識処理)
 HMD装置1は、電源が投入されたタイミングで、図4に示すジェスチャー認識処理を開始する。ただし、ジェスチャー認識処理を開始するタイミングは、これに限定されず、入力装置34に対してジェスチャー認識処理を開始するための操作がなされたときに、ジェスチャー認識処理を開始してもよい。また、HMD装置1がユーザーの頭部に装着されたことを検知したときにジェスチャー認識処理を開始してもよい。
(Gesture recognition processing)
The HMD device 1 starts the gesture recognition process shown in FIG. 4 at the timing when the power is turned on. However, the timing of starting the gesture recognition process is not limited to this, and the gesture recognition process may be started when an operation for starting the gesture recognition process is performed on the input device 34. The gesture recognition process may be started when it is detected that the HMD device 1 is mounted on the user's head.
 なお、ジェスチャー認識処理の開始直後において、制御部41は、HMD装置1の制御に必要な各種パラメータをキャリブレーションしておく。たとえば、慣性センサー13の基準値の登録やジェスチャー認識エリアの設定などが行われる。本実施形態では、制御部41は、慣性センサー13からセンサー値を取得し、撮像装置11の向きの基準値としてメモリー32に登録する。また、制御部41は、撮像装置11の視野全体(撮像画像の全領域)を、ジェスチャー認識エリアとして設定する。 Note that immediately after the start of the gesture recognition process, the control unit 41 calibrates various parameters necessary for controlling the HMD device 1. For example, registration of the reference value of the inertial sensor 13 and setting of a gesture recognition area are performed. In the present embodiment, the control unit 41 acquires a sensor value from the inertial sensor 13 and registers it in the memory 32 as a reference value for the orientation of the imaging device 11. Further, the control unit 41 sets the entire field of view of the imaging device 11 (the entire region of the captured image) as a gesture recognition area.
 [ステップS101]
 キャリブレーション後、HMD装置1は、撮像制御部42として機能し、所定のフレームレートで連続して外界を撮像する(所定数分のフレーム)。具体的には、HMD装置1は、撮像装置11を制御して撮像を開始することにより、カメラレンズの光軸方向の撮像画像(動画)を生成する。そして、HMD装置1は、生成した撮像画像(動画)をメモリー32やストレージ33に記憶する。なお、撮像装置11が撮像中において、ユーザーは、HMD装置1に認識させるべく身体の一部(たとえば、手)を動かして、ジェスチャー動作を行える。このようなジェスチャー動作の例としては、親指と人差し指を閉じた状態から広げるピンチアウト、逆に親指と人差し指を広げた状態から閉じるピンチイン、手あるいは伸ばした人差し指をずらす仕草のスライド、広げた手あるいは指で叩くタップ、それを連続的に叩くダブルタップ、大きく手を左右にあるいは上下に動かすスワイプなどがあげられる。
[Step S101]
After calibration, the HMD device 1 functions as the imaging control unit 42 and continuously images the outside world at a predetermined frame rate (a predetermined number of frames). Specifically, the HMD device 1 generates a captured image (moving image) in the optical axis direction of the camera lens by controlling the imaging device 11 and starting imaging. Then, the HMD device 1 stores the generated captured image (moving image) in the memory 32 or the storage 33. In addition, while the imaging device 11 is imaging, the user can perform a gesture operation by moving a part of the body (for example, a hand) so that the HMD device 1 can recognize it. Examples of such gesture movements include pinch-out that spreads the thumb and index finger from the closed state, pinch-in that closes the thumb and index finger from the expanded state, slide of gesture that moves the hand or extended index finger, spread hand or Tap with a finger, double tap to tap it continuously, swipe to move your hand left and right or up and down.
 [ステップS102]
 HMD装置1は、揺れ検出部43として機能し、慣性センサー13からセンサー値を取得する。たとえば、HMD装置1は、慣性センサー13から三軸方向の加速度を取得する。なお、ステップS102では、HMD装置1は、ステップS101において1フレーム分の撮像画像が生成される毎にセンサー値を取得し、フレームごとに撮像画像とセンサー値を対応づけてメモリー32などに記憶する。
[Step S102]
The HMD device 1 functions as the shake detection unit 43 and acquires a sensor value from the inertial sensor 13. For example, the HMD device 1 acquires acceleration in the triaxial direction from the inertial sensor 13. In step S102, the HMD device 1 acquires a sensor value every time a captured image for one frame is generated in step S101, and associates the captured image with the sensor value for each frame and stores them in the memory 32 or the like. .
 [ステップS103]
 HMD装置1は、追跡部44として機能し、一般的なパターンマッチング手法により所定の対象(たとえば、ユーザーの手)を追跡する。このとき、HMD装置1は、追跡可否判断部45として機能して、所定の対象について追跡可であるか追跡不可であるか判別する。具体的には、HMD装置1は、ステップS101において生成された撮像画像(各フレーム)の中から、予め用意しておいた「手」のテンプレート画像と同一または類似する特徴(色や形状)を持つ領域を探す。そして、HMD装置1は、連続する全てのフレーム間での「手」の移動量を算出する。HMD装置1は、連続する全てのフレーム間での「手」の移動量が所定量未満である場合には、「手」を追跡できると判断する。一方、HMD装置1は、連続する少なくとも一つのフレーム間での「手」の移動量が所定量以上である場合、または、「手」が存在しないフレームがあってフレーム間での「手」の移動量を算出できない場合には、「手」を追跡できないと判断する。
[Step S103]
The HMD device 1 functions as a tracking unit 44 and tracks a predetermined target (for example, a user's hand) by a general pattern matching method. At this time, the HMD device 1 functions as the traceability determination unit 45 to determine whether a predetermined target is traceable or untraceable. Specifically, the HMD device 1 has features (colors and shapes) that are the same as or similar to the template image of “hand” prepared in advance from the captured images (each frame) generated in step S101. Find the area you have. Then, the HMD device 1 calculates the movement amount of the “hand” between all successive frames. The HMD device 1 determines that the “hand” can be tracked when the movement amount of the “hand” between all the consecutive frames is less than a predetermined amount. On the other hand, the HMD device 1 determines that the movement amount of the “hand” between at least one continuous frame is equal to or greater than a predetermined amount, or there is a frame in which no “hand” exists and If the amount of movement cannot be calculated, it is determined that the “hand” cannot be tracked.
 HMD装置1は、「手」を追跡できると判断した場合には、所定の対象を追跡可と判定し(ステップS103:YES)、処理をステップS105に進める。一方、HMD装置1は、「手」を追跡できないと判断した場合には、所定の対象を追跡不可と判定し(ステップS103:NO)、処理をステップS104に進める。 When it is determined that the “hand” can be tracked, the HMD device 1 determines that the predetermined target can be tracked (step S103: YES), and the process proceeds to step S105. On the other hand, when it is determined that the “hand” cannot be tracked, the HMD device 1 determines that the predetermined target cannot be tracked (step S103: NO), and the process proceeds to step S104.
 [ステップS104]
 HMD装置1は、ジェスチャーのやり直しをユーザーに促すための警告処理を実行する。警告処理の詳細については後述する。
[Step S104]
The HMD device 1 executes a warning process for prompting the user to redo the gesture. Details of the warning process will be described later.
 [ステップS105]
 HMD装置1は、揺れ検出部43として機能し、撮像装置11の向き(たとえば、カメラレンズの光軸方向)の変化を検出する。たとえば、揺れ検出部43は、ステップS102において取得された全てのフレームのセンサー値のうち、最初と最後のフレーム間でのセンサー値の差分を求め、所定量未満である場合には、撮像装置11の向きに変化はないと判断する。一方、HMD装置1は、最初と最後のフレーム間でのセンサー値の差分が所定量以上である場合には、撮像装置11の向きに変化があると判断する。
[Step S105]
The HMD device 1 functions as the shake detection unit 43 and detects a change in the orientation of the imaging device 11 (for example, the optical axis direction of the camera lens). For example, the shake detection unit 43 obtains the difference between the sensor values of the first and last frames among the sensor values of all the frames acquired in step S102, and if the difference is less than a predetermined amount, the imaging device 11 Judge that there is no change in the direction of the. On the other hand, the HMD device 1 determines that there is a change in the orientation of the imaging device 11 when the difference between the sensor values between the first and last frames is a predetermined amount or more.
 HMD装置1は、撮像装置11の向きに変化がないと判断した場合には(ステップS105:NO)、ステップS106の処理を省略して、処理をステップS107に進める。一方、HMD装置1は、撮像装置11の向きに変化があると判断した場合には(ステップS105:YES)、処理をステップS106に進める。 When the HMD device 1 determines that there is no change in the orientation of the imaging device 11 (step S105: NO), the process of step S106 is omitted, and the process proceeds to step S107. On the other hand, when the HMD device 1 determines that there is a change in the orientation of the imaging device 11 (step S105: YES), the process proceeds to step S106.
 [ステップS106]
 HMD装置1は、揺れ検出部43として機能し、所定の対象が移動した軌跡について、撮像装置11の向きの変化量を考慮して補正する。すなわち、HMD装置1は、連続する全てのフレーム間での「手」の移動量を、実際に「手」が移動した分の移動量に変換する。具体的には、HMD装置1は、連続するフレーム間で、撮像画像内における「手」の移動量(ベクトル)と、撮像装置11の向きの変化量(ベクトル)とを合成すればよい。
[Step S106]
The HMD device 1 functions as the shake detection unit 43 and corrects the locus of movement of a predetermined target in consideration of the amount of change in the orientation of the imaging device 11. That is, the HMD device 1 converts the movement amount of the “hand” between all successive frames into a movement amount corresponding to the actual movement of the “hand”. Specifically, the HMD device 1 may synthesize the movement amount (vector) of the “hand” in the captured image and the change amount (vector) of the orientation of the imaging device 11 between consecutive frames.
 [ステップS107]
 HMD装置1は、コマンド決定部48として機能し、ステップS103において算出された「手」の移動量、もしくは、ステップS106において補正された「手」の移動量から、ユーザーのジェスチャーを認識する。そして、HMD装置1は、認識したジェスチャーに従って、HMD装置1に対するユーザーの指示内容、すなわち、コマンドを決定する。なお、ジェスチャーとコマンドは、1対1の関係で対応づけて所定のテーブルに予め登録されており、HMD装置1は、認識したジェスチャーに対応付けられているコマンドを当該テーブルから読み出せばよい。
[Step S107]
The HMD device 1 functions as the command determination unit 48 and recognizes the user's gesture from the movement amount of the “hand” calculated in step S103 or the movement amount of the “hand” corrected in step S106. Then, the HMD device 1 determines a user instruction content, that is, a command to the HMD device 1 according to the recognized gesture. Note that gestures and commands are associated with each other in a one-to-one relationship and registered in advance in a predetermined table, and the HMD device 1 may read commands associated with the recognized gesture from the table.
 [ステップS108]
 HMD装置1は、コマンド決定部48として機能し、ステップS107において決定されたコマンドが正規のコマンドであるか否か判別する。具体的には、HMD装置1は、ステップS107において認識されたジェスチャーに対応するコマンドが存在しない場合や、その時点において制限されている制御を指示するコマンドである場合に、正規のコマンドではないと判定する。一方、HMD装置1は、ステップS107において認識されたジェスチャーに対応するコマンドが、その時点において実行可能な制御を指示するコマンドである場合には、正規のコマンドと判定する。
[Step S108]
The HMD device 1 functions as the command determination unit 48 and determines whether or not the command determined in step S107 is a regular command. Specifically, the HMD device 1 is not a legitimate command when there is no command corresponding to the gesture recognized in step S107, or when it is a command for instructing control restricted at that time. judge. On the other hand, if the command corresponding to the gesture recognized in step S107 is a command for instructing control that can be executed at that time, the HMD device 1 determines that the command is a regular command.
 HMD装置1は、正規のコマンドと判定した場合には(ステップS108:YES)、処理をステップS109に進め、正規のコマンドではないと判定した場合には(ステップS108:NO)、処理をステップS110に進める。 If the HMD device 1 determines that the command is a regular command (step S108: YES), the process proceeds to step S109. If the HMD device 1 determines that the command is not a regular command (step S108: NO), the process proceeds to step S110. Proceed to
 [ステップS109]
 HMD装置1は、アプリケーション部50として機能し、コマンド決定部48により決定されたコマンドに対応する処理を実行する。たとえば、HMD装置1は、ステップS107において決定されたコマンドが、ピンチアウト、ピンチインなどのジェスチャーに対応するコマンドである場合には、拡大処理、縮小処理などを実行する。また、HMD装置1は、ステップS107において決定されたコマンドが、スライド、タップ、ダブルタップなどのジェスチャーに対応するコマンドである場合には、コンテンツの再生、一時停止、停止などの処理を実行する。
[Step S109]
The HMD device 1 functions as the application unit 50 and executes processing corresponding to the command determined by the command determination unit 48. For example, if the command determined in step S107 is a command corresponding to a gesture such as pinch-out or pinch-in, the HMD device 1 executes enlargement processing, reduction processing, and the like. In addition, when the command determined in step S107 is a command corresponding to a gesture such as slide, tap, or double tap, the HMD device 1 executes processing such as content playback, pause, and stop.
 [ステップS110]
 HMD装置1は、通知部47および表示制御部49として機能し、ユーザーのジェスチャーに対応する処理を実行できない旨のエラーを通知する。たとえば、HMD装置1は、「同位置でジェスチャーをやり直して下さい」などのメッセージを表示装置12に表示させる。
[Step S110]
The HMD device 1 functions as the notification unit 47 and the display control unit 49 and notifies an error indicating that processing corresponding to the user's gesture cannot be executed. For example, the HMD device 1 causes the display device 12 to display a message such as “Please retry the gesture at the same position”.
 [ステップS111]
 HMD装置1は、制御部41として機能し、ジェスチャー認識処理を終了する指示がユーザーによりなされたか否か判別する。具体的には、HMD装置1は、入力装置34に対してジェスチャー認識処理を終了する操作がなされたか否かに応じて判別すればよい。
[Step S111]
The HMD device 1 functions as the control unit 41 and determines whether or not an instruction to end the gesture recognition process has been given by the user. Specifically, the HMD device 1 may determine whether or not the input device 34 has been operated to end the gesture recognition process.
 HMD装置1は、ジェスチャー認識処理を終了する指示がなされていない場合には(ステップS111:NO)、処理をステップS101に戻し、ジェスチャー認識処理を継続する。一方、HMD装置1は、ジェスチャー認識処理を終了する指示がなされた場合には(ステップS111:YES)、ジェスチャー認識処理を終了する。 The HMD device 1 returns the process to step S101 when the instruction to end the gesture recognition process has not been given (step S111: NO), and continues the gesture recognition process. On the other hand, if an instruction to end the gesture recognition process is given (step S111: YES), the HMD device 1 ends the gesture recognition process.
 (警告処理S104)
 次に、警告処理(ステップS104)の詳細について説明する。
(Warning process S104)
Next, details of the warning process (step S104) will be described.
 上記のジェスチャー認識処理において処理がステップS104へ進むと、HMD装置1は、警告処理を開始する。 When the process proceeds to step S104 in the gesture recognition process described above, the HMD device 1 starts a warning process.
 [ステップS201]
 HMD装置1は、揺れ検出部43として機能し、撮像装置11の向き(たとえば、カメラレンズの光軸方向)の変化を検出する。たとえば、揺れ検出部43は、ステップS102において取得された全てのフレームのセンサー値のうち、最初と最後のフレーム間でのセンサー値の差分を求め、所定量未満である場合には、撮像装置11の向きに変化はないと判断する。一方、HMD装置1は、最初と最後のフレーム間でのセンサー値の差分が所定量以上である場合には、撮像装置11の向きに変化があると判断する。
[Step S201]
The HMD device 1 functions as the shake detection unit 43 and detects a change in the orientation of the imaging device 11 (for example, the optical axis direction of the camera lens). For example, the shake detection unit 43 obtains the difference between the sensor values of the first and last frames among the sensor values of all the frames acquired in step S102, and if the difference is less than a predetermined amount, the imaging device 11 Judge that there is no change in the direction of the. On the other hand, the HMD device 1 determines that there is a change in the orientation of the imaging device 11 when the difference between the sensor values between the first and last frames is a predetermined amount or more.
 HMD装置1は、撮像装置11の向きに変化がないと判断した場合には(ステップS201:NO)、処理をステップS202に進める。一方、HMD装置1は、撮像装置11の向きに変化があると判断した場合には(ステップS201:YES)、処理をステップS206に進める。 If the HMD device 1 determines that there is no change in the orientation of the imaging device 11 (step S201: NO), the process proceeds to step S202. On the other hand, when the HMD device 1 determines that there is a change in the orientation of the imaging device 11 (step S201: YES), the process proceeds to step S206.
 [ステップS202]
 HMD装置1は、コマンド決定部48として機能し、ステップS103において算出された「手」の移動量から、ユーザーのジェスチャーを認識する。そして、HMD装置1は、認識したジェスチャーに従って、HMD装置1に対するユーザーの指示内容、すなわち、コマンドを決定する。ただし、ステップS103において「手」の移動量が算出できなかった場合には、所定の対象(たとえば、ユーザーの手)がジェスチャー認識エリアの外へ出るような動きを含むジェスチャーである可能性が高い。そこで、HMD装置1は、「手」がジェスチャー認識エリア内を移動している間に撮像された各フレーム間での「手」の移動量から、ユーザーのジェスチャーを認識し、対応するコマンドを決定する。
[Step S202]
The HMD device 1 functions as the command determination unit 48 and recognizes the user's gesture from the movement amount of the “hand” calculated in step S103. Then, the HMD device 1 determines a user instruction content, that is, a command to the HMD device 1 according to the recognized gesture. However, if the movement amount of the “hand” cannot be calculated in step S103, there is a high possibility that the gesture includes a movement in which a predetermined target (for example, the user's hand) goes out of the gesture recognition area. . Therefore, the HMD device 1 recognizes the user's gesture from the amount of movement of the “hand” between the frames captured while the “hand” is moving in the gesture recognition area, and determines the corresponding command. To do.
 [ステップS203]
 HMD装置1は、コマンド決定部48として機能し、ステップS202において決定されたコマンドが正規のコマンドであるか否か判別する。具体的には、HMD装置1は、ステップS202において認識されたジェスチャーに対応するコマンドが存在しない場合や、その時点において制限されている制御を指示するコマンドである場合に、正規のコマンドではないと判定する。一方、HMD装置1は、ステップS202において認識されたジェスチャーに対応するコマンドが、その時点において実行可能な制御を指示するコマンドである場合には、正規のコマンドと判定する。
[Step S203]
The HMD device 1 functions as the command determination unit 48 and determines whether or not the command determined in step S202 is a regular command. Specifically, the HMD device 1 is not a legitimate command when there is no command corresponding to the gesture recognized in step S202, or when it is a command for instructing control restricted at that time. judge. On the other hand, if the command corresponding to the gesture recognized in step S202 is a command for instructing control that can be executed at that time, the HMD device 1 determines that the command is a regular command.
 HMD装置1は、正規のコマンドと判定した場合には(ステップS203:YES)、処理をステップS204に進め、正規のコマンドではないと判定した場合には(ステップS203:NO)、処理をステップS205に進める。 If the HMD device 1 determines that the command is a regular command (step S203: YES), the process proceeds to step S204. If the HMD device 1 determines that the command is not a regular command (step S203: NO), the process proceeds to step S205. Proceed to
 [ステップS204]
 HMD装置1は、アプリケーション部50として機能し、コマンド決定部48により決定されたコマンドに対応する処理を実行する。たとえば、HMD装置1は、ステップS203において決定されたコマンドが、スワイプなどのジェスチャーに対応するコマンドである場合には、表示中のページをめくる処理などを実行する。
[Step S204]
The HMD device 1 functions as the application unit 50 and executes processing corresponding to the command determined by the command determination unit 48. For example, when the command determined in step S203 is a command corresponding to a gesture such as swipe, the HMD device 1 performs a process of turning a displayed page.
 [ステップS205]
 HMD装置1は、通知部47および表示制御部49として機能し、撮像装置11は動いていないが、所定の対象(たとえば、ユーザーの手)がジェスチャー認識エリアから外れたこと(図6(A)に示す状態であること)をユーザーに知らせるための通知Aを行う。
[Step S205]
The HMD device 1 functions as the notification unit 47 and the display control unit 49, and the imaging device 11 is not moving, but a predetermined target (for example, a user's hand) is out of the gesture recognition area (FIG. 6A). Notification A for informing the user of the state shown in FIG.
 たとえば、図6(A)に示すようにユーザーからみてジェスチャー認識エリアの右側に対象が外れた場合には、HMD装置1は、表示装置12を制御して、表示画面の右側に手を模した画像を表示する(図6(B))。これにより、ユーザーは、手がジェスチャー認識エリアの右側に外れたことを容易に把握でき、直観的にジェスチャーの位置を左側に移動させればよいことがわかる。 For example, as shown in FIG. 6A, when the target is not on the right side of the gesture recognition area as viewed from the user, the HMD device 1 controls the display device 12 and imitates the hand on the right side of the display screen. An image is displayed (FIG. 6B). Thus, the user can easily grasp that the hand has moved to the right side of the gesture recognition area, and intuitively move the gesture position to the left side.
 また、ユーザーからみてジェスチャー認識エリアの左側に対象が外れた場合には、HMD装置1は、表示装置12を制御して、表示画面の左側に手を模した画像を表示する(図7(A))。これにより、ユーザーは、手がジェスチャー認識エリアの左側に外れたことを容易に把握でき、直観的にジェスチャーの位置を右側に移動させればよいことがわかる。 Further, when the target is removed from the left side of the gesture recognition area as viewed from the user, the HMD device 1 controls the display device 12 to display an image imitating the hand on the left side of the display screen (FIG. 7A). )). Thus, the user can easily grasp that the hand has moved to the left side of the gesture recognition area, and intuitively move the gesture position to the right side.
 また、ユーザーからみてジェスチャー認識エリアの上側に対象が外れた場合には、HMD装置1は、表示装置12を制御して、表示画面の上側に手を模した画像を表示する(図7(B))。これにより、ユーザーは、手がジェスチャー認識エリアの上側に外れたことを容易に把握でき、直観的にジェスチャーの位置を下側に移動させればよいことがわかる。 Further, when the target is removed from the upper side of the gesture recognition area as viewed from the user, the HMD device 1 controls the display device 12 to display an image imitating a hand on the upper side of the display screen (FIG. 7B )). Thus, the user can easily grasp that the hand has moved out of the gesture recognition area, and intuitively move the gesture position downward.
 同様に、ユーザーからみてジェスチャー認識エリアの下側、左上側、右上側、左下側、右下側に対象が外れた場合には、HMD装置1は、表示画面の下側、左上側、右上側、左下側、右下側に手を模した画像を表示する(図7(C)、図7(D)、図8(A)、図8(B)、図8(C))。このように、HMD装置1は、ジェスチャー認識エリアから対象が外れた側に、警告のための画像を表示し、ユーザーが直感的に動作を戻す方向を確認できるようにする。 Similarly, when the target is removed from the lower side, upper left side, upper right side, lower left side, lower right side of the gesture recognition area, the HMD device 1 displays the lower side, upper left side, upper right side of the display screen. Then, the image imitating the hand is displayed on the lower left side and the lower right side (FIG. 7C, FIG. 7D, FIG. 8A, FIG. 8B, FIG. 8C). As described above, the HMD device 1 displays an image for warning on the side where the target is excluded from the gesture recognition area, and allows the user to intuitively confirm the direction in which the operation is returned.
 以上のような通知Aの後、HMD装置1は、制御部41として機能し、処理をジェスチャー認識処理に戻す。 After the notification A as described above, the HMD device 1 functions as the control unit 41 and returns the process to the gesture recognition process.
 [ステップS206]
 HMD装置1は、揺れ検出部43として機能し、所定の対象が移動した軌跡について、撮像装置11の向きの変化量を考慮して補正する。すなわち、HMD装置1は、連続する全てのフレーム間での「手」の移動量を、実際に「手」が移動した分の移動量に変換する。具体的には、HMD装置1は、連続するフレーム間で、撮像画像内における「手」の移動量(ベクトル)と、撮像装置11の向きの変化量(ベクトル)とを合成すればよい。
[Step S206]
The HMD device 1 functions as the shake detection unit 43 and corrects the locus of movement of a predetermined target in consideration of the amount of change in the orientation of the imaging device 11. That is, the HMD device 1 converts the movement amount of the “hand” between all successive frames into a movement amount corresponding to the actual movement of the “hand”. Specifically, the HMD device 1 may synthesize the movement amount (vector) of the “hand” in the captured image and the change amount (vector) of the orientation of the imaging device 11 between consecutive frames.
 ただし、ステップS103において「手」の移動量が算出できなかった場合には、所定の対象(たとえば、ユーザーの手)がジェスチャー認識エリアの外へ出るような動きを含むジェスチャーである可能性が高い。そこで、HMD装置1は、「手」がジェスチャー認識エリア内を移動している間に撮像された各フレーム間での「手」の移動量のみを、実際に「手」が移動した分の移動量に変換する。 However, if the movement amount of the “hand” cannot be calculated in step S103, there is a high possibility that the gesture includes a movement in which a predetermined target (for example, the user's hand) goes out of the gesture recognition area. . Therefore, the HMD device 1 moves only the movement amount of the “hand” between the frames captured while the “hand” is moving in the gesture recognition area as much as the “hand” is actually moved. Convert to quantity.
 [ステップS207]
 HMD装置1は、コマンド決定部48として機能し、ステップS206において補正された「手」の移動量から、ユーザーのジェスチャーを認識する。そして、HMD装置1は、認識したジェスチャーに従って、HMD装置1に対するユーザーの指示内容、すなわち、コマンドを決定する。
[Step S207]
The HMD device 1 functions as the command determination unit 48 and recognizes the user's gesture from the movement amount of the “hand” corrected in step S206. Then, the HMD device 1 determines a user instruction content, that is, a command to the HMD device 1 according to the recognized gesture.
 [ステップS208]
 HMD装置1は、コマンド決定部48として機能し、ステップS207において決定されたコマンドが正規のコマンドであるか否か判別する。具体的には、HMD装置1は、ステップS207において認識されたジェスチャーに対応するコマンドが存在しない場合や、その時点において制限されている制御を指示するコマンドである場合に、正規のコマンドではないと判定する。一方、HMD装置1は、ステップS207において認識されたジェスチャーに対応するコマンドが、その時点において実行可能な制御を指示するコマンドである場合には、正規のコマンドと判定する。
[Step S208]
The HMD device 1 functions as the command determination unit 48 and determines whether or not the command determined in step S207 is a regular command. Specifically, the HMD device 1 is not a legitimate command when there is no command corresponding to the gesture recognized in step S207, or when it is a command instructing control restricted at that time. judge. On the other hand, if the command corresponding to the gesture recognized in step S207 is a command for instructing control that can be executed at that time, the HMD device 1 determines that the command is a regular command.
 HMD装置1は、正規のコマンドと判定した場合には(ステップS208:YES)、処理をステップS209に進め、正規のコマンドではないと判定した場合には(ステップS208:NO)、処理をステップS210に進める。 If the HMD device 1 determines that the command is a regular command (step S208: YES), the process proceeds to step S209. If the HMD device 1 determines that the command is not a regular command (step S208: NO), the process proceeds to step S210. Proceed to
 [ステップS209]
 HMD装置1は、アプリケーション部50として機能し、コマンド決定部48により決定されたコマンドに対応する処理を実行する。たとえば、HMD装置1は、ステップS207において決定されたコマンドが、スワイプなどのジェスチャーに対応するコマンドである場合には、表示中のページをめくる処理などを実行する。
[Step S209]
The HMD device 1 functions as the application unit 50 and executes processing corresponding to the command determined by the command determination unit 48. For example, if the command determined in step S207 is a command corresponding to a gesture such as swipe, the HMD device 1 performs a process of turning a displayed page.
 [ステップS210]
 HMD装置1は、通知部47および表示制御部49として機能し、撮像装置11が動いたことにより、所定の対象(たとえば、ユーザーの手)がジェスチャー認識エリアから外れたこと(図9(A)に示す状態であること)をユーザーに知らせるための通知Bを行う。
[Step S210]
The HMD device 1 functions as the notification unit 47 and the display control unit 49, and the predetermined target (for example, the user's hand) has moved out of the gesture recognition area due to the movement of the imaging device 11 (FIG. 9A). Notification B for informing the user of the state shown in FIG.
 たとえば、図9(A)に示すようにユーザーが左方向を向いたことにより対象がジェスチャー認識エリアの右側に外れた場合には、HMD装置1は、表示装置12を制御して、表示画面の右側に矢印を模した画像を表示する(図9(B))。これにより、ユーザーは、撮像装置11が左方向を向いていることを容易に把握でき、直観的に頭部を右方向に向ければよいことがわかる。 For example, as illustrated in FIG. 9A, when the target is moved to the left as the user moves to the right side of the gesture recognition area, the HMD device 1 controls the display device 12 to display the display screen. An image simulating an arrow is displayed on the right side (FIG. 9B). Thereby, the user can easily grasp that the imaging device 11 is facing the left direction, and can intuitively point the head to the right direction.
 また、ユーザーが右方向を向いたことにより対象がジェスチャー認識エリアの左側に外れた場合には、HMD装置1は、表示装置12を制御して、表示画面の左側に矢印を模した画像を表示する(図10(A))。これにより、ユーザーは、撮像装置11が右方向を向いていることを容易に把握でき、直観的に頭部を左方向に向ければよいことがわかる。 In addition, when the user turns to the right and the target moves to the left side of the gesture recognition area, the HMD device 1 controls the display device 12 to display an image imitating an arrow on the left side of the display screen. (FIG. 10A). Thus, the user can easily grasp that the imaging device 11 is facing the right direction, and intuitively turn the head to the left.
 また、ユーザーが下方向を向いたことにより対象がジェスチャー認識エリアの上側に外れた場合には、HMD装置1は、表示装置12を制御して、表示画面の上側に矢印を模した画像を表示する(図10B)。これにより、ユーザーは、撮像装置11が下方向を向いていることを容易に把握でき、直観的に頭部を上方向に向ければよいことがわかる。 In addition, when the user is directed downward and the target moves out of the gesture recognition area, the HMD device 1 controls the display device 12 to display an image imitating an arrow on the upper side of the display screen. (FIG. 10B). Thereby, the user can easily grasp that the imaging device 11 is facing downward, and understand that it is only necessary to intuitively point the head upward.
 同様に、ユーザーが上方向、右下方向、左下方向、右上方向、左上方向を向いたことにより対象がジェスチャー認識エリアから外れた場合には、HMD装置1は、表示装置12を制御して、表示画面の下側、左上側、右上側、左下側、右下側に矢印を模した画像を表示する(図10(C)、図10(D)、図11(A)、図11(B)、図11(C))。このように、ユーザーがある方向を向いたことにより対象がジェスチャー認識エリアから外れた場合には、ユーザーが向いた方向の反対側に画像を表示する。したがって、ユーザーは直感的に、頭部を向ける方向を確認できる。 Similarly, when the target is out of the gesture recognition area due to the user facing upward, lower right direction, lower left direction, upper right direction, upper left direction, the HMD device 1 controls the display device 12, Images imitating arrows are displayed on the lower side, upper left side, upper right side, lower left side, and lower right side of the display screen (FIGS. 10C, 10D, 11A, and 11B). ), FIG. 11 (C)). As described above, when the target is away from the gesture recognition area due to the user facing in a certain direction, an image is displayed on the opposite side of the direction in which the user is facing. Therefore, the user can intuitively confirm the direction in which the head is directed.
 以上のような通知Bの後、HMD装置1は、制御部41として機能し、処理をジェスチャー認識処理に戻す。 After the notification B as described above, the HMD device 1 functions as the control unit 41 and returns the process to the gesture recognition process.
 以上のジェスチャー認識処理および警告処理がHMD装置1において実行されることにより、所定の対象(ユーザーの身体の一部)の追跡ができなくなった場合には、その原因を特定し、特定した原因に基づく適切な通知A、Bを行っている。この通知A、Bによって、ユーザーは、HMD装置1により正しくジェスチャーを認識できるようにするために、頭部をどの方向に動かせばよいのか、手をどの方向に動かせばよいのか、容易に判断できる。このようにして、ユーザーは直観的にジェスチャーの位置を正せるようになる。 When the above-described gesture recognition process and warning process are executed in the HMD device 1 and a predetermined target (part of the user's body) cannot be tracked, the cause is identified and the identified cause Appropriate notifications A and B are made based on this. With these notifications A and B, the user can easily determine in which direction the head should be moved and in which direction the hand should be moved so that the HMD device 1 can correctly recognize the gesture. . In this way, the user can intuitively correct the position of the gesture.
 上記した各フローチャートの各処理単位は、HMD装置1の理解を容易にするために、主な処理内容に応じて分割したものである。処理ステップの分類の仕方やその名称によって、本願発明が制限されることはない。HMD装置1で行われる処理は、さらに多くの処理ステップに分割することもできる。また、1つの処理ステップが、さらに多くの処理を実行してもよい。 Each processing unit in each flowchart described above is divided according to main processing contents in order to facilitate understanding of the HMD device 1. The invention of the present application is not limited by the method of classification of the processing steps and the names thereof. The processing performed in the HMD device 1 can be divided into more processing steps. One processing step may execute more processes.
 <変形例>
 なお、上記の実施形態は、本発明の要旨を例示することを意図し、本発明を限定するものではない。多くの代替物、修正、変形例は当業者にとって明らかである。
<Modification>
In addition, said embodiment intends to illustrate the summary of this invention, and does not limit this invention. Many alternatives, modifications, and variations will be apparent to those skilled in the art.
 たとえば、上記実施形態では、撮像装置11の視野全体(撮像画像の全領域)を、ジェスチャー認識エリアとして設定している。しかし、本発明は、これに限定されず、撮像装置11の視野内の一部の領域を、ジェスチャー認識エリアとして設定してもよい。 For example, in the above embodiment, the entire field of view of the imaging device 11 (the entire area of the captured image) is set as the gesture recognition area. However, the present invention is not limited to this, and a partial region in the field of view of the imaging device 11 may be set as the gesture recognition area.
 図12は、ジェスチャー認識エリアの変形例を示す図である。図12の破線で囲んだ領域のように、ユーザーの右手によるジェスチャーを認識しやすくするために、撮像装置11の視野内右側の領域を、ジェスチャー認識エリアとして設定してもよい。ジェスチャー認識エリアは、透明部材G1、G2上に表示してもよいし、表示しなくてもよい。 FIG. 12 is a diagram showing a modification of the gesture recognition area. In order to make it easy to recognize a gesture with the user's right hand, such as a region surrounded by a broken line in FIG. 12, a region on the right side of the visual field of the imaging device 11 may be set as a gesture recognition area. The gesture recognition area may or may not be displayed on the transparent members G1 and G2.
 また、本発明は、ユーザーが装着または保持して用いる機器であって、ユーザーのジェスチャーを認識可能な機能を有していれば、任意の機器に適用可能である。したがって、本発明は、上記実施形態のHMD装置1に適用する例に限らない。 Further, the present invention is applicable to any device as long as it is a device worn or held by the user and has a function capable of recognizing the user's gesture. Therefore, the present invention is not limited to the example applied to the HMD device 1 of the above embodiment.
 図13は、本発明を携帯端末に適用する例について説明するための図である。図13に示すように、本発明をユーザーが手に保持して用いる携帯端末60に適用することもできる。ここで携帯端末には、スマートフォン、タブレットPC、PDA(Personal Digital Assistant)等の、各種情報やコンテンツを表示可能な表示画面と撮像装置とを有する携帯情報端末装置が含まれる。 FIG. 13 is a diagram for explaining an example in which the present invention is applied to a mobile terminal. As shown in FIG. 13, the present invention can also be applied to a portable terminal 60 that is used while being held by a user's hand. Here, the mobile terminal includes a mobile information terminal device having a display screen capable of displaying various types of information and contents and an imaging device, such as a smartphone, a tablet PC, and a PDA (Personal Digital Assistant).
 本発明を携帯端末60に適用する場合、携帯端末60は、上記実施形態と同様の構成を有し、ユーザーの顔の動きをジェスチャーとして認識する。そして、携帯端末60は、ジェスチャー認識エリアから顔が外れた場合に、携帯端末60を保持しているユーザーの手(すなわち、撮像装置11)が動いたのか、ユーザーの顔(ジェスチャー)が動いたのか区別可能なようにユーザーに通知を行う。 When the present invention is applied to the mobile terminal 60, the mobile terminal 60 has the same configuration as that of the above embodiment, and recognizes the movement of the user's face as a gesture. Then, when the mobile terminal 60 moves away from the gesture recognition area, the user's hand holding the mobile terminal 60 (that is, the imaging device 11) has moved or the user's face (gesture) has moved. The user is notified so that it can be distinguished.
 また、通知部47は、原因特定部46により特定された原因に応じて異なる警告音を出力するようにしてもよい。また、通知部47、原因特定部46により特定された原因に応じて異なる振動パターンをモーターなどにより発生させてもよい。 Further, the notification unit 47 may output a different warning sound according to the cause specified by the cause specifying unit 46. Further, different vibration patterns may be generated by a motor or the like depending on the cause specified by the notification unit 47 and the cause specifying unit 46.
 また、上記実施形態および変形例では、HMD装置1は、シースルー型の画像表示装置としている。しかし、本発明は、これに限定されず、HMD装置1は、非透過型の画像表示装置であってもよい。 In the embodiment and the modification, the HMD device 1 is a see-through image display device. However, the present invention is not limited to this, and the HMD device 1 may be a non-transmissive image display device.
 以上のHMD装置1の構成は、上記の実施形態および変形例の特徴を説明するにあたって主要構成を説明したのであって、上記の構成に限られない。また、一般的なHMD装置1が備える構成を排除するものではない。 The above-described configuration of the HMD device 1 is not limited to the above-described configuration because the main configuration has been described in describing the features of the above-described embodiments and modifications. In addition, the configuration of the general HMD device 1 is not excluded.
 また、上記したHMD装置1の各機能構成は、各機能構成を理解容易にするために、主な処理内容に応じて分類したものである。構成要素の分類の仕方や名称によって、本願発明が制限されることはない。各機能構成は、処理内容に応じて、さらに多くの構成要素に分類することもできる。また、一つの構成要素がさらに多くの処理を実行するように分類することもできる。 In addition, each functional configuration of the HMD device 1 described above is classified according to main processing contents in order to facilitate understanding of each functional configuration. The present invention is not limited by the way of classification and names of the constituent elements. Each functional configuration can be classified into more components according to the processing content. Moreover, it can also classify | categorize so that one component may perform more processes.
 また、上記したHMD装置1の各機能構成の処理は、専用のハードウェア回路によっても実現することもできる。この場合には、1つのハードウェアで実行されてもよいし、複数のハードウェアで実行されてもよい。 The processing of each functional configuration of the HMD device 1 described above can also be realized by a dedicated hardware circuit. In this case, it may be executed by one hardware or a plurality of hardware.
 また、HMD装置1を動作させるプログラムは、USBメモリー、フレキシブルディスク、CD-ROM等のコンピューター読み取り可能な記録媒体によって提供されてもよいし、インターネット等のネットワークを介してオンラインで提供されてもよい。この場合、コンピューター読み取り可能な記録媒体に記録されたプログラムは、通常、メモリー32やストレージ33等に転送され記憶される。また、このプログラムは、たとえば、単独のアプリケーションソフトとして提供されてもよいし、HMD装置1の一機能としてその各装置のソフトウェアに組み込んでもよい。 The program for operating the HMD device 1 may be provided by a computer-readable recording medium such as a USB memory, a flexible disk, a CD-ROM, or may be provided online via a network such as the Internet. . In this case, the program recorded on the computer-readable recording medium is usually transferred and stored in the memory 32, the storage 33, or the like. Further, this program may be provided as, for example, a single application software, or may be incorporated in the software of each device as one function of the HMD device 1.
1 HMD装置、
11 撮像装置、
12 表示装置、
13 慣性センサー、
31 CPU、
32 メモリー、
33 ストレージ、
34 入力装置、
35 表示コントローラー、
41 制御部、
42 撮像制御部、
43 揺れ検出部、
44 追跡部、
45 追跡可否判断部、
46 原因特定部、
47 通知部、
48 コマンド決定部、
49 表示制御部、
50 アプリケーション部、
60 携帯端末、
U 制御ユニット。
 
1 HMD device,
11 imaging device,
12 display device,
13 Inertial sensor,
31 CPU,
32 memory,
33 storage,
34 input devices,
35 Display controller,
41 control unit,
42 imaging control unit,
43 Shake detection unit,
44 Tracker,
45 Tracking availability determination unit,
46 Cause identification part,
47 Notification section,
48 Command decision part,
49 Display control unit,
50 application section,
60 mobile devices,
U Control unit.

Claims (8)

  1.  ユーザーが装着または保持して用いるジェスチャー認識装置であって、
     連続して画像を撮像する撮像部と、
     前記撮像部の向きの変化を検出し、前記撮像部により撮像された画像を用いて、所定の対象を追跡し、当該追跡の可否を判断し、当該判断結果により前記対象の追跡が不可の場合、検出された前記向きの変化量を算出して、追跡が不可となった原因を特定する制御部と、
     特定された前記原因に基づく通知を行う通知部と、を備えるジェスチャー認識装置。
    A gesture recognition device worn or used by a user,
    An imaging unit that continuously captures images;
    When a change in the orientation of the imaging unit is detected, a predetermined target is tracked using an image captured by the imaging unit, whether the tracking is possible, and the tracking of the target is impossible based on the determination result A control unit for calculating the detected change amount of the orientation and identifying the cause of the tracking failure;
    A gesture recognition device comprising: a notification unit that performs notification based on the identified cause.
  2.  前記所定の対象は、ユーザーの体の一部である、請求項1に記載のジェスチャー認識装置。 The gesture recognition device according to claim 1, wherein the predetermined target is a part of a user's body.
  3.  前記通知部は、前記特定部により特定された前記原因に応じて異なる通知を行う、請求項1または請求項2に記載のジェスチャー認識装置。 3. The gesture recognition apparatus according to claim 1, wherein the notification unit performs different notifications according to the cause specified by the specifying unit.
  4.  前記通知部による通知には、表示による通知、音声による通知、および振動による通知の少なくともいずれか一つの通知が含まれる、請求項1~3のいずれか一項に記載のジェスチャー認識装置。 The gesture recognition apparatus according to any one of claims 1 to 3, wherein the notification by the notification unit includes at least one of notification by display, notification by sound, and notification by vibration.
  5.  前記制御部による追跡は、前記撮像部の視野内に設定された認識領域内において行われる、請求項1~4のいずれか一項に記載のジェスチャー認識装置。 The gesture recognition apparatus according to any one of claims 1 to 4, wherein the tracking by the control unit is performed in a recognition region set in a field of view of the imaging unit.
  6.  前記制御部は、追跡された前記対象の動きに従ってコマンドを決定し、
     前記通知部は、前記制御部によりコマンドが決定される際に前記対象の追跡が不可となった場合に、通知を行う、
    請求項1~5のいずれか一項に記載のジェスチャー認識装置。
    The controller determines a command according to the tracked movement of the object;
    The notification unit performs notification when tracking of the target becomes impossible when a command is determined by the control unit,
    The gesture recognition device according to any one of claims 1 to 5.
  7.  請求項1~6のいずれか一項に記載のジェスチャー認識装置としての機能を有する、ヘッドマウントディスプレイ。 A head-mounted display having a function as the gesture recognition device according to any one of claims 1 to 6.
  8.  請求項1~6のいずれか一項に記載のジェスチャー認識装置としての機能を有する、携帯端末。 A mobile terminal having a function as the gesture recognition device according to any one of claims 1 to 6.
PCT/JP2015/050539 2014-03-12 2015-01-09 Gesture recognition device, head-mounted display, and portable terminal WO2015136952A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014049096A JP2017083916A (en) 2014-03-12 2014-03-12 Gesture recognition apparatus, head-mounted display, and mobile terminal
JP2014-049096 2014-03-12

Publications (1)

Publication Number Publication Date
WO2015136952A1 true WO2015136952A1 (en) 2015-09-17

Family

ID=54071406

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/050539 WO2015136952A1 (en) 2014-03-12 2015-01-09 Gesture recognition device, head-mounted display, and portable terminal

Country Status (2)

Country Link
JP (1) JP2017083916A (en)
WO (1) WO2015136952A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017098775A1 (en) * 2015-12-11 2017-06-15 ソニー株式会社 Information processing device, information processing method, and program
JP2017220032A (en) * 2016-06-07 2017-12-14 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus, information processing method, and computer program

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10860088B2 (en) * 2018-05-03 2020-12-08 Microsoft Technology Licensing, Llc Method and system for initiating application and system modal control based on hand locations
JP2022087989A (en) 2020-12-02 2022-06-14 株式会社Jvcケンウッド Video display device, method for controlling video display device, and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010098354A (en) * 2008-10-14 2010-04-30 Panasonic Corp Imaging apparatus
JP2012010162A (en) * 2010-06-25 2012-01-12 Kyocera Corp Camera device
JP2012146236A (en) * 2011-01-14 2012-08-02 Olympus Corp Gesture input apparatus
JP2013065112A (en) * 2011-09-15 2013-04-11 Omron Corp Gesture recognition device, electronic apparatus, control method of gesture recognition device, control program, and recording medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010098354A (en) * 2008-10-14 2010-04-30 Panasonic Corp Imaging apparatus
JP2012010162A (en) * 2010-06-25 2012-01-12 Kyocera Corp Camera device
JP2012146236A (en) * 2011-01-14 2012-08-02 Olympus Corp Gesture input apparatus
JP2013065112A (en) * 2011-09-15 2013-04-11 Omron Corp Gesture recognition device, electronic apparatus, control method of gesture recognition device, control program, and recording medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017098775A1 (en) * 2015-12-11 2017-06-15 ソニー株式会社 Information processing device, information processing method, and program
US11087775B2 (en) 2015-12-11 2021-08-10 Sony Corporation Device and method of noise suppression based on noise source positions
JP2017220032A (en) * 2016-06-07 2017-12-14 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus, information processing method, and computer program

Also Published As

Publication number Publication date
JP2017083916A (en) 2017-05-18

Similar Documents

Publication Publication Date Title
US10417496B2 (en) Visibility enhancement devices, systems, and methods
EP2634727B1 (en) Method and portable terminal for correcting gaze direction of user in image
US9900498B2 (en) Glass-type terminal and method for controlling the same
US10477090B2 (en) Wearable device, control method and non-transitory storage medium
EP3062286B1 (en) Optical distortion compensation
EP3144775B1 (en) Information processing system and information processing method
JP7092028B2 (en) Information processing equipment, information processing methods, and programs
CN110546601B (en) Information processing device, information processing method, and program
CN106066537B (en) Head-mounted display and control method of head-mounted display
CN104115100A (en) Head-mounted display, program for controlling head-mounted display, and method of controlling head-mounted display
US8996333B2 (en) Information processing apparatus which executes specific processing based on a specific condition and a detected specific vibration, and method for controlling the same
JP6341755B2 (en) Information processing apparatus, method, program, and recording medium
JP2015114818A (en) Information processing device, information processing method, and program
WO2015136952A1 (en) Gesture recognition device, head-mounted display, and portable terminal
US9148537B1 (en) Facial cues as commands
US20180004288A1 (en) Electronic device
US10389947B2 (en) Omnidirectional camera display image changing system, omnidirectional camera display image changing method, and program
CN106371552B (en) Control method and device for media display at mobile terminal
WO2016157951A1 (en) Display control device, display control method, and recording medium
JP6686319B2 (en) Image projection device and image display system
JP6155893B2 (en) Image processing apparatus and program
WO2017168622A1 (en) Captured image sharing system, captured image sharing method, and program
JP6079418B2 (en) Input device and input program
US11733789B1 (en) Selectively activating a handheld device to control a user interface displayed by a wearable device
US20220350997A1 (en) Pointer-based content recognition using a head-mounted device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15761429

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15761429

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP