WO2015136952A1 - Dispositif de reconnaissance de geste, visiocasque, et terminal portable - Google Patents

Dispositif de reconnaissance de geste, visiocasque, et terminal portable Download PDF

Info

Publication number
WO2015136952A1
WO2015136952A1 PCT/JP2015/050539 JP2015050539W WO2015136952A1 WO 2015136952 A1 WO2015136952 A1 WO 2015136952A1 JP 2015050539 W JP2015050539 W JP 2015050539W WO 2015136952 A1 WO2015136952 A1 WO 2015136952A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture recognition
notification
user
hmd device
gesture
Prior art date
Application number
PCT/JP2015/050539
Other languages
English (en)
Japanese (ja)
Inventor
義朗 平原
片桐 哲也
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Publication of WO2015136952A1 publication Critical patent/WO2015136952A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/453Help systems

Definitions

  • the present invention relates to a gesture recognition device, a head mounted display, and a mobile terminal.
  • a head mounted display is a display device that is used by being mounted on a user's head. The user can view various contents such as image information displayed in front of the user with the head mounted display mounted on the head.
  • Some of such head mounted displays can capture a part of the user's body using a small camera and recognize the movement (gesture) of the body as an input operation to the head mounted display.
  • Patent Document 1 is not an example using a head-mounted display, a small camera is installed on the ceiling, and a user's gesture made under the small camera is captured and recognized, and input to a computer. Techniques for determining commands are described.
  • the small camera is mounted on the head-mounted display. Therefore, if the user's head is shaken, the small camera is also shaken. Therefore, when the user's body moves out of the field of view of the small camera, it cannot be distinguished whether the small camera has moved or the user's body has moved. In particular, in a situation where the user wears a head-mounted display and is gazing at the display, it tends to be difficult to distinguish whether the small camera has moved or the user's body has moved. Therefore, even if the user is prompted to correct the position of the gesture without showing a specific method for correcting the position of the gesture, the user cannot determine how to correct the position of the gesture.
  • the present invention has been made in view of the above circumstances, and an object thereof is to provide a gesture recognition device, a head-mounted display, and a mobile terminal that guide a user to intuitively correct the position of a gesture.
  • a gesture recognition device that is worn or held by a user and that uses an image capturing unit that continuously captures images, changes in the orientation of the image capturing unit, and images captured by the image capturing unit Tracking the predetermined target, determining whether the tracking is possible, and if the target cannot be tracked based on the determination result, the amount of change in the detected direction is calculated, and the reason why the tracking is impossible
  • a gesture recognizing apparatus comprising: a control unit that identifies a notification; and a notification unit that performs notification based on the identified cause.
  • the notification by the notification unit includes at least one of notification by display, notification by sound, and notification by vibration apparatus.
  • the control unit determines a command according to the tracked movement of the target, and the notification unit notifies when the target cannot be tracked when the command is determined by the control unit.
  • the gesture recognition device according to any one of (1) to (5), wherein:
  • the cause is identified and appropriate notification based on the identified cause is performed.
  • This notification allows the user to easily determine in which direction the head should be moved and in which direction the hand should be moved so that the head-mounted display can correctly recognize the gesture. In this way, the user can intuitively correct the position of the gesture.
  • HMD head mounted display
  • HMD head mounted display
  • (A) It is a figure which shows the example of a state when object moves to the right direction.
  • (A) It is a figure which shows the example of notification (notification A) when a target remove
  • (B) It is a figure which shows the example of notification (notification A) when a target remove
  • (C) It is a figure which shows the example of notification (notification A) when a target remove
  • D It is a figure which shows the example of notification (notification A) when a target remove
  • (A) It is a figure which shows the example of notification (notification A) when a target remove
  • (B) It is a figure which shows the example of notification (notification A) when a target remove
  • (C) It is a figure which shows the example of notification (notification A) when a target remove
  • (A) It is a figure which shows the example of a state when a user's head turns to the left direction.
  • (B) It is a figure which shows the example of notification (notification B) when a target remove
  • A It is a figure which shows the example of notification (notification B) when a target remove
  • B It is a figure which shows the example of notification (notification B) when a target remove
  • C It is a figure which shows the example of notification (notification B) when a target remove
  • D It is a figure which shows the example of notification (notification B) when a target remove
  • A It is a figure which shows the example of notification (notification B) when a target remove
  • B It is a figure which shows the example of notification (notification B) when a target remove
  • C It is a figure which shows the example of notification (notification B) when a target remove
  • FIG. 1 is a diagram showing an example of the appearance of a head mounted display (HMD) device according to this embodiment.
  • FIG. 2 is a block diagram illustrating a hardware configuration example of the HMD device.
  • HMD head mounted display
  • the HMD device 1 is an image display device that is used by being mounted on the head of a user (hereinafter also referred to as “observer”).
  • the HMD device 1 has a structure imitating eyeglasses for correcting vision. That is, the HMD device 1 includes a pair of left and right temples L1 and L2, a bridge B, and a pair of left and right transparent members G1 and G2.
  • Temples L1 and L2 are, for example, long bar-shaped members made of an elastic material. Each of the temples L1 and L2 is provided with an ear hook portion that is hung on the user's ear, and the transparent members G1 and G2 are fixed to the other end portion. In addition, a control unit U is mounted in the vicinity of the ear hook portion that is hung on one ear of the user.
  • the bridge B is a short bar-like member for connecting the pair of left and right transparent members G1 and G2 to each other.
  • Transparent members G1 and G2 are fixed to both ends of the bridge B.
  • the pair of left and right transparent members G1 and G2 are held at a predetermined interval.
  • the transparent members G1 and G2 are transparent materials (glass, plastic, film, etc.) that can transmit light from the outside world to the user's eyes so that the user wearing the HMD device 1 can observe the outside world It is formed by.
  • the imaging device 11 is provided above the transparent member G1 corresponding to the user's right eye, and the display device 12 is provided so as to overlap the transparent member G1.
  • the imaging device 11 images the outside of the camera lens in the optical axis direction (hereinafter also referred to as “imaging direction”).
  • imaging direction the optical axis direction
  • the imaging device 11 is fixedly held with respect to the HMD device 1 so that the optical axis of the camera lens substantially coincides with the user's line-of-sight direction.
  • the imaging device 11 can take an image of the user's front visual field.
  • the imaging device 11 is a digital camera or a video camera including an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • a captured image such as JPEG or MPEG2 / 4 (or a moving image captured at a predetermined frame rate) is generated by imaging by the imaging device 11 and transmitted to the control unit U connected by wire.
  • a personal-use web camera (with an angle of view of about 72 degrees) is used as the imaging device 11.
  • the display device 12 displays various contents such as a predetermined image (video) on the transparent members G1 and G2.
  • the display device 12 may display an image (AR image) that is additionally superimposed on light from the outside.
  • the display device 12 can display a pop-up image including information such as a name and work place as if there is a person near the person in the direction of the user's line of sight.
  • the display device 12 may be a binocular type instead of the monocular type as shown in FIG.
  • an inertial sensor 13 is provided in the vicinity of the imaging device 11.
  • the inertial sensor 13 is a sensor that detects shaking of the HMD device 1 and provides data necessary for calculating the shaking amount of the imaging device 11 at that time.
  • the inertial sensor 13 includes an acceleration sensor, a gyro sensor, or the like.
  • the inertial sensor 13 may have any form, name, or structure as long as it can provide data necessary for calculating the amount of shaking of the imaging device 11.
  • the control unit U has a configuration necessary for controlling the imaging device 11, the display device 12, and the inertial sensor 13 to function as a head mounted display as a whole. For example, the control unit U recognizes the user's gesture from the captured image (moving image) transmitted from the imaging device 11, executes a command corresponding to the recognized gesture, and displays the result on the display device 12. Perform the process. A specific configuration of the control unit U will be described later.
  • control unit U includes a CPU (Central Processing Unit) 31, a memory 32, a storage 33, an input device 34, and a display controller 35.
  • CPU Central Processing Unit
  • the CPU 31 is a control circuit composed of a processor or the like that executes control of each unit and various arithmetic processes according to a program, and each function of the HMD device 1 is exhibited by the CPU 31 executing a corresponding program.
  • the memory 32 is a high-speed accessible main storage device that temporarily stores programs and data as a work area.
  • DRAM Dynamic Random Access Memory
  • SDRAM Serial Dynamic Access Memory
  • SRAM Static Random Access Memory
  • the storage 33 is a large-capacity auxiliary storage device that stores various programs including an operating system and various data.
  • a flash memory or the like is employed as the storage 33.
  • the input device 34 is composed of physical keys and buttons for inputting instructions such as power-on of the HMD device 1. Moreover, you may utilize an external controller (for example, a smart phone, a remote controller, etc.) as the input device 34.
  • an external controller for example, a smart phone, a remote controller, etc.
  • the display controller 35 causes the display device 12 to display various contents such as a predetermined image (video), an AR image, and the like.
  • the display controller 35 reads a predetermined image from the memory 32 every predetermined display cycle, converts it into a signal for input to the display device 12, and generates a pulse signal such as a horizontal synchronizing signal and a vertical synchronizing signal. To do.
  • the HMD device 1 having the hardware configuration as described above has the following functional configuration.
  • FIG. 3 is a block diagram illustrating a functional configuration example of the HMD device.
  • the HMD device 1 includes, as a functional configuration, a control unit 41, an imaging control unit 42, a shake detection unit 43, a tracking unit 44, a tracking availability determination unit 45, a cause identification unit 46, A notification unit 47, a command determination unit 48, a display control unit 49, and an application unit 50 are included.
  • the function units 41 to 48 and 50 are realized by the CPU 31 reading the program installed in the storage 33 into the memory 32 and executing it, and the display control unit 49 is realized by the display controller 35.
  • the functional units 41 to 48 and 50 are not limited to this, and may be realized by hardware such as an ASIC.
  • the control unit 41 controls the entire HMD device 1.
  • the imaging control unit 42 controls the imaging device 11 to continuously image the outside world. Thereby, a captured image (moving image) is generated.
  • the shake detection unit 43 detects a change in the direction of the imaging device 11 (for example, the optical axis direction of the camera lens) when the user wearing the HMD device 1 shakes. For example, the shake detection unit 43 acquires the sensor value of the inertial sensor 13 and detects a change in the orientation of the imaging device 11.
  • the tracking unit 44 tracks a predetermined target (for example, the user's hand, fingertip, etc.) in order to recognize the user's gesture.
  • a predetermined target for example, the user's hand, fingertip, etc.
  • the tracking unit 44 sets an area for recognizing a user's gesture in advance (hereinafter referred to as “gesture recognition area”), and tracks a predetermined target in the gesture recognition area.
  • the gesture recognition area is set within the field of view of the imaging device 11.
  • the tracking availability determination unit 45 determines whether tracking by the tracking unit 44 is possible.
  • the cause identification unit 46 calculates the amount of change in the orientation of the imaging device 11 detected by the shake detection unit 43 when the predetermined target cannot be tracked based on the determination result of the tracking enable / disable determination unit 45, and indicates that the tracking is impossible. Identify the cause. For example, when the amount of change in the orientation of the imaging device 11 is greater than or equal to the threshold value, the cause identifying unit 46 identifies that tracking is impossible due to the user's body shaking and the imaging device 11 moving. On the other hand, if the amount of change in the orientation of the imaging device 11 is less than the threshold value, it is determined that the predetermined target cannot be tracked due to the movement (gesture) of the predetermined target itself.
  • the notification unit 47 performs notification based on the cause specified by the cause specifying unit 46. For example, the notification unit 47 performs different notifications depending on the cause specified by the cause specifying unit 46.
  • the notification performed by the notification unit 47 includes at least one of notification by display (display device 12), notification by voice (not shown speaker), and notification by vibration (small vibrator not shown). included.
  • the command determination unit 48 determines the user's instruction content, that is, the command to the HMD device 1 according to the movement (gesture) of the predetermined target tracked by the tracking unit 44.
  • the display control unit 49 causes the display device 12 to display various contents such as a predetermined image (video), an AR image, and the like.
  • the application unit 50 executes processing corresponding to the command determined by the command determination unit 48.
  • the application unit 50 performs processing such as display screen enlargement processing, reduction processing, content playback, pause, and stop.
  • FIG. 4 is a flowchart showing the procedure of the gesture recognition process.
  • FIG. 5 is a flowchart showing the procedure of the warning process.
  • FIG. 6A is a diagram illustrating a state example when the object moves in the right direction.
  • FIGS. 6 (B), 7 (A) to (D), and FIGS. 8 (A) to (C) are respectively the right side, left side, upper side, lower side, upper left side, upper right side of the gesture recognition area, It is a figure which shows the example of notification (notification A) when it remove
  • FIG. 9A is a diagram illustrating an example of a state when the user's head is directed leftward.
  • the objects are the right side, left side, upper side, lower side, upper left side, upper right side of the gesture recognition area, respectively. It is a figure which shows the example of notification (notification B) when it remove
  • the HMD device 1 starts the gesture recognition process shown in FIG. 4 at the timing when the power is turned on.
  • the timing of starting the gesture recognition process is not limited to this, and the gesture recognition process may be started when an operation for starting the gesture recognition process is performed on the input device 34.
  • the gesture recognition process may be started when it is detected that the HMD device 1 is mounted on the user's head.
  • the control unit 41 calibrates various parameters necessary for controlling the HMD device 1. For example, registration of the reference value of the inertial sensor 13 and setting of a gesture recognition area are performed. In the present embodiment, the control unit 41 acquires a sensor value from the inertial sensor 13 and registers it in the memory 32 as a reference value for the orientation of the imaging device 11. Further, the control unit 41 sets the entire field of view of the imaging device 11 (the entire region of the captured image) as a gesture recognition area.
  • the HMD device 1 functions as the imaging control unit 42 and continuously images the outside world at a predetermined frame rate (a predetermined number of frames). Specifically, the HMD device 1 generates a captured image (moving image) in the optical axis direction of the camera lens by controlling the imaging device 11 and starting imaging. Then, the HMD device 1 stores the generated captured image (moving image) in the memory 32 or the storage 33. In addition, while the imaging device 11 is imaging, the user can perform a gesture operation by moving a part of the body (for example, a hand) so that the HMD device 1 can recognize it.
  • a predetermined frame rate a predetermined number of frames
  • gesture movements include pinch-out that spreads the thumb and index finger from the closed state, pinch-in that closes the thumb and index finger from the expanded state, slide of gesture that moves the hand or extended index finger, spread hand or Tap with a finger, double tap to tap it continuously, swipe to move your hand left and right or up and down.
  • the HMD device 1 functions as the shake detection unit 43 and acquires a sensor value from the inertial sensor 13. For example, the HMD device 1 acquires acceleration in the triaxial direction from the inertial sensor 13. In step S102, the HMD device 1 acquires a sensor value every time a captured image for one frame is generated in step S101, and associates the captured image with the sensor value for each frame and stores them in the memory 32 or the like. .
  • the HMD device 1 functions as a tracking unit 44 and tracks a predetermined target (for example, a user's hand) by a general pattern matching method. At this time, the HMD device 1 functions as the traceability determination unit 45 to determine whether a predetermined target is traceable or untraceable. Specifically, the HMD device 1 has features (colors and shapes) that are the same as or similar to the template image of “hand” prepared in advance from the captured images (each frame) generated in step S101. Find the area you have. Then, the HMD device 1 calculates the movement amount of the “hand” between all successive frames.
  • the HMD device 1 determines that the “hand” can be tracked when the movement amount of the “hand” between all the consecutive frames is less than a predetermined amount. On the other hand, the HMD device 1 determines that the movement amount of the “hand” between at least one continuous frame is equal to or greater than a predetermined amount, or there is a frame in which no “hand” exists and If the amount of movement cannot be calculated, it is determined that the “hand” cannot be tracked.
  • step S103 determines that the predetermined target can be tracked
  • step S103: NO determines that the predetermined target cannot be tracked
  • Step S104 The HMD device 1 executes a warning process for prompting the user to redo the gesture. Details of the warning process will be described later.
  • the HMD device 1 functions as the shake detection unit 43 and detects a change in the orientation of the imaging device 11 (for example, the optical axis direction of the camera lens). For example, the shake detection unit 43 obtains the difference between the sensor values of the first and last frames among the sensor values of all the frames acquired in step S102, and if the difference is less than a predetermined amount, the imaging device 11 Judge that there is no change in the direction of the. On the other hand, the HMD device 1 determines that there is a change in the orientation of the imaging device 11 when the difference between the sensor values between the first and last frames is a predetermined amount or more.
  • step S105: NO When the HMD device 1 determines that there is no change in the orientation of the imaging device 11 (step S105: NO), the process of step S106 is omitted, and the process proceeds to step S107. On the other hand, when the HMD device 1 determines that there is a change in the orientation of the imaging device 11 (step S105: YES), the process proceeds to step S106.
  • the HMD device 1 functions as the shake detection unit 43 and corrects the locus of movement of a predetermined target in consideration of the amount of change in the orientation of the imaging device 11. That is, the HMD device 1 converts the movement amount of the “hand” between all successive frames into a movement amount corresponding to the actual movement of the “hand”. Specifically, the HMD device 1 may synthesize the movement amount (vector) of the “hand” in the captured image and the change amount (vector) of the orientation of the imaging device 11 between consecutive frames.
  • the HMD device 1 functions as the command determination unit 48 and recognizes the user's gesture from the movement amount of the “hand” calculated in step S103 or the movement amount of the “hand” corrected in step S106. Then, the HMD device 1 determines a user instruction content, that is, a command to the HMD device 1 according to the recognized gesture. Note that gestures and commands are associated with each other in a one-to-one relationship and registered in advance in a predetermined table, and the HMD device 1 may read commands associated with the recognized gesture from the table.
  • the HMD device 1 functions as the command determination unit 48 and determines whether or not the command determined in step S107 is a regular command. Specifically, the HMD device 1 is not a legitimate command when there is no command corresponding to the gesture recognized in step S107, or when it is a command for instructing control restricted at that time. judge. On the other hand, if the command corresponding to the gesture recognized in step S107 is a command for instructing control that can be executed at that time, the HMD device 1 determines that the command is a regular command.
  • step S108 determines that the command is a regular command
  • step S109 determines that the command is not a regular command
  • step S110 proceeds to step S110. Proceed to
  • the HMD device 1 functions as the application unit 50 and executes processing corresponding to the command determined by the command determination unit 48. For example, if the command determined in step S107 is a command corresponding to a gesture such as pinch-out or pinch-in, the HMD device 1 executes enlargement processing, reduction processing, and the like. In addition, when the command determined in step S107 is a command corresponding to a gesture such as slide, tap, or double tap, the HMD device 1 executes processing such as content playback, pause, and stop.
  • the HMD device 1 functions as the notification unit 47 and the display control unit 49 and notifies an error indicating that processing corresponding to the user's gesture cannot be executed. For example, the HMD device 1 causes the display device 12 to display a message such as “Please retry the gesture at the same position”.
  • the HMD device 1 functions as the control unit 41 and determines whether or not an instruction to end the gesture recognition process has been given by the user. Specifically, the HMD device 1 may determine whether or not the input device 34 has been operated to end the gesture recognition process.
  • the HMD device 1 returns the process to step S101 when the instruction to end the gesture recognition process has not been given (step S111: NO), and continues the gesture recognition process. On the other hand, if an instruction to end the gesture recognition process is given (step S111: YES), the HMD device 1 ends the gesture recognition process.
  • step S104 in the gesture recognition process described above the HMD device 1 starts a warning process.
  • the HMD device 1 functions as the shake detection unit 43 and detects a change in the orientation of the imaging device 11 (for example, the optical axis direction of the camera lens). For example, the shake detection unit 43 obtains the difference between the sensor values of the first and last frames among the sensor values of all the frames acquired in step S102, and if the difference is less than a predetermined amount, the imaging device 11 Judge that there is no change in the direction of the. On the other hand, the HMD device 1 determines that there is a change in the orientation of the imaging device 11 when the difference between the sensor values between the first and last frames is a predetermined amount or more.
  • step S201: NO If the HMD device 1 determines that there is no change in the orientation of the imaging device 11 (step S201: NO), the process proceeds to step S202. On the other hand, when the HMD device 1 determines that there is a change in the orientation of the imaging device 11 (step S201: YES), the process proceeds to step S206.
  • the HMD device 1 functions as the command determination unit 48 and recognizes the user's gesture from the movement amount of the “hand” calculated in step S103. Then, the HMD device 1 determines a user instruction content, that is, a command to the HMD device 1 according to the recognized gesture. However, if the movement amount of the “hand” cannot be calculated in step S103, there is a high possibility that the gesture includes a movement in which a predetermined target (for example, the user's hand) goes out of the gesture recognition area. . Therefore, the HMD device 1 recognizes the user's gesture from the amount of movement of the “hand” between the frames captured while the “hand” is moving in the gesture recognition area, and determines the corresponding command. To do.
  • the HMD device 1 functions as the command determination unit 48 and determines whether or not the command determined in step S202 is a regular command. Specifically, the HMD device 1 is not a legitimate command when there is no command corresponding to the gesture recognized in step S202, or when it is a command for instructing control restricted at that time. judge. On the other hand, if the command corresponding to the gesture recognized in step S202 is a command for instructing control that can be executed at that time, the HMD device 1 determines that the command is a regular command.
  • step S203 determines that the command is a regular command (step S203: YES)
  • step S204 determines that the command is not a regular command
  • step S205 proceeds to step S205. Proceed to
  • Step S204 The HMD device 1 functions as the application unit 50 and executes processing corresponding to the command determined by the command determination unit 48. For example, when the command determined in step S203 is a command corresponding to a gesture such as swipe, the HMD device 1 performs a process of turning a displayed page.
  • the HMD device 1 functions as the notification unit 47 and the display control unit 49, and the imaging device 11 is not moving, but a predetermined target (for example, a user's hand) is out of the gesture recognition area (FIG. 6A). Notification A for informing the user of the state shown in FIG.
  • the HMD device 1 controls the display device 12 and imitates the hand on the right side of the display screen.
  • An image is displayed (FIG. 6B).
  • the HMD device 1 controls the display device 12 to display an image imitating the hand on the left side of the display screen (FIG. 7A). )).
  • the user can easily grasp that the hand has moved to the left side of the gesture recognition area, and intuitively move the gesture position to the right side.
  • the HMD device 1 controls the display device 12 to display an image imitating a hand on the upper side of the display screen (FIG. 7B )).
  • the user can easily grasp that the hand has moved out of the gesture recognition area, and intuitively move the gesture position downward.
  • the HMD device 1 displays the lower side, upper left side, upper right side of the display screen. Then, the image imitating the hand is displayed on the lower left side and the lower right side (FIG. 7C, FIG. 7D, FIG. 8A, FIG. 8B, FIG. 8C). As described above, the HMD device 1 displays an image for warning on the side where the target is excluded from the gesture recognition area, and allows the user to intuitively confirm the direction in which the operation is returned.
  • the HMD device 1 functions as the control unit 41 and returns the process to the gesture recognition process.
  • the HMD device 1 functions as the shake detection unit 43 and corrects the locus of movement of a predetermined target in consideration of the amount of change in the orientation of the imaging device 11. That is, the HMD device 1 converts the movement amount of the “hand” between all successive frames into a movement amount corresponding to the actual movement of the “hand”. Specifically, the HMD device 1 may synthesize the movement amount (vector) of the “hand” in the captured image and the change amount (vector) of the orientation of the imaging device 11 between consecutive frames.
  • the HMD device 1 moves only the movement amount of the “hand” between the frames captured while the “hand” is moving in the gesture recognition area as much as the “hand” is actually moved. Convert to quantity.
  • the HMD device 1 functions as the command determination unit 48 and recognizes the user's gesture from the movement amount of the “hand” corrected in step S206. Then, the HMD device 1 determines a user instruction content, that is, a command to the HMD device 1 according to the recognized gesture.
  • the HMD device 1 functions as the command determination unit 48 and determines whether or not the command determined in step S207 is a regular command. Specifically, the HMD device 1 is not a legitimate command when there is no command corresponding to the gesture recognized in step S207, or when it is a command instructing control restricted at that time. judge. On the other hand, if the command corresponding to the gesture recognized in step S207 is a command for instructing control that can be executed at that time, the HMD device 1 determines that the command is a regular command.
  • step S208: YES If the HMD device 1 determines that the command is a regular command (step S208: YES), the process proceeds to step S209. If the HMD device 1 determines that the command is not a regular command (step S208: NO), the process proceeds to step S210. Proceed to
  • Step S209 The HMD device 1 functions as the application unit 50 and executes processing corresponding to the command determined by the command determination unit 48. For example, if the command determined in step S207 is a command corresponding to a gesture such as swipe, the HMD device 1 performs a process of turning a displayed page.
  • the HMD device 1 functions as the notification unit 47 and the display control unit 49, and the predetermined target (for example, the user's hand) has moved out of the gesture recognition area due to the movement of the imaging device 11 (FIG. 9A).
  • Notification B for informing the user of the state shown in FIG.
  • the HMD device 1 controls the display device 12 to display the display screen.
  • An image simulating an arrow is displayed on the right side (FIG. 9B).
  • the HMD device 1 controls the display device 12 to display an image imitating an arrow on the left side of the display screen. (FIG. 10A).
  • the user can easily grasp that the imaging device 11 is facing the right direction, and intuitively turn the head to the left.
  • the HMD device 1 controls the display device 12 to display an image imitating an arrow on the upper side of the display screen. (FIG. 10B).
  • the user can easily grasp that the imaging device 11 is facing downward, and understand that it is only necessary to intuitively point the head upward.
  • the HMD device 1 controls the display device 12
  • Images imitating arrows are displayed on the lower side, upper left side, upper right side, lower left side, and lower right side of the display screen (FIGS. 10C, 10D, 11A, and 11B). ), FIG. 11 (C)).
  • FIG. 11 (C) As described above, when the target is away from the gesture recognition area due to the user facing in a certain direction, an image is displayed on the opposite side of the direction in which the user is facing. Therefore, the user can intuitively confirm the direction in which the head is directed.
  • the HMD device 1 functions as the control unit 41 and returns the process to the gesture recognition process.
  • the cause is identified and the identified cause Appropriate notifications A and B are made based on this.
  • the user can easily determine in which direction the head should be moved and in which direction the hand should be moved so that the HMD device 1 can correctly recognize the gesture. . In this way, the user can intuitively correct the position of the gesture.
  • Each processing unit in each flowchart described above is divided according to main processing contents in order to facilitate understanding of the HMD device 1.
  • the invention of the present application is not limited by the method of classification of the processing steps and the names thereof.
  • the processing performed in the HMD device 1 can be divided into more processing steps.
  • One processing step may execute more processes.
  • the entire field of view of the imaging device 11 (the entire area of the captured image) is set as the gesture recognition area.
  • the present invention is not limited to this, and a partial region in the field of view of the imaging device 11 may be set as the gesture recognition area.
  • FIG. 12 is a diagram showing a modification of the gesture recognition area.
  • a region on the right side of the visual field of the imaging device 11 may be set as a gesture recognition area.
  • the gesture recognition area may or may not be displayed on the transparent members G1 and G2.
  • the present invention is applicable to any device as long as it is a device worn or held by the user and has a function capable of recognizing the user's gesture. Therefore, the present invention is not limited to the example applied to the HMD device 1 of the above embodiment.
  • FIG. 13 is a diagram for explaining an example in which the present invention is applied to a mobile terminal.
  • the present invention can also be applied to a portable terminal 60 that is used while being held by a user's hand.
  • the mobile terminal includes a mobile information terminal device having a display screen capable of displaying various types of information and contents and an imaging device, such as a smartphone, a tablet PC, and a PDA (Personal Digital Assistant).
  • the mobile terminal 60 When the present invention is applied to the mobile terminal 60, the mobile terminal 60 has the same configuration as that of the above embodiment, and recognizes the movement of the user's face as a gesture. Then, when the mobile terminal 60 moves away from the gesture recognition area, the user's hand holding the mobile terminal 60 (that is, the imaging device 11) has moved or the user's face (gesture) has moved. The user is notified so that it can be distinguished.
  • the notification unit 47 may output a different warning sound according to the cause specified by the cause specifying unit 46. Further, different vibration patterns may be generated by a motor or the like depending on the cause specified by the notification unit 47 and the cause specifying unit 46.
  • the HMD device 1 is a see-through image display device.
  • the present invention is not limited to this, and the HMD device 1 may be a non-transmissive image display device.
  • the above-described configuration of the HMD device 1 is not limited to the above-described configuration because the main configuration has been described in describing the features of the above-described embodiments and modifications. In addition, the configuration of the general HMD device 1 is not excluded.
  • each functional configuration of the HMD device 1 described above is classified according to main processing contents in order to facilitate understanding of each functional configuration.
  • the present invention is not limited by the way of classification and names of the constituent elements.
  • Each functional configuration can be classified into more components according to the processing content. Moreover, it can also classify
  • each functional configuration of the HMD device 1 described above can also be realized by a dedicated hardware circuit. In this case, it may be executed by one hardware or a plurality of hardware.
  • the program for operating the HMD device 1 may be provided by a computer-readable recording medium such as a USB memory, a flexible disk, a CD-ROM, or may be provided online via a network such as the Internet. .
  • the program recorded on the computer-readable recording medium is usually transferred and stored in the memory 32, the storage 33, or the like.
  • this program may be provided as, for example, a single application software, or may be incorporated in the software of each device as one function of the HMD device 1.
  • 1 HMD device 11 imaging device, 12 display device, 13 Inertial sensor, 31 CPU, 32 memory, 33 storage, 34 input devices, 35 Display controller, 41 control unit, 42 imaging control unit, 43 Shake detection unit, 44 Tracker, 45 Tracking availability determination unit, 46 Cause identification part, 47 Notification section, 48 Command decision part, 49 Display control unit, 50 application section, 60 mobile devices, U Control unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Image Analysis (AREA)

Abstract

Le problème décrit par la présente invention est de fournir un dispositif de reconnaissance de geste, un visiocasque et un terminal portable, qui guident un utilisateur de façon à corriger la position d'un geste de façon intuitive. La solution selon l'invention porte sur un dispositif de reconnaissance de geste (1) qui est utilisé lorsqu'il est porté ou tenu par un utilisateur, ledit dispositif de reconnaissance de geste comprenant : une unité d'imagerie (42) qui capture des images en continu ; une unité de commande (43-46) qui détecte un changement d'orientation de l'unité d'imagerie (42), suit un objet prescrit à l'aide des images capturées par l'unité d'imagerie (42), détermine si l'opération de suivi est possible, calcule une amplitude de changement pour l'orientation détectée lorsque le résultat de la détermination est que le suivi de l'objet n'est pas possible, et identifie la cause lorsque le suivi n'est pas possible ; et une unité de notification (47) qui fournit une notification sur la base de la cause qui a été identifiée.
PCT/JP2015/050539 2014-03-12 2015-01-09 Dispositif de reconnaissance de geste, visiocasque, et terminal portable WO2015136952A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-049096 2014-03-12
JP2014049096A JP2017083916A (ja) 2014-03-12 2014-03-12 ジェスチャー認識装置、ヘッドマウントディスプレイ、および携帯端末

Publications (1)

Publication Number Publication Date
WO2015136952A1 true WO2015136952A1 (fr) 2015-09-17

Family

ID=54071406

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/050539 WO2015136952A1 (fr) 2014-03-12 2015-01-09 Dispositif de reconnaissance de geste, visiocasque, et terminal portable

Country Status (2)

Country Link
JP (1) JP2017083916A (fr)
WO (1) WO2015136952A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017098775A1 (fr) * 2015-12-11 2017-06-15 ソニー株式会社 Dispositif ainsi que procédé de traitement d'informations, et programme
JP2017220032A (ja) * 2016-06-07 2017-12-14 株式会社ソニー・インタラクティブエンタテインメント 情報処理装置、情報処理方法、およびコンピュータプログラム
CN112912824A (zh) * 2018-10-24 2021-06-04 株式会社阿尔法空间 与头戴式显示器相连接的智能终端及用于其的控制方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10860088B2 (en) * 2018-05-03 2020-12-08 Microsoft Technology Licensing, Llc Method and system for initiating application and system modal control based on hand locations
JP2022087989A (ja) 2020-12-02 2022-06-14 株式会社Jvcケンウッド 映像表示装置、映像表示装置の制御方法、及びプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010098354A (ja) * 2008-10-14 2010-04-30 Panasonic Corp 撮像装置
JP2012010162A (ja) * 2010-06-25 2012-01-12 Kyocera Corp カメラ装置
JP2012146236A (ja) * 2011-01-14 2012-08-02 Olympus Corp ジェスチャ入力装置
JP2013065112A (ja) * 2011-09-15 2013-04-11 Omron Corp ジェスチャ認識装置、電子機器、ジェスチャ認識装置の制御方法、制御プログラムおよび記録媒体

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010098354A (ja) * 2008-10-14 2010-04-30 Panasonic Corp 撮像装置
JP2012010162A (ja) * 2010-06-25 2012-01-12 Kyocera Corp カメラ装置
JP2012146236A (ja) * 2011-01-14 2012-08-02 Olympus Corp ジェスチャ入力装置
JP2013065112A (ja) * 2011-09-15 2013-04-11 Omron Corp ジェスチャ認識装置、電子機器、ジェスチャ認識装置の制御方法、制御プログラムおよび記録媒体

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017098775A1 (fr) * 2015-12-11 2017-06-15 ソニー株式会社 Dispositif ainsi que procédé de traitement d'informations, et programme
US11087775B2 (en) 2015-12-11 2021-08-10 Sony Corporation Device and method of noise suppression based on noise source positions
JP2017220032A (ja) * 2016-06-07 2017-12-14 株式会社ソニー・インタラクティブエンタテインメント 情報処理装置、情報処理方法、およびコンピュータプログラム
CN112912824A (zh) * 2018-10-24 2021-06-04 株式会社阿尔法空间 与头戴式显示器相连接的智能终端及用于其的控制方法

Also Published As

Publication number Publication date
JP2017083916A (ja) 2017-05-18

Similar Documents

Publication Publication Date Title
US10417496B2 (en) Visibility enhancement devices, systems, and methods
EP2634727B1 (fr) Terminal portable et procédé permettant de corriger la direction du regard d'un utilisateur dans une image
KR20220088660A (ko) 디바이스를 이용한 화면 처리 방법 및 장치
US10477090B2 (en) Wearable device, control method and non-transitory storage medium
EP3062286B1 (fr) Compensation de distorsion optique
CN106066537B (zh) 头戴式显示器和头戴式显示器的控制方法
EP3144775B1 (fr) Système de traitement d'informations et procédé de traitement d'informations
WO2015136952A1 (fr) Dispositif de reconnaissance de geste, visiocasque, et terminal portable
US20150381885A1 (en) Glass-type terminal and method for controlling the same
CN110546601B (zh) 信息处理装置、信息处理方法和程序
JP6341755B2 (ja) 情報処理装置、方法及びプログラム並びに記録媒体
CN104115100A (zh) 头戴式显示器、用于控制头戴式显示器的程序及控制头戴式显示器的方法
US8996333B2 (en) Information processing apparatus which executes specific processing based on a specific condition and a detected specific vibration, and method for controlling the same
JP2015114818A (ja) 情報処理装置、情報処理方法及びプログラム
US10389947B2 (en) Omnidirectional camera display image changing system, omnidirectional camera display image changing method, and program
US9148537B1 (en) Facial cues as commands
US20180004288A1 (en) Electronic device
CN106371552B (zh) 一种在移动终端进行媒体展示的控制方法及装置
WO2016157951A1 (fr) Dispositif de commande d'affichage, procédé de commande d'affichage et support d'enregistrement
JP6686319B2 (ja) 画像投影装置及び画像表示システム
JP6155893B2 (ja) 画像処理装置、及びプログラム
WO2017168622A1 (fr) Système de partage d'image capturée, procédé de partage d'image capturée, et programme
JP6079418B2 (ja) 入力装置および入力プログラム
US11995899B2 (en) Pointer-based content recognition using a head-mounted device
US11733789B1 (en) Selectively activating a handheld device to control a user interface displayed by a wearable device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15761429

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15761429

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP