WO2011148607A1 - ジェスチャ認識装置及びジェスチャ認識方法 - Google Patents
ジェスチャ認識装置及びジェスチャ認識方法 Download PDFInfo
- Publication number
- WO2011148607A1 WO2011148607A1 PCT/JP2011/002847 JP2011002847W WO2011148607A1 WO 2011148607 A1 WO2011148607 A1 WO 2011148607A1 JP 2011002847 W JP2011002847 W JP 2011002847W WO 2011148607 A1 WO2011148607 A1 WO 2011148607A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- screen
- image
- gesture recognition
- camera
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
Definitions
- the present invention relates to a gesture recognition apparatus that recognizes the movement of the whole or part of a user's body as a gesture operation and controls a display device.
- a conventional gesture recognition device recognizes a user's movement (hereinafter referred to as “gesture operation” as appropriate) captured by a camera by contour extraction, object extraction, shape change identification, and position detection, and based on the recognition result.
- the device is controlled (see, for example, Patent Document 1).
- FIG. 9 is a block diagram showing a functional configuration of a conventional gesture recognition device described in Patent Document 1. In FIG.
- a screen 401 displays a graphical user interface (GUI: Graphical User Interface) reflecting various information or gesture operation results.
- GUI Graphical User Interface
- the camera 402 images a user's gesture operation.
- the frame memory 411 images taken by the user by the camera 402 are sequentially stored.
- the motion recognition unit 412 periodically reads an image from the frame memory 411 and recognizes a gesture operation by performing two-dimensional contour extraction, object extraction, shape change identification, and position detection on the read image.
- the display control unit 413 and the icon generation unit 414 create a GUI reflecting the recognition result of the gesture operation by the motion recognition unit 412 and display it on the screen 401.
- Patent Document 1 has a problem that the operability is lowered depending on the position where the user performs the gesture operation.
- the user when the user is at a position far away from the front of the screen, the user views the screen from an oblique angle, and thus the gesture operation in a specific direction (for example, the horizontal direction). May feel uncomfortable. As a result, the operability by the gesture operation in the specific direction is lowered.
- a specific direction for example, the horizontal direction
- the present invention solves the above-described conventional problems, and provides a gesture recognition device capable of realizing high operability regardless of the position of the user when the display device is operated by a gesture operation.
- the purpose is to do.
- a gesture recognition device is a gesture recognition device that controls a display device based on a gesture operation performed by a user located in the vicinity of a screen. Based on position information indicating the direction of the user as viewed from the screen or the image acquisition unit, an image acquisition unit that acquires an image of the vicinity imaged, and an operation direction that is a direction of movement to be recognized as a gesture operation Using the direction determination unit to be determined and the image acquired by the image acquisition unit, a gesture operation is performed on a motion of a part or all of the user's body and determined by the direction determination unit. And a display control unit that controls the display device based on a gesture operation recognized by the gesture recognition unit. That.
- the image is picked up by a camera, and the gesture recognition device further calculates a degree of deviation of the direction of the user from the screen or the camera with respect to a normal direction of the screen or an optical axis direction of the camera.
- a position acquisition unit that acquires the degree of deviation as the position information, and the direction determination unit may determine a horizontal direction or a vertical direction as the operation direction based on whether the degree of deviation exceeds a threshold value. preferable.
- the horizontal direction or the vertical direction can be determined as the operation direction based on whether or not the degree of divergence exceeds a threshold value, so that it is appropriate for users who are away from the screen or the front of the camera.
- An appropriate operation direction can be determined, and operability can be improved.
- the degree of divergence includes a degree of divergence in the horizontal direction indicating a degree of divergence on a horizontal plane, and the direction determining unit may determine the vertical direction as the operation direction when the degree of divergence in the horizontal direction exceeds a threshold value. preferable.
- the vertical direction when the lateral divergence exceeds a threshold value, the vertical direction can be determined as the operation direction, so that the user viewing the screen diagonally from a position away from the horizontal direction can perform a gesture operation without a sense of incongruity. it can. Furthermore, when the camera is moved away from the front of the camera in the horizontal direction, the vertical direction can be determined as the operation direction instead of the horizontal direction in which the recognition accuracy of the gesture operation is reduced, and thus the operability can be improved.
- the degree of divergence includes a vertical direction divergence degree that indicates a degree of divergence on a vertical plane that is orthogonal to a horizontal plane and orthogonal to the screen or the imaging surface of the camera, and the direction determination unit includes the vertical divergence degree.
- the direction determination unit includes the vertical divergence degree.
- the horizontal direction when the degree of vertical divergence exceeds a threshold value, the horizontal direction can be determined as the operation direction, so that a user viewing the screen diagonally from a position away from the vertical direction can perform a gesture operation without a sense of incongruity. it can. Further, when the camera is moved away from the front of the camera in the vertical direction, the horizontal direction can be determined as the operation direction instead of the vertical direction in which the recognition accuracy of the gesture operation is lowered, so that the operability can be improved.
- the direction determination unit determines the threshold according to the recognition accuracy of the gesture operation.
- This configuration makes it possible to improve user operability. For example, when the operability is higher in the horizontal direction than in the vertical direction, it is possible to determine the threshold value so that the horizontal direction is determined as the operation direction as long as it can be recognized, thereby improving the operability of the user. It becomes possible.
- the camera includes an optical system including a fisheye lens, and the position acquisition unit identifies a user image included in the image acquired by the image acquisition unit, and the position and image of the identified user image. It is preferable to acquire the position information based on the distance from the center position.
- This configuration makes it possible to share an image for recognizing a gesture operation and an image for acquiring a user's direction viewed from the camera or screen. That is, since it is not necessary to newly install a camera or a position sensor in order to acquire position information, the configuration can be simplified.
- the display control unit moves the object displayed on the screen in the operation direction when a gesture operation is recognized by the gesture recognition unit.
- This configuration allows the operation direction of the gesture operation and the movement direction of the object displayed on the screen to match, further improving the operability for the user.
- the display control unit scrolls the plurality of objects in the operation direction when the plurality of objects are scroll-displayed on the screen.
- scroll display refers to sliding and displaying objects that do not fit on the screen.
- An object is a display element displayed on the screen or a set thereof.
- the display control unit displays the position information on the display device.
- the user since the position information is displayed, the user can predict the timing at which the operation direction is switched, and can suppress a sudden sensation with respect to a change in the operation direction.
- a display device includes the gesture recognition device.
- a gesture recognition method is a gesture recognition method for controlling a display device based on a gesture operation performed by a user located in the vicinity of a screen, and an image in which the vicinity of the screen is captured.
- An image acquisition step for acquiring the image
- a direction determination step for determining an operation direction that is a direction of movement to be recognized as a gesture operation based on position information indicating the direction of the user viewed from the screen or the imaging position of the image
- Gesture recognition for recognizing, as a gesture operation, a motion of a part or all of the user's body and determined in the direction determination step using the image acquired in the image acquisition step
- controlling the display device based on the gesture operation recognized in the gesture recognition step
- a display control step that.
- the operation direction can be determined based on the direction of the user viewed from the screen or the camera, high operability can be realized regardless of the position of the user. Can do.
- FIG. 1 is an external view showing a configuration of a gesture recognition system according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing a functional configuration of the gesture recognition device according to the embodiment of the present invention.
- FIG. 3 is a flowchart showing the operation of the gesture recognition apparatus according to the embodiment of the present invention.
- FIG. 4 is a diagram for explaining the operation of the gesture recognition device according to the embodiment of the present invention.
- FIG. 5A is a diagram for explaining a GUI displayed on the screen according to the embodiment of the present invention.
- FIG. 5B is a diagram for explaining a GUI displayed on the screen according to the embodiment of the present invention.
- FIG. 6 is a diagram for explaining another example of the direction determination process in the embodiment of the present invention.
- FIG. 1 is an external view showing a configuration of a gesture recognition system according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing a functional configuration of the gesture recognition device according to the embodiment of the present invention.
- FIG. 3
- FIG. 7 is a diagram illustrating the relationship between the position of the user who performs the gesture operation and the camera recognition axis.
- FIG. 8 is a diagram showing a display example of position information in the second modification of the embodiment of the present invention.
- FIG. 9 is a block diagram illustrating a functional configuration of a conventional gesture recognition device.
- FIG. 1 is an external view showing a configuration of a gesture recognition system according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing a functional configuration of the gesture recognition apparatus according to the embodiment of the present invention.
- the gesture recognition system 100 includes a display device 101, a camera 102, and a gesture recognition device 200.
- the display device 101 is, for example, a plasma display or a liquid crystal display, and displays an image on the screen 101a. That is, the display device 101 is a device that displays a GUI, a picture, a video, or the like.
- the camera 102 includes an optical system and an image sensor, and is installed on the screen 101a.
- the camera 102 is installed on the horizontal plane so that the optical axis direction of the optical system and the normal direction of the screen 101a coincide.
- the camera 102 includes an optical system including a fisheye lens.
- the camera 102 images the user 300 located in the vicinity of the screen 101a using a fisheye lens.
- the distance from the center of the image is proportional to the angle between the direction viewed from the camera 102 and the optical axis direction.
- the gesture recognition device 200 controls the display device 101 based on the gesture operation of the user 300 located in the vicinity of the screen 101a.
- the gesture recognition apparatus 200 includes an image acquisition unit 201, a frame memory 202, a position acquisition unit 203, a direction determination unit 204, a gesture recognition unit 205, and a display control unit. 206.
- the image acquisition unit 201 acquires image information indicating an image generated by the camera 102. Specifically, for example, the image acquisition unit 201 acquires image information from the camera 102 every time an image is periodically generated by the camera 102. Then, the image acquisition unit 201 stores the acquired image information in the frame memory 202.
- the frame memory 202 holds the image information periodically acquired by the image acquisition unit 201 until the next image information is acquired.
- the position acquisition unit 203 acquires position information indicating the direction of the user 300 as viewed from the screen 101a or the camera 102.
- the position acquisition unit 203 acquires position information as follows.
- the position acquisition unit 203 identifies the image of the user 300 included in the image acquired by the image acquisition unit 201. Specifically, the position acquisition unit 203 identifies the image of the user 300 by, for example, comparing with a template image held in advance. Then, the position acquisition unit 203 acquires position information based on the distance between the identified position of the image of the user 300 and the center position of the image.
- the direction of the user 300 viewed from the screen 101a is a direction of a straight line connecting the representative position of the screen 101a and the position where the user 300 is located.
- the representative position of the screen 101a is the center position or the center of gravity position of the screen 101a.
- the direction of the user 300 viewed from the camera 102 is a direction of a straight line connecting the representative position of the camera 102 and the position where the user 300 is located.
- the representative position of the camera 102 is the position of the optical center of the camera 102 or the position of the imaging center.
- the degree of deviation indicating the degree of deviation of the direction of the user 300 viewed from the screen 101a or the camera 102 with respect to the front of the screen 101a or the camera 102 corresponds to the position information.
- the degree of deviation of the direction of the user 300 as viewed from the screen 101 a or the camera 102 with respect to the front of the screen 101 a and the camera 102 is from the screen 101 a or the camera 102 with respect to the normal direction of the screen 101 a or the optical axis direction of the camera 102.
- the degree of deviation in the direction of the viewed user 300 is shown.
- the direction determination unit 204 determines an operation direction that is a direction of movement to be recognized as a gesture operation, based on the position information acquired by the position acquisition unit 203. Specifically, the direction determination unit 204 determines the horizontal direction or the vertical direction as the operation direction based on whether or not the degree of deviation exceeds a threshold value.
- the horizontal direction is the horizontal direction of the screen 101a.
- the vertical direction is the vertical direction of the screen 101a.
- the direction determining unit 204 determines the vertical direction as the operation direction when the lateral divergence exceeds a threshold value. On the other hand, the direction determination unit 204 determines the horizontal direction as the operation direction when the horizontal direction deviation degree does not exceed the threshold value.
- the lateral divergence degree is one of the divergence degrees, and indicates the degree of divergence in the direction of the user viewed from the screen 101a or the camera 102 with respect to the front surface of the screen 101a or the camera 102 on the horizontal plane. That is, the horizontal direction deviation degree indicates the degree of deviation on the horizontal plane in the direction of the user viewed from the screen 101a or the camera 102 with respect to the normal direction of the screen 101a or the optical axis direction of the camera 102.
- the lateral divergence degree is an angle formed on the horizontal plane between the direction of the user viewed from the screen 101a or the camera 102 and the normal direction of the screen 101a or the optical axis direction of the camera 102.
- the gesture recognizing unit 205 uses the image information acquired by the image acquiring unit 201 to perform a gesture operation on the movement of the operation direction determined by the direction determining unit 204 that is a part or all of the body of the user 300. Recognize as For example, the gesture recognition unit 205 recognizes the movement of the palm of the user 300 in the operation direction as a gesture operation.
- the gesture recognition unit 205 includes a recognition target identification unit 205a and an operation amount measurement unit 205b.
- the recognition target identification unit 205 a reads image information held in the frame memory 202. Then, the recognition target identifying unit 205a performs contour extraction, object extraction, position detection, and the like on the image indicated by the read image information, thereby recognizing a part to be recognized as a gesture operation (entire body or part of the body). Part or an object held in a hand) (hereinafter simply referred to as “gesture recognition part”).
- the operation amount measuring unit 205b acquires the position of the gesture recognition part from the recognition target identifying unit 205a. Then, the operation amount measuring unit 205b measures the gesture operation amount in the operation direction determined by the direction determining unit 204 from the difference from the position of the gesture recognition part recognized in the immediately preceding image. Further, the operation amount measuring unit 205b calculates a scroll amount to be reflected on the GUI from the gesture operation amount.
- the display control unit 206 controls the display device 101 based on the gesture operation recognized by the gesture recognition unit 205.
- the display control unit 206 acquires the scroll amount from the operation amount measurement unit 205b. Then, the display control unit 206 creates GUI information by combining the information from the running application and other information. Further, the display control unit 206 transmits the created GUI information to the display device 101 as a control signal.
- FIG. 3 is a flowchart showing the operation of the gesture recognition apparatus in the embodiment of the present invention.
- FIG. 4 is a diagram for explaining the operation of the gesture recognition apparatus according to the embodiment of the present invention.
- 5A and 5B are diagrams for explaining a GUI displayed on the screen according to the embodiment of the present invention.
- the image acquisition unit 201 acquires image information from the camera 102 (S301).
- the acquired image information is stored in the frame memory 202.
- the position acquisition unit 203 reads the image information from the frame memory 202, and identifies the image of the user 300 from the image indicated by the read image information (S302).
- the position acquisition unit 203 measures the angle from the front of the camera 102 on the horizontal plane as the lateral deviation degree based on the position of the identified image of the user 300 on the image (S303).
- This measurement method utilizes the fact that the image captured by the camera 102 is an image captured using a fisheye lens (hereinafter referred to as “fisheye image”).
- fisheye image the distance between the center point of the image and the measurement target is proportional to the angle from the front of the camera 102. Therefore, the position acquisition unit 203 can measure the angle from the front of the camera 102 using a proportional constant calculated from the characteristics of the fisheye lens.
- the direction determination unit 204 determines whether or not the obtained angle exceeds a preset threshold value (S304). That is, as illustrated in FIG. 4, the direction determination unit 204 determines whether or not the angle measured by the position acquisition unit 203 exceeds an angle ⁇ th (threshold) from the front of the camera 102.
- the direction determination unit 204 determines the operation mode to be the horizontal mode (S305).
- the horizontal mode indicates a horizontal operation direction.
- the horizontal mode is a mode in which a scroll application (such as a photo browsing application) displayed on the screen is operated by a gesture operation in the horizontal direction.
- the GUI displayed on the screen 101a in accordance with the operation direction of the gesture operation is also a GUI based on the horizontal operation.
- the direction determination unit 204 determines the horizontal direction as the operation direction when the horizontal direction deviation degree does not exceed the threshold value. For example, in the case of the user 301 shown in FIG. 4, the direction determination unit 204 determines that the operation direction is the horizontal direction because the angle ⁇ 1 that is the degree of lateral deviation of the user 301 does not exceed the angle ⁇ th (threshold). decide.
- the operation amount measuring unit 205b measures the operation amount of the gesture operation in the horizontal direction (S306). Further, the operation amount measuring unit 205b calculates a horizontal scroll amount from the obtained operation amount (S307).
- the direction determination unit 204 determines the operation mode to be the vertical mode (S308).
- the vertical mode indicates a vertical operation direction.
- the vertical mode is a mode in which a scroll application (such as a photo browsing application) displayed on the screen is operated by a vertical gesture operation.
- the GUI displayed on the screen 101a in accordance with the operation direction of the gesture operation is also a GUI based on the vertical operation.
- the direction determination unit 204 determines the vertical direction as the operation direction when the lateral divergence degree exceeds the threshold value. For example, in the case of the user 302 shown in FIG. 4, the direction determination unit 204 determines that the operation direction is the vertical direction because the angle ⁇ 2, which is the lateral divergence degree of the user 302, exceeds the angle ⁇ th (threshold). decide.
- the operation amount measuring unit 205b measures the operation amount of the gesture operation in the vertical direction (S309). Further, the operation amount measuring unit 205b calculates the scroll amount in the vertical direction from the obtained operation amount (S310).
- the display control unit 206 generates a GUI reflecting the horizontal / vertical mode determined in step S305 or step S308 and the scroll amount calculated in step S307 or step S310, and displays the GUI on the display device 101. (S311).
- the display control unit 206 causes the display device 101 to display a GUI for scrolling the plurality of objects 103 in the horizontal direction as illustrated in FIG. 5A. Further, the display control unit 206 moves the knob 104 of the horizontal scroll bar by the calculated scroll amount, and displays the object 103 corresponding to the scroll position of the knob 104 on the screen 101a. That is, the display control unit 206 scrolls the plurality of objects 103 in the horizontal direction according to the scroll amount.
- the display control unit 206 causes the display device 101 to display a GUI for scrolling the plurality of objects 103 in the vertical direction as illustrated in FIG. 5B. Further, the display control unit 206 moves the knob 104 of the vertical scroll bar by the calculated scroll amount, and causes the object 103 corresponding to the scroll position of the knob 104 to be displayed on the screen 101a. That is, the display control unit 206 scrolls the plurality of objects 103 in the vertical direction according to the scroll amount.
- the gesture recognition apparatus 200 sets the operation direction based on the direction of the user viewed from the screen 101a or the camera 102 by repeating the cycle in synchronization with the frame rate of the camera 102 with the above processing as one cycle. Can be determined. Therefore, the gesture recognition device 200 can realize high operability regardless of the position of the user.
- the gesture recognition device 200 can determine the horizontal direction or the vertical direction as the operation direction based on whether or not the divergence degree exceeds a threshold value, the gesture recognition apparatus 200 is away from the front of the screen 101a or the camera 102. An appropriate operation direction can be determined for the user at the position, and operability can be improved.
- the gesture recognition apparatus 200 can determine the vertical direction as the operation direction when the lateral divergence degree exceeds the threshold value, the user who views the screen obliquely from a position separated in the horizontal direction can perform the gesture operation without feeling uncomfortable. Can be performed. Furthermore, the gesture recognition device 200 can determine the vertical direction as the operation direction instead of the horizontal direction in which the recognition accuracy of the gesture operation decreases when the gesture recognition apparatus 200 moves away from the front of the camera 102 in the horizontal direction, thereby improving operability. Can do.
- the gesture recognition device 200 acquires an image captured using a fisheye lens, thereby sharing an image for recognizing a gesture operation and an image for acquiring a user's direction viewed from the camera or the screen. can do. That is, since the gesture recognition system 100 does not need to newly install a camera or a position sensor in order to acquire position information, the configuration can be simplified.
- the gesture recognition device 200 matches the operation direction of the gesture operation and the scroll direction, the operability when performing the scroll operation by the gesture operation can be improved.
- the direction determination unit 204 determines the operation direction based on the horizontal direction deviation degree, but may determine the operation direction based on the vertical direction deviation degree. That is, the direction determination unit 204 may determine the horizontal direction as the operation direction when the vertical direction deviation degree exceeds the threshold value.
- the gesture recognition apparatus 200 can determine the horizontal direction as the operation direction when the vertical direction deviation degree exceeds the threshold value, the user who sees the screen 101a obliquely from a position away from the vertical direction has no sense of incongruity. Gesture operation can be performed.
- the gesture recognition device 200 is separated from the front of the camera 102 in the vertical direction, the gesture recognition device 200 can determine the horizontal direction as the operation direction instead of the vertical direction in which the recognition accuracy of the gesture operation decreases, thereby improving operability. Can do.
- the vertical direction divergence degree is one of divergence degrees, and is a screen with respect to the front surface of the screen 101a or the camera 102 on a vertical plane that is orthogonal to the horizontal plane and orthogonal to the imaging surface of the screen 101a or the camera 102.
- 101a or the degree of deviation of the direction of the user viewed from the camera 102 is shown. That is, the vertical direction deviation degree indicates the degree of deviation on the vertical plane of the direction of the user viewed from the screen 101a or the camera 102 with respect to the normal direction of the screen 101a or the optical axis direction of the camera 102.
- the vertical direction deviation degree is an angle formed on the vertical plane between the direction of the user viewed from the screen 101a or the camera 102 and the normal direction of the screen 101a or the optical axis direction of the camera 102.
- the direction determining unit 204 may determine the operation direction based on both the horizontal direction deviation degree and the vertical direction deviation degree.
- FIG. 6 is a diagram for explaining another example of the direction determination process in the embodiment of the present invention.
- the direction determining unit 204 determines the vertical direction as the operation direction when the horizontal direction deviation degree exceeds the threshold value, and determines the horizontal direction as the operation direction when the vertical direction deviation degree exceeds the threshold value. Note that the direction determining unit 204 may determine either the horizontal direction or the vertical direction as the operation direction when both the horizontal direction deviation degree and the vertical direction deviation degree do not exceed the threshold value. In such a case, the direction determining unit 204 may determine the operation direction based on, for example, the shape of the screen 101a. For example, when the shape of the screen 101a is larger in the horizontal direction than in the vertical direction, the direction determining unit 204 may determine the horizontal direction as the operation direction.
- the camera 102 in this modification is a three-dimensional camera that enables recognition of more various gesture operations, and is different from the camera 102 in the above embodiment.
- 3D cameras can acquire depth information in addition to 2D images captured by ordinary cameras. There are various methods for acquiring information in the depth direction, but the three-dimensional camera in the present modification may acquire information in the depth direction by any method. For example, a three-dimensional camera emits infrared rays from a built-in light emitting diode, calculates the distance in the depth direction from the round-trip time or phase shift of the infrared rays reflected back from the object to be imaged, and calculates the calculated distance based on the shading of the image. Express.
- the gesture recognition unit 205 When using a three-dimensional camera, the gesture recognition unit 205 needs to recognize a gesture operation in a three-dimensional space. However, when detecting a movement in a direction parallel to the screen as a gesture operation, there is a problem that there is no detection method with a small processing load and high recognition accuracy of the operation amount.
- FIG. 7 is a diagram illustrating the relationship between the position of the user who performs the gesture operation and the camera recognition axis, and is a diagram of the screen, the camera, and the user as viewed from above.
- the gesture recognition device 200 measures the distance from the camera 102 to the users 303 and 304. Are each set as the z-axis. Therefore, for the user 304 at a location off the front of the camera 102, the x-axis orthogonal to the z-axis is set in a direction intersecting the screen 101a as shown in FIG.
- the gesture recognition apparatus 200 uses all three axes (x axis, y axis, z axis) to handle a three-dimensional image.
- the processing load increases as compared with the processing of the two-dimensional image.
- the gesture recognition device 200 determines the operation direction as the vertical direction when the user 304 has a lateral divergence exceeding a threshold value, as in the above embodiment. A decrease in accuracy can be suppressed. Furthermore, when the vertical direction is determined as the operation direction, the gesture recognition apparatus 200 can recognize the gesture operation only in the y-axis direction, and thus can suppress an increase in processing load.
- Modification 2 Next, a second modification of the above embodiment will be described.
- the display control unit 206 in the present modification causes the display device 101 to display position information.
- FIG. 8 is a diagram showing a display example of position information in the second modification of the embodiment of the present invention.
- the position information 105 indicates the direction of the user as viewed from the screen 101a or the camera 102.
- the non-hatched area indicates that the operation direction is the horizontal direction.
- a hatched area indicates that the operation direction is the vertical direction.
- FIG. 8 the user has moved from a position where the operation direction is the horizontal direction (FIG. 8A) to a position where the operation direction is the vertical direction (FIG. 8B).
- the user can predict the timing at which the operation direction is switched, and can suppress a sudden feeling with respect to a change in the operation direction.
- the display control unit 206 preferably performs control so that the GUI is gradually switched by animation. Thereby, it is also possible to suppress a sudden feeling of GUI switching.
- gesture recognition apparatus 200 which concerns on 1 aspect of this invention was demonstrated based on embodiment and a modification, this invention is not limited to these embodiment and a modification. Unless it deviates from the meaning of the present invention, various modifications conceived by those skilled in the art have been made in the present embodiment, or forms constructed by combining constituent elements in the embodiment and the modified examples are also within the scope of the present invention. included.
- the camera 102 includes a fisheye lens, but does not necessarily include a fisheye lens. Even when the camera 102 does not include a fisheye lens, the position acquisition unit 203 can acquire the direction of the user viewed from the camera 102 based on the position of the user's image on the image. For example, the position acquisition unit 203 acquires a lateral component of the distance of the user's image from the center of the image as the lateral divergence.
- the position acquisition unit 203 does not necessarily need to acquire position information from an image captured by the camera 102.
- the position acquisition unit 203 may acquire position information from a position sensor attached to the user, a seat pressure sensor installed on the floor, or a camera installed on the ceiling.
- the camera 102 is installed so that the optical axis direction of the optical system included in the camera 102 and the normal direction of the screen 101a coincide on the horizontal plane.
- the camera 102 is not necessarily installed in this way.
- the camera 102 is attached to the screen 101a as an installation place, it is not limited to this, The installation form independent of the screen 101a is also considered. Even if the optical axis direction of the optical system included in the camera 102 does not match the normal direction of the screen 101a, the gesture recognition device 200 is based on the direction of the user viewed from one of the screen 101a and the camera 102. By determining the operation direction, the operability for the user can be improved.
- the gesture recognition apparatus 200 includes the frame memory 202.
- the gesture recognition apparatus 200 does not necessarily need to include the frame memory 202.
- the gesture recognition device 200 determines the vertical direction or the horizontal direction as the operation direction, but the operation direction is not necessarily limited to these directions.
- the gesture recognition device 200 may determine the diagonal direction as the operation direction when both the vertical direction deviation degree and the horizontal direction deviation degree exceed a threshold value.
- the display control unit 206 scrolls a plurality of objects.
- the display control unit 206 does not necessarily need to scroll-display a plurality of objects. For example, one large object that cannot be displayed on the screen 101a may be scroll-displayed.
- the display control unit 206 does not necessarily perform scroll display, and may simply move the object in the operation direction.
- the display control unit 206 may move an object indicating the current volume in the operation direction.
- the display control unit 206 changes the loudness according to the position of the moved object.
- the gesture recognition unit 205 does not need to measure the scroll amount, and may simply recognize the gesture operation.
- the display control unit 206 does not necessarily display an object on the screen 101a.
- the display control unit 206 may only change the volume or image quality (such as brightness or contrast) of the display device 101. That is, the display control unit 206 may control the display device 101 based on the gesture operation.
- the direction determination unit 204 determines the threshold value of the deviation degree according to the recognition accuracy of the gesture operation. That is, it is preferable that the direction determination unit 204 determines the threshold value of the divergence degree so that the threshold value dynamically changes according to the recognition accuracy of the gesture operation. Specifically, for example, the direction determination unit 204 preferably determines the threshold value of the divergence degree so that the threshold value of the divergence degree increases as the recognition accuracy of the gesture operation increases.
- ⁇ Recognition accuracy of gesture operation depends on the image. For example, when a gesture operation is recognized using an image photographed by a two-dimensional camera, the recognition accuracy of the gesture operation is the difference between the brightness of the image or the color of the operation part of the operator and the background color.
- the direction determination unit 204 may determine the threshold value of the divergence degree according to the brightness of the image or the difference between the color of the operation part of the operator and the background color.
- the recognition accuracy of the gesture operation depends on the amount of noise during distance measurement.
- This amount of noise depends on the amount of environmental light at the wavelength of light used for distance measurement.
- the amount of noise also depends on the background. For example, when the background is ground glass, the amount of noise increases. That is, the direction determining unit 204 may determine the threshold value of the divergence degree according to the amount of noise during distance measurement.
- the operability of the user can be improved by determining the threshold of the deviation degree according to the recognition accuracy of the gesture operation. For example, when the operability is higher in the horizontal direction than in the vertical direction, it is possible to determine the threshold value so that the horizontal direction is determined as the operation direction as long as it can be recognized, thereby improving the operability of the user. It becomes possible.
- the gesture recognition device 200 may include a system LSI that includes an image acquisition unit 201, a direction determination unit 204, a gesture recognition unit 205, and a display control unit 206.
- LSI Large Scale Integration
- the system LSI is an ultra-multifunctional LSI manufactured by integrating a plurality of components on a single chip. Specifically, a microprocessor, a ROM (Read Only Memory), a RAM (Random Access Memory), etc. It is a computer system comprised including. A computer program is stored in the RAM. The system LSI achieves its functions by the microprocessor operating according to the computer program.
- system LSI may be called IC, LSI, super LSI, or ultra LSI depending on the degree of integration.
- method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible.
- An FPGA Field Programmable Gate Array
- reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- the present invention may be realized as a display device including the gesture recognition device 200 in the above embodiment.
- the present invention may be realized as a gesture recognition system including the gesture recognition device 200, the display device 101, and the camera 102 in the above embodiment.
- the display device 101 may be a projector that projects an image on the screen 101a. That is, the display device 101 does not necessarily need to include the screen 101a.
- the present invention can be realized not only as a gesture recognition device including such a characteristic processing unit, but also as a gesture recognition method using the characteristic processing unit included in the gesture recognition device as a step. You can also. It can also be realized as a computer program that causes a computer to execute the characteristic steps included in the gesture recognition method. Needless to say, such a computer program can be distributed via a computer-readable recording medium such as a CD-ROM (Compact Disc Only Memory) or a communication network such as the Internet.
- a computer-readable recording medium such as a CD-ROM (Compact Disc Only Memory) or a communication network such as the Internet.
- the gesture recognition device is useful as a technique for operating the display device by recognizing the movement of the whole or part of the user's body.
- DESCRIPTION OF SYMBOLS 100 Gesture recognition system 101 Display apparatus 101a Screen 102 Camera 200 Gesture recognition apparatus 201 Image acquisition part 202 Frame memory 203 Position acquisition part 204 Direction determination part 205 Gesture recognition part 205a Recognition target identification part 205b Operation amount measurement part 206 Display control part 300, 301, 302, 303, 304 User
Abstract
Description
図1は、本発明の実施の形態におけるジェスチャ認識システムの構成を示す外観図である。また、図2は、本発明の実施の形態におけるジェスチャ認識装置の機能構成を示すブロック図である。
次に、上記実施の形態の変形例1について説明する。
次に、上記実施の形態の変形例2について説明する。本変形例における表示制御部206は、表示装置101に位置情報を表示させる。
101 表示装置
101a 画面
102 カメラ
200 ジェスチャ認識装置
201 画像取得部
202 フレームメモリ
203 位置取得部
204 方向決定部
205 ジェスチャ認識部
205a 認識対象識別部
205b 操作量計測部
206 表示制御部
300、301、302、303、304 使用者
Claims (11)
- 画面の近傍に位置する使用者が行なうジェスチャ操作に基づいて表示装置を制御するジェスチャ認識装置であって、
前記画面の近傍が撮像された画像を取得する画像取得部と、
前記画面又は前記画像取得部からみた前記使用者の方向を示す位置情報に基づいて、ジェスチャ操作として認識すべき動きの方向である操作方向を決定する方向決定部と、
前記画像取得部によって取得された画像を用いて、前記使用者の体の一部又は全部の動きであって前記方向決定部によって決定された操作方向の動きをジェスチャ操作として認識するジェスチャ認識部と、
前記ジェスチャ認識部によって認識されたジェスチャ操作に基づいて前記表示装置を制御する表示制御部とを備える
ジェスチャ認識装置。 - 前記画像は、カメラによって撮像され、
前記ジェスチャ認識装置は、さらに、前記画面の法線方向又は前記カメラの光軸方向に対する、前記画面又は前記カメラからみた前記使用者の方向の乖離度合いを示す乖離度を前記位置情報として取得する位置取得部を備え、
前記方向決定部は、前記乖離度が閾値を超えるか否かに基づいて、横方向又は縦方向を前記操作方向として決定する
請求項1に記載のジェスチャ認識装置。 - 前記乖離度は、水平面上における乖離度合いを示す横方向乖離度を含み、
前記方向決定部は、前記横方向乖離度が閾値を超える場合に、縦方向を前記操作方向として決定する
請求項2に記載のジェスチャ認識装置。 - 前記乖離度は、水平面と直交し、かつ前記画面又は前記カメラの撮像面と直交する垂直面上における乖離度合いを示す縦方向乖離度を含み、
前記方向決定部は、前記縦方向乖離度が閾値を超える場合に、横方向を前記操作方向として決定する
請求項2又は3に記載のジェスチャ認識装置。 - 前記方向決定部は、ジェスチャ操作の認識精度に応じて、前記閾値を決定する
請求項2~4のいずれか1項に記載のジェスチャ認識装置。 - 前記カメラは、魚眼レンズを含む光学系を備え、
前記位置取得部は、前記画像取得部によって取得された画像に含まれる使用者の像を識別し、識別した使用者の像の位置と画像の中心位置との距離に基づいて、前記位置情報を取得する
請求項2~5のいずれか1項に記載のジェスチャ認識装置。 - 前記表示制御部は、前記ジェスチャ認識部によってジェスチャ操作が認識された場合、前記画面に表示されているオブジェクトを前記操作方向に移動させる
請求項1~6のいずれか1項に記載のジェスチャ認識装置。 - 前記表示制御部は、前記画面に複数のオブジェクトがスクロール表示される場合に、前記操作方向に前記複数のオブジェクトをスクロールさせる
請求項1~6のいずれか1項に記載のジェスチャ認識装置。 - 前記表示制御部は、前記表示装置に前記位置情報を表示させる
請求項1~8のいずれか1項に記載のジェスチャ認識装置。 - 請求項1~9のいずれか1項に記載のジェスチャ認識装置を備える表示装置。
- 画面の近傍に位置する使用者が行なうジェスチャ操作に基づいて表示装置を制御するジェスチャ認識方法であって、
前記画面の近傍が撮像された画像を取得する画像取得ステップと、
前記画面又は前記画像の撮像位置からみた前記使用者の方向を示す位置情報に基づいて、ジェスチャ操作として認識すべき動きの方向である操作方向を決定する方向決定ステップと、
前記画像取得ステップにおいて取得された画像を用いて、前記使用者の体の一部又は全部の動きであって前記方向決定ステップにおいて決定された操作方向の動きをジェスチャ操作として認識するジェスチャ認識ステップと、
前記ジェスチャ認識ステップにおいて認識されたジェスチャ操作に基づいて前記表示装置を制御する表示制御ステップとを含む
ジェスチャ認識方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011529404A JP4897939B2 (ja) | 2010-05-28 | 2011-05-23 | ジェスチャ認識装置及びジェスチャ認識方法 |
US13/265,911 US8730164B2 (en) | 2010-05-28 | 2011-05-23 | Gesture recognition apparatus and method of gesture recognition |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010123556 | 2010-05-28 | ||
JP2010-123556 | 2010-05-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011148607A1 true WO2011148607A1 (ja) | 2011-12-01 |
Family
ID=45003607
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/002847 WO2011148607A1 (ja) | 2010-05-28 | 2011-05-23 | ジェスチャ認識装置及びジェスチャ認識方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US8730164B2 (ja) |
JP (1) | JP4897939B2 (ja) |
WO (1) | WO2011148607A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014048936A (ja) * | 2012-08-31 | 2014-03-17 | Omron Corp | ジェスチャ認識装置、その制御方法、表示機器、および制御プログラム |
KR101685138B1 (ko) * | 2015-10-14 | 2016-12-09 | 연세대학교 산학협력단 | 실공간 사용자 인터페이스 장치 및 그 방법 |
Families Citing this family (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5715977B2 (ja) * | 2012-03-05 | 2015-05-13 | 東芝テック株式会社 | 操作支援表示装置およびプログラム |
EP2650754A3 (en) * | 2012-03-15 | 2014-09-24 | Omron Corporation | Gesture recognition apparatus, electronic device, gesture recognition method, control program, and recording medium |
US9081413B2 (en) * | 2012-11-20 | 2015-07-14 | 3M Innovative Properties Company | Human interaction system based upon real-time intention detection |
CN105229582B (zh) * | 2013-03-14 | 2020-04-28 | 视力移动科技公司 | 基于近距离传感器和图像传感器的手势检测 |
US9342671B2 (en) | 2013-07-01 | 2016-05-17 | Blackberry Limited | Password by touch-less gesture |
US9489051B2 (en) | 2013-07-01 | 2016-11-08 | Blackberry Limited | Display navigation using touch-less gestures |
US9367137B2 (en) | 2013-07-01 | 2016-06-14 | Blackberry Limited | Alarm operation by touch-less gesture |
US9323336B2 (en) | 2013-07-01 | 2016-04-26 | Blackberry Limited | Gesture detection using ambient light sensors |
US9423913B2 (en) | 2013-07-01 | 2016-08-23 | Blackberry Limited | Performance control of ambient light sensors |
US9398221B2 (en) | 2013-07-01 | 2016-07-19 | Blackberry Limited | Camera control using ambient light sensors |
US9256290B2 (en) | 2013-07-01 | 2016-02-09 | Blackberry Limited | Gesture detection using ambient light sensors |
US9405461B2 (en) | 2013-07-09 | 2016-08-02 | Blackberry Limited | Operating a device using touchless and touchscreen gestures |
US9304596B2 (en) | 2013-07-24 | 2016-04-05 | Blackberry Limited | Backlight for touchless gesture detection |
US9465448B2 (en) | 2013-07-24 | 2016-10-11 | Blackberry Limited | Backlight for touchless gesture detection |
US9194741B2 (en) | 2013-09-06 | 2015-11-24 | Blackberry Limited | Device having light intensity measurement in presence of shadows |
EP2908219A1 (en) * | 2014-02-14 | 2015-08-19 | Omron Corporation | Gesture recognition apparatus and control method of gesture recognition apparatus |
JP6287382B2 (ja) * | 2014-03-12 | 2018-03-07 | オムロン株式会社 | ジェスチャ認識装置およびジェスチャ認識装置の制御方法 |
JP6349800B2 (ja) * | 2014-03-12 | 2018-07-04 | オムロン株式会社 | ジェスチャ認識装置およびジェスチャ認識装置の制御方法 |
JP2015176253A (ja) * | 2014-03-13 | 2015-10-05 | オムロン株式会社 | ジェスチャ認識装置およびジェスチャ認識装置の制御方法 |
US9778792B2 (en) * | 2015-11-12 | 2017-10-03 | Dell Products L.P. | Information handling system desktop surface display touch input compensation |
CN108227914B (zh) | 2016-12-12 | 2021-03-05 | 财团法人工业技术研究院 | 透明显示装置、使用其的控制方法及其控制器 |
TWI659334B (zh) * | 2016-12-12 | 2019-05-11 | Industrial Technology Research Institute | 透明顯示裝置、使用其之控制方法以及其之控制器 |
FR3065545B1 (fr) * | 2017-04-25 | 2019-06-28 | Thales | Procede de detection d'un signal d'un utilisateur pour generer au moins une instruction de commande d'un equipement avionique d'un aeronef, programme d'ordinateur et dispositif electronique associes |
US10845954B2 (en) * | 2017-07-11 | 2020-11-24 | Sony Corporation | Presenting audio video display options as list or matrix |
CN109388233B (zh) | 2017-08-14 | 2022-07-29 | 财团法人工业技术研究院 | 透明显示装置及其控制方法 |
CN113269075A (zh) * | 2021-05-19 | 2021-08-17 | 广州繁星互娱信息科技有限公司 | 手势轨迹识别方法和装置、存储介质及电子设备 |
CN115480643A (zh) * | 2021-06-17 | 2022-12-16 | 深圳市瑞立视多媒体科技有限公司 | 一种应用于基于ue4全息沙盘的交互方法及装置 |
WO2022241328A1 (en) * | 2022-05-20 | 2022-11-17 | Innopeak Technology, Inc. | Hand gesture detection methods and systems with hand shape calibration |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0844490A (ja) * | 1994-07-28 | 1996-02-16 | Matsushita Electric Ind Co Ltd | インターフェイス装置 |
WO2007088942A1 (ja) * | 2006-02-03 | 2007-08-09 | Matsushita Electric Industrial Co., Ltd. | 入力装置、及びその方法 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0981309A (ja) | 1995-09-13 | 1997-03-28 | Toshiba Corp | 入力装置 |
US7665041B2 (en) | 2003-03-25 | 2010-02-16 | Microsoft Corporation | Architecture for controlling a computer using hand gestures |
JP4569613B2 (ja) | 2007-09-19 | 2010-10-27 | ソニー株式会社 | 画像処理装置および画像処理方法、並びにプログラム |
US8514251B2 (en) | 2008-06-23 | 2013-08-20 | Qualcomm Incorporated | Enhanced character input using recognized gestures |
JP5177075B2 (ja) * | 2009-02-12 | 2013-04-03 | ソニー株式会社 | 動作認識装置、動作認識方法、プログラム |
US8396252B2 (en) * | 2010-05-20 | 2013-03-12 | Edge 3 Technologies | Systems and related methods for three dimensional gesture recognition in vehicles |
-
2011
- 2011-05-23 WO PCT/JP2011/002847 patent/WO2011148607A1/ja active Application Filing
- 2011-05-23 JP JP2011529404A patent/JP4897939B2/ja not_active Expired - Fee Related
- 2011-05-23 US US13/265,911 patent/US8730164B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0844490A (ja) * | 1994-07-28 | 1996-02-16 | Matsushita Electric Ind Co Ltd | インターフェイス装置 |
WO2007088942A1 (ja) * | 2006-02-03 | 2007-08-09 | Matsushita Electric Industrial Co., Ltd. | 入力装置、及びその方法 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014048936A (ja) * | 2012-08-31 | 2014-03-17 | Omron Corp | ジェスチャ認識装置、その制御方法、表示機器、および制御プログラム |
KR101685138B1 (ko) * | 2015-10-14 | 2016-12-09 | 연세대학교 산학협력단 | 실공간 사용자 인터페이스 장치 및 그 방법 |
Also Published As
Publication number | Publication date |
---|---|
US20120176303A1 (en) | 2012-07-12 |
JPWO2011148607A1 (ja) | 2013-07-25 |
US8730164B2 (en) | 2014-05-20 |
JP4897939B2 (ja) | 2012-03-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4897939B2 (ja) | ジェスチャ認識装置及びジェスチャ認識方法 | |
US11676349B2 (en) | Wearable augmented reality devices with object detection and tracking | |
CN111052727B (zh) | 电子装置及其控制方法 | |
CN105229582B (zh) | 基于近距离传感器和图像传感器的手势检测 | |
US8854433B1 (en) | Method and system enabling natural user interface gestures with an electronic system | |
EP2590396B1 (en) | Information processing system and information processing method | |
CN105210144B (zh) | 显示控制装置、显示控制方法和记录介质 | |
US9507437B2 (en) | Algorithms, software and an interaction system that support the operation of an on the fly mouse | |
US9787943B2 (en) | Natural user interface having video conference controls | |
US20140037135A1 (en) | Context-driven adjustment of camera parameters | |
JP5659510B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
WO2012090429A1 (ja) | ジェスチャ操作入力処理装置およびジェスチャ操作入力処理方法 | |
JP2016038889A (ja) | モーション感知を伴う拡張現実 | |
JP2007052304A (ja) | 映像表示システム | |
US11582409B2 (en) | Visual-inertial tracking using rolling shutter cameras | |
US20150189256A1 (en) | Autostereoscopic multi-layer display and control approaches | |
JP2012238293A (ja) | 入力装置 | |
KR102170749B1 (ko) | 투명 디스플레이를 포함하는 전자 장치 및 제어 방법 | |
JP2010117917A (ja) | 動作検出装置および操作システム | |
US20190066366A1 (en) | Methods and Apparatus for Decorating User Interface Elements with Environmental Lighting | |
US20210406542A1 (en) | Augmented reality eyewear with mood sharing | |
WO2011096571A1 (ja) | 入力装置 | |
US20230007227A1 (en) | Augmented reality eyewear with x-ray effect | |
CN113853569A (zh) | 头戴式显示器 | |
KR101414362B1 (ko) | 영상인지 기반 공간 베젤 인터페이스 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2011529404 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13265911 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11786310 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11786310 Country of ref document: EP Kind code of ref document: A1 |