WO2017164584A1 - Dispositif hmd susceptible de réaliser une authentification d'utilisateur basée sur un geste et procédé d'authentification d'utilisateur basée sur un geste pour un dispositif hmd - Google Patents

Dispositif hmd susceptible de réaliser une authentification d'utilisateur basée sur un geste et procédé d'authentification d'utilisateur basée sur un geste pour un dispositif hmd Download PDF

Info

Publication number
WO2017164584A1
WO2017164584A1 PCT/KR2017/002930 KR2017002930W WO2017164584A1 WO 2017164584 A1 WO2017164584 A1 WO 2017164584A1 KR 2017002930 W KR2017002930 W KR 2017002930W WO 2017164584 A1 WO2017164584 A1 WO 2017164584A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
user
pattern recognition
hand
gesture pattern
Prior art date
Application number
PCT/KR2017/002930
Other languages
English (en)
Korean (ko)
Inventor
박지만
예정민
Original Assignee
주식회사 다날
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 다날 filed Critical 주식회사 다날
Publication of WO2017164584A1 publication Critical patent/WO2017164584A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • the present invention relates to a technology that enables a user to easily perform user authentication in a head mounted display (HMD) device capable of implementing virtual reality (VR).
  • HMD head mounted display
  • VR virtual reality
  • a representative device capable of implementing VR is a head mounted display (HMD) device, which is a device that allows a user to see a game screen through a display in front of his eyes after wearing it on his head like glasses.
  • HMD head mounted display
  • the HMD device As the HMD device is widely spread, it is expected that in the near future, the user will develop various forms such as purchasing game items or performing online shopping while wearing the HMD device.
  • the HMD (Head Mounted Display) device capable of gesture-based user authentication and the gesture-based user authentication method of the HMD device according to the present invention store predetermined first figure information previously designated for user authentication and then store the information on the front of the HMD device.
  • the camera captures the movement of the user's hand and the external environment through the camera, and then performs object tracking on the movement of the user's hand from the captured image captured by the camera.
  • the figure information corresponding to the movement trajectory of the user's hand and the predetermined first figure information are compared with each other so that both figure information is the same. If it is determined that the user can complete authentication, the HMD device does not have a separate input device. To support the user's authentication process easily.
  • Gesture-based user authentication capable HMD (Head Mounted Display) device is attached to the front of the HMD device, a camera capable of shooting in the external environment, n (n is a predetermined for the user authentication) If a user authentication command is applied under the control of an application installed in the HMD device and the figure information storage unit storing the number of natural figures), the camera is driven to obtain a captured image of the external environment. As the user moves his / her hand, when the external environment and the user's hand are photographed together through the camera, object tracking of the user's hand movement is performed from the photographed image captured by the camera.
  • HMD Head Mounted Display
  • a figure information generation unit for generating figure information corresponding to the movement trajectory of the user's hand and the teeth of the user's hand
  • n pieces of figure information corresponding to a locus are generated, when n figure information corresponding to the movement trajectory of the user's hand is compared with the n first figure information, it is determined that both figure information correspond to each other.
  • a gesture-based user authentication method of an HMD device having a camera attached to a front surface stores n-type first figure information in which n (n is a natural number) previously designated for user authentication is stored. If the user authentication command is applied by the control of the application installed in the HMD device, maintaining the wealth, the camera is driven to obtain a captured image of the external environment, and as the user moves his hand, When the external environment and the user's hand are photographed together, object tracking of the movement of the user's hand is performed from the captured image captured by the camera to generate figure information corresponding to the movement trajectory of the user's hand. And n pieces of figure information corresponding to the movement trajectory of the user's hand, are generated. When compared to the generated corresponding to the movement path of the hand of n shape information and said n number of first shape information determined to match, the amount of graphic information, and a step to complete the user authentication.
  • the HMD (Head Mounted Display) device capable of gesture-based user authentication and the gesture-based user authentication method of the HMD device according to the present invention store predetermined first figure information previously designated for user authentication and then store the information on the front of the HMD device.
  • the camera captures the movement of the user's hand and the external environment through the camera, and then performs object tracking on the movement of the user's hand from the captured image captured by the camera.
  • the figure information corresponding to the movement trajectory of the user's hand and the predetermined first figure information are compared with each other so that both figure information is the same. If it is determined that the user can complete authentication, the HMD device does not have a separate input device. Can also assist the user in the authentication process.
  • HMD head mounted display
  • FIG 2 and 3 are views for explaining the operation of the HMD device capable of gesture-based user authentication according to an embodiment of the present invention.
  • FIG. 4 is a flowchart illustrating a gesture-based user authentication method of an HMD device according to an embodiment of the present invention.
  • HMD head mounted display
  • the HMD device 110 capable of gesture-based user authentication may include a camera 111, a figure information storage unit 112, a figure information generation unit 113, and a user authentication. Part 114 is included.
  • the camera 111 is attached to the front of the HMD device 110, it is configured to allow the shooting of the external environment.
  • the figure information storage unit 112 stores n pieces of first figure information (n is a natural number) previously designated for user authentication.
  • the figure information generator 113 drives the camera 111 to obtain a captured image of the external environment, and the user makes a hand.
  • the user tracks the movement of the user's hand from the captured image captured by the camera 111 to perform the user's hand. Shape information corresponding to the movement trajectory of the hand is generated.
  • the user authentication unit 114 may generate n figure information and the n first figure information generated corresponding to the movement trajectory of the user's hand. In comparison, when it is determined that both figures of information match, the user authentication is completed.
  • the HMD device 110 capable of gesture-based user authentication may further include an image display control unit 115 and a gesture pattern recognition area display unit 116.
  • the image display controller 115 displays a captured image of the external environment captured by the camera 111 in real time on the display.
  • the gesture pattern recognition area display unit 116 displays a plurality of predetermined gesture pattern recognition areas at a predetermined point on the display.
  • the figure information generating unit 113 may include a figure information table holding unit 117, a gesture pattern recognition region extracting unit 118, and a figure information extracting unit 119.
  • the figure information table holding unit 117 stores and maintains a figure information table in which different figure information predetermined in advance with respect to combinations of a plurality of different gesture pattern recognition areas are recorded.
  • the gesture pattern recognition area extractor 118 performs object tracking on the movement of the user's hand from the captured image photographed by the camera 111 and photographs through the camera among the selected gesture pattern recognition areas. At least one gesture pattern recognition region overlapping the point where the user's hand is located is extracted from the captured image.
  • the figure information extracting unit 119 may combine the at least one gesture pattern recognition region and correspond to the combination of the at least one gesture pattern recognition region from the figure information table.
  • the recorded figure information is extracted to generate the extracted figure information as figure information corresponding to a movement trajectory of the user's hand.
  • different index values may be pre-assigned to the plurality of predetermined gesture pattern recognition regions.
  • the predetermined different figure information may be recorded in the figure information table corresponding to combinations of index values for a plurality of different gesture pattern recognition areas, and the figure information extracting unit 119 may be configured to include at least the figure.
  • the index value assigned to the at least one gesture pattern recognition region is checked, the identified index values are combined, and the combination of the identified index values is determined from the figure information table.
  • Correspondingly recorded figure information may be extracted to generate the extracted figure information as figure information corresponding to a movement trajectory of the user's hand.
  • the gesture pattern recognition region extractor 118 may include a hand region detector 120 and a hand region position tracker 121.
  • the hand region detector 120 checks color values of a plurality of pixels constituting each of the plurality of image frames with respect to the plurality of image frames constituting the captured image photographed by the camera 111, and thus, the plurality of pixels. Among them, a set of at least one pixel having a color value corresponding to a predetermined first color value range associated with a skin color is detected as the user's hand region.
  • the hand region position tracking unit 121 tracks a change in position of each of the plurality of image frames with respect to the set of at least one pixel detected by the hand region of the user, and thus changes the position of the user's hand. Perform object tracking for
  • the gesture pattern recognition region extraction unit 118 overlaps the point where the user's hand is located on the captured image photographed by the camera among the plurality of selected gesture pattern recognition regions.
  • the portion may be extracted as the at least one gesture pattern recognition only for a gesture pattern recognition area that exceeds a selected area size ratio in comparison with the size of each of the plurality of selected gesture pattern recognition areas.
  • FIG 2 and 3 are views for explaining the operation of the HMD device 110 capable of gesture-based user authentication according to an embodiment of the present invention.
  • the HMD device 110 capable of gesture-based user authentication according to the present invention is worn on the user's head, as shown in FIG. 2, so that a display exists in front of both eyes of the user and the user through the display. To show the screen to the user.
  • the HMD device 110 may include a camera 111 attached to the front of the HMD device 110 and capable of photographing an external environment.
  • the figure information storage unit 112 may store n pieces of first figure information specified in advance for user authentication.
  • figure information previously designated for the user authentication is figure information for shape "3", figure information for shape "4".
  • Geometry information for "5" shape, " “Shape information about the shape” In the case of the figure information on the shape, the information may be stored in the figure information storage unit 112 as shown in Table 1 below.
  • the HMD device 110 is installed in the HMD device 110.
  • a user authentication command may be applied to the HMD device 110 by the control of a predetermined application such as a payment related application.
  • the figure information generator 113 may drive the camera 111 to acquire a captured image of an external environment.
  • the image display control unit 115 may display a captured image of the external environment captured by the camera 111 in real time on a display, so that a user may control the external environment through the display. .
  • the gesture pattern recognition area display unit 116 displays a plurality of predetermined gesture pattern recognition areas 211, 212, 213, 214, 215, and 216 at a predetermined point on the display. , 217).
  • the gesture pattern recognition area display unit 116 may include a plurality of gesture pattern recognition areas as shown at 210 in the center of the display when a captured image of an external environment captured by the camera 111 is being displayed on the display.
  • the selected gesture pattern recognition areas 211, 212, 213, 214, 215, 216, and 217 may be displayed.
  • the user looks at the plurality of predetermined gesture pattern recognition areas 211, 212, 213, 214, 215, 216, 217 displayed on the display, and uses his or her hand to select the plurality of selected gestures. Gesture in the form of drawing the same figure information as the five first figure informations stored in the figure information storage unit 112 on the pattern recognition areas 211, 212, 213, 214, 215, 216, 217. By doing so, the user authentication process can be performed.
  • the user's hand since the user's hand is photographed and displayed together with the captured image of the external environment captured by the camera 111, the user looks at his or her hand displayed on the display, It may be controlled to move on the selected gesture pattern recognition areas 211, 212, 213, 214, 215, 216, 217.
  • the figure information generation unit 113 captures the external environment and the user's hand together through the camera 111, and then the user's hand from the captured image captured by the camera 111.
  • the object tracking of the movement of the user may be performed to generate figure information corresponding to the movement trajectory of the user's hand.
  • the figure information generating unit 113 may include a figure information table holding unit 117, a gesture pattern recognition region extracting unit 118, and a figure information extracting unit 119.
  • the figure information table holding unit 117 may store and maintain a figure information table in which different figure information predetermined in advance with respect to combinations of a plurality of different gesture pattern recognition areas are recorded.
  • different index values are pre-assigned to the plurality of predetermined gesture pattern recognition areas 211, 212, 213, 214, 215, 216, 217, and the figure information table
  • the predetermined different figure information corresponding to combinations of index values for a plurality of different gesture pattern recognition areas may be recorded.
  • an index value of “1” may be assigned to the gesture pattern recognition region 1 211 among the plurality of predetermined gesture pattern recognition regions 211, 212, 213, 214, 215, 216, and 217.
  • the index value "2" may be assigned to the gesture pattern recognition area 2212, and the index value "3" may be assigned to the gesture pattern recognition area 3213, and the gesture pattern recognition may be performed.
  • An index value of "4" may be assigned to the area 4 214, an index value of "5" may be assigned to the gesture pattern recognition area 5 215, and a gesture pattern recognition area 6 (216). ),
  • An index value of "6” may be assigned, and an index value of "7” may be assigned to the gesture pattern recognition area 7 217.
  • the gesture pattern recognition region extracting unit 118 performs object tracking on the movement of the user's hand from the captured image photographed by the camera 111, thereby selecting a plurality of predetermined gesture pattern recognition regions 211, 212,. At least one gesture pattern recognition region overlapping a point where the user's hand is located may be extracted from the captured images captured by the camera 111 among the 213, 214, 215, 216, and 217.
  • the gesture pattern recognition region extraction unit 118 may include a hand region detection unit 120 and a hand region position tracking unit 121 to perform object tracking on the movement of the user's hand.
  • the hand region detector 120 checks color values of a plurality of pixels constituting each of the plurality of image frames with respect to the plurality of image frames constituting the captured image photographed by the camera 111, and thus, the plurality of pixels. Among them, a set of at least one pixel having a color value corresponding to a predetermined first color value range associated with a skin color may be detected as the user's hand region.
  • the hand region position tracking unit 121 tracks a change in position of each of the plurality of image frames with respect to the set of at least one pixel detected by the hand region of the user, thereby controlling the position of the user's hand.
  • Object tracking for movement can be performed.
  • the hand region detector 120 checks the color values of the plurality of pixels constituting the "image frame 1" and then the color value of the plurality of pixels corresponding to the selected first color value range associated with the skin color. At least one set of pixels having a may be detected as the user's hand region in the "image frame 1".
  • the hand region detector 120 checks the color values of the plurality of pixels constituting “image frame 2”, and then the color value among the plurality of pixels corresponds to the selected first color value range associated with the skin color. At least one pixel set having a value may be detected as the user's hand region in the "image frame 2".
  • the hand region detector 120 checks the color values of the plurality of pixels constituting “image frame 3”, and then the color value of the plurality of pixels corresponds to the selected first color value range associated with the skin color. At least one pixel set having a value may be detected as the user's hand region in the "image frame 3".
  • the hand region position tracking unit 121 according to the time order in the "image frame 1", “image frame 2", “image frame 3" for the set of at least one pixel detected by the user's hand region. By tracking the position change, the object tracking for the movement of the user's hand can be performed.
  • the hand region position tracking unit 121 tracks the position change of the user's hand region extracted from each image frame in the order of "image frame 1", “image frame 2", and "image frame 3". Object tracking of the movement of the user's hand may be performed.
  • the gesture pattern recognition region extracting unit 118 performs object tracking on the movement of the user's hand from the captured image photographed by the camera 111, thereby selecting a plurality of predetermined gesture pattern recognition regions 211. At least one gesture pattern recognition region overlapping a point where the user's hand is located may be extracted from the captured images captured by the camera 111 among 212, 213, 214, 215, 216, and 217.
  • the user may touch his or her hand with a gesture pattern recognition area 1 among a plurality of predetermined gesture pattern recognition areas 211, 212, 213, 214, 215, 216, and 217.
  • a gesture pattern recognition region 2 212
  • gesture pattern recognition region 3 213
  • gesture pattern recognition region 4 214
  • gesture pattern recognition region 5 215
  • the area extractor 118 performs object tracking on the movement of the user's hand from the captured image captured by the camera 111, thereby selecting a plurality of predetermined gesture pattern recognition areas 211, 212, 213, 214, and 215.
  • At least one gesture pattern recognition region overlapping with the point where the user's hand is located on the captured image captured by the camera 111 among the 216, 217, gesture pattern recognition region 1 211 and gesture pattern recognition region 2 ( 212), gesture pattern recognition Station 3 213, it is possible to extract gesture pattern recognition area 4 214, the gesture pattern recognition area 5 215.
  • the gesture pattern recognition region extraction unit 118 is the camera 111 of the plurality of predetermined gesture pattern recognition regions (211, 212, 213, 214, 215, 216, 217)
  • a plurality of predetermined gesture pattern recognition regions 211, 212, 213, 214, 215, 216 , 217 When extracting the at least one gesture pattern recognition region overlapping with the point where the user's hand is located on the captured image photographed by using, a plurality of predetermined gesture pattern recognition regions 211, 212, 213, 214, 215, 216 , 217, a portion overlapping with the point where the user's hand is located on the captured image captured by the camera 111 includes a plurality of predetermined gesture pattern recognition regions 211, 212, 213, 214, 215, 216, and 217. Only the gesture pattern recognition area that exceeds the selected area size ratio in comparison to each size may be extracted as the at least one gesture pattern recognition area.
  • the gesture pattern recognition area extraction unit 118 may include the plurality of selected gesture pattern recognition areas 211, 212, 213, and 214. , 215, 216, and 217, which overlap the point where the user's hand is located, may be extracted as the at least one gesture pattern recognition area only for a gesture pattern recognition area that exceeds 60% of the size of each gesture pattern recognition area. Can be.
  • the figure information extraction unit 119 checks an index value assigned to the at least one gesture pattern recognition region. After combining the identified index values, the figure information corresponding to the combination of the identified index values is extracted from the figure information table, and the figure information corresponding to the movement trajectory of the user's hand is extracted. Can be generated as
  • the gesture pattern recognition area extractor 118 may include the gesture pattern recognition area 1 211, the gesture pattern recognition area 2 212, the gesture pattern recognition area 3 213, and the gesture pattern recognition area. If 4 (214) and gesture pattern recognition area 5 (215) are extracted, the figure information extracting unit 119 checks the index value "1" assigned to gesture pattern recognition area 1 (211), and gesture pattern recognition area. Confirms the index value "2" assigned to 2 (212), checks the index value "3" assigned to gesture pattern recognition area 3 (213), and assigns it to gesture pattern recognition area 4 (214).
  • the user's hand movement as indicated by reference numeral 321 was applied by the user, the hand movement as indicated by reference numeral 322 was applied, and the hand movement as indicated by reference numeral 323 was applied.
  • a hand movement such as 324 is applied, and a hand movement such as 325 is applied, the figure information generating unit 113 of the user's hand such as 321, 322, 323, 324, and 325 is applied.
  • Shape information for shape "3” five shape information corresponding to movement trajectory, shape information for shape "4", shape information for shape "5", " "Shape information about the shape and” "You can create the shape information about the shape.
  • the user authentication unit 114 when five figure information corresponding to the movement trajectory of the user's hand is generated, the user authentication unit 114 generates five figure information generated corresponding to the movement trajectory of the user's hand and the table 1 as shown in Table 1 below.
  • the user authentication may be completed.
  • the HMD device 110 capable of gesture-based user authentication stores predetermined first figure information previously designated for user authentication, and then captures an external environment in front of the HMD device 110. 111, by recording the movement of the user's hand and the external environment through the camera 111 and performing object tracking of the movement of the user's hand from the captured image taken by the camera 111 After generating figure information corresponding to the movement trajectory of the user's hand, the figure information corresponding to the movement trajectory of the user's hand and the predetermined first figure information are compared with each other to determine that both figure information is identical.
  • the user authentication can be completed, the user can easily proceed with the authentication process even in the HMD device 110 without a separate input device I can support them.
  • FIG. 4 is a flowchart illustrating a gesture-based user authentication method of an HMD device having a camera attached to a front surface according to an embodiment of the present invention.
  • a figure information storage unit in which n pieces of first figure information (n is a natural number) designated in advance for user authentication, is stored.
  • step S430 when n pieces of figure information corresponding to the movement trajectory of the user's hand are generated, n shape information corresponding to the movement trajectory of the user's hand is compared with the n first figure information. If it is determined that both figure information match, the user authentication is completed.
  • the gesture-based user authentication method of the HMD device displaying a captured image of the external environment taken by the camera in real time on a display and selected on the display
  • the method may further include displaying a plurality of predetermined gesture pattern recognition regions at the point.
  • step S420 storing and maintaining a figure information table in which different figure information predetermined in advance is recorded corresponding to combinations of a plurality of different gesture pattern recognition areas, and photographing images photographed by the camera.
  • At least one gesture pattern recognition region overlapping a point where the user's hand is located on the captured image photographed by the camera among the plurality of selected gesture pattern recognition regions by performing object tracking on the movement of the user's hand from Extracting and extracting the at least one gesture pattern recognition area, the combination of the at least one gesture pattern recognition area is recorded corresponding to the combination of the at least one gesture pattern recognition area from the figure information table.
  • Extracted figure information Corresponding to the type information on a movement path of the hand of the user may include the step of generating a figure information.
  • different index values are pre-assigned to the plurality of predetermined gesture pattern recognition regions, and the index information for the plurality of different gesture pattern recognition regions is assigned to the figure information table.
  • the predetermined different figure information for the combinations may be recorded correspondingly.
  • the generating of the extracted figure information as figure information corresponding to a movement trajectory of the user's hand may include: an index allocated to the at least one gesture pattern recognition region when the at least one gesture pattern recognition region is extracted; Determine a value, combine the checked index values, extract figure information recorded corresponding to the identified index value combination from the figure information table, and move the extracted figure information to the user's hand; It may be generated by the figure information corresponding to the.
  • the extracting of the at least one gesture pattern recognition area may include each of the plurality of image frames with respect to a plurality of image frames constituting a captured image photographed by the camera. Identifying a color value of a plurality of pixels constituting a plurality of pixels, the set of at least one pixel having a color value corresponding to a predetermined first color value range associated with a skin color among the plurality of pixels as a user's hand region Detecting an object and tracking a change in position in each of the plurality of image frames with respect to the set of at least one pixel detected by the user's hand region to track the movement of the user's hand. It may include the step.
  • the extracting of the at least one gesture pattern recognition region may include positioning the user's hand on a captured image photographed by the camera among the plurality of predetermined gesture pattern recognition regions. A portion overlapping with a point may be extracted as the at least one gesture pattern recognition region only for a gesture pattern recognition region in which a portion overlapping with the size of each of the selected gesture pattern recognition regions exceeds a predetermined area size ratio.
  • the gesture-based user authentication method of the HMD device according to the embodiment of the present invention has been described above with reference to FIG. 4.
  • the gesture-based user authentication method of the HMD device according to an embodiment of the present invention may correspond to the configuration of the operation of the HMD device 110 capable of gesture-based user authentication described with reference to FIGS. 1 to 3. Therefore, a detailed description thereof will be omitted.
  • the gesture-based user authentication method of the HMD device may be implemented as a computer program stored in a storage medium for execution by combining with a computer.
  • the gesture-based user authentication method of the HMD device may be implemented in the form of program instructions that can be executed by various computer means and recorded in a computer-readable medium.
  • the computer readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • Program instructions recorded on the media may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well-known and available to those having skill in the computer software arts. Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tape, optical media such as CD-ROMs, DVDs, and magnetic disks, such as floppy disks.
  • Magneto-optical media and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device described above may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne une technologie dans laquelle : des éléments prédéfinis de premières informations de silhouette sont mémorisés afin d'effectuer une authentification d'utilisateur ; un appareil de prise de vues susceptible de photographier un environnement externe est monté sur le côté avant d'un dispositif HMD, de façon à photographier l'environnement externe et un mouvement de la main d'un utilisateur au moyen de l'appareil de prise de vues ; un suivi d'objet pour un mouvement de la main de l'utilisateur est effectué à partir d'une image photographiée par l'intermédiaire de l'appareil de prise de vues, de façon à générer des informations de silhouette correspondant à une trajectoire de mouvement de la main de l'utilisateur ; les informations de silhouette correspondant à la trajectoire de mouvement de la main de l'utilisateur sont comparées aux éléments prédéfinis des premières informations de silhouette ; puis l'authentification de l'utilisateur peut être achevée lorsqu'il est déterminé que les informations de silhouette correspondent aux éléments prédéfinis des premières informations de silhouette en résultat à la comparaison, moyennant quoi un utilisateur peut réaliser facilement un processus d'authentification même dans un dispositif HMD qui n'a pas d'unité d'entrée séparée.
PCT/KR2017/002930 2016-03-23 2017-03-20 Dispositif hmd susceptible de réaliser une authentification d'utilisateur basée sur un geste et procédé d'authentification d'utilisateur basée sur un geste pour un dispositif hmd WO2017164584A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160034388A KR101700569B1 (ko) 2016-03-23 2016-03-23 제스처 기반의 사용자 인증이 가능한 hmd 장치 및 상기 hmd 장치의 제스처 기반의 사용자 인증 방법
KR10-2016-0034388 2016-03-23

Publications (1)

Publication Number Publication Date
WO2017164584A1 true WO2017164584A1 (fr) 2017-09-28

Family

ID=57992882

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/002930 WO2017164584A1 (fr) 2016-03-23 2017-03-20 Dispositif hmd susceptible de réaliser une authentification d'utilisateur basée sur un geste et procédé d'authentification d'utilisateur basée sur un geste pour un dispositif hmd

Country Status (2)

Country Link
KR (1) KR101700569B1 (fr)
WO (1) WO2017164584A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11050752B2 (en) 2018-06-07 2021-06-29 Ebay Inc. Virtual reality authentication

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101977897B1 (ko) 2017-10-23 2019-08-28 동서대학교 산학협력단 융합현실, 가상현실 및 증강현실을 이용한 사용자 인증시스템
WO2023146196A1 (fr) * 2022-01-25 2023-08-03 삼성전자 주식회사 Procédé et dispositif électronique pour déterminer une main d'un utilisateur dans une vidéo
WO2024090826A1 (fr) * 2022-10-27 2024-05-02 삼성전자 주식회사 Dispositif électronique et procédé pour effectuer une authentification à l'aide d'un geste d'un utilisateur

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014078814A (ja) * 2012-10-10 2014-05-01 Olympus Corp 頭部装着型表示装置、ロック解除処理システム、プログラム及びロック解除処理システムの制御方法
KR20140124209A (ko) * 2013-04-16 2014-10-24 구태언 보안강화 머리착용형 디스플레이 장치 및 그 장치를 통해 암호화된 정보에 액세스 하는 방법
KR20150017893A (ko) * 2013-08-08 2015-02-23 삼성전자주식회사 전자 장치의 잠금 화면 처리 방법 및 장치
KR20150041453A (ko) * 2013-10-08 2015-04-16 엘지전자 주식회사 안경형 영상표시장치 및 그것의 제어방법
KR20160007342A (ko) * 2014-07-11 2016-01-20 넥시스 주식회사 글라스형 웨어러블 디바이스의 잠금해제 방법 및 프로그램

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014078814A (ja) * 2012-10-10 2014-05-01 Olympus Corp 頭部装着型表示装置、ロック解除処理システム、プログラム及びロック解除処理システムの制御方法
KR20140124209A (ko) * 2013-04-16 2014-10-24 구태언 보안강화 머리착용형 디스플레이 장치 및 그 장치를 통해 암호화된 정보에 액세스 하는 방법
KR20150017893A (ko) * 2013-08-08 2015-02-23 삼성전자주식회사 전자 장치의 잠금 화면 처리 방법 및 장치
KR20150041453A (ko) * 2013-10-08 2015-04-16 엘지전자 주식회사 안경형 영상표시장치 및 그것의 제어방법
KR20160007342A (ko) * 2014-07-11 2016-01-20 넥시스 주식회사 글라스형 웨어러블 디바이스의 잠금해제 방법 및 프로그램

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11050752B2 (en) 2018-06-07 2021-06-29 Ebay Inc. Virtual reality authentication
US11736491B2 (en) 2018-06-07 2023-08-22 Ebay Inc. Virtual reality authentication

Also Published As

Publication number Publication date
KR101700569B1 (ko) 2017-01-26

Similar Documents

Publication Publication Date Title
WO2017164584A1 (fr) Dispositif hmd susceptible de réaliser une authentification d'utilisateur basée sur un geste et procédé d'authentification d'utilisateur basée sur un geste pour un dispositif hmd
WO2016114610A1 (fr) Dispositif d'entrée virtuelle et procédé permettant de recevoir une entrée utilisateur l'utilisant
CN101379528B (zh) 面部认证装置、面部认证方法
US11586336B2 (en) Private control interfaces for extended reality
US11449131B2 (en) Obfuscated control interfaces for extended reality
US8682031B2 (en) Image processing device, camera, image processing method, and program
WO2012093811A1 (fr) Procédé de prise en charge conçu pour permettre de regrouper des objets compris dans une image d'entrée, et support d'enregistrement lisible par des dispositifs de terminaux et des ordinateurs
WO2020171621A1 (fr) Procédé de commande d'affichage d'avatar et dispositif électronique associé
WO2015064925A1 (fr) Appareil et procédé permettant d'entrer un motif, et support d'enregistrement les utilisant
CN104298910B (zh) 便携式电子装置及互动式人脸登入方法
KR102330637B1 (ko) 증강현실 포토카드를 제공하는 시스템, 서버, 방법 및 그 기록매체
WO2016182149A1 (fr) Dispositif d'affichage vestimentaire pour affichage de progression de processus de paiement associé à des informations de facturation sur une unité d'affichage, et son procédé de commande
CN106033539A (zh) 一种基于视频人脸识别的会议引导方法及系统
CN109697014A (zh) 多屏幕同步触控方法、装置及系统
CN112637665A (zh) 增强现实场景下的展示方法、装置、电子设备及存储介质
WO2014148691A1 (fr) Dispositif mobile et son procédé de commande
WO2020116960A1 (fr) Dispositif électronique servant à générer une vidéo comprenant des caractères et procédé associé
JP4789658B2 (ja) スライドショー再生方法、スライドショー再生プログラム、及びスライドショー再生装置
WO2019004754A1 (fr) Publicités à réalité augmentée sur des objets
WO2015182846A1 (fr) Appareil et procédé permettant de fournir une publicité au moyen d'un suivi de pupille
WO2017026834A1 (fr) Procédé de génération et programme de génération de vidéo réactive
WO2011078430A1 (fr) Procédé de recherche séquentielle pour reconnaître une pluralité de marqueurs à base de points de caractéristique et procédé de mise d'oeuvre de réalité augmentée utilisant ce procédé
KR102178396B1 (ko) 증강현실 기반의 이미지출력물 제작 방법 및 장치
WO2018169110A1 (fr) Appareil de réalité augmentée sans marqueur et procédé d'expression d'objet tridimensionnel
WO2020111844A2 (fr) Procédé et appareil pour améliorer un point caractéristique d'image dans le slam visuel à l'aide d'une étiquette d'objet

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17770554

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17770554

Country of ref document: EP

Kind code of ref document: A1