WO2023219254A1 - Procédé et dispositif d'estimation de distance de main pour lunettes de réalité augmentée - Google Patents

Procédé et dispositif d'estimation de distance de main pour lunettes de réalité augmentée Download PDF

Info

Publication number
WO2023219254A1
WO2023219254A1 PCT/KR2023/003632 KR2023003632W WO2023219254A1 WO 2023219254 A1 WO2023219254 A1 WO 2023219254A1 KR 2023003632 W KR2023003632 W KR 2023003632W WO 2023219254 A1 WO2023219254 A1 WO 2023219254A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
distance
augmented reality
coordinates
palm area
Prior art date
Application number
PCT/KR2023/003632
Other languages
English (en)
Korean (ko)
Inventor
최치원
김정환
Original Assignee
주식회사 피앤씨솔루션
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 피앤씨솔루션 filed Critical 주식회사 피앤씨솔루션
Publication of WO2023219254A1 publication Critical patent/WO2023219254A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/11Hand-related biometrics; Hand pose recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/40OLEDs integrated with touch screens

Definitions

  • the present invention relates to a method and device for estimating hand distance in an augmented reality glasses device, and more specifically, to a method and device for estimating the distance to the hand through hand joint recognition from a single camera image in an augmented reality glass device.
  • a head mounted display a type of wearable device, refers to a variety of devices that a user can wear on their head to receive multimedia content.
  • the head mounted display (HMD) is worn on the user's body and provides images to the user in various environments as the user moves.
  • HMDs head-mounted displays
  • the see-through type is mainly used for Augmented Reality (AR)
  • the closed type is mainly used for virtual reality (VR). It is used for Virtual Reality (VR).
  • Input methods that can be used in augmented reality of a head-mounted display device may include buttons provided on the HMD, a separate input device connected to the HMD, or gesture recognition. Among them, gesture recognition can be said to be a suitable input method that can be used in augmented reality on head-mounted displays.
  • depth cameras are sometimes used to recognize 3D hand gestures in augmented reality glasses.
  • Depth cameras include IR and ToF cameras, and the laser light source used in this case consumes considerable power.
  • augmented reality glasses are worn on the head, lightweighting is important, so the battery capacity cannot be increased significantly, so the battery capacity is limited, so it is difficult to continue using the depth camera for hand gesture recognition.
  • Registered Patent No. 10-2286018 (title of the invention: a wearable augmented reality device that inputs mouse events using hand movements and a mouse event input method of a wearable augmented reality device using hand movements, Registration date: July 29, 2021) has been launched.
  • the present invention was proposed to solve the above-mentioned problems of previously proposed methods, by extracting the coordinates of hand joints from hand images captured by a camera and calculating the palm area using the coordinates of the joints constituting the palm. , By estimating the distance to the hand through changes in the palm area, depth information can be acquired using images from a general visible light camera and 3D hand gestures can be recognized, and user interaction in content is possible through the recognized 3D hand gestures.
  • the purpose is to provide a hand distance estimation method and device for an augmented reality glass device that can minimize power consumption and enable smooth user interaction.
  • a method of estimating the distance to the hand of the augmented reality glass device comprising:
  • step (1) (2) recognizing a reference hand gesture requiring distance information using the coordinates of the hand joint extracted in step (1);
  • step (3) setting the frame in which the reference hand gesture was recognized in step (2) as a reference time
  • It is characterized in that it includes a step of estimating the distance to the hand through a change in the palm area based on the palm area at the reference time.
  • step (4) Preferably, in step (4),
  • the palm area can be calculated using the coordinates of the joints where the wrist connects to the back of the hand and the joints where each finger connects to the back of the hand.
  • step (1) the coordinates of 21 hand joints are extracted
  • the palm area can be calculated using the seven coordinates that make up the palm among the coordinates of the 21 hand joints.
  • step (5) Preferably, in step (5),
  • the ratio of the palm area at the current time to the palm area at the reference time is calculated, and the distance to the hand can be estimated from the calculated ratio using a deep learning model that estimates the distance to the hand based on the ratio of the palm area. there is.
  • step (5) More preferably, in step (5),
  • the distance to the hand can be estimated.
  • the hand distance estimation device for the augmented reality glass device according to the characteristics of the present invention to achieve the above object is,
  • a hand distance estimation device for an augmented reality glass device that estimates the distance to the hand through hand joint recognition from a single camera image of the augmented reality glass device
  • a coordinate extraction unit that extracts coordinates of hand joints in real time from a hand image captured by a camera of the augmented reality glass device
  • a hand gesture recognition unit that recognizes a reference hand gesture requiring distance information using the coordinates of the hand joint extracted from the coordinate extraction unit;
  • a reference setting unit that sets a frame in which the reference hand gesture is recognized by the hand gesture recognition unit as a reference time
  • an area calculation unit that calculates the palm area using the coordinates of the joints constituting the palm among the coordinates of the hand joints from the reference time;
  • Its structural feature includes a distance estimation unit that estimates the distance to the hand through a change in the palm area based on the palm area at the reference time.
  • the distance estimator calculates the distance estimate of the distance estimator.
  • the ratio of the palm area at the current time to the palm area at the reference time is calculated, and the distance to the hand can be estimated from the calculated ratio using a deep learning model that estimates the distance to the hand based on the ratio of the palm area. there is.
  • the distance estimator More preferably, the distance estimator,
  • the distance to the hand can be estimated.
  • the coordinates of the hand joints are extracted from the hand image captured by the camera, and the palm area is calculated using the coordinates of the joints constituting the palm,
  • 3D hand gestures can be recognized by acquiring depth information using images from a general visible light camera, and user interaction in content is possible through the recognized 3D hand gestures. This allows for smooth user interaction while minimizing power consumption.
  • Figure 1 is a diagram showing the configuration of a hand distance estimation device for an augmented reality glasses device according to an embodiment of the present invention.
  • Figure 2 is a diagram showing the overall configuration of an augmented reality glasses device including a hand distance estimation device of the augmented reality glasses device according to an embodiment of the present invention.
  • Figure 3 is a diagram illustrating the flow of a hand distance estimation method for an augmented reality glasses device according to an embodiment of the present invention.
  • Figure 4 is a diagram showing the algorithm structure of a hand distance estimation method for an augmented reality glasses device according to an embodiment of the present invention.
  • Figure 5 is a diagram showing hand joints for extracting coordinates in step S110 of the hand distance estimation method for the augmented reality glasses device according to an embodiment of the present invention.
  • Figure 6 is a diagram illustrating, as an example, the palm area calculated in step S140 of the hand distance estimation method for the augmented reality glasses device according to an embodiment of the present invention.
  • Figure 7 is a diagram illustrating the change in palm area as an example in step S150 of the hand distance estimation method for the augmented reality glasses device according to an embodiment of the present invention.
  • FIG. 8 is a diagram illustrating control of the augmented reality glasses device using hand gestures in the method for estimating the hand distance of the augmented reality glasses device according to an embodiment of the present invention.
  • Figure 1 is a diagram showing the configuration of a hand distance estimation device 100 of an augmented reality glasses device according to an embodiment of the present invention.
  • the hand distance estimation device 100 of the augmented reality glasses device according to an embodiment of the present invention estimates the distance to the hand through hand joint recognition from the single camera image of the augmented reality glasses device.
  • a hand distance estimation device 100 for an augmented reality glasses device comprising: a coordinate extraction unit 110 that extracts coordinates of hand joints in real time from a hand image captured by a camera of the augmented reality glasses device; A hand gesture recognition unit 120 that recognizes a reference hand gesture requiring distance information using the coordinates of the hand joint extracted from the coordinate extraction unit 110; a reference setting unit 130 that sets the frame in which the reference hand gesture is recognized in the hand gesture recognition unit 120 as the reference time; An area calculation unit 140 that calculates the palm area using the coordinates of the joints constituting the palm among the coordinates of the hand joints from the reference time; and a distance estimator 150 that estimates the distance to the hand through a change in the palm area based on the palm area at the reference time.
  • the distance estimator 150 calculates the ratio of the palm area at the current time to the palm area at the reference time, and calculates the ratio using a deep learning model that estimates the distance to the hand based on the ratio of the palm area.
  • the distance to the hand can be estimated from the ratio.
  • the distance estimation unit 150 can estimate the distance to the hand when the calculated ratio is in the range of greater than 0 and less than or equal to 1.
  • FIG. 2 is a diagram showing the overall configuration of the augmented reality glasses device 10 including the hand distance estimation device 100 of the augmented reality glasses device 10 according to an embodiment of the present invention.
  • the augmented reality glasses device 10 may be configured to include a hand distance estimation device 100, a camera unit 200, and an optical display unit.
  • the camera unit 200 may be configured as a general visible light camera provided to face the wearer's viewing direction. More specifically, the camera unit 200 can acquire a hand image by photographing the wearer's hand from the front of the augmented reality glass device 10.
  • the hand image is a general two-dimensional image and may be in the form of a video composed of a plurality of frames.
  • the hand image acquired by the camera unit 200 is transmitted to the hand distance estimation device 100, and the distance to the hand captured in the hand image can be estimated, and based on this, depth information (augmented reality glasses device 10) It can recognize 3D hand gestures that reflect the distance from the hand to the object and apply it to the user's interaction with the object.
  • the optical display unit 300 is disposed in front of both eyes of the wearer and can provide augmented reality to the wearer by transmitting at least a portion of the image light output from the display 310 toward the wearer's eyes. That is, the optical display unit 300 is a component corresponding to AR or XR (eXtended Reality) glasses of the augmented reality glass device 10, and as shown in FIG. 2, the display 310 and the optical system 320 It may be configured to include.
  • AR or XR extended Reality
  • the display 310 may output image light so that image information can be provided to the wearer. More specifically, the display 310 is coupled to the optical system 320, which will be described in detail below, so that image information can be provided to the wearer, and outputs image light transmitted by the optical system 320 toward the wearer's eyes. And, it may be composed of a pair of displays 310 for binocular display 310.
  • the display 310 may be configured in various ways, such as OLED or Liquid Crystal on Silicon (LCoS).
  • the optical system 320 is disposed in front of both eyes of the wearer wearing the augmented reality glass device 10 and can provide augmented reality by combining real world light and image light. More specifically, the optical system 320 transmits at least a portion of the light of the real world through the wearer's field of view and transmits the image light output from the display 310 toward the wearer's eyes to provide augmented reality. can do. That is, the optical system 320 may be configured so that a wearer wearing the augmented reality glasses device 10 can experience augmented reality.
  • the optical system 320 is composed of a plurality of lenses and mirrors and can be implemented in various ways, for example, an optical diffraction method, a beam splitter method, a pin mirror method, etc.
  • FIG. 3 is a diagram illustrating the flow of a hand distance estimation method for the augmented reality glasses device 10 according to an embodiment of the present invention.
  • each step is performed in the augmented reality glasses device 10, and hand joints are estimated from a single camera image.
  • a hand distance estimation method for the augmented reality glass device 10 that estimates the distance to the hand through recognition, including extracting coordinates of hand joints in real time from a hand image (S110), and recognizing a reference hand motion for which distance information is required.
  • step (S120) setting the frame in which the reference hand motion was recognized as the reference time (S130), calculating the palm area using the coordinates of the joints constituting the palm (S140), and changing the palm area to the hand. It can be implemented including the step of estimating the distance (S150).
  • FIG. 4 is a diagram showing the algorithm structure of a hand distance estimation method for the augmented reality glasses device 10 according to an embodiment of the present invention.
  • each step of the hand distance estimation method of the augmented reality glasses device 10 according to an embodiment of the present invention will be described in detail with reference to FIGS. 3 and 4.
  • step S110 the coordinates of the hand joints can be extracted in real time from the hand image captured by the camera of the augmented reality glass device 10. More specifically, as shown in FIG. 4, in step S110, a feature map is extracted from a hand image captured by the camera unit 200 using Base Net, an artificial intelligence model that extracts a feature map, and the feature map is extracted from the hand image captured by the camera unit 200. You can extract the coordinates of hand joints by inputting the extracted feature map into Keypoints Net, an artificial intelligence model that extracts key-points from the map.
  • ResNet, Inception, MobileNet, etc. can be used as the Base Net.
  • Keypoints Net can be composed of a regression model that derives the 21 (x,y) coordinates that make up the hand through a combination of a convolution layer and a dense layer.
  • the Base Net and Keypoints Net can be integrated and learned to form a single hand joint extraction model.
  • Figure 5 is a diagram showing hand joints for which coordinates are extracted in step S110 of the hand distance estimation method of the augmented reality glasses device 10 according to an embodiment of the present invention. As shown in FIG. 5, in step S110 of the hand distance estimation method for the augmented reality glasses device 10 according to an embodiment of the present invention, the coordinates of 21 hand joints can be extracted.
  • step S120 a reference hand motion requiring distance information can be recognized using the coordinates of the hand joint extracted in step S110. That is, in step S120, the 21 hand joint coordinates extracted in step S110 are input into the Classification Net to classify the hand motion, thereby determining whether the hand image captured in the hand image corresponds to a preset reference hand motion.
  • the hand distance can be estimated when the wearer makes a specific reference hand gesture.
  • the reference hand motion is a hand motion that starts a gesture that requires depth information (distance information to the hand), and may be a starting motion in which the wearer moves his/her hand in the direction of the object for interaction with the object in augmented reality or extended reality content.
  • the specific hand gestures can be set in advance. For example, the wearer can make a reference hand gesture by spreading the hand so that the back of the hand is visible, holding it in front of the eye, and looking at it for a predetermined period of time.
  • the standard hand gesture may vary depending on the type of interaction, user settings, etc.
  • step S130 the frame in which the reference hand gesture is recognized in step S120 can be set as the reference time. That is, if the reference hand gesture is recognized in step S120, it corresponds to a gesture that requires estimating the distance to the hand, so in step S130, the moment when the reference hand gesture is recognized can be set as a reference for estimating the distance.
  • the palm area can be calculated using the coordinates of the joints constituting the palm among the coordinates of the hand joints from the reference time. That is, after recognizing the reference hand gesture in step S130, the palm area can be calculated as the hand moves in the depth direction.
  • the hand closer to the augmented reality glass device 10 is photographed larger, and the farther away the hand is from the augmented reality glass device 10, the smaller the hand is photographed.
  • the finger portion changes significantly depending on the hand movement, but the point corresponding to the palm is less sensitive to the change in movement.
  • a deep learning model that estimates the distance from the augmented reality glass device 10 to the hand in proportion to the palm area can be created through learning.
  • the palm area can be calculated using the coordinates of the joint where the wrist is connected to the back of the hand and the joint where each finger is connected to the back of the hand. That is, in step S140, the area of the palm can be calculated using the coordinates of 7 joints 0, 1, 2, 5, 9, 13, and 17 that make up the palm, among 21 joints as shown in FIG. 5.
  • FIG. 6 is a diagram illustrating, as an example, the palm area calculated in step S140 of the hand distance estimation method of the augmented reality glasses device 10 according to an embodiment of the present invention.
  • the palm is configured with the coordinates of the seven joints that constitute the palm with respect to the reference hand motion. (red polygon), the area can be derived using image processing techniques.
  • step S150 the distance to the hand can be estimated through changes in the palm area based on the palm area at the reference time. More specifically, in step S150, the ratio of the palm area at the current time to the palm area at the reference time is calculated, and a deep learning model that estimates the distance to the hand based on the ratio of the palm area is used to estimate the distance to the hand from the calculated ratio. The distance can be estimated.
  • step S150 the area of the palm (Area of Ref. Palm) at the reference time at which the reference hand gesture was recognized is used as the denominator, and when the hand is moved in the depth direction after the reference time, the current palm area (Area of Cur. Palm) as the numerator, the ratio can be calculated as shown in Equation 1 below.
  • FIG. 7 is a diagram illustrating the change in palm area as an example in step S150 of the hand distance estimation method for the augmented reality glasses device 10 according to an embodiment of the present invention.
  • the wearer makes a reference hand gesture in front of the eyes and then moves the hand away in the depth direction.
  • the ratio of the palm area at the reference time as shown on the left side of FIG. 7 and the palm area at the current time after movement as shown on the right side of FIG. 7 can be calculated. At this time, the calculated ratio may range from 0 to 1.
  • FIG. 8 is a diagram illustrating the control of the augmented reality glasses device 10 using hand gestures in the method for estimating the hand distance of the augmented reality glasses device 10 according to an embodiment of the present invention.
  • the wearer's hand is connected to the object. You have to get closer. That is, Figure 8 shows that an object and a hand gradually become closer from the top picture to the bottom picture over time, and most user interactions occur in this case.
  • step S150 the ratio calculated in step S140 is input into the distance estimation model learned based on deep learning, and the distance to the hand output from the distance estimation model can be predicted.
  • the predicted distance to the hand is depth information and can be combined with the coordinates of the hand joint extracted in step S110 to recognize 3D hand gestures.
  • the coordinates of the hand joints are extracted from the hand image captured by the camera, and the coordinates of the joints constituting the palm are extracted.
  • 3D hand gestures can be recognized by acquiring depth information using images from a general visible light camera, and the recognized 3D hand gestures can be This allows user interaction with content, ensuring smooth user interaction while minimizing power consumption.
  • the present invention may include a computer-readable medium containing program instructions for performing operations implemented in various communication terminals.
  • computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD_ROM and DVD, and floptical disks. It may include magneto-optical media and hardware devices specifically configured to store and perform program instructions, such as ROM, RAM, flash memory, etc.
  • Such computer-readable media may include program instructions, data files, data structures, etc., singly or in combination.
  • program instructions recorded on a computer-readable medium may be specially designed and configured to implement the present invention, or may be known and available to those skilled in the computer software art.
  • it may include not only machine language code such as that produced by a compiler, but also high-level language code that can be executed by a computer using an interpreter, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Databases & Information Systems (AREA)
  • Optics & Photonics (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon un procédé et un dispositif d'estimation de distance de main pour lunettes de réalité augmentée, présentés dans la présente invention, les coordonnées des articulations de main sont extraites d'une image de main capturée par une caméra, une zone de paume est calculée à l'aide des coordonnées des articulations constituant la paume de telle sorte que la distance jusqu'à la main est estimée par l'intermédiaire d'un changement dans la zone de paume, et ainsi des mouvements de main tridimensionnels peuvent être reconnus à l'aide d'une image d'une caméra de lumière visible générale de façon à acquérir des informations de profondeur, et des interactions d'utilisateur peuvent être appliquées dans des contenus par l'intermédiaire des mouvements de main tridimensionnels reconnus de telle sorte que l'utilisateur peut interagir sans à-coups tandis que la consommation d'énergie est réduite au minimum.
PCT/KR2023/003632 2022-05-09 2023-03-19 Procédé et dispositif d'estimation de distance de main pour lunettes de réalité augmentée WO2023219254A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2022-0056799 2022-05-09
KR1020220056799A KR102667189B1 (ko) 2022-05-09 2022-05-09 증강현실 글래스 장치의 손 거리 추정 방법 및 장치

Publications (1)

Publication Number Publication Date
WO2023219254A1 true WO2023219254A1 (fr) 2023-11-16

Family

ID=88730496

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/003632 WO2023219254A1 (fr) 2022-05-09 2023-03-19 Procédé et dispositif d'estimation de distance de main pour lunettes de réalité augmentée

Country Status (2)

Country Link
KR (1) KR102667189B1 (fr)
WO (1) WO2023219254A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118298342A (zh) * 2024-06-03 2024-07-05 中国人民解放军国防科技大学 一种结合外置深度相机的ar眼镜目标增强方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005050177A (ja) * 2003-07-30 2005-02-24 Nissan Motor Co Ltd 非接触式情報入力装置
CN108171133A (zh) * 2017-12-20 2018-06-15 华南理工大学 一种基于特征协方差矩阵的动态手势识别方法
CN109359514A (zh) * 2018-08-30 2019-02-19 浙江工业大学 一种面向deskVR的手势跟踪识别联合策略方法
KR102286018B1 (ko) * 2020-09-09 2021-08-05 주식회사 피앤씨솔루션 손동작을 이용해 마우스 이벤트를 입력하는 착용형 증강현실 장치 및 손동작을 이용한 착용형 증강현실 장치의 마우스 이벤트 입력 방법
CN113536931A (zh) * 2021-06-16 2021-10-22 海信视像科技股份有限公司 一种手部姿态估计方法及装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011142317A1 (fr) * 2010-05-11 2011-11-17 日本システムウエア株式会社 Dispositif de reconnaissance de gestes, procédé, programme et support lisible par ordinateur sur lequel le programme est stocké

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005050177A (ja) * 2003-07-30 2005-02-24 Nissan Motor Co Ltd 非接触式情報入力装置
CN108171133A (zh) * 2017-12-20 2018-06-15 华南理工大学 一种基于特征协方差矩阵的动态手势识别方法
CN109359514A (zh) * 2018-08-30 2019-02-19 浙江工业大学 一种面向deskVR的手势跟踪识别联合策略方法
KR102286018B1 (ko) * 2020-09-09 2021-08-05 주식회사 피앤씨솔루션 손동작을 이용해 마우스 이벤트를 입력하는 착용형 증강현실 장치 및 손동작을 이용한 착용형 증강현실 장치의 마우스 이벤트 입력 방법
CN113536931A (zh) * 2021-06-16 2021-10-22 海信视像科技股份有限公司 一种手部姿态估计方法及装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118298342A (zh) * 2024-06-03 2024-07-05 中国人民解放军国防科技大学 一种结合外置深度相机的ar眼镜目标增强方法

Also Published As

Publication number Publication date
KR20230157156A (ko) 2023-11-16
KR102667189B1 (ko) 2024-05-21

Similar Documents

Publication Publication Date Title
WO2019050360A1 (fr) Dispositif électronique et procédé de segmentation automatique d'être humain dans une image
WO2020036343A1 (fr) Dispositif électronique et procédé de commande correspondant
WO2020162673A1 (fr) Dispositif électronique pour générer une animation d'avatar et procédé associé
WO2013129792A1 (fr) Procédé et terminal portable pour corriger la direction du regard de l'utilisateur dans une image
WO2017082539A1 (fr) Appareil et procédé de fourniture de réalité augmentée pour l'esthétique d'un utilisateur
WO2023219254A1 (fr) Procédé et dispositif d'estimation de distance de main pour lunettes de réalité augmentée
WO2016200102A1 (fr) Procédé et dispositif pour changer le point focal d'une caméra
WO2020180134A1 (fr) Système de correction d'image et son procédé de correction d'image
WO2016002986A1 (fr) Dispositif et procédé de suivi du regard, et support d'enregistrement destiné à mettre en œuvre ce procédé
WO2019208851A1 (fr) Procédé et appareil d'interface de réalité virtuelle permettant de fusionner avec un espace réel
WO2022050668A1 (fr) Procédé de détection du mouvement de la main d'un dispositif de réalité augmentée vestimentaire à l'aide d'une image de profondeur, et dispositif de réalité augmentée vestimentaire capable de détecter un mouvement de la main à l'aide d'une image de profondeur
WO2018097632A1 (fr) Procédé et dispositif de fourniture d'image
KR20130034125A (ko) 증강현실 기능을 구비한 안경형 모니터
WO2016064073A1 (fr) Lunettes intelligentes sur lesquelles sont montés un afficheur et une caméra, et procédé d'entrée et de correction tactile d'espace les utilisant
EP3167610A1 (fr) Dispositif d'affichage ayant une étendue d'accréditation liée à la profondeur d'un objet virtuel, et procédé de commande correspondant
WO2019112114A1 (fr) Terminal de type lunettes et procédé pour son utilisation
WO2019039870A1 (fr) Dispositif électronique capable de commander un effet d'affichage d'image, et procédé d'affichage d'image
WO2020145517A1 (fr) Procédé d'authentification d'utilisateur et dispositif électronique associé
WO2022255641A1 (fr) Procédé et appareil permettant d'améliorer les performances de reconnaissance de mouvement de la main et de commande vocale, destinés à une interface d'entrée d'un dispositif de verre à réalité augmentée
WO2018164316A1 (fr) Procédé et dispositif de capture d'image omnidirectionnelle permettant de mettre en oeuvre un procédé
EP3545387A1 (fr) Procédé et dispositif de fourniture d'image
WO2021162353A1 (fr) Dispositif électronique incluant un appareil photographique et son procédé de fonctionnement
WO2018076454A1 (fr) Procédé de traitement de données et dispositif associé correspondant
WO2021221341A1 (fr) Dispositif de réalité augmentée et son procédé de commande
WO2024122801A1 (fr) Dispositif électronique pour afficher un objet visuel sur la base de la position d'un dispositif électronique externe, et procédé associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23803682

Country of ref document: EP

Kind code of ref document: A1