US20180321776A1 - Method for acting on augmented reality virtual objects - Google Patents

Method for acting on augmented reality virtual objects Download PDF

Info

Publication number
US20180321776A1
US20180321776A1 US15/773,248 US201615773248A US2018321776A1 US 20180321776 A1 US20180321776 A1 US 20180321776A1 US 201615773248 A US201615773248 A US 201615773248A US 2018321776 A1 US2018321776 A1 US 2018321776A1
Authority
US
United States
Prior art keywords
augmented reality
camera
virtual
create
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/773,248
Other languages
English (en)
Inventor
Vitaly Vitalyevich AVERYANOV
Andrey KOMISSAROV
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Devar Entertainment Ltd
Original Assignee
Devar Entertainment Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Devar Entertainment Ltd filed Critical Devar Entertainment Ltd
Publication of US20180321776A1 publication Critical patent/US20180321776A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling

Definitions

  • the present invention relates to methods of influencing augmented reality virtual objects wherein markers of a real three-dimensional space are determined from images obtained from a video camera device to create and view an augmented reality, form a physical base coordinate system tied to the spatial position of markers of a real three-dimensional space, devices for creating and viewing augmented reality relative to the basic coordinate system, specify the coordinates of the three-dimensional virtual objects of augmented reality in the base coordinate system, perform the specified actions for modifying the virtual objects for all or a part of the objects from the generated set of virtual objects of augmented reality using user's motion.
  • a virtual object is an object created by technical means, transmitted to a person through his senses: sight, hearing, and others.
  • Point of interest (a characteristic point)—the point of the image, which has a high local informativeness.
  • various formal criteria are proposed, called interest operators.
  • the operator of interest must ensure a sufficiently accurate positioning of the point in the image plane. It is also necessary that the position of the point of interest possess sufficient resistance to photometric and geometric distortions of the image, including uneven changes in brightness, shift, rotation, change in scale, and angular distortions.
  • the Kalman filter is an effective recursive filter that estimates the state vector of a dynamic system using a series of incomplete and noisy measurements.
  • Image pyramids are a collection of images obtained from the original image by its sequential compression until the breakpoint is reached (of course, the endpoint may be one pixel).
  • Smartphone English smartphone—smart phone
  • phone supplemented by the functionality of a pocket personal computer.
  • markers of real three-dimensional space are determined from the images obtained from the video camera of the device to create and view augmented reality, form a physical base coordinate system tied to the spatial position of markers of real three-dimensional space, determine the coordinates of the device to create and view the augmented reality relative to the base coordinate system, specify the coordinates of the three-dimensional virtual objects of the augmented reality in the base coordinate system, perform the specified actions for modifying the virtual objects for all or a part of objects from the generated set of virtual objects of the augmented reality, see the description of the Russian patent for invention No. 2451982 of May 27, 2012.
  • This method is the closest in technical essence and achieved technical result and is chosen for the prototype of the proposed invention as a method.
  • the present invention is mainly aimed at proposing a method for influencing augmented reality virtual objects, in which markers of real three-dimensional space are determined from images obtained from a video camera device adapted to create and view augmented reality, form a physical base coordinate system tied to the spatial position of the markers of the real three-dimensional space, determine the coordinates of the device adapted to create and view and augmented reality relative to the basic coordinate system, specify the coordinates of the three-dimensional virtual objects of the augmented reality in the base coordinate system, perform the said actions for modifying the virtual objects for all or a part of objects from the generated set of virtual objects of the augmented reality by means of user motion, allowing at least to smooth out at least one of the specified above the shortcomings of the prior art, namely achieving additional interaction with virtual objects by changing the position of the device to create and view the augmented reality associated with additional reactions of the virtual object, in addition to simply changing the orientation of the virtual object on the device's display, thereby achieving the technical objective.
  • coordinates of the device adapted to create and view augmented reality are determined relative to the actual physical marker by analyzing the image from the device camera, a virtual camera is placed in the calculated coordinates of the device adapted to create and view the added reality relative to the physical base coordinate system so that the marker located in its field of vision is visible in the same way as the physical marker located in the field of view of the physical camera of the device adapted to create and view the augmented reality, the vector corresponding to the direction from the marker to the virtual camera is calculated in real time, information time is generated by successive iteration in real time regarding all movements of the camera relative to the marker, i.e. turning, approaching and tilting.
  • the vector can be specified in any way, not only by the direction, but also by three coordinates, one or more coordinates and one or more angles, polar coordinates, Euler angles, or quaternions.
  • the SIFT (Scale Invariant Feature Transform) method detects and describes local features of the image.
  • the characteristics obtained by means of it are invariant with respect to scale and rotation, are resistant to a number of affine transformations, noise.
  • PCA-SIFT Principal Component Analysis
  • SURF Speeded Up Robust Features
  • SIFT Short Up Robust Features
  • integrated images are used to accelerate the search for points of interest.
  • the value at each point of the integral image is calculated as the sum of the values at a given point and the values of all the points that are above and to the left of the given point.
  • the so-called rectangular filters are computed, which consist of several rectangular regions.
  • MSER and LLD methods are the most invariant to affine transformations and scale-up. Both methods normalize 6 parameters of affine distortions. More in detail we will stop on MSER. “Extreme areas” is the name of the method obtained due to the sorting of the special points by intensity (in the lower and upper levels). A pyramid is constructed, at which the initial image corresponding to the minimum intensity value contains a white image, and at the last level, corresponding to the maximum intensity value, black.
  • Harris-Affine normalizes the parameters of affine transformations. Harris uses angles as special areas, and identifies key points in a large-scale space, using the approach proposed by Lindenberg. Affine normalization is carried out by a repetitive procedure in order to evaluate the parameters of the elliptical region and normalize them. With each repetition of the elliptic region, the parameters are evaluated: the difference between the proper moments of the second-order matrices of the selected region is minimized; the elliptical region is normalized to a circular one; an assessment of the key point, its scale on a space scale.
  • Hessian-Affine uses blobs instead of corners as a special area.
  • the determinant of the local maxima of the Hessian matrix is used as the base points.
  • the rest of the method is the same as Harris-Affine.
  • SIFT detector normalizes rotation, movement and simulates all images, remote from search and request.
  • GLOH Gradient location-orientation histogram
  • DAISY is initially introduced to solve the problem of matching images in the case of significant external changes, i.e. This descriptor, in contrast to the previously discussed ones, operates on a dense set of pixels of the entire image.
  • BRIEF-descriptor (Binary Robust Independent Elementary Features) provides recognition of identical parts of the image, which were taken from different points of view. At the same time, the task was to minimize the number of computations performed.
  • the algorithm of recognition is reduced to the construction of a random forest (randomize classification trees) or naive Bayesian classifier on some training set of images and subsequent classification of test image areas.
  • image analysis from the device camera is performed by an image classifier algorithm.
  • image analysis from the device camera is performed by the Kalman Filter algorithm.
  • FIG. 1 shows a diagram of an apparatus for interacting with virtual objects according to the invention
  • FIG. 2 schematically shows the steps of a method of interacting with virtual objects according to the invention.
  • the object marker is designated as 1 .
  • the device adapted for creating and viewing the augmented reality is 2, it further shows the video camera 21 and the display 22 .
  • Devices such as a smartphone, a computer tablet or devices such as glasses of added reality can be used as a device adapted for creating and viewing augmented reality.
  • the image obtained from the video camera of the device adapted to create and view augmented reality is shown as 23 .
  • the physical base coordinate system is associated with the marker designated as OmXmYmZm
  • the coordinates of the device 2 adapted for creating and viewing the augmented reality relative to the base coordinate system, while the device 2 itself has its own coordinate system OnXnYnZn
  • a vector corresponding to the direction from marker 1 to virtual camera 21 is designated as R.
  • Step A 1 Identify the markers of real three-dimensional space from the images obtained from the device's video camera to create and view augmented reality.
  • a marker can be any figure or object. But in practice, we are limited to allowing a webcam (phone), color rendering, lighting, and processing power of the equipment, as everything happens in real time, and therefore must be processed quickly, and therefore usually select a black and white marker of simple form.
  • Step A 2 Form a physical base coordinate system tied to the spatial position of the markers of a real three-dimensional space.
  • Step A 3 Specify the coordinates of three-dimensional virtual objects of augmented reality in the base coordinate system.
  • Step A 4 Determine coordinates of the device adapted to create and view the augmented reality relative to the base coordinate system by analyzing the image from the camera of the device.
  • Step A 5 The above actions are repeated at each iteration of the running of the computing module of the device adapted to create and view augmented reality. Aggregation of the directions received from each iteration forms information about all the camera movements relative to the marker—turning, approaching, tilting, etc.
  • Step A 6 Performing with the help of user motion the specified actions for modifying virtual objects for all or a part of objects of the formed set of virtual objects of augmented reality.
  • sequence of steps is exemplary and allows you to rearrange, subtract, add, or perform some operations simultaneously without losing the ability to interact with virtual objects. Examples of such operations can be:
  • a character created as an augmented reality object (a person or an animal) can follow by it's eyes the direction of the device adapted to create and view the augmented reality, creating the user's illusion that this person or animal is watching him in a way that a real man or animal would do.
  • the character can react accordingly, turning the body towards the user.
  • An interactive game wherein the marker in the role of the content of augmented reality is a conventional opponent, shooting toward the user by missiles moving at low speed. To win the game, the user must “avoid” from the missiles, shifting the device's camera adapted to create and view the augmented reality from their trajectory.
  • the proposed method of interaction with virtual objects can be carried out by a skilled person and, when implemented, ensures the achievement of the claimed designation, which allows to conclude that the criterion “industrial applicability” for the invention is met.
  • a prototype of a device for interacting with virtual objects is made in the form of a computer tablet having a display and a video camera.
  • the present invention achieves the stated objective of providing an additional ability of interacting with virtual objects by changing the position of the device to create and view the augmented reality associated with additional reactions of the virtual object, in addition to simply changing the orientation of the virtual object on the device display.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Multimedia (AREA)
  • Architecture (AREA)
  • Processing Or Creating Images (AREA)
US15/773,248 2015-11-18 2016-11-17 Method for acting on augmented reality virtual objects Abandoned US20180321776A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
RU2015149499 2015-11-18
RU2015149499A RU2617557C1 (ru) 2015-11-18 2015-11-18 Способ воздействия на виртуальные объекты дополненной реальности
PCT/RU2016/050070 WO2017086841A1 (ru) 2015-11-18 2016-11-17 Способ воздействия на виртуальные объекты дополненной реальности

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/RU2016/050070 A-371-Of-International WO2017086841A1 (ru) 2015-11-18 2016-11-17 Способ воздействия на виртуальные объекты дополненной реальности

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/876,789 Continuation-In-Part US11288837B2 (en) 2015-11-18 2020-05-18 Method of influencing virtual objects of augmented reality

Publications (1)

Publication Number Publication Date
US20180321776A1 true US20180321776A1 (en) 2018-11-08

Family

ID=58643243

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/773,248 Abandoned US20180321776A1 (en) 2015-11-18 2016-11-17 Method for acting on augmented reality virtual objects
US16/876,789 Active US11288837B2 (en) 2015-11-18 2020-05-18 Method of influencing virtual objects of augmented reality

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/876,789 Active US11288837B2 (en) 2015-11-18 2020-05-18 Method of influencing virtual objects of augmented reality

Country Status (6)

Country Link
US (2) US20180321776A1 (ru)
EP (1) EP3379396A4 (ru)
KR (1) KR20180107085A (ru)
CN (1) CN108369473A (ru)
RU (1) RU2617557C1 (ru)
WO (1) WO2017086841A1 (ru)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190050697A1 (en) * 2018-06-27 2019-02-14 Intel Corporation Localizing a vehicle's charging or fueling port - methods and apparatuses
US20200185242A1 (en) * 2017-05-31 2020-06-11 Fujikin Incorporated Management System, Method, and Computer Program for Semiconductor Fabrication Apparatus
US10692299B2 (en) * 2018-07-31 2020-06-23 Splunk Inc. Precise manipulation of virtual object position in an extended reality environment
CN112017297A (zh) * 2019-05-28 2020-12-01 中国商用飞机有限责任公司 一种增强现实的定位方法、装置、设备及介质
US10909772B2 (en) 2018-07-31 2021-02-02 Splunk Inc. Precise scaling of virtual objects in an extended reality environment
US10922892B1 (en) * 2019-04-30 2021-02-16 Splunk Inc. Manipulation of virtual object position within a plane of an extended reality environment
US11049082B2 (en) * 2018-04-06 2021-06-29 Robert A. Rice Systems and methods for item acquisition by selection of a virtual object placed in a digital environment
US11189000B2 (en) * 2019-06-24 2021-11-30 Intel Corporation Architecture to generate binary descriptor for image feature point
US11270456B2 (en) * 2018-05-31 2022-03-08 Beijing Boe Optoelectronics Technology Co., Ltd. Spatial positioning method, spatial positioning device, spatial positioning system and computer readable medium
WO2022080869A1 (ko) * 2020-10-14 2022-04-21 삼성전자 주식회사 이미지를 이용한 3차원 지도의 업데이트 방법 및 이를 지원하는 전자 장치
US11341676B2 (en) * 2019-02-05 2022-05-24 Google Llc Calibration-free instant motion tracking for augmented reality
US11463619B2 (en) * 2019-03-13 2022-10-04 Canon Kabushiki Kaisha Image processing apparatus that retouches and displays picked-up image, image processing method, and storage medium

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2686029C2 (ru) * 2017-07-19 2019-04-23 Автономная некоммерческая образовательная организация высшего образования "Сколковский институт науки и технологий" Система виртуальной реальности на основе смартфона и наклонного зеркала
CN108917751B (zh) * 2018-03-30 2021-11-02 北京凌宇智控科技有限公司 一种免标定的定位方法及系统
JP7227377B2 (ja) * 2018-12-12 2023-02-21 ホウメディカ・オステオニクス・コーポレイション 骨密度モデリング及び整形外科手術計画システム
CN110956065B (zh) * 2019-05-11 2022-06-10 魔门塔(苏州)科技有限公司 一种用于模型训练的人脸图像处理方法及装置
KR102358950B1 (ko) * 2020-10-05 2022-02-07 홍준표 모바일 스캔 객체 모델 스케일링을 통한 증강현실 구현 장치 및 방법

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130141461A1 (en) * 2011-12-06 2013-06-06 Tom Salter Augmented reality camera registration
US20160055671A1 (en) * 2014-08-22 2016-02-25 Applied Research Associates, Inc. Techniques for Enhanced Accurate Pose Estimation
US20170083084A1 (en) * 2014-06-30 2017-03-23 Sony Corporation Information processing apparatus, information processing method, computer program, and image processing system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100927009B1 (ko) * 2008-02-04 2009-11-16 광주과학기술원 증강 현실에서의 햅틱 상호 작용 방법 및 그 시스템
RU2451982C1 (ru) * 2008-06-24 2012-05-27 Олег Станиславович Рурин Способ воздействия на виртуальные объекты
US20100045701A1 (en) * 2008-08-22 2010-02-25 Cybernet Systems Corporation Automatic mapping of augmented reality fiducials
CN102254345A (zh) * 2011-06-30 2011-11-23 上海大学 基于云计算的自然特征注册方法
US9514570B2 (en) * 2012-07-26 2016-12-06 Qualcomm Incorporated Augmentation of tangible objects as user interface controller
US10410419B2 (en) * 2015-03-02 2019-09-10 Virtek Vision International Ulc Laser projection system with video overlay

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130141461A1 (en) * 2011-12-06 2013-06-06 Tom Salter Augmented reality camera registration
US20170083084A1 (en) * 2014-06-30 2017-03-23 Sony Corporation Information processing apparatus, information processing method, computer program, and image processing system
US20160055671A1 (en) * 2014-08-22 2016-02-25 Applied Research Associates, Inc. Techniques for Enhanced Accurate Pose Estimation

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10998211B2 (en) * 2017-05-31 2021-05-04 Fujikin Inc. Management system, method, and computer program for semiconductor fabrication apparatus
US20200185242A1 (en) * 2017-05-31 2020-06-11 Fujikin Incorporated Management System, Method, and Computer Program for Semiconductor Fabrication Apparatus
US11049082B2 (en) * 2018-04-06 2021-06-29 Robert A. Rice Systems and methods for item acquisition by selection of a virtual object placed in a digital environment
US11270456B2 (en) * 2018-05-31 2022-03-08 Beijing Boe Optoelectronics Technology Co., Ltd. Spatial positioning method, spatial positioning device, spatial positioning system and computer readable medium
US20190050697A1 (en) * 2018-06-27 2019-02-14 Intel Corporation Localizing a vehicle's charging or fueling port - methods and apparatuses
US11003972B2 (en) * 2018-06-27 2021-05-11 Intel Corporation Localizing a vehicle's charging or fueling port—methods and apparatuses
US10909772B2 (en) 2018-07-31 2021-02-02 Splunk Inc. Precise scaling of virtual objects in an extended reality environment
US11893703B1 (en) 2018-07-31 2024-02-06 Splunk Inc. Precise manipulation of virtual object position in an extended reality environment
US11410403B1 (en) 2018-07-31 2022-08-09 Splunk Inc. Precise scaling of virtual objects in an extended reality environment
US10692299B2 (en) * 2018-07-31 2020-06-23 Splunk Inc. Precise manipulation of virtual object position in an extended reality environment
US11430196B2 (en) 2018-07-31 2022-08-30 Splunk Inc. Precise manipulation of virtual object position in an extended reality environment
US11721039B2 (en) 2019-02-05 2023-08-08 Google Llc Calibration-free instant motion tracking for augmented reality
US11341676B2 (en) * 2019-02-05 2022-05-24 Google Llc Calibration-free instant motion tracking for augmented reality
US11463619B2 (en) * 2019-03-13 2022-10-04 Canon Kabushiki Kaisha Image processing apparatus that retouches and displays picked-up image, image processing method, and storage medium
US11696040B2 (en) 2019-03-13 2023-07-04 Canon Kabushiki Kaisha Image processing apparatus that retouches and displays picked-up image, image processing method, and storage medium
US11544911B1 (en) 2019-04-30 2023-01-03 Splunk Inc. Manipulation of virtual object position within a plane of an extended reality environment
US10922892B1 (en) * 2019-04-30 2021-02-16 Splunk Inc. Manipulation of virtual object position within a plane of an extended reality environment
US11790623B1 (en) * 2019-04-30 2023-10-17 Splunk Inc. Manipulation of virtual object position within a plane of an extended reality environment
CN112017297A (zh) * 2019-05-28 2020-12-01 中国商用飞机有限责任公司 一种增强现实的定位方法、装置、设备及介质
US11189000B2 (en) * 2019-06-24 2021-11-30 Intel Corporation Architecture to generate binary descriptor for image feature point
WO2022080869A1 (ko) * 2020-10-14 2022-04-21 삼성전자 주식회사 이미지를 이용한 3차원 지도의 업데이트 방법 및 이를 지원하는 전자 장치

Also Published As

Publication number Publication date
KR20180107085A (ko) 2018-10-01
US20200279396A1 (en) 2020-09-03
CN108369473A (zh) 2018-08-03
WO2017086841A1 (ru) 2017-05-26
EP3379396A1 (en) 2018-09-26
EP3379396A4 (en) 2019-06-12
US11288837B2 (en) 2022-03-29
RU2617557C1 (ru) 2017-04-25

Similar Documents

Publication Publication Date Title
US11288837B2 (en) Method of influencing virtual objects of augmented reality
Chen et al. Sports camera calibration via synthetic data
CN108717531B (zh) 基于Faster R-CNN的人体姿态估计方法
CN108875524B (zh) 视线估计方法、装置、系统和存储介质
CN108230383B (zh) 手部三维数据确定方法、装置及电子设备
CN108292362A (zh) 用于光标控制的手势识别
CN112070782B (zh) 识别场景轮廓的方法、装置、计算机可读介质及电子设备
US10984610B2 (en) Method for influencing virtual objects of augmented reality
JP2020008972A (ja) 情報処理装置、情報処理方法及びプログラム
CN107292299B (zh) 基于内核规范相关分析的侧面人脸识别方法
WO2014180108A1 (en) Systems and methods for matching face shapes
JP2012113438A (ja) 姿勢推定装置および姿勢推定プログラム
US10713847B2 (en) Method and device for interacting with virtual objects
CN111353325A (zh) 关键点检测模型训练方法及装置
Xu et al. A novel method for hand posture recognition based on depth information descriptor
CN110288714B (zh) 一种虚拟仿真实验系统
Adam et al. Implementation of Object Tracking Augmented Reality Markerless using FAST Corner Detection on User Defined-Extended Target Tracking in Multivarious Intensities
WO2022266878A1 (zh) 景别确定方法、装置及计算机可读存储介质
CN115019396A (zh) 一种学习状态监测方法、装置、设备及介质
JPWO2019053790A1 (ja) 位置座標算出方法及び位置座標算出装置
KR20130081126A (ko) 손 제스처 인식 방법 및 그 장치
CN111222448A (zh) 图像转换方法及相关产品
Verma et al. Design of an Augmented Reality Based Platform with Hand Gesture Interfacing
Bermudez et al. Comparison of natural feature descriptors for rigid-object tracking for real-time augmented reality
CN111428665B (zh) 一种信息确定方法、设备及计算机可读存储介质

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION