WO2015030623A1 - Méthodes et systèmes de localisation de surfaces pratiquement planes d'une scène en 3d - Google Patents

Méthodes et systèmes de localisation de surfaces pratiquement planes d'une scène en 3d Download PDF

Info

Publication number
WO2015030623A1
WO2015030623A1 PCT/RU2013/000761 RU2013000761W WO2015030623A1 WO 2015030623 A1 WO2015030623 A1 WO 2015030623A1 RU 2013000761 W RU2013000761 W RU 2013000761W WO 2015030623 A1 WO2015030623 A1 WO 2015030623A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixels
coordinates
space
candidate
processor
Prior art date
Application number
PCT/RU2013/000761
Other languages
English (en)
Inventor
Evgeny Sergeevich SOLOGUB
Original Assignee
3Divi Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3Divi Company filed Critical 3Divi Company
Priority to PCT/RU2013/000761 priority Critical patent/WO2015030623A1/fr
Publication of WO2015030623A1 publication Critical patent/WO2015030623A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps

Definitions

  • the 2D coordinates may include a first coordinate associated with an abscissa axis value (i.e., X) and a second coordinate associated with an ordinate axis value (i.e., Y).
  • the pixel represented as ⁇ X, Y, Z ⁇ would then include a depth value related to the x and y coordinates.
  • the present technology provides multiple benefits including improved and more accurate virtual reality simulation as well as better gaming experience, which includes enhanced representation of 3D virtual reality and representation of user avatar.
  • improved and more accurate virtual reality simulation as well as better gaming experience, which includes enhanced representation of 3D virtual reality and representation of user avatar.
  • Other features, aspects, examples, and embodiments are described below.
  • FIG. 1 shows an example 3D scene suitable for implementation of a real time human-computer interface employing the present technology for detecting at least one substantially planar surface of the 3D scene.
  • FIG. 6 shows an example 3D coordinate system and one of candidate planes created by RANSAC process.
  • the techniques of the embodiments disclosed herein may be implemented using a variety of technologies.
  • the methods described herein may be implemented in software executing on a computer system or in hardware utilizing either a combination of microprocessors, controllers or other specially designed application-specific integrated circuits (ASICs), programmable logic devices, or various combinations thereof.
  • the methods described herein may be implemented by a series of computer-executable instructions residing on a storage medium such as a disk drive, solid-state drive or on a computer-readable medium.
  • FIG. 1 shows an example 3D scene 100 (e.g., a room) suitable for implementation of a real time human- computer interface employing the present technology for locating a floor of the 3D scene.
  • a control system 110 employing one or more depth-sensing devices and/or one or more video cameras configured to generate depth maps of at least a part of the scene 100.
  • the control system 110 may implement the floor locating technology based on the depth maps as described herein.
  • the detailed description of the control system 110 and its components are given below with reference to FIG. 9.
  • control system 110 may optionally determine 3D coordinates of the selected candidate plane associated with the floor 120. As will be appreciated by those skilled in the art it may be accomplished by a number of methods. However, perhaps the simplest one is the use of the following equation (which is derived from Equation No. 2):
  • control system 110 may optionally include a color video camera 920 to capture a series of 2D images in addition to 3D imagery already created by the depth sensor 910.
  • the series of 2D images captured by the color video camera 920 may be used to facilitate identification of the user, user gestures or motions, facilitate identification of user emotions, facilitate identification of the floor 120, and so forth.
  • the only color video camera 920 can be used to generate depth maps, and not the depth sensor 910. It should also be noted that the depth sensor 910 and the color video camera 920 can be either stand alone devices or be encased within a single housing with the remaining components of the control system 110.
  • 920 may include internal motion sensor(s) 940.
  • the disk drive unit 1114 includes a computer-readable medium 1120 that stores one or more sets of instructions and data structures (e.g., instructions 1122) embodying or utilized by any one or more of the methodologies or functions described herein.
  • the instructions 1122 can also reside, completely or at least partially, within the main memory 1104 and/or within the processors 1102 during execution by the computer system 1100.
  • the main memory 1104 and the processors 1102 also constitute machine- readable media.
  • the instructions 1122 can further be transmitted or received over the network 1124 via the network interface device 1118 utilizing any one of a number of well-known transfer protocols (e.g., Hyper Text Transfer Protocol (HTTP), CAN, Serial, and Modbus).
  • HTTP Hyper Text Transfer Protocol
  • CAN Serial, and Modbus

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne une technologie qui permet de localiser les surfaces d'une pièce, telles qu'un sol, des murs, un plafond, en fonction de cartes de profondeurs obtenues par un dispositif de détection de profondeur. Les surfaces de pièce peuvent être localisées grâce à un processus à multiples étapes comprenant, par exemple, les étapes suivantes : obtenir une carte de profondeur, sélectionner des pixels caractéristiques parmi une pluralité de pixels appartenant à la carte de profondeur, calculer des coordonnées 3D par rapport aux pixels caractéristiques sélectionnés, déterminer des groupes des coordonnées 3D, produire une pluralité de plans candidats au moins en partie en fonction des groupes, sélectionner des plans associés au sol, aux murs, au plafond parmi la pluralité de plans candidats, et déterminer les coordonnées 3D des plans sélectionnés. Les coordonnées 3D ou les données associées des surfaces de pièce localisées peuvent être utilisées dans la simulation en réalité virtuelle ou le rendu d'images en 3D.
PCT/RU2013/000761 2013-09-02 2013-09-02 Méthodes et systèmes de localisation de surfaces pratiquement planes d'une scène en 3d WO2015030623A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/RU2013/000761 WO2015030623A1 (fr) 2013-09-02 2013-09-02 Méthodes et systèmes de localisation de surfaces pratiquement planes d'une scène en 3d

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/RU2013/000761 WO2015030623A1 (fr) 2013-09-02 2013-09-02 Méthodes et systèmes de localisation de surfaces pratiquement planes d'une scène en 3d

Publications (1)

Publication Number Publication Date
WO2015030623A1 true WO2015030623A1 (fr) 2015-03-05

Family

ID=52587026

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/RU2013/000761 WO2015030623A1 (fr) 2013-09-02 2013-09-02 Méthodes et systèmes de localisation de surfaces pratiquement planes d'une scène en 3d

Country Status (1)

Country Link
WO (1) WO2015030623A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018219754A1 (de) * 2018-11-19 2020-05-20 BSH Hausgeräte GmbH Interaktionseinrichtung zur Steuerung eines Hausgeräts
CN114666804A (zh) * 2022-03-28 2022-06-24 北京四维图新科技股份有限公司 一种基于不同环境场景选取基站架设坐标的方法、装置及设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4947347A (en) * 1987-09-18 1990-08-07 Kabushiki Kaisha Toshiba Depth map generating method and apparatus
JPH07174538A (ja) * 1993-12-20 1995-07-14 Minolta Co Ltd 画像入力カメラ
US20100289817A1 (en) * 2007-09-25 2010-11-18 Metaio Gmbh Method and device for illustrating a virtual object in a real environment
US20110102550A1 (en) * 2008-04-02 2011-05-05 Eykona Technologies Ltd. 3d imaging system
US20130093852A1 (en) * 2011-10-12 2013-04-18 Board Of Trustees Of The University Of Arkansas Portable robotic device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4947347A (en) * 1987-09-18 1990-08-07 Kabushiki Kaisha Toshiba Depth map generating method and apparatus
JPH07174538A (ja) * 1993-12-20 1995-07-14 Minolta Co Ltd 画像入力カメラ
US20100289817A1 (en) * 2007-09-25 2010-11-18 Metaio Gmbh Method and device for illustrating a virtual object in a real environment
US20110102550A1 (en) * 2008-04-02 2011-05-05 Eykona Technologies Ltd. 3d imaging system
US20130093852A1 (en) * 2011-10-12 2013-04-18 Board Of Trustees Of The University Of Arkansas Portable robotic device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018219754A1 (de) * 2018-11-19 2020-05-20 BSH Hausgeräte GmbH Interaktionseinrichtung zur Steuerung eines Hausgeräts
CN114666804A (zh) * 2022-03-28 2022-06-24 北京四维图新科技股份有限公司 一种基于不同环境场景选取基站架设坐标的方法、装置及设备
CN114666804B (zh) * 2022-03-28 2023-06-23 北京四维图新科技股份有限公司 一种基于不同环境场景选取基站架设坐标的方法、装置及设备

Similar Documents

Publication Publication Date Title
US10761612B2 (en) Gesture recognition techniques
US11237625B2 (en) Interaction engine for creating a realistic experience in virtual reality/augmented reality environments
US10960298B2 (en) Boolean/float controller and gesture recognition system
US20130010071A1 (en) Methods and systems for mapping pointing device on depth map
US9766713B2 (en) System and method for providing user interface tools
US20190018567A1 (en) Input device for vr/ar applications
EP3072033B1 (fr) Commande de mouvement d'un environnement virtuel
US20150070274A1 (en) Methods and systems for determining 6dof location and orientation of head-mounted display and associated user movements
JP7008730B2 (ja) 画像に挿入される画像コンテンツについての影生成
US20150187108A1 (en) Augmented reality content adapted to changes in real world space geometry
US20140009384A1 (en) Methods and systems for determining location of handheld device within 3d environment
EP2814000A1 (fr) Appareil de traitement d'image, procédé et programme de traitement d'image
CN108431734A (zh) 用于非触摸式表面交互的触觉反馈
CN112154405A (zh) 三维推送通知
CN115335894A (zh) 用于虚拟和增强现实的系统和方法
CN108776544A (zh) 增强现实中的交互方法及装置、存储介质、电子设备
WO2015030623A1 (fr) Méthodes et systèmes de localisation de surfaces pratiquement planes d'une scène en 3d
CN117716322A (zh) 增强现实(ar)笔/手跟踪
Coleca et al. Real-time skeleton tracking for embedded systems
US11948237B2 (en) System and method for mimicking user handwriting or other user input using an avatar
WO2013176574A1 (fr) Procédés et systèmes de correspondance de dispositif de pointage sur une carte de profondeur
CN117899456A (zh) 二维组件的显示处理方法、装置、设备及介质
CN115937284A (zh) 一种图像生成方法、设备以及存储介质和程序产品
CN116934959A (zh) 基于手势识别的粒子影像生成方法、装置、电子设备和介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13892510

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13892510

Country of ref document: EP

Kind code of ref document: A1