WO2022192067A1 - Système de réalité augmentée pour visualiser un événement avec un mode basé sur des images issues de l'externalisation ouverte - Google Patents

Système de réalité augmentée pour visualiser un événement avec un mode basé sur des images issues de l'externalisation ouverte Download PDF

Info

Publication number
WO2022192067A1
WO2022192067A1 PCT/US2022/018674 US2022018674W WO2022192067A1 WO 2022192067 A1 WO2022192067 A1 WO 2022192067A1 US 2022018674 W US2022018674 W US 2022018674W WO 2022192067 A1 WO2022192067 A1 WO 2022192067A1
Authority
WO
WIPO (PCT)
Prior art keywords
venue
coordinate system
mobile device
world coordinate
real world
Prior art date
Application number
PCT/US2022/018674
Other languages
English (en)
Inventor
Sankar Jayaram
Wayne O. COCHRAN
John Harrison
Timothy P. Heidmann
John Buddy SCOTT
Original Assignee
Quintar, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/242,275 external-priority patent/US11645819B2/en
Application filed by Quintar, Inc. filed Critical Quintar, Inc.
Priority to EP22712149.8A priority Critical patent/EP4305597A1/fr
Publication of WO2022192067A1 publication Critical patent/WO2022192067A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • G06V20/647Three-dimensional objects by matching two-dimensional images to three-dimensional objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30221Sports video; Sports image
    • G06T2207/30228Playing field
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • Figures 1 and 2 illustrate some of the examples of the presentation of AR graphics and added AR content at an outdoor venue and an indoor venue, respectively.
  • Figure 1 illustrates a golf course venue during an event, where the green 120 (extending out from an isthmus into a lake) and an island 110 are marked out for later reference.
  • Figure 1 shows the venue during play with spectators present and a user viewing the scene with enhanced content such as 3D AR graphics on the display of a mobile device 121, where the depicted mobile device is smart phone but could also be an AR headset, tablet, or other mobile device.
  • the microprocessor 410 may be configured to implement registration processing using any one or combination of elements described in the embodiments.
  • the memory 420 may comprise any type of system memory such as static random access memory (SRAM), dynamic random access memory (DRAM), synchronous DRAM (SDRAM), read-only memory (ROM), a combination thereof, or the like.
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • SDRAM synchronous DRAM
  • ROM read-only memory
  • the memory 420 may include ROM for use at boot-up, and DRAM for program and data storage for use while executing programs.
  • the mobile device 321 can track the images, maintaining a model (such as a Kalman-filtered model) of the mobile device’s camera’s orientation, where this can be driven by the IMU of the mobile device 321 and tracking results from previous frames. This can be used by the mobile device 321 to estimate the camera parameters for the current frame.
  • the mobile device can access the current set of simple features at their predicted location with a current image, such as by a simple template matching, to refine the estimate.
  • a mobile device 321 may have its orientation changed frequently, but that its location will change to a lesser amount, so that the orientation of the mobile device 321 is the more important value for maintaining graphics and other content locked on the imagery with the real world coordinate system.
  • the transformation between the mobile device’s coordinate system and the real world coordinate system can be in the form of a set of matrices for a combination of a rotation, translation, and scale dilation to transform between the coordinate system of the mobile device 321 and the real world coordinates.
  • the calculated transformation between the mobile device’s coordinate system and the real world coordinate system and tracking points/template images are respectively sent from the registration server 311 over the network interfaces 450 to the mobile device 321 at steps 1363 and 1365.
  • FIG 14A is a more detailed flowchart of an embodiment for the operation of registration server 311.
  • the registration server 311 retrieves the output of the three columns from registration processing 307 from the feature database 309 and combines these with the image data and metadata from a mobile device 321 to determine the transformation between the mobile device’s coordinate system and the real world coordinate system.
  • the inputs image data and image metadata from the mobile devices 321 and point features, large scale features, and shape features from the feature database 309
  • the outputs the coordinate transformations and tracking points and template images
  • shape features extracted from the 3D survey data are combined with the image data and image metadata from the mobile device 321.
  • the mobile device s image data and image metadata undergo image segmentation 1421 to generate 2D contours 1423 for the 2D images as output data.
  • the image segmentation can be implemented on the registration server 311 as a convolutional neural network, for example.
  • the 2D contour data 1423 can then be combined with the 3D contour data from the feature database 309 in processing to render the 3D contours to match the 2D contours within the images from the mobile device 321.
  • FIG. 21 is flowchart for the operation of tabletop embodiment.
  • a model of the venue is built prior to an event.
  • the venue is prepared for survey, with the survey images collected at step 2103.
  • Steps 2101 and 2103 can be as described above with respect to steps 601 and 603 and can be the same as these steps, with the process for in-venue enhanced viewing and the process for remote viewing being the same process.
  • a tabletop model of the venue is built in much the same way as described with respect to step 605, but additional the model of the venue is built for a tabletop display.
  • the system also includes one or more servers configured to exchange data with the mobile device and to: receive the image data of the venue and the image metadata; generate the transformation between the mobile device’ s coordinate system and the real world coordinate system from the image data and image metadata; provide the transformation between the mobile device’s coordinate system and the real world coordinate system; and provide the graphics to be displayed over the view of the venue specified by location and orientation in the real world coordinate system.
  • one or more servers configured to exchange data with the mobile device and to: receive the image data of the venue and the image metadata; generate the transformation between the mobile device’ s coordinate system and the real world coordinate system from the image data and image metadata; provide the transformation between the mobile device’s coordinate system and the real world coordinate system; and provide the graphics to be displayed over the view of the venue specified by location and orientation in the real world coordinate system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Des systèmes de réalité augmentée fournissent des graphiques sur des vues à partir d'un appareil mobile pour la visualisation à la fois sur site et à distance d'un événement sportif ou d'un autre événement. Un système serveur peut fournir une transformation entre le système de coordonnées d'un appareil mobile (téléphone intelligent, tablette électronique, visiocasque) et un système de coordonnées du monde réel. Les graphiques demandés pour l'événement sont affichés sur une vue d'un événement.
PCT/US2022/018674 2021-03-11 2022-03-03 Système de réalité augmentée pour visualiser un événement avec un mode basé sur des images issues de l'externalisation ouverte WO2022192067A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22712149.8A EP4305597A1 (fr) 2021-03-11 2022-03-03 Système de réalité augmentée pour visualiser un événement avec un mode basé sur des images issues de l'externalisation ouverte

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163159870P 2021-03-11 2021-03-11
US63/159,870 2021-03-11
US17/242,275 2021-04-27
US17/242,275 US11645819B2 (en) 2021-03-11 2021-04-27 Augmented reality system for viewing an event with mode based on crowd sourced images

Publications (1)

Publication Number Publication Date
WO2022192067A1 true WO2022192067A1 (fr) 2022-09-15

Family

ID=80930200

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/018674 WO2022192067A1 (fr) 2021-03-11 2022-03-03 Système de réalité augmentée pour visualiser un événement avec un mode basé sur des images issues de l'externalisation ouverte

Country Status (1)

Country Link
WO (1) WO2022192067A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100257252A1 (en) * 2009-04-01 2010-10-07 Microsoft Corporation Augmented Reality Cloud Computing
US20140247279A1 (en) * 2013-03-01 2014-09-04 Apple Inc. Registration between actual mobile device position and environmental model
WO2015192117A1 (fr) * 2014-06-14 2015-12-17 Magic Leap, Inc. Procédés et systèmes de création d'une réalité virtuelle et d'une réalité augmentée
US20170365102A1 (en) * 2012-02-23 2017-12-21 Charles D. Huston System And Method For Creating And Sharing A 3D Virtual Model Of An Event
US20200302510A1 (en) * 2019-03-24 2020-09-24 We.R Augmented Reality Cloud Ltd. System, Device, and Method of Augmented Reality based Mapping of a Venue and Navigation within a Venue

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100257252A1 (en) * 2009-04-01 2010-10-07 Microsoft Corporation Augmented Reality Cloud Computing
US20170365102A1 (en) * 2012-02-23 2017-12-21 Charles D. Huston System And Method For Creating And Sharing A 3D Virtual Model Of An Event
US20140247279A1 (en) * 2013-03-01 2014-09-04 Apple Inc. Registration between actual mobile device position and environmental model
WO2015192117A1 (fr) * 2014-06-14 2015-12-17 Magic Leap, Inc. Procédés et systèmes de création d'une réalité virtuelle et d'une réalité augmentée
US20200302510A1 (en) * 2019-03-24 2020-09-24 We.R Augmented Reality Cloud Ltd. System, Device, and Method of Augmented Reality based Mapping of a Venue and Navigation within a Venue

Similar Documents

Publication Publication Date Title
US11657578B2 (en) Registration for augmented reality system for viewing an event
US20220295040A1 (en) Augmented reality system with remote presentation including 3d graphics extending beyond frame
US11189077B2 (en) View point representation for 3-D scenes
US20230118280A1 (en) Use of multiple registrations for augmented reality system for viewing an event
US7239760B2 (en) System and method for creating, storing, and utilizing composite images of a geographic location
US20140247345A1 (en) System and method for photographing moving subject by means of multiple cameras, and acquiring actual movement trajectory of subject based on photographed images
CN106169184A (zh) 用于确定摄像机的空间特性的方法及系统
CN102726051A (zh) 3d视频中的虚拟插件
US20230237748A1 (en) Augmented reality system for viewing an event with mode based on crowd sourced images
US11880953B2 (en) Augmented reality system for viewing an event with distributed computing
JP2019114147A (ja) 情報処理装置、情報処理装置の制御方法及びプログラム
US20220295141A1 (en) Remote presentation with augmented reality content synchronized with separately displayed video content
US20220295032A1 (en) Augmented reality system for remote presentation for viewing an event
US12003806B2 (en) Augmented reality system for viewing an event with multiple coordinate systems and automatically generated model
US20220295139A1 (en) Augmented reality system for viewing an event with multiple coordinate systems and automatically generated model
US20230306682A1 (en) 3d reference point detection for survey for venue model construction
WO2022192067A1 (fr) Système de réalité augmentée pour visualiser un événement avec un mode basé sur des images issues de l'externalisation ouverte
US20230260240A1 (en) Alignment of 3d graphics extending beyond frame in augmented reality system with remote presentation
WO2023205393A1 (fr) Alignement de graphiques 3d s'étendant au-delà d'une image dans un système de réalité augmentée avec présentation à distance
CN116363333A (zh) 一种体育赛事信息ar实时显示的方法和系统
KR20150066941A (ko) 선수 정보 제공 장치 및 이를 이용한 선수 정보 제공 방법
Kraft Real time baseball augmented reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22712149

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022712149

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022712149

Country of ref document: EP

Effective date: 20231011