WO2022192067A1 - Système de réalité augmentée pour visualiser un événement avec un mode basé sur des images issues de l'externalisation ouverte - Google Patents
Système de réalité augmentée pour visualiser un événement avec un mode basé sur des images issues de l'externalisation ouverte Download PDFInfo
- Publication number
- WO2022192067A1 WO2022192067A1 PCT/US2022/018674 US2022018674W WO2022192067A1 WO 2022192067 A1 WO2022192067 A1 WO 2022192067A1 US 2022018674 W US2022018674 W US 2022018674W WO 2022192067 A1 WO2022192067 A1 WO 2022192067A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- venue
- coordinate system
- mobile device
- world coordinate
- real world
- Prior art date
Links
- 230000003190 augmentative effect Effects 0.000 title abstract description 17
- 230000009466 transformation Effects 0.000 claims abstract description 81
- 238000000034 method Methods 0.000 claims description 103
- 238000012545 processing Methods 0.000 claims description 45
- 230000001186 cumulative effect Effects 0.000 claims description 2
- 230000008569 process Effects 0.000 description 66
- 230000000875 corresponding effect Effects 0.000 description 33
- 230000015654 memory Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 14
- 238000003860 storage Methods 0.000 description 11
- 230000004044 response Effects 0.000 description 9
- 238000012937 correction Methods 0.000 description 8
- 230000000007 visual effect Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 238000013459 approach Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 6
- 239000011521 glass Substances 0.000 description 6
- 238000005259 measurement Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000001360 synchronised effect Effects 0.000 description 5
- 238000013519 translation Methods 0.000 description 5
- 238000013079 data visualisation Methods 0.000 description 4
- 238000009826 distribution Methods 0.000 description 4
- 239000000284 extract Substances 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000013527 convolutional neural network Methods 0.000 description 3
- 230000002596 correlated effect Effects 0.000 description 3
- 230000010339 dilation Effects 0.000 description 3
- 230000000153 supplemental effect Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000001351 cycling effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000007667 floating Methods 0.000 description 2
- 238000003709 image segmentation Methods 0.000 description 2
- 238000002372 labelling Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012805 post-processing Methods 0.000 description 2
- 238000002360 preparation method Methods 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 238000000844 transformation Methods 0.000 description 2
- 244000025254 Cannabis sativa Species 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000001404 mediated effect Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000011112 process operation Methods 0.000 description 1
- 238000007670 refining Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/579—Depth or shape recovery from multiple images from motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
- G06V20/647—Three-dimensional objects by matching two-dimensional images to three-dimensional objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30221—Sports video; Sports image
- G06T2207/30228—Playing field
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- Figures 1 and 2 illustrate some of the examples of the presentation of AR graphics and added AR content at an outdoor venue and an indoor venue, respectively.
- Figure 1 illustrates a golf course venue during an event, where the green 120 (extending out from an isthmus into a lake) and an island 110 are marked out for later reference.
- Figure 1 shows the venue during play with spectators present and a user viewing the scene with enhanced content such as 3D AR graphics on the display of a mobile device 121, where the depicted mobile device is smart phone but could also be an AR headset, tablet, or other mobile device.
- the microprocessor 410 may be configured to implement registration processing using any one or combination of elements described in the embodiments.
- the memory 420 may comprise any type of system memory such as static random access memory (SRAM), dynamic random access memory (DRAM), synchronous DRAM (SDRAM), read-only memory (ROM), a combination thereof, or the like.
- SRAM static random access memory
- DRAM dynamic random access memory
- SDRAM synchronous DRAM
- ROM read-only memory
- the memory 420 may include ROM for use at boot-up, and DRAM for program and data storage for use while executing programs.
- the mobile device 321 can track the images, maintaining a model (such as a Kalman-filtered model) of the mobile device’s camera’s orientation, where this can be driven by the IMU of the mobile device 321 and tracking results from previous frames. This can be used by the mobile device 321 to estimate the camera parameters for the current frame.
- the mobile device can access the current set of simple features at their predicted location with a current image, such as by a simple template matching, to refine the estimate.
- a mobile device 321 may have its orientation changed frequently, but that its location will change to a lesser amount, so that the orientation of the mobile device 321 is the more important value for maintaining graphics and other content locked on the imagery with the real world coordinate system.
- the transformation between the mobile device’s coordinate system and the real world coordinate system can be in the form of a set of matrices for a combination of a rotation, translation, and scale dilation to transform between the coordinate system of the mobile device 321 and the real world coordinates.
- the calculated transformation between the mobile device’s coordinate system and the real world coordinate system and tracking points/template images are respectively sent from the registration server 311 over the network interfaces 450 to the mobile device 321 at steps 1363 and 1365.
- FIG 14A is a more detailed flowchart of an embodiment for the operation of registration server 311.
- the registration server 311 retrieves the output of the three columns from registration processing 307 from the feature database 309 and combines these with the image data and metadata from a mobile device 321 to determine the transformation between the mobile device’s coordinate system and the real world coordinate system.
- the inputs image data and image metadata from the mobile devices 321 and point features, large scale features, and shape features from the feature database 309
- the outputs the coordinate transformations and tracking points and template images
- shape features extracted from the 3D survey data are combined with the image data and image metadata from the mobile device 321.
- the mobile device s image data and image metadata undergo image segmentation 1421 to generate 2D contours 1423 for the 2D images as output data.
- the image segmentation can be implemented on the registration server 311 as a convolutional neural network, for example.
- the 2D contour data 1423 can then be combined with the 3D contour data from the feature database 309 in processing to render the 3D contours to match the 2D contours within the images from the mobile device 321.
- FIG. 21 is flowchart for the operation of tabletop embodiment.
- a model of the venue is built prior to an event.
- the venue is prepared for survey, with the survey images collected at step 2103.
- Steps 2101 and 2103 can be as described above with respect to steps 601 and 603 and can be the same as these steps, with the process for in-venue enhanced viewing and the process for remote viewing being the same process.
- a tabletop model of the venue is built in much the same way as described with respect to step 605, but additional the model of the venue is built for a tabletop display.
- the system also includes one or more servers configured to exchange data with the mobile device and to: receive the image data of the venue and the image metadata; generate the transformation between the mobile device’ s coordinate system and the real world coordinate system from the image data and image metadata; provide the transformation between the mobile device’s coordinate system and the real world coordinate system; and provide the graphics to be displayed over the view of the venue specified by location and orientation in the real world coordinate system.
- one or more servers configured to exchange data with the mobile device and to: receive the image data of the venue and the image metadata; generate the transformation between the mobile device’ s coordinate system and the real world coordinate system from the image data and image metadata; provide the transformation between the mobile device’s coordinate system and the real world coordinate system; and provide the graphics to be displayed over the view of the venue specified by location and orientation in the real world coordinate system.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Signal Processing (AREA)
- Processing Or Creating Images (AREA)
Abstract
Des systèmes de réalité augmentée fournissent des graphiques sur des vues à partir d'un appareil mobile pour la visualisation à la fois sur site et à distance d'un événement sportif ou d'un autre événement. Un système serveur peut fournir une transformation entre le système de coordonnées d'un appareil mobile (téléphone intelligent, tablette électronique, visiocasque) et un système de coordonnées du monde réel. Les graphiques demandés pour l'événement sont affichés sur une vue d'un événement.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22712149.8A EP4305597A1 (fr) | 2021-03-11 | 2022-03-03 | Système de réalité augmentée pour visualiser un événement avec un mode basé sur des images issues de l'externalisation ouverte |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163159870P | 2021-03-11 | 2021-03-11 | |
US63/159,870 | 2021-03-11 | ||
US17/242,275 | 2021-04-27 | ||
US17/242,275 US11645819B2 (en) | 2021-03-11 | 2021-04-27 | Augmented reality system for viewing an event with mode based on crowd sourced images |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022192067A1 true WO2022192067A1 (fr) | 2022-09-15 |
Family
ID=80930200
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2022/018674 WO2022192067A1 (fr) | 2021-03-11 | 2022-03-03 | Système de réalité augmentée pour visualiser un événement avec un mode basé sur des images issues de l'externalisation ouverte |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022192067A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100257252A1 (en) * | 2009-04-01 | 2010-10-07 | Microsoft Corporation | Augmented Reality Cloud Computing |
US20140247279A1 (en) * | 2013-03-01 | 2014-09-04 | Apple Inc. | Registration between actual mobile device position and environmental model |
WO2015192117A1 (fr) * | 2014-06-14 | 2015-12-17 | Magic Leap, Inc. | Procédés et systèmes de création d'une réalité virtuelle et d'une réalité augmentée |
US20170365102A1 (en) * | 2012-02-23 | 2017-12-21 | Charles D. Huston | System And Method For Creating And Sharing A 3D Virtual Model Of An Event |
US20200302510A1 (en) * | 2019-03-24 | 2020-09-24 | We.R Augmented Reality Cloud Ltd. | System, Device, and Method of Augmented Reality based Mapping of a Venue and Navigation within a Venue |
-
2022
- 2022-03-03 WO PCT/US2022/018674 patent/WO2022192067A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100257252A1 (en) * | 2009-04-01 | 2010-10-07 | Microsoft Corporation | Augmented Reality Cloud Computing |
US20170365102A1 (en) * | 2012-02-23 | 2017-12-21 | Charles D. Huston | System And Method For Creating And Sharing A 3D Virtual Model Of An Event |
US20140247279A1 (en) * | 2013-03-01 | 2014-09-04 | Apple Inc. | Registration between actual mobile device position and environmental model |
WO2015192117A1 (fr) * | 2014-06-14 | 2015-12-17 | Magic Leap, Inc. | Procédés et systèmes de création d'une réalité virtuelle et d'une réalité augmentée |
US20200302510A1 (en) * | 2019-03-24 | 2020-09-24 | We.R Augmented Reality Cloud Ltd. | System, Device, and Method of Augmented Reality based Mapping of a Venue and Navigation within a Venue |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11657578B2 (en) | Registration for augmented reality system for viewing an event | |
US20220295040A1 (en) | Augmented reality system with remote presentation including 3d graphics extending beyond frame | |
US11189077B2 (en) | View point representation for 3-D scenes | |
US20230118280A1 (en) | Use of multiple registrations for augmented reality system for viewing an event | |
US7239760B2 (en) | System and method for creating, storing, and utilizing composite images of a geographic location | |
US20140247345A1 (en) | System and method for photographing moving subject by means of multiple cameras, and acquiring actual movement trajectory of subject based on photographed images | |
CN106169184A (zh) | 用于确定摄像机的空间特性的方法及系统 | |
CN102726051A (zh) | 3d视频中的虚拟插件 | |
US20230237748A1 (en) | Augmented reality system for viewing an event with mode based on crowd sourced images | |
US11880953B2 (en) | Augmented reality system for viewing an event with distributed computing | |
JP2019114147A (ja) | 情報処理装置、情報処理装置の制御方法及びプログラム | |
US20220295141A1 (en) | Remote presentation with augmented reality content synchronized with separately displayed video content | |
US20220295032A1 (en) | Augmented reality system for remote presentation for viewing an event | |
US12003806B2 (en) | Augmented reality system for viewing an event with multiple coordinate systems and automatically generated model | |
US20220295139A1 (en) | Augmented reality system for viewing an event with multiple coordinate systems and automatically generated model | |
US20230306682A1 (en) | 3d reference point detection for survey for venue model construction | |
WO2022192067A1 (fr) | Système de réalité augmentée pour visualiser un événement avec un mode basé sur des images issues de l'externalisation ouverte | |
US20230260240A1 (en) | Alignment of 3d graphics extending beyond frame in augmented reality system with remote presentation | |
WO2023205393A1 (fr) | Alignement de graphiques 3d s'étendant au-delà d'une image dans un système de réalité augmentée avec présentation à distance | |
CN116363333A (zh) | 一种体育赛事信息ar实时显示的方法和系统 | |
KR20150066941A (ko) | 선수 정보 제공 장치 및 이를 이용한 선수 정보 제공 방법 | |
Kraft | Real time baseball augmented reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22712149 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022712149 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022712149 Country of ref document: EP Effective date: 20231011 |