EP3500911A4 - Augmented reality display device with deep learning sensors - Google Patents

Augmented reality display device with deep learning sensors Download PDF

Info

Publication number
EP3500911A4
EP3500911A4 EP17844303.2A EP17844303A EP3500911A4 EP 3500911 A4 EP3500911 A4 EP 3500911A4 EP 17844303 A EP17844303 A EP 17844303A EP 3500911 A4 EP3500911 A4 EP 3500911A4
Authority
EP
European Patent Office
Prior art keywords
display device
augmented reality
deep learning
reality display
learning sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP17844303.2A
Other languages
German (de)
French (fr)
Other versions
EP3500911B1 (en
EP3500911A1 (en
Inventor
Andrew Rabinovich
Tomasz Jan MALISIEWICZ
Daniel DETONE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magic Leap Inc
Original Assignee
Magic Leap Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magic Leap Inc filed Critical Magic Leap Inc
Publication of EP3500911A1 publication Critical patent/EP3500911A1/en
Publication of EP3500911A4 publication Critical patent/EP3500911A4/en
Application granted granted Critical
Publication of EP3500911B1 publication Critical patent/EP3500911B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/006Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • G06V10/451Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
    • G06V10/454Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/01Dynamic search techniques; Heuristics; Dynamic trees; Branch-and-bound
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Cardiology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Computer Graphics (AREA)
  • Optics & Photonics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Algebra (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
EP17844303.2A 2016-08-22 2017-08-22 Augmented reality display device with deep learning sensors Active EP3500911B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662377835P 2016-08-22 2016-08-22
PCT/US2017/048068 WO2018039269A1 (en) 2016-08-22 2017-08-22 Augmented reality display device with deep learning sensors

Publications (3)

Publication Number Publication Date
EP3500911A1 EP3500911A1 (en) 2019-06-26
EP3500911A4 true EP3500911A4 (en) 2020-04-29
EP3500911B1 EP3500911B1 (en) 2023-09-27

Family

ID=61190853

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17844303.2A Active EP3500911B1 (en) 2016-08-22 2017-08-22 Augmented reality display device with deep learning sensors

Country Status (9)

Country Link
US (4) US10402649B2 (en)
EP (1) EP3500911B1 (en)
JP (3) JP7002536B2 (en)
KR (2) KR102529137B1 (en)
CN (2) CN114253400A (en)
AU (2) AU2017317599B2 (en)
CA (1) CA3034644A1 (en)
IL (3) IL294129B2 (en)
WO (1) WO2018039269A1 (en)

Families Citing this family (142)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11258366B2 (en) 2015-11-20 2022-02-22 Galvion Soldier Power, Llc Power manager with reconfigurable power converting circuits
WO2018013200A1 (en) 2016-07-14 2018-01-18 Magic Leap, Inc. Deep neural network for iris identification
US11138784B2 (en) * 2016-07-29 2021-10-05 Sony Corporation Image processing apparatus and image processing method
CA3034644A1 (en) 2016-08-22 2018-03-01 Magic Leap, Inc. Augmented reality display device with deep learning sensors
JP6948387B2 (en) 2016-09-26 2021-10-13 マジック リープ, インコーポレイテッドMagic Leap,Inc. Calibration of magnetic and optical sensors in virtual reality or augmented reality display systems
JP6854344B2 (en) 2016-11-15 2021-04-07 マジック リープ, インコーポレイテッドMagic Leap,Inc. Deep machine learning system for rectangular parallelepiped detection
KR20230070318A (en) 2016-12-05 2023-05-22 매직 립, 인코포레이티드 Virual user input controls in a mixed reality environment
US10746815B2 (en) * 2016-12-22 2020-08-18 Microsoft Technology Licensing, Llc Magnetic interference detection and correction
US11347054B2 (en) 2017-02-16 2022-05-31 Magic Leap, Inc. Systems and methods for augmented reality
US11308391B2 (en) * 2017-03-06 2022-04-19 Baidu Usa Llc Offline combination of convolutional/deconvolutional and batch-norm layers of convolutional neural network models for autonomous driving vehicles
EP3596659A4 (en) 2017-03-17 2021-01-27 Magic Leap, Inc. Room layout estimation methods and techniques
US10762635B2 (en) 2017-06-14 2020-09-01 Tusimple, Inc. System and method for actively selecting and labeling images for semantic segmentation
US10908680B1 (en) 2017-07-12 2021-02-02 Magic Leap, Inc. Pose estimation using electromagnetic tracking
CN110914790A (en) 2017-07-26 2020-03-24 奇跃公司 Training neural networks using representations of user interface devices
US10268205B2 (en) * 2017-09-13 2019-04-23 TuSimple Training and testing of a neural network method for deep odometry assisted by static scene optical flow
US10671083B2 (en) * 2017-09-13 2020-06-02 Tusimple, Inc. Neural network architecture system for deep odometry assisted by static scene optical flow
US10552979B2 (en) 2017-09-13 2020-02-04 TuSimple Output of a neural network method for deep odometry assisted by static scene optical flow
AU2018337653A1 (en) 2017-09-20 2020-01-16 Magic Leap, Inc. Personalized neural network for eye tracking
FR3072469B1 (en) * 2017-10-13 2019-11-01 Commissariat A L'energie Atomique Et Aux Energies Alternatives METHOD FOR SYNCHRONIZING A MAGNETIC LOCATION SYSTEM
US10614574B2 (en) * 2017-10-16 2020-04-07 Adobe Inc. Generating image segmentation data using a multi-branch neural network
US10996742B2 (en) * 2017-10-17 2021-05-04 Logitech Europe S.A. Input device for AR/VR applications
KR102602117B1 (en) 2017-10-26 2023-11-13 매직 립, 인코포레이티드 Gradient regularization systems and methods for adaptive loss balancing in deep multitask networks
CN111344716B (en) * 2017-11-14 2024-07-19 奇跃公司 Full convolution point of interest detection and description via homography transform adaptation
US10514545B2 (en) * 2017-12-08 2019-12-24 Facebook Technologies, Llc Selective tracking of a head-mounted display
GB2569603B (en) * 2017-12-21 2020-04-01 Sony Interactive Entertainment Inc Position tracking apparatus and method
KR102684302B1 (en) * 2018-01-05 2024-07-12 삼성전자주식회사 Method and apparatus for navigating virtual content displayed by a virtual reality (VR) device
US10740876B1 (en) * 2018-01-23 2020-08-11 Facebook Technologies, Llc Systems and methods for generating defocus blur effects
GB201804400D0 (en) * 2018-03-20 2018-05-02 Univ Of Essex Enterprise Limited Localisation, mapping and network training
KR102395445B1 (en) * 2018-03-26 2022-05-11 한국전자통신연구원 Electronic device for estimating position of sound source
KR102094953B1 (en) * 2018-03-28 2020-03-30 주식회사 비주얼캠프 Method for eye-tracking and terminal for executing the same
US10534982B2 (en) * 2018-03-30 2020-01-14 Tobii Ab Neural network training for three dimensional (3D) gaze prediction with calibration parameters
US11119624B2 (en) * 2018-04-17 2021-09-14 Apple Inc. Dynamic image stabilization using motion sensors
CN108921893B (en) * 2018-04-24 2022-03-25 华南理工大学 Image cloud computing method and system based on online deep learning SLAM
CN112534475B (en) * 2018-05-17 2023-01-10 奈安蒂克公司 Self-supervised training of depth estimation systems
GB2574372B (en) * 2018-05-21 2021-08-11 Imagination Tech Ltd Implementing Traditional Computer Vision Algorithms As Neural Networks
US20190362235A1 (en) 2018-05-23 2019-11-28 Xiaofan Xu Hybrid neural network pruning
US10636190B2 (en) * 2018-05-31 2020-04-28 Robert Bosch Gmbh Methods and systems for exploiting per-pixel motion conflicts to extract primary and secondary motions in augmented reality systems
US11677796B2 (en) 2018-06-20 2023-06-13 Logitech Europe S.A. System and method for video encoding optimization and broadcasting
JP2021527888A (en) * 2018-06-22 2021-10-14 マジック リープ, インコーポレイテッドMagic Leap,Inc. Methods and systems for performing eye tracking using off-axis cameras
CN108958716A (en) * 2018-06-27 2018-12-07 森汉智能科技(深圳)有限公司 Realize that mobile terminal programs the method in conjunction with ar system on intelligent robot
US10867164B2 (en) * 2018-06-29 2020-12-15 Intel Corporation Methods and apparatus for real-time interactive anamorphosis projection via face detection and tracking
US10948297B2 (en) 2018-07-09 2021-03-16 Samsung Electronics Co., Ltd. Simultaneous location and mapping (SLAM) using dual event cameras
US10754419B2 (en) * 2018-07-12 2020-08-25 Google Llc Hybrid pose tracking system with electromagnetic position tracking
WO2020023399A1 (en) * 2018-07-23 2020-01-30 Magic Leap, Inc. Deep predictor recurrent neural network for head pose prediction
US10943115B2 (en) * 2018-07-24 2021-03-09 Apical Ltd. Processing image data to perform object detection
US10783713B2 (en) * 2018-09-05 2020-09-22 International Business Machines Corporation Transmutation of virtual entity sketch using extracted features and relationships of real and virtual objects in mixed reality scene
WO2020051298A1 (en) 2018-09-05 2020-03-12 Magic Leap, Inc. Directed emitter/sensor for electromagnetic tracking in augmented reality systems
KR102559203B1 (en) * 2018-10-01 2023-07-25 삼성전자주식회사 Method and apparatus of outputting pose information
CN109240510B (en) * 2018-10-30 2023-12-26 东北大学 Augmented reality man-machine interaction equipment based on sight tracking and control method
US10915793B2 (en) * 2018-11-08 2021-02-09 Huawei Technologies Co., Ltd. Method and system for converting point cloud data for use with 2D convolutional neural networks
JP7357676B2 (en) * 2018-11-15 2023-10-06 マジック リープ, インコーポレイテッド System and method for performing self-improving visual odometry
US10854006B2 (en) * 2018-11-15 2020-12-01 Palo Alto Research Center Incorporated AR-enabled labeling using aligned CAD models
US10867081B2 (en) 2018-11-21 2020-12-15 Best Apps, Llc Computer aided systems and methods for creating custom products
CN109685802B (en) * 2018-12-13 2023-09-15 泸州禾苗通信科技有限公司 Low-delay video segmentation real-time preview method
CN113632030A (en) 2018-12-27 2021-11-09 奇跃公司 System and method for virtual reality and augmented reality
CN109858524B (en) * 2019-01-04 2020-10-16 北京达佳互联信息技术有限公司 Gesture recognition method and device, electronic equipment and storage medium
FR3092426B1 (en) 2019-02-01 2021-09-24 Olivier Querbes Dynamic three-dimensional imaging process
US10424048B1 (en) * 2019-02-15 2019-09-24 Shotspotter, Inc. Systems and methods involving creation and/or utilization of image mosaic in classification of acoustic events
US11009698B2 (en) * 2019-03-13 2021-05-18 Nick Cherukuri Gaze-based user interface for augmented and mixed reality device
CN109919128B (en) * 2019-03-20 2021-04-13 联想(北京)有限公司 Control instruction acquisition method and device and electronic equipment
US11205112B2 (en) * 2019-04-01 2021-12-21 Honeywell International Inc. Deep neural network-based inertial measurement unit (IMU) sensor compensation method
DE102020109121A1 (en) * 2019-04-02 2020-10-08 Ascension Technology Corporation Correction of distortions
WO2020204939A1 (en) * 2019-04-05 2020-10-08 Google Llc Distributed machine-learned models for inference generation using wearable devices
KR102148382B1 (en) * 2019-04-25 2020-08-26 경희대학교 산학협력단 The meghod and device for conversion from signal of inertial sensor to image
US11044462B2 (en) 2019-05-02 2021-06-22 Niantic, Inc. Self-supervised training of a depth estimation model using depth hints
CN110223351B (en) * 2019-05-30 2021-02-19 杭州蓝芯科技有限公司 Depth camera positioning method based on convolutional neural network
US11719850B2 (en) * 2019-06-20 2023-08-08 Sony Interactive Entertainment Inc. Detecting and compensating for magnetic interference in electromagnetic (EM) positional tracking
US11099641B2 (en) 2019-06-27 2021-08-24 Disney Enterprises, Inc. Calibration, customization, and improved user experience for bionic lenses
US11107291B2 (en) * 2019-07-11 2021-08-31 Google Llc Traversing photo-augmented information through depth using gesture and UI controlled occlusion planes
TWI773907B (en) * 2019-07-11 2022-08-11 緯創資通股份有限公司 Data capturing apparatus and data calculation system and method
US10989916B2 (en) * 2019-08-20 2021-04-27 Google Llc Pose prediction with recurrent neural networks
US11763191B2 (en) * 2019-08-20 2023-09-19 The Calany Holding S. À R.L. Virtual intelligence and optimization through multi-source, real-time, and context-aware real-world data
CN110717866B (en) * 2019-09-03 2022-10-18 北京爱博同心医学科技有限公司 Image sharpening method based on augmented reality and augmented reality glasses
CN110530356B (en) * 2019-09-04 2021-11-23 海信视像科技股份有限公司 Pose information processing method, device, equipment and storage medium
CN110602709B (en) * 2019-09-16 2022-01-04 腾讯科技(深圳)有限公司 Network data security method and device of wearable device and storage medium
US11175729B2 (en) * 2019-09-19 2021-11-16 Finch Technologies Ltd. Orientation determination based on both images and inertial measurement units
CN110686906B (en) * 2019-10-09 2021-03-26 清华大学 Automatic driving test method and device for vehicle
JP7150894B2 (en) * 2019-10-15 2022-10-11 ベイジン・センスタイム・テクノロジー・デベロップメント・カンパニー・リミテッド AR scene image processing method and device, electronic device and storage medium
JP2022505999A (en) * 2019-10-15 2022-01-17 ベイジン センスタイム テクノロジー デベロップメント カンパニー, リミテッド Augmented reality data presentation methods, devices, equipment and storage media
JP2022553202A (en) 2019-10-18 2022-12-22 マジック リープ, インコーポレイテッド Gravity Estimation and Bundle Adjustment for Visual Inertial Odometry
CN110737339B (en) * 2019-10-28 2021-11-02 福州大学 Visual-tactile interaction model construction method based on deep learning
SG10201910949PA (en) * 2019-11-21 2020-11-27 Lian Wang Artificial Intelligence Brain
US20210157394A1 (en) 2019-11-24 2021-05-27 XRSpace CO., LTD. Motion tracking system and method
KR102260393B1 (en) * 2019-11-27 2021-06-03 주식회사 피앤씨솔루션 A head mounted display apparatus with automatic screen illumination intensity adjustment according to the user
CN113031753A (en) * 2019-12-09 2021-06-25 未来市股份有限公司 Motion sensing data generation method and motion sensing data generation system
CN111209915B (en) * 2019-12-25 2023-09-15 上海航天控制技术研究所 Three-dimensional image synchronous recognition and segmentation method based on deep learning
CN111222468A (en) * 2020-01-08 2020-06-02 浙江光珀智能科技有限公司 People stream detection method and system based on deep learning
CN111330255B (en) * 2020-01-16 2021-06-08 北京理工大学 Amazon chess-calling generation method based on deep convolutional neural network
CN111325097B (en) * 2020-01-22 2023-04-07 陕西师范大学 Enhanced single-stage decoupled time sequence action positioning method
CN111273777A (en) * 2020-02-11 2020-06-12 Oppo广东移动通信有限公司 Virtual content control method and device, electronic equipment and storage medium
US20210256174A1 (en) * 2020-02-13 2021-08-19 Best Apps, Llc Computer aided systems and methods for creating custom products
GB2588470B (en) * 2020-02-19 2022-01-12 Envisics Ltd Pupil expansion
US11263818B2 (en) * 2020-02-24 2022-03-01 Palo Alto Research Center Incorporated Augmented reality system using visual object recognition and stored geometry to create and render virtual objects
WO2021169766A1 (en) * 2020-02-25 2021-09-02 Guangdong Oppo Mobile Telecommunications Corp., Ltd. System and method for visualizing light rays in a scene
WO2021190280A1 (en) * 2020-03-24 2021-09-30 Guangdong Oppo Mobile Telecommunications Corp., Ltd. System and method for augmented tele-cooperation
US11314950B2 (en) 2020-03-25 2022-04-26 International Business Machines Corporation Text style transfer using reinforcement learning
WO2021194487A1 (en) * 2020-03-25 2021-09-30 Hewlett-Packard Development Company, L.P. Head-related transfer functions with antropometric measurements
KR102331172B1 (en) * 2020-03-31 2021-11-26 엑스퍼트아이엔씨 주식회사 System for remote service and supporting of maintenance-repairment based in cloud employing smart glass
WO2021207162A1 (en) * 2020-04-06 2021-10-14 Pike Enterprises, Llc Virtual reality tracking system
US10996753B1 (en) 2020-04-07 2021-05-04 Eyetech Digital Systems, Inc. Multi-mode eye-tracking with independently operable illuminators
US11921917B2 (en) 2020-04-07 2024-03-05 Eyetech Digital Systems, Inc. Compact eye-tracking camera systems and methods
CN111461251A (en) * 2020-04-10 2020-07-28 桂林电子科技大学 Indoor positioning method of WiFi fingerprint based on random forest and self-encoder
KR20210128269A (en) 2020-04-16 2021-10-26 삼성전자주식회사 Augmented Reality (AR) device and method for predicting pose thereof
EP4143739A4 (en) 2020-05-01 2023-09-27 Magic Leap, Inc. Image descriptor network with imposed hierarchical normalization
CN111643809B (en) * 2020-05-29 2023-12-05 广州大学 Electromagnetic pulse control method and system based on potential intervention instrument
EP4162343A4 (en) * 2020-06-05 2024-06-12 Magic Leap, Inc. Enhanced eye tracking techniques based on neural network analysis of images
CN111881735B (en) * 2020-06-17 2022-07-29 武汉光庭信息技术股份有限公司 Event classification extraction method and device for automatic driving video data
WO2021260694A1 (en) * 2020-06-22 2021-12-30 Alon Melchner System and method for rendering virtual interactions of an immersive reality-virtuality continuum-based object and a real environment
US20210405739A1 (en) * 2020-06-26 2021-12-30 Sony Interactive Entertainment Inc. Motion matching for vr full body reconstruction
CN111935396A (en) * 2020-07-01 2020-11-13 青岛小鸟看看科技有限公司 6DoF data processing method and device of VR (virtual reality) all-in-one machine
US20230290132A1 (en) * 2020-07-29 2023-09-14 Magic Leap, Inc. Object recognition neural network training using multiple data sources
CN111896221B (en) * 2020-07-30 2021-08-17 四川大学 Alignment method of rotating optical measurement system for virtual coordinate system auxiliary camera calibration
CN111914753A (en) * 2020-08-03 2020-11-10 西安杰邦科技股份有限公司 Low-power-consumption intelligent gun aiming image processing system and method based on deep learning
US11320896B2 (en) * 2020-08-03 2022-05-03 Facebook Technologies, Llc. Systems and methods for object tracking using fused data
US11727719B2 (en) * 2020-08-28 2023-08-15 Stmicroelectronics, Inc. System and method for detecting human presence based on depth sensing and inertial measurement
US11113894B1 (en) * 2020-09-11 2021-09-07 Microsoft Technology Licensing, Llc Systems and methods for GPS-based and sensor-based relocalization
US11353700B2 (en) 2020-10-07 2022-06-07 Industrial Technology Research Institute Orientation predicting method, virtual reality headset and non-transitory computer-readable medium
US20230408830A1 (en) * 2020-10-30 2023-12-21 Hewlett-Packard Development Company, L.P. Head mounted display assembly
CN112365604B (en) * 2020-11-05 2024-08-23 深圳市中科先见医疗科技有限公司 AR equipment depth information application method based on semantic segmentation and SLAM
CN112433193B (en) * 2020-11-06 2023-04-07 山东产研信息与人工智能融合研究院有限公司 Multi-sensor-based mold position positioning method and system
CN112308015A (en) * 2020-11-18 2021-02-02 盐城鸿石智能科技有限公司 Novel depth recovery scheme based on 3D structured light
US20240106998A1 (en) * 2020-12-04 2024-03-28 Magic Leap, Inc. Miscalibration detection for virtual reality and augmented reality systems
CN112464958A (en) * 2020-12-11 2021-03-09 沈阳芯魂科技有限公司 Multi-modal neural network information processing method and device, electronic equipment and medium
CN112509006A (en) * 2020-12-11 2021-03-16 北京华捷艾米科技有限公司 Sub-map recovery fusion method and device
CN112561978B (en) * 2020-12-18 2023-11-17 北京百度网讯科技有限公司 Training method of depth estimation network, depth estimation method of image and equipment
CN112702522B (en) * 2020-12-25 2022-07-12 李灯 Self-adaptive control playing method based on VR live broadcast system
US11500463B2 (en) * 2020-12-30 2022-11-15 Imagine Technologies, Inc. Wearable electroencephalography sensor and device control methods using same
US11854280B2 (en) 2021-04-27 2023-12-26 Toyota Research Institute, Inc. Learning monocular 3D object detection from 2D semantic keypoint detection
CN113469041A (en) * 2021-06-30 2021-10-01 北京市商汤科技开发有限公司 Image processing method and device, computer equipment and storage medium
CN113793389B (en) * 2021-08-24 2024-01-26 国网甘肃省电力公司 Virtual-real fusion calibration method and device for augmented reality system
US12033264B2 (en) 2021-09-20 2024-07-09 Idoru, Inc. Systems and methods for authoring and managing extended reality (XR) avatars
US11763496B2 (en) 2021-09-30 2023-09-19 Lemon Inc. Social networking based on asset items
US11417069B1 (en) * 2021-10-05 2022-08-16 Awe Company Limited Object and camera localization system and localization method for mapping of the real world
US11797127B1 (en) * 2021-11-16 2023-10-24 Alken Inc. Hybrid tracking with auto-correction
WO2023096916A1 (en) * 2021-11-23 2023-06-01 Compass Pathfinder Limited Apparatuses, systems, and methods for a real time bioadaptive stimulus environment
USD1037314S1 (en) * 2021-11-24 2024-07-30 Nike, Inc. Display screen with headwear icon
US20230241491A1 (en) * 2022-01-31 2023-08-03 Sony Interactive Entertainment Inc. Systems and methods for determining a type of material of an object in a real-world environment
US12002290B2 (en) 2022-02-25 2024-06-04 Eyetech Digital Systems, Inc. Systems and methods for hybrid edge/cloud processing of eye-tracking image data
US11822736B1 (en) 2022-05-18 2023-11-21 Google Llc Passive-accessory mediated gesture interaction with a head-mounted device
US20240012238A1 (en) * 2022-07-06 2024-01-11 Htc Corporation Tracking apparatus, method, and non-transitory computer readable storage medium thereof
US11776206B1 (en) 2022-12-23 2023-10-03 Awe Company Limited Extended reality system and extended reality method with two-way digital interactive digital twins
KR102585261B1 (en) * 2023-04-26 2023-10-06 주식회사 케이유전자 An optimized multi-camera calibration system based on an adaptive image augmentation method using a single image of a 3D calibration object

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013049248A2 (en) * 2011-09-26 2013-04-04 Osterhout Group, Inc. Video display modification based on sensor input for a see-through near-to-eye display
US20130083018A1 (en) * 2011-09-30 2013-04-04 Kevin A. Geisner Personal audio/visual system with holographic objects

Family Cites Families (85)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5291560A (en) 1991-07-15 1994-03-01 Iri Scan Incorporated Biometric personal identification system based on iris analysis
US6222525B1 (en) 1992-03-05 2001-04-24 Brad A. Armstrong Image controllers with sheet connected sensors
JPH08212184A (en) * 1995-02-01 1996-08-20 Fujitsu Ltd Recognition device and deficiency value estimating and learning method
US5583795A (en) 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
US5670988A (en) 1995-09-05 1997-09-23 Interlink Electronics, Inc. Trigger operated electronic device
JPH10334071A (en) * 1997-06-02 1998-12-18 Nippon Telegr & Teleph Corp <Ntt> Method and device for parallel generalized learning for neural circuit network model
FR2767943B1 (en) * 1997-09-04 1999-11-26 Alpha Mos Sa CLASSIFICATION APPARATUS USING A COMBINATION OF STATISTICAL METHODS AND NEURAL NETWORKS, IN PARTICULAR FOR THE RECOGNITION OF ODORS
CN1650622B (en) 2002-03-13 2012-09-05 图象公司 Systems and methods for digitally re-mastering or otherwise modifying motion pictures or other image sequences data
EP1553872B1 (en) * 2002-10-15 2010-01-13 Volvo Technology Corporation Method for interpreting a subjects head and eye activity
US8098901B2 (en) 2005-01-26 2012-01-17 Honeywell International Inc. Standoff iris recognition system
KR20050025927A (en) 2003-09-08 2005-03-14 유웅덕 The pupil detection method and shape descriptor extraction method for a iris recognition, iris feature extraction apparatus and method, and iris recognition system and method using its
US7248720B2 (en) 2004-10-21 2007-07-24 Retica Systems, Inc. Method and system for generating a combined retina/iris pattern biometric
WO2006078015A1 (en) * 2005-01-24 2006-07-27 National University Corporation Yokohama National University Categorical color perception system
US11428937B2 (en) 2005-10-07 2022-08-30 Percept Technologies Enhanced optical and perceptual digital eyewear
US8696113B2 (en) 2005-10-07 2014-04-15 Percept Technologies Inc. Enhanced optical and perceptual digital eyewear
US20070081123A1 (en) 2005-10-07 2007-04-12 Lewis Scott W Digital eyewear
JP2007122362A (en) 2005-10-27 2007-05-17 Toyota Motor Corp State estimation method using neural network and state estimation apparatus using neural network
JP4824420B2 (en) 2006-02-07 2011-11-30 アイテック株式会社 Gaze vector detection method and apparatus
US7970179B2 (en) 2006-09-25 2011-06-28 Identix Incorporated Iris data extraction
US8363783B2 (en) 2007-06-04 2013-01-29 Oraya Therapeutics, Inc. Method and device for ocular alignment and coupling of ocular structures
US8098891B2 (en) 2007-11-29 2012-01-17 Nec Laboratories America, Inc. Efficient multi-hypothesis multi-human 3D tracking in crowded scenes
EP2257636A4 (en) 2008-07-03 2014-10-15 Nec Lab America Inc Epithelial layer detector and related methods
US8768014B2 (en) 2009-01-14 2014-07-01 Indiana University Research And Technology Corp. System and method for identifying a person with reference to a sclera image
CN102811684B (en) 2010-01-22 2015-09-09 眼科医疗公司 For automatically placing the device of scanning laser capsulorhexis otch
US8345984B2 (en) 2010-01-28 2013-01-01 Nec Laboratories America, Inc. 3D convolutional neural networks for automatic human action recognition
US11935281B2 (en) * 2010-06-07 2024-03-19 Affectiva, Inc. Vehicular in-cabin facial tracking using machine learning
US9304319B2 (en) 2010-11-18 2016-04-05 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
US10156722B2 (en) 2010-12-24 2018-12-18 Magic Leap, Inc. Methods and systems for displaying stereoscopy with a freeform optical system with addressable focus for virtual and augmented reality
WO2012088478A1 (en) 2010-12-24 2012-06-28 Chunyu Gao An ergonomic head mounted display device and optical system
RU2017118159A (en) 2011-05-06 2018-10-30 Мэджик Лип, Инк. WORLD OF MASS SIMULTANEOUS REMOTE DIGITAL PRESENCE
CN102306088A (en) * 2011-06-23 2012-01-04 北京北方卓立科技有限公司 Solid projection false or true registration device and method
WO2013049861A1 (en) 2011-09-29 2013-04-04 Magic Leap, Inc. Tactile glove for human-computer interaction
RU2621633C2 (en) 2011-10-28 2017-06-06 Мэджик Лип, Инк. System and method for augmented and virtual reality
RU2628164C2 (en) 2011-11-23 2017-08-15 Мэджик Лип, Инк. System of display of three-dimensional virtual and additional reality
AU2013243380B2 (en) 2012-04-05 2017-04-20 Magic Leap, Inc. Wide-field of view (FOV) imaging devices with active foveation capability
CN104737061B (en) 2012-06-11 2018-01-16 奇跃公司 Use more depth plane three dimensional displays of the waveguided reflector arrays projector
US9671566B2 (en) 2012-06-11 2017-06-06 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US9141916B1 (en) 2012-06-29 2015-09-22 Google Inc. Using embedding functions with a deep network
CN104244807B (en) 2012-07-31 2016-10-19 国立研究开发法人科学技术振兴机构 Watch point detection device and method for viewing points detecting attentively
US8369595B1 (en) 2012-08-10 2013-02-05 EyeVerify LLC Texture features for biometric authentication
KR20150054967A (en) 2012-09-11 2015-05-20 매직 립, 인코포레이티드 Ergonomic head mounted display device and optical system
AU2014207545B2 (en) 2013-01-15 2018-03-15 Magic Leap, Inc. Ultra-high resolution scanning fiber display
JP6083251B2 (en) 2013-02-18 2017-02-22 応用地質株式会社 Distributed exploration system for obtaining electrical characteristics of underground and distributed exploration method using the same
US10295826B2 (en) * 2013-02-19 2019-05-21 Mirama Service Inc. Shape recognition device, shape recognition program, and shape recognition method
IL308285B1 (en) 2013-03-11 2024-07-01 Magic Leap Inc System and method for augmented and virtual reality
US9147154B2 (en) 2013-03-13 2015-09-29 Google Inc. Classifying resources using a deep network
EP4027222A1 (en) 2013-03-15 2022-07-13 Magic Leap, Inc. Display system and method
WO2014182769A1 (en) 2013-05-07 2014-11-13 The Johns Hopkins University Automated and non-mydriatic fundus-perimetry camera for irreversible eye diseases
US9275308B2 (en) 2013-05-31 2016-03-01 Google Inc. Object detection using deep neural networks
US10262462B2 (en) 2014-04-18 2019-04-16 Magic Leap, Inc. Systems and methods for augmented and virtual reality
US9874749B2 (en) 2013-11-27 2018-01-23 Magic Leap, Inc. Virtual and augmented reality systems and methods
US20140380249A1 (en) 2013-06-25 2014-12-25 Apple Inc. Visual recognition of gestures
CN103431840B (en) 2013-07-31 2016-01-20 北京智谷睿拓技术服务有限公司 Eye optical parameter detecting system and method
AU2014337171B2 (en) 2013-10-16 2018-11-15 Magic Leap, Inc. Virtual or augmented reality headsets having adjustable interpupillary distance
US9202144B2 (en) 2013-10-30 2015-12-01 Nec Laboratories America, Inc. Regionlets with shift invariant neural patterns for object detection
US10095917B2 (en) * 2013-11-04 2018-10-09 Facebook, Inc. Systems and methods for facial representation
US20150130799A1 (en) * 2013-11-12 2015-05-14 Fyusion, Inc. Analysis and manipulation of images and video for generation of surround views
JP6236296B2 (en) * 2013-11-14 2017-11-22 株式会社デンソーアイティーラボラトリ Learning device, learning program, and learning method
US9857591B2 (en) 2014-05-30 2018-01-02 Magic Leap, Inc. Methods and system for creating focal planes in virtual and augmented reality
WO2015081313A2 (en) 2013-11-27 2015-06-04 Magic Leap, Inc. Virtual and augmented reality systems and methods
US9430829B2 (en) 2014-01-30 2016-08-30 Case Western Reserve University Automatic detection of mitosis using handcrafted and convolutional neural network features
WO2015117039A1 (en) 2014-01-31 2015-08-06 Magic Leap, Inc. Multi-focal display system and method
EP3100099B1 (en) 2014-01-31 2020-07-01 Magic Leap, Inc. Multi-focal display system and method
US10203762B2 (en) 2014-03-11 2019-02-12 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
US10417824B2 (en) 2014-03-25 2019-09-17 Apple Inc. Method and system for representing a virtual object in a view of a real environment
IL231862A (en) * 2014-04-01 2015-04-30 Superfish Ltd Neural network image representation
WO2015164807A1 (en) 2014-04-25 2015-10-29 Texas State University Detection of brain injury and subject state with eye movement biometrics
WO2016018488A2 (en) 2014-05-09 2016-02-04 Eyefluence, Inc. Systems and methods for discerning eye signals and continuous biometric identification
EP3149539A4 (en) 2014-05-30 2018-02-07 Magic Leap, Inc. Methods and systems for generating virtual content display with a virtual or augmented reality apparatus
WO2015192117A1 (en) * 2014-06-14 2015-12-17 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
US9536293B2 (en) 2014-07-30 2017-01-03 Adobe Systems Incorporated Image assessment using deep convolutional neural networks
US20160034811A1 (en) * 2014-07-31 2016-02-04 Apple Inc. Efficient generation of complementary acoustic models for performing automatic speech recognition system combination
US9659384B2 (en) 2014-10-03 2017-05-23 EyeEm Mobile GmbH. Systems, methods, and computer program products for searching and sorting images by aesthetic quality
CN105917354A (en) 2014-10-09 2016-08-31 微软技术许可有限责任公司 Spatial pyramid pooling networks for image processing
EP3161728B1 (en) 2014-10-10 2023-05-17 Beijing Kuangshi Technology Co., Ltd. Hierarchical interlinked multi-scale convolutional network for image parsing
KR102276339B1 (en) 2014-12-09 2021-07-12 삼성전자주식회사 Apparatus and method for training convolutional neural network for approximation of convolutional neural network
US9418458B2 (en) 2015-01-05 2016-08-16 Superfish Ltd. Graph image representation from convolutional neural networks
EP3251059B1 (en) 2015-01-28 2018-12-05 Google LLC Batch normalization layers
WO2016145379A1 (en) * 2015-03-12 2016-09-15 William Marsh Rice University Automated Compilation of Probabilistic Task Description into Executable Neural Network Specification
USD758367S1 (en) 2015-05-14 2016-06-07 Magic Leap, Inc. Virtual reality headset
KR20230150397A (en) 2015-08-21 2023-10-30 매직 립, 인코포레이티드 Eyelid shape estimation using eye pose measurement
CN105654037B (en) * 2015-12-21 2019-05-21 浙江大学 A kind of electromyography signal gesture identification method based on deep learning and characteristic image
WO2018013200A1 (en) 2016-07-14 2018-01-18 Magic Leap, Inc. Deep neural network for iris identification
CN109661194B (en) 2016-07-14 2022-02-25 奇跃公司 Iris boundary estimation using corneal curvature
CA3034644A1 (en) 2016-08-22 2018-03-01 Magic Leap, Inc. Augmented reality display device with deep learning sensors

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013049248A2 (en) * 2011-09-26 2013-04-04 Osterhout Group, Inc. Video display modification based on sensor input for a see-through near-to-eye display
US20130083018A1 (en) * 2011-09-30 2013-04-04 Kevin A. Geisner Personal audio/visual system with holographic objects

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LI SIJIN ET AL: "Heterogeneous Multi-task Learning for Human Pose Estimation with Deep Convolutional Neural Network", 2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, IEEE, 23 June 2014 (2014-06-23), pages 488 - 495, XP032649680, DOI: 10.1109/CVPRW.2014.78 *
ZHANG CHA ET AL: "Improving multiview face detection with multi-task deep convolutional neural networks", IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION, IEEE, 24 March 2014 (2014-03-24), pages 1036 - 1041, XP032609868, DOI: 10.1109/WACV.2014.6835990 *

Also Published As

Publication number Publication date
KR102529137B1 (en) 2023-05-03
US10733447B2 (en) 2020-08-04
US11120266B2 (en) 2021-09-14
EP3500911B1 (en) 2023-09-27
AU2017317599B2 (en) 2021-12-23
US20190340435A1 (en) 2019-11-07
IL294129A (en) 2022-08-01
WO2018039269A1 (en) 2018-03-01
CN109923500A (en) 2019-06-21
JP2022058427A (en) 2022-04-12
IL264820B (en) 2021-03-25
US11797078B2 (en) 2023-10-24
KR20190041504A (en) 2019-04-22
IL294129B2 (en) 2024-01-01
US20200334461A1 (en) 2020-10-22
JP7254895B2 (en) 2023-04-10
AU2022201949A1 (en) 2022-04-14
US20220067378A1 (en) 2022-03-03
EP3500911A1 (en) 2019-06-26
JP7002536B2 (en) 2022-01-20
IL294129B1 (en) 2023-09-01
US10402649B2 (en) 2019-09-03
CN109923500B (en) 2022-01-04
CN114253400A (en) 2022-03-29
IL281241A (en) 2021-04-29
AU2017317599A1 (en) 2019-03-21
IL281241B (en) 2022-08-01
US20180053056A1 (en) 2018-02-22
JP2019532392A (en) 2019-11-07
CA3034644A1 (en) 2018-03-01
KR102439771B1 (en) 2022-09-02
JP2023093489A (en) 2023-07-04
JP7476387B2 (en) 2024-04-30
KR20220123565A (en) 2022-09-07

Similar Documents

Publication Publication Date Title
IL294129A (en) Augmented reality display device with deep learning sensors
EP3688526A4 (en) Augmented reality display
EP3549122A4 (en) Display device
EP3511985A4 (en) Display device
EP3507789A4 (en) Display device
EP3519885A4 (en) Display device
EP3226063A4 (en) Optical device and display device
EP3265978A4 (en) Authentication-activated augmented reality display device
EP3252742A4 (en) Transparent display device
EP3415971A4 (en) Information display device
EP3565275A4 (en) Display device
EP3561585A4 (en) Display device
EP3401898A4 (en) Display device
EP3508907A4 (en) Display device
EP3370109A4 (en) Display device
EP3523575A4 (en) Transparent display and method
EP3400481A4 (en) Display device
EP3504872A4 (en) Display device
EP3416154A4 (en) Display device
EP3510767A4 (en) Display device
EP3488604A4 (en) Display device
EP3297046A4 (en) Light-emitting instrument and image display device
EP3304192A4 (en) Color display device and driving methods therefor
EP3539287A4 (en) Display device
EP3422330A4 (en) Display device

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190218

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20200327

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/01 20060101AFI20200324BHEP

Ipc: G06T 19/00 20110101ALI20200324BHEP

Ipc: G02B 27/01 20060101ALI20200324BHEP

Ipc: G06N 3/02 20060101ALI20200324BHEP

Ipc: G06N 3/04 20060101ALI20200324BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210825

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20230330

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230608

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602017074741

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231227

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20231228

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1616049

Country of ref document: AT

Kind code of ref document: T

Effective date: 20230927

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240127

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240127

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20240129

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602017074741

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20230927

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20240723

Year of fee payment: 8

26N No opposition filed

Effective date: 20240628