WO2020042589A1 - Procédé et appareil d'estimation de distance d'utilisateur, dispositif, et support de stockage - Google Patents

Procédé et appareil d'estimation de distance d'utilisateur, dispositif, et support de stockage Download PDF

Info

Publication number
WO2020042589A1
WO2020042589A1 PCT/CN2019/078025 CN2019078025W WO2020042589A1 WO 2020042589 A1 WO2020042589 A1 WO 2020042589A1 CN 2019078025 W CN2019078025 W CN 2019078025W WO 2020042589 A1 WO2020042589 A1 WO 2020042589A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
user
gaze tracking
tracking device
eye image
Prior art date
Application number
PCT/CN2019/078025
Other languages
English (en)
Chinese (zh)
Inventor
付阳
王云飞
黄通兵
Original Assignee
北京七鑫易维信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京七鑫易维信息技术有限公司 filed Critical 北京七鑫易维信息技术有限公司
Publication of WO2020042589A1 publication Critical patent/WO2020042589A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Definitions

  • the embodiments of the present application relate to the technical field of gaze tracking, and in particular, to a user distance estimation method, device, device, and storage medium.
  • FIG. 1 is a schematic structural diagram of a line-of-sight tracking device provided in the prior art.
  • the line-of-sight tracking device is formed by connecting data lines of the image acquisition device 10, the infrared light source 20, and the body 30 side of the line-of-sight tracking device.
  • Infrared light has no light sense, so the infrared light source can replace the general light source to illuminate the eyes at a short distance without affecting the normal visual effect of a person.
  • This patent proposes a method for estimating the distance from a user to a gaze tracking device.
  • the embodiments of the present application provide a user distance estimation method, device, device, and storage medium, so as to accurately estimate the distance between the user and the gaze tracking device.
  • an embodiment of the present application provides a user distance estimation method.
  • the user distance estimation method includes:
  • a distance between the user and the gaze tracking device is determined by an image of the eye of the user.
  • the acquiring an eye image of a user includes:
  • the user's eye image obtained by the user is received through a wired and / or wireless communication module.
  • determining the distance between the user and the gaze tracking device through the eye image of the user includes:
  • a distance between the user and the gaze tracking device is determined according to the spot distance and / or the binocular distance.
  • determining the distance between the user and the gaze tracking device according to the light spot distance and / or the binocular distance includes:
  • the distance between the user and the gaze tracking device is determined by a function relationship or a mapping table of the spot distance and / or binocular distance and the distance.
  • an embodiment of the present application further provides a user distance estimation device including a target screen, a gaze tracking device main body, an image acquisition device, and an infrared light source, and further includes:
  • a wireless communication module configured to receive a user's eye image collected by a current user terminal
  • the image processing module is connected to the wireless communication module and is configured to determine a distance between the user and the gaze tracking device through an image of the eye of the user.
  • the wireless communication module is configured as:
  • the user's eye image obtained by the user is received through a wired and / or wireless communication module.
  • the image processing module includes:
  • a spot distance or binocular distance determination unit configured to determine, based on the eye image, a spot distance or binocular distance of an eye in the eye image
  • the distance determining unit is configured to determine a distance between the user and the gaze tracking device according to the spot distance or the binocular distance.
  • the distance determining unit is set as:
  • the distance between the user and the gaze tracking device is determined by a function relationship or a mapping table of the spot distance and / or binocular distance and the distance.
  • an embodiment of the present application further provides a device, where the device includes:
  • One or more processors are One or more processors;
  • a storage device configured to store one or more programs
  • the one or more processors are caused to implement the user distance estimation method according to any one of the embodiments of the present application.
  • an embodiment of the present application further provides a computer-readable storage medium on which a computer program is stored.
  • the program is executed by a processor, the user distance estimation method according to any one of the embodiments of the present application is implemented.
  • the distance between the user and the gaze tracking device is determined by acquiring an eye image of the user, and the position of the main body of the gaze tracking device is adjusted according to the distance, so as to accurately estimate the user's line of sight. Track the distance of the device.
  • FIG. 1 is a schematic structural diagram of a gaze tracking device provided in the prior art
  • FIG. 3 is a flowchart of a user distance estimation method provided in Embodiment 2 of the present application.
  • FIG. 4 is a structural diagram of a user distance estimation device provided in Embodiment 3 of the present application.
  • FIG. 5 is a schematic structural diagram of a device provided in Embodiment 4 of the present application.
  • FIG. 2 is a flowchart of a user distance estimation method provided in Embodiment 1 of the present application.
  • This embodiment is applicable to a case where a distance between a user and a gaze tracking device is estimated.
  • the method may be performed by the user distance estimation device. carried out.
  • the user distance can be an abstract / general description, that is, the distance from the user's feature point to the sight-tracking device, or the average distance from multiple user feature points to the sight-tracking device.
  • the user feature point can be the user's eyebrow, For the left eye, right eye, nose tip, or forehead, the position corresponding to the gaze tracking device may be the optical center of the image acquisition device at the gaze tracking device, the sensor, or the plane of the display at the gaze tracking device.
  • the user distance estimation method provided in the embodiment of the present application specifically includes the following execution steps:
  • the eye image may be directed to the eye through a light source, and the eye is captured by the eye image acquisition device.
  • the eye image acquisition device may include at least one image acquisition device, and the image acquisition device may be an infrared camera or an infrared camera. Wait.
  • the light spot on the cornea which is the light spot (also referred to as a Purchin spot), is captured, and the user's eye image obtained from this is the user's eye image with light spot.
  • the relative positional relationship between the pupil center and the light spot changes, and the corresponding collected eye image of the user with the light spot will reflect the corresponding positional change relationship, and thus according to the position change relationship Gaze / gaze point estimation for gaze tracking. That is, when the distance between the user and the gaze tracking device is too close, the relative position of the light spot formed on the cornea and the center of the pupil changes when the light source shines on the eye, and the corresponding eye image with the light spot cannot be detected by the eye.
  • the image acquisition device captures, and at this time, it is necessary to realize the estimation of the distance between the user and the gaze tracking device through the obtained eye image of the user with light spots.
  • the embodiment of the present application determines the distance between the user and the gaze tracking device by acquiring an eye image of the user, so as to accurately estimate the distance between the user and the gaze tracking device.
  • FIG. 3 is a flowchart of a user distance estimation method provided in Embodiment 2 of the present application.
  • This embodiment is optimized based on the foregoing embodiment.
  • the step of obtaining the eye image of the user is further optimized to receive the eye image of the user obtained by the user through a wired and / or wireless communication module.
  • the step of determining the distance between the user and the gaze tracking device through the eye image of the user is further optimized as follows: determining the light spot distance of the eyes in the eye image through the eye image Or binocular distance; determining the distance between the user and the gaze tracking device according to the spot distance and / or the binocular distance.
  • the method in this embodiment specifically includes the following execution steps:
  • S210 Receive a user's eye image obtained by a user terminal through a wired and / or wireless communication module.
  • the user terminal may be various users of operating electronic devices with a display platform, and the user terminal may be provided with an image acquisition device for acquiring a screen image of the electronic device display platform.
  • a user may wear a smart head-mounted device, such as smart glasses, which has an image acquisition device that can capture screen images of a display platform of an electronic device, and the image acquisition device may be a camera or a camera, etc.
  • the device for the target screen image is only explained in the embodiment of the present application, and is not limited thereto.
  • the collected eye image is sent to the wireless communication module of the gaze tracking device through the wireless communication module on the user side, that is, the electronic device on the user side and the display platform can be built in
  • the wireless communication module performs wireless data transmission, so that the electronic device end of the display platform obtains the user's eye image collected by the user-side image acquisition device.
  • S220 Determine the spot distance or binocular distance of the eyes in the eye image through the eye image.
  • a geographic coordinate system also called a real-world coordinate system, is a coordinate system used to determine the location of features on the earth.
  • a specific geographic coordinate system is composed of a specific ellipsoid and a specific map projection.
  • the ellipsoid is a mathematical description of the shape of the earth
  • the map projection is a mathematical method for converting spherical coordinates into plane coordinates.
  • Most maps display coordinate data according to a known geographic coordinate system. For example, the 1:25 million topographic map of the country uses the Gauss-Kruger projection on the Krasovsky ellipsoid.
  • the most commonly used geographic coordinate system is the latitude and longitude coordinate system. This coordinate system can determine the position of any point on the earth. It should be noted that the latitude and longitude coordinate system is not a flat coordinate system, because degrees are not standard length units and cannot be measured. Area length.
  • the distance from the user to the gaze tracking device is d
  • the spot distance on the image plane of the spot in the same eye is defined as igd (inter glint distance) and binocular distance It is defined as ipd (interpupil distance) on the image plane.
  • determining the distance between the user and the gaze tracking device according to the spot distance and / or the binocular distance includes:
  • the distance between the user and the gaze tracking device is determined by a function relationship or a mapping table of the spot distance and / or binocular distance and the distance.
  • mapping relationship between d and igd, d and ipd, or d and ipd and ipd can be established to estimate d.
  • This mapping relationship may be a function (for example, the following function) or a mapping table.
  • C is a constant.
  • n is the degree of the estimation equation, which can be 2, 3, 4, and so on.
  • C is a constant.
  • k and c it can be determined through empirical statistics on a large number of experimental objects, or can be derived through the positional relationship between the gaze tracking device and the spatial model of the user's eyeball.
  • interval of ipd or igd that is, the domain of the function
  • ipd or igd can be partitioned, and different functions are used for multiple intervals to obtain more accurate results.
  • the method of determining the spot distance or the binocular distance can be specifically selected by the user according to the actual situation. This embodiment of the present application only explains this and does not This makes no restrictions.
  • an eye image of a user is obtained, a distance between the user and the gaze tracking device is determined, and a position of the gaze tracking device body is adjusted according to the distance, where the gaze tracking device body includes an image Acquisition device and light source.
  • FIG. 4 is a structural diagram of a user distance estimation device provided in Embodiment 3 of the present application. This embodiment is applicable to a case where a distance between a user and a gaze tracking device is estimated.
  • the user distance estimation device includes a wireless communication module 410 and an image processing module 420, where:
  • the wireless communication module 410 is configured to receive a user's eye image collected by a current user terminal;
  • the image processing module 420 is connected to the wireless communication module and is configured to determine a distance between the user and the gaze tracking device through an image of the eye of the user.
  • a user distance estimation device of this embodiment determines the distance between the user and the gaze tracking device by acquiring an eye image of the user, so as to accurately estimate the distance between the user and the gaze tracking device.
  • the wireless communication module 410 is configured to:
  • the user's eye image obtained by the user is received through a wired and / or wireless communication module.
  • the image processing module 420 includes:
  • a spot distance or binocular distance determination unit configured to determine, based on the eye image, a spot distance or binocular distance of an eye in the eye image
  • the distance determining unit is configured to determine a distance between the user and the gaze tracking device according to the spot distance and / or the binocular distance.
  • the distance determining unit is configured as:
  • the distance between the user and the gaze tracking device is determined by a function relationship or a mapping table of the spot distance and / or binocular distance and the distance.
  • the user distance estimation device provided by each of the foregoing embodiments can execute the user distance estimation method provided by any embodiment of the present application, and has the corresponding functional modules and beneficial effects of executing the user distance estimation method.
  • FIG. 5 is a schematic structural diagram of a device provided in Embodiment 4 of the present application.
  • FIG. 5 shows a block diagram of an exemplary device 512 suitable for use in implementing embodiments of the present application.
  • the device 512 shown in FIG. 5 is only an example, and should not impose any limitation on the functions and scope of use of the embodiments of the present application.
  • the device 512 is represented in the form of a general-purpose computing device.
  • the components of the device 512 may include, but are not limited to, one or more processors 516, a system memory 528, and a bus 518 connecting different system components (including the system memory 528 and the processor 516).
  • the bus 518 represents one or more of several types of bus structures, including a memory bus or a memory controller, a peripheral bus, a graphics acceleration port, a processor 516, or a local area bus using any of a variety of bus structures.
  • these architectures include, but are not limited to, the Industry Standard Architecture (ISA) bus, the Micro Channel Architecture (MAC) bus, the enhanced ISA bus, the Video Electronics Standards Association (VESA) local area bus, and peripheral component interconnects ( PCI) bus.
  • the device 512 typically includes a variety of computer system-readable media. These media can be any available media that can be accessed by the device 512, including volatile and non-volatile media, removable and non-removable media.
  • System memory 528 may include computer system readable media in the form of volatile memory, such as random access memory (RAM) 530 and / or cache memory 532.
  • the device 512 may further include other removable / non-removable, volatile / nonvolatile computer system storage media.
  • the storage device 534 may be used to read and write non-removable, non-volatile magnetic media (not shown in FIG. 5 and is commonly referred to as a “hard drive”).
  • a disk drive for reading and writing to a removable non-volatile disk (such as a "floppy disk"), and a removable non-volatile optical disk (such as a CD-ROM, DVD-ROM, etc.) may be provided. Or other optical media).
  • each drive may be connected to the bus 518 through one or more data medium interfaces.
  • the memory 528 may include at least one program product having a set (eg, at least one) of program modules configured to perform the functions of the embodiments of the present application.
  • a program / utility tool 540 having a set (at least one) of program modules 542 may be stored in, for example, the memory 528.
  • Such program modules 542 include, but are not limited to, an operating system, one or more application programs, other program modules, and program data Each of these examples, or some combination, may include an implementation of a network environment.
  • Program module 542 generally performs functions and / or methods in the embodiments described herein.
  • the device 512 may also communicate with one or more external devices 514 (eg, keyboard, pointing device, display 524, etc.), may also communicate with one or more devices that enable a user to interact with the device 512, and / or with the device that enables
  • the device 512 can communicate with any device (eg, network card, modem, etc.) that is in communication with one or more other computing devices. This communication can be performed through an input / output (I / O) interface 522.
  • the device 512 may also communicate with one or more networks (such as a local area network (LAN), a wide area network (WAN), and / or a public network, such as the Internet) through the network adapter 520.
  • networks such as a local area network (LAN), a wide area network (WAN), and / or a public network, such as the Internet
  • the network adapter 520 communicates with other modules of the device 512 through the bus 518. It should be understood that although not shown in the figure, other hardware and / or software modules may be used in conjunction with the device 512, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and Data backup storage system.
  • the processor 516 executes various functional applications and data processing by running a program stored in the system memory 528, for example, implementing a user distance estimation method provided in the embodiment of the present application.
  • the method includes:
  • a distance between the user and the gaze tracking device is determined by an image of the eye of the user.
  • processor may also implement the technical solution of the user distance estimation method provided by any embodiment of the present application.
  • Embodiment 5 of the present application further provides a computer-readable storage medium on which a computer program is stored.
  • the program is executed by a processor, the user distance estimation method provided by the embodiment of the present application is implemented.
  • the method includes:
  • a distance between the user and the gaze tracking device is determined by an image of the eye of the user.
  • the computer-readable storage medium provided in the embodiment of the present application is not limited to the method operations described above, and the computer program stored on the computer program can also perform the correlation in the user distance estimation method provided by any embodiment of the present application. operating.
  • the computer storage medium in the embodiments of the present application may adopt any combination of one or more computer-readable media.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in connection with an instruction execution system, apparatus, or device.
  • the computer-readable signal medium may include a data signal in baseband or propagated as part of a carrier wave, which carries a computer-readable program code. Such a propagated data signal may take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than a computer-readable storage medium, and the computer-readable medium may send, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device .
  • Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for performing the operations of this application may be written in one or more programming languages, or a combination thereof, including programming languages such as Java, Smalltalk, C ++, and also conventional Procedural programming language—such as "C" or similar programming language.
  • the program code can be executed entirely on the user's computer, partly on the user's computer, as an independent software package, partly on the user's computer, partly on a remote computer, or entirely on a remote computer or server.
  • the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (such as through an Internet service provider) Internet connection).
  • LAN local area network
  • WAN wide area network
  • Internet service provider Internet service provider

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Eye Examination Apparatus (AREA)
  • Position Input By Displaying (AREA)

Abstract

Les modes de réalisation de la présente invention concernent un procédé et un appareil d'estimation de distance d'utilisateur, un dispositif, et un support de stockage. Le procédé comporte les étapes consistant à: acquérir une image d'œil d'utilisateur; et, au moyen de l'image d'œil d'utilisateur, déterminer la distance entre l'utilisateur et un appareil de suivi de ligne de visée. Les modes de réalisation de la présente invention, au moyen de l'acquisition d'une image d'œil d'utilisateur et de la détermination de la distance entre l'utilisateur et un appareil de suivi de ligne de visée, estiment avec précision la distance entre l'utilisateur et l'appareil de suivi de ligne de visée.
PCT/CN2019/078025 2018-08-29 2019-03-13 Procédé et appareil d'estimation de distance d'utilisateur, dispositif, et support de stockage WO2020042589A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810998616.0A CN109284002B (zh) 2018-08-29 2018-08-29 一种用户距离估算方法、装置、设备及存储介质
CN201810998616.0 2018-08-29

Publications (1)

Publication Number Publication Date
WO2020042589A1 true WO2020042589A1 (fr) 2020-03-05

Family

ID=65184191

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/078025 WO2020042589A1 (fr) 2018-08-29 2019-03-13 Procédé et appareil d'estimation de distance d'utilisateur, dispositif, et support de stockage

Country Status (2)

Country Link
CN (1) CN109284002B (fr)
WO (1) WO2020042589A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109284002B (zh) * 2018-08-29 2020-12-29 北京七鑫易维信息技术有限公司 一种用户距离估算方法、装置、设备及存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103366157A (zh) * 2013-05-03 2013-10-23 马建 一种人眼视线距离的判断方法
US20140098198A1 (en) * 2012-10-09 2014-04-10 Electronics And Telecommunications Research Institute Apparatus and method for eye tracking
CN104850842A (zh) * 2015-05-21 2015-08-19 北京中科虹霸科技有限公司 移动终端虹膜识别的人机交互方法
CN104921697A (zh) * 2015-05-18 2015-09-23 华南师范大学 一种人眼视线纵向距离的快速测量方法
CN105205438A (zh) * 2014-09-05 2015-12-30 北京七鑫易维信息技术有限公司 一种利用红外眼球追踪控制眼睛与屏幕距离的方法及系统
CN106662917A (zh) * 2014-04-11 2017-05-10 眼球控制技术有限公司 眼睛跟踪校准系统和方法
US20170205876A1 (en) * 2016-01-20 2017-07-20 Thalmic Labs Inc. Systems, devices, and methods for proximity-based eye tracking
CN109284002A (zh) * 2018-08-29 2019-01-29 北京七鑫易维信息技术有限公司 一种用户距离估算方法、装置、设备及存储介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI432172B (zh) * 2008-10-27 2014-04-01 Utechzone Co Ltd Pupil location method, pupil positioning system and storage media
CN204087228U (zh) * 2014-08-11 2015-01-07 北京天诚盛业科技有限公司 虹膜图像的采集装置
EP4224424A1 (fr) * 2014-11-21 2023-08-09 Apple Inc. Procédé et système de détermination de coordonnées spatiales d'une reconstruction 3d d'au moins une partie d'un objet réel à une échelle spatiale absolue
CN107562190A (zh) * 2017-08-03 2018-01-09 广东小天才科技有限公司 一种护眼模式的开启方法及设备
CN108227925A (zh) * 2018-01-08 2018-06-29 广州视源电子科技股份有限公司 一种坐姿调整的方法、装置、设备及存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140098198A1 (en) * 2012-10-09 2014-04-10 Electronics And Telecommunications Research Institute Apparatus and method for eye tracking
CN103366157A (zh) * 2013-05-03 2013-10-23 马建 一种人眼视线距离的判断方法
CN106662917A (zh) * 2014-04-11 2017-05-10 眼球控制技术有限公司 眼睛跟踪校准系统和方法
CN105205438A (zh) * 2014-09-05 2015-12-30 北京七鑫易维信息技术有限公司 一种利用红外眼球追踪控制眼睛与屏幕距离的方法及系统
CN104921697A (zh) * 2015-05-18 2015-09-23 华南师范大学 一种人眼视线纵向距离的快速测量方法
CN104850842A (zh) * 2015-05-21 2015-08-19 北京中科虹霸科技有限公司 移动终端虹膜识别的人机交互方法
US20170205876A1 (en) * 2016-01-20 2017-07-20 Thalmic Labs Inc. Systems, devices, and methods for proximity-based eye tracking
CN109284002A (zh) * 2018-08-29 2019-01-29 北京七鑫易维信息技术有限公司 一种用户距离估算方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN109284002B (zh) 2020-12-29
CN109284002A (zh) 2019-01-29

Similar Documents

Publication Publication Date Title
US11403757B2 (en) Sight line detection method and sight line detection device
US11004223B2 (en) Method and device for obtaining image, and recording medium thereof
CN110457414B (zh) 离线地图处理、虚拟对象显示方法、装置、介质和设备
KR102334139B1 (ko) 적응적 호모그래피 매핑에 기초한 눈 시선 추적
WO2019242262A1 (fr) Procédé et dispositif de guidage à distance basé sur la réalité augmentée, terminal et support de stockage
EP3656302A1 (fr) Système et procédé d'analyse de la démarche humaine
JP6853188B2 (ja) 瞳孔間距離を決定する装置、システム、および方法
WO2020228643A1 (fr) Procédé et appareil de commande interactive, dispositif électronique et support de stockage
WO2020015468A1 (fr) Procédé et appareil de transmission d'image, dispositif terminal et support de stockage
US10955245B2 (en) System and method for low latency, high performance pose fusion
JP2018505457A (ja) アイトラッキングシステムのための改良されたキャリブレーション
KR20180013277A (ko) 그래픽 객체를 표시하기 위한 전자 장치 및 컴퓨터 판독 가능한 기록 매체
JP2017129904A (ja) 情報処理装置、情報処理方法、及び記録媒体
CN110555426A (zh) 视线检测方法、装置、设备及存储介质
US10672191B1 (en) Technologies for anchoring computer generated objects within augmented reality
KR20170097884A (ko) 이미지를 처리하기 위한 방법 및 그 전자 장치
US11195259B2 (en) Apparatus and method for dynamic multi-camera rectification using depth camera
US20200042777A1 (en) Method, apparatus and device for determining an object, and storage medium for the same
US11789528B1 (en) On-the-fly calibration for improved on-device eye tracking
CN110051319A (zh) 眼球追踪传感器的调节方法、装置、设备及存储介质
US10185399B2 (en) Image processing apparatus, non-transitory computer-readable recording medium, and image processing method
WO2020042589A1 (fr) Procédé et appareil d'estimation de distance d'utilisateur, dispositif, et support de stockage
CN113741682A (zh) 注视点的映射方法、装置、设备及存储介质
CN112749414A (zh) 数据的存储方法、装置、设备及存储介质
WO2021118560A1 (fr) Mode de verrouillage de scène pour capturer des images de caméra

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19855966

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09.06.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19855966

Country of ref document: EP

Kind code of ref document: A1