WO2021227163A1 - 一种空间定位方法、装置、系统和头戴式设备 - Google Patents

一种空间定位方法、装置、系统和头戴式设备 Download PDF

Info

Publication number
WO2021227163A1
WO2021227163A1 PCT/CN2020/094766 CN2020094766W WO2021227163A1 WO 2021227163 A1 WO2021227163 A1 WO 2021227163A1 CN 2020094766 W CN2020094766 W CN 2020094766W WO 2021227163 A1 WO2021227163 A1 WO 2021227163A1
Authority
WO
WIPO (PCT)
Prior art keywords
tracking exposure
head
environment
handheld device
image data
Prior art date
Application number
PCT/CN2020/094766
Other languages
English (en)
French (fr)
Inventor
周琨
李乐
Original Assignee
深圳市欢创科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市欢创科技有限公司 filed Critical 深圳市欢创科技有限公司
Publication of WO2021227163A1 publication Critical patent/WO2021227163A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Definitions

  • This application relates to the VR/AR field, such as a spatial positioning method, device, system, and head-mounted equipment.
  • the light source (such as LED) is still arranged on the handle, and the 6DOF parameter of the handle relative to the helmet is calculated through the LED on the handle captured by the camera.
  • the frame of the environment tracking exposure time and the frame of the handheld device tracking exposure time alternate in a certain ratio. For example, first take a frame of the environment tracking exposure time, and then take a frame of the handheld device tracking the exposure time. Image; or shoot according to the ratio of 1:2.
  • the advantage of this scheme is that it solves the contradiction of exposure time, and does not introduce additional cameras to increase cost, heat generation and power consumption.
  • the disadvantage is that the same camera for time division multiplexing requires the camera to work at a higher frame rate (such as 60FPS, 30FPS for environmental tracking exposure, used for head positioning, and 30FPS for handheld device tracking exposure, used for handle positioning) , Increase the frame rate requirement of the camera.
  • the frame rate of the handle is lower, and the frequency of its LED light is only 30FPS, which will cause flickering problems.
  • the handle When users use it, they will see the handle The LED is in a flashing state, causing visual discomfort to the user.
  • the inventor found that there are at least the following problems in the related technology: the same camera is time-division multiplexed, and the frame of the environmental tracking exposure time and the frame of the handheld device tracking the exposure time are alternated according to a certain ratio.
  • the LED on the handle is flashing, causing visual discomfort to the user.
  • the present application provides a spatial positioning method, device, system, and head-mounted device to solve the visual discomfort of the user caused by the light source in a flickering state.
  • an embodiment of the present application provides a head-mounted device, including:
  • the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to execute the following method:
  • the image data of the surrounding environment includes image data of environment tracking exposure and handheld device tracking exposure.
  • the integrated luminous intensity of the light source in the environment tracking exposure is not lower than the integrated luminous intensity of the light source of the handheld device in the handheld device tracking exposure, wherein the environment tracking exposure duration is longer than the handheld device tracking exposure duration;
  • the instruction is also used to execute: obtain the last frame of the handheld device tracking exposure image data, and calculate the position and boundary of the light-emitting spot of the light source;
  • the position of the head-mounted device relative to the surrounding environment is determined.
  • an embodiment of the present application also provides a spatial positioning method, which is applied to a head-mounted device, and the head-mounted device includes: an image sensor, a processor, a memory, and the memory stores the at least An instruction executed by a processor, the method includes:
  • the image data of the surrounding environment includes image data of environment tracking exposure and handheld device tracking exposure.
  • the integrated luminous intensity of the light source in the environment tracking exposure is not lower than the integrated luminous intensity of the light source of the handheld device in the handheld device tracking exposure, wherein the environment tracking exposure duration is longer than the handheld device tracking exposure duration;
  • the tracking exposure image data according to the environment and acquiring the position of the head-mounted device relative to the surrounding environment includes:
  • the position of the head-mounted device relative to the surrounding environment is determined.
  • predicting the position and boundary of the light source luminous spot of the current frame environment tracking exposure includes:
  • Obtain the movement information of the head-mounted device and the movement information of the handheld device determine the position of the light-emitting spot of the light source in the environment tracking exposure, and calculate the halo information of the light-emitting spot of the light source in the environment tracking exposure.
  • an embodiment of the present application also provides a spatial positioning device, including:
  • the second acquiring module is configured to acquire the position of the handheld device relative to the head-mounted device according to the tracking exposure image data of the handheld device.
  • the first acquisition module is used to:
  • the position of the head-mounted device relative to the surrounding environment is determined.
  • the first obtaining module is further used for:
  • Obtain the movement information of the head-mounted device and the movement information of the handheld device determine the position of the light-emitting spot of the light source in the environment tracking exposure, and calculate the halo information of the light-emitting spot of the light source in the environment tracking exposure.
  • an embodiment of the present application also provides a spatial positioning system, including the above head-mounted device and a handheld device with a light source provided on the handheld device.
  • the embodiments of the present application also provide a non-volatile computer-readable storage medium, the computer-readable storage medium stores computer-executable instructions, and when the computer-executable instructions are executed by a robot, The robot executes the method as described above.
  • the camera can dynamically adjust the exposure time when the camera is working to form an image sequence in which a certain proportion of environment tracking exposure and handheld device tracking exposure are alternately performed.
  • the LED on the handheld device receives instructions and pulses light according to a certain frequency, brightness and light-emitting moment.
  • the LED tracks the exposure in the environment, and the handheld device tracks the intensity of the exposure, and it is controlled within a certain range. This method does not increase the number of cameras, which makes the inside-out of the headset easier and avoids flickering.
  • FIG. 1 is an example of an application scenario of spatial positioning provided by an embodiment of the application
  • FIG. 2 is a schematic flowchart of an embodiment of a method for positioning a head-mounted device according to the present application
  • Fig. 3 is the time relationship between the exposure of the camera in the head-mounted device of the present application and the light emission of the LED of the handheld device;
  • FIG. 4 is a schematic diagram of an image of tracking exposure of the handheld device of the head-mounted device according to the present application.
  • FIG. 5 is a schematic diagram of an image of the environment tracking exposure of the head-mounted device of the present application.
  • Fig. 6 is an image processing method for tracking exposure of the environment of the head-mounted device according to the present application.
  • FIG. 7 is a schematic structural diagram of a spatial positioning device for a head-mounted device of the present application.
  • FIG. 8 is a schematic structural diagram of an embodiment of the head-mounted device of the present application.
  • Fig. 9 is a schematic diagram of the LED luminous peak value of the present application unchanged.
  • Fig. 10 is a schematic diagram of the change of the LED luminescence peak value of the present application.
  • FIG. 1 is an example of an application scenario of spatial positioning provided by an embodiment of the application.
  • a virtual reality (VR) environment 10 a head-mounted device 11 and a handheld device 12 are included.
  • the image sensor 13 on the head-mounted device 11 can realize environmental tracking exposure and handheld device tracking exposure.
  • the image sensor 13 may be any suitable device with image collection capability, such as a camera, a camera, and the like.
  • At least one LED lamp 14 is installed on the handheld device 12.
  • the LED lamp 14 can realize pulsed light emission and is a visible light source and/or an infrared light source.
  • the image sensor 13 may be provided with Infrared light cut filter.
  • the head-mounted device 11 determines the position of the head-mounted device 11 relative to the environment according to the image data of the environment tracking exposure.
  • the head-mounted device 11 determines the position of the handheld device 12 relative to the head-mounted device 11 according to the image data of the handheld device tracking exposure.
  • FIG. 2 is a schematic flowchart of a spatial positioning method provided by an embodiment of the application. The method may be executed by the head-mounted device shown in FIG. 1. As shown in FIG. 2, the method includes:
  • the image data of the surrounding environment includes image data of environment tracking exposure and handheld device tracking exposure.
  • the integrated luminous intensity of the light source of the handheld device in the environment tracking exposure is not lower than the integrated luminous intensity of the light source of the handheld device in the handheld device tracking exposure, wherein the environment tracking exposure duration is longer than the handheld device's tracking exposure duration.
  • the environment tracking exposure can be defined as a long exposure
  • the handheld device tracking exposure can be defined as a short exposure
  • the environmental exposure time depends on the intensity of the ambient light, generally ranging from 1-10ms.
  • the LED detection of the handheld device requires a very short exposure to ensure a strong signal-to-noise ratio.
  • the time for the handheld device to track the exposure is in the range of 20-200us.
  • the exposure instruction is used to control the exposure mode of the image sensor (camera), so that the handheld device can track the exposure according to a certain time sequence.
  • the head-mounted device may send a pulse instruction to the light source of the handheld device, and the pulse instruction is used to control the brightness of the light source in the environment tracking exposure and the handheld device tracking exposure, so that the light source is in the environment tracking exposure.
  • the integrated luminous intensity of is not lower than the integrated luminous intensity of the light source in the tracking exposure of the handheld device.
  • the head-mounted device can also adjust the integrated luminous intensity of the light source without sending a pulse command to the light source of the handheld device, but adopts a method of self-adaptive/self-control of the handheld device to adjust the integrated luminous intensity of the light source of the handheld device.
  • the light-emitting mode of the LED can be changed to: pulse light-emitting.
  • the integrated luminous intensity of tracking exposure in the environment is not lower than the integrated luminous intensity of tracking exposure in the handheld device.
  • the time relationship between the exposure of the camera and the light emission of the LED of the handheld device is shown in Fig. 3 (in this example, the environmental tracking exposure and the handheld device tracking exposure are performed in a 1:1 ratio, and it can also be 1:2 or other ratios).
  • the bright spot of the LED light can be seen, and the handheld device can be located according to the bright spot (spot) in the image data.
  • the environment tracking exposure frame in the environment tracking exposure, it can be seen that the environment tracking exposure frame, the surrounding background is relatively bright, and several light-emitting LEDs on the handheld device will interfere with the feature extraction of the surrounding environment.
  • the environment tracking exposure frame In order to reduce the interference of the LED on the feature extraction of the surrounding environment in the environmental tracking exposure frame, it is necessary to process the image in this frame.
  • the image processing method for environment tracking exposure is as follows:
  • the position of the head-mounted device relative to the surrounding environment is determined.
  • the prediction of the position and boundary of the luminous spot of the light source for the current frame environment tracking exposure is specifically:
  • Obtain the movement information of the head-mounted device and the movement information of the handheld device determine the position of the light-emitting spot of the light source in the environmental tracking exposure, and calculate the halo information of the light-emitting spot of the light source in the environmental tracking exposure.
  • the motion information includes inertial sensor data set in the head-mounted device; alternatively, it also includes inertial sensor data set in a handheld device.
  • the processing algorithm flow is shown in FIG. 6.
  • the specific description is: the first step, in the Nth frame captured by the image sensor, the handheld device tracks the exposure frame IMAGE(N), and the position and boundary of the LED BLOB are extracted by the calculation algorithm.
  • This calculation Algorithms can include: such as binary processing, connected domain extraction, morphological detection and other algorithms.
  • BLOB_GROUP(N) is obtained through the above algorithm. There will be all LED BLOB information in the Nth frame. According to BLOB_GROUP(N), it can be performed Calculation of 6DOF parameters for handheld devices.
  • the LED BLOB area will increase.
  • the position and boundary of the LED BLOB form BLOB_GROUP(N+1).
  • the image IMAGE(N+1) captured by the image sensor in the N+1th frame cannot be used directly due to LED BLOB interference, but a processing operation is required.
  • the processing algorithm can be a simple subtraction of the budget. It can also be some image transformation, such as convolution, or deep learning algorithms to remove the target. You can also use historical image information to replace, interpolate, etc. the image that needs to be removed. After removing the BLOB_GROUP(N+1), perform feature extraction on the remaining images, and finally calculate the 6DOF parameters of the head-mounted device.
  • the exposure time can be dynamically adjusted to form an image sequence in which a certain proportion of the environment tracking exposure and the handheld device tracking exposure are alternately performed.
  • the LED on the handheld device receives instructions and pulses light according to a certain frequency, brightness and light-emitting moment.
  • the LED tracks the exposure in the environment and the handheld device tracks the intensity of the exposure, and controls them within a certain range. This method does not increase the number of cameras, which makes the inside-out of the headset easier and avoids flickering.
  • an embodiment of the present application also provides a spatial positioning device, which can be used for the head-mounted device shown in FIG. 1, and the head-mounted device positioning device 700 includes:
  • the sending module 701 is configured to send an exposure instruction to the image sensor, so that the image sensor obtains image data of the surrounding environment of the head-mounted device.
  • the image data of the surrounding environment includes environment tracking exposure and handheld device tracking exposure According to the image data of the handheld device, the integrated luminous intensity of the handheld device's light source in the environment tracking exposure is not lower than the integrated luminous intensity of the handheld device's light source in the handheld device's tracking exposure, wherein the environment tracking exposure time is longer than the handheld device's tracking exposure duration;
  • the first obtaining module 702 is configured to track exposure image data according to the environment, and obtain the position of the head-mounted device relative to the surrounding environment;
  • the second acquiring module 703 is configured to acquire the position of the handheld device relative to the head-mounted device according to the tracking exposure image data of the handheld device.
  • the first obtaining module 702 is configured to:
  • the position of the head-mounted device relative to the surrounding environment is determined.
  • the first obtaining module 702 is further configured to:
  • Obtain the movement information of the head-mounted device and the movement information of the handheld device determine the position of the light source luminous spot in the environment tracking exposure, and calculate the halo information of the light source luminous spot in the environment tracking exposure.
  • the head-mounted device can send a pulse instruction to the light source of the handheld device, and the pulse instruction is used to control the brightness of the light source in the environment tracking exposure and the handheld device tracking exposure, so that the light source is in the environment tracking
  • the integrated luminous intensity of the exposure is not lower than the integrated luminous intensity of the light source in the handheld device tracking exposure.
  • the head-mounted device can also adjust the integrated luminous intensity of the light source without sending a pulse command to the light source of the handheld device, but adopts a method of self-adaptive/self-control of the handheld device to adjust the integrated luminous intensity of the light source of the handheld device.
  • the handheld device can be located according to the bright spots (spots) in the image data.
  • the surrounding background is relatively bright, and several light-emitting LEDs on the handheld device will interfere with the feature extraction of the surrounding environment.
  • the specific processing method is as follows:
  • the handheld device tracks the exposure frame IMAGE(N), and the position and boundary of the LED BLOB are extracted by the calculation algorithm.
  • This calculation Algorithms can include: such as binary processing, connected domain extraction, morphological detection and other algorithms.
  • BLOB_GROUP(N) is obtained through the above algorithm. There will be all LED BLOB information in the Nth frame. According to BLOB_GROUP(N), it can be performed Calculation of 6DOF parameters for handheld devices.
  • the LED BLOB area will increase.
  • the position and boundary of the LED BLOB form BLOB_GROUP(N+1).
  • the image IMAGE(N+1) captured by the image sensor in the N+1th frame cannot be used directly due to LED BLOB interference, but a processing operation is required.
  • the processing algorithm can be a simple subtraction of the budget. It can also be some image transformation, such as convolution, or deep learning algorithms to remove the target. It is also possible to use historical image information to replace and interpolate the image whose position needs to be removed. After removing the BLOB_GROUP(N+1), perform feature extraction on the remaining images, and finally calculate the 6DOF parameters of the head-mounted device.
  • the camera can dynamically adjust the exposure time when the camera is working to form an image sequence in which a certain proportion of the environment tracking exposure and the handheld device tracking exposure are alternately performed.
  • the LED on the handheld device receives instructions and pulses light according to a certain frequency, brightness and light-emitting time.
  • the LED tracks the exposure in the environment, and the handheld device tracks the intensity of the exposure, and it is controlled within a certain range. This method does not increase the number of cameras, which makes the inside-out of the headset easier and avoids flickering.
  • the above-mentioned device can execute the method provided in the embodiment of the present application, and has corresponding functional modules and beneficial effects for executing the method.
  • the above-mentioned device can execute the method provided in the embodiment of the present application, and has corresponding functional modules and beneficial effects for executing the method.
  • the methods provided in the embodiments of the present application please refer to the methods provided in the embodiments of the present application.
  • the head-mounted device 80 includes an image sensor 81, a processor 82, and a memory 83.
  • One or more image sensors 81 are installed on the head-mounted device 80.
  • the exposure time can be dynamically adjusted to form an image sequence in which a certain proportion of the environment tracking exposure and the handheld device tracking exposure are alternately performed.
  • the light source (LED) on the handheld device receives instructions from the processor 82, and pulses light according to a certain frequency, brightness and light-emitting time.
  • the pulse luminescence of the LED changes with time, and its intensity may change.
  • the total luminous intensity is determined by the cumulative luminous duration and intensity of the LED during this period. This method of calculating the total luminous intensity is called: integral luminous intensity, An example is shown in Figure 9. In the example of Fig. 9, the LED luminous peak value does not change. In FIG. 10, the peak brightness of the LED light can also be changed over time.
  • the light source is a visible light source and/or an infrared light source, and when the light source is a visible light source, an infrared light cut filter may be provided in the image sensor.
  • an inertial sensor can be provided.
  • the processor 82 and the memory 83 may be connected by a bus or other methods.
  • the memory 83 can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, as in the head-mounted device positioning method in the embodiment of the present application.
  • the processor 82 executes various functional applications and data processing of the controller by running non-volatile software programs, instructions, and modules stored in the memory 83, that is, implements the head-mounted device positioning method of the foregoing method embodiment.
  • the memory 83 may include a storage program area and a storage data area.
  • the storage program area may store an operating system and an application program required by at least one function; the storage data area may store data created according to the use of the head-mounted device positioning device, etc. .
  • the memory 83 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices.
  • the memory 83 may optionally include a memory remotely provided with respect to the processor 82, and these remote memories may be connected to the head-mounted device via a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the one or more modules are stored in the memory 83, and when executed by the one or more processors 82, the head-mounted device positioning method in any of the foregoing method embodiments is executed, for example, the method described above is executed The method steps 101 to 103 in Fig. 2; realize the functions of the modules 701-703 in Fig. 7.
  • the embodiment of the present application provides a non-volatile computer-readable storage medium, the computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are executed by one or more processors, for example, as shown in FIG. 8
  • a processor 82 of the above-mentioned one or more processors can execute the head-mounted device positioning method in any of the above-mentioned method embodiments, for example, execute step 101 to step 103 of the method in FIG. 2 described above; implementation diagram 7 the functions of modules 701-703.
  • the device embodiments described above are merely illustrative.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • each embodiment can be implemented by means of software plus a general hardware platform, and of course, it can also be implemented by hardware.
  • a person of ordinary skill in the art can understand that all or part of the processes in the methods of the foregoing embodiments can be implemented by instructing relevant hardware through a computer program.
  • the program can be stored in a computer readable storage medium. When executed, it may include the procedures of the above-mentioned method embodiments.
  • the storage medium may be a magnetic disk, an optical disc, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

一种空间定位方法、装置、系统和头戴式设备(11),摄像机在工作时,可以动态调整曝光时间,形成以一定比例的环境跟踪曝光、手持设备(12)跟踪曝光交替进行的图像序列。在此过程中,手持设备(12)上的LED接受指令,按照一定的频率、亮度和发光时刻进行脉冲发光。为了避免对人眼造成闪烁感,LED在环境跟踪曝光、手持设备(12)跟踪曝光的强度,控制在一定范围内。上述方法未增加摄像机的个数,从而使得头戴式设备(11)inside-out更加简便,并避免闪烁感。

Description

一种空间定位方法、装置、系统和头戴式设备
相关申请的交叉参考
本申请要求于2020年5月13日提交中国专利局,申请号为202010402917.X,发明名称为“一种空间定位方法、装置、系统和头戴式设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及VR/AR领域,例如涉及一种空间定位方法、装置、系统和头戴式设备。
背景技术
随着5G的到来,VR/AR的浪潮又再次袭来。空间定位技术是VR/AR的核心关键技术。在2017年以后,空间定位方案都逐渐转向由内向外(inside-out),直接把摄像机内置到用户所佩戴的头盔或者眼镜之中,摄像机通过拍摄周边环境,提取周边环境的特征点(如房间墙壁上的图案,天花板的角点等),反向计算出头盔或者眼镜的六自由度(6degrees of freedom,6DOF)空间位置(X,Y,Z三维坐标)和姿态(Yaw,Roll,Pitch三个角度),对于手柄而言,仍然在上面布设光源(如LED),通过摄像机拍摄到的手柄上的LED来解算手柄相对于头盔的6DOF参数。通过这样的方案,可以实现对用户头部和手部的6DOF空间定位和实时位置追踪,从而很好的解决了用户使用的方便性问题。
摄像机在连续拍摄图像时,环境跟踪曝光时长的帧和手持设备跟踪曝光时长的帧按照一定的比例交替进行,比如先拍摄一帧环境跟踪曝光时间的图像,然后拍摄一帧手持设备跟踪曝光时间的图像;或者按照1:2的比例进行拍摄。这种方案的优点是解决了曝光时长的矛盾,并且没有引入额外的摄像机从而增加成本、发热和功耗。但缺点是,同一个摄像机进行时分复用,需要摄像机工作在更高的帧率(如60FPS,其中30FPS进行环境跟踪曝光,用于头部定位,30FPS进行手持设备跟踪曝光,用于手柄定位),增加了摄像机的帧率要求。更重要的是,虽然摄像机工作在更高的帧率,但手柄的帧率是更低的,其LED发光的频率也只有30FPS,这会带来闪烁问题,用户使用时,会看到手柄上的LED处于闪 烁状态,引起用户的视觉不适感。
实现本申请过程中,发明人发现相关技术中至少存在如下问题:同一个摄像机进行时分复用,环境跟踪曝光时长的帧和手持设备跟踪曝光时长的帧按照一定的比例交替进行,用户使用时,会看到手柄上的LED处于闪烁状态,引起用户的视觉不适感。
发明内容
本申请提供一种空间定位方法、装置、系统和头戴式设备,以解决因光源处于闪烁状态,引起用户的视觉不适感。
第一方面,本申请实施例提供了一种头戴式设备,包括:
图像传感器,
处理器,
存储器,
所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以执行以下方法:
向所述图像传感器发送曝光指令,以使得所述图像传感器获取所述头戴式设备周围环境的图像数据,所述周围环境的图像数据包括环境跟踪曝光和手持设备跟踪曝光的图像数据,手持设备的光源在环境跟踪曝光的积分发光强度,不低于所述手持设备的光源在手持设备跟踪曝光的积分发光强度,其中,所述环境跟踪曝光时长大于所述手持设备跟踪曝光时长;
根据所述环境跟踪曝光图像数据,获取所述头戴式设备相对于周围环境的位置;
根据所述手持设备跟踪曝光图像数据,获取所述手持设备相对于所述头戴式设备的位置。
在一些实施例中,所述指令还用于执行:获取上一帧手持设备跟踪曝光图像数据,计算所述光源发光光斑的位置和边界;
预测当前帧环境跟踪曝光的光源发光光斑的位置和边界;
从当前帧环境跟踪曝光图像数据中,去除预测的光源发光光斑;
根据减去后的环境跟踪曝光图像数据,确定所述头戴式设备相对于周围环境的位置。
第二方面,本申请实施例还提供了一种空间定位方法,应用于头戴式设备,所述头戴式设备包括:图像传感器,处理器,存储器,所述存储器存储有可被所述至少一个处理器执行的指令,所述方法包括:
向所述图像传感器发送曝光指令,以使得所述图像传感器获取所述头戴式设备周围环境的图像数据,所述周围环境的图像数据包括环境跟踪曝光和手持设备跟踪曝光的图像数据,手持设备的光源在环境跟踪曝光的积分发光强度,不低于所述手持设备的光源在手持设备跟踪曝光的积分发光强度,其中,所述环境跟踪曝光时长大于所述手持设备跟踪曝光时长;
根据所述环境跟踪曝光图像数据,获取所述头戴式设备相对于周围环境的位置;
根据所述手持设备跟踪曝光图像数据,获取所述手持设备相对于所述头戴式设备的位置。
在一些实施例中,所述根据所述环境跟踪曝光图像数据,获取所述头戴式设备相对于周围环境的位置,包括:
获取上一帧手持设备跟踪曝光图像数据,计算所述光源发光光斑的位置和边界;
预测当前帧环境跟踪曝光的光源发光光斑的位置和边界;
从当前帧环境跟踪曝光图像数据中,去除预测的光源发光光斑;
根据减去后的环境跟踪曝光图像数据,确定所述头戴式设备相对于周围环境的位置。
在一些实施例中,所述预测当前帧环境跟踪曝光的光源发光光斑的位置和边界,包括:
获取所述头戴式设备的运动信息和所述手持设备的运动信息,确定环境跟踪曝光的光源发光光斑的位置,并计算环境跟踪曝光中光源发光光斑的光晕信息。
第三方面,本申请实施例还提供了一种空间定位装置,包括:
第二获取模块,用于根据所述手持设备跟踪曝光图像数据,获取所述手持设备相对于所述头戴式设备的位置。
在一些实施例中,所述第一获取模块,用于:
获取上一帧手持设备跟踪曝光图像数据,计算所述光源发光光斑的位置和 边界;
预测当前帧环境跟踪曝光的光源发光光斑的位置和边界;
从当前帧环境跟踪曝光图像数据中,去除预测的光源发光光斑;
根据减去后的环境跟踪曝光图像数据,确定所述头戴式设备相对于周围环境的位置。
在一些实施例中,所述第一获取模块,还用于:
获取所述头戴式设备的运动信息和所述手持设备的运动信息,确定环境跟踪曝光的光源发光光斑的位置,并计算环境跟踪曝光中光源发光光斑的光晕信息。
第四方面,本申请实施例还提供了一种空间定位系统,包括如上的头戴式设备,以及手持设备,所述手持设备上设有光源。
第五方面,本申请实施例还提供了一种非易失性计算机可读存储介质,所述计算机可读存储介质存储有计算机可执行指令,当所述计算机可执行指令被机器人执行时,使所述机器人执行如上所述的方法。
本申请实施例提供的空间定位方法、装置、系统和头戴式设备,摄像机在工作时,可以动态调整曝光时间,形成以一定比例的环境跟踪曝光、手持设备跟踪曝光交替进行的图像序列。在此过程中,手持设备上的LED接受指令,按照一定的频率、亮度和发光时刻进行脉冲发光。为了避免对人眼造成闪烁感,LED在环境跟踪曝光、手持设备跟踪曝光的强度,控制在一定范围内。这种方式未增加摄像机的个数,从而使得头戴式设备inside-out更加简便,并避免闪烁感。
附图说明
一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。
图1为本申请实施例提供的空间定位的应用场景实例;
图2是本申请头戴式设备定位方法的一个实施例的流程示意图;
图3是本申请头戴式设备中摄像机的曝光和手持设备的LED的发光的时间 关系;
图4是本申请头戴式设备手持设备跟踪曝光的图像示意图;
图5是本申请头戴式设备环境跟踪曝光的图像示意图;
图6是本申请头戴式设备环境跟踪曝光图像处理方法;
图7是本申请头戴式设备空间定位装置的结构示意图;
图8是本申请头戴式设备的一个实施例的结构示意图;
图9是本申请LED发光峰值不变的示意图;
图10是本申请LED发光峰值变化的示意图。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
图1为本申请实施例提供的空间定位的应用场景实例。在本应用场景中,包括虚拟现实(VR)的环境10,头戴式设备11以及手持设备12。头戴式设备11上的图像传感器13可实现环境跟踪曝光和手持设备跟踪曝光。该图像传感器13可以是任何合适的,具有图像采集能力的设备,例如摄像头,照相机等。
该手持设备12上安装有至少一个LED灯14,所述LED灯14可实现脉冲发光,为可见光光源和/或红外光光源,且当光源为可见光光源时,所述图像传感器13内可设置有红外光截止滤波片。
头戴式设备11根据环境跟踪曝光的图像数据,确定头戴式设备11相对于环境的位置。头戴式设备11根据手持设备跟踪曝光的图像数据,确定手持设备12相对于头戴式设备11的位置。
图2为本申请实施例提供的一种空间定位方法的流程示意图,所述方法可以由图1所示的头戴式设备执行,如图2所示,所述方法包括:
101:向所述图像传感器发送曝光指令,以使得所述图像传感器获取所述头 戴式设备周围环境的图像数据,所述周围环境的图像数据包括环境跟踪曝光和手持设备跟踪曝光的图像数据,所述手持设备的光源在环境跟踪曝光的积分发光强度不低于所述手持设备的光源在手持设备跟踪曝光的积分发光强度,其中,所述环境跟踪曝光时长大于所述手持设备跟踪曝光时长。
为了方便说明,可以将环境跟踪曝光定义为长曝光,将手持设备跟踪曝光定义为短曝光。
其中,环境曝光的时间取决于环境光的强弱,一般来说会在1-10ms不等。手持设备的LED检测需要极短曝光,以保证产生强的信噪比,一般手持设备跟踪曝光的时间在20-200us范围比较合适。
其中,曝光指令用于控制图像传感器(摄像机)曝光的方式,可以使手持设备跟踪曝光按照一定时序进行。
具体地,该头戴式设备可以通过向手持设备的光源发送脉冲指令,该脉冲指令用于控制所述光源在环境跟踪曝光、手持设备跟踪曝光中的亮度,以使得所述光源在环境跟踪曝光的积分发光强度,不低于所述光源在手持设备跟踪曝光的积分发光强度。可选地,该头戴式设备也可以无需通过向手持设备的光源发送脉冲指令的方式来调节光源积分发光强度,而采用手持设备自适应/自控制调节手持设备的光源积分发光强度的方式。
可实施地,为了避免手持设备跟踪曝光时,设置在手持设备的LED闪烁,可以将LED的发光模式改为:脉冲发光。并且,在环境跟踪曝光的积分发光强度不低于在手持设备跟踪曝光的积分发光强度。如摄像机的曝光和手持设备的LED的发光的时间关系如图3所示(此示例为环境跟踪曝光、手持设备跟踪曝光以1:1比例进行,也可以是1:2或者其它比例)。
102,根据所述环境跟踪曝光图像数据,获取所述头戴式设备相对于周围环境的位置;
103,根据所述手持设备跟踪曝光图像数据,获取所述手持设备相对于所述头戴式设备的位置。
具体的,如图4所述,在手持设备跟踪曝光中,可以看到LED灯的亮点,根据图像数据中的亮点(光斑),可以定位手持设备。
具体的,如图5所述,在环境跟踪曝光中,可以看到环境跟踪曝光帧,周边背景相对较亮,手持设备上若干发光LED,会对周边环境的特征提取造成干扰。 为了减少在环境跟踪曝光帧里,LED对周边环境的特征提取造成干扰,需要对此帧里的图像进行处理。
在一些实施例中,对环境跟踪曝光的图像处理的方式如下:
获取上一帧手持设备跟踪曝光图像数据,计算所述光源发光光斑的位置和边界;
预测当前帧环境跟踪曝光的光源发光光斑的位置和边界;
从当前帧环境跟踪曝光图像数据中,去除预测的光源发光光斑;
根据减去后的环境跟踪曝光图像数据,确定所述头戴式设备相对于周围环境的位置。
在一些实施例中,所述预测当前帧环境跟踪曝光的光源发光光斑的位置和边界,具体为:
获取所述头戴式设备的运动信息和所述手持设备的运动信息,确定环境跟踪曝光的光源发光光斑的位置,并计算环境跟踪曝光中光源发光光斑的光晕信息。
其中,所述运动信息包括设置在所述头戴式设备中的惯性传感器数据;或者,还包括设置在手持设备中的惯性传感器数据。
在另一些实施例中,处理算法流程如图6。具体描述为:第一步,在图像传感器捕获的第N帧,手持设备跟踪曝光帧IMAGE(N)里,通过运算算法提取出来的LED发光光斑(LED BLOB)的位置,边界等信息,该运算算法可以包括:比如二值化处理,连通域提取,形态学检测等算法,通过上述算法获得BLOB_GROUP(N),这里面会有第N帧的所有LED BLOB的信息,根据BLOB_GROUP(N),可进行手持设备的6DOF参数计算。
第二步,根据第N帧的LED BLOB运动信息和(如果有)手持设备上内置的IMU(惯性传感器)数据,以及由于环境跟踪曝光带来的光晕效益使得LED BLOB面积会增大,预测在第N+1帧(环境跟踪曝光帧)时,LED BLOB的位置和边界,形成BLOB_GROUP(N+1)。
第三步,图像传感器捕获的第N+1帧获得的图像IMAGE(N+1),由于有LED BLOB干扰,不能直接使用,而是要做处理运算,该处理算法可以是简单的减法预算,也可以是一些图像变换,如卷积,或者深度学习的算法,对目标进行去除。也可以使用历史图像信息,对需要去除位置的图像进行替换,插值等,将 其中的BLOB_GROUP(N+1)去除之后,对剩余的图像进行特征提取,并最终计算头戴设备的6DOF参数。
本申请实施例的空间定位方法,摄像机在工作时,可以动态调整曝光时间,形成以一定比例的环境跟踪曝光、手持设备跟踪曝光交替进行的图像序列。在此过程中,手持设备上的LED接受指令,按照一定的频率、亮度和发光时刻进行脉冲发光。为了避免对人眼造成闪烁感,LED在环境跟踪曝光和手持设备跟踪曝光的强度,控制在一定范围内。这种方式未增加摄像机的个数,从而使得头戴式设备inside-out更加简便,并避免闪烁感。
相应的,如图7所示,本申请实施例还提供了一种空间定位装置,可以用于图1所示的头戴式设备,头戴式设备定位装置700包括:
发送模块701,用于向所述图像传感器发送曝光指令,以使得所述图像传感器获取所述头戴式设备周围环境的图像数据,所述周围环境的图像数据包括环境跟踪曝光和手持设备跟踪曝光的图像数据,手持设备的光源在环境跟踪曝光的积分发光强度不低于所述手持设备的光源在手持设备跟踪曝光的积分发光强度,其中,所述环境跟踪曝光时长大于所述手持设备跟踪曝光时长;
第一获取模块702,用于根据所述环境跟踪曝光图像数据,获取所述头戴式设备相对于周围环境的位置;
第二获取模块703,用于根据所述手持设备跟踪曝光图像数据,获取所述手持设备相对于所述头戴式设备的位置。
在一些实施例中,所述第一获取模块702,用于:
获取上一帧手持设备跟踪曝光图像数据,计算所述光源发光光斑的位置和边界;
预测当前帧环境跟踪曝光的光源发光光斑的位置和边界;
从当前帧环境跟踪曝光图像数据中,去除预测的光源发光光斑;
根据减去后的环境跟踪曝光图像数据,确定所述头戴式设备相对于周围环境的位置。
在其他实施例中,所述第一获取模块702,还用于:
获取所述头戴式设备的运动信息和所述手持设备的运动信息,确定环境跟踪曝光的光源发光光斑的位置,并计算环境跟踪曝光中光源发光光斑的光晕信 息。
可实施的,该头戴式设备可以通过向手持设备的光源发送脉冲指令,该脉冲指令用于控制所述光源在环境跟踪曝光、手持设备跟踪曝光中的亮度,以使得所述光源在环境跟踪曝光的积分发光强度,不低于所述光源在手持设备跟踪曝光的积分发光强度。可选地,该头戴式设备也可以无需通过向手持设备的光源发送脉冲指令的方式来调节光源积分发光强度,而采用手持设备自适应/自控制调节手持设备的光源积分发光强度的方式。
在手持设备跟踪曝光中,根据图像数据中的亮点(光斑),可以定位手持设备。
在环境跟踪曝光中,可以看到环境跟踪曝光帧,周边背景相对较亮,手持设备上若干发光LED,会对周边环境的特征提取造成干扰。为了减少在环境跟踪曝光帧里,LED对周边环境的特征提取造成干扰,需要对此帧里的图像进行处理。
具体的处理方式如下:
具体描述为:第一步,在图像传感器捕获的第N帧,手持设备跟踪曝光帧IMAGE(N)里,通过运算算法提取出来的LED发光光斑(LED BLOB)的位置,边界等信息,该运算算法可以包括:比如二值化处理,连通域提取,形态学检测等算法,通过上述算法获得BLOB_GROUP(N),这里面会有第N帧的所有LED BLOB的信息,根据BLOB_GROUP(N),可进行手持设备的6DOF参数计算。
第二步,根据第N帧的LED BLOB运动信息和(如果有)手持设备上内置的IMU(惯性传感器)数据,以及由于环境跟踪曝光带来的光晕效益使得LED BLOB面积会增大,预测在第N+1帧(环境跟踪曝光帧)时,LED BLOB的位置和边界,形成BLOB_GROUP(N+1)。
第三步,图像传感器捕获的第N+1帧获得的图像IMAGE(N+1),由于有LED BLOB干扰,不能直接使用,而是要做处理运算,该处理算法可以是简单的减法预算,也可以是一些图像变换,如卷积,或者深度学习的算法,对目标进行去除。也可以使用历史图像信息,对需要去除位置的图像进行替换,插值等,将其中的BLOB_GROUP(N+1)去除之后,对剩余的图像进行特征提取,并最终计算头戴设备的6DOF参数。
本申请实施例的空间定位装置,摄像机在工作时,可以动态调整曝光时间,形成以一定比例的环境跟踪曝光、手持设备跟踪曝光交替进行的图像序列。在 此过程中,手持设备上的LED接受指令,按照一定的频率、亮度和发光时刻进行脉冲发光。为了避免对人眼造成闪烁感,LED在环境跟踪曝光、手持设备跟踪曝光的强度,控制在一定范围内。这种方式未增加摄像机的个数,从而使得头戴式设备inside-out更加简便,并避免闪烁感。
需要说明的是,上述装置可执行本申请实施例所提供的方法,具备执行方法相应的功能模块和有益效果。未在装置实施例中详尽描述的技术细节,可参见本申请实施例所提供的方法。
其中,在一些实施例中,请参照图8,头戴式设备80包括图像传感器81、处理器82、存储器83。
头戴式设备80上安装有一个或多个图像传感器81(如摄像机)。
图像传感器81在工作时,可以动态调整曝光时间,从而形成以一定比例的环境跟踪曝光、手持设备跟踪曝光交替进行的图像序列。在此过程中,手持设备上的光源(LED)接受处理器82的指令,按照一定的频率、亮度和发光时刻进行脉冲发光。LED的脉冲发光随着时间变化,其强度可能会发生变化,其总发光强度由LED在此期间的累计发光时长和强度共同决定,这种计算总发光强度的方法称之为:积分发光强度,示例见图9所示。在图9示例中,LED发光峰值不变化。在图10中,LED发光的亮度峰值也可以是随着时间改变的。
其中,所述光源为可见光光源和/或红外光光源,且当光源为可见光光源时,所述图像传感器内可设置有红外光截止滤波片。
在头戴式设备中,可设置惯性传感器。
在本申请实施例中,所述处理器82和所述存储器83可以通过总线或者其他方式连接。
存储器83作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本申请实施例中的头戴式设备定位方法对应的程序指令/模块。处理器82通过运行存储在存储器83中的非易失性软件程序、指令以及模块,从而执行控制器的各种功能应用以及数据处理,即实现上述方法实施例的头戴式设备定位方法。
存储器83可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据头戴式设备 定位装置的使用所创建的数据等。此外,存储器83可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器83可选包括相对于处理器82远程设置的存储器,这些远程存储器可以通过网络连接至头戴式设备。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
所述一个或者多个模块存储在所述存储器83中,当被所述一个或者多个处理器82执行时,执行上述任意方法实施例中的头戴式设备定位方法,例如,执行以上描述的图2中的方法步骤101至步骤103;实现图7中的模块701-703的功能。
上述产品可执行本申请实施例所提供的方法,具备执行方法相应的功能模块和有益效果。未在本实施例中详尽描述的技术细节,可参见本申请实施例所提供的方法。
本申请实施例提供了一种非易失性计算机可读存储介质,所述计算机可读存储介质存储有计算机可执行指令,该计算机可执行指令被一个或多个处理器执行,例如图8中的一个处理器82,可使得上述一个或多个处理器可执行上述任意方法实施例中的头戴式设备定位方法,例如,执行以上描述的图2中的方法步骤101至步骤103;实现图7中的模块701-703的功能。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
通过以上的实施例的描述,本领域普通技术人员可以清楚地了解到各实施例可借助软件加通用硬件平台的方式来实现,当然也可以通过硬件。本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的 存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;在本申请的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本申请的不同方面的许多其它变化,为了简明,它们没有在细节中提供;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (10)

  1. 一种头戴式设备,包括:
    图像传感器,
    处理器,
    存储器,
    所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以执行以下方法:
    向所述图像传感器发送曝光指令,以使得所述图像传感器获取所述头戴式设备周围环境的图像数据,所述周围环境的图像数据包括环境跟踪曝光和手持设备跟踪曝光的图像数据,手持设备的光源在环境跟踪曝光的积分发光强度不低于所述手持设备的光源在手持设备跟踪曝光的积分发光强度,其中,所述环境跟踪曝光时长大于所述手持设备跟踪曝光时长;
    根据所述环境跟踪曝光图像数据,获取所述头戴式设备相对于周围环境的位置;
    根据所述手持设备跟踪曝光图像数据,获取所述手持设备相对于所述头戴式设备的位置。
  2. 根据权利要求1所述的头戴式设备,所述指令还用于执行:获取上一帧手持设备跟踪曝光图像数据,计算所述光源发光光斑的位置和边界;
    预测当前帧环境跟踪曝光的光源发光光斑的位置和边界;
    从当前帧环境跟踪曝光图像数据中,去除预测的光源发光光斑;
    根据减去后的环境跟踪曝光图像数据,确定所述头戴式设备相对于周围环境的位置。
  3. 一种空间定位方法,应用于头戴式设备,所述头戴式设备包括:图像传感器,处理器,存储器,所述存储器存储有可被所述至少一个处理器执行的指令,所述方法包括:
    向所述图像传感器发送曝光指令,以使得所述图像传感器获取所述头戴式设备周围环境的图像数据,所述周围环境的图像数据包括环境跟踪曝光和手持 设备跟踪曝光的图像数据,手持设备的光源在环境跟踪曝光的积分发光强度不低于所述手持设备的光源在手持设备跟踪曝光的积分发光强度,其中,所述环境跟踪曝光时长大于所述手持设备跟踪曝光时长;
    根据所述环境跟踪曝光图像数据,获取所述头戴式设备相对于周围环境的位置;
    根据所述手持设备跟踪曝光图像数据,获取所述手持设备相对于所述头戴式设备的位置。
  4. 根据权利要求3所述的方法,所述根据所述环境跟踪曝光图像数据,获取所述头戴式设备相对于周围环境的位置,包括:
    获取上一帧手持设备跟踪曝光图像数据,计算所述光源发光光斑的位置和边界;
    预测当前帧环境跟踪曝光的光源发光光斑的位置和边界;
    从当前帧环境跟踪曝光图像数据中,去除预测的光源发光光斑;
    根据减去后的环境跟踪曝光图像数据,确定所述头戴式设备相对于周围环境的位置。
  5. 根据权利要求4所述的方法,所述预测当前帧环境跟踪曝光的光源发光光斑的位置和边界,包括:
    获取所述头戴式设备的运动信息和所述手持设备的运动信息,确定环境跟踪曝光的光源发光光斑的位置,并计算环境跟踪曝光中光源发光光斑的光晕信息。
  6. 一种空间定位装置,包括:
    发送模块,用于向所述图像传感器发送曝光指令,以使得所述图像传感器获取所述头戴式设备周围环境的图像数据,所述周围环境的图像数据包括环境跟踪曝光和手持设备跟踪曝光的图像数据,手持设备的光源在环境跟踪曝光的积分发光强度不低于所述手持设备的光源在手持设备跟踪曝光的积分发光强度,其中,所述环境跟踪曝光时长大于所述手持设备跟踪曝光时长;
    第一获取模块,用于根据所述环境跟踪曝光图像数据,获取所述头戴式设 备相对于周围环境的位置;
    第二获取模块,用于根据所述手持设备跟踪曝光图像数据,获取所述手持设备相对于所述头戴式设备的位置。
  7. 根据权利要求6所述的装置,所述第一获取模块,用于:
    获取上一帧手持设备跟踪曝光图像数据,计算所述光源发光光斑的位置和边界;
    预测当前帧环境跟踪曝光的光源发光光斑的位置和边界;
    从当前帧环境跟踪曝光图像数据中,去除预测的光源发光光斑;
    根据减去后的环境跟踪曝光图像数据,确定所述头戴式设备相对于周围环境的位置。
  8. 根据权利要求7所述的装置,所述第一获取模块,还用于:
    获取所述头戴式设备的运动信息和所述手持设备的运动信息,确定环境跟踪曝光的光源发光光斑的位置,并计算环境跟踪曝光中光源发光光斑的光晕信息。
  9. 一种空间定位系统,包括如权利要求1或2所述的头戴式设备,以及手持设备,所述手持设备上设有光源。
  10. 一种非易失性计算机可读存储介质,所述计算机可读存储介质存储有计算机可执行指令,当所述计算机可执行指令被头戴式设备执行时,使所述头戴式设备执行如权利要求3-5任一项所述的方法。
PCT/CN2020/094766 2020-05-13 2020-06-05 一种空间定位方法、装置、系统和头戴式设备 WO2021227163A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010402917.X 2020-05-13
CN202010402917.XA CN111614915B (zh) 2020-05-13 2020-05-13 一种空间定位方法、装置、系统和头戴式设备

Publications (1)

Publication Number Publication Date
WO2021227163A1 true WO2021227163A1 (zh) 2021-11-18

Family

ID=72201293

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/094766 WO2021227163A1 (zh) 2020-05-13 2020-06-05 一种空间定位方法、装置、系统和头戴式设备

Country Status (2)

Country Link
CN (1) CN111614915B (zh)
WO (1) WO2021227163A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112286343A (zh) * 2020-09-16 2021-01-29 青岛小鸟看看科技有限公司 定位追踪方法、平台及头戴显示系统
CN112437213A (zh) * 2020-10-28 2021-03-02 青岛小鸟看看科技有限公司 图像采集方法、手柄设备、头戴设备及头戴系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101937505A (zh) * 2009-07-03 2011-01-05 深圳泰山在线科技有限公司 一种目标检测方法和设备及其使用的图像采集装置
US20120106799A1 (en) * 2009-07-03 2012-05-03 Shenzhen Taishan Online Technology Co., Ltd. Target detection method and apparatus and image acquisition device
CN104436643A (zh) * 2014-11-17 2015-03-25 深圳市欢创科技有限公司 在显示屏上输出光枪瞄准准心的方法、装置及系统
CN110622107A (zh) * 2017-05-09 2019-12-27 微软技术许可有限责任公司 跟踪可穿戴设备和手持对象姿势

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN206400472U (zh) * 2016-08-24 2017-08-11 王忠民 一种虚拟现实设备及其定位系统
CN109313483A (zh) * 2017-01-22 2019-02-05 广东虚拟现实科技有限公司 一种与虚拟现实环境进行交互的装置
CN107037880A (zh) * 2017-03-02 2017-08-11 深圳前海极客船长网络科技有限公司 基于虚拟现实技术的空间定位定姿系统及其方法
WO2020086356A2 (en) * 2018-10-26 2020-04-30 Magic Leap, Inc. Ambient electromagnetic distortion correction for electromagnetic tracking

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101937505A (zh) * 2009-07-03 2011-01-05 深圳泰山在线科技有限公司 一种目标检测方法和设备及其使用的图像采集装置
US20120106799A1 (en) * 2009-07-03 2012-05-03 Shenzhen Taishan Online Technology Co., Ltd. Target detection method and apparatus and image acquisition device
CN104436643A (zh) * 2014-11-17 2015-03-25 深圳市欢创科技有限公司 在显示屏上输出光枪瞄准准心的方法、装置及系统
CN110622107A (zh) * 2017-05-09 2019-12-27 微软技术许可有限责任公司 跟踪可穿戴设备和手持对象姿势

Also Published As

Publication number Publication date
CN111614915B (zh) 2021-07-30
CN111614915A (zh) 2020-09-01

Similar Documents

Publication Publication Date Title
US11734867B2 (en) Detecting physical boundaries
WO2021227163A1 (zh) 一种空间定位方法、装置、系统和头戴式设备
US9962078B2 (en) Gaze tracking variations using dynamic lighting position
US10666856B1 (en) Gaze-directed photography via augmented reality feedback
US10901215B1 (en) Systems and methods for providing a mobile artificial reality user with environmental awareness
CN109710071A (zh) 一种屏幕控制方法和装置
JP6025690B2 (ja) 情報処理装置および情報処理方法
CN103957354B (zh) 一种移动终端及其引导拍照的方法和装置
TW201940953A (zh) 拍攝方法、裝置、智慧型裝置及儲存媒體
WO2015104644A2 (en) Light modulation in eye tracking devices
JP2015088096A (ja) 情報処理装置および情報処理方法
US9380198B2 (en) Photographing system, photographing method, light emitting apparatus, photographing apparatus, and computer-readable storage medium
CN108200340A (zh) 能够检测眼睛视线的拍照装置及拍照方法
KR20190134476A (ko) 동적 비전 센서(dvs) 스테레오 쌍 및 펄스형 스페클 패턴 프로젝터로부터의 반밀도 깊이 추정
JP2015088098A (ja) 情報処理装置および情報処理方法
US9489102B2 (en) System and method of modifying lighting in a display system
JP2020173670A (ja) 複数のマーカを備えたデバイス
JP7367689B2 (ja) 情報処理装置、情報処理方法、及び記録媒体
CN110177201A (zh) 一种摄像头的水下控制方法及可穿戴设备
US11071650B2 (en) Visibility enhancing eyewear
US11042747B2 (en) Masking method for augmented reality effects
CN206236010U (zh) 指套、手套以及可穿戴设备
JP2006127539A (ja) 画像抽出装置
US20220161126A1 (en) Device including plurality of markers
WO2023276605A1 (ja) 照明制御システム、照明制御方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20935289

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20935289

Country of ref document: EP

Kind code of ref document: A1