WO2022057834A1 - Vr系统中多摄像头曝光中心点的对齐方法、系统 - Google Patents

Vr系统中多摄像头曝光中心点的对齐方法、系统 Download PDF

Info

Publication number
WO2022057834A1
WO2022057834A1 PCT/CN2021/118546 CN2021118546W WO2022057834A1 WO 2022057834 A1 WO2022057834 A1 WO 2022057834A1 CN 2021118546 W CN2021118546 W CN 2021118546W WO 2022057834 A1 WO2022057834 A1 WO 2022057834A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
type
exposure
data
vts
Prior art date
Application number
PCT/CN2021/118546
Other languages
English (en)
French (fr)
Inventor
张秀志
周宏伟
柳光辉
郭衡江
Original Assignee
青岛小鸟看看科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛小鸟看看科技有限公司 filed Critical 青岛小鸟看看科技有限公司
Priority to EP21868656.6A priority Critical patent/EP4199510A4/en
Publication of WO2022057834A1 publication Critical patent/WO2022057834A1/zh
Priority to US17/819,500 priority patent/US11962749B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/139Format conversion, e.g. of frame-rate or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • H04N23/662Transmitting camera control signals through networks, e.g. control via the Internet by using master/slave camera arrangements for affecting the control of camera image capture, e.g. placing the camera in a desirable condition to capture a desired image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/665Control of cameras or camera modules involving internal camera communication with the image sensor, e.g. synchronising or multiplexing SSIS control signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/72Combination of two or more compensation controls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Definitions

  • the invention relates to the field of computer vision, and more particularly, to a method and a system for aligning exposure center points of multiple cameras in a VR system.
  • each camera may have different light corresponding to the environment, and each camera needs to set different exposure parameters, and it is difficult to complete the alignment of the exposure center points of multiple cameras by setting different exposure parameters.
  • the embodiments of the present invention provide a method and system for aligning the exposure center points of multiple cameras in a VR system, so as to solve the problem that the existing VR systems with optical tracking requirements are provided with multiple cameras, and they are respectively placed on the upper and lower sides.
  • the existing VR systems with optical tracking requirements are provided with multiple cameras, and they are respectively placed on the upper and lower sides.
  • each camera corresponds to different light in the environment, and each camera needs to set different exposure parameters, and it is difficult to complete the alignment of the exposure center points of multiple cameras by setting different exposure parameters.
  • the image data of the first type of frame is collected according to the preset frame rate, the first type of frame is set to track external things, and the exposure parameters of each camera during the tracking process are dynamically changed according to changes in the external things;
  • the VTS data varies with exposure parameters, to fix the time interval between the exposure center point of the first type of frame and the FSIN synchronization signal in the VR system;
  • the image data of the second type of frame is collected according to the preset frame rate, the second type of frame is set to track the optical handle, and the exposure parameters of each camera in the tracking process are consistent;
  • the preset frame rate may be 60 Hz.
  • the external object is an external environment or a human body part.
  • the method before adjusting the VTS data of the frame of the first type, the method further includes:
  • the process of adjusting the VTS data of the first type of frame includes:
  • the VTS data of the first type of frame is calculated by the exposure parameters, and the VTS data is written into the register of the image sensor.
  • the brighter the environment of the external object the smaller the exposure parameter.
  • the value of the VTS data is decreased.
  • the VTS data of the image of the frame of the second type is adjusted according to the VTS data of the frame of the first type:
  • the sum of the value of the VTS data of the frame of the second type and the value of the VTS data of the frame of the first type is a fixed value.
  • the center point alignment of the first type frame and the second type frame is completed after the process of fixing the time interval between the exposure center point of the second type frame and the FSIN synchronization signal in the VR system ;
  • the alignment of the center points of the first type of frame and the second type of frame is sequentially and repeatedly completed, so as to align the camera exposure center point in the entire VR tracking process.
  • An embodiment of the present invention also provides an alignment system for the exposure center points of multiple cameras in a VR system, the alignment system is used to realize the aforementioned method for aligning the exposure center points of multiple cameras in the VR system, and the system includes: a class frame division module, a camera , the first type of frame processing module and the second type of frame processing module, wherein,
  • the class frame dividing module is configured to instruct the camera to sequentially acquire the first class frame and the second class frame;
  • the camera is set to sequentially collect the image data of the first type frame and the second type frame according to the instruction of the type frame dividing module, and collect the image data according to the preset frame rate;
  • the first-type frame processing module is set to adjust the VTS data of the first-type frame, so that the VTS data changes with the change of the exposure parameters, so as to fix the exposure center point of the first-type frame and the FSIN synchronization signal in the VR system. time interval;
  • the second type frame processing module is configured to adjust the VTS data of the second type frame according to the VTS data of the first type frame, and fix the time interval between the exposure center point of the second type frame and the FSIN synchronization signal , to complete the alignment of the camera exposure center point.
  • the collected image data is divided into the first type of frame and the second type of frame, and the corresponding The first type of frame and the second type of frame are processed.
  • the first type of frame is used to track external things.
  • the exposure parameters of the camera change dynamically according to the change of external things, so as to adjust the VTS data of the first type of frame.
  • the VTS data changes with the change of the exposure parameters to fix the time interval between the exposure center point of the first type of frame and the FSIN synchronization signal in the VR system
  • the second type of frame has the same exposure parameters of the camera during the tracking process
  • This feature has a fixed time interval between the exposure center point of the second type of frame and the FSIN synchronization signal in the VR system, so that the center point alignment of the first type of frame and the second type of frame is completed, and it is repeated in sequence when acquiring the remaining image data. For example, the center point of the first type frame is aligned with the center point of the second type frame, and then the camera exposure center point during the entire VR tracking process is aligned.
  • each camera is set to a different exposure. parameters, and can also complete the alignment of the exposure center point, so as to stabilize the output of the entire VR system and improve the user's comfort and immersion.
  • FIG. 1 is a flowchart of a method for aligning exposure center points of multiple cameras in a VR system according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of an alignment system of a multi-camera exposure center point in a VR system according to an embodiment of the present invention.
  • the existing VR systems with optical tracking requirements are equipped with multiple cameras, and they are placed in the upper, lower, left and right parts, so there is a situation where the light of each camera corresponds to the environment is different, and each camera needs to set different exposure parameters, and It is difficult to align the exposure center points of multiple cameras by setting different exposure parameters.
  • embodiments of the present invention provide a method for aligning exposure center points of multiple cameras in a VR system. Specific embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
  • FIG. 1 illustrates the method for aligning the exposure center points of multiple cameras in the VR system according to the embodiment of the present invention
  • FIG. 2 illustrates the embodiment of the present invention.
  • the alignment system of the multi-camera exposure center point in the VR system is exemplarily marked.
  • the method for aligning the exposure center points of multiple cameras in the VR system according to the embodiment of the present invention provided by the embodiment of the present invention includes:
  • S110 Collect image data of a first type of frame according to a preset frame rate, the first type of frame is used to track external things, and the exposure parameters of each camera during the tracking process are dynamically changed according to changes in external things;
  • S130 Collect the image data of the second type of frame according to the preset frame rate, the second type of frame is used to track the optical handle, and the exposure parameters of each camera during the tracking process are consistent;
  • S140 Adjust the VTS data of the second-type frame according to the VTS data of the first-type frame, and fix the time interval between the exposure center point of the second-type frame and the FSIN synchronization signal in the VR system, so as to complete all the The alignment of the camera exposure center point.
  • the preset frame rate is a preset frame rate, which needs to be combined with the frame rate of the camera itself, and the frame rate is an integer multiple of the synchronization frame rate, such as a synchronization frame
  • the synchronization frame rate can be 30Hz, or 15Hz; if the image data is collected at 90Hz, the image data can be a 30Hz head image and a 60Hz hand image, and the combination sequence during collection can be a head image, a hand image, and a hand image.
  • the preset frame rate is specific. The value of is not specifically limited. In this embodiment, 60 Hz is used to collect image data, including the first type of frame at 30 Hz and the second type of frame at 30 Hz.
  • the first type of frame is used to track external things, and the external thing is an external environment or a human body part, that is, the first type of frame is an image of an external environment or a human body part. If the external environment and the handle are collected, the external object is the external environment, that is, the first type of frame is the external environment image.
  • the frame rate of 60 Hz is used to collect image data, and the first frame is collected from the external environment to form 30 Hz.
  • the external environment image of the whole VR system can realize the Head 6DOF function of the whole VR system.
  • the second frame captures the handle image of 30Hz to track the optical handle to realize the Hand6DOF function.
  • the first type of frame refers to the external environment formed by collecting the external environment.
  • the frame rate of 30Hz is used to collect image data to achieve 15Hz synchronization.
  • the first frame and the second frame first collect the head image
  • the third frame and the fourth frame collect the image of the handle
  • the first type of frame refers to the collected head image; no matter how many frames of external things are collected first and then the handle image, the first type of frames are the images of the first few frames of external things,
  • the first type of frame is a non-handle image collected except for tracking the handle, which can be an external environment image or a human body part image.
  • step S110 the exposure parameters of each camera during the tracking process are dynamically changed according to changes in external things, and multiple cameras keep working synchronously.
  • the external environment is also different.
  • the exposure parameters of each camera also need to be different.
  • the environment where the external things are located is darker, the greater the exposure parameter ;
  • the brighter the environment where the external objects are located the smaller the exposure parameter, that is, the exposure time of the camera with a darker environment is set to be longer, and the exposure time of the camera with a brighter environment is set to be shorter.
  • only the alignment of the exposure center point can be achieved. , that is, it can be ensured that multiple cameras have the same time when collecting data, and multiple cameras will also have the same timestamp.
  • step S120 it is necessary to calculate the scan time of each line of data according to the HTS setting of the data image and the clock frequency in the VR system, and obtain the default VTS value of the image sensor in the VR system, and then adjust The VTS data of the first type of frame, wherein, the process of adjusting the VTS data of the first type of frame includes:
  • S122 Calculate the VTS data of the first type of frame through the exposure parameters, and write the VTS data into the register of the image sensor;
  • the VTS data changes with the change of the exposure parameter. Specifically, if the exposure parameter increases, the value of the VTS data is increased; if the exposure parameter decreases, the value of the VTS data is decreased to fix the value of the first type of frame.
  • the time interval between the exposure center point and the FSIN synchronization signal in the VR system so as to ensure that the center points of the multiple cameras are aligned when acquiring the image data of the first type of frame.
  • a second type of frame is collected according to a preset frame rate, and the image data of the second type of frame is used to track the optical handle.
  • a 60 Hz frame is used.
  • the first frame captures the external environment to form a 30Hz external environment image to realize the Head 6DOF function of the entire VR system, and the second frame captures the handle to form a 30Hz handle image to track the optical handle to achieve the Hand 6DOF function.
  • the second type of frame refers to the image of the handle formed by tracking the handle; in another specific embodiment, the frame rate of 30 Hz is used to collect image data to achieve 15 Hz synchronization.
  • the first frame and the second frame first collect the head
  • the third frame and the fourth frame capture the image of the handle.
  • the second type of frame refers to the collected image of the handle; no matter how many frames of external things are collected first and then the handle image is collected, the second frame refers to the collected image of the handle.
  • the second type of frame is the image of the acquired handle.
  • step S130 since the brightness of the handle will be brighter than the environment, and at the same time, in order to reduce the image afterglow when the handle moves quickly, the exposure parameters of each camera in the process of tracking the optical handle are
  • the exposure time (exposure time) is set very small (tens to hundreds of microseconds), and the exposure times of multiple cameras are set to the same parameters, so that the exposure parameters (exposure time) of each camera remain consistent during the process of tracking the optical handle.
  • the exposure center points of the second-type frames of multiple cameras are aligned, that is, the time interval between the exposure center points of the second-type frames and the FSIN synchronization signal is fixed.
  • step S140 according to the first-type frame
  • the VTS data of the frame adjusts the VTS data of the second type frame, and the time interval between the exposure center point of the second type frame and the FSIN synchronization signal in the VR system is fixed, so as to complete the alignment of the camera exposure center point.
  • step S140 adjusting the VTS data of the image of the second type of frame according to the VTS data of the first type of frame is: make the value of the VTS data of the second type of frame and the first type of frame.
  • the sum of the values of the VTS data is a fixed value.
  • the value of the VTS data of the second type frame and the value of the VTS data of the first type frame are always the same value, so the first type frame and the second type frame are two frames.
  • the image data can be output stably at a specific frame rate, and the camera can output stably at a preset frame rate, so as to achieve the center point alignment of the first type of frame and the second type of frame, and the entire collection of image data is collected in the VR system.
  • the first type of frame, the second type of frame, the first type of frame, the second type of frame In the process, the first type of frame, the second type of frame, the first type of frame, the second type of frame.
  • the process of aligning the center point of the frame is repeated, and the center point alignment of the first type of frame and the second type of frame is repeated in turn, so that the camera exposure center point in the entire VR tracking process can be aligned.
  • the method for aligning the exposure center points of multiple cameras in the VR system divides the collected image data into the first type of frame and the second type of frame.
  • the second type of frame is processed.
  • the first type of frame is used to track external things.
  • the exposure parameters of the camera change dynamically according to the changes of external things.
  • the VTS data of the first type of frame is adjusted so that the VTS data changes with the It changes with the change of exposure parameters to fix the time interval between the exposure center point of the first type of frame and the FSIN synchronization signal in the VR system
  • the second type of frame has the same exposure parameters of the camera during the tracking process, which can easily realize the second type of frame.
  • the exposure center point of the frame has a fixed time interval with the FSIN synchronization signal in the VR system, so as to complete the center point alignment of the first type frame and the second type frame, and repeat the completion of the first type frame when acquiring the remaining image data. Align with the center point of the second type of frame, and then align the camera exposure center point in the entire VR tracking process. In this way, even if more cameras are added to meet the optical handle tracking requirements, each camera can be set with different exposure parameters. The alignment of the exposure center point, so as to stabilize the output of the entire VR system, improve the user's comfort and immersion.
  • an embodiment of the present invention further provides a system 100 for aligning exposure center points of multiple cameras in a VR system, which is used to implement the aforementioned method for aligning exposure center points of multiple cameras in a VR system, including a class frame division module 110 .
  • the type frame division module 110 is used to instruct the camera 120 to obtain the image data of the first type frame and the second type frame in turn, That is, the camera 120 is instructed to sequentially collect external objects and the light-emitting handle, and the image data collected by the camera 120 is divided into the first type of frame and the second type of frame; the camera 120 is used for the collection of the instructions of the division module 110 according to the type of frame.
  • the image data of the external object or the light-emitting handle is collected, and the image data is collected according to the preset frame rate;
  • the first type frame processing module 130 is used to adjust the VTS data of the first type frame, so that the VTS data changes with the change of the exposure parameter , to fix the time interval between the exposure center point of the first type frame and the FSIN synchronization signal in the VR system;
  • the second type frame processing module 140 is used to adjust the VTS data of the second type frame according to the VTS data of the first type frame, And the time interval between the exposure center point of the second type of frame and the FSIN synchronization signal in the VR system is fixed, so as to complete the alignment of the camera exposure center point.
  • the class frame division module instructs the cameras to obtain the first class frame and the second class frame in sequence, and the camera follows the preset
  • the frame rate collects the image data of the first type frame and the second type frame, and then processes the VTS data of the first type frame and the second type frame through the first type frame processing module and the second type frame processing module, so that the first type frame and the second type frame are processed.
  • each camera can be set with different exposure parameters, and the alignment of the exposure center point can be completed, so as to stabilize the output of the entire VR system and improve the user's comfort and immersion. feel.
  • Embodiments of the present invention further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, wherein the computer program is configured to execute the steps in any of the above method embodiments when running.
  • An embodiment of the present invention also provides an electronic device, comprising a memory and a processor, where a computer program is stored in the memory, and the processor is configured to run the computer program to execute the steps in any of the above method embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

提供一种VR系统中多摄像头曝光中心点的对齐方法及系统,方法包括:按照预设帧率采集第一类帧的图像数据,第一类帧用于追踪外部事物,并且各个摄像头在追踪过程中的曝光参数依据外部事物的变化而动态变化(S110);调整第一类帧的VTS数据,VTS数据随着曝光参数的变化而变化,以固定第一类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔(S120);按照预设帧率采集第二类帧的图像数据,第二类帧用于追踪光学手柄,并且各个摄像头在追踪过程中的曝光参数一致(S130);根据第一类帧的VTS数据调整第二类帧的VTS数据,且固定第二类帧的曝光中心点与FSIN同步信号的时间间隔,以完成摄像头曝光中心点的对齐(S140)。

Description

VR系统中多摄像头曝光中心点的对齐方法、系统
相关申请的交叉引用
本申请基于2020年9月16日提交的发明名称为“VR系统中多摄像头曝光中心点的对齐方法、系统”的中国专利申请CN202010973542.2,并且要求该专利申请的优先权,通过引用将其所公开的内容全部并入本申请。
技术领域
本发明涉及计算机视觉领域,更为具体地,涉及一种VR系统中多摄像头曝光中心点的对齐方法、系统。
背景技术
现有的All-in-one VR 6DOF一体机设计中,头戴的光学追踪大部分都是两个摄像头来实现,且两个摄像头具有相同的配置参数,在自动曝光设计上,两个摄像头也都是有相同的曝光参数,所以两个摄像头可以做到摄像头曝光中心点的对齐。由于两个摄像头在FOV上有盲区,如果有光学手柄追踪需求时,手柄盲区问题会更加明显,为此需要在追踪系统中加入更多的摄像头来实现,故现有的多个摄像头可能放置在上下左右部分,可能会有每个摄像头对应环境的光线不同,每个摄像头需要设置不同的曝光参数,而设置不同的曝光参数很难完成多个摄像头曝光中心点的对齐。
因此,亟需一种在不同曝光参数下也能够完成VR系统中多个摄像头曝光中心点对齐的方法、系统。
发明内容
鉴于上述问题,本发明实施例提供了一种VR系统中多摄像头曝光中心点的对齐方法、系统,以解决现有的存在光学追踪需求的VR系统中设置有多个摄像头,且分别放置在上下左右部分,如此则存在每个摄像头对应环境的光线不同的情况,每个摄像头需要设置不同的曝光参数,而设置不同的曝光参数很难完成多个摄像头曝光中心点的对齐的问题。
根据本发明实施例提供的一种VR系统中多摄像头曝光中心点的对齐方法,包括:
按照预设帧率采集第一类帧的图像数据,所述第一类帧设置为追踪外部事物,并且各个摄像头在追踪过程中的曝光参数依据外部事物的变化而动态变化;
调整所述第一类帧的VTS数据,所述VTS数据随着曝光参数的变化而变化,以固定所述第一类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔;
按照所述预设帧率采集第二类帧的图像数据,所述第二类帧设置为追踪光学手柄,并且各个摄像头在追踪过程中的曝光参数一致;
根据所述第一类帧的VTS数据调整所述第二类帧的VTS数据,且固定所述第二类帧的曝光中心点与所述FSIN同步信号的时间间隔,以完成所述摄像头曝光中心点的对齐。
在一实施例中,所述预设帧率可以为60Hz。
在一实施例中,所述外部事物为外部环境或人体部位。
在一实施例中,在调整所述第一类帧的VTS数据之前,还包括:
根据所述数据图像的HTS设置和VR系统中的时钟频率计算每行数据的扫描时间,并获取所述VR系统中图像传感器默认的VTS数值。
在一实施例中,调整所述第一类帧的VTS数据的过程包括:
获取所述第一类帧的曝光参数;
通过所述曝光参数计算所述第一类帧的VTS数据,并将所述VTS数据写入所述图像传感器的寄存器中。
在一实施例中,在各个摄像头在追踪过程中的曝光参数依据外部事物的变化而动态变化的过程中,
所述外部事物的环境越暗,所述曝光参数越大;
所述外部事物的环境越亮,所述曝光参数越小。
在一实施例中,在所述VTS数据随着曝光参数的变化而变化的过程中,
若所述曝光参数增大,则增大所述VTS数据的数值;
若所述曝光参数减小,则减小所述VTS数据的数值。
在一实施例中,根据所述第一类帧的VTS数据调整所述第二类帧的图像 的VTS数据为:
使所述第二类帧的VTS数据的数值与所述第一类帧的VTS数据的数值之和为固定数值。
在一实施例中,在固定所述第二类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔的过程之后完成所述第一类帧与所述第二类帧的中心点对齐;
在VR系统采集图像数据的整个采集过程中,依次重复完成所述第一类帧与所述第二类帧的中心点对齐,以对齐整个VR追踪过程中的摄像头曝光中心点。
本发明实施例还提供一种VR系统中多摄像头曝光中心点的对齐系统,该对齐系统用于实现前述的VR系统中多摄像头曝光中心点的对齐方法,该系统包括:类帧划分模块、摄像头、第一类帧处理模块和第二类帧处理模块,其中,
所述类帧划分模块设置为指令所述摄像头依次获取第一类帧和第二类帧;
所述摄像头设置为按照所述类帧划分模块的指令依次采集所述第一类帧和第二类帧的图像数据,并且按照预设帧率采集图像数据;
所述第一类帧处理模块设置为调整第一类帧的VTS数据,使VTS数据随着曝光参数的变化而变化,以固定第一类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔;
所述第二类帧处理模块设置为根据所述第一类帧的VTS数据调整第二类帧的VTS数据,且固定所述第二类帧的曝光中心点与所述FSIN同步信号的时间间隔,以完成摄像头曝光中心点的对齐。
从上面的技术方案可知,在本发明实施例提供的VR系统中多摄像头曝光中心点的对齐方法和系统中,通过将采集的图像数据划分为第一类帧、第二类帧,在分别对第一类帧和第二类帧进行处理,第一类帧用于追踪外部事物,在追踪过程中摄像头的曝光参数依据外部事物的变化而动态变化,借此特点调整第一类帧的VTS数据,使VTS数据随着曝光参数的变化而变化,以固定第一类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔,第二类帧在追踪过程中摄像头的曝光参数一致,根据此特点第二类帧的曝光中心点与 VR系统中的FSIN同步信号具有固定的时间间隔,从而完成第一类帧与第二类帧的中心点对齐,在获取剩余的图像数据时依次重复完成如第一类帧与第二类帧的中心点对齐,进而对齐整个VR追踪过程中的摄像头曝光中心点,如此,即使为满足光学手柄追踪需求加入更多的摄像头,每个摄像头设置不同的曝光参数,也能够完成曝光中心点的对齐,从而使整个VR系统稳定输出,提高用户的舒适性和沉浸感。
附图说明
通过参考以下结合附图的说明书内容,并且随着对本发明的更全面理解,本发明的其它目的及结果将更加明白及易于理解。在附图中:
图1为根据本发明实施例的VR系统中多摄像头曝光中心点的对齐方法的流程图;
图2为根据本发明实施例的VR系统中多摄像头曝光中心点的对齐系统的示意图。
具体实施方式
现有的存在光学追踪需求的VR系统中设置有多个摄像头,且分别放置在上下左右部分,如此则存在每个摄像头对应环境的光线不同的情况,每个摄像头需要设置不同的曝光参数,而设置不同的曝光参数很难完成多个摄像头曝光中心点的对齐的问题。
针对上述问题,本发明实施例提供一种VR系统中多摄像头曝光中心点的对齐方法,以下将结合附图对本发明的具体实施例进行详细描述。
为了说明本发明实施例提供的VR系统中多摄像头曝光中心点的对齐方法,图1对本发明实施例的VR系统中多摄像头曝光中心点的对齐方法进行了示例性标示;图2对本发明实施例的VR系统中多摄像头曝光中心点的对齐系统进行了示例性标示。
以下示例性实施例的描述实际上仅仅是说明性的,决不作为对本发明及其应用或使用的任何限制。对于相关领域普通技术人员已知的技术和设备可能不作详细讨论,但在适当情况下,所述技术和设备应当被视为说明书的一部分。
如图1所示,本发明实施例提供的本发明实施例的VR系统中多摄像头曝光中心点的对齐方法,包括:
S110:按照预设帧率采集第一类帧的图像数据,所述第一类帧用于追踪外部事物,并且各个摄像头在追踪过程中的曝光参数依据外部事物的变化而动态变化;
S120:调整所述第一类帧的VTS数据,所述VTS数据随着曝光参数的变化而变化,以固定所述第一类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔;
S130:按照预设帧率采集第二类帧的图像数据,所述第二类帧用于追踪光学手柄,并且各个摄像头在追踪过程中的曝光参数一致;
S140:根据所述第一类帧的VTS数据调整所述第二类帧的VTS数据,且固定所述第二类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔,以完成所述摄像头曝光中心点的对齐。
如图1所示的实施例,在步骤S110中,该预设帧率为提前预设的帧率,其需要结合摄像头本身的帧率,该帧率为同步帧率的整数倍,比如同步帧率为30Hz,摄像头帧率为60Hz,换句话说,若摄像头帧率为60Hz,实际上摄像头所获取的图像可以分别为30Hz头图像和30Hz手图像,需要说明的是,该同步帧率可以是30Hz,也可以是15Hz;若按照90Hz采集图像数据,其中该图像数据可以是30Hz头图像和60Hz手图像,采集时的组合顺序可以是头图像、手图像、手图像,该预设帧率具体的数值不作具体限制,在本实施例中以采用60Hz采集图像数据,包括30Hz的第一类帧和30Hz的第二类帧。
如图1所示,在步骤S110中,第一类帧用于追踪外部事物,该外部事物为外部环境或人体部位,即该第一类帧为外部环境或人体部位的图像,若VR系统只采集外部环境和手柄,则该外部事物为外部环境,即该第一类帧为外部环境图像,在一个具体实施例中,采用60Hz的帧率采集图像数据,第一帧先采集外部环境形成30Hz的外部环境图像以实现整个VR系统的Head 6DOF功能,第二帧采集手柄形成30Hz的手柄的图像以追踪光学手柄实现Hand6DOF功能,此时该第一类帧指的是采集外部环境形成的外部环境图像;在另一具体实施例中,采用30Hz的帧率采集图像数据,实现15Hz同步,具体的,第一帧和第二帧先采集头部图像,第三帧第四帧采集手柄的图像,在本实施 例中,该第一类帧指的是采集的头部图像;无论先采集几帧外部事物的图像再采集手柄图像,该第一类帧均为前几帧的外部事物的图像,换句话说,该第一类帧为除追踪手柄以外所采集的非手柄的图像,可以为外部环境图像,也可以为人体部位图像。
如图1所示,在步骤S110中,各个摄像头在追踪过程中的曝光参数依据外部事物的变化而动态变化,多个摄像头保持同步工作,由于每个摄像头的安装位置不同,每个摄像头所对应的外界环境也不同,为保证追踪精度、保证每个图像在不同环境输出的一致性,每个摄像头的曝光参数也需有所不同,当外部事物所处的环境越暗,该曝光参数越大;外部事物所处的环境越亮,该曝光参数越小,即环境较暗的摄像头曝光时间设置较长,环境较亮的摄像头曝光时间设置较短,这种情况下只有实现曝光中心点的对齐,即可以保证多个摄像头在采集数据时的时刻一致,多个摄像头也会有相同的时间戳。
如图1所示的实施例,在步骤S120之前,需要根据数据图像的HTS设置和VR系统中的时钟频率计算每行数据的扫描时间,并获取VR系统中图像传感器默认的VTS数值,而后调整第一类帧的VTS数据,其中,调整所述第一类帧的VTS数据的过程包括:
S121:获取第一类帧的曝光参数;
S122:通过曝光参数计算第一类帧的VTS数据,并将VTS数据写入图像传感器的寄存器中;
该VTS数据随着曝光参数的变化而变化,具体的,若曝光参数增大,则增大VTS数据的数值;若曝光参数减小,则减小VTS数据的数值,以固定第一类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔,从而在多个摄像头在获取第一类帧的图像数据时保证多个摄像头的中心点对齐。
如图1所示的实施例,在步骤S130中,按照预设帧率采集第二类帧,该第二类帧的图像数据用于追踪光学手柄,在一个具体实施例中,采用60Hz的帧率采集图像数据,第一帧先采集外部环境形成30Hz的外部环境图像以实现整个VR系统的Head 6DOF功能,第二帧采集手柄形成30Hz的手柄的图像以追踪光学手柄实现Hand 6DOF功能,此时该第二类帧指的是追踪手柄形成的手柄的图像;在另一具体实施例中,采用30Hz的帧率采集图像数据,实现15Hz同步,具体的,第一帧和第二帧先采集头部图像,第三帧第四帧采集手 柄的图像,在本实施例中,该第二类帧指的是采集的手柄的图像;无论先采集几帧外部事物的图像再采集手柄图像,该第二类帧均为所采集的手柄的图像。
如图1所示的实施例,在步骤S130中,由于手柄亮度与环境相比会较亮,同时为了在手柄快速移动时能够减小图像余辉,在追踪光学手柄的过程中各个摄像头的曝光参数(曝光时间)设置很小(几十到几百微秒),并且多个摄像头曝光时间设置为同样参数,从而在追踪光学手柄的过程中各个摄像头的曝光参数(曝光时间)保持一致,根据摄像头的特性,多个摄像头的第二类帧的图像曝光中心点对齐,即实现第二类帧的图像的曝光中心点与FSIN同步信号时间间隔固定,因此,在步骤S140中,首先根据第一类帧的VTS数据调整第二类帧的VTS数据,且固定第二类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔,从而完成摄像头曝光中心点的对齐。
如图1所示的实施例,在步骤S140中,根据第一类帧的VTS数据调整第二类帧的图像的VTS数据为:使第二类帧的VTS数据的数值与第一类帧的VTS数据的数值之和为固定数值,当第二类帧的VTS数据的数值与第一类帧的VTS数据的数值之和始终为相同的值,如此第一类帧与第二类帧两帧的图像数据方能以特定帧率稳定的进行图像输出,实现摄像头以预设帧率稳定输出,从而实现第一类帧与第二类帧的中心点对齐,在VR系统采集图像数据的整个采集过程中,依次进行采集第一类帧、第二类帧、第一类帧、第二类帧……如此循环往复以实现整个VR追踪过程,故只需按照上述第一类帧与第二类帧进行中心点对齐的过程循环往复,依次重复完成第一类帧与第二类帧的中心点对齐,即可实现对齐整个VR追踪过程中的摄像头曝光中心点。
通过上述实施方式可以看出,本发明提供的VR系统中多摄像头曝光中心点的对齐方法,通过将采集的图像数据划分为第一类帧、第二类帧,在分别对第一类帧和第二类帧进行处理,第一类帧用于追踪外部事物,在追踪过程中摄像头的曝光参数依据外部事物的变化而动态变化,借此特点调整第一类帧的VTS数据,使VTS数据随着曝光参数的变化而变化,以固定第一类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔,第二类帧在追踪过程中摄像头的曝光参数一致,可轻松实现第二类帧的曝光中心点与VR系统中的FSIN同步信号具有固定的时间间隔,从而完成第一类帧与第二类帧的中心点 对齐,在获取剩余的图像数据时依次重复完成如第一类帧与第二类帧的中心点对齐,进而对齐整个VR追踪过程中的摄像头曝光中心点,如此,即使为满足光学手柄追踪需求加入更多的摄像头,每个摄像头设置不同的曝光参数,也能够完成曝光中心点的对齐,从而使整个VR系统稳定输出,提高用户的舒适性和沉浸感。
如图2所示,本发明实施例还提供一种VR系统中多摄像头曝光中心点的对齐系统100,用于实现前述的VR系统中多摄像头曝光中心点的对齐方法,包括类帧划分模块110、摄像头120、第一类帧处理模块130和第二类帧处理模块140,其中,该类帧划分模块110,用于指令该摄像头120依次获取第一类帧和第二类帧的图像数据,即指令该摄像头120依次采集外部事物和发光手柄,并且将该摄像头120采集的图像数据划分为第一类帧与第二类帧;该摄像头120用于按照该类帧划分模块110的指令的采集外部事物或采集发光手柄的图像数据,并且按照预设帧率采集图像数据;该第一类帧处理模块130用于调整第一类帧的VTS数据,使VTS数据随着曝光参数的变化而变化,以固定第一类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔;该第二类帧处理模块140用于根据第一类帧的VTS数据调整第二类帧的VTS数据,且固定第二类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔,以完成摄像头曝光中心点的对齐。
通过上述实施方式可以看出,根据本发明实施例提供的VR系统中多摄像头曝光中心点的对齐系统,首先类帧划分模块指令摄像头依次获取第一类帧和第二类帧,摄像头按照预设帧率采集第一类帧和第二类帧的图像数据,而后通过第一类帧处理模块与第二类帧处理模块对第一类帧、第二类帧的VTS数据进行处理,以使第一类帧与第二类帧的曝光中心点对齐,在获取剩余的图像数据时依次重复完成如第一类帧与第二类帧的中心点对齐,进而对齐整个VR追踪过程中的摄像头曝光中心点,如此,即使为满足光学手柄追踪需求加入更多的摄像头,每个摄像头设置不同的曝光参数,也能够完成曝光中心点的对齐,从而使整个VR系统稳定输出,提高用户的舒适性和沉浸感。
本发明的实施例还提供了一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序,其中,该计算机程序被设置为运行时执行上述任一项方法实施例中的步骤。
本发明的实施例还提供了一种电子装置,包括存储器和处理器,该存储器中存储有计算机程序,该处理器被设置为运行计算机程序以执行上述任一项方法实施例中的步骤。
如上参照附图以示例的方式描述了根据本发明提出的VR系统中多摄像头曝光中心点的对齐方法、系统。但是,本领域技术人员应当理解,对于上述本发明所提出的VR系统中多摄像头曝光中心点的对齐方法、系统,还可以在不脱离本发明内容的基础上做出各种改进。因此,本发明的保护范围应当由所附的权利要求书的内容确定。

Claims (12)

  1. 一种VR系统中多摄像头曝光中心点的对齐方法,包括:
    按照预设帧率采集第一类帧的图像数据,所述第一类帧用于追踪外部事物,并且各个摄像头在追踪过程中的曝光参数依据外部事物的变化而动态变化;
    调整所述第一类帧的VTS数据,所述VTS数据随着曝光参数的变化而变化,以固定所述第一类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔;
    按照所述预设帧率采集第二类帧的图像数据,所述第二类帧用于追踪光学手柄,并且各个摄像头在追踪过程中的曝光参数一致;
    根据所述第一类帧的VTS数据调整所述第二类帧的VTS数据,且固定所述第二类帧的曝光中心点与所述FSIN同步信号的时间间隔,以完成所述摄像头曝光中心点的对齐。
  2. 如权利要求1所述的VR系统中多摄像头曝光中心点的对齐方法,其中,
    所述预设帧率为60Hz。
  3. 如权利要求2所述的VR系统中多摄像头曝光中心点的对齐方法,其中,
    所述外部事物为外部环境或人体部位。
  4. 如权利要求1所述的VR系统中多摄像头曝光中心点的对齐方法,其中,在调整所述第一类帧的VTS数据之前,还包括:
    根据所述数据图像的HTS设置和VR系统中的时钟频率计算每行数据的扫描时间,并获取所述VR系统中图像传感器默认的VTS数值。
  5. 如权利要求4所述的VR系统中多摄像头曝光中心点的对齐方法,其中,调整所述第一类帧的VTS数据的过程包括:
    获取所述第一类帧的曝光参数;
    通过所述曝光参数计算所述第一类帧的VTS数据,并将所述VTS数据写入所述图像传感器的寄存器中。
  6. 如权利要求1所述的VR系统中多摄像头曝光中心点的对齐方法,其中,在各个摄像头在追踪过程中的曝光参数依据外部事物的变化而动态变化的过程中,
    所述外部事物的环境越暗,所述曝光参数越大;
    所述外部事物的环境越亮,所述曝光参数越小。
  7. 如权利要求6所述的VR系统中多摄像头曝光中心点的对齐方法,其中,在所述VTS数据随着曝光参数的变化而变化的过程中,
    若所述曝光参数增大,则增大所述VTS数据的数值;
    若所述曝光参数减小,则减小所述VTS数据的数值。
  8. 如权利要求1所述的VR系统中多摄像头曝光中心点的对齐方法,其中,根据所述第一类帧的VTS数据调整所述第二类帧的图像的VTS数据为:
    使所述第二类帧的VTS数据的数值与所述第一类帧的VTS数据的数值之和为固定数值。
  9. 如权利要求1所述的VR系统中多摄像头曝光中心点的对齐方法,其中,
    在固定所述第二类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔的过程之后完成所述第一类帧与所述第二类帧的中心点对齐;
    在VR系统采集图像数据的整个采集过程中,依次重复完成所述第一类帧与所述第二类帧的中心点对齐,以对齐整个VR追踪过程中的摄像头曝光中心点。
  10. 一种VR系统中多摄像头曝光中心点的对齐系统,所述对齐系统用于实现如权利要求1-9任一项所述的VR系统中多摄像头曝光中心点的对齐方 法,包括:类帧划分模块、摄像头、第一类帧处理模块和第二类帧处理模块,其中,
    所述类帧划分模块,设置为指令所述摄像头依次获取第一类帧和第二类帧;
    所述摄像头,设置为按照所述类帧划分模块的指令依次采集所述第一类帧和第二类帧的图像数据,并且按照预设帧率采集图像数据;
    所述第一类帧处理模块,设置为调整第一类帧的VTS数据,使VTS数据随着曝光参数的变化而变化,以固定第一类帧的曝光中心点与VR系统中的FSIN同步信号的时间间隔;
    所述第二类帧处理模块,设置为根据所述第一类帧的VTS数据调整第二类帧的VTS数据,且固定所述第二类帧的曝光中心点与所述FSIN同步信号的时间间隔,以完成摄像头曝光中心点的对齐。
  11. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机程序,其中,所述计算机程序被处理器执行时实现所述权利要求1至9任一项中所述的方法的步骤。
  12. 一种电子装置,包括存储器、处理器以及存储在所述存储器上并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现所述权利要求1至9任一项中所述的方法的步骤。
PCT/CN2021/118546 2020-09-16 2021-09-15 Vr系统中多摄像头曝光中心点的对齐方法、系统 WO2022057834A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21868656.6A EP4199510A4 (en) 2020-09-16 2021-09-15 METHOD AND SYSTEM FOR ALIGNING CENTRAL EXPOSURE POINTS OF MULTIPLE CAMERAS IN A VR SYSTEM
US17/819,500 US11962749B2 (en) 2020-09-16 2022-08-12 Virtual reality interaction method, device and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010973542.2A CN112203076B (zh) 2020-09-16 2020-09-16 Vr系统中多摄像头曝光中心点的对齐方法、系统
CN202010973542.2 2020-09-16

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/819,500 Continuation US11962749B2 (en) 2020-09-16 2022-08-12 Virtual reality interaction method, device and system

Publications (1)

Publication Number Publication Date
WO2022057834A1 true WO2022057834A1 (zh) 2022-03-24

Family

ID=74016388

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/118546 WO2022057834A1 (zh) 2020-09-16 2021-09-15 Vr系统中多摄像头曝光中心点的对齐方法、系统

Country Status (4)

Country Link
US (1) US11962749B2 (zh)
EP (1) EP4199510A4 (zh)
CN (1) CN112203076B (zh)
WO (1) WO2022057834A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112203076B (zh) 2020-09-16 2022-07-29 青岛小鸟看看科技有限公司 Vr系统中多摄像头曝光中心点的对齐方法、系统
CN112449172B (zh) * 2021-02-01 2021-04-23 南京爱奇艺智能科技有限公司 一种vr设备多摄像头曝光同步方法
CN114285472B (zh) * 2021-12-20 2023-05-09 北京邮电大学 一种基于手机摄像头具有前向纠错的upsook调制方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170262045A1 (en) * 2016-03-13 2017-09-14 Logitech Europe S.A. Transition between virtual and augmented reality
CN107409176A (zh) * 2015-03-02 2017-11-28 英特尔公司 多相机同步脉冲同步化
US20190110039A1 (en) * 2017-10-09 2019-04-11 Facebook Technologies, Llc Head-mounted display tracking system
US20190114830A1 (en) * 2017-10-13 2019-04-18 Samsung Electronics Co., Ltd. 6dof media consumption architecture using 2d video decoder
CN110612506A (zh) * 2017-05-09 2019-12-24 微软技术许可有限责任公司 立体相机和手持对象的校准
CN111459279A (zh) * 2020-04-02 2020-07-28 重庆爱奇艺智能科技有限公司 主动式补光设备、3dof手柄、vr设备及追踪系统
CN111476907A (zh) * 2020-04-14 2020-07-31 青岛小鸟看看科技有限公司 基于虚拟现实技术的定位及三维场景重建装置、方法
CN112203076A (zh) * 2020-09-16 2021-01-08 青岛小鸟看看科技有限公司 Vr系统中多摄像头曝光中心点的对齐方法、系统

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5144445A (en) * 1989-12-26 1992-09-01 Sanyo Electric Co., Ltd. Solid-state image pickup apparatus having a plurality of photoelectric transducers arranged in a matrix
US7046292B2 (en) * 2002-01-16 2006-05-16 Hewlett-Packard Development Company, L.P. System for near-simultaneous capture of multiple camera images
EP1376995A3 (en) * 2002-06-21 2004-03-17 Samsung Electronics Co., Ltd. Device and method for displaying data in a mobile terminal equipped with a camera
JP4960907B2 (ja) * 2008-03-11 2012-06-27 富士フイルム株式会社 撮影装置および撮影方法
JP2009232340A (ja) * 2008-03-25 2009-10-08 Seiko Epson Corp 撮像素子及び撮像素子の駆動方法
CN102860016A (zh) * 2010-04-19 2013-01-02 松下电器产业株式会社 立体图像拍摄装置以及立体图像拍摄方法
JP5050256B1 (ja) * 2011-06-21 2012-10-17 オリンパス株式会社 撮像装置及び撮像方法
KR101349560B1 (ko) * 2012-04-26 2014-01-10 엘지이노텍 주식회사 적외선 카메라 및 이의 동작 방법
US10334158B2 (en) * 2014-11-03 2019-06-25 Robert John Gove Autonomous media capturing
US9549100B2 (en) * 2015-04-23 2017-01-17 Microsoft Technology Licensing, Llc Low-latency timing control
US10453185B2 (en) * 2015-11-12 2019-10-22 Aquifi, Inc. System and method for high dynamic range depth capture using multiple cameras
CN112217976B (zh) * 2016-06-19 2022-02-08 核心光电有限公司 用于双孔径摄影机中帧同步的系统
WO2018088061A1 (ja) * 2016-11-08 2018-05-17 ソニー株式会社 画像転送装置、画像転送方法、プログラム、動画像生成システム
US10873708B2 (en) * 2017-01-12 2020-12-22 Gopro, Inc. Phased camera array system for generation of high quality images and video
CN107168523A (zh) * 2017-04-10 2017-09-15 北京小鸟看看科技有限公司 一种摄像机位置调整方法和系统
US20180309919A1 (en) * 2017-04-19 2018-10-25 Qualcomm Incorporated Methods and apparatus for controlling exposure and synchronization of image sensors
CN109697002B (zh) * 2017-10-23 2021-07-16 腾讯科技(深圳)有限公司 一种在虚拟现实中对象编辑的方法、相关设备及系统
CN107948463B (zh) * 2017-11-30 2020-06-23 北京图森智途科技有限公司 一种相机同步方法、装置及系统
US10819926B2 (en) * 2018-04-09 2020-10-27 Facebook Technologies, Llc Systems and methods for synchronizing image sensors
CN108829627B (zh) * 2018-05-30 2020-11-20 青岛小鸟看看科技有限公司 虚拟现实设备间的同步控制方法及系统
CN109729278B (zh) * 2018-11-19 2020-12-25 魔门塔(苏州)科技有限公司 可设定各通道传输速率的远程多传感器同步接收装置
CN111355897B (zh) * 2018-12-24 2023-04-18 海信视像科技股份有限公司 一种灯光控制方法及装置
CN109889690B (zh) * 2019-03-04 2022-08-16 青岛小鸟看看科技有限公司 一种提高深度图像帧率的方法和深度相机组
CN111669479A (zh) * 2019-03-06 2020-09-15 舜宇光学(浙江)研究院有限公司 一种验证摄像头同步曝光的方法及其系统和电子设备
CN110139066B (zh) * 2019-03-24 2021-02-02 初速度(苏州)科技有限公司 一种传感器数据的传输系统、方法和装置
CN110198415B (zh) * 2019-05-26 2021-08-24 初速度(苏州)科技有限公司 一种图像时间戳的确定方法和装置
WO2021070813A1 (ja) * 2019-10-08 2021-04-15 株式会社デンソー 誤差推定装置、誤差推定方法、誤差推定プログラム

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107409176A (zh) * 2015-03-02 2017-11-28 英特尔公司 多相机同步脉冲同步化
US20170262045A1 (en) * 2016-03-13 2017-09-14 Logitech Europe S.A. Transition between virtual and augmented reality
CN110612506A (zh) * 2017-05-09 2019-12-24 微软技术许可有限责任公司 立体相机和手持对象的校准
US20190110039A1 (en) * 2017-10-09 2019-04-11 Facebook Technologies, Llc Head-mounted display tracking system
US20190114830A1 (en) * 2017-10-13 2019-04-18 Samsung Electronics Co., Ltd. 6dof media consumption architecture using 2d video decoder
CN111459279A (zh) * 2020-04-02 2020-07-28 重庆爱奇艺智能科技有限公司 主动式补光设备、3dof手柄、vr设备及追踪系统
CN111476907A (zh) * 2020-04-14 2020-07-31 青岛小鸟看看科技有限公司 基于虚拟现实技术的定位及三维场景重建装置、方法
CN112203076A (zh) * 2020-09-16 2021-01-08 青岛小鸟看看科技有限公司 Vr系统中多摄像头曝光中心点的对齐方法、系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4199510A4 *

Also Published As

Publication number Publication date
EP4199510A1 (en) 2023-06-21
EP4199510A4 (en) 2024-02-28
CN112203076B (zh) 2022-07-29
US11962749B2 (en) 2024-04-16
US20220385884A1 (en) 2022-12-01
CN112203076A (zh) 2021-01-08

Similar Documents

Publication Publication Date Title
WO2022057834A1 (zh) Vr系统中多摄像头曝光中心点的对齐方法、系统
US8830363B2 (en) Method and apparatus for estimating point spread function
US10478079B2 (en) Pulse estimation device, pulse estimation system, and pulse estimation method
EP3891974B1 (en) High dynamic range anti-ghosting and fusion
US20150304625A1 (en) Image processing device, method, and recording medium
US9959841B2 (en) Image presentation control methods and image presentation control apparatuses
US9055222B2 (en) Electronic device and method for image stabilization
US11749024B2 (en) Graphics processing method and related eye-tracking system
WO2021253186A1 (zh) 信息处理方法、装置和成像系统
US10163421B2 (en) Automatic parameter adjustment system and method for display device, and display device
KR101549824B1 (ko) 피부색 보정 방법 및 장치, 이를 이용한 디지털 촬영 장치
WO2018090450A1 (zh) 显示屏均匀性测试方法及显示屏均匀性测试系统
KR100844129B1 (ko) 카메라를 이용한 마우스 인터페이스 장치, 이 장치에 의한시스템 및 방법, 그리고 이를 실현하기 위한 컴퓨터로 읽을수 있는 기록매체
CN110519526A (zh) 曝光时长控制方法、装置、存储介质及电子设备
WO2016158490A1 (ja) 投影システム、プロジェクター装置、撮像装置、および、プログラム
JP2013239914A (ja) 撮像装置
JP2012080411A (ja) 撮像装置及びその制御方法
JP5561389B2 (ja) 画像処理プログラム、画像処理装置、電子カメラ、および画像処理方法
TW202038847A (zh) 用於生理資訊量測的影像調整技術及系統
KR20100121944A (ko) 영상 처리 방법 및 장치, 이를 이용한 디지털 촬영 장치
US20240361834A1 (en) Image sensor, image processing system including the image sensor and operating method of the image processing system
JP2015103918A (ja) 撮像装置及び撮像装置の制御方法
US20160381307A1 (en) Image processing apparatus, image processing method, and storage medium storing program
JP7297566B2 (ja) 画像処理装置、撮像装置、画像処理方法およびプログラム
JP7214407B2 (ja) 画像処理装置、画像処理方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21868656

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2021868656

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2021868656

Country of ref document: EP

Effective date: 20230316

NENP Non-entry into the national phase

Ref country code: DE