WO2018098867A1 - Photographing apparatus and image processing method therefor, and virtual reality device - Google Patents

Photographing apparatus and image processing method therefor, and virtual reality device Download PDF

Info

Publication number
WO2018098867A1
WO2018098867A1 PCT/CN2016/111544 CN2016111544W WO2018098867A1 WO 2018098867 A1 WO2018098867 A1 WO 2018098867A1 CN 2016111544 W CN2016111544 W CN 2016111544W WO 2018098867 A1 WO2018098867 A1 WO 2018098867A1
Authority
WO
WIPO (PCT)
Prior art keywords
processor unit
pickup apparatus
image pickup
communication module
processor
Prior art date
Application number
PCT/CN2016/111544
Other languages
French (fr)
Chinese (zh)
Inventor
刘鑫
Original Assignee
歌尔科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 歌尔科技有限公司 filed Critical 歌尔科技有限公司
Publication of WO2018098867A1 publication Critical patent/WO2018098867A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance

Definitions

  • the present invention relates to the field of image acquisition technology, and more particularly to an image pickup apparatus, an image processing method of the image pickup apparatus, and a virtual reality apparatus having the same.
  • VR technology uses computer simulation to generate a virtual world in a three-dimensional space, providing users with simulations of visual, auditory, tactile and other senses, so that users can be as immersive as they can, and can observe three times in a timely and unrestricted manner. Transactions within the space.
  • the spatial positioning system in virtual reality mainly recognizes the position and posture of the participants by tracking the spatial position of peripherals such as helmets and handles.
  • the infrared signals emitted by the infrared emitters installed on these peripherals are mainly collected by the camera device for positioning and tracking of peripherals.
  • an image pickup apparatus comprising a processor unit, a communication module, and at least one set of binocular cameras, the processor unit including at least one processor, the at least one set of binoculars
  • the camera transmits the collected image data to the processor unit for processing, and the processor unit transmits the corresponding processing result to the communication module for transmission.
  • a virtual reality device comprising: a host device and a camera device according to the first aspect of the present invention, wherein the host establishes a communication connection with a communication module of the camera device, The processor unit of the camera device transmits the processing result obtained by processing various types of data to the host through the communication connection.
  • the location information is transmitted to the communication module of the imaging device as a processing result corresponding to the image data for transmission.
  • FIG. 1 is a block schematic diagram of an embodiment of an image pickup apparatus according to the present invention.
  • FIG. 2 is a block schematic diagram of an embodiment of a virtual reality device in accordance with the present invention.
  • FIG. 3 is a flow chart showing an embodiment of an image processing method of an image pickup apparatus according to the present invention.
  • U101-application processor U102-coprocessor
  • U3-storage unit U4-inertial measurement unit
  • U5-communication module U6-power management chip
  • M1-microphone module C1-first group of binocular cameras
  • C2-second group binocular camera U1-processor unit.
  • Figure 1 is a block schematic diagram of an embodiment of an image pickup apparatus according to the present invention.
  • the camera device includes a processor unit U1, a communication module U5, and two sets of binocular cameras C1, C2.
  • the image data collected by each group of binocular cameras C1 and C2 is transmitted to the processor unit U1 for processing.
  • the processor unit U1 can configure each group of binocular cameras C1, C2, for example, via an I2C bus.
  • the processor unit U1 transmits the processing result of various processing, including the processing result obtained by processing the image data, to the communication module U5 for transmission. In this way, after the imaging device establishes a communication connection with the host through the communication module U5, the processing result obtained by the processor unit U1 can be sent to the host for use by the host, thereby reducing the processing load of the host.
  • the communication module U5 is a wireless communication module, so that various types of processed data can be transmitted by establishing a wireless communication connection with the host.
  • the communication module U5 may also be a wired communication module, such as a USB communication module or the like.
  • Each set of binocular cameras includes two cameras of the same type, which are equivalent to the left and right eyes for information capture in three dimensions.
  • the first set of binocular cameras C1 is a standard camera, for example
  • the sensor model is the OV9281 camera for gesture capture;
  • the second group of binocular camera C2 uses a fisheye lens, such as the sensor model OV7251, for depth of field.
  • the two sets of binocular cameras C1 and C2 will capture the infrared light emitted by the handle infrared light at the set frequency, and the first group of binocular cameras C1 will gesture. Capture, the second group of binocular cameras C2 obtains the depth of field, and together to complete the positioning of the handle action.
  • the processor unit U1 processes the image data captured by the two sets of binocular cameras C1 and C2, and further obtains the position information of the handle in the space, and sends the processed position information to the host of the virtual reality device through the communication module U5. After the host gets the location information of the handle, it can map it to the VR scene to implement the corresponding VR operation.
  • the camera device may further include an Inertial Measurement Unit (IMU) U4, and the inertial measurement unit U4 also transmits the collected motion data to the processor unit U1 for processing, and the processor unit U1 may The corresponding processing result is also transmitted to the communication module U5 for transmission.
  • IMU Inertial Measurement Unit
  • the inertial measurement unit (IMU) U4 is a device for measuring the three-axis attitude angle (or angular rate) of the object and the acceleration.
  • the gyroscope, the accelerometer, and the geomagnetic sensor are the main components of the IMU.
  • An IMU may, for example, comprise three single-axis accelerometers and three single-axis gyroscopes, the accelerometer detecting an acceleration signal of an independent three-axis of the object in the carrier coordinate system, and the gyroscope detecting the angular velocity signal of the carrier relative to the navigation coordinate system, The angular velocity and acceleration of the object in three-dimensional space are measured, and the posture of the object is calculated.
  • the processor unit U1 includes at least one processor for processing data collected by each group of binocular cameras C1, C2 and inertial measurement unit U4.
  • the processor unit U1 includes an application processor U101 and a coprocessor U102 to perform certain tasks through the coprocessor U102, such as monitoring each group of binocular cameras C1, C2, and an inertial measurement unit. U4, etc., thereby reducing the burden on the application processor U101.
  • the coprocessor U102 is communicatively coupled to the application processor U101, for example, via a USB 3.0 bus, and the communication module U5 is communicatively coupled to the application processor U101, for example, via an I2C bus.
  • the coprocessor U102 can transmit the data preprocessed by the coprocessor U102 to the application processor U101 for further processing, and the application processor U101 transmits the processed processing result to the communication module U5. send.
  • At least one set of binocular cameras C1, C2 can be connected to the coprocessor U102 through the MIPI interface, for example, to preprocess the image data collected by the binocular cameras C1, C2 by the coprocessor U102. .
  • the connections at both ends can be achieved through the MIPI bridge chip.
  • the inertial measurement unit U4 can be connected to the coprocessor U102 via the SPI interface, for example, to preprocess the motion data collected by the inertial measurement unit U4 by the coprocessor U102.
  • the camera device may further include an audio codec chip U7, a microphone module M1, and a speaker module S1.
  • the microphone module M1 is connected to the processor unit U1 through an audio coding channel of the audio codec chip U7, and the processor unit U1 passes the audio.
  • the audio decoding channel of the codec chip U7 is connected to the speaker module S1.
  • the audio codec chip U7 can be connected, for example, to the application processor U101 of the processor unit U1 via an I2S bus.
  • the camera device can also be powered by a battery to improve the reliability and convenience of the camera.
  • the camera device may further include a battery B1, a power management chip (PMIC/PMU) U6, and a USB socket J1.
  • the battery B1 supplies power to each of the power devices of the camera device via the battery management chip U6.
  • J1 is connected to power management chip U6 and processor unit U1 for data transmission, and power management chip U6 is communicatively coupled to processor unit U1 for transmitting control commands and/or status information.
  • the working mode between the power management chip U6 and the processor unit U1 is as follows:
  • the power management chip U6 After detecting that the VBUS pin of the USB socket J1 is powered on, the power management chip U6 performs an interaction with the external device inserted into the USB socket J1, and after the interaction, if it is judged that the external device is a charger, the charging channel is turned on. The charger charges the battery; if it is judged that the inserted external device is a USB host, the processor unit U1 is notified through a communication channel with the processor unit U1 to establish a USB connection between the processor unit U1 and the external device. .
  • the power management chip U6 can be communicatively coupled to application processor U101, for example, via an I2C bus.
  • the USB socket J1 can be connected only to the application processor U101.
  • the USB socket J1 can also be connected to the application processor U101 and the coprocessor U102 at the same time, where the connection selection can be made by communication between the application processor U101 and the coprocessor U102.
  • the camera device may further include an isolation circuit for connecting the power management chip U6 to the USB socket J1 through the isolation circuit, thereby connecting the circuit between the power management chip U6 and the USB socket J1 and between the processor unit U1 and the USB socket J1.
  • the high speed communication circuit is isolated.
  • the image capturing apparatus may further include a storage unit U3, and the storage unit U3 includes at least one memory for expanding the storage space of the processor unit U1.
  • the storage unit U3 includes, for example, at least one of double rate synchronous dynamic random access memory (DDR) and FLASH.
  • DDR double rate synchronous dynamic random access memory
  • FLASH FLASH
  • processor unit U1 includes application processor U101 and coprocessor U102
  • the memory unit U3 can be directly coupled to the read/write pins of coprocessor U102.
  • FIG. 2 is a block schematic diagram of one embodiment of a virtual reality device in accordance with the present invention.
  • the virtual reality device comprises a host 210 and an imaging device according to the invention, which is labeled 220 in this embodiment.
  • the imaging device 220 establishes a communication connection with the host 210 through its communication module U5, and further transmits the processing result obtained by processing the various types of data by the processor unit U1 of the imaging device 220 to the host 210 for use by the host 210.
  • the virtual reality device of the present invention since the imaging device 220 itself undertakes the main calculation task, the burden on the host 210 can be greatly reduced, thereby effectively solving the problem of heat generation of the host. Therefore, the virtual reality device of the present invention can adopt The design of the host 210 on the wearing portion of the virtual reality device does not cause user discomfort due to severe heat generation.
  • the host 210 can also be disposed in the mobile handle in communication with the camera 220 and the headset.
  • the host 210 can also be a fixed PC in communication with the camera 220 and the headset.
  • FIG. 3 shows a kind of image processing performed by the processor unit U1 of the image pickup apparatus according to the present invention.
  • the image processing method may include the following steps:
  • Step S301 receiving image data collected by at least one set of binocular cameras C1, C2.
  • the processor unit U1 receives image data acquired by at least one set of binocular cameras C1, C2, for example via an MIPI bus. Step S302, preprocessing the received image data to improve image quality.
  • the pre-processing may include at least one of grayscale processing, enhancement processing, filtering processing, binarization processing, white balance processing, demosaic processing, gamma correction processing, and the like.
  • This pre-processing can be performed, for example, by an image acquisition engine (IAE) integrated by the processor unit.
  • IAE image acquisition engine
  • Step S303 generating a depth image based on the preprocessed image data.
  • Each pixel value in the depth image is used to represent the distance of a point in the scene relative to the camera.
  • Step S304 based on the depth image, obtain position information of the positioned object.
  • the step S304 can be performed, for example, by a computer Vision Engine (CVE) integrated by the processor unit, and the computer vision engine processes the depth image through the DSP and the computer vision algorithm to obtain position information of the positioned object.
  • CVE computer Vision Engine
  • the object to be positioned is, for example, a helmet, a handle, or the like that is marked with infrared light.
  • step S305 the location information is transmitted as a processing result of the corresponding image data to the communication module U5 of the imaging device for transmission.
  • the camera device can send the processing result obtained by the processor unit U1 processing the received image data to the host for use by the host, so that the host does not need to occupy resources to collect image data collected by at least one group of binocular cameras. Processing, thereby reducing the burden on the host.
  • the processor unit U1 includes an application processor U101 and a coprocessor U102.
  • the above steps S301 to S304 may all be completed by the coprocessor U102, and the application processor U101 is only responsible for processing the result. Integrating and transmitting the processing result through the communication module U5; or the coprocessor U102 may perform the above steps S301 and S302, or The above steps S301 to S303 are performed, and the remaining steps are performed by the application processor U101.
  • the processing of the motion data collected by the inertial measurement unit (IMU) by the processor unit U1 may be, for example, transmission to the integrated processor unit through fast interrupt response (FIQ), further integrated into the processor unit.
  • FIQ fast interrupt response
  • a General Purpose Processor (GPP) performs processing to generate a quaternion, and transmits the quaternion as a processing result of the corresponding motion data to the communication module U5 for transmission.

Abstract

A photographing apparatus and an image processing method therefor, and a virtual reality device. The photographing apparatus comprises a processor unit, a communication module, and at least one group of binocular cameras, wherein the processor unit comprises at least one processor, the at least one group of binocular cameras transmits collected image data to the processor unit for processing, and the processing unit transmits a corresponding processing result to the communication module for sending. The virtual reality device comprises the photographing apparatus and a host computer, wherein a processor unit of the photographing device transmits processing results obtained by means of processing various types of data to the host computer via a communication module.

Description

摄像装置及其图像处理方法、虚拟现实设备Camera device and image processing method thereof, virtual reality device 技术领域Technical field
本发明涉及图像采集技术领域,更具体地,本发明涉及一种摄像装置、该种摄像装置的图像处理方法、及具有该种摄像装置的虚拟现实设备。The present invention relates to the field of image acquisition technology, and more particularly to an image pickup apparatus, an image processing method of the image pickup apparatus, and a virtual reality apparatus having the same.
背景技术Background technique
虚拟现实技术简称VR技术,其是利用电脑模拟产生一个三维空间的虚拟世界,提供用户关于视觉、听觉、触觉等感官的模拟,让用户如同身临其境一般,可以及时、没有限制地观察三度空间内的事务。Virtual reality technology is abbreviated as VR technology, which uses computer simulation to generate a virtual world in a three-dimensional space, providing users with simulations of visual, auditory, tactile and other senses, so that users can be as immersive as they can, and can observe three times in a timely and unrestricted manner. Transactions within the space.
虚拟现实中的空间定位系统主要是通过跟踪例如是头盔、手柄等外设的空间位置,识别参与者的位置、姿态等。The spatial positioning system in virtual reality mainly recognizes the position and posture of the participants by tracking the spatial position of peripherals such as helmets and handles.
现有的空间定位系统中,主要通过摄像装置采集安装在这些外设上的红外发射器发出的红外信号进行外设的定位与跟踪。In the existing spatial positioning system, the infrared signals emitted by the infrared emitters installed on these peripherals are mainly collected by the camera device for positioning and tracking of peripherals.
发明内容Summary of the invention
根据本发明的第一方面,提供了一种摄像装置,其包括处理器单元、通信模块、及至少一组双目摄像头,所述处理器单元包括至少一个处理器,所述至少一组双目摄像头将采集到的图像数据传输至所述处理器单元进行处理,所述处理器单元将相应的处理结果传输至所述通信模块进行发送。According to a first aspect of the present invention, there is provided an image pickup apparatus comprising a processor unit, a communication module, and at least one set of binocular cameras, the processor unit including at least one processor, the at least one set of binoculars The camera transmits the collected image data to the processor unit for processing, and the processor unit transmits the corresponding processing result to the communication module for transmission.
根据本发明的第二方面,还提供了一种虚拟现实设备,其包括主机和根据本发明的第一方面所述的摄像装置,所述主机与所述摄像装置的通信模块建立通信连接,所述摄像装置的处理器单元将处理各类数据得到的处理结果通过所述通信连接发送至所述主机。According to a second aspect of the present invention, a virtual reality device is provided, comprising: a host device and a camera device according to the first aspect of the present invention, wherein the host establishes a communication connection with a communication module of the camera device, The processor unit of the camera device transmits the processing result obtained by processing various types of data to the host through the communication connection.
根据本发明的第三方面,还提供了一种根据本发明第一方面所述的摄像装置的图像处理方法,所述处理器单元: According to a third aspect of the present invention, there is provided an image processing method of an image pickup apparatus according to the first aspect of the present invention, wherein the processor unit:
接收至少一组双目摄像头采集到的图像数据;Receiving image data collected by at least one set of binocular cameras;
对所述图像数据进行预处理;Preprocessing the image data;
基于预处理后的图像数据,生成深度图像;Generating a depth image based on the preprocessed image data;
基于所述深度图像,得到被定位物体的位置信息;以及,Obtaining position information of the positioned object based on the depth image; and
将所述位置信息作为对应所述图像数据的处理结果传输至所述摄像装置的通信模块进行发送。The location information is transmitted to the communication module of the imaging device as a processing result corresponding to the image data for transmission.
通过以下参照附图对本发明的示例性实施例的详细描述,本发明的其它特征及其优点将会变得清楚。Other features and advantages of the present invention will become apparent from the Detailed Description of the <RTIgt;
附图说明DRAWINGS
被结合在说明书中并构成说明书的一部分的附图示出了本发明的实施例,并且连同其说明一起用于解释本发明的原理。The accompanying drawings, which are incorporated in FIG
图1为根据本发明摄像装置的一种实施例的方框原理图;1 is a block schematic diagram of an embodiment of an image pickup apparatus according to the present invention;
图2为根据本发明虚拟现实设备的一种实施例的方框原理图;2 is a block schematic diagram of an embodiment of a virtual reality device in accordance with the present invention;
图3为根据本发明摄像装置的图像处理方法的一种实施例的流程示意图。3 is a flow chart showing an embodiment of an image processing method of an image pickup apparatus according to the present invention.
附图标记说明:Description of the reference signs:
U101-应用处理器;        U102-协处理器;U101-application processor; U102-coprocessor;
U3-存储单元;            U4-惯性测量单元;U3-storage unit; U4-inertial measurement unit;
U5-通信模块;            U6-电源管理芯片;U5-communication module; U6-power management chip;
U7-音频编解码芯片;      B1-电池;U7-audio codec chip; B1-battery;
J1-USB插座;             S1-扬声器模组;J1-USB socket; S1-speaker module;
M1-麦克风模组;          C1-第一组双目摄像头;M1-microphone module; C1-first group of binocular cameras;
C2-第二组双目摄像头;    U1-处理器单元。C2-second group binocular camera; U1-processor unit.
具体实施方式detailed description
现在将参照附图来详细描述本发明的各种示例性实施例。应注意到:除非另外具体说明,否则在这些实施例中阐述的部件和步骤的相对布置、数字表达式和数值不限制本发明的范围。 Various exemplary embodiments of the present invention will now be described in detail with reference to the drawings. It should be noted that the relative arrangement of the components and steps, numerical expressions and numerical values set forth in the embodiments are not intended to limit the scope of the invention unless otherwise specified.
以下对至少一个示例性实施例的描述实际上仅仅是说明性的,决不作为对本发明及其应用或使用的任何限制。The following description of the at least one exemplary embodiment is merely illustrative and is in no way
对于相关领域普通技术人员已知的技术、方法和设备可能不作详细讨论,但在适当情况下,所述技术、方法和设备应当被视为说明书的一部分。Techniques, methods and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail, but the techniques, methods and apparatus should be considered as part of the specification, where appropriate.
在这里示出和讨论的所有例子中,任何具体值应被解释为仅仅是示例性的,而不是作为限制。因此,示例性实施例的其它例子可以具有不同的值。In all of the examples shown and discussed herein, any specific values are to be construed as illustrative only and not as a limitation. Thus, other examples of the exemplary embodiments may have different values.
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步讨论。It should be noted that similar reference numerals and letters indicate similar items in the following figures, and therefore, once an item is defined in one figure, it is not required to be further discussed in the subsequent figures.
图1是根据本发明摄像装置的一种实施例的方框原理图。BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 is a block schematic diagram of an embodiment of an image pickup apparatus according to the present invention.
根据图1所示,该摄像装置包括处理器单元U1、通信模块U5、及两组双目摄像头C1、C2。According to FIG. 1, the camera device includes a processor unit U1, a communication module U5, and two sets of binocular cameras C1, C2.
各组双目摄像头C1、C2采集到的图像数据传输至处理器单元U1进行处理。The image data collected by each group of binocular cameras C1 and C2 is transmitted to the processor unit U1 for processing.
处理器单元U1例如可以通过I2C总线对各组双目摄像头C1、C2进行配置。The processor unit U1 can configure each group of binocular cameras C1, C2, for example, via an I2C bus.
处理器单元U1将进行各种处理的处理结果,包括处理图像数据得到的处理结果,传输至通信模块U5进行发送。这样,在摄像装置通过通信模块U5与主机建立通信连接后,便可将处理器单元U1获得的处理结果发送至主机供主机使用,进而减轻主机的处理负担。The processor unit U1 transmits the processing result of various processing, including the processing result obtained by processing the image data, to the communication module U5 for transmission. In this way, after the imaging device establishes a communication connection with the host through the communication module U5, the processing result obtained by the processor unit U1 can be sent to the host for use by the host, thereby reducing the processing load of the host.
为了提高使用便捷性,在该实施例中,该通信模块U5为无线通信模块,这样,便可以通过与主机建立无线通信连接的方式进行处理过的各类数据的发送。In order to improve the convenience of use, in this embodiment, the communication module U5 is a wireless communication module, so that various types of processed data can be transmitted by establishing a wireless communication connection with the host.
在另外的实施例中,该通信模块U5也可以为有线通信模块,例如是USB通信模块等。In another embodiment, the communication module U5 may also be a wired communication module, such as a USB communication module or the like.
每组双目摄像头包括两个相同类型的摄像头,以相当于左右眼进行三维空间的信息捕捉。Each set of binocular cameras includes two cameras of the same type, which are equivalent to the left and right eyes for information capture in three dimensions.
在图1所示的实施例中,第一组双目摄像头C1为标准摄像头,例如 传感器型号为OV9281的摄像头,用于进行手势捕捉;第二组双目摄像头C2采用鱼眼镜头,例如传感器型号为OV7251,用于获取景深。In the embodiment shown in FIG. 1, the first set of binocular cameras C1 is a standard camera, for example The sensor model is the OV9281 camera for gesture capture; the second group of binocular camera C2 uses a fisheye lens, such as the sensor model OV7251, for depth of field.
以应用场景为用户手持带有红外灯的无线手柄体验VR游戏为例,两组双目摄像头C1、C2将以设定频率捕捉手柄红外灯发出的红外光,第一组双目摄像头C1进行手势捕捉,第二组双目摄像头C2获取景深,进而共同完成对手柄动作的定位。处理器单元U1对两组双目摄像头C1、C2捕捉到的图像数据进行处理,进而获知手柄在空间中的位置信息,并通过通信模块U5将处理得到的位置信息发送至虚拟现实设备的主机,主机得到手柄的位置信息后,便可以将其映射到VR场景中,以实现相应的VR操作。Taking the application scenario as a user to experience the VR game with the wireless controller with infrared light as an example, the two sets of binocular cameras C1 and C2 will capture the infrared light emitted by the handle infrared light at the set frequency, and the first group of binocular cameras C1 will gesture. Capture, the second group of binocular cameras C2 obtains the depth of field, and together to complete the positioning of the handle action. The processor unit U1 processes the image data captured by the two sets of binocular cameras C1 and C2, and further obtains the position information of the handle in the space, and sends the processed position information to the host of the virtual reality device through the communication module U5. After the host gets the location information of the handle, it can map it to the VR scene to implement the corresponding VR operation.
在该实施例中,该摄像装置还可以包括惯性测量单元(Inertial measurement unit,IMU)U4,该惯性测量单元U4同样将采集到的运动数据传输至处理器单元U1进行处理,处理器单元U1可以将相应的处理结果同样传输至通信模块U5进行发送。In this embodiment, the camera device may further include an Inertial Measurement Unit (IMU) U4, and the inertial measurement unit U4 also transmits the collected motion data to the processor unit U1 for processing, and the processor unit U1 may The corresponding processing result is also transmitted to the communication module U5 for transmission.
该惯性测量单元(IMU)U4是测量物体三轴姿态角(或角速率)以及加速度的装置,陀螺仪、加速度计、地磁传感器是IMU的主要元件。一个IMU例如可以包括三个单轴的加速度计和三个单轴的陀螺仪,加速度计检测物体在载体坐标系统独立三轴的加速度信号,而陀螺仪检测载体相对于导航坐标系的角速度信号,测量物体在三维空间中的角速度和加速度,并以此解算出物体的姿态。The inertial measurement unit (IMU) U4 is a device for measuring the three-axis attitude angle (or angular rate) of the object and the acceleration. The gyroscope, the accelerometer, and the geomagnetic sensor are the main components of the IMU. An IMU may, for example, comprise three single-axis accelerometers and three single-axis gyroscopes, the accelerometer detecting an acceleration signal of an independent three-axis of the object in the carrier coordinate system, and the gyroscope detecting the angular velocity signal of the carrier relative to the navigation coordinate system, The angular velocity and acceleration of the object in three-dimensional space are measured, and the posture of the object is calculated.
上述处理器单元U1包括至少一个处理器对各组双目摄像头C1、C2及惯性测量单元U4等采集到的数据进行处理。The processor unit U1 includes at least one processor for processing data collected by each group of binocular cameras C1, C2 and inertial measurement unit U4.
在该实施例中,该处理器单元U1包括一个应用处理器U101和一个协处理器U102,以通过协处理器U102执行一些特定任务,例如监控各组双目摄像头C1、C2,及惯性测量单元U4等,进而减轻应用处理器U101的负担。In this embodiment, the processor unit U1 includes an application processor U101 and a coprocessor U102 to perform certain tasks through the coprocessor U102, such as monitoring each group of binocular cameras C1, C2, and an inertial measurement unit. U4, etc., thereby reducing the burden on the application processor U101.
该协处理器U102例如通过USB3.0总线与应用处理器U101通信连接,而通信模块U5例如通过I2C总线与应用处理器U101通信连接。这样,协处理器U102便可将经其预处理的数据传输至应用处理器U101做进一步的处理,应用处理器U101再将处理得到的处理结果传输至通信模块U5进行 发送。The coprocessor U102 is communicatively coupled to the application processor U101, for example, via a USB 3.0 bus, and the communication module U5 is communicatively coupled to the application processor U101, for example, via an I2C bus. In this way, the coprocessor U102 can transmit the data preprocessed by the coprocessor U102 to the application processor U101 for further processing, and the application processor U101 transmits the processed processing result to the communication module U5. send.
因此,在该实施例中,至少一组双目摄像头C1、C2例如可以通过MIPI接口与协处理器U102连接,以通过协处理器U102对双目摄像头C1、C2采集到的图像数据进行预处理。Therefore, in this embodiment, at least one set of binocular cameras C1, C2 can be connected to the coprocessor U102 through the MIPI interface, for example, to preprocess the image data collected by the binocular cameras C1, C2 by the coprocessor U102. .
在任何一端不具有MIPI接口的实施例中,可以通过MIPI桥接芯片实现两端的连接。In embodiments where there is no MIPI interface at either end, the connections at both ends can be achieved through the MIPI bridge chip.
同样,在该实施例中,惯性测量单元U4例如可以通过SPI接口与协处理器U102连接,以通过协处理器U102对惯性测量单元U4采集到的运动数据进行预处理。Also, in this embodiment, the inertial measurement unit U4 can be connected to the coprocessor U102 via the SPI interface, for example, to preprocess the motion data collected by the inertial measurement unit U4 by the coprocessor U102.
该摄像装置还可以包括音频编解码芯片U7、麦克风模组M1及扬声器模组S1,该麦克风模组M1通过音频编解码芯片U7的音频编码通道与处理器单元U1连接,处理器单元U1通过音频编解码芯片U7的音频解码通道与扬声器模组S1连接。这样,便可通过本发明的摄像装置进行语音通信。The camera device may further include an audio codec chip U7, a microphone module M1, and a speaker module S1. The microphone module M1 is connected to the processor unit U1 through an audio coding channel of the audio codec chip U7, and the processor unit U1 passes the audio. The audio decoding channel of the codec chip U7 is connected to the speaker module S1. Thus, voice communication can be performed by the image pickup apparatus of the present invention.
在处理器单元U1具有应用处理器U101和协处理器U102的实施例中,该音频编解码芯片U7例如可以通过I2S总线与处理器单元U1的应用处理器U101连接。In an embodiment in which the processor unit U1 has an application processor U101 and a coprocessor U102, the audio codec chip U7 can be connected, for example, to the application processor U101 of the processor unit U1 via an I2S bus.
该摄像装置还可以自带电池供电,以提高摄像装置的可靠性和便捷性。The camera device can also be powered by a battery to improve the reliability and convenience of the camera.
因此,在该实施例中,该摄像装置还可以包括电池B1、电源管理芯片(PMIC/PMU)U6和USB插座J1,电池B1经由电池管理芯片U6为摄像装置的各用电器件供电,USB插座J1与电源管理芯片U6和处理器单元U1连接,以实现数据传输,电源管理芯片U6与处理器单元U1通信连接,以传输控制命令和/或状态信息。Therefore, in this embodiment, the camera device may further include a battery B1, a power management chip (PMIC/PMU) U6, and a USB socket J1. The battery B1 supplies power to each of the power devices of the camera device via the battery management chip U6. J1 is connected to power management chip U6 and processor unit U1 for data transmission, and power management chip U6 is communicatively coupled to processor unit U1 for transmitting control commands and/or status information.
电源管理芯片U6与处理器单元U1之间的工作方式例如为:The working mode between the power management chip U6 and the processor unit U1 is as follows:
电源管理芯片U6在检测到USB插座J1的VBUS引脚上电后,与插入USB插座J1的外部设备进行充电协议的交互,交互之后,如果判断插入外部设备是充电器,则打开充电通道,使充电器为电池充电;如果判断插入的外部设备为PC机(USB host),则通过与处理器单元U1之间的通信通道通知处理器单元U1,以使处理器单元U1与外部设备建立USB连接。 After detecting that the VBUS pin of the USB socket J1 is powered on, the power management chip U6 performs an interaction with the external device inserted into the USB socket J1, and after the interaction, if it is judged that the external device is a charger, the charging channel is turned on. The charger charges the battery; if it is judged that the inserted external device is a USB host, the processor unit U1 is notified through a communication channel with the processor unit U1 to establish a USB connection between the processor unit U1 and the external device. .
在处理器单元U1包括应用处理器U101和协处理器U102的实施例中,该电源管理芯片U6例如可以通过I2C总线与应用处理器U101通信连接。该USB插座J1可以仅与应用处理器U101连接。该USB插座J1也可以同时与应用处理器U101和协处理器U102连接,在此可以通过应用处理器U101与协处理器U102之间的通信进行连接的选择。In an embodiment where processor unit U1 includes application processor U101 and coprocessor U102, the power management chip U6 can be communicatively coupled to application processor U101, for example, via an I2C bus. The USB socket J1 can be connected only to the application processor U101. The USB socket J1 can also be connected to the application processor U101 and the coprocessor U102 at the same time, where the connection selection can be made by communication between the application processor U101 and the coprocessor U102.
该摄像装置还可以进一步包括隔离电路,以使电源管理芯片U6通过隔离电路与USB插座J1连接,进而将电源管理芯片U6与USB插座J1之间的电路与处理器单元U1与USB插座J1之间的高速通信电路隔离开。The camera device may further include an isolation circuit for connecting the power management chip U6 to the USB socket J1 through the isolation circuit, thereby connecting the circuit between the power management chip U6 and the USB socket J1 and between the processor unit U1 and the USB socket J1. The high speed communication circuit is isolated.
在该实施例中,摄像装置还可以包括存储单元U3,存储单元U3包括至少一个存储器用于扩展处理器单元U1的存储空间.In this embodiment, the image capturing apparatus may further include a storage unit U3, and the storage unit U3 includes at least one memory for expanding the storage space of the processor unit U1.
该存储单元U3例如包括双倍速率同步动态随机存储器(DDR)、FLASH中的至少一种存储器。The storage unit U3 includes, for example, at least one of double rate synchronous dynamic random access memory (DDR) and FLASH.
在处理器单元U1包括应用处理器U101和协处理器U102的实施例中,该存储单元U3可以直接与协处理器U102的读写引脚连接。In embodiments where processor unit U1 includes application processor U101 and coprocessor U102, the memory unit U3 can be directly coupled to the read/write pins of coprocessor U102.
图2是根据本发明虚拟现实设备的一种实施例的方框原理图。2 is a block schematic diagram of one embodiment of a virtual reality device in accordance with the present invention.
根据图2所示,该虚拟现实设备包括主机210和根据本发明的摄像装置,该摄像装置在该实施例中被标记为220。According to FIG. 2, the virtual reality device comprises a host 210 and an imaging device according to the invention, which is labeled 220 in this embodiment.
摄像装置220通过其通信模块U5与主机210建立通信连接,进而将摄像装置220的处理器单元U1处理各类数据得到的处理结果发送至主机210,供主机210使用。The imaging device 220 establishes a communication connection with the host 210 through its communication module U5, and further transmits the processing result obtained by processing the various types of data by the processor unit U1 of the imaging device 220 to the host 210 for use by the host 210.
对于本发明的虚拟现实设备,由于摄像装置220自身承担了主要的计算任务,能够较大程度地减轻主机210的负担,进而有效解决主机发热的问题,因此,本发明的虚拟现实设备能够采用将主机210设置在虚拟现实设备的头戴部分上的设计结构,而不会因发热严重导致用户不适。For the virtual reality device of the present invention, since the imaging device 220 itself undertakes the main calculation task, the burden on the host 210 can be greatly reduced, thereby effectively solving the problem of heat generation of the host. Therefore, the virtual reality device of the present invention can adopt The design of the host 210 on the wearing portion of the virtual reality device does not cause user discomfort due to severe heat generation.
在另外的实施例中,该主机210也可以设置在移动手柄中与摄像装置220和头戴部分通信连接。In other embodiments, the host 210 can also be disposed in the mobile handle in communication with the camera 220 and the headset.
在另外的实施例中,该主机210也可以是固定的PC机与摄像装置220和头戴部分通信连接。In other embodiments, the host 210 can also be a fixed PC in communication with the camera 220 and the headset.
图3示出了根据本发明摄像装置的处理器单元U1进行图像处理的一种 实施方法的流程示意图。FIG. 3 shows a kind of image processing performed by the processor unit U1 of the image pickup apparatus according to the present invention. A schematic flow chart of the implementation method.
根据图3所示,该图像处理方法可以包括如下步骤:According to FIG. 3, the image processing method may include the following steps:
步骤S301,接收至少一组双目摄像头C1、C2采集到的图像数据。Step S301, receiving image data collected by at least one set of binocular cameras C1, C2.
处理器单元U1例如通过MIPI总线接收至少一组双目摄像头C1、C2采集到的图像数据。步骤S302,对接收到的图像数据进行预处理,以改善图像质量。The processor unit U1 receives image data acquired by at least one set of binocular cameras C1, C2, for example via an MIPI bus. Step S302, preprocessing the received image data to improve image quality.
该预处理可以包括灰度化处理、增强处理、滤波处理、二值化处理、白平衡处理、去马赛克处理、伽马校正处理等中的至少一种。The pre-processing may include at least one of grayscale processing, enhancement processing, filtering processing, binarization processing, white balance processing, demosaic processing, gamma correction processing, and the like.
该预处理例如可由处理器单元集成的图像采集引擎(Image Acquisition Engine,IAE)执行。This pre-processing can be performed, for example, by an image acquisition engine (IAE) integrated by the processor unit.
步骤S303,基于预处理后的图像数据,生成深度图像。Step S303, generating a depth image based on the preprocessed image data.
深度图像中的每一个像素值用于表示场景中某一点相对于摄像装置的距离。Each pixel value in the depth image is used to represent the distance of a point in the scene relative to the camera.
这例如可以基于预处理后的图像数据,通过智能视差映射和细化算法等生产深度图像。This can produce depth images, for example, based on pre-processed image data, through intelligent parallax mapping and refinement algorithms, and the like.
步骤S304,基于该深度图像,得到被定位物体的位置信息。Step S304, based on the depth image, obtain position information of the positioned object.
该步骤S304例如可以由处理器单元集成的计算机视觉引擎(Computer Vision Engine,CVE)执行,该计算机视觉引擎通过DSP和计算机视觉算法处理深度图像,得到被定位物体的位置信息。被定位物体例如是被红外光标记的头盔、手柄等。The step S304 can be performed, for example, by a computer Vision Engine (CVE) integrated by the processor unit, and the computer vision engine processes the depth image through the DSP and the computer vision algorithm to obtain position information of the positioned object. The object to be positioned is, for example, a helmet, a handle, or the like that is marked with infrared light.
步骤S305,将该位置信息作为对应图像数据的处理结果传输至所述摄像装置的通信模块U5进行发送。In step S305, the location information is transmitted as a processing result of the corresponding image data to the communication module U5 of the imaging device for transmission.
通过该步骤,摄像装置便可将处理器单元U1处理接收到的图像数据得到的处理结果发送至主机供主机使用,这样,主机便无需再占用资源对至少一组双目摄像头采集到的图像数据进行处理,进而减轻了主机的负担。Through this step, the camera device can send the processing result obtained by the processor unit U1 processing the received image data to the host for use by the host, so that the host does not need to occupy resources to collect image data collected by at least one group of binocular cameras. Processing, thereby reducing the burden on the host.
在图1所示的实施例中,处理器单元U1包括应用处理器U101和协处理器U102,以上步骤S301至步骤S304可以均由协处理器U102完成,而应用处理器U101仅负责处理结果的整合及通过通信模块U5进行处理结果的发送;也可以是协处理器U102执行上述步骤S301和步骤S302,或者执 行上述步骤S301至步骤S303,而其余步骤由应用处理器U101执行。In the embodiment shown in FIG. 1, the processor unit U1 includes an application processor U101 and a coprocessor U102. The above steps S301 to S304 may all be completed by the coprocessor U102, and the application processor U101 is only responsible for processing the result. Integrating and transmitting the processing result through the communication module U5; or the coprocessor U102 may perform the above steps S301 and S302, or The above steps S301 to S303 are performed, and the remaining steps are performed by the application processor U101.
另外,处理器单元U1对惯性测量单元(IMU)采集到的运动数据的处理例如可以是:通过快速中断响应(FIQ)传输至集成上处理器单元中的,进一步为集成在处理器单元的协处理器中的,通用处理模块(General Purpose Processor,GPP)进行处理,生成四元数,并将在四元数作为对应运动数据的处理结果传输至通信模块U5进行发送。In addition, the processing of the motion data collected by the inertial measurement unit (IMU) by the processor unit U1 may be, for example, transmission to the integrated processor unit through fast interrupt response (FIQ), further integrated into the processor unit. In the processor, a General Purpose Processor (GPP) performs processing to generate a quaternion, and transmits the quaternion as a processing result of the corresponding motion data to the communication module U5 for transmission.
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分相互参见即可,每个实施例重点说明的都是与其他实施例的不同之处,而且各个实施例可以根据需要单独使用或者相互结合使用。The various embodiments in the present specification are described in a progressive manner, and the same or similar parts between the various embodiments may be referred to each other. Each embodiment focuses on differences from other embodiments, and each implementation The examples can be used alone or in combination with each other as needed.
虽然已经通过例子对本发明的一些特定实施例进行了详细说明,但是本领域的技术人员应该理解,以上例子仅是为了进行说明,而不是为了限制本发明的范围。本领域的技术人员应该理解,可在不脱离本发明的范围和精神的情况下,对以上实施例进行修改。本发明的范围由所附权利要求来限定。 While the invention has been described in detail with reference to the preferred embodiments of the present invention, it is understood that It will be appreciated by those skilled in the art that the above embodiments may be modified without departing from the scope and spirit of the invention. The scope of the invention is defined by the appended claims.

Claims (10)

  1. 一种摄像装置,其特征在于,包括处理器单元(U1)、通信模块(U5)、及至少一组双目摄像头(C1、C2),所述处理器单元(U1)包括至少一个处理器(U101、U102),所述至少一组双目摄像头(C1、C2)将采集到的图像数据传输至所述处理器单元(U1)进行处理,所述处理器单元(U1)将相应的处理结果传输至所述通信模块(U5)进行发送。A camera device, comprising: a processor unit (U1), a communication module (U5), and at least one set of binocular cameras (C1, C2), the processor unit (U1) comprising at least one processor ( U101, U102), the at least one set of binocular cameras (C1, C2) transmit the collected image data to the processor unit (U1) for processing, and the processor unit (U1) will process the corresponding processing result Transfer to the communication module (U5) for transmission.
  2. 根据权利要求1所述的摄像装置,所述摄像装置包括至少两组双目摄像头(C1、C2),且其中一组双目摄像头(C2)采用鱼眼镜头。The image pickup apparatus according to claim 1, wherein the image pickup apparatus includes at least two sets of binocular cameras (C1, C2), and wherein one set of binocular cameras (C2) employs a fisheye lens.
  3. 根据权利要求1或2所述的摄像装置,所述处理器单元(U1)包括一个应用处理器(U101)和一个协处理器(U102),所述至少一组双目摄像头(C1、C2)与所述协处理器(U102)连接,所述协处理器(U102)与所述应用处理器(U101)通信连接,所述通信模块(U5)与所述应用处理器(U101)通信连接。The image pickup apparatus according to claim 1 or 2, wherein said processor unit (U1) comprises an application processor (U101) and a coprocessor (U102), said at least one set of binocular cameras (C1, C2) Connected to the coprocessor (U102), the coprocessor (U102) is communicatively coupled to the application processor (U101), and the communication module (U5) is communicatively coupled to the application processor (U101).
  4. 根据权利要求3所述的摄像装置,其特征在于,所述协处理器(U102)与所述应用处理器(U101)通过USB总线通信连接。The image pickup apparatus according to claim 3, wherein said coprocessor (U102) and said application processor (U101) are communicably connected by a USB bus.
  5. 根据权利要求1至4中任一项所述的摄像装置,其特征在于,所述摄像装置还包括惯性测量单元(U4),所述惯性测量单元(U4)将采集到的运动数据传输至所述处理器单元(U1)进行处理,所述处理器单元(U1)将相应的处理结果传输至所述通信模块(U5)进行发送。The image pickup apparatus according to any one of claims 1 to 4, wherein the image pickup apparatus further comprises an inertial measurement unit (U4) that transmits the collected motion data to the camera The processor unit (U1) performs processing, and the processor unit (U1) transmits the corresponding processing result to the communication module (U5) for transmission.
  6. 根据权利要求1至5中任一项所述的摄像装置,其特征在于,所述摄像装置还包括音频编解码芯片(U7)、麦克风模组(M1)及扬声器模组(S1),所述麦克风模组(M1)通过所述音频编解码芯片(U7)的音频编码通道与所述处理器单元(U1)连接,所述处理器单元(U1)通过所述音频编解码芯片(U7)的音频解码通道与所述扬声器模组(S1)连接。The image pickup apparatus according to any one of claims 1 to 5, further comprising an audio codec chip (U7), a microphone module (M1), and a speaker module (S1), a microphone module (M1) is connected to the processor unit (U1) through an audio encoding channel of the audio codec chip (U7), and the processor unit (U1) passes through the audio codec chip (U7) An audio decoding channel is coupled to the speaker module (S1).
  7. 根据权利要求1至6中任一项所述的摄像装置,其特征在于,所述通信模块(U5)为无线通信模块。 The image pickup apparatus according to any one of claims 1 to 6, wherein the communication module (U5) is a wireless communication module.
  8. 根据权利要求1至7中任一项所述的摄像装置,其特征在于,所述摄像装置还包括电池(B1)、电源管理芯片(U6)和USB插座(J1),所述电池(B1)经由所述电池管理芯片(U6)为所述摄像装置的用电器件供电,所述USB插座(J1)分别与所述电源管理芯片(U6)和所述处理器单元(U1)连接,所述电源管理芯片(U6)与所述处理器单元(U1)通信连接。The image pickup apparatus according to any one of claims 1 to 7, characterized in that the image pickup apparatus further comprises a battery (B1), a power management chip (U6), and a USB socket (J1), the battery (B1) Powering the power device of the camera device via the battery management chip (U6), the USB socket (J1) being respectively connected to the power management chip (U6) and the processor unit (U1), A power management chip (U6) is communicatively coupled to the processor unit (U1).
  9. 一种虚拟现实设备,其特征在于,包括主机(210)和权利要求1至8中任一项所述的摄像装置(220),所述主机(210)与所述摄像装置(220)的通信模块(U5)建立通信连接,所述摄像装置(220)的处理器单元(U1)将处理各类数据得到的处理结果通过所述通信连接发送至所述主机(210)。A virtual reality device, comprising: a host (210) and the camera device (220) according to any one of claims 1 to 8, the host (210) communicating with the camera device (220) The module (U5) establishes a communication connection, and the processor unit (U1) of the camera device (220) transmits the processing result obtained by processing various types of data to the host (210) through the communication connection.
  10. 根据权利要求1至8中任一项所述的摄像装置的图像处理方法,其特征在于,所述处理器单元:The image processing method of the image pickup apparatus according to any one of claims 1 to 8, wherein the processor unit:
    接收至少一组双目摄像头(C1、C2)采集到的图像数据;Receiving image data collected by at least one set of binocular cameras (C1, C2);
    对所述图像数据进行预处理;Preprocessing the image data;
    基于预处理后的图像数据,生成深度图像;Generating a depth image based on the preprocessed image data;
    基于所述深度图像,得到被定位物体的位置信息;以及,Obtaining position information of the positioned object based on the depth image; and
    将所述位置信息作为对应所述图像数据的处理结果传输至所述摄像装置的通信模块进行发送。 The location information is transmitted to the communication module of the imaging device as a processing result corresponding to the image data for transmission.
PCT/CN2016/111544 2016-11-29 2016-12-22 Photographing apparatus and image processing method therefor, and virtual reality device WO2018098867A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611073683.9 2016-11-29
CN201611073683.9A CN106507092A (en) 2016-11-29 2016-11-29 Camera head and its image processing method, virtual reality device

Publications (1)

Publication Number Publication Date
WO2018098867A1 true WO2018098867A1 (en) 2018-06-07

Family

ID=58328949

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/111544 WO2018098867A1 (en) 2016-11-29 2016-12-22 Photographing apparatus and image processing method therefor, and virtual reality device

Country Status (2)

Country Link
CN (1) CN106507092A (en)
WO (1) WO2018098867A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106960473B (en) * 2017-03-27 2019-12-10 北京交通大学 behavior perception system and method
CN107168515A (en) * 2017-03-31 2017-09-15 北京奇艺世纪科技有限公司 The localization method and device of handle in a kind of VR all-in-ones
WO2019015261A1 (en) * 2017-07-17 2019-01-24 Chengdu Topplusvision Technology Co., Ltd. Devices and methods for determining scene
CN109672876A (en) * 2017-10-17 2019-04-23 福州瑞芯微电子股份有限公司 Depth map processing unit and depth map processing unit
CN107707840A (en) * 2017-10-27 2018-02-16 信利光电股份有限公司 A kind of method of camera module and multilevel image data transmission
CN108427479B (en) * 2018-02-13 2021-01-29 腾讯科技(深圳)有限公司 Wearable device, environment image data processing system, method and readable medium
CN108983982B (en) * 2018-05-30 2022-06-21 太若科技(北京)有限公司 AR head display equipment and terminal equipment combined system
CN109710056A (en) * 2018-11-13 2019-05-03 宁波视睿迪光电有限公司 The display methods and device of virtual reality interactive device
CN109587451A (en) * 2018-12-25 2019-04-05 青岛小鸟看看科技有限公司 A kind of video capture device and its control method showing equipment for virtual reality
CN111420391A (en) * 2020-03-04 2020-07-17 青岛小鸟看看科技有限公司 Head-mounted display system and space positioning method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203984550U (en) * 2014-08-06 2014-12-03 杭州戈虎达科技有限公司 A kind of control circuit of 3D camera
CN104835163A (en) * 2015-05-11 2015-08-12 华中科技大学 Embedded real-time high-speed binocular vision system for moving target detection
US20160267884A1 (en) * 2015-03-12 2016-09-15 Oculus Vr, Llc Non-uniform rescaling of input data for displaying on display device
CN106020456A (en) * 2016-05-11 2016-10-12 北京暴风魔镜科技有限公司 Method, device and system for acquiring head posture of user
CN205726125U (en) * 2016-03-30 2016-11-23 重庆邮电大学 A kind of novel robot Long-Range Surveillance System

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101419339A (en) * 2008-11-24 2009-04-29 电子科技大学 Head-mounted display
TW201215144A (en) * 2010-09-16 2012-04-01 Hon Hai Prec Ind Co Ltd Image correcting system for cameras and correcting method using same
CN204423304U (en) * 2014-12-15 2015-06-24 上海乐相科技有限公司 A kind of device realizing virtual reality technology
CN204258990U (en) * 2014-12-24 2015-04-08 何军 Intelligence head-wearing display device
CN104618712A (en) * 2015-02-13 2015-05-13 北京维阿时代科技有限公司 Head wearing type virtual reality equipment and virtual reality system comprising equipment
CN205302186U (en) * 2015-12-28 2016-06-08 青岛歌尔声学科技有限公司 Virtual reality control system based on external input
CN205490840U (en) * 2016-02-24 2016-08-17 厦门北卡信息科技有限公司 Wireless camera device of portable image processing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203984550U (en) * 2014-08-06 2014-12-03 杭州戈虎达科技有限公司 A kind of control circuit of 3D camera
US20160267884A1 (en) * 2015-03-12 2016-09-15 Oculus Vr, Llc Non-uniform rescaling of input data for displaying on display device
CN104835163A (en) * 2015-05-11 2015-08-12 华中科技大学 Embedded real-time high-speed binocular vision system for moving target detection
CN205726125U (en) * 2016-03-30 2016-11-23 重庆邮电大学 A kind of novel robot Long-Range Surveillance System
CN106020456A (en) * 2016-05-11 2016-10-12 北京暴风魔镜科技有限公司 Method, device and system for acquiring head posture of user

Also Published As

Publication number Publication date
CN106507092A (en) 2017-03-15

Similar Documents

Publication Publication Date Title
WO2018098867A1 (en) Photographing apparatus and image processing method therefor, and virtual reality device
CN109040600B (en) Mobile device, system and method for shooting and browsing panoramic scene
WO2019176308A1 (en) Information processing device, information processing method and program
WO2021018070A1 (en) Image display method and electronic device
TWI642903B (en) Locating method, locator, and locating system for head-mounted display
JP6452440B2 (en) Image display system, image display apparatus, image display method, and program
WO2021098358A1 (en) Virtual reality system
US9916004B2 (en) Display device
CN109276895B (en) Building block system, and method, device and system for identifying topological structure
KR20200028771A (en) Electronic device and method for recognizing user gestures based on user intention
KR102190743B1 (en) AUGMENTED REALITY SERVICE PROVIDING APPARATUS INTERACTING WITH ROBOT and METHOD OF THEREOF
WO2020110659A1 (en) Information processing device, information processing method, and program
JP2018067773A (en) Imaging device, control method thereof, program, and storage medium
CN111479148B (en) Wearable device, glasses terminal, processing terminal, data interaction method and medium
JP2015118442A (en) Information processor, information processing method, and program
CN110956571B (en) SLAM-based virtual-real fusion method and electronic equipment
CN112351188A (en) Apparatus and method for displaying graphic elements according to objects
CN114332423A (en) Virtual reality handle tracking method, terminal and computer-readable storage medium
KR102402457B1 (en) Method for processing contents and electronic device implementing the same
WO2018196221A1 (en) Interaction method, device and system
CN212181167U (en) Split type AR intelligence glasses of function enhancement mode
WO2019021573A1 (en) Information processing device, information processing method, and program
US11240482B2 (en) Information processing device, information processing method, and computer program
CN206350093U (en) Camera device and virtual reality device
EP4141710A1 (en) Device enabling method and apparatus, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16922650

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16922650

Country of ref document: EP

Kind code of ref document: A1