CN110192391B - Processing method and equipment - Google Patents

Processing method and equipment Download PDF

Info

Publication number
CN110192391B
CN110192391B CN201780083728.7A CN201780083728A CN110192391B CN 110192391 B CN110192391 B CN 110192391B CN 201780083728 A CN201780083728 A CN 201780083728A CN 110192391 B CN110192391 B CN 110192391B
Authority
CN
China
Prior art keywords
image data
image
eye image
warping
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780083728.7A
Other languages
Chinese (zh)
Other versions
CN110192391A (en
Inventor
罗毅
康俊腾
郑方舟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN110192391A publication Critical patent/CN110192391A/en
Application granted granted Critical
Publication of CN110192391B publication Critical patent/CN110192391B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Provided herein is an electronic device with VR functionality or AR functionality, the electronic device comprising an image processing device and a head mounted display, the method comprising: the image processing device receives image data transmitted by the head-mounted display, wherein the image data comprises left-eye image data and right-eye image data; performing rendering on the image data; dividing left eye image data and right eye image data obtained after rendering into n groups of image data combinations according to a time sequence, wherein each image data combination comprises a left eye image data block and a right eye image data block, and n is an integer greater than 1; performing warping on the n groups of image data combinations respectively; performing image compression on each image data combination after performing the warping; the image processing apparatus transmits each image data combination subjected to image compression to the head-mounted display.

Description

Processing method and equipment
The present application claims priority from chinese patent application entitled "a method and apparatus for image depth information assisted image compression and transmission" filed by the chinese patent office on 19/1/2017 with application number 201710045188.5, the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of wireless communications technologies, and in particular, to a processing method and device.
Background
Virtual Reality (VR) technology and Augmented Reality (AR) technology are multimedia technologies emerging in recent years. The virtual reality technology is a computer simulation system capable of creating and experiencing a virtual world, and the augmented reality technology is a technology capable of overlapping and interacting virtual reality and a real world. In VR or AR scenes, a user typically wears a Head Mounted Display (HMD) that enables the wearer to have an interactive immersive experience through an integrated graphics system, optical system, and gesture tracking system.
In VR or AR scenarios, limited by the existing HMD processor capabilities, some large gaming or video-like applications cannot complete the processing of virtual reality through a processing unit built into the HMD, and thus a portion of the virtual reality data that requires extensive computation is executed by another device (e.g., an external host, cell phone, or game console) that is connected to the HMD through wired or wireless transmission. The other device is connected with the HMD through a wired transmission mode, and the use is inconvenient. The other device is typically connected to the HMD by way of wireless transmission. Since the amount of image data output by the other apparatus is large, an efficient image processing method is required.
Disclosure of Invention
A processing method and apparatus are described herein, which aim to increase the speed of image processing by improving the image processing method, thereby reducing the time delay due to image processing.
A first aspect provides a method for an electronic device with virtual reality, VR, or augmented reality, AR, functionality, the electronic device comprising an image processing device and a head mounted display, the method comprising: the image processing device receives image data transmitted by the head-mounted display, wherein the image data comprises left-eye image data and right-eye image data; the image processing apparatus performs rendering on the image data; the image processing equipment divides left eye image data and right eye image data obtained after rendering into n groups of image data combinations according to a time sequence, each image data combination comprises a left eye image data block and a right eye image data block, and n is an integer greater than 1; the image processing device performs warping on n sets of image data combinations, respectively; the image processing device performs image compression on each image data combination after performing the warping; the image processing apparatus transmits each image data combination subjected to image compression to the head-mounted display. With this method, since n sets of image data combinations are divided, image compression can be performed after one set of image data combination is subjected to warping, without having to perform warping of all left-eye image data first, then performing warping of all right-eye image data, and then performing image compression. Therefore, the time for image processing is greatly shortened, and the speed of image processing is improved.
In a first possible implementation manner of the first aspect, the method further includes: the image processing apparatus performs image compression on each image data combination after performing the warping, including:
the image processing apparatus performs image compression on each image data combination after performing the warping based on image depth information acquired by the image processing apparatus when performing rendering on each image data combination. Since the image depth information acquired when rendering is performed is added in the image compression process, the image compression process can be faster and the image processing time can be further shortened.
In a second possible implementation manner of the first aspect, the method further includes: the time information of the left-eye image data block and the time information of the right-eye image data block in the n groups of image data combinations are the same, and the time information includes: a start time of the image data block and an end time of the image data block. It should be understood that: the image information may include other forms, and is not limited to, the time information for ensuring that the left-eye image data block and the right-eye image data block in each image data combination are synchronized in time.
A second aspect provides an electronic device. The apparatus comprises: a head mounted display; one or more processors; a sensor unit, a memory; a plurality of application programs; and one or more programs. The one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions. The instructions are for: receiving image data transmitted by the head-mounted display, wherein the image data comprises left-eye image data and right-eye image data; performing rendering on the image data; dividing left eye image data and right eye image data obtained after rendering into n groups of image data combinations according to a time sequence, wherein each image data combination comprises a left eye image data block and a right eye image data block, and n is an integer greater than 1; performing warping on the n groups of image data combinations respectively; performing image compression on each image data combination after performing the warping; and transmitting each image data combination subjected to image compression to the head-mounted display respectively. With the electronic device, since n groups of image data combinations are divided, after one group of image data combination is subjected to warping, image compression can be performed without performing the warping of all left-eye image data and then the warping of all right-eye image data as in the prior art, and then image compression can be performed. Compared with the prior art, the method greatly shortens the image processing time and improves the image processing speed.
In a first possible implementation manner of the second aspect, the instructions are further configured to: performing image compression on each image data combination after performing the warping based on image depth information acquired by the image processing apparatus when the image processing apparatus performs rendering on each image data combination. Since the image depth information acquired when rendering is performed is added in the image compression process, the image compression process can be faster and the image processing time can be further shortened.
In a second possible implementation manner of the second aspect, the time information of the left-eye image data block and the time information of the right-eye image data block in the n groups of image data combinations are the same, and the time information includes: a start time of the image data block and an end time of the image data block. It should be understood that: the image information may include other forms, and is not limited to, the time information for ensuring that the left-eye image data block and the right-eye image data block in each image data combination are synchronized in time.
In a third aspect, an electronic device is provided, which includes means or module for performing the method provided in the first aspect or any one of the possible implementations of the first aspect.
In a fourth aspect, an electronic device is provided, the electronic device comprising: a processor, a memory, a bus system; the processor and the memory are connected through the system bus; the memory is for one or more programs, the one or more programs comprising instructions, which when executed by the electronic device, cause the electronic device to perform the method provided by the first aspect or any possible implementation of the first aspect.
In a fifth aspect, there is provided a computer readable storage medium storing one or more programs which, when executed by the electronic device, cause the electronic device to perform the method of the first aspect or any possible implementation of the first aspect.
In a sixth aspect, an embodiment of the present invention provides an electronic device, including: a head mounted display; a sensor unit; one or more processors; a memory; and one or more programs. Where the one or more programs are stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for performing the methods provided in accordance with the first aspect or any one of the possible implementations of the first aspect.
In other aspects, an embodiment of the present invention provides an electronic device, including: a head mounted display, and an apparatus for performing the method according to the first aspect or any one of its possible implementations.
Based on the technical scheme, the electronic equipment is divided into n groups of image data combinations, after one group of image data combinations are subjected to warping, image compression can be performed, and the image compression can be performed without performing the warping of all left-eye image data first and then performing the warping of all right-eye image data. Therefore, the time for image processing is greatly shortened, and the speed of image processing is improved.
Drawings
For a better understanding of the foregoing embodiments of the invention, as well as additional embodiments thereof, reference should be made to the following description of the embodiments taken in conjunction with the following drawings, wherein like reference numerals designate corresponding parts throughout the figures.
FIG. 1 illustrates a schematic diagram of an electronic device, according to some embodiments.
FIG. 2 illustrates a schematic diagram of an electronic device, according to some embodiments.
FIG. 3 illustrates a processing method schematic according to some embodiments.
FIG. 4 illustrates a processing method schematic according to some embodiments.
FIG. 5 illustrates a processing method schematic according to some embodiments.
FIG. 6 illustrates a processing method schematic according to some embodiments.
FIG. 7 illustrates a method flow diagram according to some embodiments.
Detailed Description
A method for processing image data by an electronic device having a virtual reality VR function or an augmented reality AR function is introduced, the electronic device having a structure as shown in fig. 1, the electronic device including an image processing device 10 and a head-mounted display 20, wherein the head-mounted display 20 includes, but is not limited to, a sensor unit 201 and an image display unit 202, wherein the sensor unit 201 is used for acquiring data in real time when the electronic device is started, and the data includes, but is not limited to, image data, angle data, orientation data, and the like. It should be understood that: sensor units include, but are not limited to: image sensor, camera, gyroscope, accelerometer, magnetometer, distance sensor, light sensor, temperature sensor, humidity sensor, heart rate sensor, pedometer, microphone, radiation sensor, fingerprint sensor, etc. The technical solution is described by taking the image data collected by the camera as an example, but this does not limit the present invention. In a specific design, data acquired by any one or more sensors may be added on the basis of image data as input to the image processing apparatus 10. The head mounted display 20 further includes: an input unit, an audio unit, a processing unit, etc. The image processing apparatus 10 further includes: a VR or AR calculation unit.
The image processing device 10 and the head-mounted display 20 are connected through wireless transmission; alternatively, the image processing apparatus 10 and the head-mounted display 20 are connected by a wired transmission manner or an internal circuit.
As one design, the electronic device has a structure as shown in fig. 2, wherein the image processing device 10 includes but is not limited to: a receiving unit 101, a rendering unit 102, a dividing unit 103, a warping unit 104, an image compression unit 105, and a transmission unit 106. Head mounted display 20 includes, but is not limited to: a sensor unit 201 and an image display unit 202. It should be understood that: the image compression unit 105 may not be provided inside the image processing apparatus 10 but may exist independently. The image compression unit 105 is respectively kept connected with the image processing apparatus 10 and the head mounted display 20. The image compression unit 105 may be an image compressor. It should be understood that: image Processing device 10 may include a Graphics Processing Unit (GPU) that internally includes: a receiving unit 101, a rendering unit 102, a dividing unit 103, a warping unit 104, an image compression unit 105, and a transmitting unit 106. The image compression unit 105 may be disposed inside the GPU or disposed outside the GPU, which is not limited herein. The image Processing apparatus 10 may further include a Central Processing Unit (CPU) that internally includes: a receiving unit 101, a rendering unit 102, a dividing unit 103, a warping unit 104, an image compression unit 105, and a transmitting unit 106. The image compression unit 105 may be disposed inside the CPU or disposed outside the CPU, and is not limited herein. Optionally, the CPU may further include a GPU therein, or the GPU may also be disposed outside the CPU.
In general, as shown in fig. 3, the image processing apparatus 10 receives image data collected by a sensor unit of the head mounted display 20, the image processing apparatus 10 performs rendering on the received image data, and then performs warping on left eye image data in the rendered image data, and after performing the warping on the left eye image data, performs warping on right eye image data in the image data; next, image compression is performed on the left-eye image data after the warping is performed, and after the image compression is performed on the left-eye image data, image compression is performed on the right-eye image data after the warping is performed. And finally, the left eye image data and the right eye image data after the image compression are transmitted to the head-mounted display 20 by wireless transmission. This image processing method performs warping on the right-eye image data after warping the left-eye image data, which results in a long image processing time. If the parallel warping is used, the processing capacity of the image processing apparatus 10 is increased and the buffer space is increased, thereby increasing the hardware cost. Furthermore, the data of the left eye image and the right eye image are independently compressed, and the image cannot be further compressed by utilizing the correlation of the data of the left eye image and the right eye image, so that the data volume after the image compression is still large; meanwhile, the existing VR system/AR system outputs in time-sharing manner according to the left and right eyes, if left and right eye image data compression is to be performed, the left and right eye image data which are output first need to be buffered, and the left and right eye image data compression is performed when the right and left eyes start to output, so that the buffering and cost of the VR system/AR system are increased, and the time delay of image transmission is caused. Meanwhile, when data of left and right eye images are compressed, due to the difference between the left and right eye images, operations such as searching and comparing image blocks are required in the conventional compression method, so that complexity and time delay are increased, and power consumption of image processing equipment is increased.
It should be noted that: the term rendering (render) as used herein refers to the construction of a three-dimensional scene by 3D modeling, followed by the addition of optical fibers, colors, etc. to ultimately generate an image. Warping (warp) is the generation of an image that is suitable for the human eye to see, usually a sphere, and therefore needs to match the image seen by the human eye.
In order to improve the image compression efficiency of the electronic device and reduce the complexity, power consumption, cost and time delay increased by the complexity, power consumption and cost of image compression, the application provides a processing method, so that the image compression has lower time delay and higher image compression efficiency.
As shown in fig. 4, in comparison with the processing flow as in fig. 3, in fig. 4, there is an improvement in that the original respective execution of the compression of the left eye image data and the compression of the right eye image data is changed. The left-eye image data and the right-eye image data are compressed by adopting compression of the left-eye image data and the right-eye image data, so that the correlation of the left-eye image data and the right-eye image data can be fully utilized, the image compression efficiency can be improved, and the wireless transmission data volume can be reduced. However, compared with the case where the compression of the left-eye image data and the compression of the right-eye image data are performed in parallel, the compression of the left-eye image data and the right-eye image data may cause an increase in time delay. In order to reduce time delay, the method for alternately distorting the left eye and the right eye is adopted, and the distortion operation and the image compression can be carried out in parallel, so that the time delay problem caused by the compression of the left eye image data and the right eye image data can be reduced.
As shown in fig. 5: the division is performed on left-eye image data and right-eye image data, the left-eye image data being divided into n parts, the right-eye image data being divided into n parts, n image data combinations being formed in chronological order, the warping being performed separately. That is, as shown in fig. 5, the warping is performed to intersect the left and right eye image data divided into n pieces, respectively. It should be understood that: when the warping is performed on 2n pieces of image data (corresponding to n image data combinations), the warping may be performed in series in time order or in parallel. For example, the warping may be performed sequentially from left to right as shown in fig. 5, or the warping may be performed in parallel for 2n pieces of image data. Alternatively, the image data inside each image data combination is subjected to warping in series in units of each image data combination, and the warping is performed in parallel over the entire n image data combinations. Alternatively, the warping is performed in parallel across the entire n image data combinations, and the image data within each data combination is also warped in parallel. It should be understood that: the description of the combination of n image data is only for the convenience of the reader to quickly understand the technical scheme, and the specific implementation does not relate to the combination of the image data, but acquires the corresponding image data according to the numbers of the divided left and right eye image data respectively and performs the warping in series or in parallel.
Next, image compression is performed in series or in parallel on the n sets of image data on which the warping has been performed, respectively.
It should be understood that: if parallel processing is performed in both the process of performing warping and the process of performing image compression, the result of such processing is that the time consumption is minimal, and of course the processing power requirements of the image processing apparatus 10 are higher and the power consumption is greater. If the process is serial, the time delay is also shortened compared to the process flow of fig. 4.
In order to further shorten the time for image processing, the image depth information acquired when the image processing apparatus 10 performs rendering on image data may be further utilized, as shown in fig. 6, the image depth information is increased as compared with fig. 5. The left and right eye compression in fig. 5 is compression based directly on left and right eye image data, which requires complicated search and matching. In the flow of fig. 6, when the image processing apparatus 10 performs rendering on the image data, the image depth information is rendered according to the image depth information of the object in the field of view, and after the rendering is performed, the image depth information is not output to the outside, and in order to reduce the compression complexity of the left and right eye images, the image depth information when the image processing apparatus 10 performs rendering on the image data is used as an input for performing image compression, so that the image compression unit can quickly generate a disparity map, increase the speed of left and right eye compression, and reduce the compression complexity.
When the method of fig. 4 is used to perform compression of the left and right eyes, a disparity map for the other eye needs to be obtained based on a monocular image. The difference of the same object in the left eye image and the right eye image has strong relation with the image depth information, and under the condition that the image depth information of the image object/block is known, the acquisition of the disparity map can be simple and quick. The method of fig. 4 lacks image depth information, making this process highly complex resulting in high cost and power consumption. Since the process of rendering the image data by the image processing device 10 includes the image depth information, the disparity map can be quickly acquired by the left and right compression using the image depth information.
Fig. 7 depicts a flowchart of this solution for an electronic device with virtual reality VR functionality or augmented reality AR functionality, the electronic device comprising an image processing device and a head mounted display, as follows:
s101, the image processing device receives image data transmitted by the head-mounted display, wherein the image data comprises left-eye image data and right-eye image data;
s102, the image processing device performs rendering on the image data;
s103, dividing left eye image data and right eye image data obtained after rendering into n groups of image data combinations according to a time sequence by the image processing equipment, wherein each image data combination comprises a left eye image data block and a right eye image data block, and n is an integer greater than 1;
s104, the image processing equipment respectively performs warping on the n groups of image data combinations;
s105, the image processing device performs image compression on each image data combination after the distortion is performed;
s106, the image processing device transmits each image data combination which is subjected to image compression to the head-mounted display.
It should be understood that: in S106, the image processing apparatus may simultaneously transmit the n image data combinations that have undergone image compression to the head-mounted display, or the image processing apparatus may transmit the n image data combinations that have undergone image compression to the head-mounted display, respectively, that is, the image processing apparatus transmits the n image data combinations to the head-mounted display after one image combination has undergone image compression.
It should be noted that: for 105, one possible implementation is to not utilize image depth information; yet another possible implementation is to make use of image depth information.
For the case of using image depth information:
the image processing apparatus performs image compression on each image data combination after performing the warping, including:
the image processing apparatus performs image compression on each image data combination after performing the warping based on image depth information acquired by the image processing apparatus when performing rendering on each image data combination.
It should be understood that: the time information of the left-eye image data block and the time information of the right-eye image data block in the n groups of image data combinations are the same, and the time information includes: a start time of the image data block and an end time of the image data block.
For the aforementioned electronic device, comprising: a head mounted display; one or more processors; a sensor unit; a memory; one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for: receiving image data transmitted by the head-mounted display, wherein the image data comprises left-eye image data and right-eye image data; performing rendering on the image data; dividing left eye image data and right eye image data obtained after rendering into n groups of image data combinations according to a time sequence, wherein each image data combination comprises a left eye image data block and a right eye image data block, and n is an integer greater than 1; performing warping on the n groups of image data combinations respectively; performing image compression on each image data combination after performing the warping; and transmitting each image data combination subjected to image compression to the head-mounted display respectively.
Optionally, the instructions are further for: performing image compression on each image data combination after performing the warping based on image depth information acquired by the image processing apparatus when performing rendering on each image data combination.
Optionally, the time information of the left-eye image data block and the time information of the right-eye image data block in the n groups of image data combinations are the same, and the time information includes: a start time of the image data block and an end time of the image data block.
As a possible design, an electronic device includes:
a head mounted display; a sensor unit; one or more processors; a memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of claims s 101-s 106.
As another possible design, as shown in fig. 1, the image processing apparatus 10 includes:
a receiving unit 101, configured to receive image data transmitted by the head-mounted display, where the image data includes left-eye image data and right-eye image data;
a rendering unit 102 for performing rendering on the image data;
a dividing unit 103, configured to divide left-eye image data and right-eye image data obtained after rendering into n groups of image data combinations according to a time sequence, where each image data combination includes a left-eye image data block and a right-eye image data block, and n is an integer greater than 1;
a warping unit 104 for performing warping on the n sets of image data combinations, respectively;
an image compression unit 105 for performing image compression on each image data combination after the warping is performed;
a transmission unit 106, configured to transmit each image data combination subjected to image compression to the head-mounted display respectively.
Optionally, the rendering unit 102 is further configured to obtain image depth information when performing rendering on the image data block
The image compression unit 105 performs image compression on each image data combination after performing the warping, including:
the image compression unit 105 performs image compression on each image data combination after performing the warping based on image depth information acquired by the image processing apparatus when performing rendering on each image data combination.
Optionally, the time information of the left-eye image data block and the time information of the right-eye image data block in the n groups of image data combinations are the same, and the time information includes: a start time of the image data block and an end time of the image data block.
It should be noted that: the operations of s101 to s106 can be referred to the above description and fig. 1 to 6, and are not described again here.
The present application further provides a computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method of s101 to s 106.
Those of ordinary skill in the art will appreciate that the following functions are in accordance with the various embodiments of the electronic device, apparatus, method, image processing device (e.g., processor chip or set of processor chips), and computer readable storage medium described herein, which are combined and/or integrated with one another in any combination that would directly and unambiguously occur to one skilled in the art after having the benefit of this disclosure.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (7)

1. A processing method for an electronic device with virtual reality, VR, or augmented reality, AR, functionality, the electronic device comprising an image processing device and a head mounted display, the method comprising:
the image processing device receives image data transmitted by the head-mounted display, wherein the image data comprises left-eye image data and right-eye image data;
the image processing device performs rendering on the image data and acquires image depth information;
the image processing equipment divides left eye image data and right eye image data obtained after rendering into n groups of image data combinations according to a time sequence, each image data combination comprises a left eye image data block and a right eye image data block, and n is an integer greater than 1;
the image processing device performs warping on n sets of image data combinations, respectively;
the image processing apparatus performs image compression on each image data combination after performing the warping, the image compression being performed on each image data combination after performing the warping based on the image depth information;
the image processing apparatus transmits each image data combination subjected to image compression to the head-mounted display.
2. The method of claim 1, wherein the time information of the left-eye image data block and the time information of the right-eye image data block in the n sets of image data combinations are the same, the time information including: a start time of the image data block and an end time of the image data block.
3. An electronic device, comprising:
a head mounted display;
one or more processors;
a sensor unit;
a memory;
one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for:
receiving image data transmitted by the head-mounted display, wherein the image data comprises left-eye image data and right-eye image data;
rendering the image data and acquiring image depth information;
dividing left eye image data and right eye image data obtained after rendering into n groups of image data combinations according to a time sequence, wherein each image data combination comprises a left eye image data block and a right eye image data block, and n is an integer greater than 1;
performing warping on the n groups of image data combinations respectively;
performing image compression on each image data combination after performing the warping, the image compression being performed on each image data combination after performing the warping based on the image depth information;
and transmitting each image data combination subjected to image compression to the head-mounted display respectively.
4. The electronic device of claim 3,
the time information of the left-eye image data block and the time information of the right-eye image data block in the n groups of image data combinations are the same, and the time information includes: a start time of the image data block and an end time of the image data block.
5. An image processing apparatus characterized by comprising:
a receiving unit for receiving image data transmitted by the head mounted display, the image data including left eye image data and right eye image data;
the rendering unit is used for rendering the image data and acquiring image depth information;
the image processing device comprises a dividing unit, a processing unit and a processing unit, wherein the dividing unit is used for dividing left-eye image data and right-eye image data obtained after rendering into n groups of image data combinations according to a time sequence, each image data combination comprises a left-eye image data block and a right-eye image data block, and n is an integer greater than 1;
a warping unit for performing warping on the n sets of image data combinations, respectively;
an image compression unit configured to perform image compression on each of the image data combinations after the warping is performed, the image compression being performed on each of the image data combinations after the warping is performed based on the image depth information;
and the transmission unit is used for respectively transmitting each image data combination subjected to image compression to the head-mounted display.
6. The image processing apparatus according to claim 5,
the time information of the left-eye image data block and the time information of the right-eye image data block in the n groups of image data combinations are the same, and the time information includes: a start time of the image data block and an end time of the image data block.
7. An electronic device, comprising:
a head mounted display; a sensor unit; one or more processors; a memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for performing the method of any of claims 1-2.
CN201780083728.7A 2017-01-19 2017-06-14 Processing method and equipment Active CN110192391B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN2017100451885 2017-01-19
CN201710045188 2017-01-19
PCT/CN2017/088261 WO2018133312A1 (en) 2017-01-19 2017-06-14 Processing method and device

Publications (2)

Publication Number Publication Date
CN110192391A CN110192391A (en) 2019-08-30
CN110192391B true CN110192391B (en) 2020-11-06

Family

ID=62907628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780083728.7A Active CN110192391B (en) 2017-01-19 2017-06-14 Processing method and equipment

Country Status (2)

Country Link
CN (1) CN110192391B (en)
WO (1) WO2018133312A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113473106A (en) * 2021-06-18 2021-10-01 青岛小鸟看看科技有限公司 Image transmission method, image display and processing device, and image transmission system
US11758108B2 (en) 2021-06-18 2023-09-12 Qingdao Pico Technology Co., Ltd. Image transmission method, image display device, image processing device, image transmission system, and image transmission system with high-transmission efficiency
CN114268779B (en) * 2021-12-08 2023-09-08 北京字跳网络技术有限公司 Image data processing method, device, equipment and computer readable storage medium
CN117834829A (en) * 2022-09-27 2024-04-05 万有引力(宁波)电子科技有限公司 Image processor, processing method, storage medium, and augmented reality display device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102215405B (en) * 2011-06-01 2013-10-23 深圳创维-Rgb电子有限公司 3D (three-dimensional) video signal compression coding-decoding method, device and system
ITTO20120208A1 (en) * 2012-03-09 2013-09-10 Sisvel Technology Srl METHOD OF GENERATION, TRANSPORT AND RECONSTRUCTION OF A STEREOSCOPIC VIDEO FLOW
CN105072433B (en) * 2015-08-21 2017-03-22 山东师范大学 Depth perception mapping method applied to head track virtual reality system
CN105608666A (en) * 2015-12-25 2016-05-25 普瑞福克斯(北京)数字媒体科技有限公司 Method and system for generating three-dimensional image by two-dimensional graph
CN105791977B (en) * 2016-02-26 2019-05-07 北京视博云科技有限公司 Virtual reality data processing method, equipment and system based on cloud service
CN106170081B (en) * 2016-06-28 2017-12-12 上海米影信息科技有限公司 A kind of wireless dummy reality server, system and its data compression transmission method

Also Published As

Publication number Publication date
CN110192391A (en) 2019-08-30
WO2018133312A1 (en) 2018-07-26

Similar Documents

Publication Publication Date Title
US11978159B2 (en) Cross reality system
CN110192391B (en) Processing method and equipment
US11790619B2 (en) Cross reality system with accurate shared maps
KR101979564B1 (en) Systems and methods for reducing motion-to-phon latency and memory bandwidth in a virtual reality system
US10855909B2 (en) Method and apparatus for obtaining binocular panoramic image, and storage medium
WO2017113681A1 (en) Video image processing method and apparatus based on virtual reality technology
KR20160135660A (en) Method and apparatus for providing 3-dimension image to head mount display
US20180063512A1 (en) Image streaming method and electronic device for supporting the same
US20160261841A1 (en) Method and device for synthesizing three-dimensional background content
WO2018000609A1 (en) Method for sharing 3d image in virtual reality system, and electronic device
CN112153306B (en) Image acquisition system, method and device, electronic equipment and wearable equipment
US11496758B2 (en) Priority-based video encoding and transmission
KR20160106338A (en) Apparatus and Method of tile based rendering for binocular disparity image
CN107318008A (en) Panoramic video player method and playing device
CN106201259A (en) A kind of method and apparatus sharing full-view image in virtual reality system
CN107065164B (en) Image presentation method and device
KR102459850B1 (en) Method and apparatus for processing 3-dimension image, and graphic processing unit
US9225968B2 (en) Image producing apparatus, system and method for producing planar and stereoscopic images
CN114742703A (en) Method, device and equipment for generating binocular stereoscopic panoramic image and storage medium
US20200294209A1 (en) Camera feature removal from stereoscopic content
TW201902218A (en) Systems and methods for reducing memory bandwidth via multiview compression/decompression
TWM630947U (en) Stereoscopic image playback apparatus
US20230052104A1 (en) Virtual content experience system and control method for same
CN114187173A (en) Model training method, image processing method and device, electronic device and medium
WO2018000610A1 (en) Automatic playing method based on determination of image type, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant