CN109814710A - Data processing method and device and virtual reality equipment - Google Patents

Data processing method and device and virtual reality equipment Download PDF

Info

Publication number
CN109814710A
CN109814710A CN201811613471.4A CN201811613471A CN109814710A CN 109814710 A CN109814710 A CN 109814710A CN 201811613471 A CN201811613471 A CN 201811613471A CN 109814710 A CN109814710 A CN 109814710A
Authority
CN
China
Prior art keywords
target image
acquisition time
image
virtual reality
acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811613471.4A
Other languages
Chinese (zh)
Other versions
CN109814710B (en
Inventor
李立纲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Xiaoniao Kankan Technology Co Ltd
Original Assignee
Qingdao Xiaoniao Kankan Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Xiaoniao Kankan Technology Co Ltd filed Critical Qingdao Xiaoniao Kankan Technology Co Ltd
Priority to CN201811613471.4A priority Critical patent/CN109814710B/en
Publication of CN109814710A publication Critical patent/CN109814710A/en
Application granted granted Critical
Publication of CN109814710B publication Critical patent/CN109814710B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a data processing method, a data processing device and virtual reality equipment, wherein the virtual reality equipment comprises a camera device and a positioning and tracking device, and the method comprises the following steps: acquiring an image of the environment where the virtual reality equipment is located, which is acquired by the camera device, and taking the image as a target image; acquiring the acquisition time of the target image; acquiring attitude data provided by the positioning and tracking device at the acquisition time; and associating the target image with the attitude data according to the acquisition time to generate a display image of the virtual reality equipment. The method can improve the real experience of watching the external real environment through the virtual reality equipment.

Description

Data processing method, device and virtual reality device
Technical field
The present invention relates to technical field of virtual reality, more particularly it relates to a kind of for virtual reality device Data processing method, a kind of data processing equipment and a kind of virtual reality device for virtual reality device.
Background technique
Virtual reality (Virtual Reality, VR) equipment can create a virtual world for user and immerse wherein, Therefore, the part of wearing of VR equipment is totally-enclosed encapsulation, user be during using VR equipment can't see it is external true Real environment.
As gradually maturation, the various technologies around virtual reality of VR technology are also developed rapidly, these technologies Including augmented reality (Augmented Reality, referred to as AR), mixed reality (Mix Reality, referred to as MR) etc., In, AR is that virtual information is added in true environment, Lai Zengqiang true environment, and MR is then to mix real world and virtual world Together, it generates new visible environment, contains physical entity and virtual information in environment simultaneously.
In order to realize the above AR perhaps MR function or in order to meet user when using VR equipment it can be seen that real ring The image of photographic device acquisition true environment can be set in the demand etc. in border, to show true environment by the screen of VR equipment Image.This kind is applied, since the acquisition frame rate of photographic device is difficult to be consistent with the refresh rate of screen, use Family can feel that picture has Caton sense in moving process, moreover, camera, to application, then is arrived from shooting image to by image transmitting Show can there is biggish delay finally by screen, and then will cause user's dizziness etc..
In order to solve problem above, however, it would be possible to interleave and position correction are carried out by some optimization algorithms, to protect The display frame rate for demonstrate,proving image is consistent with screen refresh rate, and user is made not perceived delay, these algorithms are, for example, Asynchronous time warping algorithm, asynchronous reprojection's algorithm etc..Since these algorithms not only need image itself, but also require Posture information when every frame image generates, and the image of camera acquisition does not have the posture information at corresponding moment, this just leads It causes the application of these algorithms to occur hindering, therefore, is highly desirable to provide a kind of scheme to obtain optimization data, to pass through Optimization algorithm solves the problems such as Caton occurred when display actual environment image being arranged by VR, dizziness.
Summary of the invention
One purpose of the embodiment of the present invention is to provide a kind of the new of the progress data processing for virtual reality device Technical solution, to obtain optimization data.
According to the first aspect of the invention, a kind of data processing method for virtual reality device, feature are provided It is, the virtual reality device includes photographic device and location tracking device, which comprises
The image of environment where obtaining the virtual reality device of the photographic device acquisition, using described image as mesh Logo image;
Obtain the acquisition time of the target image;
Obtain the attitude data in the acquisition time that the location tracking device provides;
According to the acquisition time, the target image is associated with the attitude data, generates the virtual reality device Display image.
Optionally, the step of acquisition time for obtaining the target image includes:
After the photographic device collects the target image, the target image is obtained from the photographic device Acquisition time.
Optionally, the step of acquisition time for obtaining the target image includes:
Before the photographic device acquires the target image, the acquisition time of the target image is calculated, with
When the acquisition time arrives, the attitude data in the acquisition time is obtained from the location tracking device.
Optionally, the step of acquisition time for calculating the target image includes:
An image of environment is as the target figure where obtaining the virtual reality device of the photographic device acquisition The reference picture of picture, wherein the reference picture is the first image of the target image;
Obtain the actual acquisition time of the reference picture;
According to the acquisition frame rate of the actual acquisition time of the reference picture and the photographic device, the target figure is calculated The acquisition time of picture.
Optionally, an image conduct of environment where the virtual reality device for obtaining the photographic device acquisition The step of reference picture of the target image includes:
It is chosen at before the actual acquisition time arrival of the target image, has obtained the latest image of actual acquisition time Reference picture as the target image.
Optionally, described according to the acquisition time, include: by the step of target image association attitude data
After the photographic device collects the target image, the actual acquisition time of the target image is obtained; According to the actual acquisition time of the acquisition time for the target image being calculated and the target image, obtains to calculate and miss Difference;
When the calculating error is in the range of tolerable variance of setting, the target image is associated with the attitude data.
Optionally, the method also includes: according to the calculating error transfer factor and update the range of tolerable variance.
Optionally, the display image for generating the virtual reality device includes:
Obtain the target image of successive frame, wherein the target image is associated with corresponding attitude data;
According to the target image of the successive frame and associated attitude data, interleave is carried out to the target image of successive frame With position correction processing, generate the display image, wherein it is described processing so that it is described display image display frame rate with it is described The screen refresh rate of the head-wearing device of virtual reality device is consistent.
According to the second aspect of the invention, a kind of data processing equipment for virtual reality device is additionally provided, it is described Virtual reality device includes photographic device and location tracking device, and the data processing equipment includes:
Image data acquisition module, environment where the virtual reality device for obtaining the photographic device acquisition Image, using described image as target image;
Time-obtaining module, for obtaining the acquisition time of the target image;
Attitude data obtains module, the posture number in the acquisition time provided for obtaining the location tracking device According to;And
The target image is associated with the attitude data, described in generation for according to the acquisition time by relating module The display image of virtual reality device;Alternatively,
The data processing equipment includes memory and processor, and the memory for storing instruction, use by described instruction It is operated in controlling the processor to execute the method described according to the first aspect of the invention.
According to the third aspect of the invention we, a kind of virtual reality device is additionally provided comprising photographic device, location tracking Device and the data processing equipment described according to the second aspect of the invention, the photographic device is for acquiring virtual reality The image of environment where device, the location tracking device are used to obtain the attitude data using the virtual reality device.
A beneficial effect of the invention is, can be according to the method for the embodiment of the present invention the true of photographic device acquisition The attitude data that the image correlation time of real environment synchronizes, virtual reality device can be according to associated with attitude data by image Obtained optimization data optimize processing to the image of the true environment of photographic device acquisition, and then are promoted by virtually existing Real equipment watches the actual experience of external true environment.For example, the true environment acquired by the optimization data to photographic device Image carry out interleave and position correction processing, in this way, user by virtual reality device viewing true environment when, can subtract Light truck and spinning sensation.
By referring to the drawings to the detailed description of exemplary embodiment of the present invention, other feature of the invention and its Advantage will become apparent.
Detailed description of the invention
It is combined in the description and the attached drawing for constituting part of specification shows the embodiment of the present invention, and even With its explanation together principle for explaining the present invention.
Fig. 1 is the structure composition schematic diagram of virtual reality device according to an embodiment of the present invention;
Fig. 2 is the hardware structural diagram of data processing equipment according to an embodiment of the present invention;
Fig. 3 is the flow diagram of data processing method according to an embodiment of the present invention;
Fig. 4 is the flow diagram of data processing method according to another embodiment of the present invention;
Fig. 5 is the functional block diagram of data processing equipment according to an embodiment of the present invention;
Fig. 6 is the functional block diagram of data processing equipment according to another embodiment of the present invention.
Specific embodiment
Carry out the various exemplary embodiments of detailed description of the present invention now with reference to attached drawing.It should also be noted that unless in addition having Body explanation, the unlimited system of component and the positioned opposite of step, numerical expression and the numerical value otherwise illustrated in these embodiments is originally The range of invention.
Be to the description only actually of at least one exemplary embodiment below it is illustrative, never as to the present invention And its application or any restrictions used.
Technology, method and apparatus known to person of ordinary skill in the relevant may be not discussed in detail, but suitable In the case of, the technology, method and apparatus should be considered as part of specification.
It is shown here and discuss all examples in, any occurrence should be construed as merely illustratively, without It is as limitation.Therefore, other examples of exemplary embodiment can have different values.
It should also be noted that similar label and letter indicate similar terms in following attached drawing, therefore, once a certain Xiang Yi It is defined in a attached drawing, then in subsequent attached drawing does not need that it is further discussed.
<virtual reality device>
Fig. 1 is the composed structure schematic diagram of virtual reality device according to an embodiment of the present invention.
According to Fig. 1, the virtual reality device of the embodiment of the present invention may include photographic device 1000, location tracking dress Set 2000, head-wearing device 3000 and data processing equipment 4000.
Photographic device 1000 is used to acquire the image of true environment.
Photographic device 1000 can be carried out wired or is wirelessly connected with data processing equipment 4000, wherein wired connection E.g. USB connection, the connection of MIPI bus etc., are wirelessly connected e.g. wifi connection, bluetooth connection etc..
The photographic device 1000 can be fixed on head-wearing device 3000.The photographic device 1000, which can also be fixed, to be set It sets in external environment.The photographic device 1000 can also be mounted in external environment by holder, in this way, photographic device is for example Head-wearing device 3000 can be followed to move synchronously under the control action of holder.
Location tracking device 2000 is used to obtain the attitude data that user uses virtual reality device.
The attitude data can be the attitude data for the peripheral hardware of virtual reality device carried by user, be also possible to lead to Cross the attitude data of the user obtained to the location tracking of peripheral hardware.
Location tracking device 2000 can also be carried out wired or is wirelessly connected with data processing equipment 4000.
In an example of the invention, which can pass through tracking e.g. head-wearing device 3000, the spatial position for controlling at least one peripheral hardware that handle, feedback shoes, data glove etc. need user to carry, identifies user Position, posture etc., and then obtain the attitude data that user uses virtual reality device.
For example, location tracking device 2000 can obtain position, the posture of peripheral hardware based on infrared signal.
In another example location tracking device 2000 can also position based on the first-class acquisition peripheral hardware of binocular camera shooting, posture.
In an example of the invention, which also may include Inertial Measurement Unit (Inertial measurement unit, IMU), which is mounted on head-wearing device 3000, to pass through inertia Measuring unit identifies position, the posture etc. of head-wearing device, and then obtains attitude data.
Data processing equipment 4000 can according to the method for the present invention to the collected true environment of photographic device 1000 figure As optimizing processing, and the screen for passing through head-wearing device 3000 image that shows that treated.
Fig. 2 is the hardware structural diagram of data processing equipment 4000 according to an embodiment of the present invention.
According to Fig.2, the data processing equipment 4000 of the embodiment of the present invention include one or more memories 4010 and One or more processor 4020.
Processor 4020 can be desktop computer processor, processor-server, mobile edition processor etc..
Data processing equipment 4000 may include at least one dedicated graphics processor.
Memory 4010 for example including ROM (read-only memory), RAM (random access memory), such as hard disk it is non-easily The property lost memory etc..
For storing instruction, which is operated for control processor 4020 to execute according to this hair memory 4010 The data processing method of bright embodiment.Technical staff can disclosed conceptual design instruction according to the present invention.How instruction controls Processor is operated, this is it is known in the art that therefore being not described in detail herein.
The data processing equipment 4000 of the embodiment of the present invention can also include interface arrangement, communication device, input unit, raise Sound device, microphone etc..
Interface arrangement is for example including USB interface etc..Communication device is for example able to carry out wired or wireless communication, specifically may be used To include Wifi communication, Bluetooth communication etc..Input unit for example may include touch screen, keyboard, body-sensing input etc..At data Voice messaging can be received by microphone by managing device 4000, and export voice messaging etc. by loudspeaker.
The data processing equipment 4000 can be the host of the fixed setting of virtual reality device, can be virtual reality and sets Standby hand-held control handle, can also integrate with head-wearing device 3000.
<embodiment of the method>
Fig. 3 is the flow diagram of the data processing method according to an embodiment of the present invention for virtual reality device.
According to Fig.3, the method for the embodiment of the present invention may include steps of:
Step S3100, data processing equipment 4000 obtain environment where the virtual reality device that photographic device 1000 acquires Image, and using the image as target image.
Environment where the virtual reality device is also external true environment.
It, can be by the every of the virtual reality device place environment that photographic device 1000 acquires in an example of the invention One frame image is used as target image, wherein target image is the figure that optimization data are formed to be associated with attitude data Picture.
It, can also be by the virtual reality device place environment that photographic device 1000 acquires in an example of the invention Parts of images is as target image.
By taking android system as an example, photographic device can be accessed by Camera2 interface in step S3100 and obtained The image.
Step S3200, data processing equipment 4000 obtain the acquisition time of target image.
In step S3200, the acquisition time of target image is the time for exposure of target image.
In one example, data processing equipment 4000 is after getting target image from photographic device 1000, just really The acquisition time of the fixed target image, the acquisition time are also the real exposure time of target image.It is to be understood that number According to processing unit 4000 after getting target image from photographic device 1000, the target just is got from photographic device 1000 The acquisition time of image, i.e., in this example embodiment, the acquisition time of target image are also provided by photographic device 1000.
Equally by taking android system as an example, data processing equipment 4000 accesses photographic device 100 by Camera2 interface The data of acquisition not only include image data itself, also include the acquisition time of image.Data processing equipment 4000 can pass through The parameter of onCaptureCompleted function in CameraCaptureSession.CaptureCallbac CaptureResult obtains the value of CaptureResult.SENSOR_TIMESTAMP, which is the acquisition time of image.
It in this example embodiment, can be relatively right when getting the real exposure time of target image due to data processing equipment 4000 The real exposure time answered is postponed, and therefore, which, which is suitable for location tracking device 2000, can recorde and provide over The case where attitude data of time.
In an example of the invention, data processing equipment 4000 can also be in step S3200, in photographic device It acquires between the target image, just calculates the acquisition time of the target image in advance, i.e., the acquisition time is calculated Acquisition time, directly to obtain from location tracking device 2000 in the acquisition time being calculated when the acquisition time arrives Attitude data.The example can be suitable for the case where location tracking device 2000 can only provide the attitude data at current time.
In this example embodiment, the acquisition time that target image is obtained in step S3200 may further include: data processing Device 4000 calculates the acquisition time of the target image, in the acquisition before photographic device 1000 acquires the target image When time arrives, the attitude data in the acquisition time is obtained from location tracking device 2000.
In one example, the target is calculated before photographic device 1000 acquires the target image in step S3200 The acquisition time of image may further include:
Step S3211, an image of environment is as target where obtaining the virtual reality device that photographic device 1000 acquires The reference picture of image, wherein the reference picture is the first image of target image.
For example, the reference picture can be the collected first frame image of photographic device 1000.Further, data processing Reference picture of the first frame image as all posterior target images can be set in device 4000.
In another example a figure of environment where obtaining the virtual reality device that photographic device 1000 acquires in step S3211 As the reference picture as target image also may further include: the actual acquisition time for being chosen at the target image arrives it Before, obtain reference picture of the latest image as the target image of actual acquisition time.
In this example embodiment, data processing equipment 4000 can be according to the delay and photographic device for obtaining the actual acquisition time 1000 acquisition frame rate, carries out the selection of reference picture, so that the reference picture selected and corresponding target image interval are most Few picture frame, and then effectively improve the accuracy for calculating the acquisition time of target image.
For example, the delay time can be the average value of the delay time of the first image of target image.
The average value can according to need selection arithmetic mean of instantaneous value, geometrical mean, root mean square average etc..
In another example the delay time is also possible to according to the pre-set fixed value of data processing speed.
Step S3212 obtains the actual acquisition time of the reference picture.
Step S3213 calculates target image according to the actual acquisition time of reference picture and the acquisition frame rate of photographic device Acquisition time.
According to step S3213, adjacent two field pictures first can be calculated according to the acquisition frame rate of photographic device 1000 Between time interval, further according to the time interval between adjacent two field pictures calculate between target image and reference picture when Between be spaced, finally according to the time interval between the actual acquisition time of reference picture and target image and reference picture, push away Calculation obtains the acquisition time of target image.
Step S3300, data processing equipment 4000 obtain obtaining in step S3200 for the offer of location tracking device 2000 Acquisition time attitude data.
The attitude data is the attitude data that user uses virtual reality device to generate.
For example, the attitude data include head-wearing device 3000 tilt forward and back (pitch) angle, tilt the angle (roll) It spends and is swung left and right (yaw) angle etc..
In another example the attitude data include user itself tilt forward and back (pitch) angle, tilt (roll) angle With (yaw) angle etc. that is swung left and right.
In one example, the acquisition time that step S3200 is obtained is the target image obtained from photographic device 1000 The actual acquisition time, accordingly, in step S3300, data processing equipment 4000 will be obtained from location tracking device 2000 First acquisition and the stored attitude data in the actual acquisition time.
In one example, step S3200 obtain acquisition time be photographic device 1000 acquire the target image it What preceding reckoning obtained, i.e., the acquisition time is the acquisition time being calculated, accordingly, in step S3300, data processing equipment 4000 can directly be calculated from the acquisition of location tracking device 2000 at this when the acquisition time that this is calculated arrives The attitude data that the attitude data of acquisition time, i.e. location tracking device 2000 currently obtain.
Target image is associated with attitude data according to acquisition time by step S3400, generates the display of virtual reality device Image.
In one example, acquisition time is the actual acquisition time of the target image obtained from photographic device 1000, appearance State data are attitude data of the virtual reality device of the record of location tracking device 2000 in the actual acquisition time, and the two is based on It corresponds to identical acquisition time and is associated together.
In this example embodiment, photographic device 1000 can provide the actual acquisition time of target image, location tracking device 2000 can provide the timestamp of attitude data.In this way, data processing equipment 4000 can be according to the lookup pair of actual acquisition time Attitude data between seasonable is associated.
In one example, the target figure can be calculated before acquiring target image in data processing equipment 4000 The acquisition time of picture, and then when the acquisition time arrives, the posture of current time is directly obtained from location tracking device 2000 Data, and the attitude data is associated with to the target image.
In this example embodiment, target image association attitude data can be wrapped further according to acquisition time in step S3400 It includes:
Step S3410, after photographic device 1000 collects the target image, data processing equipment 4000 obtains the mesh The actual acquisition time of logo image.
In step S3410, data processing equipment 4000 can get the reality of the target image from photographic device 1000 Border acquisition time.
Step S3420 is obtained according to the actual acquisition time of the acquisition time for the target image being calculated and target image Error must be calculated.
According to step S3420, which is equal to the acquisition time being calculated in step S3200 and reality The difference of acquisition time or the absolute value of difference.
The target image is associated with the attitude data when calculating error in the range of tolerable variance of setting by step S3430.
In this example embodiment, target image corresponds to the actual acquisition time, and attitude data corresponds to the acquisition time being calculated, such as The acquisition time that fruit is calculated with respect to the actual acquisition time calculating error in the range of tolerable variance of permission, then target image with Attitude data can be associated, i.e., can match therebetween, otherwise need to abandon this group of data, in order to avoid influence display image Accuracy.
Range of tolerable variance can be less than or equal to 2ms, such as value is 1ms.
In one example, the first row of reflection target image and the mapping relations between the actual acquisition time can be generated Table, and the second list of the mapping relations between reflection attitude data and the acquisition time being calculated is generated, and then according to appearance Poor range, is associated with first list and second list generates image data.
In an example of the invention, which can be pre-set fixed value.
In an example of the invention, which can also be adaptively adjusted according to big data, for this purpose, root According in an example, the method for the present invention can further include following steps: according to calculating error transfer factor and update the tolerance Range.
For example, the adjustment can be, again according to the average value of the calculating error of all target images of acquired correspondence Calculate range of tolerable variance.
It in this example embodiment, can also be when the calculating error be more than given threshold, further according to the calculating error transfer factor and more New range of tolerable variance, to take into account data processing amount and tolerance precision.
It can be according to the method for the embodiment of the present invention the attitude data that the image correlation time of camera acquisition synchronizes, into And the optimization data that optimization algorithm is able to use can be obtained, for promoting the performance of virtual reality device and the body of user It tests.
The optimization data that method according to embodiments of the present invention obtains, such as interleave and position can be carried out by optimization algorithm Correcting process is set, the display image of virtual reality device is generated, so that the display frame rate and head-wearing device of treated image 3000 screen refresh rate is consistent, and then user is avoided to go out when watching external true environment by the screen of head-wearing device 3000 The problems such as existing Caton, dizziness.
Fig. 4 is the flow diagram of data processing method according to another embodiment of the present invention.
According to Fig.4, the display image that the virtual reality device is generated in above step S3400 can be wrapped further Include following steps:
Step S4100 obtains the target image of successive frame, wherein the target image is associated with corresponding attitude data.
According to step S4100, each target image in successive frame has been associated with according to the method for the embodiment of the present invention The synchronous attitude data of having time, forms optimization data.
Step S4200, according to the target image of the successive frame and associated attitude data, to the target image of successive frame Interleave and position correction processing are carried out, display image is generated, wherein the processing is so that the display frame rate of the display image and virtual The screen refresh rate of the head-wearing device of real world devices is consistent.
In step S4200, such as it can be calculated by existing asynchronous time warping algorithm or asynchronous reprojection Method etc. carries out interleave to the target image of successive frame and position correction is handled.
The method of the embodiment according to the present invention can be avoided user outside the screen viewing by head-wearing device 3000 Occur the problems such as Caton, dizziness when true environment, promotes user experience.
<Installation practice>
Fig. 5 is the functional block diagram of data processing equipment according to an embodiment of the present invention.
According to Fig.5, according to this embodiment data processing equipment 4000 include image data acquisition module 45100, Attitude data obtains module 45200, relating module 45300 and time-obtaining module 45400.
The image data acquisition module 45100 is used to obtain environment where the virtual reality device of the acquisition of photographic device 1000 Image, and using the image as target image.
The time-obtaining module 45400 is used to obtain the acquisition time of the target image.
The attitude data obtain module 45200 be used to obtain the offer of location tracking device 2000 from time-obtaining module The attitude data of 45400 acquisition times obtained.
The relating module 45300 is used for according to the acquisition time, and target image is associated with attitude data, generates virtual reality The display image of equipment.
In one embodiment, which is used to collect the target image in photographic device 1000 Later, the acquisition time of the target image is obtained from photographic device 1000.
In one embodiment, the time-obtaining module 45400 be used for photographic device 1000 acquire the target image it Before, the acquisition time of the target image is calculated, to notify attitude data to obtain module 45200 when the acquisition time arrives, from Location tracking device 2000 obtains the attitude data in the acquisition time.
In one embodiment, which is used for: obtaining the virtual reality device of photographic device acquisition Reference picture of one image of place environment as target image, wherein reference picture is the first image of target image;It obtains The actual acquisition time of the reference picture;And according to the actual acquisition time of the reference picture and the acquisition of the photographic device Frame per second calculates the acquisition time of the target image.
In one embodiment, the actual acquisition time that time-obtaining module 45400 is used to be chosen at target image arrives Before, reference picture of the latest image as the target image of actual acquisition time has been obtained.
In one embodiment, which is used for after photographic device 1000 collects the target image, Obtain the actual acquisition time of the target image;According to the reality of the acquisition time for the target image being calculated and target image Acquisition time obtains and calculates error;And when calculating error in the range of tolerable variance of setting, target image is associated with the appearance State data.
In one embodiment of the invention, data processing equipment 4000 can also include that update module (is not shown in figure Out), which is used for according to the calculating error transfer factor and updates the range of tolerable variance.
The device 4000 of the embodiment according to the present invention, the image correlation time that can be acquired for photographic device 1000 are synchronous Attitude data, and then the optimization data that optimization algorithm is able to use can be obtained, promoted virtual reality device performance and The experience of user.
Fig. 6 is the functional block diagram of data processing equipment according to another embodiment of the present invention.
According to Fig.6, on the basis of the embodiment shown in Fig. 5 of data processing equipment 4000 of the embodiment, further include Optimization processing module 46100.
The optimization processing module 46100 can be used for: obtain the target image of successive frame, wherein the target image closes It is associated with corresponding attitude data;And target image and associated attitude data according to successive frame, to the target of successive frame Image carries out interleave and position correction is handled, and generates display image, wherein the processing is so that show the display frame rate and void of image The screen refresh rate of the head-wearing device of quasi- real world devices is consistent.
The device 4000 of the embodiment according to the present invention can be avoided user and watch by the screen of head-wearing device 3000 Occur the problems such as Caton, dizziness when external true environment, promotes user experience.
The present invention can be system, method and/or computer program product.Computer program product may include computer Readable storage medium storing program for executing, containing for making processor realize the computer-readable program instructions of various aspects of the invention.
Computer readable storage medium, which can be, can keep and store the tangible of the instruction used by instruction execution equipment Equipment.Computer readable storage medium for example can be-- but it is not limited to-- storage device electric, magnetic storage apparatus, optical storage Equipment, electric magnetic storage apparatus, semiconductor memory apparatus or above-mentioned any appropriate combination.Computer readable storage medium More specific example (non exhaustive list) includes: portable computer diskette, hard disk, random access memory (RAM), read-only deposits It is reservoir (ROM), erasable programmable read only memory (EPROM or flash memory), static random access memory (SRAM), portable Compact disk read-only memory (CD-ROM), digital versatile disc (DVD), memory stick, floppy disk, mechanical coding equipment, for example thereon It is stored with punch card or groove internal projection structure and the above-mentioned any appropriate combination of instruction.Calculating used herein above Machine readable storage medium storing program for executing is not interpreted that instantaneous signal itself, the electromagnetic wave of such as radio wave or other Free propagations lead to It crosses the electromagnetic wave (for example, the light pulse for passing through fiber optic cables) of waveguide or the propagation of other transmission mediums or is transmitted by electric wire Electric signal.
Computer-readable program instructions as described herein can be downloaded to from computer readable storage medium it is each calculate/ Processing equipment, or outer computer or outer is downloaded to by network, such as internet, local area network, wide area network and/or wireless network Portion stores equipment.Network may include copper transmission cable, optical fiber transmission, wireless transmission, router, firewall, interchanger, gateway Computer and/or Edge Server.Adapter or network interface in each calculating/processing equipment are received from network to be counted Calculation machine readable program instructions, and the computer-readable program instructions are forwarded, for the meter being stored in each calculating/processing equipment In calculation machine readable storage medium storing program for executing.
Computer program instructions for executing operation of the present invention can be assembly instruction, instruction set architecture (ISA) instructs, Machine instruction, machine-dependent instructions, microcode, firmware instructions, condition setup data or with one or more programming languages The source code or object code that any combination is write, the programming language include the programming language-of object-oriented such as Smalltalk, C++ etc., and conventional procedural programming languages-such as " C " language or similar programming language.Computer Readable program instructions can be executed fully on the user computer, partly execute on the user computer, be only as one Vertical software package executes, part executes on the remote computer or completely in remote computer on the user computer for part Or it is executed on server.In situations involving remote computers, remote computer can pass through network-packet of any kind It includes local area network (LAN) or wide area network (WAN)-is connected to subscriber computer, or, it may be connected to outer computer (such as benefit It is connected with ISP by internet).In some embodiments, by utilizing computer-readable program instructions Status information carry out personalized customization electronic circuit, such as programmable logic circuit, field programmable gate array (FPGA) or can Programmed logic array (PLA) (PLA), the electronic circuit can execute computer-readable program instructions, to realize each side of the invention Face.
Referring herein to according to the method for the embodiment of the present invention, the flow chart of device (system) and computer program product and/ Or block diagram describes various aspects of the invention.It should be appreciated that flowchart and or block diagram each box and flow chart and/ Or in block diagram each box combination, can be realized by computer-readable program instructions.
These computer-readable program instructions can be supplied to general purpose computer, special purpose computer or other programmable datas The processor of processing unit, so that a kind of machine is produced, so that these instructions are passing through computer or other programmable datas When the processor of processing unit executes, function specified in one or more boxes in implementation flow chart and/or block diagram is produced The device of energy/movement.These computer-readable program instructions can also be stored in a computer-readable storage medium, these refer to It enables so that computer, programmable data processing unit and/or other equipment work in a specific way, thus, it is stored with instruction Computer-readable medium then includes a manufacture comprising in one or more boxes in implementation flow chart and/or block diagram The instruction of the various aspects of defined function action.
Computer-readable program instructions can also be loaded into computer, other programmable data processing units or other In equipment, so that series of operation steps are executed in computer, other programmable data processing units or other equipment, to produce Raw computer implemented process, so that executed in computer, other programmable data processing units or other equipment Instruct function action specified in one or more boxes in implementation flow chart and/or block diagram.
The flow chart and block diagram in the drawings show the system of multiple embodiments according to the present invention, method and computer journeys The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation One module of table, program segment or a part of instruction, the module, program segment or a part of instruction include one or more use The executable instruction of the logic function as defined in realizing.In some implementations as replacements, function marked in the box It can occur in a different order than that indicated in the drawings.For example, two continuous boxes can actually be held substantially in parallel Row, they can also be executed in the opposite order sometimes, and this depends on the function involved.It is also noted that block diagram and/or The combination of each box in flow chart and the box in block diagram and or flow chart, can the function as defined in executing or dynamic The dedicated hardware based system made is realized, or can be realized using a combination of dedicated hardware and computer instructions.It is right For those skilled in the art it is well known that, by hardware mode realize, by software mode realize and pass through software and It is all of equal value that the mode of combination of hardware, which is realized,.
Various embodiments of the present invention are described above, above description is exemplary, and non-exclusive, and It is not limited to disclosed each embodiment.Without departing from the scope and spirit of illustrated each embodiment, for this skill Many modifications and changes are obvious for the those of ordinary skill in art field.The selection of term used herein, purport In principle, the practical application or to the technological improvement in market for best explaining each embodiment, or make the art its Its those of ordinary skill can understand each embodiment disclosed herein.The scope of the present invention is defined by the appended claims.

Claims (10)

1. a kind of data processing method for virtual reality device, which is characterized in that the virtual reality device includes camera shooting Device and location tracking device, which comprises
The image of environment where obtaining the virtual reality device of the photographic device acquisition, using described image as target figure Picture;
Obtain the acquisition time of the target image;
Obtain the attitude data in the acquisition time that the location tracking device provides;
According to the acquisition time, the target image is associated with the attitude data, generates the aobvious of the virtual reality device Diagram picture.
2. the method according to claim 1, wherein the step of acquisition time for obtaining the target image Include:
After the photographic device collects the target image, the acquisition of the target image is obtained from the photographic device Time.
3. the method according to claim 1, wherein the step of acquisition time for obtaining the target image Include:
Before the photographic device acquires the target image, the acquisition time of the target image is calculated, with
When the acquisition time arrives, the attitude data in the acquisition time is obtained from the location tracking device.
4. according to the method described in claim 3, it is characterized in that, the step of the acquisition time for calculating the target image Include:
An image of environment is as the target image where obtaining the virtual reality device of the photographic device acquisition Reference picture, wherein the reference picture is the first image of the target image;
Obtain the actual acquisition time of the reference picture;
According to the acquisition frame rate of the actual acquisition time of the reference picture and the photographic device, the target image is calculated Acquisition time.
5. according to the method described in claim 4, it is characterized in that, described obtain the described virtual existing of the photographic device acquisition The step of reference picture of the image of environment where real equipment as the target image includes:
It is chosen at before the actual acquisition time arrival of the target image, has obtained the latest image conduct of actual acquisition time The reference picture of the target image.
6. the method according to any one of claim 3-5, which is characterized in that it is described according to the acquisition time, by institute Stating the step of target image is associated with the attitude data includes:
After the photographic device collects the target image, the actual acquisition time of the target image is obtained;According to The acquisition time for the target image being calculated and the actual acquisition time of the target image obtain and calculate error;
When the calculating error is in the range of tolerable variance of setting, the target image is associated with the attitude data.
7. according to the method described in claim 6, it is characterized in that, the method also includes:
According to the calculating error transfer factor and update the range of tolerable variance.
8. the method according to any one of claims 1 to 5, which is characterized in that described to generate the virtual reality device Display image include:
Obtain the target image of successive frame, wherein the target image is associated with corresponding attitude data;
According to the target image of the successive frame and associated attitude data, interleave and position are carried out to the target image of successive frame Correcting process is set, the display image is generated, wherein the processing is so that the display frame rate of the display image and described virtual The screen refresh rate of the head-wearing device of real world devices is consistent.
9. a kind of data processing equipment for virtual reality device, which is characterized in that the virtual reality device includes camera shooting Device and location tracking device, the data processing equipment include:
Image data acquisition module, the figure of environment where the virtual reality device for obtaining the photographic device acquisition Picture, using described image as target image;
Time-obtaining module, for obtaining the acquisition time of the target image;
Attitude data obtains module, the attitude data in the acquisition time provided for obtaining the location tracking device; And
Relating module generates described virtual for according to the acquisition time, the target image to be associated with the attitude data The display image of real world devices;Alternatively,
The data processing equipment includes memory and processor, and for storing instruction, described instruction is for controlling for the memory The processor is made to be operated to execute method according to claim 1 to 8.
10. a kind of virtual reality device, which is characterized in that including photographic device, location tracking device and claim 9 institute The data processing equipment stated, the image of environment, the location tracking where the photographic device is used to acquire virtual reality device Device is used to obtain the attitude data using the virtual reality device.
CN201811613471.4A 2018-12-27 2018-12-27 Data processing method and device and virtual reality equipment Active CN109814710B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811613471.4A CN109814710B (en) 2018-12-27 2018-12-27 Data processing method and device and virtual reality equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811613471.4A CN109814710B (en) 2018-12-27 2018-12-27 Data processing method and device and virtual reality equipment

Publications (2)

Publication Number Publication Date
CN109814710A true CN109814710A (en) 2019-05-28
CN109814710B CN109814710B (en) 2022-05-13

Family

ID=66602569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811613471.4A Active CN109814710B (en) 2018-12-27 2018-12-27 Data processing method and device and virtual reality equipment

Country Status (1)

Country Link
CN (1) CN109814710B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112461245A (en) * 2020-11-26 2021-03-09 浙江商汤科技开发有限公司 Data processing method and device, electronic equipment and storage medium
WO2021175055A1 (en) * 2020-03-05 2021-09-10 Oppo广东移动通信有限公司 Video processing method and related device
CN113766119A (en) * 2021-05-11 2021-12-07 腾讯科技(深圳)有限公司 Virtual image display method, device, terminal and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102598064A (en) * 2009-10-12 2012-07-18 Metaio有限公司 Method for representing virtual information in a view of a real environment
CN107065195A (en) * 2017-06-02 2017-08-18 福州光流科技有限公司 A kind of modularization MR equipment imaging methods
CN107168518A (en) * 2017-04-05 2017-09-15 北京小鸟看看科技有限公司 A kind of synchronous method, device and head-mounted display for head-mounted display
CN107690067A (en) * 2016-08-05 2018-02-13 成都理想境界科技有限公司 The detection method and device of head-mounted display apparatus frame per second
CN107835404A (en) * 2017-11-13 2018-03-23 歌尔科技有限公司 Method for displaying image, equipment and system based on wear-type virtual reality device
CN108765563A (en) * 2018-05-31 2018-11-06 北京百度网讯科技有限公司 Processing method, device and the equipment of SLAM algorithms based on AR
CN108965721A (en) * 2018-08-22 2018-12-07 Oppo广东移动通信有限公司 The control method and device of camera module, electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102598064A (en) * 2009-10-12 2012-07-18 Metaio有限公司 Method for representing virtual information in a view of a real environment
CN107690067A (en) * 2016-08-05 2018-02-13 成都理想境界科技有限公司 The detection method and device of head-mounted display apparatus frame per second
CN107168518A (en) * 2017-04-05 2017-09-15 北京小鸟看看科技有限公司 A kind of synchronous method, device and head-mounted display for head-mounted display
CN107065195A (en) * 2017-06-02 2017-08-18 福州光流科技有限公司 A kind of modularization MR equipment imaging methods
CN107835404A (en) * 2017-11-13 2018-03-23 歌尔科技有限公司 Method for displaying image, equipment and system based on wear-type virtual reality device
CN108765563A (en) * 2018-05-31 2018-11-06 北京百度网讯科技有限公司 Processing method, device and the equipment of SLAM algorithms based on AR
CN108965721A (en) * 2018-08-22 2018-12-07 Oppo广东移动通信有限公司 The control method and device of camera module, electronic equipment

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021175055A1 (en) * 2020-03-05 2021-09-10 Oppo广东移动通信有限公司 Video processing method and related device
CN112461245A (en) * 2020-11-26 2021-03-09 浙江商汤科技开发有限公司 Data processing method and device, electronic equipment and storage medium
WO2022110801A1 (en) * 2020-11-26 2022-06-02 浙江商汤科技开发有限公司 Data processing method and apparatus, electronic device, and storage medium
CN113766119A (en) * 2021-05-11 2021-12-07 腾讯科技(深圳)有限公司 Virtual image display method, device, terminal and storage medium
CN113766119B (en) * 2021-05-11 2023-12-05 腾讯科技(深圳)有限公司 Virtual image display method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN109814710B (en) 2022-05-13

Similar Documents

Publication Publication Date Title
CN110908503B (en) Method of tracking the position of a device
KR102517876B1 (en) Technique for recording augmented reality data
JP7443602B2 (en) Mixed reality system with virtual content warping and how to use it to generate virtual content
CN110402425B (en) Mixed reality system with color virtual content distortion and method for generating virtual content using the same
US20160350595A1 (en) Feedback based remote maintenance operations
Raaen et al. Measuring latency in virtual reality systems
CN107577045B (en) The method, apparatus and storage medium of predicting tracing for head-mounted display
CN106716303B (en) Stablize the movement of interaction ray
CN110050250A (en) The synchronous image regulation of display
CN109814710A (en) Data processing method and device and virtual reality equipment
US10403045B2 (en) Photorealistic augmented reality system
WO2018086295A1 (en) Application interface display method and apparatus
US11016560B1 (en) Video timewarp for mixed reality and cloud rendering applications
JP7201869B1 (en) Generate new frames with rendered and unrendered content from the previous eye
US11328475B2 (en) Gravity estimation and bundle adjustment for visual-inertial odometry
CN109613984A (en) Processing method, equipment and the system of video image in VR live streaming
US11353955B1 (en) Systems and methods for using scene understanding for calibrating eye tracking
CN108829627A (en) Synchronisation control means and system between virtual reality device
CN110332930B (en) Position determination method, device and equipment
CN105630152A (en) Device and method for processing visual data, and related computer program product
CN103517061A (en) Method and device for display control of terminal device
CN117529700A (en) Human body pose estimation using self-tracking controller
US20230037750A1 (en) Systems and methods for generating stabilized images of a real environment in artificial reality
CN114356082A (en) Image optimization method and device of augmented reality equipment, electronic equipment and system
JP6515512B2 (en) Display device, display device calibration method, and calibration program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant