CN107835404A - Method for displaying image, equipment and system based on wear-type virtual reality device - Google Patents
Method for displaying image, equipment and system based on wear-type virtual reality device Download PDFInfo
- Publication number
- CN107835404A CN107835404A CN201711117228.9A CN201711117228A CN107835404A CN 107835404 A CN107835404 A CN 107835404A CN 201711117228 A CN201711117228 A CN 201711117228A CN 107835404 A CN107835404 A CN 107835404A
- Authority
- CN
- China
- Prior art keywords
- attitude data
- real scene
- scene image
- virtual reality
- wear
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The embodiment of the present invention provides a kind of method for displaying image based on wear-type virtual reality device, equipment and system, and this method includes:Wear-type virtual reality device receives the original real scene image that camera is sent.Then, wear-type virtual reality device obtains the first attitude data when original real scene image to be shown in screen, and the second attitude data when shooting original real scene image respectively.Original real scene image is adjusted with the second attitude data further according to the first attitude data, image after adjustment is real scene image to be shown, after being adjusted processing, real scene image to be shown and user can be made to be matched in the first attitude data lower body towards the corresponding real scene image in direction, finally by this real scene image to be shown show and wear-type virtual reality device on.It is dizzy also just to solve user accordingly while solving the problems, such as image mismatch problem.
Description
Technical field
The present invention relates to technical field of image processing, more particularly to a kind of image based on wear-type virtual reality device to show
Show method, equipment and system.
Background technology
In recent years, virtual reality technology was widely used in various fields.User is set using virtual reality
Standby viewing image, can reach sensation on the spot in person.
User is shown in from capture apparatus shooting image to image and worn when actually using wear-type virtual reality device
The regular hour is needed on the display screen of formula virtual reality device, this time is time delay.Time delay is typically by two
It is grouped into, Part I is that capture apparatus shooting image receives time between this image to wear-type virtual reality device,
Part II be from wear-type virtual reality device by the image received to show this image time.And Part I
Duration be usually much larger than the duration of Part II, therefore, in actual applications, Part II can be ignored, and only examine
Consider the Part I of time delay.Outdoor scene corresponding to user's current body towards direction occurs in the presence of this time delay
The unmatched situation of real scene image that image is seen with user in wear-type virtual reality device, this matching are possible to make
User produces dizzy sense.
The content of the invention
In view of this, the embodiment of the present invention provides a kind of method for displaying image based on wear-type virtual reality device, set
Standby and system, to make the real scene image that user watches in wear-type virtual reality device with user's body currently towards corresponding
Real scene image there is good matching relationship.
The embodiment of the present invention provides a kind of method for displaying image based on wear-type virtual reality device, including:
Receive the original real scene image that camera is sent;
Obtain the first attitude data when showing the original real scene image and shoot the original real scene image
When the second attitude data;
The original real scene image is adjusted according to first attitude data and second attitude data, waits to show to generate
Show real scene image;
Show the real scene image to be shown.
Alternatively, after the original real scene image that the reception camera is sent, methods described also includes:
Display when obtaining shooting time when shooting the original real scene image and showing the original real scene image
Time;
First attitude data of the acquisition when show the original real scene image and in the shooting original outdoor scene
The second attitude data during image, including:
Obtain with display time corresponding first attitude data and it is corresponding with the shooting time described in
Second attitude data.
Alternatively, the acquisition with it is described show time corresponding first attitude data and with the shooting time
Corresponding second attitude data, including:
The first posture number is obtained from the storage queue for being stored with the attitude data collected in predetermined time period
It is more than or equal to according to second attitude data, the predetermined time period between the display time and the shooting time
Time difference.
Alternatively, methods described also includes:
In response to the start-up operation of the camera, the appearance gathered is read from attitude data storehouse with prefixed time interval
State data are stored in the storage queue;
If the quantity of the prefixed time interval undergone meets the predetermined time period, according to the suitable of FIFO
Sequence, the attitude data being stored in earliest in the storage queue is deleted successively, and the attitude data newly read is stored in the storage
In queue.
Alternatively, it is described that the original realistic picture is adjusted according to first attitude data and second attitude data
Picture, to generate real scene image to be shown, including:
Determine the data difference between first attitude data and second attitude data;
Transformation matrix for adjusting the original real scene image is determined according to the data difference;
Pixel coordinate matrix corresponding to each pixel in the original real scene image is multiplied with the transformation matrix respectively,
To generate the real scene image to be shown.
Alternatively, methods described also includes:
The virtual reality scenario type selected according to user, it is determined whether start camera.
The embodiment of the present invention provides a kind of wear-type virtual reality device, including:Memory, and connect with the memory
The processor connect;
The memory, for storing one or more computer instruction, wherein, one or more computer instruction
Call and perform for the processor;
The processor, set for performing one or more computer instruction with above-mentioned based on wear-type virtual reality
Any one method in standby method for displaying image.
Alternatively, the equipment also includes:
Camera, for shooting original real scene image;
Sensor, for gathering the attitude data of user;
Thread controller, for the start-up operation in response to the camera, control attitude data reads the unlatching of thread,
So that the processor reads thread by the attitude data reads the first attitude data and the second attitude data.
The embodiment of the present invention provides a kind of image display system based on wear-type virtual reality device, including:
For performing the number of any one method in the above-mentioned method for displaying image based on wear-type virtual reality device
According to processor and the head-mounted display apparatus with camera.
Alternatively, the head-mounted display apparatus with camera is used for:Shoot original real scene image and collection is clapped
Take the photograph the shooting time of original real scene image.
Method for displaying image based on wear-type virtual reality device, equipment and system provided in an embodiment of the present invention, head
Wear formula virtual reality device and receive the original real scene image of camera transmission.After original real scene image is received, wear-type is virtual
Real world devices obtain the first attitude data when original real scene image to be shown in screen respectively, and are shooting original realistic picture
As when the second attitude data.Original real scene image is adjusted with the second attitude data further according to the first attitude data, adjusted
Image after whole is real scene image to be shown, finally shows this real scene image to be shown.Due to from shooting original real scene image
The larger time difference often be present to the original real scene image photographed is shown between screen, if not to original realistic picture
Piece is adjusted, and occurs the user mentioned in background technology in outdoor scene corresponding to current pose data lower body towards direction
The unmatched problem of real scene image that image is watched with user in wear-type virtual reality device.And according to the first posture number
After being adjusted with the second attitude data to original real scene image, can so that real scene image to be shown and user in the first appearance
Real scene image corresponding to state data lower body towards direction matches, at the same also just solve due to real scene image mismatch problem and
The caused dizzy problem of user.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing
There is the required accompanying drawing used in technology description to be briefly described, it should be apparent that, drawings in the following description are this hairs
Some bright embodiments, for those of ordinary skill in the art, on the premise of not paying creative work, can be with root
Other accompanying drawings are obtained according to these accompanying drawings.
Fig. 1 is the method for displaying image embodiment one provided in an embodiment of the present invention based on wear-type virtual reality device
Flow chart;
Fig. 2 is the method for displaying image embodiment two provided in an embodiment of the present invention based on wear-type virtual reality device
Flow chart;
Fig. 3 is the image display device embodiment one provided in an embodiment of the present invention based on wear-type virtual reality device
Structural representation;
Fig. 4 is the image display device embodiment two provided in an embodiment of the present invention based on wear-type virtual reality device
Structural representation;
Fig. 5 is the structural representation of the embodiment one of wear-type virtual reality device provided in an embodiment of the present invention;
Fig. 6 is the image display system embodiment one provided in an embodiment of the present invention based on wear-type virtual reality device
Structural representation;
Fig. 7 is the inside configuration structure schematic diagram of wear-type virtual reality device provided in an embodiment of the present invention.
Embodiment
To make the purpose, technical scheme and advantage of the embodiment of the present invention clearer, below in conjunction with the embodiment of the present invention
In accompanying drawing, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is
Part of the embodiment of the present invention, rather than whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art
The every other embodiment obtained under the premise of creative work is not made, belongs to the scope of protection of the invention.
The term used in embodiments of the present invention is only merely for the purpose of description specific embodiment, and is not intended to be limiting
The present invention." one kind ", " described " and "the" of singulative used in the embodiment of the present invention and appended claims
It is also intended to including most forms, unless context clearly shows that other implications, " a variety of " generally comprise at least two, but not
Exclusion includes at least one situation.
It should be appreciated that term "and/or" used herein is only a kind of incidence relation for describing affiliated partner, represent
There may be three kinds of relations, for example, A and/or B, can be represented:Individualism A, while A and B be present, individualism B these three
Situation.In addition, character "/" herein, it is a kind of relation of "or" to typically represent forward-backward correlation object.
It will be appreciated that though XXX may be described using term first, second, third, etc. in embodiments of the present invention, but
These XXX should not necessarily be limited by these terms.These terms are only used for XXX being distinguished from each other out.For example, implementation of the present invention is not being departed from
In the case of example scope, the first XXX can also be referred to as the 2nd XXX, and similarly, the 2nd XXX can also be referred to as the first XXX.
Depending on linguistic context, word as used in this " if ", " if " can be construed to " ... when " or
" when ... " or " in response to determining " or " in response to detection ".Similarly, depending on linguistic context, phrase " if it is determined that " or " such as
Fruit detects (condition or event of statement) " can be construed to " when it is determined that when " or " in response to determine " or " when detection (statement
Condition or event) when " or " in response to detect (condition or event of statement) ".
It should also be noted that, term " comprising ", "comprising" or its any other variant are intended to nonexcludability
Comprising, so that commodity or system including a series of elements not only include those key elements, but also including without clear and definite
The other element listed, or also include for this commodity or the intrinsic key element of system.In the feelings not limited more
Under condition, the key element that is limited by sentence "including a ...", it is not excluded that in the commodity including the key element or system also
Other identical element be present.
Fig. 1 is the method for displaying image embodiment one provided in an embodiment of the present invention based on wear-type virtual reality device
Flow chart, the executive agent for being somebody's turn to do the method for displaying image based on wear-type virtual reality device that the present embodiment provides can be head
Formula virtual reality device is worn, as shown in figure 1, this method comprises the following steps:
S101, receive the original real scene image that camera is sent.
S102, obtain the first attitude data when showing original real scene image and when shooting original real scene image
Second attitude data.
The camera configured on wear-type virtual reality device can be with BR real scene image, to obtain original outdoor scene
Image.Then, alternatively, camera can be sent the original real scene image of shooting virtual to wear-type in the form of packet
Real world devices.Wear-type virtual reality device receives this packet, and original real scene image is can obtain by parsing this packet.
Sensor is provided with wear-type virtual reality device, for gathering the posture of user's head with prefixed time interval
Data.Alternatively, the acquisition time of attitude data and attitude data can be stored in one by wear-type virtual reality device
In data file.Wear-type virtual reality device can inquire first when showing original real scene image from data file
Attitude data and the second attitude data when shooting original real scene image.Alternatively, attitude data is specifically as follows user
The attitude angle on head, it that is to say the angle of pitch, roll angle and the course angle on head.
S103, original real scene image is adjusted according to the first attitude data and the second attitude data, to generate outdoor scene to be shown
Image.
S104, show real scene image to be shown.
The posture of user's head is continually changing, the first attitude data and the second attitude data the two attitude datas point
Do not represent two kinds of different head poses of user, therefore, scene image that user sees under the first attitude data with second
The scene image seen under attitude data is different.Wear-type virtual reality device can be according between two attitude datas
Data difference is adjusted to original real scene image, will be corresponding to reception corresponding to the original time Image Adjusting of shooting time
The real scene image to be shown of time.Alternatively, due to the attitude angle that attitude data is user's head, therefore, this data difference
It is a differential seat angle.Wear-type virtual reality device can just to be adjusted according to this differential seat angle to original real scene image, with
Obtain real scene image to be shown.Finally, this real scene image to be shown is shown on the screen of wear-type virtual reality device.It is former
After beginning real scene image is adjusted processing so that real scene image to be shown that user sees in wear-type virtual reality device with
User the first attitude data lower body towards the real scene image in direction be identical.It that is to say and reached wear-type virtual reality
The effect that the real scene image shown in equipment matches with user's body towards the real scene image of mode, reach the effect of this matching
Fruit, user just will not produce dizzy sensation.
In the present embodiment, wear-type virtual reality device receives camera and sends original real scene image.It is original receiving
After real scene image, wear-type virtual reality device obtains the first posture number when original real scene image to be shown in screen respectively
According to, and the second attitude data when shooting original real scene image.Further according to the first attitude data and the second attitude data pair
Original real scene image is adjusted, and the image after adjustment is real scene image to be shown, finally shows this real scene image to be shown.
Due to from shoot original real scene image to the original real scene image photographed is shown in often have larger screen when
Between it is poor, occur the user mentioned in background technology under current pose data if not being adjusted to original real picture
Real scene image corresponding to body towards direction mismatches with the real scene image that user watches in wear-type virtual reality device
The problem of.And after being adjusted according to the first attitude data and the second attitude data to original real scene image, can be so as to wait to show
Show that real scene image and user match in the first attitude data lower body towards the corresponding real scene image in direction, while also just solve
The dizzy problem of caused user due to real scene image mismatch problem.
Fig. 2 is the method for displaying image embodiment two provided in an embodiment of the present invention based on wear-type virtual reality device
Flow chart, as shown in Fig. 2 this method comprises the following steps:
S201, receive the original real scene image that camera is sent.
Above-mentioned steps S201 implementation procedures are similar to the corresponding steps of previous embodiment, may refer to implement as shown in Figure 1
Or else associated description in example, is repeated herein.
S202, obtain shoot original real scene image when shooting time and show original real scene image when display when
Between.
The camera being configured on wear-type virtual reality device can also record original while original real scene image is shot
The shooting time of beginning real scene image.Camera can together send this shooting time with original real scene image virtual to wear-type
Real world devices.Wear-type virtual reality device can be to get shooting time.Meanwhile wear-type virtual reality device is receiving
To can also obtain the current reception time during original real scene image.Because wear-type virtual reality device is from receiving original outdoor scene
Image can be ignored to the time interval very little being shown in this original real scene image between display screen, therefore, can be with
It the reception time will regard the display time as, and that is to say that wear-type virtual reality device will get the display time.
S203, obtain and display time corresponding first attitude data and the second posture number corresponding with shooting time
According to.
Based on the display time and shooting time got in step S202, alternatively, the correlation in embodiment one is accepted
Description, wear-type virtual reality device can obtain the first attitude data corresponding with showing the time by reading data file
And the second attitude data corresponding with shooting time.
Alternatively, wear-type virtual reality device can also obtain the first attitude data and the second appearance in the following manner
State data:The first attitude data and are obtained from the storage queue for being stored with the attitude data collected in predetermined time period
Two attitude datas, the time difference that predetermined time period is more than or equal between display time and shooting time.
Specifically, after the attitude data that sensor collects, it is alternatively possible to by the attitude data collected and posture
The acquisition time of data is stored in an attitude data storehouse.Wear-type virtual reality device can be with prefixed time interval from appearance
Attitude data is read in state database and the attitude data read is stored in storage queue, wherein, read attitude data
Prefixed time interval can with sensor gather attitude data prefixed time interval it is identical.In addition, the reading of attitude data
Time and acquisition time are actually also a time difference be present, but the general very little of this time difference, therefore in practical application
It can often ignore, that is to say and regard read access time and acquisition time as the same time.
Storage queue is stored with the attitude data collected in predetermined time period.Because wear-type virtual reality device is
Attitude data is read according to prefixed time interval, therefore the quantity of the attitude data read in predetermined time period is one
Fixed, this is also equivalent to be provided with a storage cap for storage queue.Wear-type virtual reality device can be according to reception
Time and shooting time obtain the first attitude data and the second attitude data respectively from storage queue.Certainly, in order to ensure head
The formula virtual reality device of wearing can get the first attitude data and the second attitude data, predetermined time period should be greater than or
Equal to the time difference for receiving time and shooting time.
, can be in storage queue when attitude data exceeds storage cap because storage queue has storage cap
Attitude data is deleted.It is alternatively possible to the part attitude data beyond storage cap is deleted according in the following manner.
First, in response to the start-up operation of camera, read what is gathered from attitude data storehouse with prefixed time interval
In attitude data deposit storage queue.
In response to the camera start-up operation of user's triggering, wear-type virtual reality device starts to read attitude data, and
The attitude data read is stored in storage queue.Alternatively, camera start-up operation can be that user presses wear-type
Some button on virtual reality device or the certain types of virtual reality scenario in response to user's selection, automatically
Start camera.Wherein, the type of virtual reality scenario can include interactive entertainment, panoramic pictures or real scene video etc.
Deng specific virtual reality scenario can be real scene video.
And then if the quantity of the prefixed time interval undergone meets predetermined time period, according to the suitable of FIFO
Sequence, the attitude data in deposit storage queue earliest is deleted successively, and the attitude data newly read is stored in storage queue.
The attitude data that wear-type virtual display device is read is to be stored according to the sequencing of reading in storage queue
's.Head-mounted display apparatus reads attitude data, the quantity of the prefixed time interval of read operation experience with prefixed time interval
That is to say read operation experience time it is longer, the attitude data read is also more.When presetting for read operation experience
When the quantity of time interval is that the time span of read operation experience meets predetermined time period, then according to deposit storage queue
Sequencing deletes access time earliest attitude data, and the attitude data newly read is stored in storage queue so that deposits
Data in storage queue remain constant.
S204, determine the data difference between the first attitude data and the second attitude data.
S205, the transformation matrix for adjusting original real scene image is determined according to data difference.
S206, pixel coordinate matrix corresponding to each pixel in original real scene image is multiplied with transformation matrix respectively, with
Generate real scene image to be shown.
S207, show real scene image to be shown.
Original real scene image in photographed data, alternatively, this original outdoor scene can be got by wearing virtual reality device
The existence form of image can be a data matrix, also, the element in this data matrix is each picture in original real scene image
The gray value of vegetarian refreshments.
, can be to two appearances after wear-type virtual reality device gets the first attitude data and the second attitude data
State data are made the difference to obtain data difference.Alternatively, it can be seen from associated description above, the data between two attitude datas
Difference can be differential seat angle, can be specifically luffing angle difference θ1, rolling differential seat angle θ2And course heading difference θ3.According to
This differential seat angle is determined:
It is then possible to facility is adjusted to original real scene image with the following methods.This adjustment can be understood as will be original
The position of each pixel gray value is adjusted in real scene image.
Specifically, it is assumed that in original real scene image the pixel coordinate matrix of a certain pixel can be expressed as A=(x, y,
Z), wherein x values and y values represent xth row y row of this pixel in the data matrix of original real scene image respectively, and z values are only
It is the value that rotation is calculated and uniformly set for convenience, without concrete meaning, alternatively, in practical application, z values typically take
1。
Pixel coordinate matrix corresponding to each pixel in original real scene image is multiplied with this transformation matrix respectively, i.e. D=
A*C, so as to be adjusted rear pixel coordinate of this pixel gray value in the data matrix of real scene image to be shown.
Wherein, A is the pixel coordinate matrix of either element in the data matrix of original real scene image, and C is above-mentioned conversion
Matrix, D be real scene image to be shown data matrix in either element pixel coordinate matrix.
Repeat said process, you can obtain in the data matrix of original real scene image each pixel gray value to be shown
Position in the data matrix of real scene image, to form the data matrix of real scene image to be shown, according to this real scene image to be shown
Data matrix generate real scene image to be shown.This real scene image to be shown is finally shown in wear-type virtual display device.
In the present embodiment, in response to the start-up operation of camera, wear-type virtual reality device is read from attitude data storehouse
Attitude data is taken, and is stored to one and has in the storage queue of preset length.When the attitude data in storage queue exceeds
During preset length, the sequencing stored according to attitude data into storage queue deletes the posture in deposit storage queue earliest
Data, make the quantity of attitude data in storage queue constant all the time.Attitude data in storage queue is in a period of time
Attitude data, data volume is smaller, wear-type virtual reality device is being searched first according to reception time and shooting time
Searched when attitude data and the second attitude data in a less storage queue of data volume, accelerate posture
The acquisition speed of data, wear-type virtual reality device can also be made quickly to start to be adjusted behaviour to original real scene image
Make, to obtain real scene image to be shown, and most this real scene image to be shown is shown at last.
Fig. 3 is the image display device embodiment one provided in an embodiment of the present invention based on wear-type virtual reality device
Structural representation, as shown in figure 3, being somebody's turn to do the image display device based on wear-type virtual reality device includes:Receiving module 11,
One acquisition module 12, adjusting module 13, display module 14.
Receiving module 11, the original real scene image sent for camera.
First acquisition module 12, for obtaining the first attitude data when showing original real scene image and shooting original
The second attitude data during beginning real scene image.
Adjusting module 13, for adjusting original real scene image according to the first attitude data and the second attitude data, with generation
Real scene image to be shown.
Display module 14, for showing real scene image to be shown.
Fig. 3 shown devices can perform the method for embodiment illustrated in fig. 1, the part that the present embodiment is not described in detail, can join
Examine the related description to embodiment illustrated in fig. 1.In implementation procedure and the technique effect embodiment shown in Figure 1 of the technical scheme
Description, will not be repeated here.
Fig. 4 is the image display device embodiment two provided in an embodiment of the present invention based on wear-type virtual reality device
Structural representation, as shown in figure 4, on the basis of embodiment illustrated in fig. 3, it is somebody's turn to do the image based on wear-type virtual reality device and shows
Device also includes:Second acquisition module 21.
Second acquisition module 21, shooting time and the original realistic picture of display when shooting original real scene image for obtaining
As when the display time.
First acquisition module 12, for obtaining with showing the first attitude data corresponding to the time and corresponding with shooting time
The second attitude data.
Alternatively, the acquisition module 12 being somebody's turn to do in the image display device based on wear-type virtual reality device is specifically used for:
Obtained from the storage queue for being stored with the attitude data collected in predetermined time period the first attitude data and
Second attitude data, predetermined time period are more than or equal to the time difference for receiving time and shooting time.
Alternatively, being somebody's turn to do the image display device based on wear-type virtual reality device also includes:Read module 22, delete mould
Block 23.
Read module 22, for the start-up operation in response to camera, read with prefixed time interval from attitude data storehouse
The attitude data gathered is taken to be stored in storage queue.
Removing module 23, if the quantity of the prefixed time interval for having undergone meets predetermined time period, according to elder generation
Enter the order first gone out, delete the attitude data in deposit storage queue earliest successively, and the attitude data newly read is deposited
Store up in queue.
Alternatively, being somebody's turn to do the image display device based on wear-type virtual reality device also includes:Starting module 24.
Starting module 24, for the virtual reality scenario type selected according to user, it is determined whether start camera.
Alternatively, the adjusting module 13 being somebody's turn to do in the image display device based on wear-type virtual reality device includes:First
Determining unit 131, the second determining unit 132 and adjustment unit 133.
First determining unit 131, for determining the data difference between the first attitude data and the second attitude data.
Second determining unit 132, for determining the transformation matrix for adjusting original real scene image according to data difference.
Adjustment unit 133, for respectively by pixel coordinate matrix corresponding to each pixel in original real scene image and conversion
Matrix multiple, to generate real scene image to be shown.
Fig. 4 shown devices can perform the method for embodiment illustrated in fig. 2, the part that the present embodiment is not described in detail, can join
Examine the related description to embodiment illustrated in fig. 2.In implementation procedure and the technique effect embodiment shown in Figure 2 of the technical scheme
Description, will not be repeated here.
The built-in function and structure of the image display device based on wear-type virtual reality device are the foregoing described, at one
In possible design, the structure of the image display device based on wear-type virtual reality device can be realized virtually to be showed for a wear-type
Real equipment.Fig. 5 is the structural representation of wear-type virtual reality device embodiment one provided in an embodiment of the present invention, such as Fig. 5 institutes
Show, the electronic equipment includes:Memory 31, and the processor 32 being connected with memory, memory 31 are set for storing electronics
The standby program for performing the method for displaying image provided in any of the above-described embodiment, processor 32 are configurable for performing memory
The program stored in 31.
Program includes one or more computer instruction, wherein, one or more computer instruction is performed by processor 32
When can realize following steps:
Receive the original real scene image that camera is sent;
Obtain the first attitude data when showing original real scene image and second when shooting original real scene image
Attitude data;
Original real scene image is adjusted according to the first attitude data and the second attitude data, to generate real scene image to be shown;
Show real scene image to be shown.
Alternatively, processor 32 is additionally operable to perform all or part of step in aforementioned approaches method step.
Alternatively, camera 33, sensor 34 and thread controller 35 can also be included in the structure of electronic equipment.
Camera 33, for shooting original real scene image.
Sensor 34, for gathering the attitude data of user.
Thread controller 35, for the start-up operation in response to the camera, control attitude data reads opening for thread
Open, so that the processor reads thread by the attitude data reads the first attitude data and the second attitude data.
Alternatively, communication interface 36 can also be included in the structure of electronic equipment, for electronic equipment and other equipment or
Communication.
Fig. 6 is a kind of image display system based on head-mounted display apparatus provided in an embodiment of the present invention, should be based on head
Wearing the image display system of formula display device includes:Wear-type virtual reality device 41 with camera and for realizing such as
The data processing equipment 42 of the method for displaying image based on wear-type virtual reality device shown in Fig. 1 and Fig. 2.
Alternatively, should be specific based on the wear-type virtual reality device 41 in the image display system of head-mounted display apparatus
It can be used for shooting original real scene image and collection shoot the shooting time of original real scene image.
Fig. 7 is a kind of inside configuration structure schematic diagram of wear-type virtual reality device provided in an embodiment of the present invention.
Display unit 501 can include display panel, display panel be arranged on wear-type virtual reality device 500 towards
The side surface of user's face can be an entire panel or be left panel and the right side for corresponding to user's left eye and right eye respectively
Plate.Display panel can be electroluminescent (Electroluminescent, abbreviation EL) element, liquid crystal display or have similar
The miniscope or retina of structure can directly display or similar laser scan type display.
Virtual image optical unit 502 shoots the image shown by display unit 501 in an exaggerated way, and allows user to press
Image shown by the virtual image observation of amplification.Can be from content as the display image being output on display unit 501
The image for the virtual scene that reproduction equipment (Blu-ray Disc or DVD player) or streaming media server provide or use are outside
The image for the reality scene that camera 510 is shot.In some embodiments, virtual image optical unit 502 can include lens unit,
Such as spherical lens, non-spherical lens, Fresnel Lenses etc..
Input operation unit 503 include it is at least one be used for performing the functional unit of input operation, such as button, button,
Switch or other there is the parts of similar functions, user instruction is received by functional unit, and export to control unit 507
Instruction.
State information acquisition unit 504 is used for the status information for obtaining the user of wearing wear-type virtual reality device 500.
State information acquisition unit 504 can include various types of sensors, for itself detecting status information, and can be by logical
Believe that unit 505 from external equipment, such as other multi-functional terminal ends of smart mobile phone, watch and user's wearing, obtains status information.
State information acquisition unit 504 can obtain the positional information and/or attitude information on the head of user.State information acquisition unit
504 can include gyro sensor, acceleration transducer, global positioning system (Global Positioning System,
Abbreviation GPS) sensor, geomagnetic sensor, doppler effect sensors, infrared sensor, one in radio-frequency field intensity sensor
It is individual or multiple.In addition, state information acquisition unit 504 obtains the state letter of the user of wearing wear-type virtual reality device
Breath, such as obtain mode of operation (such as whether user dresses wear-type virtual reality device 500), the operating state of user of user
(it is such as static, walk, run and such mobile status, the posture of hand or finger tip, eyes open or closed state, sight
Direction, pupil size), the state of mind (whether user is immersed in the shown image of observation and the like), it is or even raw
Reason state.
Communication unit 505 performs the coding with the communication process of external device (ED), modulation and demodulation processing and signal of communication
And decoding process.In addition, control unit 507 can send transmission data from communication unit 505 to external device (ED).Communication mode can
To be wired or wireless, for example, mobile high definition link (Mobile High-Definition Link, abbreviation MHL) or
USB (Universal Serial Bus, abbreviation USB), high-definition media interface (High Definition
Multimedia Interface, abbreviation HDMI), Wireless Fidelity (Wireless Fidelity, abbreviation Wi-Fi), Bluetooth communication
Or low-power consumption bluetooth communication, and mesh network of IEEE802.11s standards etc..In addition, communication unit 505 can be according to width
Band CDMA (Wideband Code Division Multiple Access, abbreviation W-CDMA), Long Term Evolution (Long
Term Evolution, abbreviation LTE) and similar standard operation cellular radio transceiver.
In some embodiments, wear-type virtual reality device 500 can also include memory cell, and memory cell 506 is to match somebody with somebody
It is set to the mass-memory unit with solid-state drive (Solid State Drives, abbreviation SSD) etc..Some embodiments
In, memory cell 506 can store application program or various types of data.For example, user is set using wear-type virtual reality
The content of standby 500 viewing can be stored in memory cell 506.
In some embodiments, wear-type virtual reality device 500 can also include control unit, and control unit 507 can be with
Including computer processing unit (Central Processing Unit, abbreviation CPU) or other there is setting for similar functions
It is standby.In some embodiments, control unit 507 can be used for performing the application program that memory cell 506 stores, or control unit
507 can be also used for performing method, function and the circuit of operation disclosed in the application some embodiments.
Graphics processing unit 508 is used to perform signal transacting, for example the picture signal to being exported from control unit 507 is related
Image quality correction, and by its conversion of resolution be the resolution ratio according to the screen of display unit 501.Then, display is driven
Moving cell 504 selects the often row pixel of display unit 501 successively, and scans the often row pixel of display unit 501 successively line by line, because
And provide the picture element signal based on the picture signal through signal transacting.
In some embodiments, wear-type virtual reality device 500 can also include external camera.External camera 510 can be with
500 main body front surface of wear-type virtual reality device is arranged on, external camera 510 can be one or more.External camera
510 can obtain three-dimensional information, and be also used as range sensor.In addition, the position of reflected signal of the detection from object
Putting sensitive detector (Position Sensitive Detector, abbreviation PSD) or other kinds of range sensor can be with
It is used together with external camera 510.External camera 510 and range sensor can be used for detecting wearable, wear-type virtually now
Body position, posture and the shape of the user of real equipment 500.In addition, user can be straight by external camera 510 under certain condition
Connect viewing or preview reality scene.
In some embodiments, wear-type virtual reality device 500 can also include sound processing unit 511, acoustic processing
Unit 511 can perform the sound quality correction or sound amplification of the voice signal exported from control unit 507, and input sound
Signal transacting of sound signal etc..Then, sound I/O unit 512 after acoustic processing to outside output sound and defeated
Enter the sound from microphone.
It should be noted that the structure or part in Fig. 7 shown in bold box can be independently of wear-type virtual reality devices
Outside 500, such as external treatment system can be arranged on, such as computer system, in match somebody with somebody with wear-type virtual reality device 500
Close and use;Or structure shown in dotted line frame or part can be arranged on the inside of wear-type virtual reality device 500 or surface
On.
Device embodiment described above is only schematical, wherein the unit illustrated as separating component can
To be or may not be physically separate, it can be as the part that unit is shown or may not be physics list
Member, you can with positioned at a place, or can also be distributed on multiple NEs.It can be selected according to the actual needs
In some or all of module realize the purpose of this embodiment scheme.Those of ordinary skill in the art are not paying creativeness
Work in the case of, you can to understand and implement.
Through the above description of the embodiments, those skilled in the art can be understood that each embodiment can
Realized by the mode of general hardware platform necessary to add, naturally it is also possible to come real by way of hardware and software combination
It is existing.Based on such understanding, the part that above-mentioned technical proposal substantially contributes to prior art in other words can be with product
Form embody, the computer product can store in a computer-readable storage medium, such as ROM/RAM, magnetic disc, CD
Deng, including some instructions are causing a computer installation (can be personal computer, server, or network equipment etc.)
Perform the method described in some parts of each embodiment or embodiment.
Finally it should be noted that:The above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although
The present invention is described in detail with reference to the foregoing embodiments, it will be understood by those within the art that:It still may be used
To be modified to the technical scheme described in foregoing embodiments, or equivalent substitution is carried out to which part technical characteristic;
And these modification or replace, do not make appropriate technical solution essence depart from various embodiments of the present invention technical scheme spirit and
Scope.
Claims (10)
- A kind of 1. method for displaying image based on wear-type virtual reality device, it is characterised in that including:Receive the original real scene image that camera is sent;Obtain the first attitude data when showing the original real scene image and when shooting the original real scene image Second attitude data;The original real scene image is adjusted according to first attitude data and second attitude data, to generate reality to be shown Scape image;Show the real scene image to be shown.
- 2. according to the method for claim 1, it is characterised in that it is described reception camera send original real scene image it Afterwards, methods described also includes:Display time when obtaining shooting time when shooting the original real scene image and showing the original real scene image;First attitude data of the acquisition when show the original real scene image and in the shooting original real scene image When the second attitude data, including:Obtain first attitude data corresponding with the display time and corresponding with the shooting time described second Attitude data.
- 3. according to the method for claim 2, it is characterised in that the acquisition and the display time corresponding described first Attitude data and second attitude data corresponding with the shooting time, including:Obtained from the storage queue for being stored with the attitude data collected in predetermined time period first attitude data and Second attitude data, the predetermined time period be more than or equal to it is described display the time and the shooting time between when Between it is poor.
- 4. according to the method for claim 3, it is characterised in that methods described also includes:In response to the start-up operation of the camera, the posture number gathered is read from attitude data storehouse with prefixed time interval According in the deposit storage queue;If the quantity of the prefixed time interval undergone meets the predetermined time period, according to the order of FIFO, according to Secondary deletion is stored in the attitude data in the storage queue earliest, and the attitude data newly read is stored in into the storage queue In.
- 5. method according to any one of claim 1 to 4, it is characterised in that described according to first attitude data The original real scene image is adjusted with second attitude data, to generate real scene image to be shown, including:Determine the data difference between first attitude data and second attitude data;Transformation matrix for adjusting the original real scene image is determined according to the data difference;Pixel coordinate matrix corresponding to each pixel in the original real scene image is multiplied with the transformation matrix respectively, with life Into the real scene image to be shown.
- 6. according to the method for claim 5, it is characterised in that methods described also includes:The virtual reality scenario type selected according to user, it is determined whether start camera.
- A kind of 7. wear-type virtual reality device, it is characterised in that including:Memory, and the place being connected with the memory Manage device;The memory, for storing one or more computer instruction, wherein, one or more computer instruction supplies institute State processor and call execution;The processor, for performing one or more computer instruction to realize any one of claim 1 to 6 Method.
- 8. equipment according to claim 7, it is characterised in that the equipment also includes:Camera, for shooting original real scene image;Sensor, for gathering the attitude data of user;Thread controller, for the start-up operation in response to the camera, control attitude data reads the unlatching of thread, so that The processor reads thread by the attitude data and reads the first attitude data and the second attitude data.
- 9. a kind of image display system based on wear-type virtual reality device, it is characterised in that the system includes:For holding Any one of the row claim 1 to 6 data processing equipment of method and the head-mounted display apparatus with camera.
- 10. system according to claim 9, it is characterised in that the head-mounted display apparatus with camera is used for: Shoot original real scene image and collection shoots the shooting time of original real scene image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711117228.9A CN107835404A (en) | 2017-11-13 | 2017-11-13 | Method for displaying image, equipment and system based on wear-type virtual reality device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711117228.9A CN107835404A (en) | 2017-11-13 | 2017-11-13 | Method for displaying image, equipment and system based on wear-type virtual reality device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107835404A true CN107835404A (en) | 2018-03-23 |
Family
ID=61655232
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711117228.9A Pending CN107835404A (en) | 2017-11-13 | 2017-11-13 | Method for displaying image, equipment and system based on wear-type virtual reality device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107835404A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109671033A (en) * | 2018-12-21 | 2019-04-23 | 上海联影医疗科技有限公司 | A kind of image dark field correction method and system |
CN109814710A (en) * | 2018-12-27 | 2019-05-28 | 青岛小鸟看看科技有限公司 | Data processing method and device and virtual reality equipment |
CN110519247A (en) * | 2019-08-16 | 2019-11-29 | 上海乐相科技有限公司 | A kind of one-to-many virtual reality display method and device |
CN110881102A (en) * | 2018-09-06 | 2020-03-13 | 佳能株式会社 | Image processing apparatus, image processing method, and computer readable medium |
CN111949114A (en) * | 2019-05-15 | 2020-11-17 | 中国移动通信有限公司研究院 | Image processing method and device and terminal |
CN112446965A (en) * | 2020-12-04 | 2021-03-05 | 上海影创信息科技有限公司 | Delay detection safety protection method and system of VR glasses and VR glasses |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160105669A1 (en) * | 2014-10-13 | 2016-04-14 | Samsung Electronics Co., Ltd. | Method and apparatus for rendering content |
CN106909221A (en) * | 2017-02-21 | 2017-06-30 | 北京小米移动软件有限公司 | Image processing method and device based on VR systems |
CN106998409A (en) * | 2017-03-21 | 2017-08-01 | 华为技术有限公司 | A kind of image processing method, head-mounted display and rendering apparatus |
-
2017
- 2017-11-13 CN CN201711117228.9A patent/CN107835404A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160105669A1 (en) * | 2014-10-13 | 2016-04-14 | Samsung Electronics Co., Ltd. | Method and apparatus for rendering content |
CN106909221A (en) * | 2017-02-21 | 2017-06-30 | 北京小米移动软件有限公司 | Image processing method and device based on VR systems |
CN106998409A (en) * | 2017-03-21 | 2017-08-01 | 华为技术有限公司 | A kind of image processing method, head-mounted display and rendering apparatus |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110881102A (en) * | 2018-09-06 | 2020-03-13 | 佳能株式会社 | Image processing apparatus, image processing method, and computer readable medium |
CN110881102B (en) * | 2018-09-06 | 2022-01-04 | 佳能株式会社 | Image capturing apparatus, control method of image capturing apparatus, and computer readable medium |
US11750916B2 (en) | 2018-09-06 | 2023-09-05 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and non-transitory computer readable medium |
CN109671033A (en) * | 2018-12-21 | 2019-04-23 | 上海联影医疗科技有限公司 | A kind of image dark field correction method and system |
CN109671033B (en) * | 2018-12-21 | 2024-09-06 | 上海联影医疗科技股份有限公司 | Image dark field correction method and system |
CN109814710A (en) * | 2018-12-27 | 2019-05-28 | 青岛小鸟看看科技有限公司 | Data processing method and device and virtual reality equipment |
CN109814710B (en) * | 2018-12-27 | 2022-05-13 | 青岛小鸟看看科技有限公司 | Data processing method and device and virtual reality equipment |
CN111949114A (en) * | 2019-05-15 | 2020-11-17 | 中国移动通信有限公司研究院 | Image processing method and device and terminal |
CN110519247A (en) * | 2019-08-16 | 2019-11-29 | 上海乐相科技有限公司 | A kind of one-to-many virtual reality display method and device |
CN110519247B (en) * | 2019-08-16 | 2022-01-21 | 上海乐相科技有限公司 | One-to-many virtual reality display method and device |
CN112446965A (en) * | 2020-12-04 | 2021-03-05 | 上海影创信息科技有限公司 | Delay detection safety protection method and system of VR glasses and VR glasses |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108491775B (en) | Image correction method and mobile terminal | |
CN109949412B (en) | Three-dimensional object reconstruction method and device | |
CN107835404A (en) | Method for displaying image, equipment and system based on wear-type virtual reality device | |
CN109361865B (en) | Shooting method and terminal | |
CN106558025B (en) | Picture processing method and device | |
CN108712603B (en) | Image processing method and mobile terminal | |
CN108184050B (en) | Photographing method and mobile terminal | |
EP3965003A1 (en) | Image processing method and device | |
US20140160129A1 (en) | Information processing apparatus and recording medium | |
CN107566728A (en) | A kind of image pickup method, mobile terminal and computer-readable recording medium | |
CN108322644A (en) | A kind of image processing method, mobile terminal and computer readable storage medium | |
RU2745737C1 (en) | Video recording method and video recording terminal | |
US11792351B2 (en) | Image processing method, electronic device, and computer-readable storage medium | |
CN109002164A (en) | It wears the display methods for showing equipment, device and wears display equipment | |
CN109002248B (en) | VR scene screenshot method, equipment and storage medium | |
CN109190509A (en) | A kind of personal identification method, device and computer readable storage medium | |
CN111885307B (en) | Depth-of-field shooting method and device and computer readable storage medium | |
CN110650294A (en) | Video shooting method, mobile terminal and readable storage medium | |
CN112702533B (en) | Sight line correction method and sight line correction device | |
CN109462745A (en) | A kind of white balancing treatment method and mobile terminal | |
CN102043942A (en) | Visual direction judging method, image processing method, image processing device and display device | |
CN108270971B (en) | Mobile terminal focusing method and device and computer readable storage medium | |
CN110825897A (en) | Image screening method and device and mobile terminal | |
CN111866388B (en) | Multiple exposure shooting method, equipment and computer readable storage medium | |
CN109803087A (en) | A kind of image generating method and terminal device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180323 |