CN103941851B - A kind of method and system for realizing virtual touch calibration - Google Patents
A kind of method and system for realizing virtual touch calibration Download PDFInfo
- Publication number
- CN103941851B CN103941851B CN201310180909.5A CN201310180909A CN103941851B CN 103941851 B CN103941851 B CN 103941851B CN 201310180909 A CN201310180909 A CN 201310180909A CN 103941851 B CN103941851 B CN 103941851B
- Authority
- CN
- China
- Prior art keywords
- user
- virtual
- coordinate
- coordinate system
- calibration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
Abstract
The invention discloses a kind of method and system for realizing virtual touch calibration, methods described includes:Create virtual calibration menu;With the plane for showing the virtual calibration menu place plane as x-axis and y-axis is constituted, the first coordinate system is set up;The second coordinate system is set up, by user gesture position with the second coordinate system coordinate representation;Calculate the corresponding relation of first coordinate system and second coordinate system;According to the corresponding relation, the user gesture position coordinates that the second coordinate system is represented, with the coordinate representation of first coordinate system;According to the user gesture position relationship of the coordinate representation with first coordinate system, correcting user gesture and the virtual corresponding relation for calibrating menu.Compared with prior art, the invention has the beneficial effects as follows:The accuracy of virtual touch is improve, the sensory effects of user are improved.
Description
Technical field
A kind of the present invention relates to augmented reality, more particularly to method and system for realizing virtual touch calibration.
Background technology
Emerging augmented reality can make virtual world seem more directly more natural with interacting for real world, and base
Man-machine interaction in gesture is indispensable key technology in the interaction for realize augmented reality.Allow real people or thing
Body is directly interacted with 3D virtual projection objects, that is, allow gesture and dummy object direct interaction, user to use gesture
3D virtual projection objects, this expression are naturally clear so that the experience of man-machine interaction is more attractive.
In prior art, the User Interface and implementation method of 3D virtual projections and virtual touch, including depth finding
Device, binocular images disparity computation module, binocular images processing module, 3D display devices, gesture recognition module, camera and virtuality
Touch controller.As shown in figure 1, depth detector detection obtains the range information of user's head and hand and 3D display devices;
Binocular images disparity computation module calculates binocular images parallax according to the range information that depth detector is detected;Binocular images process
Module processes image according to binocular images parallax information;Again by process after image be sent to 3D display devices virtual projection with
In the arms length of family;Gesture recognition module etc. until after the gesture operation virtual projection picture of user, according to depth detector and
Camera recognizes user's finger motion track;Virtual touch controller is made corresponding anti-according to user gesture and motion track
Should.
But during present inventor states technical scheme in realization, it is found which at least has following technical problem:
The effect of virtual touch to be realized, needs the pixel space by the locational space of gesture with virtual projection picture to carry out space projection
Conversion.But due to the mobility of depth detector so that the projective transformation rule set before depth detector movement is uncomfortable
With with movement after;Or the interpupillary distance due to changing human eye after user changes so that the virtual projection that sees is drawn
The depth that goes out to shield in face there occurs change, also so that after the transformation rule set before changing user is not applied to and changed;This
A little factors, all easily produce gesture and click on certain, and response shows the entanglement effect at another place, and it is interactive accurate to have impact on
Property.
Content of the invention
The embodiment of the present application is solved in prior art due to depth by providing a kind of method for realizing virtual touch calibration
After degree detector movement and when when user changes, human eye interpupillary distance changes, the gesture for causing is clicked on and non_uniform response
Problem, improve interactive accuracy, improve user viewing impression.
The embodiment of the present application provides a kind of method for realizing virtual touch calibration, and methods described includes:
Create virtual calibration menu;
With the plane for showing plane that the virtual calibration menu is located as x-axis and y-axis is constituted, the first coordinate system is set up;
The second coordinate system is set up, by user gesture position with the second coordinate system coordinate representation;
Calculate the corresponding relation of first coordinate system and second coordinate system;
According to the corresponding relation, the user gesture position coordinates that the second coordinate system is represented, with first coordinate system
Coordinate representation;
According to the user gesture position coordinates of the coordinate representation with first coordinate system, correcting user gesture and virtual school
The corresponding relation of quasi- menu.
Further, the virtual calibration menu is
Three circular icons comprising the first circle, the second circle and the 3rd circle, described three the first round central coordinate of circle, second
Central coordinate of circle and the 3rd central coordinate of circle, the lower left corner, the lower right corner and the upper left corner respectively with the virtual screen of display as rectangle
Coordinate.
Further, the corresponding relation for calculating first coordinate system and second coordinate system, specially:
According to the first coordinate system coordinate of virtual calibration four calibration points of menu, and the user's virtual touch for getting
The coordinate in the second coordinate system of virtual calibration three calibration points of menu, calculates any point in second coordinate system
Coordinate Conversion is translational movement, scaling and the anglec of rotation of the first coordinate system coordinate;
Further, the corresponding relation for calculating first coordinate system and second coordinate system, specially:
By coordinate of the user's virtual touch virtual screen any point in the second coordinate system, use
With this first coordinate system coordinateCorresponding, wherein,
A is the horizontal resolution for showing virtual screen, and B is the vertical resolution for showing virtual screen,、WithFirst center of circle of virtual calibration menu, second center of circle and the 3rd center of circle described in virtual touch
Coordinate.
Further, after the correcting user gesture calibrates the corresponding relation of menu with virtuality, show virtuality to be operated
Picture, and the change for user's eyes position causes the change of virtual screen position to compensate, the compensation is specifically included:
According to formulaWith, to the user's eyes position parallel to display screen plane
Putting skew causes the skew of virtual screen position to compensate, and negative sign represents user's eyes position offset direction and virtual screen position
Put the in opposite direction of migration;
According to formulaWith,
The skew of virtual screen position is caused to compensate when offseting to the user's eyes position perpendicular to display screen plane;
According toWith, obtain user's eyes position incline when pair
The vertical parallax compensation of right and left eyes image, the left eye compensation are contrary with right eye compensation direction.
Wherein,Go out to shield distance for dummy object,WithFor user's eyes parallel to display screen plane skew
Displacement, L are the distance between user and screen,WithFor showing the double relative to user of the virtual screen to be operated
The side-play amount of eye skew,For the position coordinates of human eye,For family eyes perpendicular to display screen plane skew
Displacement,For the parallax of virtual screen right and left eyes image to be operated,For user's eyes angle of inclination.
Further, in user operation virtual screen, feedback sound is pointed out.
Further, when feedback sound is pointed out, if the signal strength signal intensity sent for left and right ear is respectivelyWith, then right
Left and right intensity in audio amplifier output signalWithRespectively:
,;Wherein, L1 and L2 be respectively user gesture position with described
The distance of the left and right ear of user.
The embodiment of the present application additionally provides a kind of system for realizing virtual touch calibration, and the system includes:
Creating unit, for creating virtual calibration menu;
Construction unit, for the plane to show plane that the virtual calibration menu is located as x-axis and y-axis is constituted, builds
Vertical first coordinate system;And set up the second coordinate system, for by user gesture position with the second coordinate system coordinate representation;
Computing unit, for calculating the corresponding relation of first coordinate system and second coordinate system;
Alignment unit, for according to the corresponding relation, the user gesture position coordinates that the second coordinate system is represented, using institute
State the coordinate representation of the first coordinate system, correcting user gesture and the virtual corresponding relation for calibrating menu.
Further, the creating unit specifically for:
Create comprising the first circle, the second circle circular icons with three of the 3rd circle, described three the first round central coordinate of circle,
Second central coordinate of circle and the 3rd central coordinate of circle, the lower left corner, the lower right corner and upper left respectively with the virtual screen of display as rectangle
The coordinate at angle.
Further, the computing unit, concrete with for:
According to the first coordinate system coordinate of virtual calibration four calibration points of menu, and the user's virtual touch for getting
The coordinate in the second coordinate system of virtual calibration four calibration points of menu, calculates any point in second coordinate system
Coordinate Conversion is translational movement, scaling and the anglec of rotation of the first coordinate system coordinate;
Further, the computing unit, concrete with for:
By coordinate of the user's virtual touch virtual screen any point in the second coordinate system, use
With this first coordinate system coordinateCorresponding, wherein,
A is the horizontal resolution for showing virtual screen, and B is the vertical resolution for showing virtual screen,、WithFirst center of circle of virtual calibration menu, second center of circle and the 3rd center of circle described in virtual touch
Coordinate.
Further, the system also includes compensating unit, for after virtual screen to be operated is shown, for user
The change of eyes position causes the change of virtual screen position to compensate, and specifically includes:
First compensating module, for according to formulaWith, to parallel to display screen
User's eyes position skew of plane causes the skew of virtual screen position to compensate, and negative sign represents that user's eyes position offsets
Direction is in opposite direction with virtual screen position migration;
Second compensating module, for according to formula
With, to perpendicular to
User's eyes position of display screen plane causes the skew of virtual screen position to compensate when offseting;
3rd compensating module, for basisWith, used
The vertical parallax of right and left eyes image is compensated when family eyes position inclines, the left eye compensation and right eye compensation direction are contrary.
Wherein,Go out to shield distance for dummy object,WithFor user's eyes parallel to display screen plane skew
Displacement, L are the distance between user and screen,WithFor showing the double relative to user of the virtual screen to be operated
The side-play amount of eye skew,For the position coordinates of human eye,For family eyes perpendicular to display screen plane skew
Displacement,For the parallax of virtual screen right and left eyes image to be operated,For user's eyes angle of inclination.
Further, the system also includes Tip element, in user operation virtual screen, feedback sound is carried
Show.
Further, the Tip element is when feedback sound is pointed out, if the signal strength signal intensity difference sent for left and right ear
ForWith, then for the left and right intensity of audio amplifier output signalWithRespectively:
,;Wherein, L1 and L2 be respectively user gesture position with described
The distance of the left and right ear of user.
The one or more technical schemes provided in the embodiment of the present application, at least have the following technical effect that or advantage:When
When the position of depth detector changes or the oculopupillary distance of user replacing descendant there occurs change, user's point is employed
The technological means of the calibration point of virtual calibration menu is hit, again user gesture operation is calibrated with virtual projection picture, is had
Imitated solves in prior art after there is above-mentioned change, and gesture clicks on the entanglement problem with non_uniform response, it is achieved that i.e.
Just there is above-mentioned change, can also keep interactive accuracy.
Description of the drawings
Method flow diagrams of the Fig. 1 for prior art
Fig. 2 is the method flow diagram for realizing virtual touch operation with reference to the inventive method
The method flow diagram for realizing virtual touch calibration that Fig. 3 is provided for the embodiment of the present application
The system block diagram for realizing virtual touch calibration that Fig. 4 is provided for the embodiment of the present application
Fig. 5 is the calibration menu that the present embodiment is created
Fig. 6 is the first coordinate system that the present embodiment builds
Fig. 7 is the corresponding relation figure of the second coordinate system of the present embodiment and the first coordinate system
Fig. 8 is the corresponding relation figure of the second coordinate system of the present embodiment and the first coordinate system
Fig. 9 is the corresponding relation figure of the second coordinate system of the present embodiment and the first coordinate system
Figure 10 is the corresponding relation figure of the second coordinate system of the present embodiment and the first coordinate system
3D display schematic diagrams of the Figure 11 for the present embodiment
The bit shift compensation schematic diagram parallel to display plane that Figure 12 is provided for the present embodiment
The bit shift compensation schematic diagram perpendicular to display plane that Figure 13 is provided for the present embodiment
Figure 14 compensates schematic diagram for the head inclination that the present embodiment is provided
The auditory tone cues schematic diagram that Figure 15 is provided for the present embodiment
Specific embodiment
The embodiment of the present application passes through to provide a kind of method and system for realizing virtual touch calibration, in the base of prior art
Increase calibration on plinth and user's eyes position changes compensation, solve in prior art because depth detector position changes
Or user change after pupil of both eyes distance produce the problem of virtual touch entanglement when changing, and user is in pseudo operation
In journey, the slight change of eyes position causes operating accuracy to decline and destroy the problem of sensory effects.
, for solving the above problems, general thought is as follows for technical scheme in the embodiment of the present application:
For the problem of virtual touch entanglement, the method that is calibrated using user, i.e., when the position of depth detector changes
After becoming or changing user, user recalibrates virtual projection device according to the virtual calibration menu that system is provided so that user
3D hand gesture locations correctly can be projected to 3D virtual screens space.
Meanwhile, using user's eyes position migration and the method for user's eyes position slope compensation, i.e., when user is double
After eye position occurs slight skew, the compensation of doing corresponding displacement to the 3D virtual screens for showing makes the virtual picture that user sees
Face will not the movement with the movement of user, when user's eyes position occurs slight inclination, to the 3D virtual screens for showing
The compensation of respective angles inclination is done, makes user see the virtual image that correct parallax overlaps, so as to improve the precision of operation,
Improve the sensory effects of user.
In order to be better understood from above-mentioned technical proposal, below in conjunction with Figure of description and specific embodiment to upper
State technical scheme to be described in detail.
A kind of method and system for realizing virtual touch calibration is provided in the embodiment of the present application.The meaning of calibration exists
In:In the processor as 3D display control cores, the position of 3D figures, size etc. are all calculated with pixel;And make
In depth detector for 3D gesture identification cores, the positional information of gesture is calculated with bulk.Therefore work as differentiation
When the position of user gesture is contacted, the process of a calibration is needed with the position of 3D rendering, i.e., in physical space and digital space
In build one communication bridge.But this communication bridge, typically set a corresponding relation, and premise are generally
The position of constant depth detector, the interpupillary distance of fixed human eye is a constant value.So, when the position of depth detector exists
Variation is there occurs in actually used(For now, depth detector is external independent device, arbitrarily can move), at this moment
Reuse the action that the calibration relation for setting before cannot just be correctly completed virtual touch;Simultaneously as the eyes of different user
Interpupillary distance is differentiated, using fixed interpupillary distance for the calibration of different user, can cause naturally virtual touch
Inaccuracy.
Embodiment one
For the calibration steps that more fully statement the present embodiment is provided, will illustrate in conjunction with user's virtual touch process;
Such as Fig. 2, step are as follows:
S01, judges whether the position of depth detector there occurs change or whether changed user.
In the embodiment of the present application, in depth detector in-built motion sensor, such as accelerometer, gyroscope etc., with biography
Sensor is monitoring whether depth detector there occurs movement;Meanwhile, the 2D cameras configured on display can be passed through to user
Face datection is carried out, and judges whether to have changed user.
When the position of motion sensor senses to depth detector there occurs change, or 2D cameras detect replacing
Viewing user, display interface jump out menu prompt, point out the above-mentioned state of user to there occurs change, need recalibration normal
Viewing, and asks the user whether to be calibrated, user can select be, it is also possible to select no.
If user have selected calibration, start the calibration steps that the present embodiment is provided, i.e.,
S02, user's calibration system is called to carry out user's calibration;
Below in conjunction with user's calibration steps(Such as Fig. 3)The calibration process of user is described.
Step S021, the virtual calibration menu of establishment;
The calibration menu that the present embodiment is created as shown in figure 5, be respectively 1,2,3,44 circular icons for numbering(Corresponding
There are the threedimensional model of computer graphical, stereoscopic display), 4 round centers of circle correspond to four angles of a rectangle, the length and width of rectangle
Identical with the length and width of the virtual screen for showing;In fact, because 3 points determine a plane, as long as possessing three circular icons,
Calibration menu, and the work for completing to calibrate can just be built.Those skilled in the art are, it should be understood that the form of calibration menu is not limited to
The present embodiment.
Step S022, the plane to show plane that the virtual calibration menu is located as x-axis and y-axis is constituted, set up the
One coordinate system;
The coordinate system shows plane that above-mentioned calibration menu is located as x-axis and y-axis plane with virtual, and z=0, z positive direction refers to
To user, away from screen, as shown in Figure 6.The first coordinate system in the present embodiment, is defined on display plane, in units of pixel
(Those skilled in the art are, it should be understood that the display screen that is formed as of virtual screen is shown with parallax with certain frequency translation
Right and left eyes image, so as in the eyes of people produce virtual image).For example:Define the center of circle coordinate be, with pixel
For unit, if the resolution ratio of the virtual screen for showing is 1920*1080, the coordinate in four centers of circle can correspondingly be respectively 1(0,
0,0)、2(1920,0,0)、3(0,1080,0)、4(1920,1080,0).
S023, the second coordinate system is set up, by user gesture position with the second coordinate system coordinate representation;
The coordinate system adopted during the second coordinate system, usually depth detector detection user gesture position is defined, for table
Show user gesture position coordinates.When user clicks on the center of circle 1 of virtual calibration menu with finger, depth detector records collection
The finger position for arriving, on the point and virtual screen 1(0,0,0)Corresponding;In the same manner, record user clicks on 2,3,4
Position、、, respectively at 2(1920,0,0)、3(0,1080,0)、4(1920,
1080,0)Corresponding.
Step S024, the corresponding relation for calculating first coordinate system and second coordinate system;
After getting four groups of space coordinates, and known corresponding four groups of computer graphical coordinates, it is possible to start to calibrate work
Make.
Specific calibration algorithm is described as follows:
Algorithm 1, the first coordinate system coordinate according to virtual calibration four calibration points of menu, and the user for getting is empty
Intend the coordinate in the second coordinate system for touching virtual calibration four calibration points of menu, calculate in second coordinate system
Any point Coordinate Conversion is translational movement, scaling and the anglec of rotation of the first coordinate system coordinate;
For the corresponding computer graphical coordinate system of calibration menu, it is a pixel coordinate system, when known display screen chi
On the premise of very little, just the bulk of a known concrete pixel, therefore can converse the corresponding space coordinates of pixel coordinate,
As a example by 42 cun of display screens with resolving power as 1080*1920, the pixel of this size is generally the diameter of 0.5mm, thus, can
1 is respectively to extrapolate four round central coordinate of circle(0,0,0)、2(96,0,0)、3(0,54,0)、4(96,54,0), with cm(Li
Rice)For unit.
So, the corresponding relation of translation, scaling and rotation is just only existed between the first coordinate system and the second coordinate system.
1)Translation
Translation parameters can be obtained with the following method:4 groups of space coordinates are averaged as the coordinate for touching calibration menu midpoint, with the midpoint for calibrating menu(48,27,0)Corresponding;Then,,For
Two coordinate systems are converted to the first coordinate system translation parameters.
Then pass throughBy the origin of the second coordinate system and the first coordinate system
Origin is mapped, as shown in Figure 7.
2)Scaling
Zooming parameter can be obtained with the following method:Or,Or, as depth detector coordinate data is mapped to plane for x-axis and y-axis plane, z=0, coordinate system in, because
And the zoom factor in only x and y directions, without the zoom factor in z directions, c herein can be set to constant,
Then pass throughBy the second coordinate system and the first coordinate system in addition to origin
Each point coordinate is mapped, as shown in Figure 8.
3)If having changed user so as to interpupillary distance there occurs change, the display of virtual screen is caused to there occurs change,
The Pan and Zoom of coordinate system can be only related to, the rotation transformation problem of coordinate system is will not relate to.And for depth detector,
As depth detector only has the change of horizontal direction angle, i.e., move towards varying level direction, without the elevation angle and the angle of depression, also do not have
There is left and right to be uneven, therefore the rotation in only y-axis direction;
Can by 1,2,3,4 points of angles for determining plane, i.e., 1,2 points or 3,4 points of X, Z coordinate,
WithDraw, orWithDraw
Then pass through matrixConversion, so that it may
The coordinate representation with the first coordinate system will be converted to the hand gesture location of postrotational second coordinate system coordinate representation, such as Fig. 9 institutes
Show.
After the process of above-mentioned three part, i.e. translation, scaling and rotation, whether the position of depth detector occurs
Change, has still changed user, and the hand gesture location coordinate that can represent the second coordinate system after changing is with the first seat
The coordinate representation of mark system out, has reached the effect of correction.
Algorithm 2, the coordinate by user's virtual touch virtual screen any point in the second coordinate system,
With
With this first coordinate system coordinateCorresponding, wherein,
A is the horizontal resolution for showing virtual screen, and B is the vertical resolution for showing virtual screen,、WithFirst center of circle of virtual calibration menu, second center of circle and the 3rd center of circle described in virtual touch
Coordinate.
For example, as shown in Figure 10, on 1920*1080 display screens, coordinate isThe point of pixel, corresponding four angular coordinate are empty
Interior coordinateFor
Or
That is, click on when detecting gesturePlace
When, it can be determined that, this clicking trigger is plotted in screenThe event at place.
This principle is based on, we can be by the central point of user operation picture with it, obtaining corresponding depth finding
Device coordinate system(Second coordinate system)Space coordinates.As long as then we contrast this space coordinates and depth detector Real-time Collection
The gesture coordinate for arriving, it is possible to judge gesture whether in the close scope of the central point of distance users actions menu, so as to sentence
Break and whether trigger.
Algorithm 3, the improvement that algorithm 1 can be regarded as.
Due in algorithm 1, depth detector coordinate data is mapped to plane for x-axis and y-axis plane, z=0, coordinate
In system, consequently only that the zoom factor in x and y directions, without the zoom factor in z directions, so, can pass through one in front and one in back double
The mode of plane, is mapped with computer graphical 3D models by gathering 8 space of points coordinates.
Concrete grammar is similar with algorithm 1, but can draw the zoom factor of z-axis, and it will not go into details herein.
Step S025, according to the corresponding relation, the user gesture position coordinates that the second coordinate system is represented, with described
The coordinate representation of one coordinate system;According to the user gesture position coordinates of the coordinate representation with first coordinate system, correcting user
Gesture and the virtual corresponding relation for calibrating menu.
After the algorithm process of above-mentioned steps S024, obtain the corresponding relation of two coordinate systems, in the second coordinate system and
A relation that can be changed is have found between first coordinate system, so that it may with the Coordinate Conversion of the second coordinate system coordinate representation will be
The coordinate representation of the first coordinate system, i.e., have found a corresponding relation between two coordinate systems, can very easily with
The user gesture position that two coordinate systems are represented with the first coordinate system image point one-to-one corresponding, so as to reach school
Accurate effect.
After the completion of user's calibration operation, according to the screen display size and parallax that set, it is possible to which drawing needs to show picture
The right and left eyes image in face simultaneously synthesizes required size and goes out to shield the three-dimensional picture of distance.
But for as user, during watching and operating 3D virtual screens, it is impossible to be always maintained at head(Double
Eye)Do not move, this movement includes spatial displacement and the inclination of head.Both situations can all affect the sight of user
See and operate.User's space position mobile can cause virtual screen with user movement and movement, now calibration base
Carry out operation on plinth again and will produce deviation, affect the accuracy of user operation;The inclination of user's head can cause left and right eye pattern
As being imaged well, the discomfort on sense organ is caused.
Therefore, be ensure calibration after display picture position keep immobilize, not with user movement and movement, pin
To user's head(Eyes)Change, need to do display picture corresponding position compensation, i.e. will because user move and move
Display picture retract the position that shows of picture after calibration.So, the accuracy of user operation is high, and sensory effects are good.
After S03, display virtual screen to be operated, the change for user's eyes position causes changing for virtual screen position
Change is compensated;
In the embodiment of the present application, virtual screen is imaged before screen near the position of user, specifically goes out to shield distance,
Following relation be there is with parallax:
(1)With
(2),
Such as Figure 11, L are the distance between user and screen,Go out to shield distance for virtual screen, right and left eyes of the d for user
Spacing, p are the right and left eyes parallax for showing screen display virtual screen, and wherein p is known;If w is the size of virtual screen, W and is
Screen display has the size of the right and left eyes image of parallax, and wherein W is, it is known that w can pass through four groups of seats for touching calibration menu
Mark is tried to achieve.
1st, head displacement tracing compensation
As described above, virtual screen goes out to shield distanceOnly with the interpupillary distance of user's right and left eyes and regarding for right and left eyes image
Difference correlation, and the position of virtual screen is then relevant with the space multistory position of human eye;For TV user, when he watches electricity
Apparent time, body are had slight movement all around, the i.e. position of its eyes and be there occurs again slightly with the position in calibration
Change, it is therefore necessary to the compensation that 3D shows is carried out according to real-time eyes position.
When user's head position changes, the image space of virtual screen also can occur corresponding change therewith,
For keep virtual screen position not with user's head movement and movement, by display picture on display screen toward rightabout adjust
Position before whole time change.
As the tracking of three-dimensional user's eyes is difficult to, substituted by the way of head position tracking in the present embodiment
The tracking of position of human eye, as eyes are fixed relative to the position proportional of head, largely misses caused by last time replacement
Difference is less.
According to 3D parallax displaying principles, when right and left eyes receive different images respectively, human eye can be realized automatically
Prolongation, until formed the virtual image.Therefore, shown right and left eyes image on display screen is also virtual screen respectively under human eye
Projection.
For the center of virtual screen, the compensation of its displacement can be obtained by simple geometric linear proportionate relationship.Head
The displacement in portion can be decomposed into the displacement parallel to the displacement of display screen plane and perpendicular to display screen plane.
1)Carry out the bit shift compensation parallel to display screen plane;
Such as Figure 12, user's head(Eyes)Displacement、, the adjustment displacement that its corresponding virtual screen shows is
、,
Then basisWith
Can inquire into、.
Wherein,、Can be obtained by the coordinate value of two groups of coordinates before and after user's movement that depth detector gets;For the distance between user and virtual screen, the z-axis number of the user's space coordinate that can be detected by depth detector
According to the difference for being touched the midpoint z-axis data of four groups of coordinates of virtual calibration menu with user is tried to achieve.According to(2)Formula can be released, then byVirtual screen can be obtained to go out to shield distance.
2)Carry out the bit shift compensation perpendicular to display screen plane;
Such as Figure 13, user's head(Eyes)Position be, displacement, its corresponding virtual screen shows
Adjusting displacement is
Wherein,Can be obtained by the coordinate value of two groups of coordinates before and after user's movement that depth detector gets;
2nd, head inclination tracing compensation is carried out;
For general 3D parallax stereoscopic displays, when user's angled head, it may appear that right and left eyes parallax cannot be weighed
The problem of the synthesis virtual image, causes user's vision uncomfortable, or even the phenomenon of dizziness and nausea occurs.This is because, usual 3D or so
Eye display image, only horizontal parallax, without vertical parallax, when user's angled head, right and left eyes have had vertical displacement
, so the extended line of sight line cannot be intersected in a bit, cause vision entanglement.
At this point it is possible to pass through image recognition, the angle of inclination of eyes of user is detected, in this manner it is possible to left and right eye pattern
The vertical parallax of picture is compensated, and can also normally be watched and operate 3D virtual screens so as to allow user face screen.
Such as Figure 14, offset data can be drawn with trigonometric function, i.e. if user's head(Eyes)Angle of inclination(Can root
Obtain according to the image identification function of 2D cameras), in conjunction with above-mentioned formula(1), can obtain
The compensation of left eye is:
The compensation of right eye is:
Wherein, by(1)With(2)The right and left eyes spacing that user can be obtained is.
With regard to the positive and negative of symbol, for going out to shield the virtual image()If, definitionClockwise for just, then the compensation of left eye is equal
It is negative, and the compensation of right eye is not just;For entering to shield the virtual image()If, definitionIt is the just then compensation of left eye clockwise
Just be, and the compensation of right eye be negative.
It is noted that above-mentioned human eye(Head)Slope compensation principle is applicable not only to this, it is also possible to be widely used in each
The three-dimensional 3D for planting principle of parallax shows.
After S04, user operation virtual screen, user operation result, and feedback sound prompting is judged.
The operation of virtual screen and true operation are comparatively, easily to produce user uncertain, it is not known that whether point
Having hit needs the picture option of operation, in order to improve Consumer's Experience, proposes a kind of sound feedback method, make in the present embodiment
User is during operation, if correctly clicking virtual screen option, just feedback sound improves user's body as prompting
Test effect.
According to sound positioning principle, the intensity of point sound source(Energy)As propagation distance square is inversely proportional to;And the propagation time
Difference can be calculated by velocity of sound.
Such as Figure 15, if dummy object A sends sound, the voice signal that left and right ear is received is respectively:,, make A with
The distance of left and right ear is respectively:With, velocity of sound is, then
,Strength ratio be, and the time difference is;
For earphone,,It is exactly directly output signal.
And for audio amplifier,,Also need to through conversion.
For audio amplifier output signalWithIf, distance of the user just to the viewing of TV center, user's head and audio amplifier
With, then basis
Can be derived from
With
Certainly, auditory tone cues is not limited to the correct prompting operation of the present embodiment offer, and system can also set different
Sound is pointing out different operations, such as faulty operation prompting, unavailable emphasis etc..
Calibration steps in the present embodiment, in practice, the improvement that can be simplified in conjunction with self-reacting method,
For example:
1st, in depth detector in-built motion sensor, the such as sensor such as accelerometer and gyroscope, if do not monitor
The motion of depth detector, it is believed that do not move the position of depth detector, it may not be necessary to calibrated, but before recalling
Calibration data use(Coordinate system translation and rotating part).
2nd, by the identification function of 2D cameras, Face datection is carried out to user, if detect currently viewing user being
It was the user of calibration before, then need not be recalibrated, but the calibration data use before recalling(Coordinate system contracts
Put part).
If 3 depth detectors are moved, and user is not new user, it is only necessary to recalibrate coordinate system translation and
Rotating part, the coordinate system scaling partial collimation data of old user before cooperation.
If 4 depth detectors are for being moved through, and user is new user, it is only necessary to recalibrates coordinate system scaling part, matches somebody with somebody
Coordinate system translation under depth detector position and rotating part calibration data before conjunction.
If 5 depth detectors are moved, and user is new user, then the calibration that must be updated.
Specific implementation process, repeats no more in the present embodiment.
Embodiment two
The embodiment of the present application additionally provides a kind of system for realizing virtual touch calibration, as shown in figure 4, including:
Creating unit, construction unit, computing unit, alignment unit, compensating unit and Tip element.
Creating unit 01, for creating virtual calibration menu;
Construction unit 02, for the plane to show plane that the virtual calibration menu is located as x-axis and y-axis is constituted,
Set up the first coordinate system;And set up the second coordinate system, for by user gesture position with the second coordinate system coordinate representation;
Computing unit 03, for calculating the corresponding relation of first coordinate system and second coordinate system;
Alignment unit 04, for according to the corresponding relation, the user gesture position coordinates that the second coordinate system is represented, using
The coordinate representation of first coordinate system;According to the user gesture position coordinates of the coordinate representation with first coordinate system, school
Positive user gesture and the virtual corresponding relation for calibrating menu;
Compensating unit 05, including
1)First compensating module 051, for carrying out the bit shift compensation parallel to display screen plane;
2)Second compensating module 052, for carrying out the bit shift compensation perpendicular to display screen plane;
3)3rd compensating module 053, for carrying out head inclination tracing compensation;
Tip element 06, for carrying out sound feedback algorithm.
By one or more embodiments of the invention, it is possible to achieve following technique effect or advantage:
Because employing the technological means of user's calibration, user gesture operation is carried out space projection school with virtual screen
Standard, effectively solves in prior art, because securing the projection relation in user gesture locational space and virtual screen space,
So that when the position of depth detector there occurs change, or because the change of interpupillary distance causes virtual picture after user is changed
The change of face position, the gesture operation for causing inconsistent entanglement problem corresponding to picture, has obtained effective solution.Meanwhile,
The skew of user's eyes position or the indemnifying measure for inclining is employed, after calibration is complete, if user's eyes occur slightly partially
When moving and incline, to the change that follows eyes position, the change of the display location of virtual screen that changes is compensated,
Even if so that user's eyes position changes, can also keep the position of virtual screen to keep the position after calibration motionless, so as to
Ensure that the projection relation after calibration does not change, effectively raise the degree of accuracy of user operation, improve user's
Sensory effects.
, but those skilled in the art once know basic creation although preferred embodiments of the present invention have been described
Property concept, then can make other change and modification to these embodiments.So, claims are intended to be construed to include excellent
Select embodiment and fall into the had altered of the scope of the invention and change.
Obviously, those skilled in the art can carry out the essence of various changes and modification without deviating from the present invention to the present invention
God and scope.So, if these modifications of the present invention and modification belong to the scope of the claims in the present invention and its equivalent technologies
Within, then the present invention is also intended to comprising these changes and modification.
Claims (12)
1. a kind of realize virtual touch calibration method, it is characterised in that methods described includes:
Virtual calibration menu is created, the virtual calibration menu is:Three circular diagrams comprising the first circle, the second circle and the 3rd circle
Mark, described three the first round central coordinate of circle, the second central coordinate of circle and the 3rd central coordinate of circle, the respectively virtual screen to show
Coordinate for the lower left corner of rectangle, the lower right corner and the upper left corner;
With the plane for showing plane that the virtual calibration menu is located as x-axis and y-axis is constituted, the first coordinate system is set up;
The second coordinate system is set up, by user gesture position with the second coordinate system coordinate representation;
Calculate the corresponding relation of first coordinate system and second coordinate system;
According to the corresponding relation, the user gesture position coordinates that the second coordinate system is represented, with the seat of first coordinate system
Mark is represented;
According to the user gesture position coordinates of the coordinate representation with first coordinate system, correcting user gesture and virtual calibration dish
Single corresponding relation.
2. the method for claim 1, it is characterised in that calculating first coordinate system and second coordinate system
Corresponding relation, specially:
According to the first coordinate system coordinate of virtual calibration four calibration points of menu, and described in the user's virtual touch for getting
The coordinate in the second coordinate system of virtual calibration three calibration points of menu, calculates arbitrary point coordinates in second coordinate system
Be converted to translational movement, scaling and the anglec of rotation of the first coordinate system coordinate.
3. the method for claim 1, it is characterised in that calculating first coordinate system and second coordinate system
Corresponding relation, specially:
By coordinate (X, Y, Z) of the user's virtual touch virtual screen any point in the second coordinate system, use
Corresponding in the coordinate (x, y, z) of first coordinate system with this, wherein,
A be show virtual screen horizontal resolution, B be show virtual screen vertical resolution, (X1,Y1,Z1)、(X2,Y2,
Z2) and (X3,Y3,Z3) it is the virtual coordinate for calibrating first center of circle of menu, second center of circle and the 3rd center of circle described in virtual touch.
4. the method for claim 1, it is characterised in that the corresponding pass of the correcting user gesture and virtual calibration menu
After system, show virtual screen to be operated, and the change for user's eyes position causes the change of virtual screen position to carry out
Compensation, the compensation are specifically included:
According to formulaWithUser's eyes position parallel to display screen plane is offset
The skew of virtual screen position is caused to compensate, negative sign represents that user's eyes position offset direction is offset with virtual screen position
Compensated is in opposite direction;
According to formulaWithTo vertical
When user's eyes position of display screen plane offsets the skew of virtual screen position is caused to compensate;
According toObtain when user's eyes position inclines to left and right
The vertical parallax compensation of eye pattern picture, the left eye compensation are contrary with right eye compensation direction;
Wherein, z0Go out for dummy object and shield distance, Δ XeWith Δ YeFor user's eyes parallel to display screen plane offset displacement, L
For the distance between user and screen, Δ XdWith Δ YdFor showing offseting relative to user's eyes for the virtual screen to be operated
Side-play amount, (Xe,Ye,Ze) for human eye position coordinates, Δ ZeFor family eyes perpendicular to display screen plane offset displacement, p is
The parallax of virtual screen right and left eyes image to be operated, θ are user's eyes angle of inclination.
5. the method as described in claim 1-4 is arbitrary, it is characterised in that in user operation virtual screen, feedback sound is carried
Show.
6. the method as described in claim 1-4 is arbitrary, it is characterised in that when feedback sound is pointed out, if for the transmission of left and right ear
Signal strength signal intensity be respectively S1And S2, then for the left and right strength S ' of audio amplifier output signal1And S'2Respectively:
Wherein, L1 and L2 is respectively user gesture position and the use
The distance of the left and right ear in family.
7. a kind of realize virtual touch calibration system, it is characterised in that the system includes:
Creating unit, for creating virtual calibration menu, wherein virtual calibration menu is comprising the first circle, the second circle and the 3rd circle
Three circular icons, described three the first round central coordinate of circle, the second central coordinate of circle and the 3rd central coordinate of circle, respectively with display
Virtual screen for rectangle the lower left corner, the lower right corner and the upper left corner coordinate;
Construction unit, for the plane to show plane that the virtual calibration menu is located as x-axis and y-axis is constituted, sets up the
One coordinate system;And set up the second coordinate system, for by user gesture position with the second coordinate system coordinate representation;
Computing unit, for calculating the corresponding relation of first coordinate system and second coordinate system;
Alignment unit, for according to the corresponding relation, the user gesture position coordinates that the second coordinate system is represented, with described
The coordinate representation of one coordinate system, correcting user gesture and the virtual corresponding relation for calibrating menu.
8. system as claimed in claim 7, it is characterised in that the computing unit, concrete with for:
According to the first coordinate system coordinate of virtual calibration four calibration points of menu, and described in the user's virtual touch for getting
The coordinate in the second coordinate system of virtual calibration four calibration points of menu, calculates arbitrary point coordinates in second coordinate system
Be converted to translational movement, scaling and the anglec of rotation of the first coordinate system coordinate.
9. system as claimed in claim 7, it is characterised in that the computing unit, concrete with for:
By coordinate (X, Y, Z) of the user's virtual touch virtual screen any point in the second coordinate system, use
Corresponding in the coordinate (x, y, z) of first coordinate system with this, wherein,
A be show virtual screen horizontal resolution, B be show virtual screen vertical resolution, (X1,Y1,Z1)、(X2,Y2,
Z2) and (X3,Y3,Z3) it is the virtual coordinate for calibrating first center of circle of menu, second center of circle and the 3rd center of circle described in virtual touch.
10. the system as described in claim 7-9 is arbitrary, it is characterised in that the system also includes compensating unit, for aobvious
After showing virtual screen to be operated, the change for user's eyes position causes the change of virtual screen position to compensate, tool
Body includes:
First compensating module, for according to formulaWithTo parallel to display screen plane
User's eyes position skew cause the skew of virtual screen position to compensate, negative sign represents user's eyes position offset direction
In opposite direction with virtual screen position migration;
Second compensating module, for according to formula
WithTo perpendicular to display screen
User's eyes position of plane causes the skew of virtual screen position to compensate when offseting;
3rd compensating module, for basisObtain user double
The vertical parallax of right and left eyes image is compensated when eye position inclines, the left eye compensation and right eye compensation direction are contrary;
Wherein, z0Go out for dummy object and shield distance, Δ XeWith Δ YeFor user's eyes parallel to display screen plane offset displacement, L
For the distance between user and screen, Δ XdWith Δ YdFor showing offseting relative to user's eyes for the virtual screen to be operated
Side-play amount, (Xe,Ye,Ze) for human eye position coordinates, Δ ZeFor family eyes perpendicular to display screen plane offset displacement, p is
The parallax of virtual screen right and left eyes image to be operated, θ are user's eyes angle of inclination.
11. systems as described in claim 7-9 is arbitrary, it is characterised in that the system also includes Tip element, for
During the operation virtual screen of family, feedback sound is pointed out.
12. systems as described in claim 7-9 is arbitrary, it is characterised in that Tip element when feedback sound is pointed out, if being directed to
The signal strength signal intensity that left and right ear sends is respectively S1And S2, then for the left and right strength S ' of audio amplifier output signal1And S'2Respectively:
Wherein, L1 and L2 is respectively user gesture position and the use
The distance of the left and right ear in family.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310180909.5A CN103941851B (en) | 2013-01-23 | 2013-05-16 | A kind of method and system for realizing virtual touch calibration |
CN201710139509.8A CN106951074B (en) | 2013-01-23 | 2013-05-16 | method and system for realizing virtual touch calibration |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310024722 | 2013-01-23 | ||
CN201310024722.6 | 2013-01-23 | ||
CN2013100247226 | 2013-01-23 | ||
CN201310180909.5A CN103941851B (en) | 2013-01-23 | 2013-05-16 | A kind of method and system for realizing virtual touch calibration |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710139509.8A Division CN106951074B (en) | 2013-01-23 | 2013-05-16 | method and system for realizing virtual touch calibration |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103941851A CN103941851A (en) | 2014-07-23 |
CN103941851B true CN103941851B (en) | 2017-03-15 |
Family
ID=51189548
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310180909.5A Active CN103941851B (en) | 2013-01-23 | 2013-05-16 | A kind of method and system for realizing virtual touch calibration |
CN201710139509.8A Active CN106951074B (en) | 2013-01-23 | 2013-05-16 | method and system for realizing virtual touch calibration |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710139509.8A Active CN106951074B (en) | 2013-01-23 | 2013-05-16 | method and system for realizing virtual touch calibration |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN103941851B (en) |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105487647A (en) * | 2014-09-16 | 2016-04-13 | 洪永川 | Intelligent terminal system with signal converting unction |
CN105404387A (en) * | 2014-09-16 | 2016-03-16 | 洪永川 | Gesture remote control system for intelligent terminal |
CN105487643B (en) * | 2014-09-16 | 2019-05-07 | 深圳市冠凯科技有限公司 | Interactive intelligent terminal system |
CN105487642B (en) * | 2014-09-16 | 2019-05-07 | 深圳市冠凯科技有限公司 | A kind of interactive intelligent terminal system |
CN105407262A (en) * | 2014-09-16 | 2016-03-16 | 洪永川 | Camera |
CN105407371A (en) * | 2014-09-16 | 2016-03-16 | 洪永川 | Television system |
CN104391603B (en) * | 2014-12-08 | 2017-06-09 | 京东方科技集团股份有限公司 | A kind of calibration method of touch screen, calibrating installation and calibration system |
CN104765156B (en) * | 2015-04-22 | 2017-11-21 | 京东方科技集团股份有限公司 | A kind of three-dimensional display apparatus and 3 D displaying method |
CN108885496B (en) * | 2016-03-29 | 2021-12-10 | 索尼公司 | Information processing apparatus, information processing method, and program |
CN109643208B (en) * | 2016-06-28 | 2022-08-19 | 株式会社尼康 | Display device, storage medium, display method, and control device |
US10496353B2 (en) | 2016-09-29 | 2019-12-03 | Jiang Chang | Three-dimensional image formation and color correction system and method |
CN107036628B (en) * | 2017-04-10 | 2020-09-01 | 中国船舶重工集团公司第七0七研究所 | Method for correcting paper chart |
CN109905754B (en) * | 2017-12-11 | 2021-05-07 | 腾讯科技(深圳)有限公司 | Virtual gift receiving method and device and storage equipment |
CN108459770B (en) * | 2018-03-09 | 2021-05-25 | 北京硬壳科技有限公司 | Coordinate correction method and device |
CN112748798B (en) * | 2019-10-31 | 2023-05-26 | Oppo广东移动通信有限公司 | Eyeball tracking calibration method and related equipment |
CN111476104B (en) * | 2020-03-17 | 2022-07-01 | 重庆邮电大学 | AR-HUD image distortion correction method, device and system under dynamic eye position |
CN112378398B (en) * | 2020-11-12 | 2023-03-17 | 展讯通信(上海)有限公司 | Method, device and equipment for determining attitude of terminal equipment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5583526A (en) * | 1995-07-28 | 1996-12-10 | Chrysler Corporation | Hand calibration system for virtual reality vehicle simulator |
CN1694056A (en) * | 2004-05-06 | 2005-11-09 | 阿尔派株式会社 | Operation input device and method of operation input |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7190331B2 (en) * | 2002-06-06 | 2007-03-13 | Siemens Corporate Research, Inc. | System and method for measuring the registration accuracy of an augmented reality system |
CN100416336C (en) * | 2003-06-12 | 2008-09-03 | 美国西门子医疗解决公司 | Calibrating real and virtual views |
CN102880352A (en) * | 2011-07-11 | 2013-01-16 | 北京新岸线移动多媒体技术有限公司 | Non-contact interface operation method and non-contact interface operation system |
CN102306053B (en) * | 2011-08-29 | 2014-09-10 | Tcl集团股份有限公司 | Virtual touch screen-based man-machine interaction method and device and electronic equipment |
CN102508562B (en) * | 2011-11-03 | 2013-04-10 | 深圳超多维光电子有限公司 | Three-dimensional interaction system |
-
2013
- 2013-05-16 CN CN201310180909.5A patent/CN103941851B/en active Active
- 2013-05-16 CN CN201710139509.8A patent/CN106951074B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5583526A (en) * | 1995-07-28 | 1996-12-10 | Chrysler Corporation | Hand calibration system for virtual reality vehicle simulator |
CN1694056A (en) * | 2004-05-06 | 2005-11-09 | 阿尔派株式会社 | Operation input device and method of operation input |
Also Published As
Publication number | Publication date |
---|---|
CN103941851A (en) | 2014-07-23 |
CN106951074A (en) | 2017-07-14 |
CN106951074B (en) | 2019-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103941851B (en) | A kind of method and system for realizing virtual touch calibration | |
JP6514089B2 (en) | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD | |
US20220366598A1 (en) | Calibration system and method to align a 3d virtual scene and a 3d real world for a stereoscopic head-mounted display | |
EP2979127B1 (en) | Display method and system | |
CN106454311B (en) | A kind of LED 3-D imaging system and method | |
US10623721B2 (en) | Methods and systems for multiple access to a single hardware data stream | |
US7907167B2 (en) | Three dimensional horizontal perspective workstation | |
US10133073B2 (en) | Image generation apparatus and image generation method | |
US20190206112A1 (en) | 3d digital painting | |
CN106774880B (en) | Three-dimensional tracking of user control devices in space | |
US10650573B2 (en) | Synthesizing an image from a virtual perspective using pixels from a physical imager array weighted based on depth error sensitivity | |
US20060126927A1 (en) | Horizontal perspective representation | |
CN206961066U (en) | A kind of virtual reality interactive device | |
WO2007013833A1 (en) | Method and system for visualising virtual three-dimensional objects | |
US9734622B2 (en) | 3D digital painting | |
CN110447224B (en) | Method for controlling virtual images in a display | |
KR20120015564A (en) | Display system and method using hybrid user tracking sensor | |
CN108881893A (en) | Naked eye 3D display method, apparatus, equipment and medium based on tracing of human eye | |
EP3413165A1 (en) | Wearable system gesture control method and wearable system | |
TW201349844A (en) | Method for creating a naked-eye 3D effect | |
CN108153417B (en) | Picture compensation method and head-mounted display device adopting same | |
JP6446465B2 (en) | I / O device, I / O program, and I / O method | |
Jones et al. | Correction of geometric distortions and the impact of eye position in virtual reality displays | |
RU2406150C2 (en) | Method and system for viewing virtual three-dimensional objects | |
TW202205851A (en) | Light transmitting display system, image output method thereof and processing device thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CP01 | Change in the name or title of a patent holder |
Address after: 266555 Qingdao economic and Technological Development Zone, Shandong, Hong Kong Road, No. 218 Patentee after: Hisense Video Technology Co.,Ltd. Address before: 266555 Qingdao economic and Technological Development Zone, Shandong, Hong Kong Road, No. 218 Patentee before: HISENSE ELECTRIC Co.,Ltd. |
|
CP01 | Change in the name or title of a patent holder |