CN110298924A - For showing the coordinate transformation method of detection information in a kind of AR system - Google Patents
For showing the coordinate transformation method of detection information in a kind of AR system Download PDFInfo
- Publication number
- CN110298924A CN110298924A CN201910394854.5A CN201910394854A CN110298924A CN 110298924 A CN110298924 A CN 110298924A CN 201910394854 A CN201910394854 A CN 201910394854A CN 110298924 A CN110298924 A CN 110298924A
- Authority
- CN
- China
- Prior art keywords
- interesting target
- equipment
- target
- equation
- face
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Abstract
The invention discloses the coordinate transformation methods for being used to show detection information in a kind of AR system, comprising: determines human eye observation position and interesting target position;Interesting target actual position to the interesting target is shown that projection position is coordinately transformed in equipment in AR;Interesting target is shown that the position coordinates in equipment are converted to coordinate of the interesting target on AR projection device in AR.Solve the technical issues of AR system lacks phenomena such as accurately calculating, can have dislocation when virtual information is superimposed simultaneously with real information of position coordinates, reduces the virtual information sense of reality.Reach and carries out accurate position coordinates transformation calculations, it can accomplish the image display position of adjust automatically AR system in conjunction with the real time positioning technology to viewer's eyes, and guarantee in the visual field of viewer, AR shows that the virtual projection in equipment can have an exact position matching with the target in real scene, realize really virtually with the seamless technical effect merged of reality.
Description
Technical field
The present invention relates to augmented reality AR technical field, in particular to for showing the seat of detection information in a kind of AR system
Mark transform method.
Background technique
Augmented reality is that one kind can be fused to virtual information (object, picture, video, sound etc.) in actual environment
Technology, it enriches this world based on virtual reality technology, by carrying out information supplement to real world, and passes through
It is commonly used to carry out certain interaction with people.Virtual reality technology has very big application potential, such as automobile in many fields
Assist driving, teleshopping, game etc..
Augmented reality can be used to the technology interacted with people, have very high requirement for user experience, but
It is current AR system or is difficult to accomplish that virtual information is merged with the seamless of real information, and one of important factor in order is just
It is computer none accurate position coordinates calculating when virtual information is carried out visual, therefore in virtual information
Phenomena such as having dislocation when display simultaneously with real information, the sense of reality of virtual information is reduced, to influence entire
The user experience of AR system.
Summary of the invention
The present invention provides the coordinate transformation methods for being used to show detection information in a kind of AR system, to solve existing skill
AR system is difficult to accomplish that virtual information is merged with the seamless of real information in art, exists and lacks accurately calculating for position coordinates, when
The technical issues of phenomena such as virtual information and real information can have dislocation when showing simultaneously, the reduction virtual information sense of reality.
The present invention provides the coordinate transformation methods for being used to show detection information in a kind of AR system, which comprises
Determine human eye observation position and interesting target position;By the interesting target actual position to the interesting target
Show that projection position is coordinately transformed in equipment in AR;The interesting target is shown into the position in equipment in the AR
Coordinate is set to be converted to coordinate of the interesting target on the AR projection device.
Preferably, the determining human eye observation position and interesting target position, comprising: establish three in real world
Dimension space coordinate system, the three-dimensional coordinate system is using ground as XOY plane, using upwardly direction perpendicular to the ground as Z axis pros
To;Obtain human eye observation position Po=(xo,yo,zo);The position of the interesting target is detected by image acquisition device, it is described
Interesting target is the interesting target for needing virtually to be shown;The interesting target is obtained in the three dimensional space coordinate
Actual position P in systemw=(xw,yw,zw)。
Preferably, described show the interesting target actual position to the interesting target in AR projects in equipment
Position is coordinately transformed, comprising: obtains the equation in face where the AR shows equipment;According to the observation position PoWith
And the actual position P of the interesting targetwObtain the linear equation of the space line;Equipment place is shown according to the AR
The equation in face and the linear equation obtain the interesting target in the AR and show the position coordinates P in equipments=(xs,
ys,zs)。
Preferably, the equation in face where the AR shows equipment, comprising: projection is determined in the three-dimensional coordinate system
Face normal vectorAnd the AR shows the first fixed point P that equipment place face is passed throughi=(xi,yi,zi);According to
Face normal vector where the AR shows equipment and first fixed point, obtain the equation in face where the AR shows equipment.
Preferably, described according to the observation position PoAnd the actual position P of the interesting targetwObtain the sky
Between straight line linear equation, comprising: according to the observation position PoAnd the actual position P of the interesting targetwDetermine space
The direction vector of straight line, the space line isAccording to the direction of the space line
Vector obtains the linear equation of the space line are as follows:
Preferably, described to show the position coordinates in equipment to the mesh interested in the AR interesting target
The coordinate being marked on the AR projection device is converted, comprising: the equation in face where determining the AR projection device;According to institute
Human eye observation position, the interesting target are stated where position coordinates, the AR in AR display equipment show equipment
The equation in face, obtains the interesting target on the AR projection device where the equation in face and the AR projection device
Position coordinates.
Said one or multiple technical solutions in the embodiment of the present invention at least have following one or more technology effects
Fruit:
For showing the coordinate transformation method of detection information, the method in a kind of AR system provided in an embodiment of the present invention
It include: first to be determined human eye observation's position coordinates, and obtain position of the interesting target in real world coordinates system
Information, the plane equation that will determine that virtual information shows equipment place plane, calculates interesting target actual position and projects to
AR shows the position coordinates of the target in equipment, shows that the target location coordinate in equipment calculates in AR further according to interesting target
The position coordinates of interesting target target on projection arrangement out realize and carry out accurate position coordinates transformation calculations, then tie
The image display position that can accomplish adjust automatically AR system to the real time positioning technology of viewer's eyes is closed, and is guaranteed
In the visual field of viewer, AR shows that the virtual projection in equipment can have exact position matching with the target in real scene, real
Now really virtually merged with the seamless of reality.It is difficult to accomplish virtual information and true to solve AR system in the prior art
The seamless fusion of information exists and lacks accurately calculating for position coordinates, has when virtual information is shown simultaneously with real information
The technical issues of phenomena such as dislocation, the reduction virtual information sense of reality.
The above description is only an overview of the technical scheme of the present invention, in order to better understand the technical means of the present invention,
And it can be implemented in accordance with the contents of the specification, and in order to allow above and other objects of the present invention, feature and advantage can
It is clearer and more comprehensible, the followings are specific embodiments of the present invention.
Detailed description of the invention
Fig. 1 is in a kind of AR system of the embodiment of the present invention for showing that the process of the coordinate transformation method of detection information is shown
It is intended to;
Fig. 2 is the schematic diagram for obtaining interesting target actual position in the embodiment of the present invention according to video camera;
Fig. 3 is the position coordinates transformational relation signal shown from actual target locations to virtual information in the embodiment of the present invention
Figure;
Fig. 4 is the coordinate conversion of target position on the position to projection arrangement shown in the embodiment of the present invention from virtual information
Relation schematic diagram.
Description of symbols: video camera 1.
Specific embodiment
The embodiment of the invention provides the coordinate transformation methods for being used to show detection information in a kind of AR system, to solve
AR system is difficult to accomplish that virtual information is merged with the seamless of real information in the prior art, there is the accurate meter for lacking position coordinates
The technical issues of phenomena such as calculating, having dislocation when virtual information is shown simultaneously with real information, the reduction virtual information sense of reality.
Technical solution in the embodiment of the present invention, general thought are as follows:
Determine human eye observation position and interesting target position;By the interesting target actual position to the sense
Targets of interest shows that projection position is coordinately transformed in equipment in AR;The interesting target is shown in the AR and is set
Standby upper position coordinates are converted to coordinate of the interesting target on the AR projection device.It is accurate progress has been reached
Position coordinates transformation calculations, can accomplish adjust automatically AR system in conjunction with the real time positioning technology to viewer's eyes
Image display position, and guarantee in the visual field of viewer, AR shows that the virtual projection in equipment can be in real scene
Target have an exact position matching, realize really virtually with the seamless technical effect merged of reality.It is accurate progress has been reached
Position coordinates transformation calculations, can accomplish adjust automatically AR system in conjunction with the real time positioning technology to viewer's eyes
Image display position, and guarantee in the visual field of viewer, AR shows that the virtual projection in equipment can be in real scene
Target have an exact position matching, realize really virtually with the seamless technical effect merged of reality.
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention
In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is
A part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art
Every other embodiment obtained without making creative work, shall fall within the protection scope of the present invention.
Embodiment one
Fig. 1 is a kind of flow diagram for going the method for fluorine ion in water in the embodiment of the present invention, as shown in Figure 1, described
Method includes:
Step 10: determining human eye observation position and interesting target position.
Step 20: by the interesting target actual position to the interesting target where AR shows and projects in equipment
Position is coordinately transformed.
Step 30: the interesting target is shown that the position coordinates in equipment exist to the interesting target in the AR
Coordinate on the AR projection device is converted.
Specifically, being first determined human eye observation's position coordinates, and obtains interesting target and sat in real world
Location information in mark system, the plane equation that will determine that virtual information shows equipment place plane, it is true to calculate interesting target
Real position projects to the position coordinates for the target that AR is shown in equipment, shows the target in equipment in AR further according to interesting target
Position coordinates calculate the position coordinates of interesting target target on projection arrangement, realize and carry out accurate position coordinates change
Calculating is changed, can accomplish that the image of adjust automatically AR system shows position in conjunction with the real time positioning technology to viewer's eyes
It sets, and guarantees in the visual field of viewer, AR shows that the virtual projection in equipment can have essence with the target in real scene
True location matches are realized and are really virtually merged with the seamless of reality.It is difficult to accomplish to solve AR system in the prior art
Virtual information is merged with the seamless of real information, is existed and is lacked accurately calculating for position coordinates, when virtual information and real information
The technical issues of phenomena such as having dislocation when showing simultaneously, the reduction virtual information sense of reality.
Further, the determining human eye observation position and interesting target position, comprising: established in real world
Three-dimensional coordinate system, the three-dimensional coordinate system using ground as XOY plane, using upwardly direction perpendicular to the ground as Z axis just
Direction;Obtain human eye observation position Po=(xo,yo,zo);The position of the interesting target, institute are detected by image acquisition device
Stating interesting target is the interesting target for needing virtually to be shown;The interesting target is obtained to sit in the three-dimensional space
Actual position P in mark systemw=(xw,yw,zw)。
It, can be with using ground as XOY plane specifically, establish real world three dimensional space coordinates XYZ as shown in Figure 2
Human eye observation position is positioned in real time by certain detection algorithm, adapts to observation in practical application in real time to meet
The demand of person's eye position, according to the three-dimensional coordinate system for the position and foundation for obtaining observer's eyes, so that it is determined that human eye
The coordinate P of observation positiono=(xo,yo,zo).Then the mesh interested is detected by the common video camera 1 of image acquisition device again
Target position, it is ensured that the visual field of image acquisition device goes out perpendicular to YOZ plane, this step needs to calculate by existing target detection
Method, such as lane detection, pedestrian detection etc., it is necessary first to determine 1 internal reference of video camera, it is outer ginseng and its in real world coordinates
Position in system, the interesting target location information that then existing coordinate transformation method will acquire turn from image pixel coordinates system
Image physical coordinates system is changed to, actual position P of the interesting target in the three-dimensional coordinate system will be finally obtainedw=
(xw,yw,zw), interesting target is a point target in the embodiment of the present invention, but for a line even object target, this
The method of invention is equally applicable, for example for line, can determine by two points.It is achieved with the inspection to viewer's eyes
Survey location technology, so that it may which the coordinate of real-time perfoming target position, which is converted, to be calculated.
Further, described that the interesting target actual position to the interesting target is shown into equipment upslide in AR
Shadow position is coordinately transformed, comprising: obtains the equation in face where the AR shows equipment;According to the observation position Po
And the actual position P of the interesting targetwObtain the linear equation of the space line;Equipment institute is shown according to the AR
Equation and the linear equation in face obtain the interesting target in the AR and show the position coordinates P in equipments=
(xs,ys,zs)。
Further, the equation in face where the AR shows equipment, comprising: determine and throw in the three-dimensional coordinate system
Shadow face normal vectorAnd the AR shows the first fixed point P that equipment place face is passed throughi=(xi,yi,zi);According to
Face normal vector where the AR shows equipment and first fixed point, obtain the equation in face where the AR shows equipment, described
AR shows that face where equipment can be plane and be also possible to curved surface, i.e. its corresponding equation can be plane equation and be also possible to song
Face equation.
Further, described according to the observation position PoAnd the actual position P of the interesting targetwDescribed in acquisition
The linear equation of space line, comprising: according to the observation position PoAnd the actual position P of the interesting targetwIt determines empty
Between straight line, the direction vector of the space line isAccording to the side of the space line
To vector, the linear equation of the space line is obtained are as follows:
Specifically, determining view plane normal's amount in the coordinate system of foundationAnd plane process
Some fixed point Pi=(xi,yi,zi), to obtain plane equation where projection plane:
xn(x-xi)+yn(y-yi)+zn(z-zi)=0
Perspective plane is described by taking plane as an example, but the perspective plane equally can be a curved surface, need to only determine corresponding
Perspective plane equation;According to observation position PoAnd interesting target actual position PwDetermine a space line Lo, the straight line
Direction vector beThen linear equation are as follows:
Write the form for doing parametric equation:
Wherein t is parameter to be asked;
By LoParametric equation substitute into projection plane where plane equation in, to obtain:
The expression formula of t is substituted into LoParametric equation, can be obtained under projection relation from observation position PoOn look over,
The position coordinates P of the projection of interesting target on a projection planes=(xs,ys,zs):
To realize the accurate calculation of position coordinates, by the way that AR system is directly calculated according to real goal position
The corresponding position of virtual target to be shown is needed in system display device, is avoided when virtual information and real information are shown simultaneously
Phenomena such as time, the dislocation occurred since position coordinates are inaccurate.
Further, described to show the position coordinates in equipment to described interested in the AR interesting target
Coordinate of the target on the AR projection device is converted, comprising: the equation in face where determining the AR projection device;According to
The human eye observation position, the interesting target show that position coordinates, the AR in equipment show equipment institute in the AR
The equation in face where equation and the AR projection device in face, obtains the interesting target on the AR projection device
Position coordinates.
Specifically, similarly determine the equation in face where the AR projection device, projection line perpendicular to AR projection device,
And by fixed point Ps, thus may determine that projection line equation;Projection line equation and the intersection point of plane where AR projection device are
Target to be shown is needed on AR projection device, then also just having obtained the location information of target on AR projection device.It is thrown using AR
Shadow equipment is one of situation, other projection devices can be used.It furthermore directly will be virtual without using AR projection device
The position coordinates that target is obtained according to the present embodiment aforementioned calculation method show in AR display equipment also possible.It reaches
Target position to be shown is needed on AR image display to that can accurately calculate, it can be real-time by means of human eye positioning device
Accurate adjustment image display position, AR show that the virtual projection in equipment can have exact position with the target in real scene
Match, realizes and really virtually merged with the seamless of reality, so that AR system has better user experience.And then it solves existing
AR system is difficult to accomplish that virtual information is merged with the seamless of real information in technology, exists and lacks accurately calculating for position coordinates,
The technical issues of phenomena such as having dislocation when virtual information and real information are shown simultaneously, the reduction virtual information sense of reality.
Embodiment two
In order to preferably introduce in a kind of AR system of the invention for showing the skill of the coordinate transformation method of detection information
Art features and applications is described in detail advantages of the present invention below in conjunction with specific embodiment, please refers to Fig. 2,3,4.
In conjunction with attached drawing 2-4, with coordinate transformation method proposed by the present invention in vehicle-mounted HUD system to the detection of lane line with
And display is further described and formula pushes over.
The first step obtains location information of the interesting target in real world coordinates system using video camera.
(1a) establishes real world three dimensional space coordinates XYZ as shown in Figure 2, using ground as XOY plane;
In (1b) such as Fig. 2,1 position coordinates P of video camera is determined in real world coordinates systemc=(xc,yc,zc), and protect
1 visual field of video camera is demonstrate,proved perpendicular to YOZ plane;
(1c) determines 1 internal reference of video camera: pixel is in the physical size dx in U axis direction, the physics ruler in V axis direction
The center point coordinate u of very little dy, image on U axis0, center point coordinate v on V axis0, focal length of camera f;
(1d) is determined to join outside video camera 1: video camera 1 spin matrix r and translation matrix t;
(1e) detects lane line from the image that video camera 1 obtains using correlation detection principle, obtains lane line in image
Pixel coordinate system Ppix=(upix,vpix) in position, and using following transition matrix by two lane line positions from pixel sit
Mark system is transformed into physical coordinates system Pimg=(ximg,yimg):
(1f) is as shown in Fig. 2, by two lane line positions from image physical coordinates system Pimg=(ximg,yimg) be transformed into and take the photograph
1 coordinate system P of cameracam=(xcam,ycam,zcam):
(1g) is by two lane line positions from 1 coordinate system P of video cameracam=(xcam,ycam,zcam) it is transformed into real world seat
Mark system Pw=(xw,yw,zw):
Wherein 0T=[0,0,0], then can be obtained by two practical lane line L1 on road surface shown in Fig. 3g=
P11gP12gAnd L2g=P21gP22gPosition coordinates in real world coordinates system, wherein P11g、P12g、P21gAnd P22gPoint
It Wei not line segment L1g、L2gFour endpoints, corresponding coordinate is respectively as follows:
P11g=(x11g,y11g,z11g)
P12g=(x12g,y12g,z12g)
P21g=(x21g,y21g,z21g)
P22g=(x22g,y22g,z22g)
Two lane lines are in XOY plane, therefore have z11g=z12g=z21g=z22g=0.
Second step determines the plane equation of plane where virtual information shows equipment.
P in (2a) Fig. 3eyeIt indicates driver eye positions, that is, human eye observation position, comes by Face datection scheduling algorithm real-time
It detects and positions coordinate P of the driver eye positions in real world coordinates systemeye=(xeye,yeye,z eye);
It indicates to need two lane line L1 to be shown in display equipment (vehicle glass) in (2b) Fig. 3s=P11sP12sAnd L2s
=P21sP22s, corresponding four endpoint location coordinates are respectively as follows:
P11s=(x11s,y11s,z11s)
P12s=(x12s,y12s,z12s)
P21s=(x21s,y21s,z21s)
P22s=(x22s,y22s,z22s)
Two lane line L1 in (2c) Fig. 3s=P11sP12sAnd L2s=P21sP22sDetermining plane is vehicle glass place
Plane, determine the plane pass through fixed point Ps=(xs,ys,zs) and plane normal vectorTo
To the point French plane equation of the plane:
xns(x-xs)+yns(y-ys)+zns(z-zs)=0
Third step calculates the position coordinates of the target of projection on the display device.
(3a) two o'clock determines straight line, according to driver eye positions PeyeAnd two practical lane line L1g、L2g's
Four endpoint locations are assured that four space line L1sg、L2sg、L3sgAnd L4sg, the directions of four space lines to
Amount is respectively as follows:
(3b) You Dianxiang formula is the linear equation that can determine above-mentioned four straight lines:
To which the intersection point P11 of vehicle glass place plane and above-mentioned four straight lines can be found outs、P12s、P21sAnd P22s
Position coordinates;
(3c) is in the hope of point P11sFor, which is plane and straight line L1 where vehicle glasssgIntersection point, by L1sgStraight line side
The form of journey writing parametric equation:
X=x11d·t+xeye
Y=y11d·t+yeye
Z=z11d·t+zeye
Wherein t is parameter to be asked;
The plane equation of plane where (3d) brings above-mentioned parameter equation into vehicle glass:
(x11d·t+xeye-xs)xns+(y11d·t+yeye-ys)yns+(z11d·t+zeye-zs)zns=0
(3e) can acquire the expression formula of parameter t:
The expression formula of t is substituted into straight line L1 by (3f)sgParametric equation P11 can be obtainedsCoordinate:
(3g) can similarly find out P12s、P21sAnd P22sPosition coordinates, then HUD image display is projected in vehicle
Two lane line L1 on glasss、L2sPosition i.e. can determine.
4th step calculates the position coordinates of target on projection arrangement.
L1 in (4a) Fig. 4v=P11vP12vAnd L2v=P21vP22vPlace plane is plane where HUD projection arrangement, with
XOY plane is parallel, if its plane equation are as follows: z=zv
The two lane line L1 shown on (4b) HUD projection arrangementv=P11vP12vAnd L2v=P21vP22v, wherein P11v、
P12v、P21v、P22vRespectively indicate line segment L1vWith L2vFour endpoints, corresponding position coordinates are as follows:
P11v=(x11v,y11v,zv)
P12v=(x12v,y12v,zv)
P21v=(x21v,y21v,zv)
P22v=(x22v,y22v,zv)
When (4c) plane where HUD image display is parallel with XOY plane, straight line P11 naturallysP11v、
P12sP12v、P21sP21v、P22sP22vAll perpendicular to XOY plane, that is to say, that P11sWith P11v、P12sWith P12v、P21sWith
P21v、P22sWith P22vX, Y coordinates all having the same, and then the plane where projection arrangement determines Z coordinate, and P11s、P12s、
P21s、P22sIt finds out in step 3, then being easy to available P11v、P12v、P21v、P22vCoordinate:
(4d) is according to step (4c) calculated formula, so that it may the lane obtained by driver eye positions, video camera 1
HUD throwing is directly calculated in plan-position where plan-position where line position information, vehicle glass and HUD image display
The lane line position shown on image device, and can guarantee to look over from driver eye positions, it is projected in vehicle glass
On lane line be completely coincident with practical lane line.It realizes and carries out accurate position coordinates transformation calculations, in conjunction with to viewing
The real time positioning technology of person's eyes can accomplish the image display position of adjust automatically AR system, and guarantee viewer's
In the visual field, AR shows that the virtual projection in equipment can have exact position matching with the target in real scene, realizes real
Virtually with reality the seamless technical effect merged.
Said one or multiple technical solutions in the embodiment of the present invention at least have following one or more technology effects
Fruit:
For showing the coordinate transformation method of detection information, the method in a kind of AR system provided in an embodiment of the present invention
It include: first to be determined human eye observation's position coordinates, and obtain interesting target in real world coordinates using video camera 1
Location information in system, the plane equation that will determine that virtual information shows equipment place plane, it is true to calculate interesting target
Position projects to the position coordinates for the target that AR is shown in equipment, shows the target position in equipment in AR further according to interesting target
The position coordinates that coordinate calculates interesting target target on projection arrangement are set, realizes and carries out accurate position coordinates transformation
It calculates, can accomplish the image display position of adjust automatically AR system in conjunction with the real time positioning technology to viewer's eyes,
And guarantee in the visual field of viewer, AR shows that the virtual projection in equipment there can be accurate position with the target in real scene
Matching is set, realizes and is really virtually merged with the seamless of reality.It is difficult to accomplish virtually to solve AR system in the prior art
Information merges with the seamless of real information, exists and lacks position coordinates and accurately calculate, when virtual information and real information simultaneously
The technical issues of phenomena such as having dislocation when display, the reduction virtual information sense of reality.
Although preferred embodiments of the present invention have been described, it is created once a person skilled in the art knows basic
Property concept, then additional changes and modifications can be made to these embodiments.So it includes excellent that the following claims are intended to be interpreted as
It selects embodiment and falls into all change and modification of the scope of the invention.
Obviously, those skilled in the art can carry out various modification and variations without departing from this hair to the embodiment of the present invention
The spirit and scope of bright embodiment.In this way, if these modifications and variations of the embodiment of the present invention belong to the claims in the present invention
And its within the scope of equivalent technologies, then the present invention is also intended to include these modifications and variations.
Claims (6)
1. for showing the coordinate transformation method of detection information in a kind of AR system, which is characterized in that the described method includes:
Determine human eye observation position and interesting target position;
The interesting target actual position to the interesting target is shown that projection position is sat in equipment in AR
Mark transformation;
The interesting target is shown that the position coordinates in equipment are projected to the interesting target in the AR in the AR
Coordinate in equipment is converted.
2. the method as described in claim 1, which is characterized in that the determining human eye observation position and interesting target institute are in place
It sets, comprising:
Three-dimensional coordinate system is established in real world, the three-dimensional coordinate system is using ground as XOY plane, with perpendicular to the ground
Upwardly direction is Z axis positive direction;
Obtain human eye observation position Po=(xo,yo,zo);
The position of the interesting target is detected by image acquisition device, the interesting target needs are virtually shown
Interesting target;
Obtain actual position P of the interesting target in the three-dimensional coordinate systemw=(xw,yw,zw)。
3. method according to claim 2, which is characterized in that described that the interesting target actual position is emerging to the sense
Interesting target shows that projection position is coordinately transformed in equipment in AR, comprising:
Obtain the equation in face where the AR shows equipment;
According to the observation position PoAnd the actual position P of the interesting targetwObtain the straight line side of the space line
Journey;
It is aobvious in the AR to obtain the interesting target for the equation and the linear equation in face where showing equipment according to the AR
Show the position coordinates P in equipments=(xs,ys,zs)。
4. method as claimed in claim 3, which is characterized in that the equation in face where the AR shows equipment, comprising:
Perspective plane normal vector is determined in the three-dimensional coordinate systemAnd the AR shows face where equipment
The the first fixed point P passed throughi=(xi,yi,zi);
Face normal vector where showing equipment according to the AR and first fixed point, obtain the side in face where the AR shows equipment
Journey.
5. method as claimed in claim 3, which is characterized in that described according to the observation position PoAnd the mesh interested
Target actual position PwObtain the linear equation of the space line, comprising:
According to the observation position PoAnd the actual position P of the interesting targetwDetermine space line, the space line
Direction vector be
According to the direction vector of the space line, the linear equation of the space line is obtained are as follows:
6. method as claimed in claim 4, which is characterized in that described to show the interesting target in equipment in the AR
Position coordinates converted to coordinate of the interesting target on the AR projection device, comprising:
The equation in face where determining the AR projection device;
Show that position coordinates, the AR in equipment are aobvious in the AR according to the human eye observation position, the interesting target
The equation in face, obtains the interesting target and throws in the AR where the equation and the AR projection device in face where showing equipment
Position coordinates in shadow equipment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910394854.5A CN110298924A (en) | 2019-05-13 | 2019-05-13 | For showing the coordinate transformation method of detection information in a kind of AR system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910394854.5A CN110298924A (en) | 2019-05-13 | 2019-05-13 | For showing the coordinate transformation method of detection information in a kind of AR system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110298924A true CN110298924A (en) | 2019-10-01 |
Family
ID=68026909
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910394854.5A Pending CN110298924A (en) | 2019-05-13 | 2019-05-13 | For showing the coordinate transformation method of detection information in a kind of AR system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110298924A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110856254A (en) * | 2019-11-22 | 2020-02-28 | 上海图聚智能科技股份有限公司 | Vision-based indoor positioning method, device, equipment and storage medium |
CN111405263A (en) * | 2019-12-26 | 2020-07-10 | 的卢技术有限公司 | Method and system for enhancing head-up display by combining two cameras |
CN111918114A (en) * | 2020-07-31 | 2020-11-10 | 北京市商汤科技开发有限公司 | Image display method, image display device, display equipment and computer readable storage medium |
CN112464870A (en) * | 2020-12-08 | 2021-03-09 | 未来汽车科技(深圳)有限公司 | Target object real scene fusion method, system, equipment and storage medium for AR-HUD |
CN115002431A (en) * | 2022-05-20 | 2022-09-02 | 广景视睿科技(深圳)有限公司 | Projection method, control device and projection system |
CN115995161A (en) * | 2023-02-01 | 2023-04-21 | 华人运通(上海)自动驾驶科技有限公司 | Method and electronic device for determining parking position based on projection |
WO2023071834A1 (en) * | 2021-10-28 | 2023-05-04 | 虹软科技股份有限公司 | Alignment method and alignment apparatus for display device, and vehicle-mounted display system |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107564089A (en) * | 2017-08-10 | 2018-01-09 | 腾讯科技(深圳)有限公司 | Three dimensional image processing method, device, storage medium and computer equipment |
CN107784693A (en) * | 2017-09-22 | 2018-03-09 | 西安点云生物科技有限公司 | A kind of information processing method and device |
-
2019
- 2019-05-13 CN CN201910394854.5A patent/CN110298924A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107564089A (en) * | 2017-08-10 | 2018-01-09 | 腾讯科技(深圳)有限公司 | Three dimensional image processing method, device, storage medium and computer equipment |
CN107784693A (en) * | 2017-09-22 | 2018-03-09 | 西安点云生物科技有限公司 | A kind of information processing method and device |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110856254A (en) * | 2019-11-22 | 2020-02-28 | 上海图聚智能科技股份有限公司 | Vision-based indoor positioning method, device, equipment and storage medium |
CN111405263A (en) * | 2019-12-26 | 2020-07-10 | 的卢技术有限公司 | Method and system for enhancing head-up display by combining two cameras |
CN111918114A (en) * | 2020-07-31 | 2020-11-10 | 北京市商汤科技开发有限公司 | Image display method, image display device, display equipment and computer readable storage medium |
CN112464870A (en) * | 2020-12-08 | 2021-03-09 | 未来汽车科技(深圳)有限公司 | Target object real scene fusion method, system, equipment and storage medium for AR-HUD |
CN112464870B (en) * | 2020-12-08 | 2024-04-16 | 未来汽车科技(深圳)有限公司 | Target object live-action fusion method, system, equipment and storage medium for AR-HUD |
WO2023071834A1 (en) * | 2021-10-28 | 2023-05-04 | 虹软科技股份有限公司 | Alignment method and alignment apparatus for display device, and vehicle-mounted display system |
CN115002431A (en) * | 2022-05-20 | 2022-09-02 | 广景视睿科技(深圳)有限公司 | Projection method, control device and projection system |
CN115002431B (en) * | 2022-05-20 | 2023-10-27 | 广景视睿科技(深圳)有限公司 | Projection method, control device and projection system |
CN115995161A (en) * | 2023-02-01 | 2023-04-21 | 华人运通(上海)自动驾驶科技有限公司 | Method and electronic device for determining parking position based on projection |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110298924A (en) | For showing the coordinate transformation method of detection information in a kind of AR system | |
CN107564089B (en) | Three-dimensional image processing method, device, storage medium and computer equipment | |
Behringer | Registration for outdoor augmented reality applications using computer vision techniques and hybrid sensors | |
CN102968809B (en) | The method of virtual information mark and drafting marking line is realized in augmented reality field | |
CN106228538B (en) | Binocular vision indoor orientation method based on logo | |
CN106934772B (en) | Horizontal calibration method and system for panoramic image or video and portable terminal | |
CN104750969B (en) | The comprehensive augmented reality information superposition method of intelligent machine | |
WO2018163898A1 (en) | Free viewpoint movement display device | |
Li et al. | Easy calibration of a blind-spot-free fisheye camera system using a scene of a parking space | |
CN106920276B (en) | A kind of three-dimensional rebuilding method and system | |
CN111199560B (en) | Video monitoring positioning method and video monitoring system | |
CN110148169A (en) | A kind of vehicle target 3 D information obtaining method based on PTZ holder camera | |
CN101408422B (en) | Traffic accident on-site mapper based on binocular tridimensional all-directional vision | |
KR20150013709A (en) | A system for mixing or compositing in real-time, computer generated 3d objects and a video feed from a film camera | |
WO2009065003A1 (en) | Method and apparatus of taking aerial surveys | |
CN107145224B (en) | Human eye sight tracking and device based on three-dimensional sphere Taylor expansion | |
CN104463778A (en) | Panoramagram generation method | |
CN109724586B (en) | Spacecraft relative pose measurement method integrating depth map and point cloud | |
CN109685855A (en) | A kind of camera calibration optimization method under road cloud monitor supervision platform | |
JP2004265396A (en) | Image forming system and image forming method | |
WO2009069165A2 (en) | Transition method between two three-dimensional geo-referenced maps | |
Zheng et al. | Scanning scene tunnel for city traversing | |
CN111914790A (en) | Real-time human body rotation angle identification method based on double cameras under different scenes | |
CN110197524B (en) | Stereoscopic display method, apparatus, device, and computer-readable storage medium | |
Aliakbarpour et al. | Geometric exploration of virtual planes in a fusion-based 3D data registration framework |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20191001 |
|
WD01 | Invention patent application deemed withdrawn after publication |