CN106710002A - AR implementation method and system based on positioning of visual angle of observer - Google Patents

AR implementation method and system based on positioning of visual angle of observer Download PDF

Info

Publication number
CN106710002A
CN106710002A CN201611244524.0A CN201611244524A CN106710002A CN 106710002 A CN106710002 A CN 106710002A CN 201611244524 A CN201611244524 A CN 201611244524A CN 106710002 A CN106710002 A CN 106710002A
Authority
CN
China
Prior art keywords
scene
real
observer
attitude
dimensional virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611244524.0A
Other languages
Chinese (zh)
Inventor
刘林运
温晓晴
丁淑华
田媛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHENZHEN DLP DIGITAL TECHNOLOGY CO LTD
Original Assignee
SHENZHEN DLP DIGITAL TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SHENZHEN DLP DIGITAL TECHNOLOGY CO LTD filed Critical SHENZHEN DLP DIGITAL TECHNOLOGY CO LTD
Priority to CN201611244524.0A priority Critical patent/CN106710002A/en
Publication of CN106710002A publication Critical patent/CN106710002A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to an AR implementation method and system based on the positioning of a visual angle of an observer, and the method comprises the steps: obtaining the position and posture of the observer located in a real scene, and giving the position and posture to a virtual camera; transmitting and the position and posture to a rendering center for rendering, and obtaining a three-dimensional virtual scene; transmitting the three-dimensional virtual scene to display equipment worn by the observer, carrying out the superposing of the three-dimensional virtual scene and the real scene, and displaying the superposed scene; and carrying out the matching of a real object in the real scene and a virtual object in the three-dimensional virtual scene. The method achieves the fusion and display of the real world and a virtual world, employs the rendering center in a server for rendering, enables the rendering process to be integrated in the server, achieves no constraint on the size of the real scene and the complexity, is strong in rendering capability, is high in applicability, and is wide in application range.

Description

AR implementation methods and its system based on observer visual angle positioning
Technical field
The present invention relates to the technical field of AR, more specifically refer to AR implementation methods based on observer visual angle positioning and Its system.
Background technology
VR (i.e. virtual reality) and AR (i.e. augmented reality) is used as following new display means, and recent years is fashionable complete Ball, numerous enterprises develop various terminals and show or provide content service all in the technology around VR and AR.Wherein, AR skills Art not only needs that virtual scene and real scene are carried out the perfect superposition of image, in addition it is also necessary to ensures with observer position and regards The movement at angle, virtual scene must keep being synchronized with the movement with real scene, thus with technical difficulty higher.
Current AR technologies mainly have following two implementations:
First way is to carry out advance 3-D scanning and position to real scene based on depth recognition technology to extract, and Designing and producing for dummy object and scene is carried out on the basis of this;When actually used, the position and visual angle of observer are obtained in real time And virtual camera is assigned, the position according to viewpoint assigns the positional information that three-dimensional rendering engine is used as three-dimensional virtual object, So as to obtain the additive fusion of real physical space and virtual scene, such as Hololens of Microsoft;
The second way is that identification scanning in advance is carried out to true three-dimension object based on image recognition technology, actually used During, Real time identification real-world object, and do virtual enhancing display, such as 0glass of blue or green orange visual field for real-world object Mirror.
Above-mentioned two ways is respectively present following shortcoming:The shortcoming one of first way is that the scope of depth recognition has Limit, best results in the range of 4-5 meters, more than 10 meters of basic None- identifieds;Two is that microprocessor is integrated in into observer to wear AR glasses in, therefore rendering capability is limited, causes the fineness of scene, complexity inadequate, it is impossible to support various dynamic elements; The shortcoming of the second way does not have the three dimensional local information of real scene, is not the augmented reality to three-dimensional real scene, only Only for the augmented reality of three-dimensional body, range of application is very limited.
Therefore, it is necessary to design a kind of AR implementation methods based on observer visual angle positioning, realize to the big of real scene Small and complexity is unrestricted, and rendering capability is high, strong applicability, and is with a wide range of applications.
The content of the invention
Defect it is an object of the invention to overcome prior art, there is provided the AR implementation methods based on observer visual angle positioning And its system.
To achieve the above object, the present invention uses following technical scheme:Based on the AR implementation methods of observer visual angle positioning, Methods described includes:
Position and the attitude of the observer being located in real scene are obtained, and assigns virtual camera;
Send the position and the attitude to the center of rendering to be rendered, obtain three-dimensional virtual scene;
The display device that the three-dimensional virtual scene is worn to the observer is sent, is overlapped with the real scene After show;
Real-world object and the dummy object of three-dimensional virtual scene in real scene is matched.
Its further technical scheme is:The position for obtaining the observer in real scene and attitude, and assign The step of giving virtual camera, it is specific to be obtained using at least one of light beam positioning, infrared positioning and ultrasonic wave positioning positioning method The position of observer of the fetch bit in real scene.
Its further technical scheme is:The position for obtaining the observer in real scene and attitude, and assign The step of giving virtual camera, the specific attitude that the observer in real scene is obtained using gyroscope attitude orientation mode.
Its further technical scheme is:The transmission position and the attitude to the center of rendering are rendered, and are obtained The step of taking three-dimensional virtual scene, including step in detail below:
Send the position and the attitude renders center to server;
The triggering command of the control end in the reception server, the center that renders carries out rendering for three-dimensional virtual scene.
Its further technical scheme is:The dummy object of the real-world object and three-dimensional virtual scene by real scene The step of being matched, specifically includes following steps:
Under visual angle based on the observer, in advance measurement real scene in real-world object position and attitude;
The position of the real-world object according to measurement and attitude, adjust the position of the dummy object of three-dimensional virtual scene And attitude.
Its further technical scheme is:The dummy object of the real-world object and three-dimensional virtual scene by real scene The step of being matched, specifically real-time adjustment render the position in the dummy object of the three-dimensional virtual scene that center renders Parameter and attitude parameter, until the dummy object matching of the real-world object and three-dimensional virtual scene in real scene.
Its further technical scheme is:The dummy object of the real-world object and three-dimensional virtual scene by real scene After the step of being matched, also including interacting formula operation in three-dimensional virtual scene using interactive device.
System is realized present invention also offers the AR positioned based on observer visual angle, including it is acquiring unit, rendering unit, folded Plus unit and matching unit;
The acquiring unit, for obtaining position and the attitude of the observer in real scene, and assigns virtual Camera;
The rendering unit, is rendered for sending the position and the attitude to the center of rendering, and obtains three-dimensional Virtual scene;
The superpositing unit, for sending the display device that the three-dimensional virtual scene is worn to the observer, with institute State and shown after real scene is overlapped;
A matching unit, for the real-world object and the dummy object of three-dimensional virtual scene in real scene to be carried out Match somebody with somebody.
Its further technical scheme is:The rendering unit includes sending module and receives rendering module;
The sending module, center is rendered for sending the position and the attitude to server;
The reception rendering module, for the triggering command of the control end in the reception server, the center that renders is carried out Three-dimensional virtual scene is rendered.
Its further technical scheme is:The matching unit includes measurement module and adjusting module;
The measurement module, for the visual angle based on the observer under, in advance measurement real scene in real-world object Position and attitude;
The adjusting module, position and attitude for the real-world object according to measurement, adjusts virtual three dimensional field The position of the dummy object of scape and attitude.
Compared with the prior art, the invention has the advantages that:AR realization sides based on observer visual angle positioning of the invention Method, by obtaining position and the attitude of the observer in the real scene, and by the position of interactive device, attitude and triggering Signal data is assigned and renders center, and carrying out various animation triggerings as trigger signal renders, according to this position and this attitude, using position The center that renders in server carries out rendering for three-dimensional virtual scene, and three-dimensional virtual scene sends to observer AR for wearing Display is overlapped with real scene on mirror, and matching is adjusted to display content, you can realization is by real world and virtually World's fusion display, is rendered using the center that renders in server, and render process is integrated in server, is realized to true The size and complexity of real field scape are unrestricted, and rendering capability is high, strong applicability, and is with a wide range of applications.
The invention will be further described with specific embodiment below in conjunction with the accompanying drawings.
Brief description of the drawings
The flow chart of the AR implementation methods based on observer visual angle positioning that Fig. 1 is provided for the specific embodiment of the invention;
The transmission position and attitude that Fig. 2 is provided for the specific embodiment of the invention are rendered to the center of rendering and are obtained three Tie up the particular flow sheet of virtual scene;
Fig. 3 for the specific embodiment of the invention provide by the virtual of the real-world object in real scene and three-dimensional virtual scene The particular flow sheet that object is matched;
Fig. 4 realizes the structured flowchart of system for the AR based on observer visual angle positioning that the specific embodiment of the invention is provided;
The structured flowchart of the rendering unit that Fig. 5 is provided for the specific embodiment of the invention;
The structured flowchart of the matching unit that Fig. 6 is provided for the specific embodiment of the invention.
Specific embodiment
In order to more fully understand technology contents of the invention, technical scheme is entered with reference to specific embodiment One step introduction and explanation, but it is not limited to this.
Specific embodiment as shown in figs. 1 to 6, the AR realization sides based on observer visual angle positioning that the present embodiment is provided Method, can be used in any scene, and it is unrestricted to the size and complexity of real scene to realize, rendering capability is high, suitable It is strong with property, and be with a wide range of applications.
As shown in figure 1, the AR implementation methods based on observer visual angle positioning, the method includes:
S1, the position for obtaining the observer being located in real scene and attitude, and assign virtual camera;
S2, the transmission position and the attitude to the center of rendering are rendered, and obtain three-dimensional virtual scene;
The display device that S3, the transmission three-dimensional virtual scene are worn to the observer, is carried out with the real scene Shown after superposition;
S4, the real-world object and the dummy object of three-dimensional virtual scene in real scene are matched.
In the present embodiment, first using just needing to carry out above-mentioned S1 to S4 steps in a certain real scene, or, It is specific according to real every time using being required for carrying out above-mentioned S1 to S4 steps in a certain real scene in other embodiment Depending on the situation of border.
Above-mentioned S1 steps, obtain position and the attitude of the observer being located in real scene, and assign virtual camera The step of, the position of observer and the attitude of observer are mainly obtained here, wherein, it is specific using light beam positioning, infrared positioning And at least one of ultrasonic wave positioning positioning method obtains the position of the observer in the real scene.
Above-mentioned light beam positioning is the positioning light tower by building transmitting laser in the space of positioning, and located space is entered Row laser fire, sets multiple optical signal receivers on object to be positioned, and carries out calculation process to data in receiving terminal, directly Obtain and take three-dimensional position, such as laser positioning.
Above-mentioned acquisition is located at the attitude of observer in real scene, and assigns virtual camera, primarily to obtain seeing The visual angle of the person of examining, specifically obtains the attitude of the observer in real scene using gyroscope attitude orientation mode here.
Further, above-mentioned S2 steps, send the position and the attitude to the center of rendering is rendered, and obtain The step of taking three-dimensional virtual scene, including step in detail below:
S21, the transmission position and the attitude render center to server;
The triggering command of the control end in S22, the reception server, the center that renders carries out the wash with watercolours of three-dimensional virtual scene Dye.
Above-mentioned S21 steps, send the position and the attitude to server the step of rendering center, in rendering The heart is a rendering engine, and it is provided separately within server, and is not integrated into what the observer that observer worn wore In display device, so using the engine of rendering capability higher, therefore can ensure appointing for virtual scene due to rendering center Meaning complexity, fineness, are capable of achieving various complicated animations and special efficacy.
Above-mentioned S22 steps, the triggering command of the control end in the reception server, the center that renders carries out three-dimensional Scene is rendered, and the visual angle that renders mainly according to the observation and position here assign virtual camera, are entered by virtual camera Real scene acquired in row observer visual angle and position is recorded, and three-dimensional virtual scene is rendered with this.
The step of above-mentioned S3, the display device that the three-dimensional virtual scene is worn to the observer is sent, it is true with described The step of real field scape shows after being overlapped, is that three-dimensional virtual scene is presented by the display device that the observer wears here Out, simply it is superimposed with real scene and is shown.Here transmission three-dimensional virtual scene is by virtual three-dimensional scenic Vision signal be transmitted through come, wearable device typically now is wire transmission vision signal, it is also possible to be wirelessly transferred video letter Number.
In the present embodiment, above-mentioned S4 steps, by the virtual of the real-world object in real scene and three-dimensional virtual scene The step of object is matched, specifically includes following steps:
Under S41, the visual angle based on the observer, in advance measurement real scene in real-world object position and appearance State;
S42, the position according to the real-world object for measuring and attitude, adjust the dummy object of three-dimensional virtual scene Position and attitude.
Above-mentioned S41 steps, under the visual angle based on the observer, measure the position of the real-world object in real scene in advance Put and attitude, particular location and appearance of the real-world object first obtained under observer visual angle in real scene are mainly here State, according to the above-mentioned particular location and the dummy object of pose adjustment three-dimensional virtual scene for obtaining after three-dimensional virtual scene is generated Position and attitude, settle at one go, the speed for adjusting and matching.
In addition, in other embodiment, above-mentioned S4 steps, by the real-world object in real scene and three-dimensional virtual scene The step of dummy object is matched, specifically real-time adjustment render the dummy object of the three-dimensional virtual scene that center renders In location parameter and attitude parameter, until the dummy object matching of real-world object and three-dimensional virtual scene in real scene. It is herein to be adjusted according to the effect after Overlapping display, is joined by the position in the dummy object for changing the three-dimensional virtual scene Number and attitude parameter, reach the effect of matching.
Either above-mentioned any matching way, true thing is realized in the position and Viewing-angle information for being all based on three-dimensional scenic The identification of body and its matched with dummy object, strong applicability, and be with a wide range of applications.
In the present embodiment, above-mentioned S4 steps, the real-world object and three-dimensional virtual scene by real scene After the step of dummy object is matched, formula behaviour also including S5, using interactive device is interacted in three-dimensional virtual scene Make.Here interactive device includes for example interactive handle of any interactive device, gloves, bracelet etc., by by interactive handle Position, attitude assign and render center as trigger signal, carry out various animations triggerings and render, so as to realize various interactive behaviour Make.
The above-mentioned AR implementation methods based on observer visual angle positioning, by obtaining the observer's in real scene Position and attitude, and the imparting of the position of interactive device, attitude and trigger signal data is rendered into center, carried out as trigger signal Various animation triggerings are rendered, and according to this position and this attitude, virtual three dimensional field are carried out using the center that renders in server Scape is rendered, and three-dimensional virtual scene is sent on the display device worn to the observer and real scene is overlapped display, And matching is adjusted to display content, you can realize by real world and virtual world fusion display, using in server The center of rendering is rendered, and render process is integrated in server, realize to the size and complexity of real scene without Limitation, rendering capability is high, strong applicability, and is with a wide range of applications.
Above-mentioned display device includes AR glasses, the helmet, and at least one in mobile terminal, in other embodiment, is also wrapped Including other can show the electronic product of three-dimensional virtual scene.
When display device is mobile terminal, the equipment for having shoot function by mobile terminals such as mobile phones is seen, really Scene is shot by the camera of the mobile terminals such as mobile phone, then is superimposed with virtual, and time delay is done to the true scene for shooting, so that The true foreground that can be shot and the virtual part synchronization for rendering.
It is as shown in figure 4, the AR based on observer visual angle positioning realizes system including acquiring unit 10, rendering unit 20, folded Plus unit 30 and matching unit 40;
Acquiring unit 10, for obtaining position and the attitude of the observer in real scene, and assigns virtualphase Machine;
Rendering unit 20, is rendered for sending the position and the attitude to the center of rendering, and obtains three-dimensional empty Intend scene;
Superpositing unit 30, it is and described for sending the display device that the three-dimensional virtual scene is worn to the observer Real scene shows after being overlapped;
A matching unit 40, for the real-world object and the dummy object of three-dimensional virtual scene in real scene to be carried out Match somebody with somebody.
Above-mentioned acquiring unit 10 is specifically using at least one of light beam positioning, infrared positioning and ultrasonic wave positioning positioning Mode obtains the position of the observer in real scene, is obtained in real scene using gyroscope attitude orientation mode Observer attitude.
Above-mentioned light beam positioning is the positioning light tower by building transmitting laser in the space of positioning, and located space is entered Row laser fire, sets multiple optical signal receivers on object to be positioned, and carries out calculation process to data in receiving terminal, directly Obtain and take three-dimensional position, such as laser positioning.
Rendering unit 20 includes sending module 21 and receives rendering module 22;
Sending module 21, center is rendered for sending the position and the attitude to server;
Rendering module 22 is received, for the triggering command of the control end in the reception server, the center that renders carries out three Tie up rendering for virtual scene.
The center that renders referred in sending module 21 is a rendering engine, and it is provided separately within server, not integrated In the display device that the observer that observer is worn wears, so energy can be rendered using higher due to rendering center The engine of power, therefore arbitrarily complicated degree, the fineness of virtual scene can be ensured, it is capable of achieving various complicated animations and special efficacy.
The visual angle for rendering mainly according to the observation and position for receiving rendering module 22 assign virtual camera, by virtual Camera carries out observer visual angle and the real scene acquired in position is recorded, and carries out wash with watercolours to three-dimensional virtual scene with this Dye.
Superpositing unit 30 is that three-dimensional virtual scene is showed by the display device that the observer wears, with true field Scape is simply superimposed and is shown.
In the present embodiment, matching unit 40 includes measurement module 41 and adjusting module 42;
Measurement module 41, for the visual angle based on the observer under, the real-world object in measurement real scene in advance Position and attitude;
Adjusting module 42, position and attitude for the real-world object according to measurement, adjusts three-dimensional virtual scene Dummy object position and attitude.
Measurement module 41 mainly first obtain particular location in real scene of real-world object under observer visual angle and Attitude, according to the above-mentioned particular location and the virtual object of pose adjustment three-dimensional virtual scene for obtaining after three-dimensional virtual scene is generated The position of body and attitude, settle at one go, the speed for adjusting and matching.
In other embodiment, above-mentioned matching unit 40 is that real-time adjustment renders the three-dimensional virtual scene that center renders Dummy object in location parameter and attitude parameter, until real-world object and three-dimensional virtual scene in real scene is virtual Object matches.It is herein to be adjusted according to the effect after Overlapping display, by the dummy object for changing the three-dimensional virtual scene In location parameter and attitude parameter, reach the effect of matching.
Above-mentioned matching unit 40 is all based on the position of three-dimensional scenic and Viewing-angle information realize real-world object identification and It is matched with dummy object, strong applicability, and is with a wide range of applications.
In the present embodiment, the AR based on observer visual angle positioning realizes that system also includes operating unit 50, operating unit 50 are used to interact formula operation in three-dimensional virtual scene using interactive device, and interactive device here sets including any interaction Standby for example interactive handle, gloves, bracelet etc., triggering is used as by assigning the center that renders by the position of interactive handle, attitude Signal, carries out various animation triggerings and renders, so as to realize various interactive operations.
The above-mentioned AR based on observer visual angle positioning realizes system, is obtained by acquiring unit 10 and is located in real scene Observer position and attitude, and by the position of interactive device, attitude and trigger signal data assign render center, touch Signalling carries out the triggering of various animations and renders, rendering unit 20 according to this position and this attitude, using the wash with watercolours in server Dye center carries out rendering for three-dimensional virtual scene, and three-dimensional virtual scene is sent on the display device worn to the observer, folds Plus three-dimensional virtual scene and real scene are overlapped display by unit 30, and matching unit 40 is adjusted to display content Match somebody with somebody, you can realize, by real world and virtual world fusion display, being rendered using the center that renders in server, will render In server, it is unrestricted to the size and complexity of real scene to realize, rendering capability is high, applicability for process integration By force, and it is with a wide range of applications.
Above-mentioned display device includes AR glasses, the helmet, and at least one in mobile terminal, in other embodiment, is also wrapped Including other can show the electronic product of three-dimensional virtual scene.
It is above-mentioned that technology contents of the invention are only further illustrated with embodiment, it is easier to understand in order to reader, but not Represent embodiments of the present invention and be only limitted to this, any technology done according to the present invention extends or recreates, by of the invention Protection.Protection scope of the present invention is defined by claims.

Claims (10)

1. the AR implementation methods for being positioned based on observer visual angle, it is characterised in that methods described includes:
Position and the attitude of the observer being located in real scene are obtained, and assigns virtual camera;
Send the position and the attitude to the center of rendering to be rendered, obtain three-dimensional virtual scene;
The display device that the three-dimensional virtual scene is worn to the observer is sent, is shown after being overlapped with the real scene Show;
Real-world object and the dummy object of three-dimensional virtual scene in real scene is matched.
2. AR implementation methods based on observer visual angle positioning according to claim 1, it is characterised in that acquisition position The position of the observer in real scene and attitude, and the step of assign virtual camera, it is specific using light beam positioning, it is infrared At least one of positioning and ultrasonic wave positioning positioning method obtain the position of the observer in real scene.
3. AR implementation methods based on observer visual angle positioning according to claim 2, it is characterised in that acquisition position The position of the observer in real scene and attitude, and the step of assign virtual camera, it is specific to be determined using gyroscope attitude Position mode obtains the attitude of the observer in real scene.
4. it is according to claim 3 based on observer visual angle positioning AR implementation methods, it is characterised in that the transmission institute Rheme is put and the attitude to the center of rendering is rendered, the step of obtain three-dimensional virtual scene, including step in detail below:
Send the position and the attitude renders center to server;
The triggering command of the control end in the reception server, the center that renders carries out rendering for three-dimensional virtual scene.
5. AR implementation methods based on observer visual angle positioning according to claim 4, it is characterised in that it is described will be true The step of dummy object of real-world object and three-dimensional virtual scene in scene is matched, specifically includes following steps:
Under visual angle based on the observer, in advance measurement real scene in real-world object position and attitude;
The position of the real-world object according to measurement and attitude, adjust position and the appearance of the dummy object of three-dimensional virtual scene State.
6. AR implementation methods based on observer visual angle positioning according to claim 4, it is characterised in that it is described will be true The step of dummy object of real-world object and three-dimensional virtual scene in scene is matched, specifically real-time adjustment renders center Location parameter and attitude parameter in the dummy object of the three-dimensional virtual scene for rendering, until the true thing in real scene The dummy object matching of body and three-dimensional virtual scene.
7. according to claim 5 or 6 based on observer visual angle positioning AR implementation methods, it is characterised in that it is described will After the step of dummy object of real-world object and three-dimensional virtual scene in real scene is matched, also including using interaction Equipment interacts formula operation in three-dimensional virtual scene.
8. the AR based on observer visual angle positioning realizes system, it is characterised in that single including acquiring unit, rendering unit, superposition Unit and matching unit;
The acquiring unit, for obtaining position and the attitude of the observer in real scene, and assigns virtual camera;
The rendering unit, is rendered for sending the position and the attitude to the center of rendering, and obtains three-dimensional Scene;
The superpositing unit, it is true with described for sending the display device that the three-dimensional virtual scene is worn to the observer Real field scape shows after being overlapped;
The matching unit, for the real-world object and the dummy object of three-dimensional virtual scene in real scene to be matched.
9. the AR based on observer visual angle positioning according to claim 8 realizes system, it is characterised in that described to render list Unit includes sending module and receives rendering module;
The sending module, center is rendered for sending the position and the attitude to server;
The reception rendering module, for the triggering command of the control end in the reception server, the center that renders carries out three-dimensional Virtual scene is rendered.
10. the AR based on observer visual angle positioning according to claim 9 realizes system, it is characterised in that the matching Unit includes measurement module and adjusting module;
The measurement module, for the visual angle based on the observer under, in advance measurement real scene in real-world object position Put and attitude;
The adjusting module, position and attitude for the real-world object according to measurement, adjustment three-dimensional virtual scene The position of dummy object and attitude.
CN201611244524.0A 2016-12-29 2016-12-29 AR implementation method and system based on positioning of visual angle of observer Pending CN106710002A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611244524.0A CN106710002A (en) 2016-12-29 2016-12-29 AR implementation method and system based on positioning of visual angle of observer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611244524.0A CN106710002A (en) 2016-12-29 2016-12-29 AR implementation method and system based on positioning of visual angle of observer

Publications (1)

Publication Number Publication Date
CN106710002A true CN106710002A (en) 2017-05-24

Family

ID=58903883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611244524.0A Pending CN106710002A (en) 2016-12-29 2016-12-29 AR implementation method and system based on positioning of visual angle of observer

Country Status (1)

Country Link
CN (1) CN106710002A (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107198876A (en) * 2017-06-07 2017-09-26 北京小鸟看看科技有限公司 The loading method and device of scene of game
CN107888600A (en) * 2017-11-21 2018-04-06 北京恒华伟业科技股份有限公司 A kind of localization method
CN107944129A (en) * 2017-11-21 2018-04-20 北京恒华伟业科技股份有限公司 A kind of method and device for creating cable model
CN108022306A (en) * 2017-12-30 2018-05-11 华自科技股份有限公司 Scene recognition method, device, storage medium and equipment based on augmented reality
CN108090966A (en) * 2017-12-13 2018-05-29 广州市和声信息技术有限公司 A kind of dummy object reconstructing method and system suitable for virtual scene
CN108255291A (en) * 2017-12-05 2018-07-06 腾讯科技(深圳)有限公司 Transmission method, device, storage medium and the electronic device of virtual scene data
CN108762501A (en) * 2018-05-23 2018-11-06 歌尔科技有限公司 AR display methods, intelligent terminal, AR equipment and system
CN108830944A (en) * 2018-07-12 2018-11-16 北京理工大学 Optical perspective formula three-dimensional near-eye display system and display methods
CN109218709A (en) * 2018-10-18 2019-01-15 北京小米移动软件有限公司 The method of adjustment and device and computer readable storage medium of holographic content
CN109214265A (en) * 2017-07-06 2019-01-15 佳能株式会社 Image processing apparatus, its image processing method and storage medium
CN109379551A (en) * 2018-11-26 2019-02-22 京东方科技集团股份有限公司 A kind of enhancing content display method, processing method, display device and processing unit
CN109615703A (en) * 2018-09-28 2019-04-12 阿里巴巴集团控股有限公司 Image presentation method, device and the equipment of augmented reality
CN109829964A (en) * 2019-02-11 2019-05-31 北京邮电大学 The rendering method and device of Web augmented reality
CN109978945A (en) * 2019-02-26 2019-07-05 浙江舜宇光学有限公司 A kind of information processing method and device of augmented reality
CN110850977A (en) * 2019-11-06 2020-02-28 成都威爱新经济技术研究院有限公司 Stereoscopic image interaction method based on 6DOF head-mounted display
WO2020078354A1 (en) * 2018-10-16 2020-04-23 北京凌宇智控科技有限公司 Video streaming system, video streaming method and apparatus
CN111127621A (en) * 2019-12-31 2020-05-08 歌尔科技有限公司 Picture rendering method and device and readable storage medium
CN111162840A (en) * 2020-04-02 2020-05-15 北京外号信息技术有限公司 Method and system for setting virtual objects around optical communication device
CN111937051A (en) * 2018-06-15 2020-11-13 谷歌有限责任公司 Smart home device placement and installation using augmented reality visualization
WO2021083031A1 (en) * 2019-10-31 2021-05-06 中兴通讯股份有限公司 Time delay error correction method, terminal device, server, and storage medium
CN113129358A (en) * 2019-12-30 2021-07-16 北京外号信息技术有限公司 Method and system for presenting virtual objects
CN113342167A (en) * 2021-06-07 2021-09-03 深圳市金研微科技有限公司 Space interaction AR realization method and system based on multi-person visual angle positioning
CN113632498A (en) * 2019-03-28 2021-11-09 多玩国株式会社 Content distribution system, content distribution method, and content distribution program
CN113728301A (en) * 2019-06-01 2021-11-30 苹果公司 Device, method and graphical user interface for manipulating 3D objects on a 2D screen
WO2022040983A1 (en) * 2020-08-26 2022-03-03 南京翱翔智能制造科技有限公司 Real-time registration method based on projection marking of cad model and machine vision
WO2022088918A1 (en) * 2020-10-30 2022-05-05 北京字节跳动网络技术有限公司 Virtual image display method and apparatus, electronic device and storage medium
WO2023040551A1 (en) * 2021-09-18 2023-03-23 华为技术有限公司 Method for displaying image on display screen, electronic device, and apparatus
CN116400878A (en) * 2023-06-07 2023-07-07 优奈柯恩(北京)科技有限公司 Display method and device of head-mounted display device, electronic device and storage medium
US12033288B2 (en) 2023-02-28 2024-07-09 Google Llc Smart-home device placement and installation using augmented-reality visualizations

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140168735A1 (en) * 2012-12-19 2014-06-19 Sheng Yuan Multiplexed hologram tiling in a waveguide display
CN104321681A (en) * 2012-06-29 2015-01-28 英特尔公司 Enhanced information delivery using a transparent display
CN104603673A (en) * 2012-09-03 2015-05-06 Smi创新传感技术有限公司 Head mounted system and method to compute and render stream of digital images using head mounted system
CN105468142A (en) * 2015-11-16 2016-04-06 上海璟世数字科技有限公司 Interaction method and system based on augmented reality technique, and terminal
CN106204431A (en) * 2016-08-24 2016-12-07 中国科学院深圳先进技术研究院 The display packing of intelligent glasses and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104321681A (en) * 2012-06-29 2015-01-28 英特尔公司 Enhanced information delivery using a transparent display
CN104603673A (en) * 2012-09-03 2015-05-06 Smi创新传感技术有限公司 Head mounted system and method to compute and render stream of digital images using head mounted system
US20140168735A1 (en) * 2012-12-19 2014-06-19 Sheng Yuan Multiplexed hologram tiling in a waveguide display
CN105468142A (en) * 2015-11-16 2016-04-06 上海璟世数字科技有限公司 Interaction method and system based on augmented reality technique, and terminal
CN106204431A (en) * 2016-08-24 2016-12-07 中国科学院深圳先进技术研究院 The display packing of intelligent glasses and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杨铁军: "《产业专利分析报告(第20册)卫星导航终端》", 31 May 2014, 知识产权出版社 *
王海宇: "《飞机装配工艺学》", 31 August 2013, 西北工业大学出版社 *

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107198876A (en) * 2017-06-07 2017-09-26 北京小鸟看看科技有限公司 The loading method and device of scene of game
CN109214265A (en) * 2017-07-06 2019-01-15 佳能株式会社 Image processing apparatus, its image processing method and storage medium
CN109214265B (en) * 2017-07-06 2022-12-13 佳能株式会社 Image processing apparatus, image processing method thereof, and storage medium
CN107888600A (en) * 2017-11-21 2018-04-06 北京恒华伟业科技股份有限公司 A kind of localization method
CN107944129A (en) * 2017-11-21 2018-04-20 北京恒华伟业科技股份有限公司 A kind of method and device for creating cable model
CN108255291A (en) * 2017-12-05 2018-07-06 腾讯科技(深圳)有限公司 Transmission method, device, storage medium and the electronic device of virtual scene data
CN108090966B (en) * 2017-12-13 2021-06-01 广州市和声信息技术有限公司 Virtual object reconstruction method and system suitable for virtual scene
CN108090966A (en) * 2017-12-13 2018-05-29 广州市和声信息技术有限公司 A kind of dummy object reconstructing method and system suitable for virtual scene
CN108022306B (en) * 2017-12-30 2021-09-21 华自科技股份有限公司 Scene recognition method and device based on augmented reality, storage medium and equipment
CN108022306A (en) * 2017-12-30 2018-05-11 华自科技股份有限公司 Scene recognition method, device, storage medium and equipment based on augmented reality
CN108762501A (en) * 2018-05-23 2018-11-06 歌尔科技有限公司 AR display methods, intelligent terminal, AR equipment and system
US11593999B2 (en) 2018-06-15 2023-02-28 Google Llc Smart-home device placement and installation using augmented-reality visualizations
CN111937051A (en) * 2018-06-15 2020-11-13 谷歌有限责任公司 Smart home device placement and installation using augmented reality visualization
CN108830944A (en) * 2018-07-12 2018-11-16 北京理工大学 Optical perspective formula three-dimensional near-eye display system and display methods
CN109615703A (en) * 2018-09-28 2019-04-12 阿里巴巴集团控股有限公司 Image presentation method, device and the equipment of augmented reality
CN109615703B (en) * 2018-09-28 2020-04-14 阿里巴巴集团控股有限公司 Augmented reality image display method, device and equipment
TWI712918B (en) * 2018-09-28 2020-12-11 開曼群島商創新先進技術有限公司 Method, device and equipment for displaying images of augmented reality
US11500455B2 (en) 2018-10-16 2022-11-15 Nolo Co., Ltd. Video streaming system, video streaming method and apparatus
WO2020078354A1 (en) * 2018-10-16 2020-04-23 北京凌宇智控科技有限公司 Video streaming system, video streaming method and apparatus
CN109218709A (en) * 2018-10-18 2019-01-15 北京小米移动软件有限公司 The method of adjustment and device and computer readable storage medium of holographic content
US11409241B2 (en) 2018-10-18 2022-08-09 Beijing Xiaomi Mobile Software Co., Ltd. Method and apparatus for adjusting holographic content and computer readable storage medium
CN109379551A (en) * 2018-11-26 2019-02-22 京东方科技集团股份有限公司 A kind of enhancing content display method, processing method, display device and processing unit
CN109829964A (en) * 2019-02-11 2019-05-31 北京邮电大学 The rendering method and device of Web augmented reality
CN109978945B (en) * 2019-02-26 2021-08-31 浙江舜宇光学有限公司 Augmented reality information processing method and device
CN109978945A (en) * 2019-02-26 2019-07-05 浙江舜宇光学有限公司 A kind of information processing method and device of augmented reality
US11961190B2 (en) 2019-03-28 2024-04-16 Dwango Co., Ltd. Content distribution system, content distribution method, and content distribution program
CN113632498A (en) * 2019-03-28 2021-11-09 多玩国株式会社 Content distribution system, content distribution method, and content distribution program
CN113728301A (en) * 2019-06-01 2021-11-30 苹果公司 Device, method and graphical user interface for manipulating 3D objects on a 2D screen
WO2021083031A1 (en) * 2019-10-31 2021-05-06 中兴通讯股份有限公司 Time delay error correction method, terminal device, server, and storage medium
CN110850977A (en) * 2019-11-06 2020-02-28 成都威爱新经济技术研究院有限公司 Stereoscopic image interaction method based on 6DOF head-mounted display
CN110850977B (en) * 2019-11-06 2023-10-31 成都威爱新经济技术研究院有限公司 Stereoscopic image interaction method based on 6DOF head-mounted display
CN113129358A (en) * 2019-12-30 2021-07-16 北京外号信息技术有限公司 Method and system for presenting virtual objects
CN111127621A (en) * 2019-12-31 2020-05-08 歌尔科技有限公司 Picture rendering method and device and readable storage medium
CN111127621B (en) * 2019-12-31 2024-02-09 歌尔科技有限公司 Picture rendering method, device and readable storage medium
CN111162840A (en) * 2020-04-02 2020-05-15 北京外号信息技术有限公司 Method and system for setting virtual objects around optical communication device
WO2022040983A1 (en) * 2020-08-26 2022-03-03 南京翱翔智能制造科技有限公司 Real-time registration method based on projection marking of cad model and machine vision
WO2022088918A1 (en) * 2020-10-30 2022-05-05 北京字节跳动网络技术有限公司 Virtual image display method and apparatus, electronic device and storage medium
CN113342167A (en) * 2021-06-07 2021-09-03 深圳市金研微科技有限公司 Space interaction AR realization method and system based on multi-person visual angle positioning
WO2023040551A1 (en) * 2021-09-18 2023-03-23 华为技术有限公司 Method for displaying image on display screen, electronic device, and apparatus
US12033288B2 (en) 2023-02-28 2024-07-09 Google Llc Smart-home device placement and installation using augmented-reality visualizations
CN116400878A (en) * 2023-06-07 2023-07-07 优奈柯恩(北京)科技有限公司 Display method and device of head-mounted display device, electronic device and storage medium
CN116400878B (en) * 2023-06-07 2023-09-08 优奈柯恩(北京)科技有限公司 Display method and device of head-mounted display device, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN106710002A (en) AR implementation method and system based on positioning of visual angle of observer
EP3051525B1 (en) Display
EP2613296B1 (en) Mixed reality display system, image providing server, display apparatus, and display program
CN106484116B (en) The treating method and apparatus of media file
CN109358754B (en) Mixed reality head-mounted display system
WO2013175923A1 (en) Simulation device
CN108431738A (en) Cursor based on fluctuation ties
CN104380347A (en) Video processing device, video processing method, and video processing system
CN106444023A (en) Super-large field angle binocular stereoscopic display transmission type augmented reality system
EP3128413A1 (en) Sharing mediated reality content
US11699259B2 (en) Stylized image painting
WO2018175335A1 (en) Method and system for discovering and positioning content into augmented reality space
CN113228688B (en) System and method for creating wallpaper images on a computing device
US20220239886A1 (en) Depth sculpturing of three-dimensional depth images utilizing two-dimensional input selection
Kimura et al. Eyeglass-based hands-free videophone
US11589024B2 (en) Multi-dimensional rendering
EP3323241A1 (en) Immersive teleconferencing system with translucent video stream
CN109445596B (en) Integrated mixed reality head-mounted display system
CN205610838U (en) Virtual stereoscopic display's device
US20230412779A1 (en) Artistic effects for images and videos
EP2827589A1 (en) Display device and device for adapting an information
KR20190126990A (en) Viewer motion responsive vr system
EP3503101A1 (en) Object based user interface
CN108093222A (en) A kind of hand-held VR imaging methods
CN111213111A (en) Wearable device and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170524