CN108090968A - Implementation method, device and the computer readable storage medium of augmented reality AR - Google Patents

Implementation method, device and the computer readable storage medium of augmented reality AR Download PDF

Info

Publication number
CN108090968A
CN108090968A CN201711483513.2A CN201711483513A CN108090968A CN 108090968 A CN108090968 A CN 108090968A CN 201711483513 A CN201711483513 A CN 201711483513A CN 108090968 A CN108090968 A CN 108090968A
Authority
CN
China
Prior art keywords
elements
deployment
models
plane
video flowing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711483513.2A
Other languages
Chinese (zh)
Other versions
CN108090968B (en
Inventor
杨颖慧
李海燕
戴星阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guang Heng Hengyu (beijing) Technology Co Ltd
Original Assignee
Guang Heng Hengyu (beijing) Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guang Heng Hengyu (beijing) Technology Co Ltd filed Critical Guang Heng Hengyu (beijing) Technology Co Ltd
Priority to CN201711483513.2A priority Critical patent/CN108090968B/en
Publication of CN108090968A publication Critical patent/CN108090968A/en
Application granted granted Critical
Publication of CN108090968B publication Critical patent/CN108090968B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses the implementation method of augmented reality AR, device and computer readable storage mediums.The described method includes:Obtain the video flowing of camera acquisition;Plane is identified from the video flowing;Using the plane as the reference plane of AR models, at least part AR elements in the AR models are disposed in the video flowing, generate AR video flowings and are shown in display interface.The technical solution make use of plane with the real world to realize virtual world as the tie being combined with virtual world and merged with the ingenious of real world, the AR video flowings of generation are more natural, interesting higher.

Description

Implementation method, device and the computer readable storage medium of augmented reality AR
Technical field
The present invention relates to augmented reality fields, and in particular to the implementation method of augmented reality AR, device and computer-readable Storage medium.
Background technology
AR (Augmented Reality, augmented reality) is a kind of position for calculating camera image in real time and angle And plus respective image, video, the technology of 3D models, the target of this technology is that virtual world is sleeved on real generation on the screen Boundary simultaneously carries out interaction.General AR is to be previously provided with AR models, but AR models can be only achieved with how real scene is combined Better effect is problem to be solved.
The content of the invention
In view of the above problems, it is proposed that the present invention overcomes the above problem in order to provide one kind or solves at least partly State implementation method, device and the computer readable storage medium of the augmented reality AR of problem.
One side according to the invention provides a kind of implementation method of augmented reality AR, including:
Obtain the video flowing of camera acquisition;
Plane is identified from the video flowing;
Using the plane as the reference plane of AR models, at least part in the AR models is disposed in the video flowing AR elements generate AR video flowings and are shown in display interface.
Optionally, this method further includes:
Object is identified from the video flowing;
At least part AR elements disposed in the video flowing in the AR models include:Judge the object Whether match with the deployment conditions of any of AR models AR elements;
The deployment of corresponding AR elements is completed according to matched deployment conditions.
Optionally, this method further includes:Identify the posture of the object;
At least part AR elements disposed in the video flowing in the AR models further include:Judge the target The posture of object whether match with the deployment conditions of any of AR models AR elements and/or, judge the object Whether posture change matches with the deployment conditions of any one AR element;
The deployment of corresponding AR elements is completed according to matched deployment conditions.
Optionally, the deployment conditions of the AR elements include following one or more:
The AR elements do not disposed are disposed;
The AR elements disposed are deleted;
The display state for the AR elements disposed is changed.
Optionally, the deployment that corresponding AR elements are completed according to matched deployment conditions includes:
According to the one or more in the speed, amplitude, type of the change of the posture of the object, matched deployment is determined Deployment parameters in condition and/or, the deployment parameters in matched deployment conditions are determined according to the posture of the object.
Optionally, this method further includes:
First the video flowing is shown in the display interface;
The reference plane using the plane as AR models includes:When the plane identified has multiple, in response to aobvious Show the selection instruction on interface, determine to instruct immediate plane with described choose, using the immediate plane as the AR The reference plane of model.
Another aspect according to the invention provides a kind of realization device of augmented reality AR, including:
Acquiring unit, suitable for obtaining the video flowing of camera acquisition;
Recognition unit, suitable for identifying plane from the video flowing;
AR units suitable for the reference plane using the plane as AR models, dispose the AR models in the video flowing In at least part AR elements, generation AR video flowings simultaneously be shown in display interface.
Optionally, the recognition unit is further adapted for identifying object from the video flowing;
The AR units, suitable for judge the object whether the deployment conditions with any of AR models AR elements Match, the deployment of corresponding AR elements is completed according to matched deployment conditions.
Optionally, the recognition unit is further adapted for the posture for identifying the object;
The AR units, suitable for judge the posture of the object whether the portion with any of AR models AR elements Affix one's name to matching criteria and/or, suitable for judge the posture of the object change whether the deployment conditions phase with any one AR element The deployment of corresponding AR elements is completed in matching according to matched deployment conditions.
Optionally, the deployment conditions of the AR elements include following one or more:
The AR elements do not disposed are disposed;
The AR elements disposed are deleted;
The display state for the AR elements disposed is changed.
Optionally, the AR units, suitable for one in the speed, amplitude, type that are changed according to the posture of the object Kind or it is a variety of, determine deployment parameters in matched deployment conditions and/or, suitable for being determined according to the posture of the object Deployment parameters in the deployment conditions matched somebody with somebody.
Optionally, the AR units are further adapted for first being shown the video flowing in the display interface, identify Plane when having multiple, instructed in response to the selection on display interface, determine to instruct immediate plane with described choose, by this Reference plane of the immediate plane as the AR models.
Another aspect according to the invention, provides a kind of computer readable storage medium, wherein, it is described computer-readable The one or more programs of storage medium storage, one or more of programs when being executed by a processor, are realized as any of the above-described Method described in.
It can be seen from the above, technical scheme, after the video flowing of camera acquisition is got, is therefrom identified flat Reference plane of the face as AR models, and at least part AR elements for disposing AR models in video streaming can show so as to generate The AR video flowings shown on interface.The technical solution make use of plane with the real world as being combined with virtual world Tie realizes virtual world and is merged with the ingenious of real world, and the AR video flowings of generation are more natural, interesting higher.
Above description is only the general introduction of technical solution of the present invention, in order to better understand the technological means of the present invention, And can be practiced according to the content of specification, and in order to allow above and other objects of the present invention, feature and advantage can It is clearer and more comprehensible, below the special specific embodiment for lifting the present invention.
Description of the drawings
By reading the detailed description of hereafter preferred embodiment, it is various other the advantages of and benefit it is common for this field Technical staff will be apparent understanding.Attached drawing is only used for showing the purpose of preferred embodiment, and is not considered as to the present invention Limitation.And throughout the drawings, the same reference numbers will be used to refer to the same parts.In the accompanying drawings:
Fig. 1 shows a kind of flow diagram of the implementation method of augmented reality AR according to an embodiment of the invention;
Fig. 2 shows a kind of structure diagram of the realization device of augmented reality AR according to an embodiment of the invention;
Fig. 3 shows the structure diagram of computer readable storage medium according to an embodiment of the invention.
Specific embodiment
The exemplary embodiment of the disclosure is more fully described below with reference to accompanying drawings.Although the disclosure is shown in attached drawing Exemplary embodiment, it being understood, however, that may be realized in various forms the disclosure without should be by embodiments set forth here It is limited.On the contrary, these embodiments are provided to facilitate a more thoroughly understanding of the present invention, and can be by the scope of the present disclosure Completely it is communicated to those skilled in the art.
Fig. 1 shows a kind of flow diagram of the implementation method of augmented reality AR according to an embodiment of the invention. As shown in Figure 1, this method includes:
Step S110 obtains the video flowing of camera acquisition.
By taking user is using mobile phone as the equipment of operation AR applications as an example, then the camera acquisition video flowing of mobile phone is as reality The data in the world.
Step S120, identifies plane from video flowing.
For example, the plane in plane or the vertical direction such as metope, minute surface in the horizontal directions such as ground, desktop, sliding Ladder, the isoclinal plane in slope etc..
Step S130 using plane as the reference plane of AR models, disposes at least part AR in AR models in video streaming Element generates AR video flowings and is shown in display interface.
One AR model is 3 D stereo, then its x-axis, y-axis and z-axis constitutes three reference planes two-by-two, can will be known The plane not gone out is overlapped with any one reference plane, to realize merging for virtual world and real world.
As it can be seen that method shown in FIG. 1, after the video flowing of camera acquisition is got, therefrom identifies plane as AR The reference plane of model, and at least part AR elements for disposing AR models in video streaming can be opened up so as to generate on display interface The AR video flowings shown.The technical solution make use of plane with the real world as the tie being combined with virtual world, real Show virtual world to merge with the ingenious of real world, the AR video flowings of generation are more natural, interesting higher.
In one embodiment of the invention, the above method further includes:Object is identified from video flowing;In video flowing At least part AR elements in middle deployment AR models include:Judge object whether the deployment with any of AR models AR elements Matching criteria;The deployment of corresponding AR elements is completed according to matched deployment conditions.
Such as following scene:AR models are one group of fountains, including 16 water jets, are then being identified from video flowing It opens behind the door, is disposing fountain outdoors, but according to the limitation of door size, only disposing eight water jets therein.
In one embodiment of the invention, the above method further includes:Identify the posture of object;Portion in video streaming At least part AR elements in administration's AR models further include:Judge object posture whether with any of AR models AR elements Deployment conditions match and/or, judge whether the posture change of object matches with the deployment conditions of any one AR element; The deployment of corresponding AR elements is completed according to matched deployment conditions.
For example, object is people, has shown " aircraft carrier style ", then AR models --- each AR elements in aircraft carrier, Such as the deployment such as opportunity of combat, deck are in video streaming.This is the AR elements that deployment is determined according to static posture.In another example user Both hands are held stick-like object or so and are brandished, then AR models --- and the flagpole of flag and the two AR elements of flag face are deployed in In video flowing, specifically, stick-like object is covered with flagpole, flag face is also swung therewith with brandishing for user's both hands.
In one embodiment of the invention, in the above method, the deployment conditions of AR elements include following one kind or more Kind:The AR elements do not disposed are disposed;The AR elements disposed are deleted;By the displaying shape for the AR elements disposed State is changed.
For example, user holds the common cap in a top, the cap is replaced by a top Magic cap in AR video flowings.When with Family overturns cap for the first time, absolutely empty in cap.(posture change has occurred) when cap is worn on the head by user to be taken again, Overturn cap again, drill out a rabbit from the inside --- in order to realize such effect, then need subsequent deployment rabbit this AR elements.Rabbit is pressed back cap by user, and rabbit disappears not when haveing broken up (and posture change has occurred) without a break, then overturning cap See --- in order to realize such effect, then need this AR element of deletion rabbit.In another example deploy on the ground one it is very large This AR element of incomparable pillar, when user makes the action for pushing hard pillar so that pillar slowly tilts, and here it is to The example that the display state of the AR elements of deployment is changed.
In one embodiment of the invention, in the above method, corresponding AR elements are completed according to matched deployment conditions Deployment includes:According to the one or more in the speed, amplitude, type of the change of the posture of object, matched deployment item is determined Deployment parameters in part and/or, the deployment parameters in matched deployment conditions are determined according to the posture of object.
For example, playing card this AR elements is deployed on desktop in front of the user.When user claps desktop, playing card fly Rise it is static in the air --- when realizing this effect, can determine what playing card flew up according to speed that user takes, amplitude Speed and height.And can be following example according to another effect of playing card this AR element realization:User, which makes, to be picked up Playing card carry out the action that fancy is shuffled, then show magnificent effect of shuffling, this effect is different from foregoing advantages, but all It is to be realized based on same AR elements, that is to say, that the type changed according to object (people) font is different, the deployment of AR elements Deployment parameters in condition are also changed, and what is showed is exactly the difference of AR effects.
And determine that the deployment parameters in matched deployment conditions are just relatively easy to understand according to the posture according to object, in the past , it is necessary to determine the position of opportunity of combat, take-off angle etc. according to the posture of user exemplified by " the aircraft carrier style " in face.
In one embodiment of the invention, the above method further includes:First video flowing is shown in display interface;With Plane includes as the reference plane of AR models:When the plane identified has multiple, instructed in response to the selection on display interface, It determines to instruct immediate plane with choosing, using the immediate plane as the reference plane of AR models.
It sometimes will recognise that multiple planes, such as desktop and ground in video flowing, and if it is desired to realizing flag Fixed effect in the plane, it is necessary to which user's selection is specifically placed it on desktop or ground, at this point, video flowing can be first Be illustrated on display interface, user can click on the desktop shown or ground, from the background can automatically according to user click it is true The plane that fixed corresponding plane is placed as flag.
Fig. 2 shows a kind of structure diagram of the realization device of augmented reality AR according to an embodiment of the invention. As shown in Fig. 2, the realization device 200 of augmented reality AR includes:
Acquiring unit 210, suitable for obtaining the video flowing of camera acquisition.
By taking user is using mobile phone as the equipment of operation AR applications as an example, then the camera acquisition video flowing of mobile phone is as reality The data in the world.
Recognition unit 220, suitable for identifying plane from video flowing.
For example, the plane in plane or the vertical direction such as metope, minute surface in the horizontal directions such as ground, desktop, sliding Ladder, the isoclinal plane in slope etc..
AR units 230 suitable for the reference plane using plane as AR models, dispose at least portion in AR models in video streaming Divide AR elements, generate AR video flowings and be simultaneously shown in display interface.
One AR model is 3 D stereo, then its x-axis, y-axis and z-axis constitutes three reference planes two-by-two, can will be known The plane not gone out is overlapped with any one reference plane, to realize merging for virtual world and real world.
As it can be seen that device shown in Fig. 2, by the mutual cooperation of each unit, after the video flowing of camera acquisition is got, It therefrom identifies reference plane of the plane as AR models, and disposes at least part AR elements of AR models in video streaming so as to raw Into the AR video flowings that can be shown on display interface.The technical solution make use of plane with the real world as with virtual generation The tie that boundary is combined realizes virtual world and is merged with the ingenious of real world, and the AR video flowings of generation are more natural, entertaining Property higher.
In one embodiment of the invention, in above device, recognition unit 220 is further adapted for identifying from video flowing Object;AR units 230, suitable for judging whether object matches with the deployment conditions of any of AR models AR elements, root The deployment of corresponding AR elements is completed according to matched deployment conditions.
Such as following scene:AR models are one group of fountains, including 16 water jets, are then being identified from video flowing It opens behind the door, is disposing fountain outdoors, but according to the limitation of door size, only disposing eight water jets therein.
In one embodiment of the invention, in above device, recognition unit 220 is further adapted for the appearance for identifying object State;AR units 230, suitable for judging whether the posture of object matches with the deployment conditions of any of AR models AR elements, And/or suitable for judging whether the change of the posture of object matches with the deployment conditions of any one AR element, according to matched portion Administration's condition completes the deployment of corresponding AR elements.
For example, object is people, has shown " aircraft carrier style ", then AR models --- each AR elements in aircraft carrier, Such as the deployment such as opportunity of combat, deck are in video streaming.This is the AR elements that deployment is determined according to static posture.In another example user Both hands are held stick-like object or so and are brandished, then AR models --- and the flagpole of flag and the two AR elements of flag face are deployed in In video flowing, specifically, stick-like object is covered with flagpole, flag face is also swung therewith with brandishing for user's both hands.
In one embodiment of the invention, in above device, the deployment conditions of AR elements include following one kind or more Kind:The AR elements do not disposed are disposed;The AR elements disposed are deleted;By the displaying shape for the AR elements disposed State is changed.
For example, user holds the common cap in a top, the cap is replaced by a top Magic cap in AR video flowings.When with Family overturns cap for the first time, absolutely empty in cap.(posture change has occurred) when cap is worn on the head by user to be taken again, Overturn cap again, drill out a rabbit from the inside --- in order to realize such effect, then need subsequent deployment rabbit this AR elements.Rabbit is pressed back cap by user, and rabbit disappears not when haveing broken up (and posture change has occurred) without a break, then overturning cap See --- in order to realize such effect, then need this AR element of deletion rabbit.In another example deploy on the ground one it is very large This AR element of incomparable pillar, when user makes the action for pushing hard pillar so that pillar slowly tilts, and here it is to The example that the display state of the AR elements of deployment is changed.
In one embodiment of the invention, in above device, AR units 230, suitable for being changed according to the posture of object Speed, amplitude, the one or more in type, determine deployment parameters in matched deployment conditions and/or, suitable for basis The posture of object determines the deployment parameters in matched deployment conditions.
For example, playing card this AR elements is deployed on desktop in front of the user.When user claps desktop, playing card fly Rise it is static in the air --- when realizing this effect, can determine what playing card flew up according to speed that user takes, amplitude Speed and height.And can be following example according to another effect of playing card this AR element realization:User, which makes, to be picked up Playing card carry out the action that fancy is shuffled, then show magnificent effect of shuffling, this effect is different from foregoing advantages, but all It is to be realized based on same AR elements, that is to say, that the type changed according to object (people) font is different, the deployment of AR elements Deployment parameters in condition are also changed, and what is showed is exactly the difference of AR effects.
And determine that the deployment parameters in matched deployment conditions are just relatively easy to understand according to the posture according to object, in the past , it is necessary to determine the position of opportunity of combat, take-off angle etc. according to the posture of user exemplified by " the aircraft carrier style " in face.
In one embodiment of the invention, in above device, AR units 230, suitable for first by video flowing in display interface It is shown, when the plane identified has multiple, is instructed in response to the selection on display interface, determine most to connect with choosing instruction Near plane, using the immediate plane as the reference plane of AR models.
It sometimes will recognise that multiple planes, such as desktop and ground in video flowing, and if it is desired to realizing flag Fixed effect in the plane, it is necessary to which user's selection is specifically placed it on desktop or ground, at this point, video flowing can be first Be illustrated on display interface, user can click on the desktop shown or ground, from the background can automatically according to user click it is true The plane that fixed corresponding plane is placed as flag.
In conclusion technical scheme, after the video flowing of camera acquisition is got, therefrom identifies plane As the reference plane of AR models, and at least part AR elements of deployment AR models can be on display circle so as to generate in video streaming The AR video flowings shown on face.The technical solution make use of plane with the real world as the knob being combined with virtual world Band realizes virtual world and is merged with the ingenious of real world, and the AR video flowings of generation are more natural, interesting higher.
It should be noted that:
Algorithm and display be not inherently related to any certain computer, virtual bench or miscellaneous equipment provided herein. Various fexible units can also be used together with teaching based on this.As described above, required by constructing this kind of device Structure be obvious.In addition, the present invention is not also directed to any certain programmed language.It should be understood that it can utilize various Programming language realizes the content of invention described herein, and the description done above to language-specific is to disclose this hair Bright preferred forms.
In the specification provided in this place, numerous specific details are set forth.It is to be appreciated, however, that the implementation of the present invention Example can be put into practice without these specific details.In some instances, well known method, structure is not been shown in detail And technology, so as not to obscure the understanding of this description.
Similarly, it should be understood that in order to simplify the disclosure and help to understand one or more of each inventive aspect, Above in the description of exemplary embodiment of the present invention, each feature of the invention is grouped together into single implementation sometimes In example, figure or descriptions thereof.However, the method for the disclosure should be construed to reflect following intention:I.e. required guarantor Shield the present invention claims the more features of feature than being expressly recited in each claim.It is more precisely, such as following Claims reflect as, inventive aspect is all features less than single embodiment disclosed above.Therefore, Thus the claims for following specific embodiment are expressly incorporated in the specific embodiment, wherein each claim is in itself Separate embodiments all as the present invention.
Those skilled in the art, which are appreciated that, to carry out adaptively the module in the equipment in embodiment Change and they are arranged in one or more equipment different from the embodiment.It can be the module or list in embodiment Member or component be combined into a module or unit or component and can be divided into addition multiple submodule or subelement or Sub-component.In addition at least some in such feature and/or process or unit exclude each other, it may be employed any Combination is disclosed to all features disclosed in this specification (including adjoint claim, summary and attached drawing) and so to appoint Where all processes or unit of method or equipment are combined.Unless expressly stated otherwise, this specification is (including adjoint power Profit requirement, summary and attached drawing) disclosed in each feature can be by providing the alternative features of identical, equivalent or similar purpose come generation It replaces.
In addition, it will be appreciated by those of skill in the art that although some embodiments described herein include other embodiments In included some features rather than other feature, but the combination of the feature of different embodiments means in of the invention Within the scope of and form different embodiments.For example, in the following claims, embodiment claimed is appointed One of meaning mode can use in any combination.
The all parts embodiment of the present invention can be with hardware realization or to be run on one or more processor Software module realize or realized with combination thereof.It will be understood by those of skill in the art that it can use in practice Microprocessor or digital signal processor (DSP) are realized in the realization device of augmented reality AR according to embodiments of the present invention Some or all components some or all functions.The present invention is also implemented as performing side as described herein The some or all equipment or program of device (for example, computer program and computer program product) of method.It is such Realizing the program of the present invention can may be stored on the computer-readable medium or can have the shape of one or more signal Formula.Such signal can be downloaded from internet website to be obtained either providing or with any other shape on carrier signal Formula provides.
Fig. 3 shows a kind of structure diagram of computer readable storage medium according to an embodiment of the invention.It should Computer readable storage medium 300 is stored with to perform the computer readable program code of steps of a method in accordance with the invention 310, such as the program code that can be read by the processor of electronic equipment, when these program codes are run by electronic equipment, The electronic equipment is caused to perform each step in method described above, specifically, the computer readable storage medium The program code of storage can perform the method shown in any of the above-described embodiment.Program code can be pressed in a suitable form Contracting.
It should be noted that the present invention will be described rather than limits the invention for above-described embodiment, and ability Field technique personnel can design alternative embodiment without departing from the scope of the appended claims.In the claims, Any reference symbol between bracket should not be configured to limitations on claims.Word "comprising" does not exclude the presence of not Element or step listed in the claims.Word "a" or "an" before element does not exclude the presence of multiple such Element.The present invention can be by means of including the hardware of several different elements and being come by means of properly programmed computer real It is existing.If in the unit claim for listing equipment for drying, several in these devices can be by same hardware branch To embody.The use of word first, second, and third does not indicate that any order.These words can be explained and run after fame Claim.
Embodiment of the invention discloses that A1, a kind of implementation method of augmented reality AR, including:
Obtain the video flowing of camera acquisition;
Plane is identified from the video flowing;
Using the plane as the reference plane of AR models, at least part in the AR models is disposed in the video flowing AR elements generate AR video flowings and are shown in display interface.
A2, the method as described in A1, wherein, this method further includes:
Object is identified from the video flowing;
At least part AR elements disposed in the video flowing in the AR models include:Judge the object Whether match with the deployment conditions of any of AR models AR elements;
The deployment of corresponding AR elements is completed according to matched deployment conditions.
A3, the method as described in A2, wherein, this method further includes:Identify the posture of the object;
At least part AR elements disposed in the video flowing in the AR models further include:Judge the target The posture of object whether match with the deployment conditions of any of AR models AR elements and/or, judge the object Whether posture change matches with the deployment conditions of any one AR element;
The deployment of corresponding AR elements is completed according to matched deployment conditions.
A4, the method as described in A2 or A3, wherein, the deployment conditions of the AR elements include following one or more:
The AR elements do not disposed are disposed;
The AR elements disposed are deleted;
The display state for the AR elements disposed is changed.
A5, the method as described in A3, wherein, the deployment package that corresponding AR elements are completed according to matched deployment conditions It includes:
According to the one or more in the speed, amplitude, type of the change of the posture of the object, matched deployment is determined Deployment parameters in condition and/or, the deployment parameters in matched deployment conditions are determined according to the posture of the object.
A6, the method as described in A1, wherein, this method further includes:
First the video flowing is shown in the display interface;
The reference plane using the plane as AR models includes:When the plane identified has multiple, in response to aobvious Show the selection instruction on interface, determine to instruct immediate plane with described choose, using the immediate plane as the AR The reference plane of model.
The embodiment of the present invention also discloses B7, a kind of realization device of augmented reality AR, including:
Acquiring unit, suitable for obtaining the video flowing of camera acquisition;
Recognition unit, suitable for identifying plane from the video flowing;
AR units suitable for the reference plane using the plane as AR models, dispose the AR models in the video flowing In at least part AR elements, generation AR video flowings simultaneously be shown in display interface.
B8, the device as described in B7, wherein,
The recognition unit is further adapted for identifying object from the video flowing;
The AR units, suitable for judge the object whether the deployment conditions with any of AR models AR elements Match, the deployment of corresponding AR elements is completed according to matched deployment conditions.
B9, the device as described in B8, wherein,
The recognition unit is further adapted for the posture for identifying the object;
The AR units, suitable for judge the posture of the object whether the portion with any of AR models AR elements Affix one's name to matching criteria and/or, suitable for judge the posture of the object change whether the deployment conditions phase with any one AR element The deployment of corresponding AR elements is completed in matching according to matched deployment conditions.
B10, the device as described in B8 or B9, wherein, the deployment conditions of the AR elements include following one or more:
The AR elements do not disposed are disposed;
The AR elements disposed are deleted;
The display state for the AR elements disposed is changed.
B11, the device as described in B9, wherein,
The AR units, suitable for one kind or more in the speed, amplitude, type that are changed according to the posture of the object Kind, determine deployment parameters in matched deployment conditions and/or, suitable for determining matched portion according to the posture of the object Deployment parameters in administration's condition.
B12, the device as described in B7, wherein,
The AR units are further adapted for first being shown the video flowing in the display interface, in the plane identified It when having multiple, is instructed in response to the selection on display interface, determines to instruct immediate plane with described choose, this is closest Reference plane of the plane as the AR models.
The embodiment of the present invention also discloses C13, a kind of computer readable storage medium, wherein, it is described computer-readable The one or more programs of storage medium storage, one or more of programs when being executed by a processor, are realized as appointed in A1-A6 Method described in one.

Claims (10)

1. a kind of implementation method of augmented reality AR, including:
Obtain the video flowing of camera acquisition;
Plane is identified from the video flowing;
Using the plane as the reference plane of AR models, at least part AR members in the AR models are disposed in the video flowing Element generates AR video flowings and is shown in display interface.
2. the method for claim 1, wherein this method further includes:
Object is identified from the video flowing;
At least part AR elements disposed in the video flowing in the AR models include:Whether judge the object Match with the deployment conditions of any of AR models AR elements;
The deployment of corresponding AR elements is completed according to matched deployment conditions.
3. method as claimed in claim 2, wherein, this method further includes:Identify the posture of the object;
At least part AR elements disposed in the video flowing in the AR models further include:Judge the object Posture whether match with the deployment conditions of any of AR models AR elements and/or, judge the posture of the object Whether change matches with the deployment conditions of any one AR element;
The deployment of corresponding AR elements is completed according to matched deployment conditions.
4. method as claimed in claim 2 or claim 3, wherein, the deployment conditions of the AR elements include following one or more:
The AR elements do not disposed are disposed;
The AR elements disposed are deleted;
The display state for the AR elements disposed is changed.
5. method as claimed in claim 3, wherein, the deployment package that corresponding AR elements are completed according to matched deployment conditions It includes:
According to the one or more in the speed, amplitude, type of the change of the posture of the object, matched deployment conditions are determined In deployment parameters and/or, the deployment parameters in matched deployment conditions are determined according to the posture of the object.
6. the method for claim 1, wherein this method further includes:
First the video flowing is shown in the display interface;
The reference plane using the plane as AR models includes:When the plane identified has multiple, in response to showing boundary Selection instruction on face determines to instruct immediate plane with described choose, using the immediate plane as the AR models Reference plane.
7. a kind of realization device of augmented reality AR, including:
Acquiring unit, suitable for obtaining the video flowing of camera acquisition;
Recognition unit, suitable for identifying plane from the video flowing;
AR units suitable for the reference plane using the plane as AR models, are disposed in the AR models in the video flowing At least part AR elements generate AR video flowings and are shown in display interface.
8. device as claimed in claim 7, wherein,
The recognition unit is further adapted for identifying object from the video flowing;
The AR units, suitable for judge the object whether the deployment conditions phase with any of AR models AR elements Match somebody with somebody, the deployment of corresponding AR elements is completed according to matched deployment conditions.
9. device as claimed in claim 8, wherein,
The recognition unit is further adapted for the posture for identifying the object;
The AR units, suitable for judge the posture of the object whether the deployment item with any of AR models AR elements Part match and/or, suitable for judge the posture of the object change whether the deployment conditions phase with any one AR element Match somebody with somebody, the deployment of corresponding AR elements is completed according to matched deployment conditions.
10. a kind of computer readable storage medium, wherein, the computer-readable recording medium storage one or more program, One or more of programs when being executed by a processor, realize the method as any one of claim 1-6.
CN201711483513.2A 2017-12-29 2017-12-29 Method and device for realizing augmented reality AR and computer readable storage medium Active CN108090968B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711483513.2A CN108090968B (en) 2017-12-29 2017-12-29 Method and device for realizing augmented reality AR and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711483513.2A CN108090968B (en) 2017-12-29 2017-12-29 Method and device for realizing augmented reality AR and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108090968A true CN108090968A (en) 2018-05-29
CN108090968B CN108090968B (en) 2022-01-25

Family

ID=62181267

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711483513.2A Active CN108090968B (en) 2017-12-29 2017-12-29 Method and device for realizing augmented reality AR and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108090968B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109002813A (en) * 2018-08-17 2018-12-14 浙江大丰实业股份有限公司 Stage fountain blocked state resolution system
CN110009952A (en) * 2019-04-12 2019-07-12 上海乂学教育科技有限公司 Adaptive learning mobile terminal and learning method based on augmented reality
CN110390730A (en) * 2019-07-05 2019-10-29 北京悉见科技有限公司 The method and electronic equipment of augmented reality object arrangement
CN111475026A (en) * 2020-04-10 2020-07-31 李斌 Space positioning method based on mobile terminal application augmented virtual reality technology
CN114900722A (en) * 2022-05-06 2022-08-12 浙江工商大学 AR technology-based personalized advertisement implanting method and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157011A (en) * 2010-12-10 2011-08-17 北京大学 Method for carrying out dynamic texture acquisition and virtuality-reality fusion by using mobile shooting equipment
CN104740869A (en) * 2015-03-26 2015-07-01 北京小小牛创意科技有限公司 True environment integrated and virtuality and reality combined interaction method and system
CN105759960A (en) * 2016-02-02 2016-07-13 上海尚镜信息科技有限公司 Augmented reality remote guidance method and system in combination with 3D camera
CN106341720A (en) * 2016-08-18 2017-01-18 北京奇虎科技有限公司 Method for adding face effects in live video and device thereof
CN107025662A (en) * 2016-01-29 2017-08-08 成都理想境界科技有限公司 A kind of method for realizing augmented reality, server, terminal and system
US20170329402A1 (en) * 2014-03-17 2017-11-16 Spatial Intelligence Llc Stereoscopic display
CN107358656A (en) * 2017-06-16 2017-11-17 珠海金山网络游戏科技有限公司 The AR processing systems and its processing method of a kind of 3d gaming

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157011A (en) * 2010-12-10 2011-08-17 北京大学 Method for carrying out dynamic texture acquisition and virtuality-reality fusion by using mobile shooting equipment
US20170329402A1 (en) * 2014-03-17 2017-11-16 Spatial Intelligence Llc Stereoscopic display
CN104740869A (en) * 2015-03-26 2015-07-01 北京小小牛创意科技有限公司 True environment integrated and virtuality and reality combined interaction method and system
CN107025662A (en) * 2016-01-29 2017-08-08 成都理想境界科技有限公司 A kind of method for realizing augmented reality, server, terminal and system
CN105759960A (en) * 2016-02-02 2016-07-13 上海尚镜信息科技有限公司 Augmented reality remote guidance method and system in combination with 3D camera
CN106341720A (en) * 2016-08-18 2017-01-18 北京奇虎科技有限公司 Method for adding face effects in live video and device thereof
CN107358656A (en) * 2017-06-16 2017-11-17 珠海金山网络游戏科技有限公司 The AR processing systems and its processing method of a kind of 3d gaming

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HUNG-CHUN LIN,ET AL.: "Augmented reality using holographic display", 《OPTICAL DATA PROCESSING AND STORAGE》 *
滕健 等: "基于增强现实的产品展示APP设计研究", 《包装工程》 *
葛林: "具有高融合度的城市场景移动增强现实技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109002813A (en) * 2018-08-17 2018-12-14 浙江大丰实业股份有限公司 Stage fountain blocked state resolution system
CN109002813B (en) * 2018-08-17 2022-05-27 浙江大丰实业股份有限公司 Stage fountain blockage state analysis system
CN110009952A (en) * 2019-04-12 2019-07-12 上海乂学教育科技有限公司 Adaptive learning mobile terminal and learning method based on augmented reality
CN110390730A (en) * 2019-07-05 2019-10-29 北京悉见科技有限公司 The method and electronic equipment of augmented reality object arrangement
CN110390730B (en) * 2019-07-05 2023-12-29 北京悉见科技有限公司 Method for arranging augmented reality object and electronic equipment
CN111475026A (en) * 2020-04-10 2020-07-31 李斌 Space positioning method based on mobile terminal application augmented virtual reality technology
CN111475026B (en) * 2020-04-10 2023-08-22 李斌 Spatial positioning method based on mobile terminal application augmented virtual reality technology
CN114900722A (en) * 2022-05-06 2022-08-12 浙江工商大学 AR technology-based personalized advertisement implanting method and system

Also Published As

Publication number Publication date
CN108090968B (en) 2022-01-25

Similar Documents

Publication Publication Date Title
CN108090968A (en) Implementation method, device and the computer readable storage medium of augmented reality AR
JP6276882B1 (en) Information processing method, apparatus, and program for causing computer to execute information processing method
CN106355153B (en) A kind of virtual objects display methods, device and system based on augmented reality
CN109035373B (en) Method and device for generating three-dimensional special effect program file package and method and device for generating three-dimensional special effect
US10126553B2 (en) Control device with holographic element
JP6750046B2 (en) Information processing apparatus and information processing method
CN107622524A (en) Display methods and display device for mobile terminal
CN109420336A (en) Game implementation method and device based on augmented reality
CN108109209A (en) A kind of method for processing video frequency and its device based on augmented reality
CN206741428U (en) Support with reflective mirror
KR20180013892A (en) Reactive animation for virtual reality
CN110688003B (en) Electronic drawing system, display method, device and medium based on augmented reality
CN105892651A (en) Virtual object display method and electronic equipment
CN105847583A (en) Method and apparatus for image processing on mobile terminal
Lee Research and development of haptic simulator for dental education using virtual reality and user motion
US20180218631A1 (en) Interactive vehicle control system
JP2018124981A (en) Information processing method, information processing device and program causing computer to execute information processing method
CN108230448A (en) Implementation method, device and the computer readable storage medium of augmented reality AR
CN112206525B (en) Information processing method and device for hand-twisting virtual article in UE4 engine
Vasudevan et al. An intelligent boxing application through augmented reality for two users-human computer interaction attempt
CN108022305A (en) It is a kind of that room body check system is seen based on AR technologies
CN111640199B (en) AR special effect data generation method and device
Borisova et al. Developing an augmented reality textbook for Bachelor and Master programmes “Design, technology and management of the fashion industry”
JP2018190196A (en) Information processing method, information processing device, program causing computer to execute information processing method
Wang Augmented Reality with Kinect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant