CN109166182A - AR simulation process method, apparatus, electronic equipment and readable storage medium storing program for executing - Google Patents

AR simulation process method, apparatus, electronic equipment and readable storage medium storing program for executing Download PDF

Info

Publication number
CN109166182A
CN109166182A CN201810991019.5A CN201810991019A CN109166182A CN 109166182 A CN109166182 A CN 109166182A CN 201810991019 A CN201810991019 A CN 201810991019A CN 109166182 A CN109166182 A CN 109166182A
Authority
CN
China
Prior art keywords
simulated object
reality scene
type
video
kinematic parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810991019.5A
Other languages
Chinese (zh)
Other versions
CN109166182B (en
Inventor
曹逸凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201810991019.5A priority Critical patent/CN109166182B/en
Publication of CN109166182A publication Critical patent/CN109166182A/en
Application granted granted Critical
Publication of CN109166182B publication Critical patent/CN109166182B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education

Abstract

The embodiment of the present invention provides a kind of AR simulation process method, apparatus, electronic equipment and readable storage medium storing program for executing, this method comprises: acquisition applications video, the application video is shooting reality scene video generated;According to the type of the reality scene, the kinematic parameter of simulated object is determined;According to the kinematic parameter of simulated object, the simulated object is fused in the application video.This method can make user have the direct feel of actual physical experiment, meanwhile, and without preparing more experimental material, therefore, the significant increase teaching efficiency of Physical Experiment.

Description

AR simulation process method, apparatus, electronic equipment and readable storage medium storing program for executing
Technical field
The present embodiments relate to Intelligent hardware technology more particularly to a kind of AR simulation process method, apparatus, electronic equipment And readable storage medium storing program for executing.
Background technique
In physics teaching field, it is often necessary to show some Physical Experiments to student.
In the prior art, Physical Experiment can be shown to student by the way of in-kind simulation or video simulation.Wherein, In-kind simulation needs the experimental material that is prepared in advance, and then carries out experiment displaying by manual mode of operation.Video simulation can be with To the experiment video of student's played pre-recorded.
But it is bad using the experiment teaching effect of existing method.
Summary of the invention
The embodiment of the present invention provides a kind of AR simulation process method, apparatus, electronic equipment and readable storage medium storing program for executing, for solving The certainly bad problem of experiment teaching effect in the prior art.
First aspect of the embodiment of the present invention provides a kind of AR simulation process method, comprising:
Acquisition applications video, the application video are shooting reality scene video generated;
According to the type of the reality scene, the kinematic parameter of simulated object is determined;
According to the kinematic parameter of simulated object, the simulated object is fused in the application video.
Further, the simulated object is fused to described using video by the kinematic parameter according to simulated object In, comprising:
According to the type of the reality scene, the original fusion position of the simulated object is determined;
The simulated object is fused on the original fusion position using in video first frame;
According to the kinematic parameter of the simulated object, the simulated object sequence is fused to described in the application video In frame after first frame.
Further, the type according to the reality scene before the kinematic parameter for determining simulated object, is also wrapped It includes:
It establishes to the corresponding simulated object of experimental subjects;
Determine kinematic parameter of the simulated object at least one reality scene.
Further, the type according to the reality scene before the kinematic parameter for determining simulated object, is also wrapped It includes:
According to the type of the reality scene, the simulated object is selected from preset simulated object library.
Further, the type according to the reality scene before the kinematic parameter for determining simulated object, is also wrapped It includes:
Determine the type of the reality scene.
Further, the type of the determination reality scene, comprising:
According to the texture information using in video, the type of the reality scene is determined.
Further, the type of the reality scene captured by the determination, comprising:
According to the shape information using in video, the type of the reality scene is determined.
Further, further includes:
Play audio corresponding with the type of the reality scene.
Further, the kinematic parameter includes movement velocity and motion profile.
Second aspect of the embodiment of the present invention provides a kind of AR analog processing device, comprising:
Acquisition module, is used for acquisition applications video, and the application video is shooting reality scene video generated;
First determining module determines the kinematic parameter of simulated object according to the type of the reality scene;
The simulated object is fused to described using video by Fusion Module for the kinematic parameter according to simulated object In.
Further, the Fusion Module, comprising:
Determination unit determines the original fusion position of the simulated object for the type according to the reality scene;
First integrated unit described initially melts for the simulated object to be fused to using described in video first frame Coincidence is set;
The simulated object sequence is fused to by the second integrated unit for the kinematic parameter according to the simulated object In the frame using after first frame described in video.
Further, further includes:
Module is established, for establishing to the corresponding simulated object of experimental subjects;
Second determining module, for determining kinematic parameter of the simulated object at least one reality scene.
Further, further includes:
Selecting module selects the simulation from preset simulated object library for the type according to the reality scene Object.
Further, further includes:
Third determining module, for determining the type of the reality scene.
Further, the third determining module, comprising:
First determination unit, for determining the type of the reality scene according to the texture information using in video.
Further, the third determining module, further includes:
Second determination unit, for determining the type of the reality scene according to the shape information using in video.
Further, further includes:
Playing module, for playing audio corresponding with the type of the reality scene.
Further, the kinematic parameter includes movement velocity and motion profile.
The third aspect of the embodiment of the present invention provides a kind of electronic equipment, comprising:
Memory, for storing program instruction;
Processor executes side described in above-mentioned first aspect for calling and executing the program instruction in the memory Method step.
Fourth aspect of the embodiment of the present invention provides a kind of readable storage medium storing program for executing, and calculating is stored in the readable storage medium storing program for executing Machine program, the computer program is for executing method described in above-mentioned first aspect.
AR simulation process method, apparatus, electronic equipment and readable storage medium storing program for executing provided by the embodiment of the present invention, acquisition are existing The video of real field scape, and simulated object is fused in the video of reality scene according to the kinematic parameter of simulated object, due to mould Quasi- object be during the motion in real time with merged after reality scene, and the type pair of kinematic parameter and reality scene It answers, therefore, can make user that there is the direct feel of actual physical experiment, meanwhile, and without preparing more experimental material, Therefore, the significant increase teaching efficiency of Physical Experiment.
Detailed description of the invention
It, below will be to embodiment or the prior art in order to illustrate more clearly of the present invention or technical solution in the prior art Attached drawing needed in description is briefly described, it should be apparent that, the accompanying drawings in the following description is of the invention one A little embodiments for those of ordinary skill in the art without any creative labor, can also be according to this A little attached drawings obtain other attached drawings.
Fig. 1 is the flow diagram of AR simulation process method provided in an embodiment of the present invention;
Fig. 2 is the flow diagram of AR simulation process method provided in an embodiment of the present invention;
Fig. 3 is the flow diagram of AR simulation process method provided in an embodiment of the present invention;
Fig. 4 is the function structure chart of AR simulation process method provided in an embodiment of the present invention;
Fig. 5 is the function structure chart of AR simulation process method provided in an embodiment of the present invention;
Fig. 6 is the function structure chart of AR simulation process method provided in an embodiment of the present invention;
Fig. 7 is the function structure chart of AR simulation process method provided in an embodiment of the present invention;
Fig. 8 is the function structure chart of AR simulation process method provided in an embodiment of the present invention;
Fig. 9 is the function structure chart of AR simulation process method provided in an embodiment of the present invention;
Figure 10 is the function structure chart of AR simulation process method provided in an embodiment of the present invention;
Figure 11 is the function structure chart of AR simulation process method provided in an embodiment of the present invention;
Figure 12 is the entity block diagram of a kind of electronic equipment provided in an embodiment of the present invention.
Specific embodiment
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with attached in the embodiment of the present invention Figure, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is the present invention A part of the embodiment, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art are not having Every other embodiment obtained under the premise of creative work is made, shall fall within the protection scope of the present invention.
In the prior art, the exhibition method of Physical Experiment mainly may include the mode of in-kind simulation and video simulation.It is real Object simulation needs to prepare more experimental material, and needs to execute experimentation by hand.Video simulation is only capable of viewing video, nothing Method obtains intuitive experiment impression.Therefore, the equal Shortcomings of existing method cause the teaching efficiency of Physical Experiment bad.
The embodiment of the present invention based on the above issues, proposes a kind of augmented reality (Augmented Reality, abbreviation AR) mould Quasi- processing method, acquires the video of reality scene, and simulated object is fused to real field according to the kinematic parameter of simulated object In the video of scape, due to simulated object be during the motion in real time with merged after reality scene, and kinematic parameter with The type of reality scene is corresponding, therefore, user can be made to have the direct feel of actual physical experiment, meanwhile, and without standard Standby more experimental material, therefore, the significant increase teaching efficiency of Physical Experiment.
It should be noted that method provided by the embodiment of the present invention can be applied not only in physics teaching field, also It can be applied in the fields such as health medical treatment.
Fig. 1 is the flow diagram of AR simulation process method provided in an embodiment of the present invention, and the executing subject of this method can Think electronic equipment with image-capable, such as desktop computer, laptop etc..As shown in Figure 1, this method packet It includes:
S101, acquisition applications video, above-mentioned application video are shooting reality scene video generated.
Optionally, " object " and " scene " two factors are related generally in Physical Experiment.Wherein, " object " refers to reality Object is tested, for example, some Physical Experiment needs to verify movement of the sliding block on inclined-plane, then the sliding block is " object " tested. In addition, " scene " can refer to environment locating for experimental subjects, for example, in the examples described above, " inclined-plane " can be regarded as experiment Scene.
Optionally, in embodiments of the present invention, above-mentioned " scene " is true scene, as above-mentioned " reality scene ". Before executing this step, existing reality scene, for example, using the method validation sliding block of the embodiment of the present invention on inclined-plane It when movement, needs to get the real scene in advance, and the video for acquiring the real scene is formed using video.The real scene It can be and prepared by experimenter, alternatively, can also be directly using existing scene in real world.For example, for " inclined-plane " This scene can directly select an inclined-plane, such as the inclined-plane of stair in the real world etc., shoot to the inclined-plane, It is above-mentioned using video to be formed.
Optionally, in embodiments of the present invention, above-mentioned " object " is the object of simulation.Illustratively, it is assumed that pair of experiment As then in embodiments of the present invention the threedimensional model of sliding block can be generated, and using the threedimensional model as above-mentioned simulation for sliding block Object.
S102, according to the type of above-mentioned reality scene, determine the kinematic parameter of simulated object.
Optionally, in actual environment, the object of Physical Experiment has different kinetic characteristics in different environment.Show Example property, for the sliding block, on smooth inclined-plane movement velocity with the movement velocity on coarse inclined-plane simultaneously It is not identical.Have corresponding relationship therefore, can be according to upper in this step between reality scene and the kinematic parameter of simulated object The type for stating reality scene determines the kinematic parameter of simulated object.
Optionally, the type of reality scene may include the type of a variety of dimensions, for example, distinguishing from the corresponding material of scene Difference is from scene type, alternatively, distinguishing different scene types from the corresponding shape of scene.
Optionally, the kinematic parameter of simulated object may include movement velocity, motion profile etc..
Above-mentioned simulated object is fused in above-mentioned application video by S103, the kinematic parameter according to simulated object.
During the motion, the position in display scene can change simulated object, therefore, can in this step Simulated object to be fused to using in video in real time according to the kinematic parameter of simulated object.
In the present embodiment, the video of reality scene is acquired, and simulated object is merged according to the kinematic parameter of simulated object Into the video of reality scene, due to simulated object be during the motion in real time with merged after reality scene, and transport Dynamic parameter is corresponding with the type of reality scene, therefore, user can be made to have the direct feel of actual physical experiment, meanwhile, Again without preparing more experimental material, therefore, the significant increase teaching efficiency of Physical Experiment.
On the basis of the above embodiments, the present embodiment is related to for simulated object being fused to the process using video.
Fig. 2 is the flow diagram of AR simulation process method provided in an embodiment of the present invention, as shown in Fig. 2, above-mentioned steps S103 includes:
S201, according to the type of above-mentioned reality scene, determine the original fusion position of above-mentioned simulated object.
Wherein, above-mentioned original fusion position refers to simulated object in the above-mentioned initial position using in video.For difference The scene of type can have different original fusion positions.
Illustratively, it is assumed that reality scene is an inclined-plane, then based on the type on inclined-plane, can determine the first of simulated object Begin to merge position.Such as when inclined-plane is smooth bevel, above-mentioned original fusion position can be the top on inclined-plane.When inclined-plane is thick When rough inclined-plane, above-mentioned original fusion position can be the middle position on inclined-plane.
S202, above-mentioned simulated object is fused on the above-mentioned above-mentioned original fusion position using in video first frame.
In the embodiment of the present invention, electronic equipment can acquire the application video of reality scene, above-mentioned original fusion position in real time Set the first frame corresponded to using in video.Above-mentioned simulated object is fused to the above-mentioned original fusion position in the first frame On.
S203, according to the kinematic parameter of above-mentioned simulated object, above-mentioned simulated object sequence is fused to described using video Described in frame after first frame.
Optionally, after simulated object is fused to the original fusion position of first frame, with the movement of simulated object, mould Quasi- position of the object in reality scene constantly changes, therefore, can be according to the kinematic parameter of simulated object, by simulation pair As being fused in each frame using video frame by frame.
In a kind of example, according to the movement velocity of simulated object, can calculate simulated object at every point of time on Simulated object can be fused on the corresponding position of frame corresponding to the time point by position in turn.
In another example, according to the motion profile of simulated object, can calculate simulated object at every point of time on Position simulated object can be fused on the corresponding position of frame corresponding to the time point in turn.
On the basis of the above embodiments, the present embodiment is related to determining the process of the kinematic parameter of simulated object.
Fig. 3 is the flow diagram of AR simulation process method provided in an embodiment of the present invention, as shown in figure 3, in above-mentioned step Before rapid S102, further includes:
S301, it establishes to the corresponding above-mentioned simulated object of experimental subjects.
Optionally, above-mentioned to refer to true experimental subjects, such as a true sliding block to experimental subjects.Establish to When the simulated object of experimental subjects, the image to experimental subjects in all angles can be acquired first and passes through all angles in turn Image obtain the information such as the size to experimental subjects, shape, then wait for the corresponding simulation of experimental subjects by these information architectures Object.
S302, kinematic parameter of the above-mentioned simulated object at least one reality scene is determined.
Optionally, the kinematic parameter of simulated object is the kinematic parameter to experimental subjects.To experimental subjects in not co-occurrence Kinematic parameter in real field scape can be obtained by being first passed through in a manner of actual verification, statistics etc. in advance.In this step, when establishing to reality After testing the simulated object of object, can using to the kinematic parameter at least one reality scene of experimental subjects as simulation pair The kinematic parameter at least one reality scene of elephant.
In a kind of optional embodiment, after collecting above-mentioned application video, simulation can be manually selected by user Object.
Illustratively, electronic equipment can be shown multiple simulated objects of preservation in a manner of list or icon sets Come, user can select a simulated object in a manner of dragging, be dragged in the above-mentioned display area using video.
It, can be according to above-mentioned reality scene after collecting above-mentioned application video in another optional embodiment Type, select simulated object from preset simulated object library.
Wherein, above-mentioned simulated object library can be pre-generated by electronic equipment, and be constantly updated.For example, whenever root According to after experimental subjects generates a new simulated object, which can be increased to simulated object library by electronic equipment In, with the subsequent experiment of user.
Optionally, for some reality scenes, corresponding to object may be it is fixed, therefore, can direct basis The types of these scenes selects simulated object.
Illustratively, it is assumed that the type of some reality scene is " smooth surface ", and the scene of the type can be applied only to The exercise testing of sliding block then in the present embodiment, can directly select the applicable simulation of the reality scene according to the reality scene Object.
On the basis of the above embodiments, the present embodiment is related to determining the process of reality scene type.
Optionally, it before the type according to reality scene determines the kinematic parameter of simulated object, can determine first State the type of reality scene.
Wherein, the type of above-mentioned reality scene may include the type of a variety of dimensions.Correspondingly, determining above-mentioned real field When the type of scape, it can be determined based on the feature of different dimensions.
Illustrate the mode of two kinds of optional types for determining reality scene below.
In a kind of optional embodiment, above-mentioned reality can be determined according to the above-mentioned texture information using in video The type of scene.
During the experiment, different textures may correspond to different movement velocitys, motion profile etc..For example, if tiltedly The texture in face is smooth, then the movement velocity of sliding block is very fast.If the movement velocity of the coarse texture on inclined-plane, sliding block is slower.Cause This can determine the type of reality scene, and then can be true according to scene type by judging the texture in video Determine the kinematic parameter etc. of simulated object.
Illustratively, texture can be divided into a variety of grades in advance by electronic equipment, a kind of corresponding seed type of grade, often Kind of grade represents a kind of smooth degree, after getting using video, can choose using the frame image in video, to this one Frame image is detected, and determines that its texture and any grade are closest, then can determine that the type of reality scene is this etc. The corresponding type of grade.
In another optional embodiment, it can be determined above-mentioned existing according to the above-mentioned shape information using in video The type of real field scape.
During the experiment, scene of different shapes may also correspond to different movement velocitys, motion profile etc..For example, Sliding speed of the sliding block on inclined-plane is different from sliding speed in the plane.Therefore, by the scene shape in video into Row judgement, can determine the type of reality scene, and then the kinematic parameter etc. of simulated object can be determined according to scene type.
Illustratively, electronic equipment can preset various shapes type, and a kind of shape type corresponds to reality scene One seed type.After getting using video, it can choose using the frame image in video, which examined It surveys, determines that shape included in image and any shape type are closest, then can determine that the type of reality scene is The corresponding type of the shape type.
Further, as an alternative embodiment, electronic equipment is through the foregoing embodiment melting simulated object It closes during applying in video, can also play and audio corresponding to the type of above-mentioned reality scene.
Illustratively, sliding block generated sound when sliding on the inclined-plane of different texture is not identical, therefore, can be pre- Audio corresponding with the scene first is saved for each type of reality scene, when sliding block moves under certain type of scene, The corresponding audio of the scene can be played in real time, to further promote the degree true to nature of teaching.
Fig. 4 is the function structure chart of AR simulation process method provided in an embodiment of the present invention, as shown in figure 4, the device packet It includes:
Acquisition module 401, is used for acquisition applications video, and the application video is shooting reality scene video generated.
First determining module 402 determines the kinematic parameter of simulated object according to the type of the reality scene.
The simulated object is fused to the application and regarded by Fusion Module 403 for the kinematic parameter according to simulated object In frequency.
The device is for realizing preceding method embodiment, and it is similar that the realization principle and technical effect are similar, and details are not described herein again.
Fig. 5 is the function structure chart of AR simulation process method provided in an embodiment of the present invention, as shown in figure 5, Fusion Module 403, comprising:
Determination unit 4031 determines the original fusion position of the simulated object for the type according to the reality scene It sets.
First integrated unit 4032, it is described using described first in video first frame for the simulated object to be fused to Begin on fusion position.
Second integrated unit 4033 melts the simulated object sequence for the kinematic parameter according to the simulated object It is bonded in the frame using after first frame described in video.
Fig. 6 is the function structure chart of AR simulation process method provided in an embodiment of the present invention, as shown in Figure 6, further includes:
Module 404 is established, for establishing to the corresponding simulated object of experimental subjects;
Second determining module 405, for determining kinematic parameter of the simulated object at least one reality scene.
Fig. 7 is the function structure chart of AR simulation process method provided in an embodiment of the present invention, as shown in Figure 7, further includes:
Selecting module 406 selects the mould from preset simulated object library for the type according to the reality scene Quasi- object.
Fig. 8 is the function structure chart of AR simulation process method provided in an embodiment of the present invention, as shown in Figure 8, further includes:
Third determining module 407, for determining the type of the reality scene.
Fig. 9 is the function structure chart of AR simulation process method provided in an embodiment of the present invention, as shown in figure 9, third determines Module 407, comprising:
First determination unit 4071, for determining the reality scene according to the texture information using in video Type.
Figure 10 is the function structure chart of AR simulation process method provided in an embodiment of the present invention, and as shown in Figure 10, third is true Cover half block 407, further includes:
Second determination unit 4072, for determining the reality scene according to the shape information using in video Type.
Figure 11 is the function structure chart of AR simulation process method provided in an embodiment of the present invention, as shown in figure 11, further includes:
Playing module 408, for playing audio corresponding with the type of the reality scene.
In another embodiment, the kinematic parameter includes movement velocity and motion profile.
Figure 12 is the entity block diagram of a kind of electronic equipment provided in an embodiment of the present invention, as shown in figure 12, the electronic equipment 1200 include:
Memory 1201, for storing program instruction;
Processor 1202 executes in above method embodiment for calling and executing the program instruction in memory 1201 The method and step.
Those of ordinary skill in the art will appreciate that: realize that all or part of the steps of above-mentioned each method embodiment can lead to The relevant hardware of program instruction is crossed to complete.Program above-mentioned can be stored in a computer readable storage medium.The journey When being executed, execution includes the steps that above-mentioned each method embodiment to sequence;And storage medium above-mentioned include: ROM, RAM, magnetic disk or The various media that can store program code such as person's CD.
Finally, it should be noted that the above embodiments are only used to illustrate the technical solution of the present invention., rather than its limitations;To the greatest extent Pipe present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that: its according to So be possible to modify the technical solutions described in the foregoing embodiments, or to some or all of the technical features into Row equivalent replacement;And these are modified or replaceed, various embodiments of the present invention technology that it does not separate the essence of the corresponding technical solution The range of scheme.

Claims (20)

1. a kind of AR simulation process method characterized by comprising
Acquisition applications video, the application video are shooting reality scene video generated;
According to the type of the reality scene, the kinematic parameter of simulated object is determined;
According to the kinematic parameter of simulated object, the simulated object is fused in the application video.
2. the method according to claim 1, wherein the kinematic parameter according to simulated object, by the mould Quasi- object is fused in the application video, comprising:
According to the type of the reality scene, the original fusion position of the simulated object is determined;
The simulated object is fused on the original fusion position using in video first frame;
According to the kinematic parameter of the simulated object, the simulated object sequence is fused to first described in the application video In frame after frame.
3. the method according to claim 1, wherein the type according to the reality scene, determines simulation Before the kinematic parameter of object, further includes:
It establishes to the corresponding simulated object of experimental subjects;
Determine kinematic parameter of the simulated object at least one reality scene.
4. the method according to claim 1, wherein the type according to the reality scene, determines simulation Before the kinematic parameter of object, further includes:
According to the type of the reality scene, the simulated object is selected from preset simulated object library.
5. the method according to claim 1, wherein the type according to the reality scene, determines simulation Before the kinematic parameter of object, further includes:
Determine the type of the reality scene.
6. according to the method described in claim 5, it is characterized in that, the type of the determination reality scene, comprising:
According to the texture information using in video, the type of the reality scene is determined.
7. according to the method described in claim 5, it is characterized in that, the type of the reality scene captured by the determination, Include:
According to the shape information using in video, the type of the reality scene is determined.
8. method according to claim 1-7, which is characterized in that further include:
Play audio corresponding with the type of the reality scene.
9. method according to claim 1-7, which is characterized in that the kinematic parameter include movement velocity and Motion profile.
10. a kind of AR analog processing device characterized by comprising
Acquisition module, is used for acquisition applications video, and the application video is shooting reality scene video generated;
First determining module determines the kinematic parameter of simulated object according to the type of the reality scene;
The simulated object is fused in the application video by Fusion Module for the kinematic parameter according to simulated object.
11. device according to claim 10, which is characterized in that the Fusion Module, comprising:
Determination unit determines the original fusion position of the simulated object for the type according to the reality scene;
First integrated unit, for the simulated object to be fused to the original fusion position using in video first frame It sets;
The simulated object sequence is fused to described by the second integrated unit for the kinematic parameter according to the simulated object Using in the frame after first frame described in video.
12. device according to claim 10, which is characterized in that further include:
Module is established, for establishing to the corresponding simulated object of experimental subjects;
Second determining module, for determining kinematic parameter of the simulated object at least one reality scene.
13. device according to claim 10, which is characterized in that further include:
Selecting module selects the simulated object from preset simulated object library for the type according to the reality scene.
14. device according to claim 10, which is characterized in that further include:
Third determining module, for determining the type of the reality scene.
15. device according to claim 14, which is characterized in that the third determining module, comprising:
First determination unit, for determining the type of the reality scene according to the texture information using in video.
16. device according to claim 14, which is characterized in that the third determining module, further includes:
Second determination unit, for determining the type of the reality scene according to the shape information using in video.
17. the described in any item devices of 0-16 according to claim 1, which is characterized in that further include:
Playing module, for playing audio corresponding with the type of the reality scene.
18. the described in any item devices of 0-16 according to claim 1, which is characterized in that the kinematic parameter includes movement velocity And motion profile.
19. a kind of electronic equipment characterized by comprising
Memory, for storing program instruction;
Processor, for calling and executing the program instruction in the memory, perform claim requires the described in any item sides of 1-9 Method step.
20. a kind of readable storage medium storing program for executing, which is characterized in that be stored with computer program, the meter in the readable storage medium storing program for executing Calculation machine program requires the described in any item methods of 1-9 for perform claim.
CN201810991019.5A 2018-08-28 2018-08-28 AR simulation processing method and device, electronic equipment and readable storage medium Active CN109166182B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810991019.5A CN109166182B (en) 2018-08-28 2018-08-28 AR simulation processing method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810991019.5A CN109166182B (en) 2018-08-28 2018-08-28 AR simulation processing method and device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN109166182A true CN109166182A (en) 2019-01-08
CN109166182B CN109166182B (en) 2020-11-03

Family

ID=64893230

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810991019.5A Active CN109166182B (en) 2018-08-28 2018-08-28 AR simulation processing method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN109166182B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111669666A (en) * 2019-03-08 2020-09-15 北京京东尚科信息技术有限公司 Method, device and system for simulating reality

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105657294A (en) * 2016-03-09 2016-06-08 北京奇虎科技有限公司 Method and device for presenting virtual special effect on mobile terminal
CN105955456A (en) * 2016-04-15 2016-09-21 深圳超多维光电子有限公司 Virtual reality and augmented reality fusion method, device and intelligent wearable equipment
CN105955455A (en) * 2016-04-15 2016-09-21 北京小鸟看看科技有限公司 Device and method for adding object in virtual scene
CN107529091A (en) * 2017-09-08 2017-12-29 广州华多网络科技有限公司 Video clipping method and device
CN107551554A (en) * 2017-09-29 2018-01-09 广州云友网络科技有限公司 Indoor sport scene simulation system and method are realized based on virtual reality
CN107596683A (en) * 2017-09-25 2018-01-19 晋江市博感电子科技有限公司 The virtual amusement method of perambulator, apparatus and system based on augmented reality

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105657294A (en) * 2016-03-09 2016-06-08 北京奇虎科技有限公司 Method and device for presenting virtual special effect on mobile terminal
CN105955456A (en) * 2016-04-15 2016-09-21 深圳超多维光电子有限公司 Virtual reality and augmented reality fusion method, device and intelligent wearable equipment
CN105955455A (en) * 2016-04-15 2016-09-21 北京小鸟看看科技有限公司 Device and method for adding object in virtual scene
CN107529091A (en) * 2017-09-08 2017-12-29 广州华多网络科技有限公司 Video clipping method and device
CN107596683A (en) * 2017-09-25 2018-01-19 晋江市博感电子科技有限公司 The virtual amusement method of perambulator, apparatus and system based on augmented reality
CN107551554A (en) * 2017-09-29 2018-01-09 广州云友网络科技有限公司 Indoor sport scene simulation system and method are realized based on virtual reality

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111669666A (en) * 2019-03-08 2020-09-15 北京京东尚科信息技术有限公司 Method, device and system for simulating reality

Also Published As

Publication number Publication date
CN109166182B (en) 2020-11-03

Similar Documents

Publication Publication Date Title
Syberfeldt et al. Visual assembling guidance using augmented reality
Doersch et al. Tap-vid: A benchmark for tracking any point in a video
CN109636919B (en) Holographic technology-based virtual exhibition hall construction method, system and storage medium
CN109308469B (en) Method and apparatus for generating information
CN109165571B (en) Method and apparatus for inserting image
US10242498B1 (en) Physics based garment simulation systems and methods
Marton et al. Natural exploration of 3D massive models on large-scale light field displays using the FOX proximal navigation technique
CN108986569A (en) A kind of desktop AR interactive learning method and device
Camba et al. From reality to augmented reality: Rapid strategies for developing marker-based AR content using image capturing and authoring tools
Laielli et al. Labelar: a spatial guidance interface for fast computer vision image collection
US11273342B2 (en) Viewer feedback based motion video playback
CN109166182A (en) AR simulation process method, apparatus, electronic equipment and readable storage medium storing program for executing
CN106530384B (en) A kind of the appearance texture synthesis method and device of threedimensional model
CN109816791B (en) Method and apparatus for generating information
Ponto et al. Effective replays and summarization of virtual experiences
WO2020106508A1 (en) Experience driven development of mixed reality devices with immersive feedback
Kirar et al. Review paper on the generation of computer graphics
Takacs et al. Deep authoring-an AI Tool set for creating immersive MultiMedia experiences
CN116597288B (en) Gaze point rendering method, gaze point rendering system, computer and readable storage medium
CN111524157B (en) Touch screen object analysis method and system based on camera array and storage medium
JP5966888B2 (en) Image processing program, image processing method, and image processing apparatus
Yoshida et al. Estimation of Racket Grip Vibration from Tennis Video by Neural Network
CN109064563A (en) The method of real-time control model vertices in a kind of Fusion Edges program of 3D exploitation
US20070129918A1 (en) Apparatus and method for expressing wetting and drying on surface of 3D object for visual effects
CN116012564B (en) Equipment and method for intelligent fusion of three-dimensional model and live-action photo

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20190108

Assignee: Beijing Intellectual Property Management Co.,Ltd.

Assignor: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

Contract record no.: X2023110000094

Denomination of invention: AR simulation processing methods, devices, electronic devices, and readable storage media

Granted publication date: 20201103

License type: Common License

Record date: 20230818

EE01 Entry into force of recordation of patent licensing contract