CN107730591A - A kind of assembling bootstrap technique and system based on mixed reality equipment - Google Patents
A kind of assembling bootstrap technique and system based on mixed reality equipment Download PDFInfo
- Publication number
- CN107730591A CN107730591A CN201710829291.9A CN201710829291A CN107730591A CN 107730591 A CN107730591 A CN 107730591A CN 201710829291 A CN201710829291 A CN 201710829291A CN 107730591 A CN107730591 A CN 107730591A
- Authority
- CN
- China
- Prior art keywords
- assembling
- assembly
- scene
- identified
- stage
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
The present invention relates to a kind of assembling bootstrap technique and system based on mixed reality equipment, including:Build Virtual assemble scene;Merged by mixed reality equipment with reality assembling scene;The assembling stage in scene is assembled to reality and assembly is identified;Assembling guidance information is obtained according to the assembling stage identified and the assembly identified;Assembling guidance information is shown by mixed reality equipment, assembling process is guided.A kind of assembling bootstrap technique and system based on mixed reality equipment provided by the invention, realize and carry out assembling guiding for specific installation step, real-time matching is carried out with real assembling scene, each stage of assembling can be flexibly identified in real time and provides corresponding assembling guiding, there is the advantages of directly perceived by mixed display equipment reality assembling guidance information, with feeling of immersion, the efficiency of assembling guiding is improved, reduces the error rate in assembling process.
Description
Technical field
The present invention relates to field of virtual reality, more particularly to a kind of assembling bootstrap technique based on mixed reality equipment and it is
System.
Background technology
At present, refer to for assembling process, main or by assembly crewman experience, word assembling guidance program or assembling
Assembling, assembling and the maintenance process to product or equipment such as guide frequency etc. guide.Although Virtual Assembling Technology has been achieved with
Good development and progress, but the mounting technology based on virtual reality and augmented reality scene still has very big blank,
Especially in the assembling process of some complex products, solution is instructed in imperfect assembling.
Existing Virtual assemble scheme can not build package system for specific installation step, virtual assembling scene without
Method and real scene real-time matching, lack effective guiding from Virtual assemble to practical set, virtual display interface is not true enough,
There is no feeling of immersion, Virtual assemble state can not be according to actual scene real-time update.
The content of the invention
The technical problems to be solved by the invention are in view of the shortcomings of the prior art, there is provided one kind is based on mixed reality equipment
Assembling bootstrap technique and system.
The technical scheme that the present invention solves above-mentioned technical problem is as follows:
A kind of assembling bootstrap technique based on mixed reality equipment, including:
Build Virtual assemble scene;
Merged by mixed reality equipment with reality assembling scene;
Assembling stage in the reality assembling scene and assembly are identified;
Assembling guidance information is obtained according to the assembling stage identified and the assembly identified;
The assembling guidance information is shown by the mixed reality equipment, assembling process is guided.
The beneficial effects of the invention are as follows:A kind of assembling bootstrap technique based on mixed reality equipment provided by the invention, lead to
Cross mixing real world devices to be merged the Virtual assemble scene and reality scene of structure, and the assembling in scene is assembled to reality
Stage and assembly are identified, and realize and carry out assembling guiding for specific installation step, enter with real assembling scene
Row real-time matching, each stage of assembling can be flexibly identified in real time and provides corresponding assembling guiding, passes through mixing
Display device reality assembling guidance information has the advantages of directly perceived, has feeling of immersion, improves the efficiency of assembling guiding, reduces assembling
During error rate.
On the basis of above-mentioned technical proposal, the present invention can also do following improvement.
Further, the assembling stage and assembly in the reality assembling scene is identified and specifically included:
Obtain the image of each assembly in the reality assembling scene;
Marked according to the identification being set in advance in all on the assembly, each assembly in described image is known
Not;
It is determined that the hierarchical relationship between each assembly identified;
According to each assembly identified and the hierarchical relationship, it is determined that current assembling stage.
It is using the above-mentioned further beneficial effect of scheme:By according to the identification mark pair being set in advance on assembly
Assembly is identified, and judges the hierarchical relationship between assembly, level according to the assembly identified and each other
Relation, current assembling stage can be identified exactly, improve efficiency and the degree of accuracy of identification.
Further, in addition to:
RGB image, depth image, exercise data, temperature data, the stress of each assembly identified are obtained respectively
Data and strain data;
According to default multisource data fusion algorithm, respectively in data level, feature level and decision level to each institute for identifying
RGB image, depth image, exercise data, temperature data, stress data and the strain data for stating assembly merge at decision-making
Reason, the real time status information for each assembly being identified out.
It is using the above-mentioned further beneficial effect of scheme:By the way that multi-source data processing method is integrated into augmented reality field
Scape substantially increases the utilization rate of data and the practicality of augmented reality, handled especially by multi-source data on identifying and being interactive
Method carries out fusion treatment, Neng Gouyou to the RGB image of assembly, depth image, exercise data, stress data and strain data
The real-time status of assembly is identified to effect, is easy to accurately guide assembling process, and assembling bootup process is carried out
Adjustment in real time and monitoring.
Further, in addition to:
Supervised according to the assembling process of each assembly of the real time status information to identifying with assembling result
Control, when assembling successfully, records the assembling result;When assembling occurs wrong, mistake is sent to the mixed reality equipment
Prompting message.
It is using the above-mentioned further beneficial effect of scheme:By the real time status information of assembly that identifies to assembling
Process is monitored, and assembling process can be modified in real time, and the guiding for assembling process, can with more ageing
The operating mistake in assembling process is reduced, and assembling result is monitored by real time status information, can more accurately be united
Assembling result is counted out, is easy to search and assembles unsuccessful assembly.
Further, the structure Virtual assemble scene specifically includes:
Build virtual scene;
The attribute information of assemble flow and whole assemblies is obtained, and it is more according to the assemble flow and attribute information generation
Individual assembling stage;
Hierarchical relationship and sequential relationship between each assembly is determined according to the attribute information of each assembly;
The assembling that each assembling stage is generated according to the hierarchical relationship between each assembly and sequential relationship guides
Information;
By the attribute information of the assembly, the assembling stage, the assembling guidance information of each assembling stage and each
Hierarchical relationship and sequential relationship between the assembly are added in the virtual scene, obtain Virtual assemble scene.
It is using the above-mentioned further beneficial effect of scheme:Built by the attribute information of assemble flow and whole assemblies
Virtual assemble scene, so as to get assembling scene include the attribute information of assemble flow and assembly, be easy to identify different dresses
With the stage, there is provided comprehensively assemble guidance information.
Further, the assembling guidance information specifically includes:Operation information and guidance information, wherein:
The operation information includes word, image and the voice that prompting user is operated;
The guiding message includes the essential information of assembly.
The another technical solution that the present invention solves above-mentioned technical problem is as follows:
A kind of assembling guiding system based on mixed reality equipment, including:
Model building device, for building Virtual assemble scene;
Mixed reality equipment, for the Virtual assemble scene to be merged with reality assembling scene;
Identification device, for the assembling stage in the reality assembling scene and assembly to be identified;
The mixed reality equipment is additionally operable to be obtained according to the assembling stage identified and the assembly identified
Assembling guidance information is taken, and shows the assembling guidance information, assembling process is guided.
A kind of assembling guiding system based on mixed reality equipment provided by the invention, will be built by mixed reality equipment
Virtual assemble scene and reality scene merged, and to reality assemble scene in assembling stage and assembly know
Not, realize and carry out assembling guiding for specific installation step, real-time matching is carried out with real assembling scene, can be flexible
Each stage of assembling is identified in real time and provides corresponding assembling guiding, passes through the reality assembling guiding of mixed display equipment
Information has the advantages of directly perceived, has feeling of immersion, improves the efficiency of assembling guiding, reduces the error rate in assembling process.
Further, the identification device includes:
Unit is imaged, for obtaining the image of assembly in the reality assembling scene;
Assembly recognition unit, for being marked according to the identification being set in advance in all on the assembly, to the figure
Each assembly as in is identified;
Hierarchical relationship recognition unit, for determining the hierarchical relationship between each assembly for identifying;
Assembling stage recognition unit, each assembly identified for basis and the hierarchical relationship, it is determined that currently
Assembling stage.
Further, the identification device also includes:
Sensor group, for obtain respectively the RGB image of each assembly identified, depth image, exercise data,
Temperature data, stress data and strain data;
Data processing unit, for according to default multisource data fusion algorithm, respectively in data level, feature level and decision-making
Level is to RGB image, depth image, exercise data, temperature data, stress data and the dependent variable of each assembly identified
According to fusion decision-making treatment is carried out, the real time status information for each assembly being identified out.
Further, in addition to:
Supervising device, for according to the real time status information to the assembling process and dress of each assembly identified
It is monitored with result, when assembling successfully, records the assembling result;When assembling occurs wrong, to the mixed reality
Equipment sends miscue message.
Further, the model building device includes:
Modeling unit, for building virtual scene;
Assembling stage generation unit, for obtaining the attribute information of assemble flow and whole assemblies, and according to the dress
Multiple assembling stages are generated with flow and attribute information;
Assembly relation recognition unit, for being determined according to the attribute information of each assembly between each assembly
Hierarchical relationship and sequential relationship;
Guidance information generation unit, for generating each institute according to the hierarchical relationship between each assembly and sequential relationship
State the assembling guidance information of assembling stage;
Scene modeling unit is assembled, for by the attribute information of the assembly, the assembling stage, each assembling rank
Hierarchical relationship and sequential relationship between the assembling guidance information and each assembly of section are added in the virtual scene, obtain
To Virtual assemble scene.
The advantages of aspect that the present invention adds, will be set forth in part in the description, and will partly become from the following description
Obtain substantially, or recognized by present invention practice.
Brief description of the drawings
Fig. 1 is a kind of flow signal of assembling bootstrap technique based on mixed reality equipment provided in an embodiment of the present invention
Figure;
Fig. 2 is that a kind of flow for assembling bootstrap technique based on mixed reality equipment that another embodiment of the present invention provides is shown
It is intended to;
Fig. 3 is a kind of schematic flow sheet for structure Virtual assemble scene method that another embodiment of the present invention provides;
Fig. 4 is assembling stage and assembly side in a kind of identification reality assembling scene that another embodiment of the present invention provides
The schematic flow sheet of method;
Fig. 5 is a kind of structural frames for assembling guiding system based on mixed reality equipment that another embodiment of the present invention provides
Frame figure.
Embodiment
The principle and feature of the present invention are described below in conjunction with accompanying drawing, the given examples are served only to explain the present invention, and
It is non-to be used to limit the scope of the present invention.
As shown in figure 1, it is a kind of stream of the assembling bootstrap technique based on mixed reality equipment provided in an embodiment of the present invention
Journey schematic diagram, this method comprise the following steps:
S1, builds Virtual assemble scene, and Virtual assemble scene is referred in traditional model buildings and virtual 3D scenes structure
On the basis of building, the dummy model of the assembly informations such as assemble flow, assembly attribute information is included.
In the building process of Virtual assemble scene, with the LOD technologies in development of games, 3D display performance can be optimized
And loading velocity, lift the usage experience of user.
S2, merged by mixed reality equipment with reality assembling scene.
For example, Hololens, or the mixed reality equipment of the brand such as grand base, Asus, Dell, or traditional calculations can be passed through
Machine and its display device etc. are realized.
S3, the assembling stage in scene is assembled to reality and assembly is identified, and can be obtained by multiple sensors
The information such as image, the speed of assembly are taken, assembly is identified by the identification point being set in advance on assembly or characteristic value etc.,
Can determines assembling stage according to the hierarchical relationship between default assembly after identification assembly.
S4, assembling guidance information is obtained according to the assembling stage identified and the assembly identified, assembles guidance information
It is stored in advance in Virtual assemble scene, assembling guidance information is obtained by trigger condition.For example, mixing can be passed through
Eye movement of the final function of eyeball of real world devices to user detects, when a certain assembly of the watching of user when
Between reach preset time after, it is possible to automatically mixed reality equipment display interface ejection assembling guidance information.It can also lead to
The trigger conditions such as voice, gesture are crossed to obtain assembling guidance information.
It should be noted that assembling guidance information specifically includes:Operation information and guidance information, wherein:
Operation information includes word, image and voice that prompting user is operated etc., and the content of prompting is included currently
Assembly, current assembling stage, the assembly that user is just being look at, interface message to be assembled, installation time, assemble sequence,
Points for attention are assembled, assemble safety operation guide, the correct information of installation step, menu operation etc..
Guiding message includes the essential information of assembly, and the content of prompting includes:It is text panel, assembly information, to be installed
With information, specification and parameter information etc..
S5, assembling guidance information is shown by mixed reality equipment, assembling process is guided, when existing with mixing
After the assembler of real equipment sees the assembling guidance information virtually shown, it is possible to according to virtual assembling guidance information to existing
Pieces of equipment in reality etc. is assembled, it is possible to increase the accuracy and efficiency of assembling, greatlys save time and labor cost.
It is appreciated that said process should also contain the conventional steps such as actual situation registration and visualization display.The present embodiment provides
A kind of assembling bootstrap technique based on mixed reality equipment, by mixed reality equipment by the Virtual assemble scene of structure and existing
Real field scape is merged, and the assembling stage in reality assembling scene and assembly are identified, and is realized for specific
Installation step carries out assembling guiding, carries out real-time matching with real assembling scene, can flexibly identify assembling in real time
Each stage and provide corresponding assembling guiding, have by mixed display equipment reality assembling guidance information intuitively excellent
Point, there is feeling of immersion, improve the efficiency of assembling guiding, reduce the error rate in assembling process.
A kind of as shown in Fig. 2 assembling bootstrap technique based on mixed reality equipment provided for another embodiment of the present invention
Schematic flow sheet, this method comprises the following steps:
S1, build Virtual assemble scene.
Preferably, as shown in figure 3, step S1 can be subdivided into following steps:
S11, virtual scene is built, virtual scene here is the three-dimensional scenic based on traditional 3D model buildings.
S12, the attribute information of assemble flow and whole assemblies is obtained, and it is more according to assemble flow and attribute information generation
Individual assembling stage.
It should be noted that the attribute information of assembly refers to the information such as the shape of assembly, material, assembling stage
Division can be according to actual setting, for example, for automobile this assembling process, can be by the different portion of assembling motor vehicle
Obstructed assembling stage is divided into, the assemble flow and assembly in each stage are different, for example, in the wheel of automobile
Assembling stage, assembly are exactly wheel, and assemble flow is exactly related procedure for assembling bus or train route etc..
S13, hierarchical relationship and sequential relationship between each assembly are determined according to the attribute information of each assembly, here
Hierarchical relationship and sequential relationship refer to the assemble sequence between assembly and the structural relation of assembling.
S14, the assembling guidance information of each assembling stage is generated according to the hierarchical relationship between each assembly and sequential relationship,
The structural relation of assemble sequence and assembling between each assembly is obtained, has been known that the assembly method of each assembly, just
The assembling guidance information for being guided to assembling process can be generated.
S15, by the attribute information of assembly, assembling stage, the assembling guidance information of each assembling stage and each assembly it
Between hierarchical relationship and sequential relationship be added in virtual scene, obtain Virtual assemble scene.
Pass through the attribute information of assemble flow and whole assemblies build Virtual assemble scene, so as to get assembling scene bag
Attribute information containing assemble flow and assembly, it is easy to identify different assembling stages, there is provided comprehensively assemble guidance information.
S2, merged by mixed reality equipment with reality assembling scene, ripe virtual reality device, increase at present
Real world devices and mixed reality equipment etc. can realize merging virtual scene and reality scene, will not be repeated here.
S3, the assembling stage in scene is assembled to reality and assembly is identified.
Preferably, as shown in figure 4, step S3 can be subdivided into following steps:
S31, the image of each assembly in reality assembling scene is obtained, for example, each dress can be obtained by depth camera
The image of part, to determine the position of the image of each assembly.
S32, marked according to the identification being set in advance on whole assemblies, each assembly in image is identified.
It should be noted that the identification and tracking to assembly can be realized by technologies such as characteristic matching, binocular trackings.
Specifically, can according to the two-dimensional signal and depth information of the assembly that sensor obtains, with reference to RGB cameras identification and chase after
Track, particular location of the practical set body in virtual scene and the relative position in actual scene are determined, and carry out actual situation
Registration.
S33, it is determined that the hierarchical relationship between each assembly identified, because the hierarchical relationship of each assembling is to prestore
In Virtual assemble scene, therefore, after identifying current assembly, it is possible to drawn according to current assembly between it
Hierarchical relationship.
S34, according to each assembly and hierarchical relationship identified, it is determined that current assembling stage, because assembling stage is pre-
First set according to assemble flow and assembly, therefore, when identifying current assembly and hierarchical relationship each other
Afterwards, it is possible to it is determined that current assembling stage.
In step S3 refinement step, by being carried out according to the identification mark being set in advance on assembly to assembly
Identification, and judge the hierarchical relationship between assembly, hierarchical relationship according to the assembly identified and each other, Neng Gouzhun
Current assembling stage really is identified, improves efficiency and the degree of accuracy of identification.
S4, assembling guidance information is obtained according to the assembling stage identified and the assembly identified.
S5, assembling guidance information is shown by mixed reality equipment, assembling process is guided.
S6, RGB image, depth image, exercise data, temperature data, the stress of each assembly identified are obtained respectively
Data and strain data.
For example, can by RGB video camera, depth camera, velocity sensor, acceleration transducer, temperature sensor,
Nine axle sensors, strain transducer and strain gauge etc. obtain these data.
S7, it is each to what is identified in data level, feature level and decision level respectively according to default multisource data fusion algorithm
RGB image, depth image, exercise data, temperature data, stress data and the strain data of assembly merge at decision-making
Reason, the real time status information for each assembly being identified out., can be to assembling after data fusion decision-making treatment is carried out
Assembly carry out actual situation registration, record assembly state and 3D scenes, for later stage system maintenance and manage basis is provided.
For example, RGB image, depth image, exercise data, temperature data, stress data and strain data can be carried out
Feature extraction, respectively carries out the feature of extraction the data fusion of feature level, then by the data after fusion and Virtual assemble scene
Merged, show the real time status information of each assembly in real time.
By the way that multi-source data processing method is integrated into augmented reality scene Recognition and interaction, data are substantially increased
The practicality of utilization rate and augmented reality, especially by multi-source data processing method to the RGB image of assembly, depth image,
Exercise data, temperature data, stress data and strain data carry out fusion treatment, can efficiently identify out the real-time of assembly
State, it is easy to accurately guide assembling process, and assembling bootup process adjust and monitor in real time.
S8, it is monitored according to the assembling process of each assembly of the real time status information to identifying with assembling result, when
When assembling successfully, record assembling result;When assembling occurs wrong, miscue message is sent to mixed reality equipment, with side
Assembler is helped to correct loading error.
For example, the position of each assembly, state can be determined by detecting the real time status information of assembly in real time
Deng, contrasted with the default successful model of assembling, to determine whether to assemble successfully, or loading error.
Assembling process is monitored by the real time status information of the assembly identified, can be in real time to assembling
Journey is modified, and the guiding for assembling process can reduce the operating mistake in assembling process, and pass through with more ageing
Real time status information is monitored to assembling result, can more accurately count assembling result, is easy to lookup assembling unsuccessful
Assembly.
As shown in figure 5, guide system for a kind of assembling based on mixed reality equipment 2 that another embodiment of the present invention provides
Structural framing figure, the system includes:Model building device 1, mixed reality equipment 2, identification device 3 and supervising device 4, separately below
This 4 devices are described in detail.
Model building device 1 is used to build Virtual assemble scene, and its function can be realized by computer, including:
Modeling unit 11, for building virtual scene.
Assembling stage generation unit 12, for obtaining the attribute information of assemble flow and whole assemblies, and according to assembling
Flow and attribute information generate multiple assembling stages.
Assembly relation recognition unit 13, for determining that the level between each assembly is closed according to the attribute information of each assembly
System and sequential relationship.
Guidance information generation unit 14, for generating each assembling according to the hierarchical relationship between each assembly and sequential relationship
The assembling guidance information in stage.
Scene modeling unit 15 is assembled, for the assembling of the attribute information of assembly, assembling stage, each assembling stage to be drawn
Lead the hierarchical relationship between information and each assembly and sequential relationship is added in virtual scene, obtain Virtual assemble scene.
Mixed reality equipment 2 is used to be merged Virtual assemble scene with reality assembling scene, is additionally operable to according to identification
The assembling stage gone out and the assembly identified obtain assembling guidance information, and show assembling guidance information, and assembling process is entered
Row guiding.
For example, Hololens, or the mixed reality equipment 2 of the brand such as grand base, Asus, Dell, or tradition meter can be passed through
Calculation machine and its display device etc. are realized.
The assembling stage and assembly that identification device 3 is used to assemble reality in scene are identified, including:
Unit 31 is imaged, for obtaining the image of assembly in reality assembling scene.
Preferably, shooting unit 31 can include RGB video camera and depth camera.
Assembly recognition unit 32, for being marked according to the identification being set in advance on whole assemblies, in image
Each assembly is identified.
Hierarchical relationship recognition unit 33, for determining the hierarchical relationship between each assembly for identifying.
Assembling stage recognition unit 34, each assembly identified for basis and hierarchical relationship, it is determined that currently assembling rank
Section.
Sensor group 35, for obtaining the RGB image of each assembly identified, depth image, exercise data, temperature respectively
Degrees of data, stress data and strain data.
Preferably, sensor group 35 can include:RGB video camera, depth camera, velocity sensor, acceleration sensing
Device, nine axle sensors, temperature data, strain transducer and strain gauge etc..
Data processing unit 36, for according to default multisource data fusion algorithm, respectively in data level, feature level and certainly
Plan level is to RGB image, depth image, exercise data, temperature data, stress data and the strain data of each assembly identified
Fusion decision-making treatment is carried out, the real time status information for each assembly being identified out.
Supervising device 4 is used to assembling result be entered according to the assembling process of each assembly of the real time status information to identifying
Row monitoring, when assembling successfully, record assembling result;When assembling occurs wrong, miscue is sent to mixed reality equipment 2
Message.
It should be noted that supervising device 4 can be integrated in mixed reality equipment 2, can also be as independent physics
In the presence of monitoring assembling process.
Reader should be understood that in the description of this specification, reference term " one embodiment ", " some embodiments ", " show
The description of example ", " specific example " or " some examples " etc. mean to combine the specific features of the embodiment or example description, structure,
Material or feature are contained at least one embodiment or example of the present invention.In this manual, above-mentioned term is shown
The statement of meaning property need not be directed to identical embodiment or example.Moreover, specific features, structure, material or the feature of description
It can be combined in an appropriate manner in any one or more embodiments or example.In addition, in the case of not conflicting, this
The technical staff in field can be by the different embodiments or example described in this specification and the spy of different embodiments or example
Sign is combined and combined.
It is apparent to those skilled in the art that for convenience of description and succinctly, the dress of foregoing description
The specific work process with unit is put, the corresponding process in preceding method embodiment is may be referred to, will not be repeated here.
In several embodiments provided herein, it should be understood that disclosed apparatus and method, it can be passed through
Its mode is realized.For example, device embodiment described above is only schematical, for example, the division of unit, is only
A kind of division of logic function, can there is an other dividing mode when actually realizing, for example, multiple units or component can combine or
Person is desirably integrated into another system, or some features can be ignored, or does not perform.
The unit illustrated as separating component can be or may not be physically separate, be shown as unit
Part can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple networks
On unit.Some or all of unit therein can be selected to realize the mesh of scheme of the embodiment of the present invention according to the actual needs
's.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, can also
It is that unit is individually physically present or two or more units are integrated in a unit.It is above-mentioned integrated
Unit can both be realized in the form of hardware, can also be realized in the form of SFU software functional unit.
If integrated unit is realized in the form of SFU software functional unit and is used as independent production marketing or in use, can
To be stored in a computer read/write memory medium.Based on such understanding, technical scheme substantially or
Say that the part to be contributed to prior art, or all or part of the technical scheme can be embodied in the form of software product
Out, the computer software product is stored in a storage medium, including some instructions are causing a computer equipment
(can be personal computer, server, virtual reality device, mixed reality equipment, or network equipment etc.) performs the present invention
The all or part of step of each embodiment method.Foregoing storage medium includes but are not limited in the following manner:It is locally stored
(RAM/ROM), mobile storage (USB flash disk, mobile hard disk etc.), webserver storage, cloud storage etc. are various can be with storage program generation
The medium of code.
More than, it is only embodiment of the invention, but protection scope of the present invention is not limited thereto, and it is any to be familiar with
Those skilled in the art the invention discloses technical scope in, various equivalent modifications or substitutions can be readily occurred in,
These modifications or substitutions should be all included within the scope of the present invention.Therefore, protection scope of the present invention should be wanted with right
The protection domain asked is defined.
Claims (10)
- A kind of 1. assembling bootstrap technique based on mixed reality equipment, it is characterised in that including:Build Virtual assemble scene;Merged by mixed reality equipment with reality assembling scene;Assembling stage in the reality assembling scene and assembly are identified;Assembling guidance information is obtained according to the assembling stage identified and the assembly identified;The assembling guidance information is shown by the mixed reality equipment, assembling process is guided.
- 2. assembling bootstrap technique according to claim 1, it is characterised in that the dress in the reality assembling scene It is identified and specifically includes with stage and assembly:Obtain the image of each assembly in the reality assembling scene;Marked according to the identification being set in advance in all on the assembly, each assembly in described image is identified;It is determined that the hierarchical relationship between each assembly identified;According to each assembly identified and the hierarchical relationship, it is determined that current assembling stage.
- 3. assembling bootstrap technique according to claim 2, it is characterised in that also include:RGB image, depth image, exercise data, temperature data, the stress data of each assembly identified are obtained respectively And strain data;According to default multisource data fusion algorithm, respectively in data level, feature level and decision level to each dress for identifying RGB image, depth image, exercise data, temperature data, stress data and the strain data of part carry out fusion decision-making treatment, The real time status information for each assembly being identified out.
- 4. assembling bootstrap technique according to claim 3, it is characterised in that also include:It is monitored according to the assembling process of each assembly of the real time status information to identifying with assembling result, when When assembling successfully, the assembling result is recorded.
- 5. assembling bootstrap technique according to any one of claim 1 to 4, it is characterised in that the structure Virtual assemble Scene specifically includes:Build virtual scene;The attribute information of assemble flow and whole assemblies is obtained, and multiple dresses are generated according to the assemble flow and attribute information With the stage;Hierarchical relationship and sequential relationship between each assembly is determined according to the attribute information of each assembly;The assembling guidance information of each assembling stage is generated according to the hierarchical relationship between each assembly and sequential relationship;By the attribute information of the assembly, the assembling stage, the assembling guidance information of each assembling stage and each described Hierarchical relationship and sequential relationship between assembly are added in the virtual scene, obtain Virtual assemble scene.
- A kind of 6. assembling guiding system based on mixed reality equipment, it is characterised in that including:Model building device, for building Virtual assemble scene;Mixed reality equipment, for the Virtual assemble scene to be merged with reality assembling scene;Identification device, for the assembling stage in the reality assembling scene and assembly to be identified;The mixed reality equipment is additionally operable to obtain dress according to the assembling stage identified and the assembly identified With guidance information, and the assembling guidance information is shown, assembling process is guided.
- 7. assembling guiding system according to claim 6, it is characterised in that the identification device includes:Unit is imaged, for obtaining the image of assembly in the reality assembling scene;Assembly recognition unit, for being marked according to the identification being set in advance in all on the assembly, in described image Each assembly be identified;Hierarchical relationship recognition unit, for determining the hierarchical relationship between each assembly for identifying;Assembling stage recognition unit, each assembly identified for basis and the hierarchical relationship, it is determined that current assembling Stage.
- 8. assembling guiding system according to claim 7, it is characterised in that the identification device also includes:Sensor group, for obtaining the RGB image of each assembly identified, depth image, exercise data, temperature respectively Data, stress data and strain data;Data processing unit, for according to default multisource data fusion algorithm, respectively in data level, feature level and decision level pair RGB image, depth image, exercise data, temperature data, stress data and the strain data of each assembly identified enter Row fusion decision-making treatment, the real time status information for each assembly being identified out.
- 9. assembling guiding system according to claim 8, it is characterised in that also include:Supervising device, for being tied according to the assembling process of each assembly of the real time status information to identifying with assembling Fruit is monitored, and when assembling successfully, records the assembling result.
- 10. the assembling guiding system according to any one of claim 6 to 9, it is characterised in that the model building device bag Include:Modeling unit, for building virtual scene;Assembling stage generation unit, flowed for obtaining the attribute information of assemble flow and whole assemblies, and according to the assembling Journey and attribute information generate multiple assembling stages;Assembly relation recognition unit, for determining the level between each assembly according to the attribute information of each assembly Relation and sequential relationship;Guidance information generation unit, for generating each dress according to the hierarchical relationship between each assembly and sequential relationship Assembling guidance information with the stage;Scene modeling unit is assembled, for by the attribute information of the assembly, the assembling stage, each assembling stage Hierarchical relationship and sequential relationship between assembling guidance information and each assembly are added in the virtual scene, obtain void Intend assembling scene.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710829291.9A CN107730591A (en) | 2017-09-14 | 2017-09-14 | A kind of assembling bootstrap technique and system based on mixed reality equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710829291.9A CN107730591A (en) | 2017-09-14 | 2017-09-14 | A kind of assembling bootstrap technique and system based on mixed reality equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107730591A true CN107730591A (en) | 2018-02-23 |
Family
ID=61206277
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710829291.9A Pending CN107730591A (en) | 2017-09-14 | 2017-09-14 | A kind of assembling bootstrap technique and system based on mixed reality equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107730591A (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109491497A (en) * | 2018-10-19 | 2019-03-19 | 华中科技大学 | A kind of human assistance assembly application system based on augmented reality |
CN110119206A (en) * | 2019-05-10 | 2019-08-13 | 西北工业大学 | A kind of AR resource dynamic loading method towards industrial operations operation guide |
CN110136268A (en) * | 2019-04-26 | 2019-08-16 | 广州供电局有限公司 | Cable accessory makes guidance system and method |
CN110224994A (en) * | 2019-03-06 | 2019-09-10 | 顶拓科技(武汉)有限公司 | A kind of method, AR glasses and system by diagonal assembly control |
CN110223393A (en) * | 2019-03-06 | 2019-09-10 | 顶拓科技(武汉)有限公司 | A kind of AR glasses and a kind of assembling process management system and method |
CN110928418A (en) * | 2019-12-11 | 2020-03-27 | 北京航空航天大学 | Aviation cable auxiliary assembly method and system based on MR |
CN111618550A (en) * | 2020-05-18 | 2020-09-04 | 上海交通大学 | Flexible matching system for augmented reality auxiliary assembly of missile cabin and monitoring method |
CN111862716A (en) * | 2020-07-30 | 2020-10-30 | 江苏建筑职业技术学院 | Prefabricated assembled structure construction virtual training system and method based on building information model |
US11030819B1 (en) | 2019-12-02 | 2021-06-08 | International Business Machines Corporation | Product build assistance and verification |
CN113489963A (en) * | 2021-07-08 | 2021-10-08 | 宁波宝贝第一母婴用品有限公司 | Method and device for guiding installation of cart |
CN113673894A (en) * | 2021-08-27 | 2021-11-19 | 东华大学 | Multi-person cooperation AR assembly method and system based on digital twin |
CN114197884A (en) * | 2021-12-27 | 2022-03-18 | 广东景龙建设集团有限公司 | Assembling guiding method and system for customized decorative wallboard |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103035135A (en) * | 2012-11-27 | 2013-04-10 | 北京航空航天大学 | Children cognitive system based on augment reality technology and cognitive method |
US20130318784A1 (en) * | 2012-05-31 | 2013-12-05 | Build-A-Bear Workshop, Inc. | Interactive toy assembly workshop and method for creating a toy with a digital profile |
CN103460256A (en) * | 2011-03-29 | 2013-12-18 | 高通股份有限公司 | Anchoring virtual images to real world surfaces in augmented reality systems |
CN103941861A (en) * | 2014-04-02 | 2014-07-23 | 北京理工大学 | Multi-user cooperation training system adopting mixed reality technology |
CN105511600A (en) * | 2015-07-31 | 2016-04-20 | 华南理工大学 | Multi-media man-machine interaction platform based on mixed reality |
US20160292779A1 (en) * | 2015-03-31 | 2016-10-06 | Kyle Smith Rose | Modification of three-dimensional garments using gestures |
CN106340217A (en) * | 2016-10-31 | 2017-01-18 | 华中科技大学 | Augmented reality technology based manufacturing equipment intelligent system and its implementation method |
US20170053449A1 (en) * | 2015-08-19 | 2017-02-23 | Electronics And Telecommunications Research Institute | Apparatus for providing virtual contents to augment usability of real object and method using the same |
CN106529838A (en) * | 2016-12-16 | 2017-03-22 | 湖南拓视觉信息技术有限公司 | Virtual assembling method and device |
-
2017
- 2017-09-14 CN CN201710829291.9A patent/CN107730591A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103460256A (en) * | 2011-03-29 | 2013-12-18 | 高通股份有限公司 | Anchoring virtual images to real world surfaces in augmented reality systems |
US20130318784A1 (en) * | 2012-05-31 | 2013-12-05 | Build-A-Bear Workshop, Inc. | Interactive toy assembly workshop and method for creating a toy with a digital profile |
CN103035135A (en) * | 2012-11-27 | 2013-04-10 | 北京航空航天大学 | Children cognitive system based on augment reality technology and cognitive method |
CN103941861A (en) * | 2014-04-02 | 2014-07-23 | 北京理工大学 | Multi-user cooperation training system adopting mixed reality technology |
US20160292779A1 (en) * | 2015-03-31 | 2016-10-06 | Kyle Smith Rose | Modification of three-dimensional garments using gestures |
CN105511600A (en) * | 2015-07-31 | 2016-04-20 | 华南理工大学 | Multi-media man-machine interaction platform based on mixed reality |
US20170053449A1 (en) * | 2015-08-19 | 2017-02-23 | Electronics And Telecommunications Research Institute | Apparatus for providing virtual contents to augment usability of real object and method using the same |
CN106340217A (en) * | 2016-10-31 | 2017-01-18 | 华中科技大学 | Augmented reality technology based manufacturing equipment intelligent system and its implementation method |
CN106529838A (en) * | 2016-12-16 | 2017-03-22 | 湖南拓视觉信息技术有限公司 | Virtual assembling method and device |
Non-Patent Citations (5)
Title |
---|
J. ZAUNER .ETAL: "Authoring of a Mixed Reality Assembly Instructor for Hierarchical Structures", 《THE SECOND IEEE AND ACM INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, 2003. PROCEEDINGS》 * |
LI SHIQI .ETAL: "Mixed reality-based interactive technology for aircraft cabin assembly", 《CHINESE JOURNAL OF MECHANICAL ENGINEERING》 * |
吴宜灿 编著: "《革新核能安全》", 31 December 2016, 中国原子能出版社 * |
张秋月 等: "虚拟现实和增强现实技术在飞机装配中的应用", 《航空制造技术》 * |
彭涛: "面向舱体内结构的智能装配技术与应用研究", 《中国博士学位论文全文数据库——工程科技I辑》 * |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109491497A (en) * | 2018-10-19 | 2019-03-19 | 华中科技大学 | A kind of human assistance assembly application system based on augmented reality |
CN110224994A (en) * | 2019-03-06 | 2019-09-10 | 顶拓科技(武汉)有限公司 | A kind of method, AR glasses and system by diagonal assembly control |
CN110223393A (en) * | 2019-03-06 | 2019-09-10 | 顶拓科技(武汉)有限公司 | A kind of AR glasses and a kind of assembling process management system and method |
CN110136268A (en) * | 2019-04-26 | 2019-08-16 | 广州供电局有限公司 | Cable accessory makes guidance system and method |
CN110136268B (en) * | 2019-04-26 | 2023-12-05 | 广东电网有限责任公司广州供电局 | Cable accessory manufacturing guiding system and method |
CN110119206A (en) * | 2019-05-10 | 2019-08-13 | 西北工业大学 | A kind of AR resource dynamic loading method towards industrial operations operation guide |
US11030819B1 (en) | 2019-12-02 | 2021-06-08 | International Business Machines Corporation | Product build assistance and verification |
CN110928418A (en) * | 2019-12-11 | 2020-03-27 | 北京航空航天大学 | Aviation cable auxiliary assembly method and system based on MR |
CN111618550A (en) * | 2020-05-18 | 2020-09-04 | 上海交通大学 | Flexible matching system for augmented reality auxiliary assembly of missile cabin and monitoring method |
CN111862716A (en) * | 2020-07-30 | 2020-10-30 | 江苏建筑职业技术学院 | Prefabricated assembled structure construction virtual training system and method based on building information model |
CN113489963A (en) * | 2021-07-08 | 2021-10-08 | 宁波宝贝第一母婴用品有限公司 | Method and device for guiding installation of cart |
CN113489963B (en) * | 2021-07-08 | 2024-02-23 | 宁波宝贝第一母婴用品有限公司 | Cart installation guiding method and device |
CN113673894A (en) * | 2021-08-27 | 2021-11-19 | 东华大学 | Multi-person cooperation AR assembly method and system based on digital twin |
CN113673894B (en) * | 2021-08-27 | 2024-02-02 | 东华大学 | Multi-person cooperation AR assembly method and system based on digital twinning |
CN114197884A (en) * | 2021-12-27 | 2022-03-18 | 广东景龙建设集团有限公司 | Assembling guiding method and system for customized decorative wallboard |
CN114197884B (en) * | 2021-12-27 | 2022-07-08 | 广东景龙建设集团有限公司 | Assembling guiding method and system for customized decorative wallboard |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107730591A (en) | A kind of assembling bootstrap technique and system based on mixed reality equipment | |
CN106340217B (en) | Manufacturing equipment intelligence system and its implementation based on augmented reality | |
CN106379237B (en) | Vehicle lane-changing overall process DAS (Driver Assistant System) based on augmented reality | |
US20180357978A1 (en) | Method and devices used for implementing augmented reality interaction and displaying | |
EP3796112B1 (en) | Virtual vehicle control method, model training method, control device and storage medium | |
CN204952205U (en) | Wear -type combination body -building system | |
JP2020513957A5 (en) | ||
CN109117706A (en) | Moving body detection device and detection method, moving body learning device and learning method, moving body detecting system and program | |
CN101295206A (en) | System for stereovision | |
CN111428571B (en) | Vehicle guiding method, device, equipment and storage medium | |
CN107670272A (en) | Intelligent body-sensing based on VR technologies, sense of touch interactive scene analogy method | |
CN110320883A (en) | A kind of Vehicular automatic driving control method and device based on nitrification enhancement | |
KR101916863B1 (en) | Apparatus for controlling virtual reality game and control method thereof | |
JP2022520016A (en) | Information display methods, devices, electronic devices and computer programs | |
CN106327946A (en) | Virtual reality integrated machine for driving training | |
CN110007752A (en) | The connection of augmented reality vehicle interfaces | |
KR20170104846A (en) | Method and apparatus for analyzing virtual reality content | |
WO2020145224A1 (en) | Video processing device, video processing method and video processing program | |
CN109087546A (en) | Waste paper based on 3d virtual technology sorts machining simulation system | |
CN107230427A (en) | Intelligent display platform based on laser navigation, Activity recognition | |
CN111872928A (en) | Obstacle attribute distinguishing method and system and intelligent robot | |
Benoit et al. | Multimodal signal processing and interaction for a driving simulator: Component-based architecture | |
CN115440107A (en) | VR virtual reality-based intelligent driving training system and method for deaf-mute | |
CN108733962A (en) | A kind of method for building up and system of anthropomorphic driver's Controlling model of unmanned vehicle | |
CN109711349A (en) | Method and apparatus for generating control instruction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180223 |