CN113610985B - Virtual-real interaction method, electronic equipment and storage medium - Google Patents
Virtual-real interaction method, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN113610985B CN113610985B CN202110692215.4A CN202110692215A CN113610985B CN 113610985 B CN113610985 B CN 113610985B CN 202110692215 A CN202110692215 A CN 202110692215A CN 113610985 B CN113610985 B CN 113610985B
- Authority
- CN
- China
- Prior art keywords
- virtual
- state data
- real
- jig
- mixed reality
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 39
- 230000003993 interaction Effects 0.000 title claims abstract description 21
- 238000012549 training Methods 0.000 claims abstract description 44
- 239000000463 material Substances 0.000 claims abstract description 22
- 238000004590 computer program Methods 0.000 claims description 9
- 230000008859 change Effects 0.000 claims description 4
- 210000001145 finger joint Anatomy 0.000 claims description 3
- 230000004044 response Effects 0.000 claims description 2
- 230000000694 effects Effects 0.000 abstract description 7
- 238000012545 processing Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 235000019800 disodium phosphate Nutrition 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000002427 irreversible effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/04—Manufacturing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Manufacturing & Machinery (AREA)
- Computer Hardware Design (AREA)
- Health & Medical Sciences (AREA)
- Computer Graphics (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Optics & Photonics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application provides a virtual-real interaction method, electronic equipment and a storage medium, wherein the method comprises the following steps: obtaining a virtual material corresponding to the real material; superposing the virtual material on a physical jig in the real world through mixed reality equipment; controlling the entity jig to adjust an operation mode according to the first state data of the virtual material; and adjusting the running state of the virtual material through the mixed reality equipment according to the second state data of the entity fixture. The application can save the training cost and enhance the training effect.
Description
Technical Field
The present application relates to the field of mixed reality, and in particular, to a virtual-real interaction method, an electronic device, and a storage medium.
Background
At present, virtual-real interaction training based on mixed reality technology in the industrial field is mostly applied to the situation that materials and jigs are all entities, and a user is informed of the steps or notes to be done currently by adding virtual prompts in a training scene. Often requiring the user to manually switch to the next step after completing the current step to continue training. And the state of the material after training is irreversible. The other training mode is that the materials and the jig are virtual models, and when the training is performed in the training mode, the user cannot feel touch sense when actually operating the jig, and the training effect is poor.
Disclosure of Invention
In view of the foregoing, it is necessary to provide a virtual-real interaction method, an electronic device, and a storage medium, which can enhance training effects and save training costs.
The application provides a virtual-real interaction method, which comprises the following steps: obtaining a virtual material corresponding to the real material; superposing the virtual material on a physical jig in the real world through mixed reality equipment; controlling the entity jig to adjust an operation mode according to the first state data of the virtual material; and adjusting the running state of the virtual material through the mixed reality equipment according to the second state data of the entity fixture.
In one possible implementation, the superimposing the virtual material onto the physical fixture in the real world by the mixed reality device includes: setting a real mark point; scanning the real mark points through the mixed reality equipment every preset time to obtain virtual world coordinates of the real mark points; and adjusting the position of the virtual material according to the virtual world coordinates.
In one possible implementation, the adjusting the position of the virtual material according to the virtual world coordinates includes: extracting a current virtual world coordinate as a first virtual world coordinate, and extracting a last virtual world coordinate as a second virtual world coordinate; calculating a distance between the first virtual world coordinate and the second virtual world coordinate; judging whether the distance is larger than a preset distance threshold value or not; if the distance is greater than the preset distance threshold, moving the virtual material to the first virtual world coordinate; and if the distance is smaller than the preset distance threshold, keeping the position of the virtual material unchanged.
In one possible implementation, the second state data includes: the device comprises continuous state data and discontinuous state data, wherein the continuous state data comprises an opening angle, a pressure value and infrared sensing of the physical jig; the discontinuous state data comprises the switching of the running mode of the entity jig and the triggering of a switch.
In one possible implementation, the method further includes: and in response to the change in the state of the entity jig, sending changed second state data to a message server through the entity jig.
In one possible implementation manner, before the adjusting, by the mixed reality device, the running state of the virtual material according to the second state data of the physical fixture, the method further includes: and receiving second state data of the entity jig sent by the message server.
In one possible implementation manner, after the adjusting, by the mixed reality device, the running state of the virtual material according to the second state data of the physical fixture, the method further includes: and acquiring the changed third state data through the mixed reality equipment.
In one possible implementation, the method further includes: acquiring real-time state data of the virtual material; acquiring target state data which is finally required to be met by the virtual material; judging whether the real-time state data and the target state data are the same or not; and when the real-time state data and the target state data are the same, generating a prompt for training completion.
The application also provides an electronic device comprising a processor and a memory, wherein the processor is used for realizing the virtual-real interaction method when executing the computer program stored in the memory.
The application also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor implements the method of virtual-real interaction.
According to the virtual-real interaction method and related equipment, virtual materials corresponding to real materials are obtained, the virtual materials are overlapped on the real-world physical jig by utilizing the mixed reality equipment, the physical jig is controlled to adjust the operation mode according to the first state data of the virtual materials, and the operation state of the virtual materials is adjusted by the mixed reality equipment according to the second state data of the physical jig. By synchronizing the states of the virtual materials and the physical jig, the training effect can be improved.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to a preferred embodiment of the present application for implementing a virtual-real interaction method.
Fig. 2 is a flow chart of a preferred embodiment of a method of virtual-real interaction disclosed in the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described in detail with reference to the accompanying drawings and specific embodiments.
Referring to fig. 1, fig. 1 is a schematic diagram of an electronic device according to an embodiment of the application. Referring to fig. 1, the electronic device 1 includes, but is not limited to, a memory 11 and at least one processor 12, which may be connected via a communication bus 13, or may be directly connected.
The electronic device 1 may be a computer, a mobile phone, a tablet computer, a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA), or the like, in which an application program is installed. It will be appreciated by those skilled in the art that the schematic diagram 1 is merely an example of the electronic device 1 and does not constitute a limitation of the electronic device 1, and may include more or less components than illustrated, or may combine certain components, or different components, e.g. the electronic device 1 may further include input-output devices, network access devices, buses, etc.
Fig. 2 is a flow chart of a preferred embodiment of the virtual-real interaction method of the present application. The method of virtual-real interaction is applied in the electronic device 1. The order of the steps in the flowchart may be changed and some steps may be omitted according to various needs. In this embodiment, the method for virtual-real interaction includes:
S21, obtaining a virtual material corresponding to the real material.
In the workflow of the industrial field, it is important to standardize the operations of the related technicians. And thus often require job training for the relevant technician. For example, in an assembly line for electronic parts, it is necessary to train an assembler on duty. In order to save training costs and improve training efficiency, it is necessary to convert real world materials required for training into virtual world materials.
In an embodiment of the present application, the real material is a real existing object, such as an electronic part. The virtual material is a three-dimensional virtual object generated by utilizing a three-dimensional modeling technology.
In an embodiment of the present application, the acquiring the virtual material corresponding to the real material includes:
(1) And acquiring the material information and the geometric information of the real material. Wherein the material information represents physical properties of the real material, such as color and reflectivity. The geometric information represents geometric properties of the real material, such as shape and size.
(2) According to the material information and the geometric information, the virtual materials, such as BIM, SOLIDWORKS, are constructed by using three-dimensional modeling software, and the application is not limited in any way.
In specific implementation, a three-dimensional model library can be generated according to the virtual materials and used for storing three-dimensional models of various real materials. When the three-dimensional model is used, it may be extracted from the three-dimensional model library.
By generating the virtual material instead of the real material, training costs may be saved.
S22, superposing the virtual material on a physical jig in the real world through mixed reality equipment.
In the assembly line of electronic parts, in order to enable users to feel the reality of the entity jig, a virtual-real combined training mode is adopted, training experience is increased, and training efficiency is improved. Before training begins, the virtual materials are first superimposed onto the physical jig.
In an embodiment of the present application, the physical jig is a jig corresponding to the real material, for example, when the real material is an electronic part, the physical jig is an electronic part assembling jig.
In an embodiment of the present application, the mixed reality device may be a head mounted device, for example, HOLOLENS devices.
In the embodiment of the application, the mixed reality device may extract corresponding virtual materials from the three-dimensional model library according to training content, for example, when the training content is the assembly of a mobile phone motherboard, the mixed reality device extracts three-dimensional models of a central processing unit (Central Processing Unit, a CPU), a graphics processor (Graphic Processing Unit, a GPU), a Read Only Memory (ROM), a random access Memory (Random Access Memory, a RAM) and an external Memory from the three-dimensional model library respectively.
In an embodiment of the present application, the superimposing the virtual material onto the physical jig in the real world by the mixed reality device includes:
(1) Setting a real mark point, wherein the real mark point can be selected according to the actual requirement of a user, for example, a certain point in a worktable working surface of the electronic part assembling jig is selected as the real mark point.
(2) And scanning the real mark points through the mixed reality equipment at preset first time intervals to obtain virtual world coordinates of the real mark points. The first time may be selected according to the actual requirement of the user, for example, set to 0.01 seconds. It should be noted that, the smaller the first time, the better the superposition effect of the virtual material and the physical jig.
(3) And adjusting the position of the virtual material according to the virtual world coordinates. In specific implementation, the sensor of the mixed reality equipment is utilized to scan a real training scene, and space information of the real scene is obtained based on an instant positioning and map construction (Simultaneous localization AND MAPPING, SLAM) technology. The current virtual world coordinate is extracted as a first virtual world coordinate, and the last virtual world coordinate is extracted as a second virtual world coordinate. A distance between the first virtual world coordinate and the second virtual world coordinate is calculated. And judging whether the distance is larger than a preset distance threshold, and if the distance is larger than the preset distance threshold, moving the virtual material to a first virtual world coordinate of the space information. And if the distance is smaller than the preset distance threshold, keeping the position of the virtual material unchanged.
After the step S22, the method further includes: splitting the operation flow into a plurality of training steps according to training contents, and generating training step prompt information to the mixed reality equipment, wherein the training step prompt information comprises at least one of voice, characters, figures, graphics and direction marks.
S23, controlling the entity jig to adjust the operation mode according to the first state data of the virtual materials.
When the virtual material is overlapped on the physical jig in the real world, the physical jig can adjust the operation mode according to the state data of the virtual material.
In an embodiment of the present application, the first status data may include virtual position coordinates, angles, shapes and sizes of the virtual material.
In the embodiment of the application, each preset second time, the mixed reality device acquires the first state data of the virtual material and sends the first state data to a message server. The message server is used for receiving the data sent by each sending end, and forwarding the data to the corresponding receiving end according to the sending end after receiving the data. The message server sends the first status data to the entity fixture. And the entity jig is switched to a corresponding operation mode according to the first state data. For example, when the virtual world coordinates of the virtual material are the virtual world coordinates of the real marker points, the operation mode of the physical jig is adjusted to be the start mode.
By setting the message server to forward the data, the error transmission of the data is avoided, for example, the data is sent to other irrelevant users by manual error, and the safety of the data is improved.
S24, adjusting the running state of the virtual material through the mixed reality equipment according to the second state data of the entity jig.
In the training process, the user can operate the physical jig, so that the state data of the physical jig can be changed, and the state of the virtual material needs to be adjusted in real time.
In an embodiment of the present application, the second state data includes: the device comprises continuous state data and discontinuous state data, wherein the continuous state data comprise an opening and closing angle, a pressure value and infrared sensing of the physical jig, and the discontinuous state data comprise switching of an operation mode of the physical jig and triggering of a switch.
In an embodiment of the present application, the second state data of the entity fixture is obtained in real time by the entity fixture. And comparing the current second state data with the last acquired second state data, and sending second state data inconsistent in comparison to the message server. The message server sends the second status data to the mixed reality device. And the mixed reality equipment adjusts the state of the virtual material according to the second state data. For example, when the opening and closing angle of the solid jig is changed, the mixed reality device adjusts the angle, shape and size of the virtual material according to the changed angle.
It should be noted that, in addition to the state of the virtual material being changed according to the change of the physical jig, the training step prompt information is also changed according to the change of the physical jig. For example, when the second state data of the entity fixture reaches a preset first target state, the first training step is completed, and the training step prompt information is automatically switched to the second training step prompt information. When the second state data of the entity jig reaches a preset second target state, the second training step is completed, and the prompting information of the training step is automatically switched to the prompting information of the third training step.
The training effect can be enhanced by adjusting the state of the virtual material in real time according to the state of the physical jig, so that the effect of synchronizing the state of the virtual material and the state of the physical jig is achieved.
After the step S24, the method further includes: and acquiring the changed third state data through the mixed reality equipment.
In an embodiment of the present application, the changed third state data may include: the finger joints and eye gaze point of the user. The third state data can be acquired in real time through a camera and a sensor of the mixed reality device. Comparing the current third state data with the last acquired third state data, and adjusting the state of the virtual material according to the third state data inconsistent in comparison.
In the embodiment of the application, the user wears the mixed reality device, the virtual material is overlapped on the physical jig, the physical jig adjusts the operation mode according to the state of the virtual material, and the user prompts to operate the physical jig according to the training step. The mixed reality device adjusts the state of the virtual material according to the state data of the physical jig, and simultaneously adjusts the state of the virtual material according to the finger joints and the eye gaze point of the user.
As an alternative embodiment, the method further comprises: acquiring real-time state data of the virtual material; acquiring target state data which is finally required to be met by the virtual material; judging whether the real-time state data and the target state data are the same or not; and when the real-time state data and the target state data are the same, generating a prompt for training completion.
In specific implementation, target state data of the virtual materials are preset, wherein the target state data are state data which are needed to be achieved by the virtual materials after training is completed. And acquiring real-time state data of the virtual material, generating a prompt of training completion when the real-time state data are the same as the target state data, and displaying through the mixed reality equipment.
With continued reference to fig. 1, in this embodiment, the memory 11 may be an internal memory of the electronic device 1, that is, a memory built in the electronic device 1. In other embodiments, the memory 11 may also be an external memory of the electronic device 1, i.e. a memory external to the electronic device 1.
In some embodiments, the memory 11 is used to store program codes and various data, and to implement high-speed, automatic access to programs or data during operation of the electronic device 1.
The memory 11 may include random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart memory card (SMART MEDIA CARD, SMC), secure Digital (SD) card, flash memory card (FLASH CARD), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
In one embodiment, the Processor 12 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL processors, DSPs), application SPECIFIC INTEGRATED Circuits (ASICs), field-Programmable gate arrays (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any other conventional processor or the like.
The program code and various data in said memory 11 may be stored in a computer readable storage medium if implemented in the form of software functional units and sold or used as a separate product. Based on such understanding, the present application may implement all or part of the procedures in the methods of the above embodiments, for example, the steps in the methods of extending the service life of a battery, or may be implemented by instructing the relevant hardware by a computer program, where the computer program may be stored in a computer readable storage medium, and where the computer program may implement the steps of the embodiments of the methods described above when executed by a processor. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), or the like.
It will be appreciated that the above-described division of modules into a logical function division may be implemented in other ways. In addition, each functional module in the embodiments of the present application may be integrated in the same processing unit, or each module may exist alone physically, or two or more modules may be integrated in the same unit. The integrated modules may be implemented in hardware or in hardware plus software functional modules.
Finally, it should be noted that the above-mentioned embodiments are merely for illustrating the technical solution of the present application and not for limiting the same, and although the present application has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made to the technical solution of the present application without departing from the spirit and scope of the technical solution of the present application.
Claims (9)
1. A method of virtual-to-real interaction, the method comprising:
obtaining a virtual material corresponding to the real material;
Superposing the virtual material on a physical jig in the real world through mixed reality equipment;
Controlling the physical jig to adjust an operation mode according to first state data of the virtual material, wherein the first state data comprises virtual position coordinates, angles, shapes and sizes of the virtual material;
according to the second state data of the entity fixture, the running state of the virtual material is adjusted through the mixed reality equipment, wherein the second state data comprises: the device comprises continuous state data and discontinuous state data, wherein the continuous state data comprises an opening angle, a pressure value and infrared sensing of the physical jig; the discontinuous state data comprise the switching of the running mode of the entity jig and the triggering of a switch; the running state of the virtual material comprises the following steps: the angle, shape and size of the virtual material.
2. The method of claim 1, wherein superimposing the virtual material onto a physical fixture in the real world by a mixed reality device comprises:
Setting a real mark point;
Scanning the real mark points through the mixed reality equipment every preset time to obtain virtual world coordinates of the real mark points;
And adjusting the position of the virtual material according to the virtual world coordinates.
3. The method of virtual-real interaction of claim 2, wherein said adjusting the position of the virtual material according to the virtual world coordinates comprises:
Extracting a current virtual world coordinate as a first virtual world coordinate, and extracting a last virtual world coordinate as a second virtual world coordinate;
Calculating a distance between the first virtual world coordinate and the second virtual world coordinate;
Judging whether the distance is larger than a preset distance threshold value or not;
if the distance is greater than the preset distance threshold, moving the virtual material to the first virtual world coordinate;
And if the distance is smaller than the preset distance threshold, keeping the position of the virtual material unchanged.
4. The method of virtual-to-real interaction of claim 1, further comprising:
and in response to the change in the state of the entity jig, sending changed second state data to a message server through the entity jig.
5. The method of claim 4, wherein prior to said adjusting, by the mixed reality device, the operational state of the virtual material according to the second state data of the physical fixture, the method further comprises:
And receiving second state data of the entity jig sent by the message server.
6. The method of claim 1, wherein after the adjusting, by the mixed reality device, the running state of the virtual material according to the second state data of the physical jig, the method further comprises:
obtaining, by the mixed reality device, changed third state data, the third state data comprising: the finger joints and eye gaze point of the user.
7. The method of virtual-to-real interaction of claim 6, further comprising:
acquiring real-time state data of the virtual material;
acquiring target state data which is finally required to be met by the virtual material;
Judging whether the real-time state data and the target state data are the same or not;
And when the real-time state data and the target state data are the same, generating a prompt for training completion.
8. An electronic device comprising a processor and a memory, the processor being configured to execute a computer program stored in the memory to implement the method of virtual-real interaction of any one of claims 1 to 7.
9. A computer readable storage medium storing at least one instruction which when executed by a processor performs the method of virtual-real interaction of any of claims 1 to 7.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110692215.4A CN113610985B (en) | 2021-06-22 | 2021-06-22 | Virtual-real interaction method, electronic equipment and storage medium |
TW110126153A TWI796729B (en) | 2021-06-22 | 2021-07-15 | Virtual reality interaction method, electronic device, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110692215.4A CN113610985B (en) | 2021-06-22 | 2021-06-22 | Virtual-real interaction method, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113610985A CN113610985A (en) | 2021-11-05 |
CN113610985B true CN113610985B (en) | 2024-05-17 |
Family
ID=78303612
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110692215.4A Active CN113610985B (en) | 2021-06-22 | 2021-06-22 | Virtual-real interaction method, electronic equipment and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113610985B (en) |
TW (1) | TWI796729B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103941861A (en) * | 2014-04-02 | 2014-07-23 | 北京理工大学 | Multi-user cooperation training system adopting mixed reality technology |
CN105023215A (en) * | 2015-07-21 | 2015-11-04 | 中国矿业大学(北京) | Dangerous trade safety training system based on head-mounted mixed reality equipment |
CN108153240A (en) * | 2016-12-06 | 2018-06-12 | 发那科株式会社 | Augmented reality simulator and computer-readable medium |
WO2019041900A1 (en) * | 2017-09-04 | 2019-03-07 | 全球能源互联网研究院有限公司 | Method and device for recognizing assembly operation/simulating assembly in augmented reality environment |
CN109859538A (en) * | 2019-03-28 | 2019-06-07 | 中广核工程有限公司 | A kind of key equipment training system and method based on mixed reality |
CN111467789A (en) * | 2020-04-30 | 2020-07-31 | 厦门潭宏信息科技有限公司 | Mixed reality interaction system based on Holo L ens |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016141208A1 (en) * | 2015-03-04 | 2016-09-09 | Usens, Inc. | System and method for immersive and interactive multimedia generation |
-
2021
- 2021-06-22 CN CN202110692215.4A patent/CN113610985B/en active Active
- 2021-07-15 TW TW110126153A patent/TWI796729B/en active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103941861A (en) * | 2014-04-02 | 2014-07-23 | 北京理工大学 | Multi-user cooperation training system adopting mixed reality technology |
CN105023215A (en) * | 2015-07-21 | 2015-11-04 | 中国矿业大学(北京) | Dangerous trade safety training system based on head-mounted mixed reality equipment |
CN108153240A (en) * | 2016-12-06 | 2018-06-12 | 发那科株式会社 | Augmented reality simulator and computer-readable medium |
WO2019041900A1 (en) * | 2017-09-04 | 2019-03-07 | 全球能源互联网研究院有限公司 | Method and device for recognizing assembly operation/simulating assembly in augmented reality environment |
CN109859538A (en) * | 2019-03-28 | 2019-06-07 | 中广核工程有限公司 | A kind of key equipment training system and method based on mixed reality |
CN111467789A (en) * | 2020-04-30 | 2020-07-31 | 厦门潭宏信息科技有限公司 | Mixed reality interaction system based on Holo L ens |
Also Published As
Publication number | Publication date |
---|---|
CN113610985A (en) | 2021-11-05 |
TWI796729B (en) | 2023-03-21 |
TW202301279A (en) | 2023-01-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11055890B2 (en) | Electronic device for generating avatar and method thereof | |
US20200349353A1 (en) | Method and apparatus for identifying a damaged part of a vehicle, server, client and system | |
CN108665373B (en) | Interactive processing method and device for vehicle loss assessment, processing equipment and client | |
TWI654539B (en) | Virtual reality interaction method, device and system | |
KR100940862B1 (en) | Head motion tracking method for 3d facial model animation from a video stream | |
WO2022000755A1 (en) | Robot, motion control method and apparatus therefor, and computer-readable storage medium | |
US9792731B2 (en) | System and method for controlling a display | |
CN110942479B (en) | Virtual object control method, storage medium and electronic device | |
CN111383345B (en) | Virtual content display method and device, terminal equipment and storage medium | |
KR20110104686A (en) | Marker size based interaction method and augmented reality system for realizing the same | |
KR20150105479A (en) | Realization method and device for two-dimensional code augmented reality | |
CN109035415B (en) | Virtual model processing method, device, equipment and computer readable storage medium | |
CN113610985B (en) | Virtual-real interaction method, electronic equipment and storage medium | |
CN112732075B (en) | Virtual-real fusion machine teacher teaching method and system for teaching experiments | |
US20230281354A1 (en) | System and method for providing autonomous driving simulation architecture with switchable models | |
KR101211178B1 (en) | System and method for playing contents of augmented reality | |
US20220172441A1 (en) | Virtual reality data-processing device, system and method | |
CN114596582A (en) | Augmented reality interaction method and system with vision and force feedback | |
CN111913562B (en) | Virtual content display method and device, terminal equipment and storage medium | |
KR101592977B1 (en) | Display apparatus and control method thereof | |
CN117655601B (en) | MR-based intelligent welding method apparatus, computer device, and medium | |
KR100951362B1 (en) | Object Virtual Image Materialization Method | |
JP2004306182A (en) | Simulation system of robot using image processing | |
US20240203068A1 (en) | Method and system for providing augmented reality object based on identification code | |
CN114332327A (en) | Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and server |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |