CN108845669B - AR/MR interaction method and device - Google Patents

AR/MR interaction method and device Download PDF

Info

Publication number
CN108845669B
CN108845669B CN201810625030.XA CN201810625030A CN108845669B CN 108845669 B CN108845669 B CN 108845669B CN 201810625030 A CN201810625030 A CN 201810625030A CN 108845669 B CN108845669 B CN 108845669B
Authority
CN
China
Prior art keywords
virtual
dimensional space
information
interactive device
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810625030.XA
Other languages
Chinese (zh)
Other versions
CN108845669A (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yantai mirage Intelligent Technology Co.,Ltd.
Original Assignee
刘玲
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 刘玲 filed Critical 刘玲
Priority to CN201810625030.XA priority Critical patent/CN108845669B/en
Publication of CN108845669A publication Critical patent/CN108845669A/en
Application granted granted Critical
Publication of CN108845669B publication Critical patent/CN108845669B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Abstract

The invention is suitable for the technical field of AR/MR, and provides an AR/MR interaction method and device, which comprise the following steps: an interactive device and a method for acquiring position and posture information of the interactive device in the real world and mapping the information to a three-dimensional virtual space under an AR/MR scene. Through the interaction device and the interaction method, the user can interact with the three-dimensional virtual space scene through the interaction device, and the operation usability of the user in the AR/MR scene is improved.

Description

AR/MR interaction method and device
Technical Field
The invention relates to the technical field of AR/MR, in particular to an AR/MR interaction method and device.
Background
Augmented Reality (AR) is a technology for calculating the position and angle of a camera image in real time and adding corresponding images, videos and 3D models.
Mixed Reality (MR) is a further development of augmented reality technology that builds an interactive feedback information loop between the real world, the virtual world and the user by presenting virtual scene information in the real scene to enhance the realism of the user experience.
ARKit is the AR development platform launched by apple in 2017 at the WWDC. Developers can create augmented reality applications using the set of tools, iPhone and iPad.
Nowadays, the AR/MR technology is developed more and more, and has a great development in the aspect of display effect, but the interaction mode of the user is still limited to gesture and sound, and the stability and reliability are not high, and for the three-dimensional immersive experience, the interactive experience similar to a laser pen is the direction of the industry effort. To realize the interaction of the laser pen, the gesture and position of the interactive device must be acquired when the user points to which position. Therefore, the invention provides an AR/MR interaction method and device, and solves the problems.
Disclosure of Invention
The embodiment of the invention provides an AR/MR interaction method and device, aiming at obtaining position and posture information of a user interaction device in a user virtual three-dimensional space so as to facilitate the user to develop more interaction experiences.
In a first aspect of the embodiments of the present invention, an AR/MR interaction apparatus is provided, where a core of the interaction apparatus is formed by a mobile device running an IOS system.
The mobile device running the IOS system can be a mobile phone running the IOS system, can also be a device obtained by modifying the mobile phone running the IOS system, and can also be other devices supporting the ARKIT technology.
The equipment of the mobile phone running the IOS system after being modified comprises but is not limited to: changing screen position and/or orientation, changing camera position and/or orientation.
In a second aspect of the embodiments of the present invention, an AR/MR interaction method is provided, where the method includes:
acquiring information of an interactive device, wherein the information of the interactive device comprises: the position information of the interaction device in the virtual three-dimensional space provided by the ARKIT technology, the posture information of the interaction device in the virtual three-dimensional space provided by the ARKIT technology, and the data information transferred by the interaction device to the AR/MR scene of the user.
And calculating the position mapping relation between the position information of the interactive device in the virtual three-dimensional space provided by the ARKIT technology and the position information of the interactive device in the virtual three-dimensional space of the user, and the posture mapping relation between the posture information of the interactive device in the virtual three-dimensional space provided by the ARKIT technology and the posture information of the interactive device in the virtual three-dimensional space of the user.
And calculating the position information of the interactive device in the virtual three-dimensional space of the user and the posture information of the interactive device in the virtual three-dimensional space of the user through the space mapping relation, the position information of the interactive device in the virtual three-dimensional space provided by the ARKIT technology and the posture information of the interactive device in the virtual three-dimensional space provided by the ARKIT technology.
In the embodiment of the invention, because the position information of the interactive device in the user virtual three-dimensional space and the posture information of the interactive device in the user virtual three-dimensional space can be calculated in real time, a deeper interactive design can be made according to the information, and the operation of AR/MR is more convenient, faster and more natural.
Description of the drawings:
fig. 1 is a flowchart of an interaction method according to an embodiment of the present invention.
Fig. 2 is a system structure diagram of an interaction apparatus provided in the embodiment of the present invention, which operates by using the interaction method provided in the embodiment of the present invention.
The specific implementation mode is as follows:
in order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The embodiment of the invention provides an interactive device and a method for acquiring position and posture information of the interactive device in the real world and mapping the information to a three-dimensional virtual space under an AR/MR scene.
In order to illustrate the technical process and apparatus of the present invention, the following description is given by way of specific examples.
Fig. 1 shows a flowchart of an interaction method provided in an embodiment of the present invention, which is detailed as follows:
and step S11, acquiring the position information and the posture information of the interactive device in a virtual three-dimensional space coordinate system provided by the ARKIT technology.
Specifically, the interactive device runs an application including an ARKIT technology, the application can perform scanning recognition on the real world, generate a virtual three-dimensional space after understanding of the real world, and acquire position and posture information of the interactive device in the space in real time.
Step S12, placing a virtual interaction device with the same appearance size as the real interaction device at a preset position in the user virtual three-dimensional space, and acquiring the position and posture information of the virtual interaction device in the user virtual three-dimensional space in real time.
Specifically, an AR/MR device used by a user generates a virtual three-dimensional space after understanding of the real world through a space scanning identification technology; and opening a corresponding three-dimensional application program by a user, placing the virtual interaction device in the space, and acquiring the position and posture information of the virtual interaction device in the virtual three-dimensional space of the user in real time through the application interface.
And step S13, moving the interactive device and/or the virtual interactive device to match the positions and postures of the interactive device and/or the virtual interactive device, and calculating the spatial mapping relation of the interactive device and the virtual interactive device.
Specifically, the user can see the virtual interaction device and also can see the real interaction device through the AR/MR device; and the user matches the positions and postures of the real interaction device and/or the virtual interaction device by moving the real interaction device and/or the virtual interaction device, and calculates the space mapping relation of the real interaction device and the virtual interaction device by using a space matching calibration method.
Preferably, the virtual interaction means is identical to the real interaction means in terms of appearance shape and size.
Preferably, the spatial matching calibration method is available at any time to correct the deviation of the spatial mapping relationship caused by the interference during the use.
And step S14, calculating the position information of the interactive device in the virtual three-dimensional space of the user and the posture information of the interactive device in the virtual three-dimensional space of the user according to the space mapping relation, the position information of the interactive device in the virtual three-dimensional space provided by the ARKIT technology and the posture information of the interactive device in the virtual three-dimensional space provided by the ARKIT technology.
Specifically, the position information of the interaction device in the virtual three-dimensional space provided by the ARKIT technology and the posture information of the interaction device in the virtual three-dimensional space provided by the ARKIT technology are obtained in real time, and the real-time position information of the interaction device in the user virtual three-dimensional space and the real-time posture information of the interaction device in the user virtual three-dimensional space are calculated according to the space mapping relation.
Fig. 2 is a system structure diagram illustrating an interaction apparatus provided in the embodiment of the present invention operating by using an interaction method provided in the embodiment of the present invention.
To illustrate the operation mechanism of the interactive method and apparatus in detail, the embodiment of the present invention shows other systems 213 necessary for coordinating the normal operation of the whole interactive method and apparatus. It should be understood that other systems may be used by those skilled in the art to interface with the interaction device of the present invention and should not be limited to the system shown in the embodiments of the present invention.
In particular, the interactive apparatus runs the IOS system, opens an application 21 comprising ARKIT technology, which over a period of time scans the environment, automatically generates an understanding of the environment, and creates a virtual three-dimensional spatial coordinate system 22 based on the environment. The posture information and the position information of the interactive device in the virtual three-dimensional space provided by the ARKIT technology are acquired in real time through the ARKIT technology interface, the information is transmitted to the data transmission module 23, and the data containing the position information of the interactive device in the virtual three-dimensional space provided by the ARKIT technology and the posture information of the interactive device in the virtual three-dimensional space provided by the ARKIT technology are transmitted to the calculation module 28 through the wired and/or wireless transmission means 24. The AR/MR device used by the user transmits in real time the position and attitude information of the virtual interactive device in the user's virtual three-dimensional space 25 to the calculation module 28 via the data transmission module 26 via wired and/or wireless means 27. The spatial matching calibration module 29 in the calculation module 28 calculates the spatial mapping relationship 210 between the position and posture information of the interactive device in the virtual three-dimensional space provided by the ARKIT technology and the position and posture information of the interactive device in the user virtual three-dimensional space by the spatial matching calibration method of the present invention, thereby completing the spatial matching calibration. After obtaining the spatial mapping relationship, the position information of the interaction device in the virtual three-dimensional space provided by the ARKIT technology, and the posture information of the interaction device in the virtual three-dimensional space provided by the ARKIT technology, the position information of the interaction device in the user virtual three-dimensional space and the posture information of the interaction device in the user virtual three-dimensional space, i.e. the final data 212, are calculated by the real-time spatial information data calculation module 211.
It should be understood that, in the embodiment of the present invention, the size of the serial number does not represent the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiment of the present invention.

Claims (5)

1. An AR/MR interaction method, the method comprising:
acquiring information of an interactive device, wherein the information of the interactive device comprises: the position information of the interaction device in the virtual three-dimensional space provided by the ARKIT technology, the posture information of the interaction device in the virtual three-dimensional space provided by the ARKIT technology, and other data information transferred to the AR/MR scene by the interaction device;
a user virtual three-dimensional space, which refers to a virtual space coordinate system provided by AR/MR display equipment used by a user;
calculating a position mapping relation between the position information of the interaction device in the virtual three-dimensional space provided by the ARKIT technology and the position information of the interaction device in the user virtual three-dimensional space through space matching calibration, and calculating a posture mapping relation between the posture information of the interaction device in the virtual three-dimensional space provided by the ARKIT technology and the posture information of the interaction device in the user virtual three-dimensional space;
and according to the position information and the posture information of the interactive device in the virtual three-dimensional space provided by the ARKIT technology, which are acquired in real time, the position information and the posture information of the interactive device in the virtual three-dimensional space of the user are calculated in real time through the mapping relation obtained by calculation.
2. The method of claim 1, wherein the interactive device obtains, through the ARKIT interface, the posture information of the current interactive device in the virtual three-dimensional space provided by the ARKIT technology and the position information of the interactive device in the virtual three-dimensional space provided by the ARKIT technology.
3. The method of claim 1, wherein: the spatial matching calibration is carried out according to the following steps:
firstly, establishing an application system developed by using an ARKIT interface at an interactive device end, and acquiring position information and posture information of the interactive device in a virtual three-dimensional space coordinate system provided by an ARKIT technology in real time after the application system runs;
secondly, placing a virtual interaction device at a preset position in the user virtual three-dimensional space, and acquiring the position and posture information of the virtual interaction device in the user virtual three-dimensional space in real time;
thirdly, moving the interactive device and/or the virtual interactive device to enable the interactive device to be matched with the position and the posture of the virtual interactive device, recording the position information of the current interactive device in a virtual three-dimensional space provided by an ARKIT technology, the posture information of the current interactive device in the virtual three-dimensional space provided by the ARKIT technology, the position information of the virtual interactive device in a user virtual three-dimensional space, and the posture information of the virtual interactive device in the user virtual three-dimensional space;
and fourthly, calculating the position mapping relation between the position information of the interaction device in the virtual three-dimensional space provided by the ARKIT technology and the position information of the interaction device in the user virtual three-dimensional space by a calculation module according to the information recorded in the third step through a space geometric algorithm, and calculating the posture mapping relation between the posture information of the interaction device in the virtual three-dimensional space provided by the ARKIT technology and the posture information of the interaction device in the user virtual three-dimensional space.
4. The method of claim 1, wherein the spatial matching calibration is performed anew at any time to achieve the calibration.
5. The method of claim 3, wherein there is an associated feature between the virtual interaction device and the interaction device, wherein the associated feature is: the same pattern and/or the same shape and/or the same size.
CN201810625030.XA 2018-06-17 2018-06-17 AR/MR interaction method and device Active CN108845669B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810625030.XA CN108845669B (en) 2018-06-17 2018-06-17 AR/MR interaction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810625030.XA CN108845669B (en) 2018-06-17 2018-06-17 AR/MR interaction method and device

Publications (2)

Publication Number Publication Date
CN108845669A CN108845669A (en) 2018-11-20
CN108845669B true CN108845669B (en) 2021-10-15

Family

ID=64202001

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810625030.XA Active CN108845669B (en) 2018-06-17 2018-06-17 AR/MR interaction method and device

Country Status (1)

Country Link
CN (1) CN108845669B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110568927A (en) * 2019-08-30 2019-12-13 深圳市商汤科技有限公司 Augmented reality information interaction method and device, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106648038A (en) * 2015-10-30 2017-05-10 北京锤子数码科技有限公司 Method and apparatus for displaying interactive object in virtual reality
CN107958491A (en) * 2017-12-06 2018-04-24 河南省水利勘测设计研究有限公司 Mobile augmented reality virtual coordinates and construction site coordinate matching method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100480780B1 (en) * 2002-03-07 2005-04-06 삼성전자주식회사 Method and apparatus for tracking an object from video data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106648038A (en) * 2015-10-30 2017-05-10 北京锤子数码科技有限公司 Method and apparatus for displaying interactive object in virtual reality
CN107958491A (en) * 2017-12-06 2018-04-24 河南省水利勘测设计研究有限公司 Mobile augmented reality virtual coordinates and construction site coordinate matching method

Also Published As

Publication number Publication date
CN108845669A (en) 2018-11-20

Similar Documents

Publication Publication Date Title
US20230341930A1 (en) Systems and methods for tracking a controller
US11270460B2 (en) Method and apparatus for determining pose of image capturing device, and storage medium
US20210233272A1 (en) Data processing method and device used in virtual scenario
US11481982B2 (en) In situ creation of planar natural feature targets
US8681179B2 (en) Method and system for coordinating collisions between augmented reality and real reality
JP2018125000A (en) Apparatus and method to generate realistic rigged three-dimensional (3d) model animation for view-point transform
CN109144252B (en) Object determination method, device, equipment and storage medium
US20220245859A1 (en) Data processing method and electronic device
US10402657B2 (en) Methods and systems for training an object detection algorithm
US11156843B2 (en) End-to-end artificial reality calibration testing
WO2016114930A2 (en) Systems and methods for augmented reality art creation
Viyanon et al. AR furniture: Integrating augmented reality technology to enhance interior design using marker and markerless tracking
US20170287165A1 (en) Computer program used for image processing
US20220067968A1 (en) Motion capture calibration using drones with multiple cameras
JP2015125641A (en) Information processing device, control method therefor, and program
Nishihara et al. Object recognition in assembly assisted by augmented reality system
CN107479701B (en) Virtual reality interaction method, device and system
CN108845669B (en) AR/MR interaction method and device
CN112732075B (en) Virtual-real fusion machine teacher teaching method and system for teaching experiments
CN110060354B (en) Positioning and interaction method of real image in virtual space
CN112270242A (en) Track display method and device, readable medium and electronic equipment
CN115131528A (en) Virtual reality scene determination method, device and system
Okamoto et al. Assembly assisted by augmented reality (A 3 R)
CN117274558B (en) AR navigation method, device and equipment for visual positioning and storage medium
CN106599893B (en) Processing method and device for object deviating from recognition graph based on augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220228

Address after: No.1 Lanhai Road, hi tech Zone, Yantai City, Shandong Province

Patentee after: Yantai mirage Intelligent Technology Co.,Ltd.

Address before: 264000 401, building 9, Ruixiang garden, Dajijia sub district office, Yantai Economic and Technological Development Zone, Yantai City, Shandong Province

Patentee before: Liu Ling