CN110135238B - Markless Internet of things equipment identification method based on mobile AR - Google Patents

Markless Internet of things equipment identification method based on mobile AR Download PDF

Info

Publication number
CN110135238B
CN110135238B CN201910230198.5A CN201910230198A CN110135238B CN 110135238 B CN110135238 B CN 110135238B CN 201910230198 A CN201910230198 A CN 201910230198A CN 110135238 B CN110135238 B CN 110135238B
Authority
CN
China
Prior art keywords
equipment
mobile
internet
things
things equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910230198.5A
Other languages
Chinese (zh)
Other versions
CN110135238A (en
Inventor
夏明�
向文秀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University of Technology ZJUT
Original Assignee
Zhejiang University of Technology ZJUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University of Technology ZJUT filed Critical Zhejiang University of Technology ZJUT
Priority to CN201910230198.5A priority Critical patent/CN110135238B/en
Publication of CN110135238A publication Critical patent/CN110135238A/en
Application granted granted Critical
Publication of CN110135238B publication Critical patent/CN110135238B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Automation & Control Theory (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Telephonic Communication Services (AREA)

Abstract

A label-free Internet of things equipment identification method based on mobile AR comprises two steps of Internet of things equipment information acquisition during deployment and Internet of things equipment identification after deployment; when the Internet of things equipment is deployed, after the mobile AR equipment detects the Internet of things equipment through the camera, the global coordinate of the target Internet of things equipment is calculated through image data and various sensor data (including a global positioning system, an accelerometer, a gyroscope and an electronic compass), and the global coordinate and the target Internet of things equipment ID input by a user are stored in a mapping database. After deployment, the Internet of things equipment can be detected through the mobile AR equipment camera, the global coordinate of the target Internet of things equipment is calculated in real time by utilizing image data and various sensor data, and the corresponding ID is searched in the mapping database, so that the individual Internet of things equipment can be accurately identified. The invention provides a simple and effective method for identifying the equipment of the Internet of things without depending on individual marks, which is convenient for the configuration and diagnosis of the equipment of the Internet of things.

Description

Markless Internet of things equipment identification method based on mobile AR
Technical Field
The invention relates to an identification method of Internet of things equipment, in particular to an identification method of unmarked Internet of things equipment based on mobile AR.
Background
Object recognition algorithms are widely used to identify specific target objects from complex image scenes. Currently, image-based target recognition algorithms are mainly used to distinguish different classes of objects. However, the internet of things devices may have the same appearance, identifying an individual of the internet of things device is a key technology for interacting with the internet of things device, including configuration, diagnosis and other operations, and the target identification algorithm based on the image cannot achieve the target at present.
In order to realize identification of an individual of an internet of things device, an identification method based on characteristics such as an RFID, a barcode, a wireless signal and the like has been developed at present. Document [1] Zhu B F, Gui Q.EPC Identification Technology in IOT System Based on RFID [ J ]. Journal of health organization of duration, 2012, that is EPC Identification Technology in RFID-Based Internet of things systems and document [2] MONTASER, Ali, MOSELHI, et al.RFID index Identification for Construction projects [ J ]. Automation in Construction,2014,39(4): 167-. In which each device is given a unique RFID tag to enable identification, so that its deployment cost is high and identification accuracy is easily disturbed by signals. Document [3] shape A, Holdsworth J, Lee I.Mobile augmented reality based context-aware management system [ J ]. Expert Systems with Applications,2014,41(5): 2174-. The unique bar code is printed on the Internet of things equipment to realize individual distinguishing. Thus also resulting in high deployment costs. Meanwhile, the bar code is easily damaged in a long-term exposure environment, and the recognition effect is influenced. Document [5] Haramaki T, Shimizu D, Nishino H.A wireless network vision based on signal strength object [ C ]. IEEE International Conference on Consumer Electronics-taiwan, IEEE,2017. Document [6] Haramaki T, Nishino H.A Device Identification Method for AR-based Network Topology Visualization [ C ] International Conference on Broadband & Wireless computing.ieee,2015, a Device Identification Method for AR-based Network Topology Visualization. Document [7] Mandeljc R, Kristan M, et al. tracking by Identification Using Computer Vision and Radio [ J ]. Sensors,2012,13(1): 241-. Document [8] Haramaki T.A sensor fusion for network visualization [ C ].2016 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW). And judging the position of the equipment according to the strength of the wireless signal. Among them, documents [6] and [7] also improve recognition accuracy in combination with an image recognition technique, and document [8] roughly distinguishes devices in the vertical direction by determining the posture (head up or head down) of a user using acceleration sensor data on the basis of document [6 ]. However, the wireless signals are susceptible to environmental interference, and the accuracy is poor when positioning is performed according to the strength of the wireless signals. Meanwhile, the technology is mainly used for small-range network equipment management, and identification of large-range Internet of things equipment is difficult to realize. Oguntala G, Abd-Alhamed R, Jones S, et al, inductor location identification technologies for real-time IoT-based applications, and inorganic social networking services J. Computer Science Review,2018,30:55-79 in the literature [9], namely, the real-time indoor location identification technology based on the application of the Internet of things utilizes magnetic field information to identify devices, but the method is easily interfered by magnetic fields.
The identification of a target internet of things device can be abstracted as a positioning problem for the device. The passive target positioning algorithm [ J ]. the university of southeast university newspaper (natural science edition), 2011,41(2): 266-. The difference from the solution is that it uses a plurality of fixed cameras. And the scheme can realize the identification only by one mobile AR device. Document [11] Oztarak H, Akkaya K, Yazici a. lightweight Object Localization with a Single Camera in Wireless Multimedia Sensor Networks [ C ]. IEEE Conference on Global telecommunications, IEEE,2009. Fixed cameras are used and focus primarily on calculating the two-dimensional positional offset of the target relative to the camera.
Disclosure of Invention
In order to overcome the defect that the existing identification method of the target Internet of things equipment is poor in accuracy, the invention provides a label-free Internet of things equipment identification method based on mobile AR. The method comprises the steps of identifying the same kind of Internet of things equipment by using a camera of the mobile AR equipment, then comprehensively using various sensor data (including a global positioning system, an accelerometer, a gyroscope and an electronic compass), calculating the global coordinate of an individual target Internet of things equipment, and obtaining the unique ID of the target Internet of things equipment through database matching.
The technical scheme adopted by the invention for solving the technical problems is as follows:
a label-free identification method for Internet of things equipment based on mobile AR comprises the steps of information acquisition during deployment and Internet of things equipment identification after deployment;
wherein, the information acquisition during deployment comprises the following substeps:
1.1) after the Internet of things equipment is deployed, a user aims the mobile AR equipment at the deployed Internet of things equipment and starts to acquire information of target Internet of things equipment;
1.2) the mobile AR equipment calculates the global coordinate of the target Internet of things equipment through the image data and the sensor data;
1.3) storing the global coordinate obtained by calculation and the unique address of the target Internet of things equipment input by the user into a mapping database, and finishing information acquisition;
the post-deployment internet of things equipment identification comprises the following substeps:
2.1) the user starts the mobile AR equipment and starts the equipment identification of the Internet of things;
2.2) when the mobile AR equipment detects the Internet of things equipment through image data, then sensor data are integrated, and the global coordinate of the target Internet of things equipment is calculated;
and 2.3) retrieving in a mapping database according to the obtained global coordinates of the target Internet of things equipment, matching the unique address of the corresponding Internet of things target equipment, and finishing identification.
Further, in the steps 1.2) and 2.2), the calculation process of the global coordinate of the unmarked internet of things device is as follows: firstly, calculating the target Internet of things device on the mobile AR device markRelative displacement in a quasi-coordinate system with the mobile AR device as the origin, XsThe axis points to the east, YsAxial direction due to north, ZsThe axis is perpendicular to the ground, and then the global coordinate of the target Internet of things device is calculated by combining the global coordinate of the mobile AR device.
In the steps 1.2) and 2.2), the calculation process of the global coordinate of the unmarked internet of things equipment is as follows:
firstly, calculating the vertical displacement of the target Internet of things equipment in a standard coordinate system of the mobile AR equipment:
1.2.1.1) after the mobile AR device detects the Internet of things device, calculating a displacement matrix of a target Internet of things device in a camera coordinate system of the mobile AR device in the acquired image information; and calculating a connecting line between the origin of the coordinate system in the coordinate system of the mobile AR equipment camera and the coordinates of the target Internet of things equipment and a coordinate system X of the mobile AR equipment camera through the displacement matrixcZcAngle α formed by the planes:
Figure BDA0002006454540000051
wherein x isc、yc、zcRepresenting the displacement of the target Internet of things equipment in a mobile AR equipment camera coordinate system;
1.2.1.2) calculating the pitch angle beta of the mobile AR device, i.e. the mobile AR device camera coordinate system Y, using the accelerometer and gyroscope of the mobile AR devicecStandard coordinate system X of axis and mobile AR equipmentsYsThe planar included angle is calculated through alpha and beta, and a connecting line between the target Internet of things equipment and the mobile AR equipment and a standard coordinate system X of the mobile AR equipment are calculatedsYsThe included angle gamma of the plane;
Figure BDA0002006454540000061
1.2.1.3) calculating the vertical displacement z of the target Internet of things equipment in the standard coordinate system of the mobile AR equipment according to the gammas
Figure BDA0002006454540000062
Then, calculating the horizontal displacement of the target internet of things device in a standard coordinate system of the mobile AR device:
1.2.2.1) after the mobile AR equipment identifies the Internet of things equipment, calculating a displacement matrix of the target Internet of things equipment in a camera coordinate system of the mobile AR equipment in the acquired image information, and calculating a connecting line between the target Internet of things equipment and an origin of the coordinate system in the camera coordinate system of the mobile AR equipment through the displacement matrix and connecting the connecting line at X in the X-axis directioncZcProjection and X on a planecIncluded angle θ of the shaft:
Figure BDA0002006454540000063
wherein x isc、zcRespectively representing the target Internet of things equipment in X in the coordinate system of the mobile AR equipment cameracAxis and ZcAn amount of displacement on the shaft;
1.2.2.2) calculating a magnetic heading angle delta of the mobile AR equipment by using an electronic compass of the mobile AR equipment, and calculating a magnetic heading angle epsilon of the target Internet of things equipment according to theta and delta:
Figure BDA0002006454540000064
1.2.2.3) calculating the horizontal displacement x of the target Internet of things equipment in a standard coordinate system of the mobile AR equipment according to the epsilons、ys
Figure BDA0002006454540000071
Figure BDA0002006454540000072
Finally, x is puts、ys、zsAnd converting the longitude and latitude difference and the height difference of the target Internet of things equipment relative to the mobile AR equipment under the global coordinate, and adding the longitude and latitude difference and the height difference with the global coordinate acquired by the mobile AR equipment through a global positioning system to obtain the global coordinate of the target Internet of things equipment.
In the steps 1.2) and 2.2), the sensors include a global positioning system, an accelerometer, a gyroscope, and an electronic compass.
The technical conception of the invention is as follows: when the Internet of things equipment is deployed, after the mobile AR equipment detects the Internet of things equipment through the camera, the global coordinate of the target Internet of things equipment is calculated through image data and various sensor data (including a global positioning system, an accelerometer, a gyroscope and an electronic compass), and the global coordinate and the unique ID of the target Internet of things equipment input by a user are stored in a mapping database. In the configuration and diagnosis process of the deployed Internet of things equipment, the Internet of things equipment can be detected through the mobile AR equipment camera, the global coordinate of the target Internet of things equipment is calculated in real time by utilizing image data and various sensor data, and the corresponding ID is searched in the mapping database, so that the individual Internet of things equipment is accurately identified.
The invention has the beneficial effects that: a simple and convenient method for uniquely identifying an individual of the Internet of things equipment by using the mobile AR equipment is provided, so that the configuration and diagnosis of the Internet of things equipment are simplified.
Drawings
Fig. 1 is a flowchart of a method for identifying a device of the internet of things without a mark based on a mobile AR according to the present invention.
Fig. 2 is a schematic diagram of calculating vertical displacement of target internet of things equipment.
Fig. 3 is a schematic diagram of horizontal displacement calculation of target internet of things equipment.
Detailed Description
The invention is further illustrated by the following figures and examples.
Referring to fig. 1 to 3, a method for identifying a label-free internet of things device based on mobile AR includes acquiring information during deployment and identifying the device after deployment;
wherein, the information acquisition during deployment comprises the following substeps:
1.1) after the Internet of things equipment is deployed, a user aims the mobile AR equipment at the deployed Internet of things equipment and starts to acquire information of target Internet of things equipment;
1.2) the mobile AR equipment calculates the global coordinate of the target Internet of things equipment through the image data and the sensor data;
1.3) storing the global coordinate obtained by calculation and the unique address of the target Internet of things equipment input by the user into a mapping database, and finishing information acquisition;
the post-deployment internet of things equipment identification comprises the following substeps:
2.1) the user starts the mobile AR equipment and starts the equipment identification of the Internet of things;
2.2) when the mobile AR equipment detects the Internet of things equipment through image data, then sensor data are integrated, and the global coordinate of the target Internet of things equipment is calculated;
and 2.3) retrieving in a mapping database according to the obtained global coordinates of the target Internet of things equipment, matching the unique address of the corresponding Internet of things target equipment, and finishing identification.
Referring to fig. 2, the vertical displacement of the target internet of things device in the standard coordinate system of the mobile AR device is calculated:
1.2.1.1) after the mobile AR device detects the Internet of things device, calculating a displacement matrix of a target Internet of things device in a camera coordinate system of the mobile AR device in the acquired image information; and calculating a connecting line between the origin of the coordinate system in the coordinate system of the mobile AR equipment camera and the coordinates of the target Internet of things equipment and a coordinate system X of the mobile AR equipment camera through the displacement matrixcZcAngle α formed by the planes:
Figure BDA0002006454540000091
wherein x isc、yc、zcRepresenting the displacement of the target Internet of things equipment in a mobile AR equipment camera coordinate system;
1.2.1.2) setting with Mobile ARAn accelerometer and a gyroscope are arranged, and the pitch angle beta of the mobile AR equipment is calculated, namely the coordinate system Y of the camera of the mobile AR equipmentcStandard coordinate system X of axis and mobile AR equipmentsYsPlanar included angle) and calculating a connecting line between the target internet of things device and the mobile AR device and a standard coordinate system X of the mobile AR device through alpha and betasYsThe included angle gamma of the plane;
Figure BDA0002006454540000092
1.2.1.3) calculating the vertical displacement z of the target Internet of things equipment in the standard coordinate system of the mobile AR equipment according to the gammas
Figure BDA0002006454540000093
Referring to fig. 3, calculating the horizontal displacement of the target internet of things device in the standard coordinate system of the mobile AR device:
1.2.2.1) after the mobile AR equipment identifies the Internet of things equipment, calculating a displacement matrix of the target Internet of things equipment in a camera coordinate system of the mobile AR equipment in the acquired image information, and calculating a connecting line between the target Internet of things equipment and an origin of the coordinate system in the camera coordinate system of the mobile AR equipment through the displacement matrix and connecting the connecting line at X in the X-axis directioncZcProjection and X on a planecIncluded angle θ of the shaft:
Figure BDA0002006454540000101
wherein x isc、zcRespectively representing the target Internet of things equipment in X in the coordinate system of the mobile AR equipment cameracAxis and ZcAn amount of displacement on the shaft;
1.2.2.2) calculating a magnetic heading angle delta of the mobile AR equipment by using an electronic compass of the mobile AR equipment, and calculating a magnetic heading angle epsilon of the target Internet of things equipment according to theta and delta:
Figure BDA0002006454540000102
1.2.2.3) calculating the horizontal displacement x of the target Internet of things equipment in a standard coordinate system of the mobile AR equipment according to the epsilons、ys
Figure BDA0002006454540000103
Figure BDA0002006454540000104
Finally, x is puts、ys、zsAnd converting the longitude and latitude difference and the height difference of the target Internet of things equipment relative to the mobile AR equipment under the global coordinate, and adding the longitude and latitude difference and the height difference with the global coordinate acquired by the mobile AR equipment through a global positioning system to obtain the global coordinate of the target Internet of things equipment.

Claims (2)

1. A label-free Internet of things equipment identification method based on mobile AR is characterized by comprising the steps of information acquisition during deployment and Internet of things equipment identification after deployment;
the deployment-time information acquisition comprises the following substeps:
1.1) after the Internet of things equipment is deployed, a user aims the mobile AR equipment at the deployed Internet of things equipment and starts to acquire information of target Internet of things equipment;
1.2) the mobile AR equipment calculates the global coordinate of the target Internet of things equipment through the image data and the sensor data;
1.3) storing the global coordinate obtained by calculation and the unique ID of the target Internet of things equipment input by the user into a mapping database, and finishing information acquisition;
the post-deployment internet of things device identification comprises the following substeps:
2.1) the user starts the mobile AR equipment and starts the equipment identification of the Internet of things;
2.2) when the mobile AR equipment detects the Internet of things equipment through image data, then sensor data are integrated, and the global coordinate of the target Internet of things equipment is calculated;
2.3) retrieving in a mapping database according to the obtained global coordinates of the target Internet of things equipment, matching the unique address of the corresponding Internet of things target equipment, and finishing identification;
in the steps 1.2) and 2.2), the calculation process of the global coordinate of the unmarked internet of things equipment is as follows: firstly, calculating the relative displacement of the target Internet of things equipment in a standard coordinate system of the mobile AR equipment, and taking the mobile AR equipment as an origin and XsThe axis points to the east, YsAxial direction due to north, ZsThe axis is vertical to the ground, and then the global coordinate of the target Internet of things equipment is calculated by combining the global coordinate of the mobile AR equipment;
in the steps 1.2) and 2.2), the calculation process of the global coordinate of the unmarked internet of things equipment is as follows:
firstly, calculating the vertical displacement of the target Internet of things equipment in a standard coordinate system of the mobile AR equipment, and the steps are as follows:
1.2.1.1) after the mobile AR device detects the Internet of things device, calculating a displacement matrix of a target Internet of things device in a camera coordinate system of the mobile AR device in the acquired image information; and calculating a connecting line between the origin of the coordinate system in the coordinate system of the mobile AR equipment camera and the coordinates of the target Internet of things equipment and a coordinate system X of the mobile AR equipment camera through the displacement matrixcZcAngle α formed by the planes:
Figure FDA0002784602680000021
wherein x isc、yc、zcRepresenting the displacement of the target Internet of things equipment in a mobile AR equipment camera coordinate system;
1.2.1.2) calculating the pitch angle beta of the mobile AR device, i.e. the mobile AR device camera coordinate system Y, using the accelerometer and gyroscope of the mobile AR devicecStandard coordinate system X of axis and mobile AR equipmentsYsAngle of plane and through alpha andbeta, calculating a connecting line between the target Internet of things equipment and the mobile AR equipment and a standard coordinate system X of the mobile AR equipmentsYsThe included angle gamma of the plane;
Figure FDA0002784602680000022
1.2.1.3) calculating the vertical displacement z of the target Internet of things equipment in the standard coordinate system of the mobile AR equipment according to the gammas
Figure FDA0002784602680000023
Then, calculating the horizontal displacement of the target Internet of things equipment in a standard coordinate system of the mobile AR equipment, and the steps are as follows:
1.2.2.1) after the mobile AR equipment identifies the Internet of things equipment, calculating a displacement matrix of the target Internet of things equipment in a camera coordinate system of the mobile AR equipment in the acquired image information, and calculating a connecting line between the target Internet of things equipment and an origin of the coordinate system in the camera coordinate system of the mobile AR equipment through the displacement matrix and connecting the connecting line at X in the X-axis directioncZcProjection and X on a planecIncluded angle θ of the shaft:
Figure FDA0002784602680000031
wherein x isc、zcRespectively representing the target Internet of things equipment in X in the coordinate system of the mobile AR equipment cameracAxis and ZcAn amount of displacement on the shaft;
1.2.2.2) calculating a magnetic heading angle delta of the mobile AR equipment by using an electronic compass of the mobile AR equipment, and calculating a magnetic heading angle epsilon of the target Internet of things equipment according to theta and delta:
Figure FDA0002784602680000032
1.2.2.3) calculating the horizontal displacement x of the target Internet of things equipment in a standard coordinate system of the mobile AR equipment according to the epsilons、ys
Figure FDA0002784602680000033
Figure FDA0002784602680000034
Finally, x is puts、ys、zsAnd converting the longitude and latitude difference and the height difference of the target Internet of things equipment relative to the mobile AR equipment under the global coordinate, and adding the longitude and latitude difference and the height difference with the global coordinate acquired by the mobile AR equipment through a global positioning system to obtain the global coordinate of the target Internet of things equipment.
2. The method for identifying a device of the markerless internet of things based on mobile AR as claimed in claim 1, wherein in steps 1.2) and 2.2), the sensors comprise a global positioning system, an accelerometer, a gyroscope and an electronic compass.
CN201910230198.5A 2019-03-26 2019-03-26 Markless Internet of things equipment identification method based on mobile AR Active CN110135238B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910230198.5A CN110135238B (en) 2019-03-26 2019-03-26 Markless Internet of things equipment identification method based on mobile AR

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910230198.5A CN110135238B (en) 2019-03-26 2019-03-26 Markless Internet of things equipment identification method based on mobile AR

Publications (2)

Publication Number Publication Date
CN110135238A CN110135238A (en) 2019-08-16
CN110135238B true CN110135238B (en) 2021-04-06

Family

ID=67568862

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910230198.5A Active CN110135238B (en) 2019-03-26 2019-03-26 Markless Internet of things equipment identification method based on mobile AR

Country Status (1)

Country Link
CN (1) CN110135238B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102158556A (en) * 2011-04-08 2011-08-17 南京邮电大学 Method for implementing intelligent property management system in Internet of things environment
CN102263800A (en) * 2010-05-26 2011-11-30 四川大学 Geographic information internet of things system constructed based on cloud computing, and application
CN103175524A (en) * 2013-02-20 2013-06-26 清华大学 Visual-sense-based aircraft position and attitude determination method under mark-free environment
CN103500452A (en) * 2013-10-12 2014-01-08 杭州师范大学 Scenic spot scenery moving augmented reality method based on space relationship and image analysis
CN105760106A (en) * 2016-03-08 2016-07-13 网易(杭州)网络有限公司 Interaction method and interaction device of intelligent household equipment
CN106225778A (en) * 2016-07-18 2016-12-14 北京邮电大学 The vision localization research of random fluctuation in multimedium propagation
CN206974440U (en) * 2017-07-25 2018-02-06 广州阿路比电子科技有限公司 A kind of 6DOF locus and the detecting system of posture

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180101746A (en) * 2017-03-06 2018-09-14 삼성전자주식회사 Method, electronic device and system for providing augmented reality contents

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102263800A (en) * 2010-05-26 2011-11-30 四川大学 Geographic information internet of things system constructed based on cloud computing, and application
CN102158556A (en) * 2011-04-08 2011-08-17 南京邮电大学 Method for implementing intelligent property management system in Internet of things environment
CN103175524A (en) * 2013-02-20 2013-06-26 清华大学 Visual-sense-based aircraft position and attitude determination method under mark-free environment
CN103500452A (en) * 2013-10-12 2014-01-08 杭州师范大学 Scenic spot scenery moving augmented reality method based on space relationship and image analysis
CN105760106A (en) * 2016-03-08 2016-07-13 网易(杭州)网络有限公司 Interaction method and interaction device of intelligent household equipment
CN106225778A (en) * 2016-07-18 2016-12-14 北京邮电大学 The vision localization research of random fluctuation in multimedium propagation
CN206974440U (en) * 2017-07-25 2018-02-06 广州阿路比电子科技有限公司 A kind of 6DOF locus and the detecting system of posture

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
《A markerless augmented reality system using one-shot structured light》;Huang Bingyao;《Theses and Dissertations》;20151231;第1-10页 *
《基于RFID物联网系统的EPC识别技术》;朱保锋 等;《河南教育学院学报(自然科学版)》;20121231;第21卷(第4期);第5-7页 *
《多媒体传感器网络中被动目标定位算法》;张波 等;《东南大学学报(自然科学版)》;20110331;第41卷(第2期);第266-269页 *

Also Published As

Publication number Publication date
CN110135238A (en) 2019-08-16

Similar Documents

Publication Publication Date Title
CN106793086B (en) Indoor positioning method
CN104936283B (en) Indoor orientation method, server and system
CN106529538A (en) Method and device for positioning aircraft
CN111199564A (en) Indoor positioning method and device of intelligent mobile terminal and electronic equipment
CN111091587B (en) Low-cost motion capture method based on visual markers
CN106249267A (en) A kind of target location tracking method and device
WO2018207426A1 (en) Information processing device, information processing method, and program
CN106470478B (en) Positioning data processing method, device and system
CN112308930B (en) Camera external parameter calibration method, system and device
CN117876723B (en) Unmanned aerial vehicle aerial image global retrieval positioning method under refusing environment
CN109977853B (en) Underground worker panoramic monitoring method based on multiple identification devices
CN108512888B (en) Information labeling method, cloud server, system and electronic equipment
CN111340884B (en) Dual-target positioning and identity identification method for binocular heterogeneous camera and RFID
CN110135238B (en) Markless Internet of things equipment identification method based on mobile AR
US11741151B1 (en) Indexing key frames for localization
Arth et al. Full 6dof pose estimation from geo-located images
Karimi et al. A new method for automatic and accurate coded target recognition in oblique images to improve augmented reality precision
Feng et al. Image-based localization for facilitating construction field reporting on mobile devices
CN110268438B (en) Image database construction device, position and inclination estimation device, and image database construction method
CN111402324A (en) Target measuring method, electronic equipment and computer storage medium
Zhang et al. Smart phone camera image localization method for narrow corridors based on epipolar geometry
Qiu et al. A new monocular vision simultaneous localization and mapping process for high-precision positioning in structured indoor environments
CN116266402A (en) Automatic object labeling method and device, electronic equipment and storage medium
CN114513746B (en) Indoor positioning method integrating triple vision matching model and multi-base station regression model
CN110826568A (en) Indoor positioning method and device based on road signs and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant