CN110135238A - A kind of marker free networked devices recognition methods based on mobile AR - Google Patents

A kind of marker free networked devices recognition methods based on mobile AR Download PDF

Info

Publication number
CN110135238A
CN110135238A CN201910230198.5A CN201910230198A CN110135238A CN 110135238 A CN110135238 A CN 110135238A CN 201910230198 A CN201910230198 A CN 201910230198A CN 110135238 A CN110135238 A CN 110135238A
Authority
CN
China
Prior art keywords
equipment
mobile
internet
things equipment
things
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910230198.5A
Other languages
Chinese (zh)
Other versions
CN110135238B (en
Inventor
夏明�
向文秀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University of Technology ZJUT
Original Assignee
Zhejiang University of Technology ZJUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University of Technology ZJUT filed Critical Zhejiang University of Technology ZJUT
Priority to CN201910230198.5A priority Critical patent/CN110135238B/en
Publication of CN110135238A publication Critical patent/CN110135238A/en
Application granted granted Critical
Publication of CN110135238B publication Critical patent/CN110135238B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes

Abstract

A kind of marker free networked devices recognition methods based on mobile AR, including internet of things equipment information collection and internet of things equipment after deployment identify two steps when deployment;When disposing internet of things equipment, mobile AR equipment passes through after camera detection to internet of things equipment, pass through image data and multiple sensors data (including global positioning system, accelerometer, gyroscope, electronic compass), the global coordinates of target internet of things equipment are calculated, and will be in its target internet of things equipment ID deposit mapping database with user's input.After deployment, internet of things equipment can be detected by mobile AR equipment camera, it calculates the global coordinates of target internet of things equipment in real time using image data and multiple sensors data, and searches corresponding ID in mapping database, to accurately identify internet of things equipment individual.The present invention provides a kind of simple and effective internet of things equipment recognition methods independent of individual mark, and internet of things equipment is facilitated to configure and diagnose.

Description

A kind of marker free networked devices recognition methods based on mobile AR
Technical field
The present invention relates to internet of things equipment recognition methods, are based particularly on the identification of the marker free networked devices of mobile AR Method.
Background technique
Target Recognition Algorithms are widely used in identifying specific objective object from complicated image scene.Currently, based on figure The Target Recognition Algorithms of picture are mainly used for distinguishing different classes of object.However, internet of things equipment may have identical appearance, Identify that internet of things equipment individual is interacted with internet of things equipment, the key technology including the operations such as configuring, diagnosing, base at present This target is unable to reach in the Target Recognition Algorithms of image.
To realize the identification to internet of things equipment individual, there has been based on features such as RFID, bar code, wireless signals Recognition methods.Document [1] Zhu B F, Gui Q.EPC Identification Technology in IOT System Based on RFID [J] .Journal of Henan Institute of Education, the 2012. i.e. object based on RFID EPC identification technology and document [2] MONTASER, Ali, MOSELHI, et al.RFID indoor in networked system location identification for construction projects[J].Automation in Construction, 2014,39 (4): 167-179. is the construction project of the indoor location identification based on RFID.Wherein, each Equipment is endowed unique RFID label tag to realize identification, therefore its lower deployment cost is high, and recognition accuracy is easy by signal Interference.Document [3] Shatte A, Holdsworth J, Lee I.Mobile augmented reality based context-aware library management system[J].Expert Systems with Applications, 2014,41 (5): 2174-2185. is context-aware management system and document [4] Haramaki based on mobile augmented reality T,Nishino H.A network topology visualization system based on mobile AR technology[C].IEEE International Conference on Advanced Information Networking&Applications.IEEE, the 2015. network topology visualization system i.e. based on mobile AR technology.By Unique bar code is printed on internet of things equipment to realize individual differentiation.Therefore very high lower deployment cost has been also resulted in.Meanwhile item Code is easy breakage under environment exposed for a long time, influences recognition effect.Document [5] Haramaki T, Shimizu D, Nishino H.A wireless network visualizer based on signal strength observation[C].IEEE International Conference on Consumer Electronics-taiwan.IEEE, 2017. i.e. one kind are based on The wireless network visualizer of signal strength observation.Document [6] Haramaki T, Nishino H.A Device Identification Method for AR-based Network Topology Visualization[C] .International Conference on Broadband&Wireless Computing.IEEE, a kind of 2015. i.e. base In the visual device identification method of the network topology of AR.Document [7] Mandeljc R, Kristan M, et al.Tracking by Identification Using Computer Vision and Radio[J].Sensors,2012,13(1):241- 273. carry out recognition and tracking based on computer vision and radio.Document [8] Haramaki T.A sensor fusion approach for network visualization[C].2016 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW) .IEEE, the 2016. network visualization sides merged based on sensor Method.Equipment position is judged according to the intensity of wireless signal.Wherein, document [6] and document [7] are herein in connection with image recognition technology Accuracy of identification is improved, document [8] then utilizes acceleration transducer data on the basis of document [6], by the appearance for judging user State (come back or bow) is substantially to distinguish the equipment in vertical direction.But wireless signal is vulnerable to environmental disturbances, according to its intensity Accuracy is poor when being positioned.Meanwhile the technology is mainly used for small area network equipment management, it is difficult to realize a wide range of Internet of Things The identification of net equipment.Oguntala G, Abd-Alhameed R, Jones S, et al.Indoor location in document [9] identification technologies for real-time IoT-based applications:An inclusive Survey [J] .Computer Science Review, 2018,30:55-79. are the real-time indoor position applied based on Internet of Things It sets identification technology and identifies equipment using Magnetic Field, but this method is easy the interference by magnetic field.
The identification of target internet of things equipment can be abstracted as the orientation problem of the equipment.Document [10] Zhang Bo, Luo Haiyong, recklessly It is vast, passive target location algorithm [J] Southeast China University's journal (natural science edition) in et al. multimedia sensor network, 2011,41 (2): 266-269. identifies target using the method for multiple video camera co-locateds.Difference with scheme is, makes With multiple fixing cameras.And this programme only needs a mobile AR equipment that can realize identification.Document [11] Oztarak H,Akkaya K,Yazici A.Lightweight Object Localization with a Single Camera in Wireless Multimedia Sensor Networks[C].IEEE Conference on Global Telecommunications.IEEE, 2009. be that individually the lightweight object of camera is determined in wireless multimedia sensor network Position carries out target positioning using video camera shooting image.Fixed camera has been used, and has focused primarily upon and calculates target phase Two-dimensional position offset for video camera.
Summary of the invention
Accuracy in order to overcome the shortcomings of the recognition methods of existing target internet of things equipment is poor, and the present invention proposes one Marker free networked devices recognition methods of the kind based on mobile AR.This method identifies same type using the camera of mobile AR equipment Internet of things equipment, then comprehensively utilize multiple sensors data (including global positioning system, accelerometer, gyroscope, electronics Compass), the global coordinates for calculating target internet of things equipment individual obtain target internet of things equipment only by database matching One ID.
The technical solution adopted by the present invention to solve the technical problems is:
A kind of unmarked identification internet of things equipment recognition methods based on mobile AR, including information collection and deployment when deployment Internet of things equipment identifies afterwards;
Wherein, information collection includes following sub-step when deployment:
1.1) after disposing internet of things equipment, user will move AR equipment and be directed at disposed internet of things equipment, start to target Internet of things equipment carries out information collection;
1.2) mobile AR equipment calculates the global coordinates of target internet of things equipment by image data and sensing data;
1.3) the target internet of things equipment unique address deposit mapping number for inputting the global coordinates being calculated and user According to library, information collection is completed;
Internet of things equipment identification includes following sub-step after deployment:
2.1) user opens mobile AR equipment, starts internet of things equipment identification;
2.2) when mobile AR equipment detects internet of things equipment by image data, then comprehensive sensor data, calculate mesh Mark the global coordinates of internet of things equipment;
2.3) it according to obtained target internet of things equipment global coordinates, is retrieved in mapping database, matching pair The Internet of Things target device unique address answered completes identification.
Further, the step 1.2) and 2.2) in, marker free networked devices global coordinates calculating process are as follows: count first Relative displacement of the target internet of things equipment in mobile AR equipment standard coordinate system is calculated, to move AR equipment as origin, XsAxis refers to To due east, YsAxis is directed toward due north, ZsAxis is perpendicular to the ground, then in conjunction with mobile AR equipment global coordinates, calculates target Internet of Things and sets Standby global coordinates.
The step 1.2) and 2.2) in, marker free networked devices global coordinates calculating process are as follows:
Firstly, calculating in mobile AR equipment standard coordinate system, the vertical displacement of target internet of things equipment:
1.2.1.1 it after) mobile AR equipment detects internet of things equipment, in collected image information, calculates in movement In AR equipment camera coordinate system, the transposed matrix of target internet of things equipment;And by this transposed matrix, calculates and set in mobile AR Line and mobile AR equipment video camera in standby camera coordinate system between coordinate origin and target internet of things equipment coordinate are sat Mark system XcZcThe angle α that plane is formed:
Wherein, xc、yc、zcIndicate displacement of the target internet of things equipment in mobile AR equipment camera coordinate system;
1.2.1.2) using the accelerometer and gyroscope of mobile AR equipment, mobile AR equipment pitch angle β is calculated, that is, is moved AR equipment camera coordinate system YcAxis and mobile AR equipment standard coordinate system XsYsThe angle of plane, and by α and β, calculate target Internet of things equipment and mobile AR equipment room line and mobile AR equipment standard coordinate system XsYsThe angle γ of plane;
1.2.1.3) according to γ, vertical displacement of the target internet of things equipment in mobile AR equipment standard coordinate system is calculated zs:
Then, it calculates in mobile AR equipment standard coordinate system, the horizontal displacement of target internet of things equipment:
1.2.2.1 it after) mobile AR equipment recognizes internet of things equipment, in collected image information, calculates in movement In AR equipment camera coordinate system, the transposed matrix of target internet of things equipment, and calculated by this transposed matrix and set in mobile AR In standby camera coordinate system between target internet of things equipment and coordinate origin line in XcZcProjection and X in planecThe folder of axis Angle θ:
Wherein, xc、zcTarget internet of things equipment is respectively indicated in mobile AR equipment camera coordinate system in XcAxis and ZcAxis On displacement;
1.2.2.2) using the electronic compass of mobile AR equipment, the magnetic heading angle δ of mobile AR equipment is calculated, and passes through θ and δ Calculate the magnetic heading angle ε of target internet of things equipment:
1.2.2.3) according to ε, target internet of things equipment is calculated in the horizontal displacement x of mobile AR equipment standard coordinate systems、ys:
Finally, by xs、ys、zsThe longitude and latitude for being converted to target internet of things equipment relative movement AR equipment under global coordinates is poor And difference in height, it is added with mobile AR equipment by the global coordinates that global positioning system obtains, obtains target internet of things equipment Global coordinates.
The step 1.2) and 2.2) in, the sensor includes global positioning system, accelerometer, gyroscope and electronics Compass.
Technical concept of the invention are as follows: when disposing internet of things equipment, mobile AR equipment passes through camera detection to Internet of Things After net equipment, pass through image data and multiple sensors data (including global positioning system, accelerometer, gyroscope and electronics Compass), the global coordinates of target internet of things equipment are calculated, and the unique ID of target internet of things equipment of itself and user's input is stored in In mapping database.During internet of things equipment is configured and diagnosed after deployment, mobile AR equipment camera detectable substance can be passed through Networked devices, the global coordinates of target internet of things equipment are calculated using image data and multiple sensors data in real time, and are being reflected It penetrates in database and searches corresponding ID, to accurately identify internet of things equipment individual.
The beneficial effects of the present invention are: propose a kind of letter using mobile AR equipment unique identification internet of things equipment individual Just method, to simplify internet of things equipment configuration and diagnosis.
Detailed description of the invention
Fig. 1 is a kind of marker free networked devices recognition methods flow chart based on mobile AR of the invention.
Fig. 2 is that target internet of things equipment vertical displacement meter calculates schematic diagram.
Fig. 3 is that target internet of things equipment horizontal displacement meter calculates schematic diagram.
Specific embodiment
Present invention will be further explained below with reference to the attached drawings and examples.
Referring to Fig.1~Fig. 3, a kind of recognition methods of the marker free networked devices based on mobile AR, including believe when deployment Equipment identifies after breath acquisition and deployment;
Wherein, information collection includes following sub-step when deployment:
1.1) after disposing internet of things equipment, user will move AR equipment and be directed at disposed internet of things equipment, start to target Internet of things equipment carries out information collection;
1.2) mobile AR equipment calculates the global coordinates of target internet of things equipment by image data and sensing data;
1.3) the target internet of things equipment unique address deposit mapping number for inputting the global coordinates being calculated and user According to library, information collection is completed;
Internet of things equipment identification includes following sub-step after deployment:
2.1) user opens mobile AR equipment, starts internet of things equipment identification;
2.2) when mobile AR equipment detects internet of things equipment by image data, then comprehensive sensor data, calculate mesh Mark the global coordinates of internet of things equipment;
2.3) it according to obtained target internet of things equipment global coordinates, is retrieved in mapping database, matching pair The Internet of Things target device unique address answered completes identification.
Referring to Fig. 2, calculate in mobile AR equipment standard coordinate system, the vertical displacement of target internet of things equipment:
1.2.1.1 it after) mobile AR equipment detects internet of things equipment, in collected image information, calculates in movement In AR equipment camera coordinate system, the transposed matrix of target internet of things equipment;And by this transposed matrix, calculates and set in mobile AR Line and mobile AR equipment video camera in standby camera coordinate system between coordinate origin and target internet of things equipment coordinate are sat Mark system XcZcThe angle α that plane is formed:
Wherein, xc、yc、zcIndicate displacement of the target internet of things equipment in mobile AR equipment camera coordinate system;
1.2.1.2) using the accelerometer and gyroscope of mobile AR equipment, mobile AR equipment pitch angle β is calculated, that is, is moved AR equipment camera coordinate system YcAxis and mobile AR equipment standard coordinate system XsYsThe angle of plane), and by α and β, calculate mesh Mark internet of things equipment and mobile AR equipment room line and mobile AR equipment standard coordinate system XsYsThe angle γ of plane;
1.2.1.3) according to γ, vertical displacement of the target internet of things equipment in mobile AR equipment standard coordinate system is calculated zs:
Referring to Fig. 3, calculate in mobile AR equipment standard coordinate system, the horizontal displacement of target internet of things equipment:
1.2.2.1 it after) mobile AR equipment recognizes internet of things equipment, in collected image information, calculates in movement In AR equipment camera coordinate system, the transposed matrix of target internet of things equipment, and calculated by this transposed matrix and set in mobile AR In standby camera coordinate system between target internet of things equipment and coordinate origin line in XcZcProjection and X in planecThe folder of axis Angle θ:
Wherein, xc、zcTarget internet of things equipment is respectively indicated in mobile AR equipment camera coordinate system in XcAxis and ZcAxis On displacement;
1.2.2.2) using the electronic compass of mobile AR equipment, the magnetic heading angle δ of mobile AR equipment is calculated, and passes through θ and δ Calculate the magnetic heading angle ε of target internet of things equipment:
1.2.2.3) according to ε, target internet of things equipment is calculated in the horizontal displacement x of mobile AR equipment standard coordinate systems、ys:
Finally, by xs、ys、zsThe longitude and latitude for being converted to target internet of things equipment relative movement AR equipment under global coordinates is poor And difference in height, it is added with mobile AR equipment by the global coordinates that global positioning system obtains, obtains target internet of things equipment Global coordinates.

Claims (4)

1. a kind of marker free networked devices recognition methods based on mobile AR, which is characterized in that the method includes when deployment Internet of things equipment identifies after information collection and deployment;
Information collection includes following sub-step when the deployment:
1.1) after disposing internet of things equipment, user will move AR equipment and be directed at disposed internet of things equipment, start to target Internet of Things Net equipment carries out information collection;
1.2) mobile AR equipment calculates the global coordinates of target internet of things equipment by image data and sensing data;
1.3) the target internet of things equipment unique address that the global coordinates being calculated and user input is stored in mapping database, Information collection is completed;
Internet of things equipment identification includes following sub-step after the deployment:
2.1) user opens mobile AR equipment, starts internet of things equipment identification;
2.2) when mobile AR equipment detects internet of things equipment by image data, then comprehensive sensor data, calculate object The global coordinates of networked devices;
2.3) it according to obtained target internet of things equipment global coordinates, is retrieved, is matched corresponding in mapping database Internet of Things target device unique address completes identification.
2. the marker free networked devices recognition methods as described in claim 1 based on mobile AR, which is characterized in that the step It is rapid 1.2) and 2.2) in, marker free networked devices global coordinates calculating process are as follows: calculating target internet of things equipment first is moving Relative displacement in dynamic AR equipment standard coordinate system, to move AR equipment as origin, XsAxis is directed toward due east, YsAxis is directed toward due north, Zs Axis is perpendicular to the ground, then in conjunction with mobile AR equipment global coordinates, calculates the global coordinates of target internet of things equipment.
3. the marker free networked devices recognition methods as claimed in claim 2 based on mobile AR, which is characterized in that the step It is rapid 1.2) and 2.2) in, marker free networked devices global coordinates calculating process are as follows:
Firstly, calculating in mobile AR equipment standard coordinate system, the vertical displacement of target internet of things equipment, steps are as follows:
1.2.1.1 it after) mobile AR equipment detects internet of things equipment, in collected image information, calculates and is set in mobile AR In standby camera coordinate system, the transposed matrix of target internet of things equipment;And by this transposed matrix, calculates and taken the photograph in mobile AR equipment Line and mobile AR equipment camera coordinate system in camera coordinate system between coordinate origin and target internet of things equipment coordinate XcZcThe angle α that plane is formed:
Wherein, xc、yc、zcIndicate displacement of the target internet of things equipment in mobile AR equipment camera coordinate system;
1.2.1.2) using the accelerometer and gyroscope of mobile AR equipment, mobile AR equipment pitch angle β is calculated, that is, moves AR and sets Standby camera coordinate system YcAxis and mobile AR equipment standard coordinate system XsYsThe angle of plane, and by α and β, calculate target Internet of Things Net equipment and mobile AR equipment room line and mobile AR equipment standard coordinate system XsYsThe angle γ of plane;
1.2.1.3) according to γ, vertical displacement z of the target internet of things equipment in mobile AR equipment standard coordinate system is calculateds:
Then, it calculates in mobile AR equipment standard coordinate system, the horizontal displacement of target internet of things equipment, steps are as follows:
1.2.2.1 it after) mobile AR equipment recognizes internet of things equipment, in collected image information, calculates and is set in mobile AR In standby camera coordinate system, the transposed matrix of target internet of things equipment, and calculated by this transposed matrix and taken the photograph in mobile AR equipment In camera coordinate system between target internet of things equipment and coordinate origin line in XcZcProjection and X in planecThe angle theta of axis:
Wherein, xc、zcTarget internet of things equipment is respectively indicated in mobile AR equipment camera coordinate system in XcAxis and ZcOn axis Displacement;
1.2.2.2) using the electronic compass of mobile AR equipment, the magnetic heading angle δ of mobile AR equipment is calculated, and is calculated by θ and δ The magnetic heading angle ε of target internet of things equipment:
1.2.2.3) according to ε, target internet of things equipment is calculated in the horizontal displacement x of mobile AR equipment standard coordinate systems、ys:
Finally, by xs、ys、zsBe converted to the longitude and latitude difference and height of target internet of things equipment relative movement AR equipment under global coordinates It is poor to spend, and is added with mobile AR equipment by the global coordinates that global positioning system obtains, obtains the whole world of target internet of things equipment Coordinate.
4. the marker free networked devices recognition methods based on mobile AR as described in one of claims 1 to 3, feature exist In, the step 1.2) and 2.2) in, the sensor includes global positioning system, accelerometer, gyroscope and electronic compass.
CN201910230198.5A 2019-03-26 2019-03-26 Markless Internet of things equipment identification method based on mobile AR Active CN110135238B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910230198.5A CN110135238B (en) 2019-03-26 2019-03-26 Markless Internet of things equipment identification method based on mobile AR

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910230198.5A CN110135238B (en) 2019-03-26 2019-03-26 Markless Internet of things equipment identification method based on mobile AR

Publications (2)

Publication Number Publication Date
CN110135238A true CN110135238A (en) 2019-08-16
CN110135238B CN110135238B (en) 2021-04-06

Family

ID=67568862

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910230198.5A Active CN110135238B (en) 2019-03-26 2019-03-26 Markless Internet of things equipment identification method based on mobile AR

Country Status (1)

Country Link
CN (1) CN110135238B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102158556A (en) * 2011-04-08 2011-08-17 南京邮电大学 Method for implementing intelligent property management system in Internet of things environment
CN102263800A (en) * 2010-05-26 2011-11-30 四川大学 Geographic information internet of things system constructed based on cloud computing, and application
CN103175524A (en) * 2013-02-20 2013-06-26 清华大学 Visual-sense-based aircraft position and attitude determination method under mark-free environment
CN103500452A (en) * 2013-10-12 2014-01-08 杭州师范大学 Scenic spot scenery moving augmented reality method based on space relationship and image analysis
CN105760106A (en) * 2016-03-08 2016-07-13 网易(杭州)网络有限公司 Interaction method and interaction device of intelligent household equipment
CN106225778A (en) * 2016-07-18 2016-12-14 北京邮电大学 The vision localization research of random fluctuation in multimedium propagation
CN206974440U (en) * 2017-07-25 2018-02-06 广州阿路比电子科技有限公司 A kind of 6DOF locus and the detecting system of posture
US20180253601A1 (en) * 2017-03-06 2018-09-06 Samsung Electronics Co., Ltd. Method of providing augmented reality content, and electronic device and system adapted to the method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102263800A (en) * 2010-05-26 2011-11-30 四川大学 Geographic information internet of things system constructed based on cloud computing, and application
CN102158556A (en) * 2011-04-08 2011-08-17 南京邮电大学 Method for implementing intelligent property management system in Internet of things environment
CN103175524A (en) * 2013-02-20 2013-06-26 清华大学 Visual-sense-based aircraft position and attitude determination method under mark-free environment
CN103500452A (en) * 2013-10-12 2014-01-08 杭州师范大学 Scenic spot scenery moving augmented reality method based on space relationship and image analysis
CN105760106A (en) * 2016-03-08 2016-07-13 网易(杭州)网络有限公司 Interaction method and interaction device of intelligent household equipment
CN106225778A (en) * 2016-07-18 2016-12-14 北京邮电大学 The vision localization research of random fluctuation in multimedium propagation
US20180253601A1 (en) * 2017-03-06 2018-09-06 Samsung Electronics Co., Ltd. Method of providing augmented reality content, and electronic device and system adapted to the method
CN206974440U (en) * 2017-07-25 2018-02-06 广州阿路比电子科技有限公司 A kind of 6DOF locus and the detecting system of posture

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HUANG BINGYAO: "《A markerless augmented reality system using one-shot structured light》", 《THESES AND DISSERTATIONS》 *
张波 等: "《多媒体传感器网络中被动目标定位算法》", 《东南大学学报(自然科学版)》 *
朱保锋 等: "《基于RFID物联网系统的EPC识别技术》", 《河南教育学院学报(自然科学版)》 *

Also Published As

Publication number Publication date
CN110135238B (en) 2021-04-06

Similar Documents

Publication Publication Date Title
CN106793086B (en) Indoor positioning method
CN111091587B (en) Low-cost motion capture method based on visual markers
CN106952289B (en) The WiFi object localization method analyzed in conjunction with deep video
CN108692701B (en) Mobile robot multi-sensor fusion positioning method based on particle filter
CN101916437A (en) Method and system for positioning target based on multi-visual information
CN106845392B (en) Indoor corner landmark matching and identifying method based on crowdsourcing track
CN104090263B (en) Localization method, system based on RFID technique
CN110675453B (en) Self-positioning method for moving target in known scene
CN109029429B (en) WiFi and geomagnetic fingerprint based multi-classifier global dynamic fusion positioning method
US10949579B2 (en) Method and apparatus for enhanced position and orientation determination
KR101757884B1 (en) Apparatus for providing circumstance information based on battlefield situation awareness and method thereof
CN112823321A (en) Position positioning system and method for mixing position identification results based on multiple types of sensors
Shangguan et al. Towards accurate object localization with smartphones
JP5843288B2 (en) Information presentation system
KR101720097B1 (en) User device locating method and apparatus for the same
CN108512888B (en) Information labeling method, cloud server, system and electronic equipment
CN109977853B (en) Underground worker panoramic monitoring method based on multiple identification devices
Daniş et al. An indoor localization dataset and data collection framework with high precision position annotation
Chen et al. MeshMap: A magnetic field-based indoor navigation system with crowdsourcing support
Henriques Abreu et al. Using Kalman filters to reduce noise from RFID location system
Singh et al. Ubiquitous hybrid tracking techniques for augmented reality applications
Wang et al. iNavigation: an image based indoor navigation system
WO2016072240A1 (en) Information processing device for processing sensor information, information management system, information display system, information processing method, program, recording medium, and server device
CN111340884A (en) Binocular heterogeneous camera and RFID dual target positioning and identity identification method
CN110135238A (en) A kind of marker free networked devices recognition methods based on mobile AR

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant