CN108650245A - Internet of things system based on augmented reality and operation method - Google Patents
Internet of things system based on augmented reality and operation method Download PDFInfo
- Publication number
- CN108650245A CN108650245A CN201810375417.4A CN201810375417A CN108650245A CN 108650245 A CN108650245 A CN 108650245A CN 201810375417 A CN201810375417 A CN 201810375417A CN 108650245 A CN108650245 A CN 108650245A
- Authority
- CN
- China
- Prior art keywords
- internet
- module
- data
- things
- augmented reality
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/131—Protocols for games, networked simulations or virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Processing Or Creating Images (AREA)
Abstract
Present invention is disclosed a kind of Internet of things system and operation method based on augmented reality, the Internet of things system based on augmented reality include at least four Internet of things node, server module, terminal module and control platform modules, wherein:Internet of things node is laid in wide area space, identification code and sensor is equipped with, with other module bi-directional transfer of data;Server module executes data storage, analysis, processing and transmission, carries out database data processing, analysis, abnormality detection, and transmission data and report;Terminal module, acquisition signal, sensing data, shooting video or image realize local operation and both-way communication;And control platform, execution information extraction, display and user's interaction and processing analysis, Internet of things node and terminal module are supervised and are instructed transmission.It present invention can be suitably applied to Internet of things node in the applications such as smart city, garden, storage, pipeline, the display of terminal device, interactive operation and data update, it can effectively realize the update of Internet of things node status monitoring, control platform analysis scheduling and terminal AR interactive operation integrations, accelerate exception reporting flow, guarantees system security.
Description
Technical field
The present invention relates to suitable for Internet of Things and augmented reality field, especially a kind of Internet of Things based on augmented reality
Net system and operation method.
Background technology
With the development of the technologies such as mobile Internet, cloud computing, micro-electro-mechanical sensors, just from " Internet of Things " of today
(IoT) epoch of " all things on earth interconnection " (IoE) are entered into, all objects will obtain context-aware, the processing capacity of enhancing and more
Good sensing capability.By the way that people and information to be added in internet, the networks for collecting hundred million or even trillion connections of putting the palms together before one can be realized.And
Data, information caused by these connections will also be considered as that conventional industries transition and upgrade and new industry can effectively be driven to develop,
The important internal motivation of boosting economic development.
It is how easy-to-use, effectively convert huge Internet of Things data to the information that artificially can appreciate that, to serve
Daily life is industrial circle problem to be solved.Because only that can connect, link up with people’s lives, Internet of Things
The data competence exertion of net goes out its value.And current Internet of Things is also in the stage that object is connect with object mostly, i.e.,:Generally pass through
Sensor Network obtains node space information (such as position, speed, posture) and data, then on the one hand by spatial information by each section
Point is connected from space, on the time, on the other hand by technologies such as big data analysis, machine learning, related data is screened,
Analysis and processing, to further obtain more valuable data.But with the development of technology and public demand, technology above-mentioned
Scheme and function have been unable to meet demand, are increasingly required the attribute that " people " is added in Internet of Things.The attribute is not only
Only limitation can enjoy the facility that Internet of Things big data is brought with user, it is also necessary to which people participates in, recognizes Internet of Things more and more
System, and interacted therewith from sense organ (such as vision, the sense of hearing, tactile).This is that the present invention is directed to invent and improve the target of raising.
By development in recent years, augmented reality (AR) has become an important branch of field of virtual reality, compares
Traditional virtual reality technology, AR organically combine real world and virtual environment, therefore can be provided in virtual environment more preferable
Real texture, and with reality interaction;Meanwhile AR technical costs is relatively lower, or even can be in mobile intelligent terminal, such as intelligence
It is realized on energy mobile phone, PAD, also brings huge application prospect for its development.Just because of above-mentioned advantage, AR technologies are in army
The FIELD OF THE INVENTIONThes such as thing, amusement, education, tourism, medical treatment have obtained preliminary base application.But in Internet of Things field, using also
It is relatively fewer, also without the good application model of practicability.
In order to realize that AR technologies are effectively applied to Internet of Things field and have preferable application model to experience, it is a kind of new and
Effective to realize merging for AR technologies and Internet of Things, it is imperative to invent to the Internet of things system of synergistic application and method.
Invention content
The purpose of the present invention be intended to provide it is a kind of realize merging for AR technologies and Internet of Things, synergistic application it is existing based on enhancing
Real technology Internet of things system and operation method.
According to above-mentioned purpose, the present invention provides a kind of Internet of things system based on augmented reality, includes at least object
Four networked node, server module, terminal module and control platform modules, wherein:Internet of things node is laid in wide area sky
In, it is equipped with identification code and sensor, with other module bi-directional transfer of data;Server module executes data storage, divides
Analysis, processing and transmission, carry out database data processing, analysis, abnormality detection, and transmission data and report;Terminal module, acquisition
Signal, sensing data, shooting video or image realize local operation and both-way communication;And control platform, execution information
Extraction, display and user's interaction and processing analysis, are supervised and are instructed transmission to Internet of things node and terminal module.
The Internet of things system based on augmented reality, the server module are equipped with database, the number
Node database, gis database and location navigation database, the server module and object are included at least according to library
Networked node, terminal module, control platform keep communication, the Internet of things node, server module, terminal module and management and control
Platform is all provided with memory, hard disk, processor and communication module.
The Internet of things system based on augmented reality, the terminal module further include sensor assembly, show
Show and user interactive module, the sensor assembly include wireless sensor and motion sensor, terminal module periodically acquires
Wireless signal, motion sensor data, and realize local operation, including position, posture determine that AR scenes generate.
The Internet of things system based on augmented reality further includes the motion sensor to terminal module and bat
The camera for taking the photograph video or image is demarcated, between acquisition camera parameter, sensor error parameter and camera and sensor
The misaligned angle of axis;Calibration result storage is compensated for image and sensing data.
The Internet of things system based on augmented reality, the AR scenes generate, and are included in terminal module exhibition
Show the image of acquisition, calculates the Conversion Relations between current picture coordinate system and three-dimensional coordinate system;Described three
Dimension space coordinate clicks target point or input target point numbering obtains by opening AR measurement patterns.
The Internet of things system based on augmented reality, the AR scenes generate, further comprise:Calculate mesh
Two-dimensional coordinate in punctuate relative image pixel coordinate system and depth, if target point is projected in end in image pixel coordinates system
Within the scope of the camera image of end module, and depth be less than threshold value, then on image at respective image pixel coordinate mark target point with
And distance of the target point apart from terminal.
The Internet of things system based on augmented reality further includes that the terminal module or server module melt
Resultant motion sensor and wireless sensor data carry out navigation calculation, obtain terminal location, speed and posture.
According to above-mentioned purpose, the present invention also provides a kind of Internet of things system operation method based on augmented reality,
Include the following steps:Sensing data is acquired by Internet of things node, pre-processes and is transmitted to server module;Through server mould
Block carries out database data processing, analysis, abnormality detection;Signal, sensing data, shooting video or figure are acquired in terminal module
Picture realizes local operation and both-way communication;And instruction analysis, processing and terminal module are carried out by server module and connected
It connects.
The Internet of things system operation method based on augmented reality, the server module are equipped with data
Library, the database include at least node database, gis database and location navigation database, the server
Module is communicated with Internet of things node, terminal module, control platform holding, the Internet of things node, server module, terminal module
And control platform is all provided with memory, hard disk, processor and communication module.
The Internet of things system operation method based on augmented reality, the terminal module further includes sensor
Module, display and user interactive module, the sensor assembly include wireless sensor and motion sensor, terminal module period
Property acquisition wireless signal, motion sensor data, and realize local operation, including position, posture determine, AR scenes generate.
The Internet of things system operation method based on augmented reality, further include to the camera of terminal module and
Motion sensor is demarcated, and is obtained between the inside and outside parameter of camera and sensor error parameter and camera and sensor
The misaligned angle of axis;Calibration result storage is compensated for image and sensing data.
The Internet of things system operation method based on augmented reality, the AR scenes generate, and are included in end
The image of end module displaying acquisition, the mutual conversion calculated between current picture coordinate system and three-dimensional coordinate system are closed
System, the three dimensional space coordinate click target point or input target point numbering obtain by opening AR measurement patterns.
The Internet of things system operation method based on augmented reality, the AR scenes generate, and further wrap
It includes:The two-dimensional coordinate and depth in target point relative image pixel coordinate system are calculated, if target point is in image pixel coordinates system
Be projected in terminal module ground camera image range, and depth be less than threshold value, then on image at respective image pixel coordinate mark
Remember the distance of target point and target point apart from terminal.
The Internet of things system operation method based on augmented reality, further includes fusional movement sensor and nothing
Line sensing data carries out the step of navigation calculation.
The Internet of things system operation method based on augmented reality, the terminal module acquire signal number
According to the data in server module acquisition target area judge whether to carry out navigation operations in server module;If it has, then
By sensor data transmission to server module, location navigation resolving is carried out;If it has not, then extremely by data transmission in target area
Terminal module carries out location navigation resolving.
The Internet of things system operation method based on augmented reality further includes whether detecting Internet of things node
Step in camera measurement spatial dimension;If choosing Internet of things node and storing its identification code, by selected Internet of Things section
Point coordinates is transformed into two-dimensional pixel coordinate system from three-dimensional coordinate system, then the Overlapping display in AR scenes;Use the identification
Code inquires the data and state of selected Internet of things node in server end, will if Internet of things node data mode changes
The data transmission is used to terminal module for display.
According to above-mentioned technical characteristic, Internet of Things in the applications such as smart city, garden, storage, pipeline present invention can be suitably applied to
Node, the display of terminal device, interactive operation and data update.Technical scheme of the present invention is used, can effectively be realized
The update of Internet of things node status monitoring, control platform analysis scheduling and terminal AR interactive operation integrations, accelerate exception reporting
Flow guarantees system security.The present invention can be included vision, acoustically realize user and Internet of Things system by AR technologies from sense organ
The interaction of system, while positioning and path planning function in real time is added, simplify target point and searches difficulty.The present invention utilizes LPWAN numbers
Carry out data transmission in real time according to link, reduce the power consumption of Internet of Things and location-based service, increases use scope.
Part is illustrated to embody by other advantages, target and the feature of the present invention by following, and part will also be by this
The research and practice of invention and be understood by the person skilled in the art.
Description of the drawings
In the present invention, identical reference numeral always shows identical feature, wherein:
Fig. 1 is the Internet of things system circuit theory schematic diagram the present invention is based on augmented reality;
Fig. 2 is the Internet of things system method of operation flow diagram the present invention is based on augmented reality.
Specific implementation mode
The technical solution further illustrated the present invention with reference to the accompanying drawings and examples.
These purposes according to the present invention and other advantages, provide a kind of Internet of things system and fortune based on augmented reality
Row method, as shown in Fig. 1, the Internet of things system the present invention is based on augmented reality includes:
Multiple Internet of things node can be laid in wide area space according to actual demand.Each Internet of things node has the whole world only
One identification code.Configuration surroundings include but not limited to such as temperature, humidity, smog, pernicious gas, human body sensing on Internet of things node
Sensor.Meanwhile it includes but not limited to LoRa, BLE, WiFi that configuration wireless sensor is also possible on Internet of things node.Meanwhile object
It includes but not limited to such as accelerometer, gyro that configuration motion sensor is also possible in networked node.In addition, Internet of things node also wraps
Containing memory, hard disk, processor etc., Data Analysis Services can be done, and include communication module, it can be with the mutual transceiving data of cloud platform.
Cloud platform, or referred to as server module, it includes physical units such as communication module, memory, hard disk and processors,
Executable data storage, analysis, processing and transmission.In addition, cloud platform also stores database needed for Internet of things system, such as number of nodes
According to library, GIS-Geographic Information System (GIS) database and location navigation database.Cloud platform and Internet of things node, mobile terminal and pipe
It controls platform and keeps communication.
Great amount of terminals module, certain terminal module can be mobile terminal comprising sensor assembly, memory and place
Manage device, display and user interactive module and communication module.Positioning terminal module can periodically acquire surrounding wireless signal
But be not limited to such as BLE, WiFi, LoRa, GNSS, communication base station, motion sensor include but not limited to as gyro, accelerometer,
Magnetometer, air pressure count, and shooting video or image.Meanwhile terminal module can be realized local operation and include but unlimited
It is determined in such as position, posture, AR scenes generate etc., it can also realize and the communication of cloud platform-server module.
Control platform, including the physical units such as communication module, memory, hard disk and processor, can perform information extraction, display
With user interaction, processing analysis etc. functions.User can include that terminal module carries out to entire Internet of things system by control platform
The operations such as monitoring, management, instruction transmission.Control platform is kept in communication with server module.
As shown in Fig. 2, on the basis of the structure composition of the above-mentioned Internet of things system based on augmented reality, this hair
The bright Internet of things system operation method based on augmented reality has following operating procedure:
Step 1, the Internet of things node continuous collecting sensing data laid, pre-processes, compresses and be transmitted to server
Module.Server module end carries out database data processing, analysis, abnormality detection, and data, report are sent to management and control and are put down
Platform.Data are further analyzed control platform and human-computer interaction.User is i.e. by control platform to Internet of things node number
According to being checked and being obtained, it can also be inputted and be instructed by control platform, then, will input to instruct and be transmitted to cloud platform, and by
Server module end carries out instruction analysis, processing and terminal module connection.Certainly, the Internet of things node is preferably static
In point known to space coordinate, it can also be carried and move by carrier.The collected Internet of things node data of institute are at least wrapped
Include data acquisition markers, the Internet of things node unique identifier includes but not limited to such as MAC Address and its corresponding measurement number
According to such as temperature, humidity, air quality:Such as SO2, NO2, O3, PM2.5, PM10, CO.
The step also includes regular to data progress binary coding by presetting, to reduce size of data.Meanwhile it will
The data of certain time length are packaged, and are sent together.Signal transmission is carried out using LoRa data link, can also pass through other ways
Diameter, such as communication network or internet.
The global unique identifier of Internet of things node in server module end is with fetched data, Internet of things node without
Line Signals Data Base is searched in being stored with Hash sheet form in the present embodiment, can obtain scene where the signal, and at this
Unique number in scene.Meanwhile scene and number can be used to search for its corresponding unique three dimensional space coordinate and other parameters.
Meanwhile the data of different nodes can be synchronized according to markers in data and Combined Treatment analysis.
After Internet of things node data reach server module end, memory is stored in the form of chained list, and constantly update.Together
When, data are stored on hard disk in the form of a file.The data stored on extraction memory and hard disk, and invoking server simultaneously
The Internet of things node database data of module end storage, by data analysis, whether identification data are abnormal.At least there are two kinds of feelings
The exception of condition, one is node failure, another kind is that emergency situations occur such as CO contents are exceeded.Different unusual conditions
It is warned using different modes such as particular ringtone, pattern etc..It is mesh that the immediate vertex ticks of abnormal position, which is occurred, in distance
Mark node.The information such as the unique identifier of destination node, scene number, number are transmitted to control platform and terminal and are shown.
When display, destination node neighboring area is labeled as target area, target area shape is set as round, and the length of side can be artificially in pipe
Control platform setting.Certainly, other shapes figure also can be selected and obtains target area, it also can be by software set area size.
After administrative staff log in control platform, you can check to Internet of things system state, such as obtained by scene number
It takes related data state in certain scene, check Internet of things node data etc. by inputting Internet of things node identification code or number.Together
When, Internet of things system can be controlled by control platform, Internet of things node parameter is such as written.Furthermore, it is possible to one selected
Or multiple terminal modules are attached, and are then checked to its state such as position, movement locus, electricity etc. in control platform,
Also it can be inputted in control platform and after the instruction is handled by server module is transferred to end to the control instruction of terminal module
End module.
Step 2, staff carries certain terminal module and reaches on the spot, and opens terminal module application.After opening application, eventually
End module continuous collecting available wireless signal include but not limited to such as WiFi, BLE, GNSS, LoRa, motion sensor include but
It is not limited to such as gyro, accelerometer, magnetometer, barometer data.Meanwhile obtaining number in target area at server module end
According to including Internet of things node data, GIS data and location navigation data.Judge whether to carry out at server module end later
Navigation calculation, the navigation calculation include at least the position for calculating target, speed, posture, one of motion path;If it has, then
Sensing data is compressed to and is transmitted to server module, navigation calculation use is carried out for server module end.If it has not, then will
Data transmission carries out navigation calculation use to terminal module for terminal module in target area.Meanwhile terminal module is periodically adopted
Collect image data, or carries out video capture.Navigation calculation obtains position and posture, and then position and posture are given birth to for AR scenes
At being exactly position and posture, the image of camera and spatial data could have been associated.Certainly, in practical operation, step
1 and non-sequential it is necessary, step 1 may be skipped, be directly entered step 2.
When acquiring various sensing datas, markers is recorded simultaneously.It, will be a certain according to the markers in each sensing data
The all the sensors data of period are packaged, and are calculated for location navigation.
According to the spatial position of target area, in server module client database, the database in target area is obtained,
Including node database, GIS database and location navigation database.Optional database can be by calculating each candidate data library coordinate
Whether go out and is realized in target area.The process of comparison can traverse realization, can also use k-d tree, Hash table etc.
Data structure accelerates operation.
Step 3, the camera and motion sensor configured to terminal module carries out systematic calibration, and it is inside and outside to obtain camera
Axis between parameter and sensor error parameter such as zero bias, scale factor etc. and camera and each sensing system does not weigh
Close angle degree.Systematic calibration result is stored in memory or is stored with document form, is compensated for subsequent image and sensing data.
During the systematic calibration, user is successively on a series of ground calibration point known to three dimensional space coordinates
Acquire nominal data.On each calibration point, terminal module acquires image data including handheld terminal etc. with certain frequency such as 1Hz,
And simultaneously with certain frequency such as 20Hz acquisition gyros and accelerometer data.With document form storage image, gyro and acceleration
It counts.Meanwhile on each ground calibration point, keeping having the case where characteristic point on calibration wall within the scope of camera sight line always
Under, change terminal posture, the photo at each position of calibration wall is shot using camera.Pass through satellite positioning, total powerstation or its other party
Formula measures ground calibration point and demarcates characteristic point on wall or be denoted as the three dimensional space coordinate of characteristic point.
In the present embodiment using the vector comprising 29 parameters be used as camera and motion sensor systems grade calibration wait for
Estimate parameter, i.e. P=[pn, φn, cpc, φc, bg, ba, sg, sa], wherein each element is respectively camera photocentre in three dimensional space coordinate
(three-dimensional vector is embodied as p for position in systemn=[Xn, Yn, Zn], each component is followed successively by the seat in the direction of orthogonal space
Mark), (three-dimensional vector is embodied as φ to attitude anglen=[φn, θn, ψn], each component is followed successively by around three orthogonal sides of terminal
To angle), (five dimensional vectors are embodied as c=[t to camera parameterc, U0, V0, dx, dy], each component is respectively camera markers
The correction value of opposite gyro markers alignment, the abscissa of pixel coordinate system center (i.e. the intersection point of camera optical axis and the plane of delineation) and
Ordinate, pixel photo coordinate system transverse direction and longitudinal direction physical size), the phase of camera coordinates system and gyro coordinate system
To position, (three-dimensional vector is embodied as pc=[Xc, Yc, Zc], the coordinate that each component is followed successively by the direction of orthogonal space becomes
Change), (three-dimensional vector is embodied as φ at relative attitude anglec=[φc, θc, ψc], each component be followed successively by around terminal it is orthogonal three
The angle change in a direction), (three-dimensional vector is embodied as b for the zero bias of gyrog=[bgx, bgy, bgz], each component is followed successively by
The zero bias of three gyros), (three-dimensional vector is embodied as b for the zero bias of accelerometera=[bax, bay, baz], each component is successively
For the zero bias of three accelerometers), (three-dimensional vector is embodied as s for the scale factor of gyrog=[sgx, sgy, sgz], each point
Amount is followed successively by the scale factors of three gyros) and the scale factor of accelerometer (three-dimensional vector is embodied as sa=
[sax, say, saz], each component is followed successively by the zero bias of three accelerometers).
Data used in systematic calibration are characterized the coordinate a little in three-dimensional coordinate system, in each image coordinate system
The markers of coordinate, the markers of Image Acquisition and gyro, accelerometer data and its acquisition.Calibration passes through repeatedly in embodiment
In generation, calculates, and cost function is minimized, to calculate parameter to be estimated.Cost function used is:
Wherein
Wherein symbol h () indicates nonlinear function,WithRespectively camera coordinates system turns relative to gyro coordinate system
Change matrix and the transition matrix of gyro coordinate system relative dimensional space coordinates, RcFor camera parameter matrix, tiIt is clapped for photo i
The markers of gyro data when taking the photograph,It is characterized the coordinate a little in three-dimensional coordinate system.ωG, iAnd fA, iRespectively gyro and
Accelerometer ith measure angular speed and than force vector, ωE, iAnd giRespectively rotational-angular velocity of the earth and acceleration of gravity
Vector, vC, i, j、vG, iAnd vA, iThe respectively measurement noise of image, gyro and accelerometer.
Transition matrix is indicated by three-dimensional position and three-dimension altitude angle in above formula, and formula is
Wherein sφ=sin (φn), cφ=cos (φn), sθ=sin (θn), cθ=cos (θn), sψ=sin (ψn), cψ=
cos(ψn)φn、θn、ψnRespectively roll, pitching and course attitude angle (i.e. vector φnIn three elements), Xn、Xn、YnRespectively
Three-dimensional position (i.e. vector pnIn three elements).By φn、θn、ψn、Xn、Yn、ZnIt indicates,By φc、θc、ψc、Xc、Yc、ZcTable
Show.
Camera parameter matrix is expressed as:
Wherein f is camera effective focal length.
Step 4, AR scenario buildings, the image shows in the camera acquired image or video that terminal module is configured
To user, meanwhile, the Conversion Relations between hind computation current picture coordinate system and three-dimensional coordinate system.One
Aspect calculates two-dimensional coordinate and depth in target point generic pixel coordinate system, if projection of the target point in pixel coordinate system
Within the scope of camera image, and depth is less than threshold value (threshold value is positive number, is detected to needs according to according in concrete application
Target range is set.For example, if desired detect the target apart from d meters of user, then can given threshold be d meters) then phase on image
It answers and marks the distance of target point and target point apart from terminal at pixel coordinate.On the other hand, if user opens AR measurement patterns,
Then the three dimensional space coordinate of target point can be obtained by the modes such as a certain target point of click or input target point number, and
Measure its spatial relationship between other target points, such as distance and angle.User can also be in terminal module to target point
It is marked and edits, and be sent to server module and carry out data update.
To a certain Internet of things node, the conversion formula that its three dimensional space coordinate is transformed into image pixel coordinates is:
Wherein pw=[Xw, Yw, Zw] it is coordinate of the node in three-dimensional coordinate system, pI=[U0, V0, 1], [U0, V0] be
Abscissa and ordinate of the node in pixel coordinate system.
Step 5, navigation calculation is carried out using motion sensor and wireless sensor data.Detailed process is:First passing through makes
Scene Recognition is carried out with wireless signal, determines positioning scene;Then, the database in positioning scene is obtained, and combines real-time nothing
Line signal data carries out position, speed, course resolving;Meanwhile carrying out position, speed using the motion sensor data compensated
And attitude algorithm.Then, data fusion is carried out to the data of wireless sensor and motion sensor, obtains terminal location, speed
And attitude data, or to the initial data and/or navigation calculation result of wireless sensor and motion sensor (including position, speed
The information such as degree, posture) it is merged;The relative position and distance of computing terminal module objectives point, and whether judge terminal module
It has been within the scope of target area, if so, prompt user arrived target area, and pattern of starting to work;If it has not, then right
Terminal module and target area carry out path planning, show paths on terminal module and are guided into line direction user, until
User reaches target area.Directly use of the shielding to the sensor to go wrong is convenient for the amalgamation mode of navigation calculation result,
And because navigation calculation result is relatively low compared with dimension for initial data, therefore, realize simple.It then can be with to the fusion of initial data
Preferably the error in initial data is detected and rejected, improves the system reliability of navigation calculation.Meanwhile by user position
It sets and is transmitted to server module with track, and show and dispatch in control platform.
Wireless sensor data navigation calculation is specifically described as:Relatively a series of wireless base stations of acquisition terminal module it is wireless
Signal data includes at least signal reception time, terminal MAC address, wireless base station MAC Address, signal strength RSS, can also wrap
Include other information, such as channel information, load data.According to wireless base station MAC Address in each data, in global radio signal number
According to being searched in library (being stored with Hash sheet form in the present embodiment), the corresponding unique number of the wireless signal can be obtained.Together
When, which can be used to search for its corresponding unique three dimensional space coordinate and wireless signal propagation model.Using each wireless
Base station three dimensional space coordinate and signal propagation model, can computing terminal module three-dimensional position.It is calculated using wireless signal strength
The formula of distance isWherein r and d is respectively wireless signal strength and distance, and n and b are signal propagation parameter.It hands over
Meeting location Calculation is standing procedure, is no longer described.It, can be in global radio Signals Data Base by the position of three-dimensional terminal module
It is searched, takes out the base station data in such as 100 meters of periphery a certain range.User can also be used fingerprint recognition or other methods
Carry out wireless location.Using wireless location twice as a result, can computing terminal module movement two-dimension speed and course.Use two
It is secondary positioning the moment change in location divided by time interval calculating speed, and by calculate both direction speed quotient arc tangent come
Calculate course angle.In addition, carrying out centralization to calculated course angle, concrete operations are, detect course angle, if more than 180 degree,
Course angle is then subtracted 360 degree;If less than -180 degree, course angle is added 360 degree.Then, the operation is repeated, until course
Angle is within the scope of -180 degree to 180 degree.
Data navigation resolving is carried out using motion sensor, including the concrete operations such as position and Attitude Calculation are:Whenever adopting
Collect gyro and accelerometer data, gyro zero bias and scale factor data is used to carry out error compensation to gyro data, together
When using accelerometer bias and scale factor data error compensation is carried out to accelerometer data.Then, it is initialized, is wrapped
It includes and initial attitude is determined using magnetometer and accelerometer data, sets initial velocity to zero, is inputted using user
Speed or wireless location result are as initial velocity.Then, time integral meter is carried out using the angular velocity vector of gyro to measure
Calculate posture increment;Attitude Calculation current pose of upper a moment is added using posture increment.Then, will be accelerated using current pose data
The ratio force vector that degree meter measures is transformed into three-dimensional coordinate system from device coordinate system.Later, in three-dimensional coordinate system system
Specific force is added in gravity vector and constitutes vector acceleration, then time integral calculating speed increment is carried out to acceleration;Operating speed
Increment calculates present speed plus speed of upper a moment.Then, time integral calculating position increment is carried out to present speed;Use position
It sets increment and calculates current location plus last moment position.And so on, continue on the posture, speed, position of last moment
Information calculates the posture, speed and location information at current time in conjunction with the gyro and accelerometer data at current time.
The position, speed and the course angle that use wireless sensor data and motion sensor to be calculated are melted into row information
It closes, obtains terminal location, speed and posture.Information fusion both can directly use weighted average, can also be filtered using Kalman
The optimal estimations method such as wave, least square.For using Kalman filtering, gyro and accelerometer data structure can be both used
It makes system equation, the position obtained using wireless signal, speed and course data construction and measures update, virtual system can also be used
Equation such as uniform velocity of uniting is it is assumed that position, speed and the course angle being calculated using wireless sensor data and motion sensor
Data construct measurement equation.
Whether the difference of position and posture result between whether terminal module position, posture have significant change that can compare for two moment
Realized more than respective threshold, the threshold value be positive number, can according to concrete application to position, attitudes vibration detection sensitivity come
Setting.For example, if desired detecting any change in location for being more than X1 meters, then position threshold is set as X1 meters.If desired detection is appointed
What is more than the attitudes vibration of X2 degree, then posture threshold value is set as X2 degree.A-Star scheduling algorithm computing terminals can be used in path planning
Shortest path between module and target point, and showed in terminal module with arrow form.When between terminal module and target point
When distance changes, shortest path is updated.
Step 6, whether detection periphery Internet of things node is in camera measurement spatial dimension.If choosing Internet of Things section
It puts and stores its identification code.Then, selected Internet of things node coordinate is on the one hand transformed into two-dimensional image from three-dimensional coordinate system
Plain coordinate system, the then Overlapping display in AR scenes;On the other hand, using the identification code selected Internet of Things is inquired in server end
The information such as the data and state such as temperature, humidity, air quality of node will if Internet of things node data mode changes
The data transmission is used to terminal module for display.If it is desired, can terminal module or control platform to Internet of things node into
The operations such as row selection, filtering.If especially detect it is exceeded etc. there may be exception or emergency situations such as CO contents, will be urgent
Signal is sent to terminal module and control platform, and is reminded using the modes such as jingle bell or display risk markings, can also basis
The posture information of the relative distance and terminal module of target point and user, generates left-right asymmetry audio, to prompt user
The azimuth information of target point.
Step 7, step 4 is repeated to step 6, until terminal module or control platform end task.Terminal module personnel's work
After work, end signal is sent to server module, control platform is then transmitted to by server module.At control platform end
Internet of things system is analyzed, however, it is determined that whole system is normal, and after can ending task, the letter that ends task is sent to server
Number.
It will be understood to one skilled in the art that above specification is only one kind in the numerous embodiments of the present invention
Or several embodiments, and not use limitation of the invention.Any equivalent change for embodiment described above, modification with
And the technical solutions such as equivalent substitute will all be fallen in claims of the present invention as long as meeting the spirit of the present invention
In the range of protecting.
Claims (16)
1. a kind of Internet of things system based on augmented reality, which is characterized in that it includes at least Internet of things node, server
Four module, terminal module and control platform modules, wherein:
Internet of things node is laid in wide area space, identification code and sensor is equipped with, with other module bi-directional transfer of data;
Server module executes data storage, analysis, processing and transmission, carries out database data processing, analysis, abnormality detection,
And transmission data and report;
Terminal module, acquisition signal, sensing data, shooting video or image realize local operation and both-way communication;And
Control platform, execution information extraction, display and user's interaction and processing analysis, carry out Internet of things node and terminal module
Supervision and instruction are sent.
2. the Internet of things system based on augmented reality as described in claim 1, which is characterized in that the server module
Equipped with database, the database includes at least node database, gis database and location navigation database, institute
State server module and Internet of things node, terminal module, control platform holding communicate, the Internet of things node, server module,
Terminal module and control platform are all provided with memory, hard disk, processor and communication module.
3. the Internet of things system based on augmented reality as claimed in claim 2, which is characterized in that the terminal module is also
Including sensor assembly, display and user interactive module, the sensor assembly includes wireless sensor and motion sensor, end
End module periodically acquisition wireless signal, motion sensor data, and realize local operation, including position, posture determine, AR
Scene generates.
4. the Internet of things system based on augmented reality as claimed in claim 3, which is characterized in that further include to end
The motion sensor and the camera of shooting video or image of end module are demarcated, and camera parameter, sensor error parameter are obtained
And the misaligned angle of axis between camera and sensor;Calibration result storage is compensated for image and sensing data.
5. the Internet of things system based on augmented reality as claimed in claim 3, which is characterized in that the AR scenes life
At being included in the image of terminal module displaying acquisition, calculate between current picture coordinate system and three-dimensional coordinate system
Conversion Relations;
The three dimensional space coordinate clicks target point or input target point numbering obtains by opening AR measurement patterns.
6. the Internet of things system based on augmented reality as claimed in claim 5, which is characterized in that the AR scenes life
At further comprising:The two-dimensional coordinate and depth in target point relative image pixel coordinate system are calculated, if target point is in image slices
Being projected within the scope of the camera image of terminal module in plain coordinate system, and depth is less than threshold value, then respective image picture on image
The distance of target point and target point apart from terminal is marked at plain coordinate.
7. the Internet of things system based on augmented reality as claimed in claim 3, which is characterized in that further include the terminal
Module or server module fusional movement sensor and wireless sensor data carry out navigation calculation, obtain terminal location, speed
And posture.
8. the Internet of things system operation method based on augmented reality as described in claim 1, which is characterized in that it includes
Following steps:
Sensing data is acquired by Internet of things node, pre-processes and is transmitted to server module;
Database data processing, analysis, abnormality detection are carried out through server module;
Signal, sensing data, shooting video or image are acquired in terminal module, realizes local operation and both-way communication;And
Instruction analysis, processing and terminal module connection are carried out by server module.
9. the Internet of things system operation method based on augmented reality as claimed in claim 8, which is characterized in that described in it
Server module is equipped with database, and the database is led including at least node database, gis database and positioning
Navigate database, and the server module is communicated with Internet of things node, terminal module, control platform holding, the Internet of things node,
Server module, terminal module and control platform are all provided with memory, hard disk, processor and communication module.
10. the Internet of things system operation method based on augmented reality as claimed in claim 9, which is characterized in that
The terminal module further includes sensor assembly, display and user interactive module, and the sensor assembly includes wireless sensor
And motion sensor, terminal module periodically acquisition wireless signal, motion sensor data, and realize local operation, including
Position, posture determine that AR scenes generate.
11. the Internet of things system operation method based on augmented reality as claimed in claim 10, which is characterized in that also wrap
Include and the camera and motion sensor of terminal module demarcated, obtain the inside and outside parameter of camera and sensor error parameter with
And the misaligned angle of axis between camera and sensor;Calibration result storage is compensated for image and sensing data.
12. the Internet of things system operation method based on augmented reality as claimed in claim 10, which is characterized in that described
AR scenes generate, and are included in the image of terminal module displaying acquisition, calculate current picture coordinate system and three dimensional space coordinate
Conversion Relations between system, the three dimensional space coordinate click target point or input target by opening AR measurement patterns
Point numbering obtains.
13. the Internet of things system operation method based on augmented reality as claimed in claim 12, which is characterized in that described
AR scenes generate, and further comprise:The two-dimensional coordinate and depth in target point relative image pixel coordinate system are calculated, if target point
Being projected in terminal module ground camera image range in image pixel coordinates system, and depth is less than threshold value, then phase on image
It answers and marks the distance of target point and target point apart from terminal at image pixel coordinates.
14. the Internet of things system operation method based on augmented reality as claimed in claim 10, which is characterized in that it is also
Include the steps that fusional movement sensor and wireless sensor data carry out navigation calculation.
15. the Internet of things system operation method based on augmented reality as claimed in claim 14, which is characterized in that described
Terminal module acquires signal data, obtains data in target area in server module, judges whether to carry out in server module
Navigation operations;If it has, then by sensor data transmission to server module, location navigation resolving is carried out;If it has not, then by mesh
It marks data transmission in region and carries out location navigation resolving to terminal module.
16. the Internet of things system operation method based on augmented reality as claimed in claim 12, which is characterized in that it is also
Including detection Internet of things node whether the step in camera measurement spatial dimension;
If choosing Internet of things node and storing its identification code, by selected Internet of things node coordinate from three-dimensional coordinate system
It is transformed into two-dimensional pixel coordinate system, then the Overlapping display in AR scenes;
The data and state for inquiring selected Internet of things node in server end using the identification code, if Internet of things node data mode
It changes, then uses the data transmission to terminal module for display.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810375417.4A CN108650245A (en) | 2018-04-24 | 2018-04-24 | Internet of things system based on augmented reality and operation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810375417.4A CN108650245A (en) | 2018-04-24 | 2018-04-24 | Internet of things system based on augmented reality and operation method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108650245A true CN108650245A (en) | 2018-10-12 |
Family
ID=63747256
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810375417.4A Pending CN108650245A (en) | 2018-04-24 | 2018-04-24 | Internet of things system based on augmented reality and operation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108650245A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111272764A (en) * | 2020-01-22 | 2020-06-12 | 哈尔滨工业大学 | Large intelligent temporary platform non-contact image recognition mobile management and control system and method |
CN111787189A (en) * | 2020-07-17 | 2020-10-16 | 塔盾信息技术(上海)有限公司 | Gridding automatic monitoring system for integration of augmented reality and geographic information |
CN111885707A (en) * | 2020-08-05 | 2020-11-03 | 济南浪潮高新科技投资发展有限公司 | AR (augmented reality) -device-based Internet of things device control method and system |
CN113259912A (en) * | 2020-02-13 | 2021-08-13 | 虎尾科技大学 | Many-to-many state identification system for Internet of things broadcasting equipment name |
CN113487278A (en) * | 2021-07-02 | 2021-10-08 | 钦州云之汇大数据科技有限公司 | Enterprise cooperative office system based on Internet of things |
CN113542689A (en) * | 2021-07-16 | 2021-10-22 | 金茂智慧科技(广州)有限公司 | Image processing method based on wireless Internet of things and related equipment |
CN114489344A (en) * | 2022-02-16 | 2022-05-13 | 海南热带海洋学院 | Augmented reality digital culture content display device and method based on Internet of things |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20110128096A (en) * | 2010-05-20 | 2011-11-28 | 성균관대학교산학협력단 | System and method for implementing augmented reality using building enviornment data |
CN105404231A (en) * | 2016-01-12 | 2016-03-16 | 西北工业大学 | Internet of things-based intelligent building monitoring managing system |
CN105427504A (en) * | 2015-12-24 | 2016-03-23 | 重庆甲虫网络科技有限公司 | Wireless intelligent augmented reality firefighting monitoring system |
CN106713416A (en) * | 2016-11-23 | 2017-05-24 | 宁波市镇海百硕机械科技有限公司 | Wireless smart augmented reality fire-fighting monitoring system |
CN106909215A (en) * | 2016-12-29 | 2017-06-30 | 深圳市皓华网络通讯股份有限公司 | Based on the fire-fighting operation three-dimensional visualization command system being accurately positioned with augmented reality |
US20180018867A1 (en) * | 2016-07-12 | 2018-01-18 | Tyco Fire & Security Gmbh | Holographic Technology Implemented Security Solution |
-
2018
- 2018-04-24 CN CN201810375417.4A patent/CN108650245A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20110128096A (en) * | 2010-05-20 | 2011-11-28 | 성균관대학교산학협력단 | System and method for implementing augmented reality using building enviornment data |
CN105427504A (en) * | 2015-12-24 | 2016-03-23 | 重庆甲虫网络科技有限公司 | Wireless intelligent augmented reality firefighting monitoring system |
CN105404231A (en) * | 2016-01-12 | 2016-03-16 | 西北工业大学 | Internet of things-based intelligent building monitoring managing system |
US20180018867A1 (en) * | 2016-07-12 | 2018-01-18 | Tyco Fire & Security Gmbh | Holographic Technology Implemented Security Solution |
CN106713416A (en) * | 2016-11-23 | 2017-05-24 | 宁波市镇海百硕机械科技有限公司 | Wireless smart augmented reality fire-fighting monitoring system |
CN106909215A (en) * | 2016-12-29 | 2017-06-30 | 深圳市皓华网络通讯股份有限公司 | Based on the fire-fighting operation three-dimensional visualization command system being accurately positioned with augmented reality |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111272764A (en) * | 2020-01-22 | 2020-06-12 | 哈尔滨工业大学 | Large intelligent temporary platform non-contact image recognition mobile management and control system and method |
CN111272764B (en) * | 2020-01-22 | 2023-04-28 | 哈尔滨工业大学 | Non-contact image identification mobile management and control system and method for large intelligent temporary platform |
CN113259912A (en) * | 2020-02-13 | 2021-08-13 | 虎尾科技大学 | Many-to-many state identification system for Internet of things broadcasting equipment name |
CN113259912B (en) * | 2020-02-13 | 2024-03-26 | 虎尾科技大学 | Many-to-many status recognition system of broadcasting equipment name of thing networking |
CN111787189A (en) * | 2020-07-17 | 2020-10-16 | 塔盾信息技术(上海)有限公司 | Gridding automatic monitoring system for integration of augmented reality and geographic information |
CN111885707A (en) * | 2020-08-05 | 2020-11-03 | 济南浪潮高新科技投资发展有限公司 | AR (augmented reality) -device-based Internet of things device control method and system |
CN113487278A (en) * | 2021-07-02 | 2021-10-08 | 钦州云之汇大数据科技有限公司 | Enterprise cooperative office system based on Internet of things |
CN113487278B (en) * | 2021-07-02 | 2023-05-09 | 深圳市顿泽慧科技有限公司 | Enterprise collaborative office system based on Internet of things |
CN113542689A (en) * | 2021-07-16 | 2021-10-22 | 金茂智慧科技(广州)有限公司 | Image processing method based on wireless Internet of things and related equipment |
CN114489344A (en) * | 2022-02-16 | 2022-05-13 | 海南热带海洋学院 | Augmented reality digital culture content display device and method based on Internet of things |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108650245A (en) | Internet of things system based on augmented reality and operation method | |
CN108600367A (en) | Internet of Things system and method | |
US8706414B2 (en) | Method and system for locating and monitoring first responders | |
CN204465738U (en) | A kind of disaster relief rescue visible system | |
CN104660995A (en) | Disaster relief visual system | |
CN108120436A (en) | Real scene navigation method in a kind of iBeacon auxiliary earth magnetism room | |
CN106909215A (en) | Based on the fire-fighting operation three-dimensional visualization command system being accurately positioned with augmented reality | |
CN110222137A (en) | One kind is based on oblique photograph and augmented reality Intelligent campus system | |
CN107094319A (en) | A kind of high-precision indoor and outdoor fusion alignment system and method | |
CN108844533A (en) | A kind of free posture PDR localization method based on Multi-sensor Fusion and attitude algorithm | |
US20100214118A1 (en) | System and method for tracking a person | |
CN109668568A (en) | A kind of method carrying out location navigation using panoramic imagery is looked around | |
CN104569909B (en) | A kind of indoor alignment system and method | |
CN116485066B (en) | GIS-based intelligent gas safety line inspection management method and Internet of things system | |
CN104331078B (en) | Multi-robot cooperative localization method based on position mapping algorithm | |
CN108827295A (en) | Miniature drone method for self-locating based on wireless sensor network and inertial navigation | |
CN111521971B (en) | Robot positioning method and system | |
Verma et al. | A smartphone based indoor navigation system | |
CN103499341A (en) | Electric rod dipmeter as well as using method thereof | |
CN106525007A (en) | Distributed interactive surveying and mapping universal robot | |
Wu et al. | An intelligent active alert application on handheld devices for emergency evacuation guidance | |
CN108512888A (en) | A kind of information labeling method, cloud server, system, electronic equipment and computer program product | |
US11692829B2 (en) | System and method for determining a trajectory of a subject using motion data | |
CN108124479A (en) | Map labeling method and device, cloud server, terminal and application program | |
CN106525023B (en) | A kind of array positioning device and the array localization method based on data analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20181012 |