CN111787273A - Real-environment positioning system - Google Patents

Real-environment positioning system Download PDF

Info

Publication number
CN111787273A
CN111787273A CN201910273090.4A CN201910273090A CN111787273A CN 111787273 A CN111787273 A CN 111787273A CN 201910273090 A CN201910273090 A CN 201910273090A CN 111787273 A CN111787273 A CN 111787273A
Authority
CN
China
Prior art keywords
module
unit
data
server
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910273090.4A
Other languages
Chinese (zh)
Inventor
穆宜霖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhijun Shenzhen Data Technology Co ltd
Original Assignee
Zhijun Shenzhen Data Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhijun Shenzhen Data Technology Co ltd filed Critical Zhijun Shenzhen Data Technology Co ltd
Priority to CN201910273090.4A priority Critical patent/CN111787273A/en
Publication of CN111787273A publication Critical patent/CN111787273A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a real environment positioning system, which comprises: the system comprises a server, a camera access module, an identification module, an extraction module, a calling module, a VR application module, a display module and an acquisition module; the server is in communication connection with the calling module, the VR application module, the display module, the acquisition module, the identification module and the extraction module, the camera shooting access module is in communication connection with the identification module and the acquisition module, and the identification module is in communication connection with the extraction module. The indoor navigation system can realize convenient and clear indoor navigation, timely update of data, compatibility with an indoor original guide, simple structure, easy realization, wide popularization, compatibility and strong adaptability.

Description

Real-environment positioning system
Technical Field
The utility model belongs to the technical field of the electronic positioning technique and specifically relates to a real estate positioning system.
Background
A positioning system is an assembly or device of related objects that is aimed at determining a spatial position, and is generally referred to as a global positioning system. In short, this is a satellite system consisting of 24 satellites covering the world. The system can ensure that 4 satellites can be observed at any point on the earth at any time, so that the satellite can acquire the longitude and latitude and the height of the observation point, and functions of navigation, positioning, time service and the like can be realized. This technique can be used to guide aircraft, ships, vehicles, and individuals to safely and accurately follow a selected route to a destination on time. The existing positioning technology has defects, and particularly when the positioning technology is applied indoors, the following obvious defects exist:
existing positioning technologies cannot be combined with indoor guidance to enable positioning and navigation, so that the original indoor guidance can only provide very limited information.
Location-based services are essentially unavailable in indoor environments due to technical and cost issues.
The current solutions to positioning difficulties are proprietary in nature and are not compatible with general positioning and mapping systems, such as IOS systems and android systems.
The existing positioning technology lacks a method for collecting data and collecting data in a scale by a frontline platform, so that the data cannot be updated in time, and the adaptability and timeliness of the positioning technology have serious defects.
In view of the above problems, a three-dimensional real-scene technology is introduced in the prior art, such as a method for integrating a facility and equipment monitoring system based on a three-dimensional real scene disclosed in the prior art, which is applied under the reference number 201310036817X. Firstly, fully utilizing a database technology to integrate and correlate static data of a three-dimensional live-action model library with dynamic data in a facility equipment monitoring subsystem, and realizing the dynamic property and real-time property of information display of a three-dimensional live-action object; secondly, modeling the intelligent facility equipment in a real object manner to form identifiable facility equipment three-dimensional primitives, extracting characteristic information of the intelligent facility equipment, and establishing a visual information data structure of the intelligent facility equipment; and finally, by applying an industrial data bridge and SOA technology, integrating real-time data acquired by intelligent facility equipment to the three-dimensional live-action map, realizing information display control and statistical analysis, providing accurate real-time visual dynamic information for an administrator, and meeting the omnibearing monitoring requirement.
With the continuous development of three-dimensional live-action technology, geographic information and visualization technology, the three-dimensional live-action model is gradually replacing the traditional two-dimensional model, and is widely applied to the fields of traffic systems, digital campuses, city management and the like. The three-dimensional real-scene model has the characteristics of intuition, reality, rich content and the like, can enable a user to visually generate an immersive feeling, and models and simulates real world objects, namely, according to research targets and key points, the three-dimensional real-scene model researches the attributes of the shape, the material, the illumination, the color and the like in a three-dimensional space so as to achieve the 3D reproduction process. The technology for monitoring the real object in the three-dimensional model scene is continuously developed and applied in life and work of people.
Although the three-dimensional live-action technology can solve some of the problems, the problems of convenience and clarity of indoor navigation, timely data updating, compatibility with indoor original guidelines and the like still have defects.
Disclosure of Invention
In order to solve the above problems, an object of the present invention is to provide a real-world positioning system, which can achieve convenient and clear indoor navigation, timely update data, and compatibility with an indoor original guide.
Another object of the present invention is to provide a real-world positioning system which is simple in structure, easy to implement, widely popularized, compatible and adaptable.
To achieve the above object, the present invention provides a real-world positioning system, comprising:
the server is used for storing data, analyzing the data and issuing a control command;
the camera access module is used for accessing and opening the camera function of the mobile equipment;
the identification module is used for identifying the image data collected by the camera module, comparing and analyzing the image data with the stored data in the server and transmitting the image data back to the server;
the extraction module is used for extracting server data and comparing and analyzing the server data by the identification module;
the calling module is used for calling the real-time navigation data of the conventional navigation software;
the VR application module is used for integrating the real-time navigation data of the conventional navigation software and the comparison data of the identification module and processing the data into a VR display mode;
the display module is used for displaying a VR navigation picture;
the acquisition module is used for calling the camera module to acquire navigation data, analyzing the acquired data and storing the analyzed data in the server;
the server is in communication connection with the calling module, the VR application module, the display module, the acquisition module, the identification module and the extraction module, the camera shooting access module is in communication connection with the identification module and the acquisition module, and the identification module is in communication connection with the extraction module.
The real-environment positioning system is mainly divided into a positioning working mode and a data acquisition working mode when working.
When the system realizes a positioning working mode, the camera access module opens a camera of the mobile equipment, shot data are transmitted to the identification module, meanwhile, the extraction module extracts instruction data stored by the server and transmits the instruction data to the identification module, the identification module compares the data sent by the extraction module and the camera access module and transmits a comparison result to the server, the server corresponds the result to a corresponding instruction and transmits the instruction to the VR application module, meanwhile, the calling module calls real-time navigation data of other conventional navigation software and transmits the real-time navigation data to the VR application module through the server, and the VR application module combines two groups of data sent by the server and then displays display mode data in the display module. Finally, the picture shot by the mobile phone camera is combined with extra prompting characters or pictures to display the image, so that the real-world navigation function is realized.
When the system realizes a data acquisition working mode, the acquisition module is connected with the camera access module to be connected with the camera of the mobile device, receives image information shot by the camera, and analyzes the image information to form server data for the extraction module to extract, so that the data of the server is continuously perfected, and the applicability of the server is continuously improved.
Further, the collection module comprises a mobile phone collection unit, a simple equipment collection unit and a collection analysis unit, wherein the mobile phone collection unit and the simple equipment collection unit are all in communication connection with the camera shooting access module, the mobile phone collection unit and the simple equipment collection unit are all in communication connection with the collection analysis unit, and the collection analysis unit is in communication connection with the server. The mobile phone acquisition unit is a common mobile phone, can acquire data at any time and any place without labor, and is quick and convenient in acquisition work. The simple equipment acquisition unit comprises a tablet computer, a notebook computer and other simple collectors. The acquisition and analysis unit is used for analyzing and converting data of the mobile phone acquisition unit and the simple equipment acquisition unit, and finally transmitting the data to the server to become server data for the extraction module to extract.
Further, the camera shooting access module comprises a camera shooting opening unit, a model identification unit and a data transmission unit which are sequentially connected, the camera shooting opening unit is connected with the acquisition module, and the data transmission unit is connected with the identification module. The camera opening unit is used for opening a camera of the mobile device, the model identification unit is used for identifying the mobile device with different signals, different control instructions are called to the server to open the camera of the mobile device, and the adaptability of the camera access module can be improved. The output transmitting unit is used for transmitting the data shot by the camera into the identification module for identification and comparison by the identification module.
Furthermore, the identification module comprises an image receiving unit, an image feature capturing unit and a comparison unit, the image receiving unit, the image feature capturing unit and the comparison unit are sequentially connected, the image receiving unit is connected with the camera shooting access module, the image feature capturing unit and the comparison unit are both connected with the extraction module, and the comparison unit is further in communication connection with the server. The image receiving unit is used for receiving the image data sent by the camera access module, the image characteristic capturing unit is used for capturing the characteristics of the image data transmitted by the camera access module and the extracting module and transmitting the characteristics into the comparison unit, and the comparison unit transmits the comparison result into the server after comparing the characteristics and the image data for the server to perform next display control.
Furthermore, the extraction module comprises an image data extraction unit and a stored image extraction unit, the input end of the image data extraction unit is connected with the server, the output end of the image extraction unit is connected with the stored image extraction unit and the comparison unit, and the stored image extraction unit is connected with the image feature capture unit. The image data extraction unit is used for extracting instruction data corresponding to the image data in the server and transmitting the instruction data to the comparison unit for standby, and the image data is extracted by the stored image extraction unit and then sent to the image feature capture unit for feature capture. And finally, after the comparison result is matched with the instruction data, the comparison unit transmits the corresponding instruction data to the server to realize the subsequent display control function of the server.
The invention has the beneficial effects that: compared with the prior art, in the invention,
the real-environment positioning system is mainly divided into a positioning working mode and a data acquisition working mode when working. When the system realizes a positioning working mode, the camera access module opens a camera of the mobile equipment, shot data are transmitted to the identification module, meanwhile, the extraction module extracts instruction data stored by the server and transmits the instruction data to the identification module, the identification module compares the data sent by the extraction module and the camera access module and transmits a comparison result to the server, the server corresponds the result to a corresponding instruction and transmits the instruction to the VR application module, meanwhile, the calling module calls real-time navigation data of other conventional navigation software and transmits the real-time navigation data to the VR application module through the server, and the VR application module combines two groups of data sent by the server and then displays display mode data in the display module. Finally, the picture shot by the mobile phone camera is combined with extra prompting characters or pictures to display the image, so that the real-world navigation function is realized. When the system realizes a data acquisition working mode, the acquisition module is connected with the camera access module to be connected with the camera of the mobile device, receives image information shot by the camera, and analyzes the image information to form server data for the extraction module to extract, so that the data of the server is continuously perfected, and the applicability of the server is continuously improved. The combination of the modules can realize the functions of convenient and clear indoor navigation, timely data updating and compatibility with the indoor original guide. In addition, the system has simple structure, easy realization, wide popularization, compatibility and strong adaptability.
Drawings
FIG. 1 is a block diagram of a real-world positioning system according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, the present invention provides a real-world positioning system, including:
the server is used for storing data, analyzing the data and issuing a control command;
the camera access module is used for accessing and opening the camera function of the mobile equipment;
the identification module is used for identifying the image data collected by the camera module, comparing and analyzing the image data with the stored data in the server and transmitting the image data back to the server;
the extraction module is used for extracting server data and comparing and analyzing the server data by the identification module;
the calling module is used for calling the real-time navigation data of the conventional navigation software;
the VR application module is used for integrating the real-time navigation data of the conventional navigation software and the comparison data of the identification module and processing the data into a VR display mode;
the display module is used for displaying a VR navigation picture;
the acquisition module is used for calling the camera module to acquire navigation data, analyzing the acquired data and storing the analyzed data in the server;
the server is in communication connection with the calling module, the VR application module, the display module, the acquisition module, the identification module and the extraction module, the camera shooting access module is in communication connection with the identification module and the acquisition module, and the identification module is in communication connection with the extraction module.
The real-environment positioning system is mainly divided into a positioning working mode and a data acquisition working mode when working.
When the system realizes a positioning working mode, the camera access module opens a camera of the mobile equipment, shot data are transmitted to the identification module, meanwhile, the extraction module extracts instruction data stored by the server and transmits the instruction data to the identification module, the identification module compares the data sent by the extraction module and the camera access module and transmits a comparison result to the server, the server corresponds the result to a corresponding instruction and transmits the instruction to the VR application module, meanwhile, the calling module calls real-time navigation data of other conventional navigation software and transmits the real-time navigation data to the VR application module through the server, and the VR application module combines two groups of data sent by the server and then displays display mode data in the display module. Finally, the picture shot by the mobile phone camera is combined with extra prompting characters or pictures to display the image, so that the real-world navigation function is realized.
When the system realizes a data acquisition working mode, the acquisition module is connected with the camera access module to be connected with the camera of the mobile device, receives image information shot by the camera, and analyzes the image information to form server data for the extraction module to extract, so that the data of the server is continuously perfected, and the applicability of the server is continuously improved.
In this embodiment, the collection module includes cell-phone collection unit, simple and easy equipment collection unit and gathers analytic unit, and cell-phone collection unit, simple and easy equipment collection unit all access module communication connection with making a video recording, and cell-phone collection unit, simple and easy equipment collection unit all with gather analytic unit communication connection, gather analytic unit and server communication connection. The mobile phone acquisition unit is a common mobile phone, can acquire data at any time and any place without labor, and is quick and convenient in acquisition work. The simple equipment acquisition unit comprises a tablet computer, a notebook computer and other simple collectors. The acquisition and analysis unit is used for analyzing and converting data of the mobile phone acquisition unit and the simple equipment acquisition unit, and finally transmitting the data to the server to become server data for the extraction module to extract.
In this embodiment, the camera access module includes a camera opening unit, a model identification unit and a data transmission unit, which are connected in sequence, the camera opening unit is connected with the acquisition module, and the data transmission unit is connected with the identification module. The camera opening unit is used for opening a camera of the mobile device, the model identification unit is used for identifying the mobile device with different signals, different control instructions are called to the server to open the camera of the mobile device, and the adaptability of the camera access module can be improved. The output transmitting unit is used for transmitting the data shot by the camera into the identification module for identification and comparison by the identification module.
In this embodiment, the recognition module includes an image receiving unit, an image feature capturing unit and a comparison unit, the image receiving unit, the image feature capturing unit and the comparison unit are sequentially connected, the image receiving unit is connected to the camera access module, the image feature capturing unit and the comparison unit are both connected to the extraction module, and the comparison unit is further connected to the server in a communication manner. The image receiving unit is used for receiving the image data sent by the camera access module, the image characteristic capturing unit is used for capturing the characteristics of the image data transmitted by the camera access module and the extracting module and transmitting the characteristics into the comparison unit, and the comparison unit transmits the comparison result into the server after comparing the characteristics and the image data for the server to perform next display control.
In this embodiment, the extraction module includes an image data extraction unit and a stored image extraction unit, an input end of the image data extraction unit is connected to the server, an output end of the image extraction unit is connected to both the stored image extraction unit and the comparison unit, and the stored image extraction unit is connected to the image feature capture unit. The image data extraction unit is used for extracting instruction data corresponding to the image data in the server and transmitting the instruction data to the comparison unit for standby, and the image data is extracted by the stored image extraction unit and then sent to the image feature capture unit for feature capture. And finally, after the comparison result is matched with the instruction data, the comparison unit transmits the corresponding instruction data to the server to realize the subsequent display control function of the server.
The present invention is not limited to the above preferred embodiments, and any modifications, equivalent substitutions and improvements made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (5)

1. A real estate locating system, comprising:
the server is used for storing data, analyzing the data and issuing a control command;
the camera access module is used for accessing and opening the camera function of the mobile equipment;
the identification module is used for identifying the image data collected by the camera module, comparing and analyzing the image data with the stored data in the server and transmitting the image data back to the server;
the extraction module is used for extracting server data and comparing and analyzing the server data by the identification module;
the calling module is used for calling the real-time navigation data of the conventional navigation software;
the VR application module is used for integrating the real-time navigation data of the conventional navigation software and the comparison data of the identification module and processing the data into a VR display mode;
the display module is used for displaying a VR navigation picture;
the acquisition module is used for calling the camera module to acquire navigation data, analyzing the acquired data and storing the analyzed data in the server;
the server is in communication connection with the calling module, the VR application module, the display module, the acquisition module, the identification module and the extraction module, the camera shooting access module is in communication connection with the identification module and the acquisition module, and the identification module is in communication connection with the extraction module.
2. The real environment positioning system according to claim 1, wherein the collection module comprises a mobile phone collection unit, a simple device collection unit and a collection and analysis unit, the mobile phone collection unit and the simple device collection unit are all in communication connection with the camera access module, the mobile phone collection unit and the simple device collection unit are all in communication connection with the collection and analysis unit, and the collection and analysis unit is in communication connection with the server.
3. The real estate positioning system of claim 1 wherein the camera access module comprises a camera opening unit, a model identification unit and a data transmission unit, and the three are connected in sequence, the camera opening unit is connected with the acquisition module, and the data transmission unit is connected with the identification module.
4. The real estate positioning system of claim 1 wherein the identification module comprises an image receiving unit, an image feature capturing unit and a comparison unit, the image receiving unit, the image feature capturing unit and the comparison unit are connected in sequence, the image receiving unit is connected to the camera access module, the image feature capturing unit and the comparison unit are both connected to the extraction module, and the comparison unit is further connected to the server in communication.
5. The real estate positioning system of claim 1 wherein the extraction module comprises an image data extraction unit and a stored image extraction unit, wherein the input of the image data extraction unit is connected to the server, the output of the image extraction unit is connected to both the stored image extraction unit and the comparison unit, and the stored image extraction unit is connected to the image feature capture unit.
CN201910273090.4A 2019-04-04 2019-04-04 Real-environment positioning system Pending CN111787273A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910273090.4A CN111787273A (en) 2019-04-04 2019-04-04 Real-environment positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910273090.4A CN111787273A (en) 2019-04-04 2019-04-04 Real-environment positioning system

Publications (1)

Publication Number Publication Date
CN111787273A true CN111787273A (en) 2020-10-16

Family

ID=72755413

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910273090.4A Pending CN111787273A (en) 2019-04-04 2019-04-04 Real-environment positioning system

Country Status (1)

Country Link
CN (1) CN111787273A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102980570A (en) * 2011-09-06 2013-03-20 上海博路信息技术有限公司 Live-scene augmented reality navigation system
CN104266654A (en) * 2014-09-26 2015-01-07 广东好帮手电子科技股份有限公司 Vehicle real scene navigation system and method
US20170116783A1 (en) * 2015-10-26 2017-04-27 Institute Of Nuclear Energy Research Atomic Energy Council, Executive Yuan Navigation System Applying Augmented Reality
CN109520510A (en) * 2018-12-26 2019-03-26 安徽智恒信科技有限公司 A kind of indoor navigation method and system based on virtual reality technology

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102980570A (en) * 2011-09-06 2013-03-20 上海博路信息技术有限公司 Live-scene augmented reality navigation system
CN104266654A (en) * 2014-09-26 2015-01-07 广东好帮手电子科技股份有限公司 Vehicle real scene navigation system and method
US20170116783A1 (en) * 2015-10-26 2017-04-27 Institute Of Nuclear Energy Research Atomic Energy Council, Executive Yuan Navigation System Applying Augmented Reality
CN109520510A (en) * 2018-12-26 2019-03-26 安徽智恒信科技有限公司 A kind of indoor navigation method and system based on virtual reality technology

Similar Documents

Publication Publication Date Title
EP2207113B1 (en) Automated annotation of a view
CN105847750B (en) The method and device of UAV Video image real-time display based on geocoding
CN107885096B (en) Unmanned aerial vehicle patrols and examines three-dimensional emulation monitored control system of flight path
AU2018426323B2 (en) Method and apparatus for planning sample points for surveying and mapping, control terminal, and storage medium
CN103514626A (en) Method and device for displaying weather information and mobile terminal
CN103514621B (en) The authentic dynamic 3D reproducting method of case, event scenarios and reconfiguration system
US20230162449A1 (en) Systems and methods for data transmission and rendering of virtual objects for display
CN103971589A (en) Processing method and device for adding interest point information of map to street scene images
JP6985777B1 (en) Educational service provision methods and equipment using satellite images of artificial intelligence infrastructure
CN110362895B (en) Land acquisition removal application management system based on BIM + GIS technology
US20220138467A1 (en) Augmented reality utility locating and asset management system
CN116343103B (en) Natural resource supervision method based on three-dimensional GIS scene and video fusion
CN107808009B (en) Stamp platform-based two-dimensional and three-dimensional map linkage method
CN111710041B (en) System and environment simulation method based on multi-source heterogeneous data fusion display technology
CN115798265A (en) Digital tower construction method based on digital twinning technology and implementation system thereof
CN114372107A (en) GIS-based method and system for visualizing homeland improvement and ecological restoration data
CN109857826B (en) Video camera visual field marking system and marking method thereof
CN115035626A (en) Intelligent scenic spot inspection system and method based on AR
CN111787273A (en) Real-environment positioning system
Hong et al. The use of CCTV in the emergency response: A 3D GIS perspective
CN110196638B (en) Mobile terminal augmented reality method and system based on target detection and space projection
CN113157158A (en) Community global analysis large-screen interaction method and system
CN113239076A (en) Geographic information inquiry management platform based on three-dimensional image
CN113612927B (en) Digital imaging system based on photogrammetry and remote sensing technology
CN115906206B (en) Space computing equipment based on GIS and space planning analysis method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201016

RJ01 Rejection of invention patent application after publication