CN108020854B - Scene target situation display method and system - Google Patents

Scene target situation display method and system Download PDF

Info

Publication number
CN108020854B
CN108020854B CN201711013533.3A CN201711013533A CN108020854B CN 108020854 B CN108020854 B CN 108020854B CN 201711013533 A CN201711013533 A CN 201711013533A CN 108020854 B CN108020854 B CN 108020854B
Authority
CN
China
Prior art keywords
coordinate information
target situation
scene
scene target
station
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711013533.3A
Other languages
Chinese (zh)
Other versions
CN108020854A (en
Inventor
邱志豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Zhongnan Civil Aviation Air Traffic Control Technology Equipment Engineering Co ltd
Original Assignee
Guangzhou Zhongnan Civil Aviation Air Traffic Control Technology Equipment Engineering Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Zhongnan Civil Aviation Air Traffic Control Technology Equipment Engineering Co ltd filed Critical Guangzhou Zhongnan Civil Aviation Air Traffic Control Technology Equipment Engineering Co ltd
Priority to CN201711013533.3A priority Critical patent/CN108020854B/en
Publication of CN108020854A publication Critical patent/CN108020854A/en
Application granted granted Critical
Publication of CN108020854B publication Critical patent/CN108020854B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/43Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/40Correcting position, velocity or attitude
    • G01S19/41Differential correction, e.g. DGPS [differential GPS]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/421Determining position by combining or switching between position solutions or signals derived from different satellite radio beacon positioning systems; by combining or switching between position solutions or signals derived from different modes of operation in a single system
    • G01S19/423Determining position by combining or switching between position solutions or signals derived from different satellite radio beacon positioning systems; by combining or switching between position solutions or signals derived from different modes of operation in a single system by combining or switching between position solutions derived from different satellite radio beacon positioning systems

Abstract

The invention discloses a scene target situation display method, which comprises the following steps: acquiring first coordinate information of a mobile device located in a current area; calculating second coordinate information according to the distance between each receiving station and a reference beacon positioned on the mobile equipment; calculating the positioning precision of the current area according to the first coordinate information and the second coordinate information; and outputting a scene target situation map of each receiving station in the current working state according to the positioning accuracy of each region, wherein the working state of each receiving station represents the number of the opened receiving stations. The invention also discloses a scene target situation display system, and the scene target situation display method and the scene target situation display system provided by the invention can display the change of the scene positioning precision in real time, so that a user can make a decision and coordinate work according to the change of the positioning precision.

Description

Scene target situation display method and system
Technical Field
The invention relates to the field of positioning systems, in particular to a scene target situation display method and a scene target situation display system.
Background
The main way of displaying the situation of the target in the existing scene is to use a static map containing the layout of the local airport as a background map, and then overlay the target information (plane or vehicle) for dynamic display. The static map accurately depicts the shapes and areas of various components such as runways, taxiways, tie-ways, tarmac, and stands, and also marks the relative positions of the receivers in the scene positioning system. The static map corresponds to the actual layout of the local airport one by one, and the same scale of zooming and browsing can be carried out by selecting a plurality of fixed points of the reference object in the static map and associating and matching the longitude and latitude coordinates of the fixed points with the actual positions corresponding to the local airport.
The positioning accuracy is one of the key technical indexes of a scene multipoint positioning system, and is closely related to the site layout of a field receiver. When equipment fails or maintenance and repair are carried out, the positioning accuracy changes due to the fact that a certain station needs to be shut down. Whether the scene control user or the equipment guarantee user needs to pay attention to the positioning precision change of the scene multipoint positioning system, the positioning precision change is ensured to meet the technical requirements of the industry, and the normal work of the business is not influenced. However, the existing scene target situation display method does not consider the situation of the change of the positioning precision on the scene, and cannot meet the requirement of a user on checking the real-time positioning precision of the scene.
Disclosure of Invention
In order to overcome the defects of the prior art, one of the objectives of the present invention is to provide a scene target situation display method, so as to solve the problem that the current scene target situation display method cannot meet the requirement of a user for viewing the real-time positioning precision of the scene.
The second objective of the present invention is to provide a scene target situation display system to solve the problem that the current scene target situation display method cannot satisfy the requirement of the user to view the real-time positioning precision of the scene.
One of the purposes of the invention is realized by adopting the following technical scheme:
a scene target situation display method comprises the following steps:
acquiring first coordinate information of a mobile device located in a current area;
calculating second coordinate information according to the distance between each receiving station and a reference beacon positioned on the mobile equipment;
calculating the positioning precision of the current area according to the first coordinate information and the second coordinate information;
and outputting a scene target situation map of each receiving station in the current working state according to the positioning accuracy of each region, wherein the working state of each receiving station represents the number of the opened receiving stations.
Further, the obtaining first coordinate information of a mobile device located in the current area includes:
position information of an RTK GNSS rover located on a mobile device is acquired, wherein the RTK GNSS rover cooperates with a fixedly-located RTK GNSS reference station.
Further, the calculating the positioning accuracy of the current area according to the first coordinate information and the second coordinate information includes:
and if the acquisition time of the first coordinate information is the same as the acquisition time of the second coordinate information, calculating the error distance between the first coordinate information and the second coordinate information.
Further, the calculating the positioning accuracy of the current area according to the first coordinate information and the second coordinate information further includes:
if the acquisition time of the second coordinate information is between the acquisition times of the two pieces of first coordinate information, calculating third coordinate information between the two pieces of first coordinate information through an interpolation algorithm;
and calculating the error distance of the third coordinate information and the second coordinate information.
Further, the obtaining the first coordinate information of the mobile device located in the current area further comprises:
and dividing the area to be measured into a plurality of cells, wherein the current area is any cell.
Further, the outputting the scene target situation map of each receiving station in the current working state according to the positioning accuracy of each region includes:
and marking different colors for the cells with different positioning accuracies.
The second purpose of the invention is realized by adopting the following technical scheme:
a scene object situation display system comprising: the system comprises mobile equipment, a scene multipoint positioning system and a differential GPS system; the mobile device is used for moving in the current area; the scene multipoint positioning system comprises a reference beacon, a processor and a plurality of receiving stations; the differential GPS system is used for acquiring first coordinate information of the mobile equipment and sending the first coordinate information to the processor;
the reference beacon is located on the mobile device, the reference beacon for transmitting signals to a plurality of the receiving stations; the receiving station is used for feeding back the received signals to the processor; the processor is used for calculating the distance between each receiving station and the reference beacon according to the feedback time of each receiving station, so as to calculate second coordinate information of the mobile device;
the processor is further used for calculating the positioning precision of the current area according to the first coordinate information and the second coordinate information; the processor is also used for recording the working state of each receiving station; and the processor is also used for drawing a scene target situation map of each receiving station in the current working state according to the positioning accuracy of each region.
Further, the differential GPS system includes an RTK GNSS rover station located on the mobile device and an RTK GNSS base station fixedly disposed, the RTK GNSS rover station being configured to cooperate with the RTK GNSS base station to acquire position information of the RTK GNSS rover station.
Further, the processor comprises a central processing server and a display terminal, wherein the central processing server is used for calculating the positioning accuracy of the current area according to the first coordinate information and the second coordinate information; and the display terminal is used for outputting a scene target situation map of each receiving station in the current working state according to the positioning accuracy of each region.
Further, the central processing server is further configured to divide the area to be measured into a plurality of cells, where the current area is any cell.
Compared with the prior art, the invention has the beneficial effects that: the method comprises the steps of obtaining first coordinate information of mobile equipment located in a current area, calculating second coordinate information according to a receiving station and reference beacons on the mobile equipment, calculating positioning accuracy of the current area according to the first coordinate information and the second coordinate information, enabling the mobile equipment to pass through each area, calculating the positioning accuracy of each area according to received data, obtaining the positioning accuracy of each area of each receiving station in a current working state, adjusting the working state of each receiving station, and outputting scene target situation diagrams in different working states, so that changes of scene positioning accuracy can be displayed in real time, and a user can make decisions and coordinate work according to the changes of the positioning accuracy.
Drawings
Fig. 1 is a flowchart of a scene target situation display method according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a scene target situation display system according to an embodiment of the present invention.
Detailed Description
The present invention will be further described with reference to the accompanying drawings and the detailed description, and it should be noted that any combination of the embodiments or technical features described below can be used to form a new embodiment without conflict.
As shown in fig. 1, a method for displaying a situation of a scene object according to an embodiment of the present invention includes:
step S101: first coordinate information of a mobile device located in a current area is acquired.
Specifically, taking the area to be measured located in the airport as an example, a plurality of receiving stations and RTK GNSS reference stations are arranged at the peripheral position of the area to be measured according to the measurement requirement. The RTK GNSS mobile station is fixed on the mobile equipment, and the mobile equipment can be vehicle-mounted equipment, so that the vehicle-mounted equipment moves at a low speed according to a designed route, and a driving route is ensured to completely cover all runways, taxiways, contact roads, parking aprons and parking stalls, namely, a region to be detected. In the moving process of the mobile equipment, the RTK GNSS mobile station and the fixed RTK GNSS base station work in a matched mode, the RTK GNSS mobile station can acquire position information of the RTK GNSS mobile station in real time and send the position information to the processor, and the position information of the RTK GNSS mobile station is first coordinate information of the mobile equipment in the current area. The RTK GNSS rover station and the fixed RTK GNSS reference station work together to acquire position information of the RTK GNSS rover station in the prior art, and details are not repeated herein. In order to visually display a situation map of an area to be detected, a static map corresponding to the actual layout of an airport is drawn before data are collected, the static map comprises elements such as a runway, a taxiway, a contact way, an air park and an air station, the position of each receiving station is marked, then the static map is divided in a grid matrix mode, the area to be detected in the static map is composed of a series of grids, the actual area to be detected is correspondingly divided into a plurality of cells, the size of each cell of the actual area to be detected is 5m x 5m according to the proportion of the static map, and the current area is any cell. The area moved by the mobile device covers each cell, and when the mobile device is in each cell, the RTK GNSS mobile station collects the first coordinate information at least once and sends the first coordinate information to the processor.
Step S102: second coordinate information is calculated based on the distance of each receiving station from a reference beacon located on the mobile device.
Specifically, a reference beacon is fixed on the mobile device, the reference beacon sends signals to each receiving station, and the receiving stations receive the signals and feed the signals back to the processor. Because the distances between the receiving stations and the reference beacon are different, the time for receiving the signals by the receiving stations is different after the reference beacon sends the signals. The processor can calculate the distance between each receiving station and the reference beacon according to the time of the feedback signal of each receiving station, and can calculate the coordinate information of the reference beacon, namely the second coordinate information according to the distance between each receiving station and the reference beacon. The processor calculates the coordinate information of the reference beacon according to the distance between each receiving station and the reference beacon, which is the prior art and is not described herein again. The processor calculates second coordinate information at least once when the mobile device moves into each cell.
Step S103: and calculating the positioning precision of the current area according to the first coordinate information and the second coordinate information. The method comprises the following steps:
step S1031: and if the acquisition time of the first coordinate information is the same as the acquisition time of the second coordinate information, calculating the error distance between the first coordinate information and the second coordinate information.
In particular, it is assumed that the processor calculates and outputs a trace point PM based on the feedback signals of the receiving stationsiThe coordinate is (PMX)i,PMYi) I.e. second coordinate information, recording the moment of time as PMTi. Correspondingly, the processor is used for processing signals sent by the RTK GNSS mobile station at the PMTiOutputs two adjacent point traces PD in the similar time segmentiAnd PDi+1Having coordinates of (PDX) respectivelyi,PDYi) And (PDX)i+1,PDYi+1) I.e. first coordinate information, recording time being PDT, respectivelyiAnd PDTi+1
Within the same cell, if time PMTiEqual to PDTiThen the trace point PM can be directly calculatediAnd trace point PDiThe error distance L betweeniI.e. Li*Li=(PMXi-PDXi)*(PMXi-PDXi)+(PMYi-PDYi)*(PMXi-PDYi) Namely, the positioning accuracy of the cell is obtained.
Step S1032: and if the acquisition time of the second coordinate information is between the acquisition times of the two pieces of first coordinate information, calculating third coordinate information between the two pieces of first coordinate information by an interpolation algorithm.
Specifically, within the same cell, if time PMTiInterval PDTiAnd PDTi+1Meanwhile, the new trace point PD 'can be calculated through an interpolation method'iCoordinate (PDX'i,PDY’i) I.e. third coordinate information, wherein PDX'i=PDXi+(PDXi+1-PDXi)*(PMTi-PDTi)/(PDTi+1-PDTi) From PDY to PDY'i=PDYi+(PDYi+1-PDYi)*(PMTi-PDTi)/(PDTi+1-PDTi)。
Step S1033: and calculating the error distance of the third coordinate information and the second coordinate information.
Specifically, calculating an error distance L ' between the point trace PMi and the new point trace PD ' i 'iI.e. L'i*L’i=(PMXi-PDX’i)*(PMXi-PDX’i)+(PMYi-PDY’i)*(PMXi-PDY’i) Namely, the positioning accuracy of the cell is obtained.
In the same cell, if the second coordinate information output by the processor based on the feedback signal of each receiving station is a plurality of traces and the first coordinate information is also a plurality of traces, the error distances of the first coordinate information and the second coordinate information at the same or similar time are calculated according to the above method, and finally, the average value of the error distances is calculated as the positioning accuracy of the cell.
Step S104: and outputting a scene target situation map of each receiving station in the current working state according to the positioning accuracy of each region, wherein the working state of each receiving station represents the number of the opened receiving stations.
Specifically, when all receiving stations are in the on state, the processor calculates the positioning accuracy of each cell according to the received data and step S103, marks different colors for the cells with different positioning accuracies according to the numerical value of the positioning accuracy, and makes the different colors represent the different positioning accuracies, thereby obtaining the scene target situation map in which all the receiving stations are in the on state. For example: the specific meanings for defining different colors of the cells are as follows:
grayscale 1 (light): the positioning precision reaches below 7.5 meters;
grayscale 2 (medium): the positioning precision reaches 7.5 to 12 meters;
grayscale 3 (depth): the positioning precision reaches 12 to 20 meters;
grayscale 4 (heavy): the positioning precision reaches more than 20 meters.
The state of turning off one or more receiving stations is equivalent to not collecting data of corresponding receiving stations, in order to reduce the test time, the positioning precision of each cell is recalculated by adopting a method of removing all test data of corresponding receiving stations, namely the positioning precision of each cell when the corresponding receiving stations are turned off, and different colors can be marked for the cells with different positioning precisions according to the positioning precision of each cell. For example, assuming that ten receiving stations are provided, the processor may output scene target situation maps in all on states of the ten receiving stations by collecting coordinate information in all on states of the ten receiving stations, and if one or more of the scene target situation maps in all on states of the ten receiving stations is to be output, remove feedback data of the corresponding one or more receiving stations to recalculate the positioning accuracy of each cell, and output the corresponding scene target situation map. In another embodiment of the present invention, the corresponding positioning accuracy may also be tested in each working state of the receiving station, and the scene target situation map may be output. For example, assuming that there are ten receiving stations in total, if a target situation map in a state where one or more of the receiving stations are closed is to be output, the positioning accuracy of each cell is tested in each state, and a scene target situation map in each state is output. The scene target situation map under different working states is drawn, the influence of each receiver on the positioning precision of each cell can be visually displayed, and the decision and coordination arrangement work of a user are facilitated.
As shown in fig. 2, a scene target situation display system provided in an embodiment of the present invention includes: the system comprises a mobile device 1, a scene multipoint positioning system 2 and a differential GPS system 3; the mobile device 1 is used for moving in the current area; the scene multipoint positioning system 2 comprises a reference beacon 21, a processor 22 and a plurality of receiving stations 23; the differential GPS system 3 is used for acquiring first coordinate information of the mobile equipment 1 and sending the first coordinate information to the processor 22; a reference beacon 21 is located on the mobile device 1, the reference beacon 21 being for transmitting signals to a plurality of receiving stations 23; the receiving station 23 is used for feeding back the received signal to the processor 22; the processor 22 is configured to calculate a distance between each receiving station 23 and the reference beacon 21 according to the feedback time of each receiving station 23, so as to calculate second coordinate information of the mobile device 1; the processor 22 is further configured to calculate a positioning accuracy of the current area according to the first coordinate information and the second coordinate information; the processor 22 is further configured to record an operating status of each receiving station 23, where the operating status of the receiving station 23 indicates the number of receiving stations 23 that are turned on; the processor 22 is further configured to draw a scene target situation map of each receiving station 23 in the current working state according to the positioning accuracy of each region. During the movement of the mobile device 1, the processor 22 receives the feedback signal from the receiving station 23, calculates the second coordinate information, and calculates the positioning accuracy of the current area according to the received first coordinate information of the differential GPS system 3, so that the positioning accuracy of each area can be output. The number of the receiving stations 23 which are switched on is changed, and the measurement or calculation is repeated, so that scene target situation maps of the receiving stations in different working states can be output.
The differential GPS system 3 comprises an RTK GNSS rover station 31 and an RTK GNSS reference station 32, the RTK GNSS rover station 31 being located on the mobile device 1, the RTK GNSS reference station 2 being fixedly arranged, the RTK GNSS rover station 31 being adapted to cooperate with the RTK GNSS reference station 32 for acquiring position information of the RTK GNSS rover station 31.
The processor 22 includes a central processing server 221 and a display terminal 222, the central processing server 221 is configured to receive the feedback information of the receiving stations 23, calculate second coordinate information of the reference beacon 21 according to the feedback time of each receiving station 23, receive first coordinate information of the RTK GNSS rover station 31, and calculate positioning accuracy of the current area according to the first coordinate information and the second coordinate information; the display terminal 222 is configured to output a scene target situation map of each receiving station in the current working state according to the positioning accuracy of each region.
As a preferred embodiment, the central processing server 221 is further configured to divide the area to be measured into a plurality of cells, where the current area is any cell. And drawing a scene target situation map according to the positioning accuracy of the cells.
The test and calculation principle of the scene target situation display system provided by this embodiment is the same as that of the scene target situation display method, and is not described herein again.
The scene target situation display method and the scene target situation display system provided by the invention have the advantages that the first coordinate information of the mobile equipment in the current area is obtained, the second coordinate information is calculated according to the receiving station and the reference beacons on the mobile equipment, the positioning precision of the current area is calculated according to the first coordinate information and the second coordinate information, the mobile equipment passes through each area, the positioning precision of each area is calculated according to the received data, the positioning precision of each area of each receiving station in the current working state is obtained, the working state of each receiving station is adjusted, and the scene target situation map in different working states is output, so that the change of the scene positioning precision can be displayed in real time, and a user can make a decision and coordinate work according to the change of the positioning precision.
The above embodiments are only preferred embodiments of the present invention, and the protection scope of the present invention is not limited thereby, and any insubstantial changes and substitutions made by those skilled in the art based on the present invention are within the protection scope of the present invention.

Claims (10)

1. A scene target situation display method is characterized by comprising the following steps:
acquiring first coordinate information of a mobile device located in a current area;
calculating second coordinate information according to the distance between each receiving station and a reference beacon positioned on the mobile equipment;
calculating the positioning precision of the current area according to the first coordinate information and the second coordinate information;
and outputting a scene target situation map of each receiving station in the current working state according to the positioning accuracy of each region, wherein the working state of each receiving station represents the number of the opened receiving stations.
2. The scene target situation display method according to claim 1, wherein said obtaining first coordinate information of a mobile device located in a current area comprises:
position information of an RTK GNSS rover located on a mobile device is acquired, wherein the RTK GNSS rover cooperates with a fixedly-located RTK GNSS reference station.
3. The scene target situation display method according to claim 1, wherein said calculating the positioning accuracy of the current area based on the first coordinate information and the second coordinate information comprises:
and if the acquisition time of the first coordinate information is the same as the acquisition time of the second coordinate information, calculating the error distance between the first coordinate information and the second coordinate information.
4. The scene target situation display method according to claim 3, wherein said calculating the positioning accuracy of the current area based on the first coordinate information and the second coordinate information further comprises:
if the acquisition time of the second coordinate information is between the acquisition times of the two pieces of first coordinate information, calculating third coordinate information between the two pieces of first coordinate information through an interpolation algorithm;
and calculating the error distance of the third coordinate information and the second coordinate information.
5. The method for displaying scene target situation according to claim 1, wherein said obtaining the first coordinate information of the mobile device located in the current area further comprises:
and dividing the area to be measured into a plurality of cells, wherein the current area is any cell.
6. The method for displaying a scene target situation according to claim 5, wherein said outputting a scene target situation map of each receiving station in a current working state according to the positioning accuracy of each region comprises:
and marking different colors for the cells with different positioning precisions.
7. A scene object posture display system, comprising: the system comprises mobile equipment, a scene multipoint positioning system and a differential GPS system; the mobile device is used for moving in a current area; the scene multipoint positioning system comprises a reference beacon, a processor and a plurality of receiving stations; the differential GPS system is used for acquiring first coordinate information of the mobile equipment and sending the first coordinate information to the processor;
the reference beacon is located on the mobile device, the reference beacon for transmitting signals to a plurality of the receiving stations; the receiving station is used for feeding back the received signals to the processor; the processor is used for calculating the distance between each receiving station and the reference beacon according to the feedback time of each receiving station, so as to calculate second coordinate information of the mobile device;
the processor is further used for calculating the positioning precision of the current area according to the first coordinate information and the second coordinate information; the processor is also used for recording the working state of each receiving station; and the processor is also used for drawing a scene target situation map of each receiving station in the current working state according to the positioning accuracy of each region.
8. The situational target situation display system of claim 7, wherein said differential GPS system includes an RTK GNSS rover station and an RTK GNSS base station, said RTK GNSS rover station being located on said mobile device, said RTK GNSS base station being fixedly disposed, said RTK GNSS rover station being adapted to cooperate with said RTK GNSS base station for acquiring position information of said RTK GNSS rover station.
9. The scene target situation display system of claim 7, wherein the processor comprises a central processing server and a display terminal, the central processing server is configured to calculate a positioning accuracy of a current area according to the first coordinate information and the second coordinate information; and the display terminal is used for outputting a scene target situation map of each receiving station in the current working state according to the positioning accuracy of each region.
10. The scene target situation display system of claim 9, wherein the central processing server is further configured to divide the area to be measured into a plurality of cells, wherein the current area is any one of the cells.
CN201711013533.3A 2017-10-26 2017-10-26 Scene target situation display method and system Active CN108020854B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711013533.3A CN108020854B (en) 2017-10-26 2017-10-26 Scene target situation display method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711013533.3A CN108020854B (en) 2017-10-26 2017-10-26 Scene target situation display method and system

Publications (2)

Publication Number Publication Date
CN108020854A CN108020854A (en) 2018-05-11
CN108020854B true CN108020854B (en) 2022-05-17

Family

ID=62080326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711013533.3A Active CN108020854B (en) 2017-10-26 2017-10-26 Scene target situation display method and system

Country Status (1)

Country Link
CN (1) CN108020854B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109117776B (en) * 2018-08-02 2021-09-07 西安电子工程研究所 Aircraft and meteorological clutter classification and identification method based on flight path information
CN111277945A (en) * 2018-11-20 2020-06-12 北京华信泰科技股份有限公司 RTK positioning method and device
CN109581442B (en) * 2018-12-27 2021-05-11 华为技术有限公司 High-precision satellite positioning method, positioning terminal and positioning system
CN112986980A (en) * 2021-02-09 2021-06-18 北京理工大学 Monitoring system of target situation characteristics

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6119838B2 (en) * 2013-02-26 2017-04-26 日本電気株式会社 Status detection method, correction value processing apparatus, positioning system, and status detection program
KR101349396B1 (en) * 2013-08-02 2014-01-13 명화지리정보(주) Map data processing system for confirming topography change by gps coordinate data
CN106941703B (en) * 2016-01-04 2020-02-18 上海交通大学 Indoor and outdoor seamless positioning device and method based on situation awareness
EP3444636B1 (en) * 2016-04-13 2021-06-09 Positec Power Tools (Suzhou) Co., Ltd Differential global positioning system and a positioning method therefor

Also Published As

Publication number Publication date
CN108020854A (en) 2018-05-11

Similar Documents

Publication Publication Date Title
CN108020854B (en) Scene target situation display method and system
De Angelis et al. GNSS/cellular hybrid positioning system for mobile users in urban scenarios
RU2473443C2 (en) Device and method of defining location of resources with railway station limits
CN106530794B (en) The automatic identification and calibration method and system of carriage way
CN103116990B (en) Traffic speed vehicle-mounted acquisition system and method based on mobile phone switch position
KR20190082071A (en) Method, apparatus, and computer readable storage medium for updating electronic map
Al-Sobky et al. Traffic density determination and its applications using smartphone
CN109544932A (en) A kind of city road network flow estimation method based on GPS data from taxi Yu bayonet data fusion
CN105241465B (en) A kind of method of road renewal
CN104309650B (en) High accuracy Rail Detection route survey device based on high accuracy global positioning system
EP3073451A1 (en) Bus station optimization evaluation method and system
US10728536B2 (en) System and method for camera commissioning beacons
CN109163715B (en) Electric power station selection surveying method based on unmanned aerial vehicle RTK technology
US11237007B2 (en) Dangerous lane strands
US8483961B2 (en) Systems, methods, and computer program products of flight validation
CN102279406A (en) Fence identification method using global positioning system (GPS) to position tracks
CN112965077A (en) Road inspection system and method based on vehicle-mounted laser radar
Xing et al. Traffic volume estimation in multimodal urban networks using cell phone location data
CN105445729A (en) Unmanned plane three-dimensional flight track precision detection method and system
CN104729529B (en) The method and system that map surveying systematic error judges
CN111524394A (en) Method, device and system for improving accuracy of comprehensive track monitoring data of apron
Rogers Creating and evaluating highly accurate maps with probe vehicles
CN111998857B (en) System and method for positioning indoor object position in real time
CN103777196B (en) Based on terrain object distance single station measuring method and the measuring system thereof of geography information
CN111356074B (en) Method and device for positioning bus station, server and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant