CN114690223A - Scene recognition method and device, electronic equipment and computer readable medium - Google Patents

Scene recognition method and device, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN114690223A
CN114690223A CN202011633209.3A CN202011633209A CN114690223A CN 114690223 A CN114690223 A CN 114690223A CN 202011633209 A CN202011633209 A CN 202011633209A CN 114690223 A CN114690223 A CN 114690223A
Authority
CN
China
Prior art keywords
satellite
value
observation
satellites
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011633209.3A
Other languages
Chinese (zh)
Inventor
孔超
孙海鹏
王力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qianxun Spatial Intelligence Inc
Original Assignee
Qianxun Spatial Intelligence Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qianxun Spatial Intelligence Inc filed Critical Qianxun Spatial Intelligence Inc
Priority to CN202011633209.3A priority Critical patent/CN114690223A/en
Publication of CN114690223A publication Critical patent/CN114690223A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/396Determining accuracy or reliability of position or pseudorange measurements

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The disclosure relates to a scene identification method, a scene identification device, electronic equipment and a computer readable medium, and belongs to the technical field of satellite positioning. The method comprises the following steps: obtaining effective observation data of each satellite at the current moment; extracting the basic observation value of each satellite from the effective observation data, and obtaining the satellite signal characteristic value at the current moment according to the basic observation value of each satellite; acquiring a characteristic value threshold corresponding to each scene type, and determining the scene type at the current moment according to the relationship between the satellite signal characteristic value and the characteristic value threshold of each scene type; and outputting a scene type label corresponding to the scene type at the current moment, and performing satellite positioning deviation early warning according to the scene type at the current moment. According to the method and the device, the satellite signal characteristic value is extracted and counted, and the characteristic value is compared with the characteristic value threshold values under various scene types, so that the scene type at the current moment is determined, and the reliability and the accuracy of scene identification can be improved.

Description

Scene recognition method and device, electronic equipment and computer readable medium
Technical Field
The present disclosure relates to the field of satellite positioning technologies, and in particular, to a scene recognition method, a scene recognition apparatus, an electronic device, and a computer-readable medium.
Background
The Global Navigation Satellite System (GNSS) has the advantages of Global range, all-weather, high-precision positioning and the like, and plays an important role in vehicle Navigation positioning application. With the rise of automatic driving and the improvement of vehicle-scale positioning precision, higher requirements are put forward on the reliability and safety of GNSS positioning.
However, because GNSS satellite signals are susceptible to environmental and scene interference, in some complex and severe scenes, a large error may occur in the positioning accuracy of GNSS. The personal safety of the user may be compromised if the current scene cannot be accurately identified.
In view of this, there is a need in the art for a scene recognition method that can improve the reliability and accuracy of scene recognition.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a scene recognition method, a scene recognition apparatus, an electronic device, and a computer-readable medium, so as to improve reliability and accuracy of scene recognition at least to some extent.
According to a first aspect of the present disclosure, there is provided a scene recognition method, including:
obtaining effective observation data of each satellite at the current moment;
extracting the basic observation value of each satellite from the effective observation data, and obtaining a satellite signal characteristic value at the current moment according to the basic observation value of each satellite;
acquiring a characteristic value threshold corresponding to each scene type, and determining the scene type at the current moment according to the relationship between the satellite signal characteristic value and the characteristic value threshold of each scene type;
and outputting a scene type label corresponding to the scene type at the current moment, and performing satellite positioning deviation early warning according to the scene type at the current moment.
In an exemplary embodiment of the disclosure, the acquiring valid observation data of each satellite at the current time includes:
acquiring original message data obtained at the current moment according to received satellite signals, and decoding the original message data to obtain original observation data of each satellite at the current moment;
and performing data validity verification on the original observation data, and if the verification is passed, taking the original observation data of each satellite at the current moment as valid observation data.
In an exemplary embodiment of the present disclosure, the obtaining the satellite signal feature value at the current time according to the basic observation value of each satellite includes:
acquiring an effective signal-to-noise ratio threshold value and an effective altitude angle threshold value, and determining satellites of which the signal-to-noise ratios are greater than the effective signal-to-noise ratio threshold value and the altitude angles are greater than the effective altitude angle threshold value as effective observation satellites;
and acquiring the number of the effective observation satellites, and acquiring the total number of the satellites at the current moment according to the number of the effective observation satellites.
In an exemplary embodiment of the present disclosure, the obtaining the satellite signal characteristic value at the current time according to the basic observation value of each satellite includes:
acquiring the altitude angles of all the effective observation satellites at the current moment, and obtaining the sum of the altitude angles of the satellites according to the altitude angles of all the effective observation satellites;
and obtaining the average satellite altitude at the current moment according to the ratio of the satellite altitude sum to the total number of the satellites.
In an exemplary embodiment of the present disclosure, the obtaining the satellite signal characteristic value at the current time according to the basic observation value of each satellite includes:
determining a plurality of frequency points corresponding to each effective observation satellite, and respectively acquiring the signal-to-noise ratio of each effective observation satellite at different frequency points at the current moment;
obtaining the sum of the signal-to-noise ratios of the effective observation satellites at different frequency points according to the signal-to-noise ratios of the effective observation satellites at the different frequency points;
and obtaining the average signal-to-noise ratio of the effective observation satellite of each frequency point at the current moment according to the ratio of the sum of the signal-to-noise ratios to the total number of the satellites.
In an exemplary embodiment of the present disclosure, the obtaining the satellite signal feature value at the current time according to the basic observation value of each satellite includes:
performing cycle slip detection and identification on the effective observation satellites according to carrier observation values of the effective observation satellites at different frequency points at the current moment so as to judge whether the effective observation satellites generate cycle slips or not;
and marking the effective observation satellites with cycle slip as carrier signal interruption satellites, and obtaining the cycle slip ratio of the satellite number of each frequency point at the current moment according to the ratio of the number of the carrier signal interruption satellites of each frequency point to the total number of the satellites.
In an exemplary embodiment of the present disclosure, the obtaining of the satellite signal feature value at the current time according to the basic observation value of each satellite includes:
determining real-time pseudo-range multipath values of the effective observation satellites at different frequency points according to carrier observation values and pseudo-range observation values of the effective observation satellites at different frequency points at the current moment;
obtaining the multipath sum of all effective observation satellites of each frequency point according to the real-time pseudo-range multipath values of each effective observation satellite at different frequency points;
and obtaining an average pseudo-range multipath value of each frequency point at the current moment according to the ratio of the multipath sum to the total number of the satellites.
In an exemplary embodiment of the present disclosure, the determining, according to a carrier observation value and a pseudo-range observation value of each effective observation satellite at different frequency points at a current time, a real-time pseudo-range multipath value of each effective observation satellite at different frequency points includes:
acquiring the frequency of each frequency point and a frequency point error value, and acquiring the size of a pseudo-range noise sliding window;
obtaining pseudo-range noise values of the effective observation satellites at different frequency points according to carrier observation values and pseudo-range observation values of the effective observation satellites at different frequency points at the current moment, and frequencies of the frequency points and the frequency point error values;
performing cycle slip detection and identification on the effective observation satellites according to carrier observation values of the effective observation satellites at different frequency points at the current moment;
if the effective observation satellite does not generate cycle slip at the frequency point, obtaining a real-time pseudo-range multipath value of the effective observation satellite at the frequency point according to the pseudo-range noise value of the effective observation satellite at the frequency point and the average pseudo-range noise value in the pseudo-range noise sliding window;
and if the cycle slip of the effective observation satellite occurs at the frequency point, clearing the pseudo-range noise value in the pseudo-range noise sliding window, and calculating the real-time pseudo-range multipath value of the next effective observation satellite again.
In an exemplary embodiment of the disclosure, the obtaining a size of a pseudorange noise sliding window includes:
and acquiring the sampling rate of the basic observation value of the effective observation satellite, and determining the size of a pseudo-range noise sliding window according to the sampling rate of the basic observation value.
In an exemplary embodiment of the present disclosure, the obtaining a real-time pseudorange multipath value of the effective observed satellite at the frequency point according to the pseudorange noise value of the effective observed satellite at the frequency point and the average pseudorange noise value within the pseudorange noise sliding window includes:
obtaining pseudo range noise values of the effective observation satellite at multiple moments in the pseudo range noise sliding window according to the size of the pseudo range noise sliding window, and obtaining an average pseudo range noise value in the pseudo range noise sliding window according to the pseudo range noise values at the multiple moments;
and obtaining a real-time pseudo-range multipath value of the effective observation satellite at the frequency point according to the absolute value of the difference value between the pseudo-range noise value of the effective observation satellite at the frequency point at the current moment and the average pseudo-range noise value in the pseudo-range noise sliding window.
In an exemplary embodiment of the present disclosure, the determining the scene type at the current time according to the relationship between the satellite signal feature value and the feature value threshold of each scene type includes:
comparing the satellite signal characteristic value with a characteristic value threshold value of each evaluable scene type respectively;
if the satellite signal characteristic value meets a characteristic value threshold value of one evaluable scene type, determining the scene type at the current moment as a corresponding evaluable scene type;
and if the satellite signal characteristic value does not meet the characteristic value threshold of any evaluable scene type, determining the scene type at the current moment as other scene types.
In an exemplary embodiment of the present disclosure, the performing satellite positioning deviation early warning according to the scene type at the current time includes:
acquiring a maximum deviation threshold corresponding to the scene type at the current moment and actual measurement data at the current moment, and obtaining a positioning accuracy value under the scene type at the current moment according to the actual measurement data;
if the positioning accuracy value is smaller than or equal to the maximum deviation threshold value, satellite positioning deviation early warning is not carried out;
and if the positioning accuracy value is greater than the maximum deviation threshold value, performing satellite positioning deviation early warning.
According to a second aspect of the present disclosure, there is provided a scene recognition apparatus including:
the observation data acquisition module is used for acquiring effective observation data of each satellite at the current moment;
the satellite characteristic determination module is used for extracting the basic observation value of each satellite from the effective observation data and obtaining a satellite signal characteristic value at the current moment according to the basic observation value of each satellite;
the scene type determining module is used for acquiring a characteristic value threshold corresponding to each scene type and determining the scene type at the current moment according to the relationship between the satellite signal characteristic value and the characteristic value threshold of each scene type;
and the positioning deviation early warning module is used for outputting a scene type label corresponding to the scene type at the current moment and carrying out satellite positioning deviation early warning according to the scene type at the current moment.
According to a third aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the scene recognition method of any one of the above via execution of the executable instructions.
According to a fourth aspect of the present disclosure, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the scene recognition method of any one of the above.
The exemplary embodiments of the present disclosure may have the following advantageous effects:
in the scene identification method of the disclosed example embodiment, the original observation data of the satellite is collected, the characteristic value of the satellite signal is extracted and counted based on the original observation data, the characteristic value of the satellite signal at the current moment is compared with the characteristic value thresholds under various scene types, so that the scene type at the current moment is identified, and the positioning deviation early warning of the corresponding scene is performed. In the scene recognition method in the disclosed example embodiment, on one hand, scene recognition in a real-time and dynamic scene can be completed by acquiring observation data at the current moment and extracting a characteristic value, so that the reliability and accuracy of the scene recognition are improved; on the other hand, the purposes of scene real-time detection and recognition can be achieved without adding extra hardware equipment, meanwhile, positioning deviation early warning of the recognized scene is given, the cost of scene recognition can be reduced, and the method has the advantages of low cost and easiness in implementation.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 shows a flow diagram of a scene recognition method of an example embodiment of the present disclosure;
FIG. 2 illustrates a flowchart of an exemplary embodiment of the present disclosure for obtaining valid observation data for each satellite at a current time;
fig. 3 shows a flowchart of obtaining an average signal-to-noise ratio at a current time according to an example embodiment of the present disclosure;
FIG. 4 shows a flowchart for obtaining a number of satellites over a cycle slip ratio at a current time according to an example embodiment of the disclosure;
fig. 5 illustrates a flowchart for obtaining an average pseudorange multipath value for a current time in an example embodiment of the disclosure;
FIG. 6 is a flowchart illustrating the determination of real-time pseudorange multipath values for various valid observed satellites at different frequency points in accordance with an exemplary embodiment of the present disclosure;
FIG. 7 illustrates a flowchart for obtaining real-time pseudorange multipath values for valid observed satellites according to an example embodiment of the present disclosure;
FIG. 8 illustrates a flowchart of determining a scene type at a current time in an example embodiment of the present disclosure;
fig. 9 is a schematic diagram illustrating a flow of performing satellite positioning deviation warning according to a scene type at a current time according to an example embodiment of the present disclosure;
FIG. 10 illustrates a schematic diagram of a GNSS system in accordance with an embodiment of the present disclosure;
FIG. 11 illustrates a flow diagram of a scene recognition method in accordance with an embodiment of the present disclosure;
FIG. 12 illustrates a flow diagram for computing real-time pseudorange multipath values for active satellites in accordance with one embodiment of the present disclosure;
fig. 13 shows a block diagram of a scene recognition apparatus of an example embodiment of the present disclosure;
FIG. 14 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The Global Navigation Satellite System (GNSS) has the advantages of Global range, all-weather, high-precision positioning and the like, and plays an important role in vehicle Navigation positioning application. With the rise of automatic driving and the improvement of vehicle-scale positioning precision, higher requirements are put forward on the reliability and safety of GNSS positioning. Because GNSS satellite signals are easily interfered by environments and scenes, positioning accuracy of GNSS can be sharply reduced in severe scenes, and even positioning errors of more than 10m level can be brought, which far exceeds an automatic driving allowable accuracy range.
In general, the car navigation positioning habit divides the scene into an open (OS, open-sky) scene, a semi-open (NOS1, none open-sky1) scene, a severe occlusion (NOS2, none open-sky2) scene, a Tunnel (Tunnel) scene, and Other (Other) scenes to evaluate the positioning performance and service availability. The performance of the GNSS is better in an OS scene, and the expected positioning performance can be completely achieved, while in an NOS1 scene, the positioning performance is interfered to a certain extent, in an NOS2 scene, the positioning result is worse, in a Tunnel scene, the positioning is not available, in an Other scene, the positioning is very unreliable, and the GNSS cannot be incorporated into a reliable positioning service. Therefore, the automatic driving navigation technology which depends on the GNSS as the key positioning technology can endanger the personal safety of the user if the scene cannot be accurately identified and corresponding early warning is given.
In some related embodiments, the sub-scenes are identified in parallel by different special modules according to scene types through a method of disassembling the multi-dimensional driving scene from a main scene to the sub-scenes step by step, and the sub-scenes identified for the first time are fused and judged by adopting a multi-scene fusion method, so that the multi-dimensional scene directly related to the acquired scene key information is finally obtained, and the traffic scene in the vehicle driving process can be subjected to multi-dimensional evaluation, classification and identification. However, the scene recognition granularity of the method is fine, the dynamic main scene, the natural environment main scene, the road main scene and the like are distinguished, and each main scene is divided into a plurality of specific scenes, so that the method is complex to realize, an additional scene recognition sensor is needed, and the scene recognition cost is increased. Meanwhile, the scene recognition technology cannot give positioning deviation and deviation early warning under the recognized scene.
In other related embodiments, the scene recognition may be implemented based on a system including a data extraction and calculation module and a typical scene type recognition module, specifically, driving scene key parameters may be obtained from a vehicle driving database through screening, matching and calculation by the data extraction and calculation module, and then the typical scene may be recognized by the typical scene type recognition module according to the provided driving scene key parameters and the driving characteristics of the host vehicle and the relative state of the host vehicle and the target object. However, the method depends on the adopted typical scene characteristics, so that the real scene cannot be completely covered, and the probability of missed judgment and wrong judgment is high. Meanwhile, the scene recognition technology still cannot give positioning deviation and deviation early warning under different scenes.
In some related embodiments, driving related data of an autonomous vehicle in an autonomous driving simulation scene and environmental data around the autonomous vehicle can be used as input, the simulation scene is classified by using a deep learning classification model comprising a coding layer and a clustering layer, scene features are processed into vectors with unified dimensions through the coding layer, the problem that classification is inaccurate due to the fact that the scene feature dimensions are not unified in the follow-up process is avoided, the clustering layer can unsupervised classify the scene features into reasonable scenes or abnormal scenes, and therefore high efficiency and high accuracy are achieved. However, the accuracy of the method depends on a large number of samples and the cleaning degree of the samples, and the method is easily interfered by abnormal samples and characteristics, so that the reliability of scene recognition is reduced. Meanwhile, the scene recognition technology still cannot give positioning deviation and deviation early warning under different scenes.
In view of the above problem, the present exemplary embodiment first provides a scene recognition method. Referring to fig. 1, the scene recognition method may include the following steps:
and S110, obtaining effective observation data of each satellite at the current moment.
And S120, extracting basic observation values of all satellites from the effective observation data, and obtaining satellite signal characteristic values at the current moment according to the basic observation values of all the satellites.
S130, obtaining a characteristic value threshold corresponding to each scene type, and determining the scene type at the current moment according to the relation between the satellite signal characteristic value and the characteristic value threshold of each scene type.
And S140, outputting a scene type label corresponding to the scene type at the current moment, and performing satellite positioning deviation early warning according to the scene type at the current moment.
In the scene identification method of the disclosed example embodiment, the original observation data of the satellite is collected, the characteristic value of the satellite signal is extracted and counted based on the original observation data, the characteristic value of the satellite signal at the current moment is compared with the characteristic value thresholds under various scene types, so that the scene type at the current moment is identified, and the positioning deviation early warning of the corresponding scene is performed. In the scene recognition method in the disclosed example embodiment, on one hand, scene recognition in a real-time and dynamic scene can be completed by acquiring observation data at the current moment and extracting a characteristic value, so that the reliability and accuracy of the scene recognition are improved; on the other hand, the purposes of scene real-time detection and recognition can be achieved without adding extra hardware equipment, meanwhile, positioning deviation early warning of the recognized scene is given, the cost of scene recognition can be reduced, and the method has the advantages of low cost and easiness in implementation.
Next, the above steps of the present exemplary embodiment will be described in more detail with reference to fig. 2 to 9.
In step S110, effective observation data of each satellite at the current time is acquired.
In this exemplary embodiment, first, effective observation data of each satellite at the current time is obtained through the positioning system, where the effective observation data of the satellite refers to observation data whose time tag is consistent with the current time and whose data type is complete.
Taking vehicle navigation as an example, the most common in vehicle navigation is a GNSS + IMU (Inertial Measurement Unit) combined positioning system, before entering a scene recognition system, it needs to be ensured that the GNSS positioning system itself works normally, and the GNSS can stably receive satellite signals in real time and normally output RTCM (Radio Technical communication for landmark Services) data in a standard format.
In the present exemplary embodiment, a real-time scene recognition system may be extended based on the original architecture of the GNSS/IMU combined positioning system, and scene recognition of data processing may be performed by the real-time scene recognition system.
In this exemplary embodiment, as shown in fig. 2, the obtaining of the effective observation data of each satellite at the current time may specifically include the following steps:
and S210, acquiring original message data obtained at the current moment according to the received satellite signals, and decoding the original message data to obtain original observation data of each satellite at the current moment.
When the scene recognition system is started, the positioning system firstly outputs original message data at the current moment, such as original RTCM format data, according to the received satellite signals, and then sends the original message data to the scene recognition system to decode the RTCM data. The decoded data contains the original observation data of each satellite at the current moment, including the basic observations such as pseudo range, carrier wave, Doppler, signal-to-noise ratio and the like. The scene recognition system will store the raw observation data at each time. The scene recognition system may internally retain the raw observation data for a period of time, such as retaining all raw observation data for 2 minutes, according to the set parameters.
And S220, performing data validity verification on the original observation data, and if the verification is passed, taking the original observation data of each satellite at the current moment as valid observation data.
After the original observation data at the latest time is stored, it is also necessary to check whether the data at the latest time is valid, for example, whether the time tag of the original observation data is consistent with the current time and whether the data type of the original observation data is complete. And if the data is invalid, directly exiting the scene recognition operation at the current moment. If the result is valid, the subsequent steps are continued.
In step S120, a basic observation value of each satellite is extracted from the effective observation data, and a satellite signal feature value at the current time is obtained according to the basic observation value of each satellite.
After the effective observation data of each satellite is obtained, the basic observation value of each satellite is extracted from the effective observation data, and the basic observation value of each satellite comprises basic observation quantities such as a signal-to-noise ratio, an altitude angle, a carrier observation value, a pseudo-range observation value and the like. Then, traversing the acquired basic observation values of all the satellites by adopting a reasonable algorithm, and extracting and counting the satellite signal characteristic values at the current moment. The satellite signal characteristic values include a total number of satellites, a Position Precision of Precision (PDOP) value, an average satellite elevation angle, an average signal-to-noise ratio, a maximum signal-to-noise ratio, a number of satellites cycle slip ratio, an average pseudo-range multipath value, and the like. Most of the characteristic values can be obtained by directly carrying out statistics according to original observation data.
In order to realize real-time scene recognition, the satellite signal characteristic value needs to be updated and calculated in real time. Because the information in the satellite signal characteristic value can represent the observation quality of the satellite signal in the current state, and the observation quality of the satellite is mainly interfered by the environment and the scene, the scene of the current time can be identified based on the satellite signal characteristic value extracted at the current time. The calculation methods of the satellite signal characteristic values are as follows:
in this exemplary embodiment, a specific method for determining the total number of satellites at the current time may be: acquiring an effective signal-to-noise ratio threshold value and an effective altitude angle threshold value, and determining satellites of which the signal-to-noise ratios are greater than the effective signal-to-noise ratio threshold value and the altitude angles are greater than the effective altitude angle threshold value as effective observation satellites; and acquiring the number of the effective observation satellites, and acquiring the total number of the satellites at the current moment according to the number of the effective observation satellites.
The total number of satellites can be obtained by counting the number of effective observation satellites, wherein an effective observation satellite refers to a satellite with a signal-to-noise ratio greater than an effective signal-to-noise ratio threshold and an altitude greater than an effective altitude angle threshold, and for example, a satellite with a signal-to-noise ratio >15dB and an altitude >10 ° can be determined as an effective observation satellite.
In this exemplary embodiment, the satellite PDOP value may be directly obtained from a GNSS chip of the positioning system.
In this exemplary embodiment, a specific method for determining the average satellite altitude at the current time may be: acquiring the altitude angles of all effective observation satellites at the current moment, and obtaining the sum of the altitude angles of the satellites according to the altitude angles of all the effective observation satellites; and obtaining the average satellite altitude at the current moment according to the ratio of the sum of the satellite altitude and the total number of the satellites.
The statistical method of the average satellite altitude angle can calculate the sum of the altitude angles of all effective observation satellites, and then divide the sum by the total number of the satellites to obtain the average satellite altitude angle at the current moment.
In this exemplary embodiment, as shown in fig. 3, a specific method for determining the average snr at the current time may include the following steps:
s310, determining a plurality of frequency points corresponding to each effective observation satellite, and respectively obtaining the signal-to-noise ratio of each effective observation satellite at different frequency points at the current moment.
And S320, obtaining the sum of the signal-to-noise ratios of the effective observation satellites at different frequency points according to the signal-to-noise ratios of the effective observation satellites at the different frequency points.
And S330, obtaining the average signal-to-noise ratio of the effective observation satellite of each frequency point at the current moment according to the ratio of the sum of the signal-to-noise ratios to the total number of the satellites.
The frequency points refer to numbers of fixed frequencies corresponding to satellite signals with different frequencies, and in the present exemplary embodiment, two frequency points corresponding to each satellite are taken as an example for detailed description, and the signal-to-noise ratio of an effective observation satellite can be divided into two frequency points L1 and L2 for statistics.
For two frequency points of L1 and L2, the sum of the signal-to-noise ratios of all effective observation satellites of the two frequency points is calculated respectively, and then the sum is divided by the total number of the satellites, so that the average signal-to-noise ratio of the effective observation satellites of the two frequency points of L1 and L2 at the current moment can be obtained.
In this example embodiment, the maximum snr may be obtained by traversing all available observation satellites and extracting the maximum snr from the snrs of all available observation satellites.
In this exemplary embodiment, as shown in fig. 4, a specific method for determining the cycle slip ratio of the number of satellites at the current time may include the following steps:
and S410, performing cycle slip detection and identification on the effective observation satellites according to the carrier observation values of the effective observation satellites at different frequency points at the current moment so as to judge whether the effective observation satellites generate cycle slips or not.
And S420, marking the effective observation satellites with the cycle slips as carrier signal interruption satellites, and obtaining the cycle slip ratio of the number of the satellites at each frequency point at the current moment according to the ratio of the number of the carrier signal interruption satellites at each frequency point to the total number of the satellites.
The satellite number cycle slip ratio refers to the ratio of the number of satellites in which the cycle slip phenomenon occurs to the total number of satellites. Firstly, judging whether the carrier signal interruption occurs in the satellite at two frequency points according to the carrier observed quantities of the two frequency points L1 and L2, and marking the effective observation satellite with the carrier signal interruption as a carrier signal interruption satellite. And then, according to the ratio of the number of the marked carrier signal interruption satellites to the total number of the satellites, obtaining the cycle slip ratio of the number of the satellites at each frequency point at the current moment.
The above satellite signal characteristic values can be obtained by simple statistics and calculation, and the calculation of the average pseudorange multipath value is more complicated and critical.
In this exemplary embodiment, as shown in fig. 5, a specific method for determining an average pseudorange multipath value at a current time may include the following steps:
and S510, determining real-time pseudo-range multipath values of each effective observation satellite at different frequency points according to the carrier observation value and the pseudo-range observation value of each effective observation satellite at different frequency points at the current moment.
In this exemplary embodiment, as shown in fig. 6, determining real-time pseudorange multipath values of each effective observation satellite at different frequency points according to a carrier observed value and a pseudorange observed value of each effective observation satellite at different frequency points at the current time may specifically include the following steps:
and S610, acquiring the frequency of each frequency point and the frequency point error value, and acquiring the size of a pseudo-range noise sliding window.
Taking two frequency points of L1 and L2 as examples, the frequencies of L1 and L2 are f1 and f2, and the frequency point error values are Bp1 and Bp2, respectively, where the frequency point error values include phase ambiguity and frequency deviation, and Bp1 and Bp2 at different times do not change under the condition of no cycle slip.
In this exemplary embodiment, the size of the pseudo-range noise sliding window may be determined by obtaining a sampling rate of a basic observation value of an effective observation satellite and determining the size of the pseudo-range noise sliding window according to the sampling rate of the basic observation value.
In general, the sliding window may be determined according to the GNSS sampling rate size of each positioning system, for example, the sampling rate size is 1Hz, and the size of the sliding window may be set to 5 minutes.
And S620, obtaining pseudo-range noise values of the effective observation satellites at different frequency points according to the carrier observed values and the pseudo-range observed values of the effective observation satellites at different frequency points at the current moment, and the frequency and frequency point error values of the frequency points.
And selecting data at the latest moment from all the original observation data stored in real time, traversing each effective observation satellite, and independently calculating the pseudo-range noise value of each effective observation satellite.
Taking two frequency points of L1 and L2 as examples, pseudo range noise of the two frequency points
Figure BDA0002880576200000131
And
Figure BDA0002880576200000132
the calculation formula is as follows:
Figure BDA0002880576200000133
Figure BDA0002880576200000134
here, P1 and P2 represent pseudo-range observations at frequency points L1 and L2, respectively, and a is f1 2/f2 2The square ratios of the frequencies of the frequency points L1 and L2 are shown, L1 and L2 show carrier wave observed values, and Bp1 and Bp2 are frequency point error values.
And S630, performing cycle slip detection and identification on the effective observation satellites according to the carrier observation values of the effective observation satellites at different frequency points at the current moment.
And performing cycle slip detection and identification on the effective observation satellite according to the carrier observed quantities of the two frequency points L1 and L2, and judging whether the satellite has carrier signal interruption at the two frequency points.
And step 640, if the cycle slip of the effective observation satellite at the frequency point does not occur, obtaining a real-time pseudo-range multipath value of the effective observation satellite at the frequency point according to the pseudo-range noise value of the effective observation satellite at the frequency point and the average pseudo-range noise value in the pseudo-range noise sliding window.
And if the cycle slip does not occur in the current effective observation satellite, directly calculating the real-time pseudo-range multipath value of the effective observation satellite at each frequency point.
In this exemplary embodiment, as shown in fig. 7, obtaining a real-time pseudorange multipath value of an effective observed satellite at a frequency point according to a pseudorange noise value of the effective observed satellite at the frequency point and an average pseudorange noise value in a pseudorange noise sliding window may specifically include the following steps:
and S710, obtaining pseudo range noise values of the effective observation satellite at multiple moments in the pseudo range noise sliding window according to the size of the pseudo range noise sliding window, and obtaining an average pseudo range noise value in the pseudo range noise sliding window according to the pseudo range noise values at the multiple moments.
And S720, obtaining a real-time pseudo-range multipath value of the effective observation satellite at the frequency point according to the absolute value of the difference value between the pseudo-range noise value of the effective observation satellite at the frequency point at the current moment and the average pseudo-range noise value in the pseudo-range noise sliding window.
The method for calculating the real-time pseudo-range multipath values of the effective observation satellites at each frequency point is to subtract the average pseudo-range noise value in a sliding window from the pseudo-range noise value of the satellite at the current moment and then take the absolute value of the obtained result. The average pseudo-range noise value in the sliding window is an average value of pseudo-range noise values at a plurality of corresponding times in the sliding window.
And after the real-time pseudo-range multipath values of the current effective observation satellite are obtained, updating the sliding window, then completing traversal of all effective observation satellites, and acquiring and storing the multipath values of all effective observation satellites.
And S650, if the cycle slip of the effective observation satellite occurs at the frequency point, clearing the pseudo-range noise value in the pseudo-range noise sliding window, and calculating the real-time pseudo-range multipath value of the next effective observation satellite again.
And if the cycle slip phenomenon occurs to the current effective observation satellite, clearing the pseudo-range noise sliding window of the current effective observation satellite, then re-initializing and updating the sliding window.
And S520, obtaining the multipath sum of all effective observation satellites of each frequency point according to the real-time pseudo-range multipath values of each effective observation satellite at different frequency points.
And after the real-time pseudo-range multipath values of all effective observation satellites at different frequency points are obtained, adding the real-time pseudo-range multipath values of the effective observation satellites at the same frequency point to obtain the multipath sum of all the effective observation satellites at the frequency point.
And S530, obtaining an average pseudo-range multipath value of each frequency point at the current moment according to the ratio of the multipath sum to the total number of the satellites.
And dividing the multipath sum of each frequency point by the total number of the satellites respectively to obtain an average pseudo range multipath value of each frequency point at the current moment.
In step S130, a feature value threshold corresponding to each scene type is obtained, and the scene type at the current time is determined according to the relationship between the satellite signal feature value and the feature value threshold of each scene type.
In this example embodiment, the scene types may include an open scene type, a semi-open scene type, a severe occlusion scene type, a tunnel scene type, and other scene types. The type of the open scene, the type of the semi-open scene, the type of the severe shielding scene and the type of the tunnel scene belong to evaluable scene types, and the other scene types are ineluable scene types.
In this exemplary embodiment, as shown in fig. 8, the obtaining of the scene type determined at the current time according to the relationship between the satellite signal feature value and the feature value threshold of each scene type may specifically include the following steps:
and step S810, comparing the satellite signal characteristic value with the characteristic value threshold of each evaluable scene type respectively.
Step S820, if the satellite signal characteristic value meets the characteristic value threshold of a certain evaluable scene type, determining the scene type at the current moment as the corresponding evaluable scene type.
And S830, if the satellite signal characteristic value does not meet the characteristic value threshold of any evaluable scene type, determining the scene type at the current moment as other scene types.
Through testing a large amount of measured data, a set of specific GNSS experience characteristic value threshold ranges corresponding to each scene of OS, NOS1, NOS2, Tunnel, Other can be obtained respectively, taking an OS scene as an example, the characteristic value threshold ranges may include: PDOP value <2.0, signal-to-noise ratio > dB33, average pseudo range noise <1.1m, average altitude angle >30 degrees, cycle slip ratio <0.2 and other empirical characteristic value thresholds. The satellite signal characteristic value at the current moment is compared with the empirical characteristic value threshold value for analysis, and the scene to which the current moment belongs can be judged, so that the purpose of scene recognition is achieved.
In step S140, a scene type tag corresponding to the scene type at the current time is output, and satellite positioning deviation early warning is performed according to the scene type at the current time.
After the scene type at the current moment is determined, the scene identification system outputs a scene type label corresponding to the scene type at the current moment, and then satellite positioning deviation early warning is carried out through the GNSS positioning system.
Taking vehicle navigation as an example, the positioning performance of the vehicle navigation positioning system is limited by hardware devices such as GNSS antennas and GNSS chips and a built-in fusion positioning algorithm, so that the positioning performance and the maximum positioning deviation under different scenes are stable and predictable under the condition that large hardware and algorithm changes are not made.
In this exemplary embodiment, as shown in fig. 9, the satellite positioning deviation early warning is performed according to the scene type at the current time, which may specifically include the following steps:
step S910, acquiring a maximum deviation threshold corresponding to the scene type at the current moment and actual measurement data at the current moment, and obtaining a positioning accuracy value under the scene type at the current moment according to the actual measurement data.
And S920, if the positioning accuracy value is smaller than or equal to the maximum deviation threshold value, not performing satellite positioning deviation early warning.
And S930, if the positioning accuracy value is greater than the maximum deviation threshold value, performing satellite positioning deviation early warning.
And then, while scene division is carried out according to OS, NOS1, NOS2, Tunnel and Other, performance verification of the vehicle navigation positioning system can be carried out based on a large amount of actually measured data under various scenes, a statistical accuracy value is obtained, and performance analysis is carried out.
Specifically, CEP99.7 of the statistical accuracy value, CEP representing the circle probability error, may be employed as the maximum positioning deviation threshold. And comparing the positioning accuracy value under the scene type at the current moment with the maximum positioning deviation threshold, if the positioning accuracy value is less than or equal to the accuracy range of CEP99.7, determining that the positioning accuracy value is in a safe condition, and not performing early warning, otherwise, outputting satellite positioning deviation early warning by the system.
For example, after the scene type is determined, each positioning system calculates positioning performance and a maximum deviation threshold value in a corresponding scene and gives a safety protection threshold value in each scene based on a large amount of existing measured data including original GNSS observed data, high-precision reference positioning data, visual images and the like. And finally, outputting the scene label, the protection threshold and the early warning to a vehicle navigation positioning system. The vehicle-mounted navigation system can provide complete and reliable positioning service for a user according to the scene label, the protection threshold and early warning, and reduces risks caused by positioning deviation, so that the reliability and the safety of the vehicle-mounted navigation positioning system are greatly improved.
Fig. 10 is a schematic diagram of a GNSS system in an embodiment of the present disclosure, by which the scene recognition method in the above steps in the present exemplary embodiment can be implemented. The specific structure of the schematic diagram of the GNSS system is as follows:
on the basis of the original architecture of the GNSS/IMU integrated navigation system, a real-time scene recognition system 1010 is extended, which includes an original observation data management module 1001, a GNSS feature value extraction module 1002, a scene detection and recognition module 1003, and a positioning deviation and deviation early warning module 1004.
The GNSS antenna 1005 is configured to receive satellite signals, and the GNSS chip 1006 obtains raw observation data according to the received satellite signals.
The raw observation data management module 1001 collects raw observation data collected by the GNSS chip 1006 and stores the raw observation data according to the sequence of collection time. In order to ensure the convenience and high efficiency of feature value extraction, all satellites can be numbered inside the real-time scene recognition system 1010 and are stored in a sequence according to the number size, and each satellite at least comprises three observation values of pseudo range, carrier wave and signal to noise ratio.
The GNSS feature value extraction module 1002 traverses each satellite of the GNSS original observation data acquired at each time epoch, extracts and calculates satellite signal feature values including a total number of satellites, an average signal-to-noise ratio of the satellites, a PDOP value of the satellites, a cycle slip ratio of the satellites, a pseudo-range multipath, and the like, and these GNSS feature values substantially reflect the quality of the satellite signals.
The scene detection and identification module 1003 performs comparison analysis mainly according to a series of GNSS feature values extracted in the previous operation and a set experience feature value threshold, and generally, the GNSS feature value threshold set in the OS scene is the most strict, which indicates the highest level, and the GNSS feature value threshold in the OS scene can be reached only when the GNSS observation quality is good enough. For other scenes such as NOS1 and NOS2, the GNSS eigenvalue threshold decreases in sequence.
The positioning deviation and deviation early warning module 1004 can give out the maximum positioning deviation and deviation early warning in the corresponding scene according to the identified scene type, then send the deviation and early warning information to the fusion positioning algorithm module 1007, and finally output as navigation positioning, so that the positioning early warning function is achieved, and the positioning reliability and safety are improved.
Fig. 11 is a complete flowchart of a scene recognition method in an embodiment of the present disclosure, which is an illustration of the above steps in the present exemplary embodiment, and the specific steps in the flowchart are as follows:
and step S1102, acquiring and decoding RTCM data.
And S1104, storing original observation data.
Step S1106, judging whether the GNSS observation data at the latest moment is valid.
If the observation data at the latest moment is valid data, the method proceeds to step S1108 and continues to execute the subsequent steps, and if the data is invalid, the scene recognition operation at the current moment is directly exited.
And S1108, extracting the characteristic value of the satellite signal.
The satellite signal characteristic values include a total number of satellites, a Position Precision of Precision (PDOP) value, an average satellite elevation angle, an average signal-to-noise ratio, a maximum signal-to-noise ratio, a number of satellites cycle slip ratio, an average pseudo-range multipath value, and the like. Wherein, the average pseudo-range multipath value can be calculated by the real-time pseudo-range multipath values of all satellites.
And S1110, judging the type of the open scene.
And judging whether the satellite signal characteristic value meets the characteristic value threshold of the open scene type, and if so, determining the scene type at the current moment as the open scene type. If the satellite signal feature value does not meet the feature value threshold of the open scene type, the process proceeds to step S1114.
And S1112, determining the maximum positioning deviation and deviation early warning of the open scene type.
And S1114, judging the type of the semi-open scene.
And judging whether the satellite signal characteristic value meets the characteristic value threshold of the semi-open scene type, and if so, determining the scene type at the current moment as the semi-open scene type. If the satellite signal eigenvalue does not meet the eigenvalue threshold of the semi-open scene type, step S1118 is performed.
And S1116, determining the maximum positioning deviation and deviation early warning of the type of the semi-empty open scene.
And S1118, judging the type of the severely occluded scene.
And judging whether the satellite signal characteristic value meets the characteristic value threshold of the severe occlusion scene type, and if so, determining the scene type at the current moment as the severe occlusion scene type. If the satellite signal feature value does not satisfy the feature value threshold of the severe occlusion scene type, the process proceeds to step S1122.
And S1120, determining the maximum positioning deviation and deviation early warning of the type of the severely-shielded scene.
And S1122, judging the type of the tunnel scene.
And judging whether the satellite signal characteristic value meets the characteristic value threshold of the tunnel scene type, and if so, determining the scene type at the current moment as the tunnel scene type. If the satellite signal feature value does not meet the feature value threshold of the tunnel scene type, the process proceeds to step S1126.
And S1124, determining the maximum positioning deviation and deviation early warning of the tunnel scene type.
Step S1126, determining other scene types.
And if the characteristic value thresholds of the several evaluable scene types do not meet, determining the scene type at the current moment as other scene types.
And S1128, outputting a scene type label.
Fig. 12 is a complete flowchart for calculating the real-time pseudorange multipath values of different frequency points for each effective observed satellite according to an embodiment of the present disclosure, which is a detailed description of the method for calculating the real-time pseudorange multipath values in step S1108 of fig. 11, and the specific steps of the flowchart are as follows:
and S1202, acquiring all original observation data.
And S1204, acquiring original observation data at the latest moment.
Step S1206, traversing and circulating each satellite.
And S1208, calculating a pseudo range noise value of the current satellite.
And S1210, detecting and identifying the cycle slip of the current satellite.
And S1212, judging whether the current satellite generates cycle slip.
If the current satellite has a cycle slip, the process goes to step S1214; if the cycle slip does not occur in the current satellite, the process proceeds to step S1216.
And S1214, clearing the pseudo range noise of the sliding window.
Step S1216, obtaining sliding window average pseudo range noise.
Wherein the size of the sliding window may be set to 5 minutes.
And step S1218, calculating the current satellite multipath value.
And S1220, updating the sliding window pseudo range noise.
Step S1222, storing the multipath values of all the satellites.
The scene recognition method in the present exemplary embodiment may be implemented by using a C + + language. The reliability and performance of the algorithm are verified through the road test data of 4 kilometers in total collected on the national highway, and the road test data is contrasted and analyzed with the visual identification scene labels, so that the system and the method can be finally verified to effectively identify different scenes and meet the change and difference of the actual scenes, and the optimal implementation effect is achieved.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Furthermore, the disclosure also provides a scene recognition device. Referring to fig. 13, the scene recognition apparatus may include an observation data acquisition module 1310, a satellite characteristic determination module 1320, a scene type determination module 1330, and a positioning deviation warning module 1340. Wherein:
the observation data obtaining module 1310 may be configured to obtain valid observation data of each satellite at the current time;
the satellite characteristic determination module 1320 may be configured to extract a basic observation value of each satellite from the effective observation data, and obtain a satellite signal characteristic value at the current time according to the basic observation value of each satellite;
the scene type determining module 1330 may be configured to obtain feature value thresholds corresponding to the respective scene types, and determine the scene type at the current time according to a relationship between the satellite signal feature value and the feature value threshold of the respective scene type;
the positioning deviation early warning module 1340 may be configured to output a scene type tag corresponding to a scene type at the current time, and perform satellite positioning deviation early warning according to the scene type at the current time.
In some exemplary embodiments of the present disclosure, the observation data acquisition module 1310 may include a raw data decoding unit and a raw data verification unit. Wherein:
the original data decoding unit can be used for acquiring original message data obtained at the current moment according to the received satellite signals, and decoding the original message data to obtain original observation data of each satellite at the current moment;
the original data verification unit can be used for performing data validity verification on the original observation data, and if the data validity verification is passed, the original observation data of each satellite at the current moment is used as valid observation data.
In some example embodiments of the present disclosure, the satellite characteristic determination module 1320 may include a valid observed satellite determination unit and a total number of satellites determination unit. Wherein:
the effective observation satellite determining unit can be used for acquiring an effective signal-to-noise ratio threshold value and an effective altitude angle threshold value, and determining the satellites of which the signal-to-noise ratios are greater than the effective signal-to-noise ratio threshold value and the altitude angles are greater than the effective altitude angle threshold value as effective observation satellites;
the total satellite number determining unit may be configured to obtain the number of effective observation satellites, and obtain the total number of satellites at the current time according to the number of effective observation satellites.
In some exemplary embodiments of the present disclosure, the satellite characteristic determination module 1320 may further include a satellite altitude sum determination unit and an average satellite altitude determination unit. Wherein:
the satellite altitude angle sum determining unit can be used for acquiring altitude angles of all effective observation satellites at the current moment and obtaining a satellite altitude angle sum according to the altitude angles of all the effective observation satellites;
the average satellite altitude determining unit may be configured to obtain the average satellite altitude at the current time according to a ratio of the sum of the satellite altitudes to the total number of satellites.
In some exemplary embodiments of the present disclosure, the satellite characteristic determination module 1320 may further include a satellite signal-to-noise ratio acquisition unit, a signal-to-noise ratio sum determination unit, and an average signal-to-noise ratio determination unit. Wherein:
the satellite signal-to-noise ratio acquisition unit can be used for determining a plurality of frequency points corresponding to each effective observation satellite and respectively acquiring the signal-to-noise ratio of each effective observation satellite at different frequency points at the current moment;
the signal-to-noise ratio sum determining unit can be used for obtaining the signal-to-noise ratio sum of each effective observation satellite at different frequency points according to the signal-to-noise ratio of each effective observation satellite at different frequency points;
the average signal-to-noise ratio determining unit can be used for obtaining the average signal-to-noise ratio of the effective observation satellite of each frequency point at the current moment according to the ratio of the sum of the signal-to-noise ratios to the total number of the satellites.
In some exemplary embodiments of the present disclosure, the satellite characteristic determination module 1320 may further include a cycle slip detection identification unit and a satellite number cycle slip ratio determination unit. Wherein:
the cycle slip detection and identification unit can be used for detecting and identifying the cycle slip of each effective observation satellite according to the carrier observation values of the effective observation satellite at different frequency points at the current moment so as to judge whether the effective observation satellite generates the cycle slip;
the satellite number cycle slip ratio determining unit can be used for marking the effective observation satellite with the cycle slip as a carrier signal interruption satellite, and obtaining the satellite number cycle slip ratio of each frequency point at the current moment according to the ratio of the number of the carrier signal interruption satellites of each frequency point to the total number of the satellites.
In some exemplary embodiments of the present disclosure, the satellite characteristic determination module 1320 may further include a pseudo-range multipath value determination unit, a multipath sum determination unit, and an average pseudo-range multipath value determination unit. Wherein:
the pseudo-range multipath value determination unit can be used for determining real-time pseudo-range multipath values of each effective observation satellite at different frequency points according to the carrier observation values and the pseudo-range observation values of each effective observation satellite at different frequency points at the current moment;
the multipath sum determining unit can be used for obtaining the multipath sum of all effective observation satellites of each frequency point according to the real-time pseudo-range multipath values of each effective observation satellite at different frequency points;
the average pseudorange multipath value determining unit may be configured to obtain an average pseudorange multipath value of each frequency point at the current time according to a ratio of the multipath sum to the total number of satellites.
In some exemplary embodiments of the present disclosure, the pseudorange multipath value determining unit may include a pseudorange multipath parameter obtaining unit, a pseudorange noise value determining unit, a cycle slip detecting and identifying unit, a pseudorange multipath value calculating unit, and a sliding window updating unit. Wherein:
the pseudo-range multi-path parameter acquisition unit can be used for acquiring the frequency of each frequency point and the frequency point error value and acquiring the size of a pseudo-range noise sliding window;
the pseudo-range noise value determination unit can be used for obtaining the pseudo-range noise value of each effective observation satellite at different frequency points according to the carrier observed value and the pseudo-range observed value of each effective observation satellite at different frequency points at the current moment, and the frequency and frequency point error value of each frequency point;
the cycle slip detection and identification unit can be used for detecting and identifying the cycle slip of each effective observation satellite according to the carrier observation values of each effective observation satellite at different frequency points at the current moment;
the pseudo-range multipath value calculation unit can be used for obtaining a real-time pseudo-range multipath value of the effective observation satellite at the frequency point according to the pseudo-range noise value of the effective observation satellite at the frequency point and the average pseudo-range noise value in the pseudo-range noise sliding window if the effective observation satellite does not generate cycle slip at the frequency point;
the sliding window updating unit may be configured to empty the pseudo-range noise value in the pseudo-range noise sliding window if the cycle slip occurs at the frequency point for the effective observation satellite, and perform the calculation of the real-time pseudo-range multipath value of the next effective observation satellite again.
In some exemplary embodiments of the present disclosure, the pseudorange multipath parameter obtaining unit may include a pseudorange noise sliding window determining unit, which may be configured to obtain a sampling rate of a base observation of a valid observed satellite and determine a size of a pseudorange noise sliding window according to the sampling rate of the base observation.
In some exemplary embodiments of the present disclosure, the pseudo-range multipath value calculation unit may include an average pseudo-range noise value calculation unit and a real-time pseudo-range multipath value calculation unit. Wherein:
the average pseudo-range noise value calculation unit may be configured to obtain pseudo-range noise values of the effective observed satellite at multiple times within the pseudo-range noise sliding window according to the size of the pseudo-range noise sliding window, and obtain an average pseudo-range noise value within the pseudo-range noise sliding window according to the pseudo-range noise values at the multiple times;
the real-time pseudo-range multipath value calculation unit may be configured to obtain a real-time pseudo-range multipath value of the effective observation satellite at the frequency point according to an absolute value of a difference between a pseudo-range noise value of the effective observation satellite at the frequency point at the current time and an average pseudo-range noise value in a pseudo-range noise sliding window.
In some example embodiments of the present disclosure, the scene type determination module 1330 may include a feature value threshold comparison unit, an evaluable scene type determination unit, and other scene type determination units. Wherein:
the characteristic value threshold comparison unit can be used for comparing the satellite signal characteristic value with the characteristic value threshold of each evaluable scene type respectively;
the evaluable scene type determining unit may be configured to determine a scene type at a current time as a corresponding evaluable scene type if the satellite signal feature value satisfies a feature value threshold of a certain evaluable scene type;
the other scene type determining unit may be configured to determine the scene type at the current time as the other scene type if the satellite signal feature value does not satisfy the feature value threshold of any evaluable scene type.
In some exemplary embodiments of the present disclosure, the positioning deviation warning module 1340 may include a positioning accuracy value determination unit, a safety positioning unit, and a positioning deviation warning unit. Wherein:
the positioning accuracy value determining unit can be used for acquiring a maximum deviation threshold corresponding to the scene type at the current moment and actual measurement data at the current moment, and obtaining a positioning accuracy value under the scene type at the current moment according to the actual measurement data;
the safety positioning unit can be used for not carrying out satellite positioning deviation early warning if the positioning accuracy value is less than or equal to the maximum deviation threshold value;
the positioning deviation early warning unit can be used for carrying out satellite positioning deviation early warning if the positioning accuracy value is greater than the maximum deviation threshold value.
The details of each module/unit in the scene recognition apparatus have been described in detail in the corresponding method embodiment section, and are not described herein again.
FIG. 14 illustrates a schematic structural diagram of a computer system suitable for use with the electronic device to implement an embodiment of the invention.
It should be noted that the computer system 1400 of the electronic device shown in fig. 14 is only an example, and should not bring any limitation to the function and the scope of the application of the embodiment of the present invention.
As shown in fig. 14, the computer system 1400 includes a Central Processing Unit (CPU)1401, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)1402 or a program loaded from a storage portion 1408 into a Random Access Memory (RAM) 1403. In the RAM 1403, various programs and data necessary for system operation are also stored. The CPU1401, ROM 1402, and RAM 1403 are connected to each other via a bus 1404. An input/output (I/O) interface 1405 is also connected to bus 1404.
The following components are connected to the I/O interface 1405: an input portion 1406 including a keyboard, a mouse, and the like; an output portion 1407 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker and the like; a storage portion 1408 including a hard disk and the like; and a communication section 1409 including a network interface card such as a LAN card, a modem, or the like. The communication section 1409 performs communication processing via a network such as the internet. The driver 1410 is also connected to the I/O interface 1405 as necessary. A removable medium 1411 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 1410 as necessary, so that a computer program read out therefrom is installed into the storage section 1408 as necessary.
In particular, the processes described below with reference to the flowcharts may be implemented as computer software programs, according to embodiments of the present invention. For example, embodiments of the invention include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network via the communication portion 1409 and/or installed from the removable medium 1411. When the computer program is executed by a Central Processing Unit (CPU)1401, various functions defined in the system of the present application are executed.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below.
It should be noted that although in the above detailed description several modules of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more of the modules described above may be embodied in one module, in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module described above may be further divided into embodiments by a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (15)

1. A method for scene recognition, comprising:
obtaining effective observation data of each satellite at the current moment;
extracting the basic observation value of each satellite from the effective observation data, and obtaining a satellite signal characteristic value at the current moment according to the basic observation value of each satellite;
acquiring a characteristic value threshold corresponding to each scene type, and determining the scene type at the current moment according to the relationship between the satellite signal characteristic value and the characteristic value threshold of each scene type;
and outputting a scene type label corresponding to the scene type at the current moment, and performing satellite positioning deviation early warning according to the scene type at the current moment.
2. The method of claim 1, wherein the obtaining valid observation data of each satellite at the current time comprises:
acquiring original message data obtained at the current moment according to received satellite signals, and decoding the original message data to obtain original observation data of each satellite at the current moment;
and performing data validity verification on the original observation data, and if the verification is passed, taking the original observation data of each satellite at the current moment as valid observation data.
3. The method of claim 1, wherein the basic observations of the satellites include signal-to-noise ratio and altitude angle, the satellite signal feature comprises a total number of satellites, and the obtaining the satellite signal feature at the current time according to the basic observations of each of the satellites comprises:
acquiring an effective signal-to-noise ratio threshold value and an effective altitude angle threshold value, and determining the satellites of which the signal-to-noise ratios are greater than the effective signal-to-noise ratio threshold value and the altitude angles are greater than the effective altitude angle threshold value as effective observation satellites;
and acquiring the number of the effective observation satellites, and acquiring the total number of the satellites at the current moment according to the number of the effective observation satellites.
4. The scene recognition method according to claim 3, wherein the satellite signal feature value includes an average satellite altitude angle, and the obtaining of the satellite signal feature value at the current time from the basic observation value of each of the satellites includes:
acquiring the altitude angles of all the effective observation satellites at the current moment, and obtaining the sum of the altitude angles of the satellites according to the altitude angles of all the effective observation satellites;
and obtaining the average satellite altitude at the current moment according to the ratio of the satellite altitude sum to the total number of the satellites.
5. The method of claim 3, wherein the satellite signal feature value comprises an average signal-to-noise ratio, and the obtaining the satellite signal feature value at the current time according to the basic observation value of each satellite comprises:
determining a plurality of frequency points corresponding to each effective observation satellite, and respectively acquiring the signal-to-noise ratio of each effective observation satellite at different frequency points at the current moment;
obtaining the sum of the signal-to-noise ratios of the effective observation satellites at different frequency points according to the signal-to-noise ratios of the effective observation satellites at different frequency points;
and obtaining the average signal-to-noise ratio of the effective observation satellite of each frequency point at the current moment according to the ratio of the sum of the signal-to-noise ratios to the total number of the satellites.
6. The method of claim 5, wherein the basic observations of the satellites comprise carrier observations, the satellite signal features comprise a satellite number cycle slip ratio, and the obtaining the satellite signal features at the current time from the basic observations of each of the satellites comprises:
performing cycle slip detection and identification on the effective observation satellites according to carrier observation values of the effective observation satellites at different frequency points at the current moment so as to judge whether the effective observation satellites generate cycle slips or not;
and marking the effective observation satellites with cycle slip as carrier signal interruption satellites, and obtaining the cycle slip ratio of the satellite number of each frequency point at the current moment according to the ratio of the number of the carrier signal interruption satellites of each frequency point to the total number of the satellites.
7. The method of claim 5, wherein the basic observations of the satellites include carrier observations and pseudo-range observations, the satellite signal features include averaged pseudo-range multipath values, and obtaining the satellite signal features at the current time from the basic observations of each of the satellites comprises:
determining real-time pseudo-range multipath values of the effective observation satellites at different frequency points according to carrier observation values and pseudo-range observation values of the effective observation satellites at different frequency points at the current moment;
obtaining the multipath sum of all effective observation satellites of each frequency point according to the real-time pseudo-range multipath values of each effective observation satellite at different frequency points;
and obtaining an average pseudo-range multipath value of each frequency point at the current moment according to the ratio of the multipath sum to the total number of the satellites.
8. The method of claim 7, wherein the determining the real-time pseudorange multipath values of the effective observation satellites at different frequency points according to the carrier observation values and the pseudorange observation values of the effective observation satellites at different frequency points at the current time comprises:
acquiring the frequency of each frequency point and a frequency point error value, and acquiring the size of a pseudo-range noise sliding window;
obtaining pseudo-range noise values of the effective observation satellites at different frequency points according to carrier observation values and pseudo-range observation values of the effective observation satellites at different frequency points at the current moment, and frequencies of the frequency points and the frequency point error values;
performing cycle slip detection and identification on the effective observation satellites according to carrier observation values of the effective observation satellites at different frequency points at the current moment;
if the effective observation satellite does not generate cycle slip at the frequency point, obtaining a real-time pseudo-range multipath value of the effective observation satellite at the frequency point according to the pseudo-range noise value of the effective observation satellite at the frequency point and the average pseudo-range noise value in the pseudo-range noise sliding window;
and if the cycle slip of the effective observation satellite occurs at the frequency point, clearing the pseudo-range noise value in the pseudo-range noise sliding window, and calculating the real-time pseudo-range multipath value of the next effective observation satellite again.
9. The method of claim 8, wherein obtaining the pseudorange noise sliding window size comprises:
and acquiring the sampling rate of the basic observation value of the effective observation satellite, and determining the size of a pseudo-range noise sliding window according to the sampling rate of the basic observation value.
10. The method according to claim 8, wherein the obtaining a real-time pseudorange multipath value of the effective observed satellite at the frequency point according to the pseudorange noise value of the effective observed satellite at the frequency point and the average pseudorange noise value within the pseudorange noise sliding window comprises:
obtaining pseudo range noise values of the effective observation satellite at multiple moments in the pseudo range noise sliding window according to the size of the pseudo range noise sliding window, and obtaining an average pseudo range noise value in the pseudo range noise sliding window according to the pseudo range noise values at the multiple moments;
and obtaining a real-time pseudo-range multipath value of the effective observation satellite at the frequency point according to the absolute value of the difference value between the pseudo-range noise value of the effective observation satellite at the frequency point at the current moment and the average pseudo-range noise value in the pseudo-range noise sliding window.
11. The scene recognition method according to claim 1, wherein the scene types include an evaluable scene type and other scene types, and the determining the scene type at the current time according to the relationship between the satellite signal feature value and the feature value threshold of each of the scene types includes:
comparing the satellite signal characteristic value with a characteristic value threshold value of each evaluable scene type respectively;
if the satellite signal characteristic value meets a characteristic value threshold of a certain evaluable scene type, determining the scene type at the current moment as a corresponding evaluable scene type;
and if the satellite signal characteristic value does not meet the characteristic value threshold of any one evaluable scene type, determining the scene type at the current moment as other scene types.
12. The scene recognition method of claim 1, wherein the performing satellite positioning deviation early warning according to the scene type at the current time comprises:
acquiring a maximum deviation threshold corresponding to the scene type at the current moment and actual measurement data at the current moment, and obtaining a positioning accuracy value under the scene type at the current moment according to the actual measurement data;
if the positioning accuracy value is smaller than or equal to the maximum deviation threshold value, satellite positioning deviation early warning is not carried out;
and if the positioning accuracy value is greater than the maximum deviation threshold value, performing satellite positioning deviation early warning.
13. A scene recognition apparatus, comprising:
the observation data acquisition module is used for acquiring effective observation data of each satellite at the current moment;
the satellite characteristic determination module is used for extracting the basic observation value of each satellite from the effective observation data and obtaining a satellite signal characteristic value at the current moment according to the basic observation value of each satellite;
the scene type determining module is used for acquiring a characteristic value threshold corresponding to each scene type and determining the scene type at the current moment according to the relationship between the satellite signal characteristic value and the characteristic value threshold of each scene type;
and the positioning deviation early warning module is used for outputting a scene type label corresponding to the scene type at the current moment and carrying out satellite positioning deviation early warning according to the scene type at the current moment.
14. An electronic device, comprising:
a processor; and
memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the scene recognition method of any of claims 1 to 12.
15. A computer-readable medium, on which a computer program is stored, which program, when being executed by a processor, carries out the scene recognition method according to any one of claims 1 to 12.
CN202011633209.3A 2020-12-31 2020-12-31 Scene recognition method and device, electronic equipment and computer readable medium Pending CN114690223A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011633209.3A CN114690223A (en) 2020-12-31 2020-12-31 Scene recognition method and device, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011633209.3A CN114690223A (en) 2020-12-31 2020-12-31 Scene recognition method and device, electronic equipment and computer readable medium

Publications (1)

Publication Number Publication Date
CN114690223A true CN114690223A (en) 2022-07-01

Family

ID=82134061

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011633209.3A Pending CN114690223A (en) 2020-12-31 2020-12-31 Scene recognition method and device, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN114690223A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115267861A (en) * 2022-09-30 2022-11-01 智道网联科技(北京)有限公司 Automatic driving fusion positioning precision testing method and device and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115267861A (en) * 2022-09-30 2022-11-01 智道网联科技(北京)有限公司 Automatic driving fusion positioning precision testing method and device and electronic equipment
CN115267861B (en) * 2022-09-30 2023-03-10 智道网联科技(北京)有限公司 Automatic driving fusion positioning precision testing method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN107966724B (en) It is a kind of based on 3D city model auxiliary urban canyons in satellite positioning method
Wang et al. Smartphone shadow matching for better cross-street GNSS positioning in urban environments
AU2016331625B2 (en) System and method for localization and tracking using GNSS location estimates, satellite SNR data and 3D maps
US7030814B2 (en) System and method to estimate the location of a receiver in a multi-path environment
US20020042268A1 (en) Systems and methods for determining signal coverage
US11899117B2 (en) Moving body positioning system, method, and program
JP5425039B2 (en) Satellite signal determination apparatus and program
KR20140138027A (en) Receivers and methods for multi-mode navigation
TW201508305A (en) Cloud-offloaded global satellite positioning
CN114624741A (en) Positioning accuracy evaluation method and device
US20200110182A1 (en) Positioning device and positioning method
Zair et al. A-contrario modeling for robust localization using raw GNSS data
CN113031031A (en) Weighting positioning method based on GNSS signal accurate classification in urban canyon
CN113970761A (en) Non-line-of-sight signal identification method, system, computer equipment and storage medium
CN114690223A (en) Scene recognition method and device, electronic equipment and computer readable medium
Ziedan Optimized position estimation in mobile multipath environments using machine learning
US20140085138A1 (en) Efficient detection of movement using satellite positioning systems
Nasr-Azadani et al. Detecting Multipath Effects on Smartphone Gnss Measurements Using Cmcd and Elevation-Dependent SNR Selection Technique
KR101274629B1 (en) Hardware bias calculating system and method
CN113687397B (en) Method for detecting tightly-combined navigation forwarding type deception jamming
KR101334507B1 (en) Positioning system and methdo
Lv et al. Research on shadow matching algorithm based on consistency probability weighting
KR101392068B1 (en) Apparatus and method for fault detecting of global navigation satellite system using receiver baseline
CN113325450A (en) Positioning method, positioning device, electronic equipment and storage medium
CN111175789A (en) Ionized layer anomaly monitoring method, device and system of foundation enhancement system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination