CN112396125B - Classification method, device, equipment and storage medium for positioning test scenes - Google Patents

Classification method, device, equipment and storage medium for positioning test scenes Download PDF

Info

Publication number
CN112396125B
CN112396125B CN202011386443.0A CN202011386443A CN112396125B CN 112396125 B CN112396125 B CN 112396125B CN 202011386443 A CN202011386443 A CN 202011386443A CN 112396125 B CN112396125 B CN 112396125B
Authority
CN
China
Prior art keywords
occlusion
shielding
source
degree
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011386443.0A
Other languages
Chinese (zh)
Other versions
CN112396125A (en
Inventor
刘阳
王硕
高洪伟
李璇
黄志福
张金柱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Group Corp
Original Assignee
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Group Corp filed Critical FAW Group Corp
Priority to CN202011386443.0A priority Critical patent/CN112396125B/en
Publication of CN112396125A publication Critical patent/CN112396125A/en
Application granted granted Critical
Publication of CN112396125B publication Critical patent/CN112396125B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a classification method, a classification device, classification equipment and a storage medium for positioning test scenes. The method comprises the steps of obtaining an image to be classified of a current test scene, determining the type of a shielding object in the image to be classified, determining the area of the image to be classified and the area of each type of shielding object, and determining the shielding degree based on the total area of each type of shielding object and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, gao Zhedang and ultrahigh shielding, and the image to be classified under each test scene can be subdivided, so that the shielding degree of the image to be classified is accurately determined; according to the shielding degree and the radiation type of the radiation source, the scene classification result of the image to be classified is determined, the positioning test scenes are classified comprehensively and accurately, reasonable positioning accuracy is set for the test scenes of all types, and the evaluation accuracy of the high-accuracy satellite positioning effect is improved based on the positioning accuracy of the test scenes of all types.

Description

Classification method, device, equipment and storage medium for positioning test scenes
Technical Field
The embodiment of the invention relates to a positioning technology, in particular to a method, a device, equipment and a storage medium for classifying positioning test scenes.
Background
With the continuous development of the positioning and navigation technology, the requirement of the vehicle on the positioning precision is continuously improved, and the vehicle-mounted sub-meter-level and even centimeter-level positioning products appear in succession. However, satellite positioning and navigation use satellite signals for positioning, and the positioning signals are easily affected by the surrounding environment along with the angle change of the satellites in the sky. For example, buildings on both sides of a road, metal signboards, and large vehicles traveling on the road have a large difference in the influence on the positioning accuracy. Therefore, when a high-precision positioning test is performed, the test scenes need to be classified to complete the test under different test scenes.
In the prior art, the dividing mode of the high-precision positioning test scene is generally simply divided according to the road condition and the surrounding buildings. For example, the positioning test scene is divided into a highway, an urban expressway, a short building, an urban canyon, a shady road, a place under an overpass and the like. However, the dividing accuracy of the positioning test scenario is poor, because the positioning test result has a large error in the same positioning test scenario. For example, if the vehicles are all highways, more and fewer large transportation vehicles will generate different positioning test results, and if the positioning test results generate errors of tens of centimeters, the sub-meter or even centimeter-level positioning accuracy is extremely sensitive to the influence.
In summary, in the process of implementing the present invention, the inventor finds that at least the following problems exist in the prior art: the positioning test scene division precision is low, so that the positioning precision setting under different test scenes is inaccurate, and the evaluation accuracy of the satellite high-precision positioning effect is reduced.
Disclosure of Invention
The embodiment of the invention provides a classification method, a classification device and a storage medium for positioning test scenes, which improve the classification accuracy of the positioning test scenes, realize the effect of reasonably setting the positioning accuracy of the test scenes and further improve the evaluation accuracy of the high-precision positioning effect of a satellite.
In a first aspect, an embodiment of the present invention provides a method for classifying a positioning test scenario, including:
acquiring an image to be classified of a current test scene;
determining the type of the shielding object in the image to be classified, and determining the area of the image to be classified and the area of each type of shielding object;
determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, gao Zhedang and ultrahigh shielding;
and determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source.
In a second aspect, an embodiment of the present invention further provides a classification apparatus for positioning a test scenario, including:
the image acquisition module is used for acquiring an image to be classified of the current test scene;
the area determining module is used for determining the type of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects;
the shielding degree determining module is used for determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, gao Zhedang and ultrahigh shielding;
and the scene classification module is used for determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source.
In a third aspect, an embodiment of the present invention further provides a classification device for a positioning test scenario, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the classification method for a positioning test scenario according to any one of the first aspect when executing the computer program.
In a fourth aspect, an embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, implement the classification method for localization test scenarios according to any one of the first aspect.
According to the technical scheme, the image to be classified of the current test scene is obtained, the type of the shelters in the image to be classified is determined, the area of the image to be classified and the area of each type of shelter are determined, and the sheltering degree is determined based on the total area of each type of shelter and the area of the image to be classified, wherein the sheltering degree comprises at least one of no sheltering, low sheltering, medium sheltering, gao Zhedang and ultrahigh sheltering, so that the image to be classified under the open test scene and the non-open test scene can be subdivided, and the sheltering degree of the image to be classified is accurately determined; according to the shielding degree and the radiation type of the radiation source, the scene classification result of the image to be classified is determined, the positioning test scenes can be comprehensively and accurately classified, reasonable positioning accuracy is set for the test scenes of all types, and the evaluation accuracy of the high-accuracy satellite positioning effect is further improved according to the set positioning accuracy.
Drawings
Fig. 1 is a schematic flowchart of a classification method for positioning test scenes according to an embodiment of the present invention;
FIG. 2 is a diagram of an image to be classified according to a first embodiment of the present invention;
fig. 3 is a schematic structural diagram of a classification apparatus for positioning a test scenario according to a second embodiment of the present invention;
fig. 4 is a schematic structural diagram of a classification device for positioning a test scenario according to a third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a schematic flowchart of a method for classifying a positioning test scenario according to an embodiment of the present invention, where the method is applicable to a situation of classifying the positioning test scenario and the method can be executed by a device for classifying the positioning test scenario, where the device can be implemented by software and/or hardware and is generally integrated in a device for classifying the positioning test scenario. Referring specifically to fig. 1, the method may include the steps of:
and S110, acquiring an image to be classified of the current test scene.
Wherein, the current test scene refers to a positioning test scene where a vehicle loaded with a positioning product is located. The current test scene can be an expressway, an urban expressway, a forest shade road, an area under a viaduct, a high-voltage iron tower and the like. The images to be classified can be collected through a camera or other photographing equipment on the vehicle.
S120, determining the category of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects.
The categories of shades may include, among others, tree categories, building categories, metal categories, and special categories of buildings. The trees comprise tall trees beside roads, can shield satellite signals to a certain extent, but cannot completely shield the satellite signals, are not easy to reflect and are easy to generate multipath problems; the building category comprises various concrete buildings, tall buildings, bridges, viaducts, overpasses, tunnels, mountains and the like, satellite signals can be completely shielded, certain reflection is caused to the satellite signals, and the problem of multipath is easily caused; the metal types comprise large metal advertising boards on two sides of a road, guideboards, large vehicles running on the road and the like, can completely shield satellite signals, are easy to reflect the satellite signals and are easy to generate multipath problems; the special types comprise a high-voltage iron tower, a signal tower and the like, which interfere satellite signals, reduce the reliability of the satellite signals and influence the positioning accuracy.
Optionally, the determining the category of the obstruction in the image to be classified includes: inputting the images to be classified into a classification model which is trained in advance to obtain labels of various shielding objects, and determining the types of the shielding objects according to the labels, wherein the classification model which is trained is obtained by performing supervision training on an initial classification model according to a sample scene image and a sample classification image carrying the labels.
The classification model may be a neural network model, or may be other learning algorithms, etc. Illustratively, the classification model may be a decision tree, a logistic Regression model (LR), a Support Vector Machine algorithm (SVM), a full convolution network, or the like. Specifically, the sample scene image is an original image of each sampling scene acquired by a camera or other photographing equipment, the sample scene image with a blocking object is used as a positive sample, the sample scene image without the blocking object is used as a negative sample, the initial classification model is supervised and trained on the basis of the positive sample, the negative sample and the sample classification image with a label, so as to adjust the model parameters of the initial classification model, and the classification model with the model parameters reaching a stable state is used as the trained classification model.
Further, the image to be classified is input into the trained classification model, so that the label and the prediction probability of each type of shielding object can be obtained, the type of the shielding object is determined according to the label and the prediction probability of each shielding object, the area of the image to be classified is determined according to the side length of the image to be classified, and the areas of various shielding objects are determined according to the area of pixel points of the shielding objects of various types in the image to be classified.
As shown in fig. 2, the image to be classified is input to the trained classification model, the types of the shelters are determined to be buildings, numbers and metal signboards, the area of the shelter of the buildings is calculated according to the area of the pixel points of the buildings in the image to be classified, the area of the shelter of the trees is calculated according to the area of the pixel points of the trees in the image to be classified, and the area of the shelter of the metal signboards is calculated according to the area of the pixel points of the metal signboards in the image to be classified.
S130, determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified.
Wherein, the total area based on all kinds of shelters and the area of waiting to classify the image, confirm the degree of sheltering from, include: calculating a first ratio of the total area of the various types of shelters to the area of the image to be classified; comparing the first ratio with at least one first occlusion threshold to determine the degree of occlusion. The total area of the various types of shelters is the sum of the areas of the various types of shelters, the sheltering degree comprises at least one of no sheltering, low sheltering, medium sheltering, gao Zhedang and ultrahigh sheltering, and the first sheltering threshold comprises at least one of 10%, 30%, 45%, 60% and 90%.
Specifically, no occlusion means that there is substantially no obvious occlusion, and the first ratio of the total area of various occlusions of the image to be classified to the area of the classified image is less than or equal to 10%; the low occlusion indicates that only a distant low-angle or nearby flexible occlusion object exists, and the first ratio of the total area of various occlusions of the images to be classified to the area of the classified images is greater than or equal to 10% and less than or equal to 30%; the middle shielding means that the far low-angle range is seriously shielded or a part of shielding objects are arranged at the height of one side of the vehicle body, and the first ratio of the total area of the various shielding objects of the images to be classified to the shielding degree to the area of the classified images is more than or equal to 30 percent and less than or equal to 45 percent; gao Zhedang indicates that there is a large-area shelter at one side of the car body or a partial shelter at two sides, and the first ratio of the total area of all shelters of the images to be classified and the area of the classified images is greater than or equal to 45% and less than or equal to 60%; the ultrahigh shielding indicates that large-area shielding exists at the height of two sides of the vehicle body, even the vehicle is partially shielded along the driving direction, under the shielding degree, a satellite can not be positioned basically, the precision requirement is not made, and the first ratio of the total area of various shields of the images to be classified to the shielding degree to the area of the classified images is more than or equal to 60 percent and less than or equal to 90 percent.
Through the mode, the images to be classified under the open test scene and the non-open test scene can be subdivided, the shielding degree of the images to be classified is accurately determined, the accuracy of determining the scene classification result of the images to be classified is improved, and the positioning accuracy is further reasonably set for different scenes.
S140, determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source.
In order to improve the scene classification precision of the images to be classified, the shielding sources under the scenes with the shielding degrees of medium shielding, high shielding and ultrahigh shielding can be counted before the scene classification result is determined. Optionally, the statistical method of the occlusion source comprises: calculating a second ratio of the area of each type of shelter with high shelter degree or ultrahigh shelter degree to the total area of each type of shelter, and calculating a third ratio of the area of each type of shelter with medium shelter degree to the total area of each type of shelter; and comparing the second ratio with a second occlusion threshold value and a third occlusion threshold value, determining a first occlusion source type corresponding to high occlusion or ultrahigh occlusion, and comparing the third ratio with the second occlusion threshold value and the third occlusion threshold value, and determining a second occlusion source type corresponding to medium occlusion.
For example, if the obstruction is a tree, and the obstruction degree is middle obstruction, the expression of the second ratio of the tree is:
Figure BDA0002809828170000071
wherein, M Tree (a tree) Is the area of the tree, M Various shelters Is the total area of each type of shelter; if the shielding object is metal, and the shielding degree is high shielding or ultrahigh shielding, the expression of the third ratio of the metal is as follows:
Figure BDA0002809828170000072
wherein M is Metal Is the area of the metal, M Various shelters Is the total area of each type of shelter.
The method for determining the first occlusion source category comprises the following steps: if the second ratio of the shielding degree of any shielding object with high shielding or ultrahigh shielding is larger than or equal to the second shielding threshold value, taking the shielding object as a main shielding source; if the second ratio of the shielding object with the shielding degree of high shielding or ultrahigh shielding is less than or equal to the second shielding threshold value and is greater than or equal to the third shielding threshold value, taking the shielding object as a secondary shielding source; determining the occlusion degree as a first occlusion source category corresponding to high occlusion or ultrahigh occlusion based on the primary occlusion source and the secondary occlusion source.
The method for determining the second occlusion source category comprises the following steps: if the third ratio of any shielding object with the shielding degree of medium shielding is larger than or equal to the second shielding threshold value, taking the shielding object as a main shielding source with the shielding degree of medium shielding; and taking the occlusion type corresponding to the main occlusion source with the occlusion degree of middle occlusion as the second occlusion source type.
Wherein the second occlusion threshold may be 60% and the third occlusion threshold may be 30%.
Illustratively, when the occlusion degree is determined to be the first occlusion source category corresponding to high occlusion or ultrahigh occlusion, the second ratio is compared with the second occlusion threshold and the third occlusion threshold, if the second ratio of any of the occlusions is greater than or equal to 60%, the occlusion is taken as the main occlusion source, if the second ratio of any of the occlusions is less than or equal to 60% and greater than or equal to 30%, the occlusion is taken as the main occlusion source, and the first occlusion source category is further determined according to the main occlusion source and the secondary occlusion source. As shown in table 1, which is a classification table of the first occlusion source category, it can be seen from table 1 that the first occlusion source categories with occlusion degrees of high occlusion and ultrahigh occlusion both include 13 cases.
Table 1: classification table of first occlusion source category
Figure BDA0002809828170000091
Illustratively, when the occlusion degree is determined to be a second occlusion source type corresponding to the medium occlusion, the third ratio is compared with the second occlusion threshold and the third occlusion threshold, and if the second ratio of any occlusion object is greater than or equal to 60%, the occlusion object is taken as a main occlusion source with the occlusion degree being the medium occlusion, and the main occlusion source is four cases of trees, metals, buildings and mixed non-main occlusion sources, and the second occlusion source type includes 4 cases.
By the mode, the type of the shielding source is refined according to the shielding degree, and the scene classification result of the image to be classified is accurately determined according to the type of the shielding object and the radiation type of the radiation source. Optionally, the method for determining the scene classification result includes: determining the radiation category according to the number of the radiation sources and the distance between the radiation sources and the current test scene, wherein the radiation category comprises at least one of strong radiation, weak radiation and no radiation; taking the occlusion type corresponding to the non-occlusion and low-occlusion degrees of the occlusion object as a third type occlusion source; and combining the first shielding source type, the second shielding source type, the third shielding source type and each radiation type to obtain a scene classification result of the image to be classified.
The occlusion degree is that the occlusion types corresponding to the non-occlusion and the low-occlusion both include 1 condition, and then the occlusion source of the third type includes 2 conditions, and as can be seen from the foregoing description, the occlusion source of the first type includes 26 conditions, and the occlusion source of the second type includes 4 conditions, and then the occlusion source of each type corresponding to each occlusion degree includes 32 conditions. And if the radiation types comprise 3 conditions of strong radiation, weak radiation and no radiation, combining the first shielding source type, the second shielding source type, the third shielding source type and the radiation types to obtain a scene classification result of the image to be classified, wherein the scene classification result comprises 96 conditions. Based on the mode, the positioning test scenes can be comprehensively and accurately classified, reasonable positioning precision is set for the test scenes of all classes, and the evaluation precision of the high-precision positioning effect of the satellite is further improved based on the set positioning precision.
According to the technical scheme provided by the embodiment, the image to be classified of the current test scene is obtained, the category of the shielding object in the image to be classified is determined, the area of the image to be classified and the area of each type of shielding object are determined, and the shielding degree is determined based on the total area of each type of shielding object and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, gao Zhedang and ultrahigh shielding, so that the image to be classified under the open test scene and the non-open test scene can be subdivided, and the shielding degree of the image to be classified is accurately determined; according to the shielding degree and the radiation type of the radiation source, the scene classification result of the image to be classified is determined, the positioning test scenes can be comprehensively and accurately classified, reasonable positioning accuracy is set for the test scenes of all types, and the evaluation accuracy of the satellite high-accuracy positioning effect is improved based on the set positioning accuracy.
Example two
Fig. 3 is a schematic structural diagram of a classification apparatus for positioning a test scenario according to a second embodiment of the present invention. Referring to fig. 3, the apparatus includes: an image acquisition module 210, an area determination module 220, an occlusion degree determination module 230, and a scene classification module 240.
The image obtaining module 210 is configured to obtain an image to be classified of a current test scene;
the area determining module 220 is configured to determine the type of the blocking object in the image to be classified, and determine the area of the image to be classified and the areas of various blocking objects;
the occlusion degree determining module 230 is configured to determine an occlusion degree based on a total area of each type of occlusion object and an area of the image to be classified, where the occlusion degree includes at least one of no occlusion, low occlusion, medium occlusion, gao Zhedang, and ultrahigh occlusion;
and a scene classification module 240, configured to determine a scene classification result of the image to be classified according to the occlusion degree and the radiation category of the radiation source.
On the basis of the above technical solutions, the area determining module 220 is further configured to input the image to be classified into a classification model trained in advance to obtain labels of various types of obstacles, and determine the type of the obstacle according to the labels, where the classification model trained is obtained by performing supervised training on an initial classification model according to a sample scene image and a sample classification image carrying the labels.
On the basis of the above technical solutions, the occlusion degree determining module 230 is further configured to calculate a first ratio between the total area of each type of occlusion object and the area of the image to be classified;
comparing the first ratio with at least one first occlusion threshold to determine the degree of occlusion.
On the basis of each technical scheme, the device further comprises: an occlusion source category determination module; the second shielding source type determining module is used for calculating a second ratio of the area of each type of shielding object with the shielding degree of high shielding or ultrahigh shielding to the total area of each type of shielding object, and calculating a third ratio of the area of each type of shielding object with the shielding degree of medium shielding to the total area of each type of shielding object;
and comparing the second ratio with a second occlusion threshold value and a third occlusion threshold value, determining a first occlusion source type corresponding to high occlusion or ultrahigh occlusion, and comparing the third ratio with the second occlusion threshold value and the third occlusion threshold value, and determining a second occlusion source type corresponding to medium occlusion.
On the basis of the above technical solutions, the occlusion source type determining module is further configured to, if the second ratio of any of the occlusions with the occlusion degree of high occlusion or ultrahigh occlusion is greater than or equal to the second occlusion threshold, take the occlusion as a main occlusion source;
if the second ratio of the shielding object with the shielding degree of high shielding or ultrahigh shielding is less than or equal to the second shielding threshold value and is greater than or equal to the third shielding threshold value, taking the shielding object as a secondary shielding source;
and determining the occlusion degree as a first occlusion source category corresponding to high occlusion or ultrahigh occlusion based on the primary occlusion source and the secondary occlusion source.
On the basis of the foregoing technical solutions, the occlusion source type determining module is further configured to, if the occlusion degree is that the third ratio of any of the occlusions of the middle occlusion is greater than or equal to the second occlusion threshold, take the obstruction as a main occlusion source whose occlusion degree is the middle occlusion;
and taking the occlusion type corresponding to the main occlusion source with the occlusion degree of the middle occlusion as the second occlusion source type.
On the basis of the foregoing technical solutions, the scene classification module 240 is further configured to determine the radiation category according to the number of the radiation sources and the distance between the radiation source and the current test scene, where the radiation category includes at least one of strong radiation, weak radiation, and no radiation;
taking the occlusion type corresponding to the occlusion degree of the occlusion object as the non-occlusion and low-occlusion as a third type occlusion source;
and combining the first shielding source category, the second shielding source category, the third shielding source category and the radiation categories to obtain a scene classification result of the image to be classified.
According to the technical scheme provided by the embodiment, the image to be classified of the current test scene is obtained, the category of the shielding object in the image to be classified is determined, the area of the image to be classified and the area of each type of shielding object are determined, and the shielding degree is determined based on the total area of each type of shielding object and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, gao Zhedang and ultrahigh shielding, so that the image to be classified under the open test scene and the non-open test scene can be subdivided, and the shielding degree of the image to be classified is accurately determined; according to the shielding degree and the radiation type of the radiation source, the scene classification result of the image to be classified is determined, the positioning test scenes can be comprehensively and accurately classified, reasonable positioning accuracy is set for the test scenes of all types, and the evaluation accuracy of the high-accuracy satellite positioning effect is improved based on the set positioning accuracy.
EXAMPLE III
Fig. 4 is a schematic structural diagram of a classification device for positioning a test scenario according to a third embodiment of the present invention. FIG. 4 shows a block diagram of a classification device 12 suitable for use in implementing an exemplary localization test scenario of an embodiment of the present invention. The classification device 12 for locating test scenarios shown in fig. 4 is only an example and should not bring any limitations to the functionality and scope of use of the embodiments of the present invention.
As shown in FIG. 4, the classification device 12 that locates the test scenario is in the form of a general purpose computing device. The components of the classification device 12 that locate test scenarios may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
The sorting apparatus 12 that locates test scenarios typically includes a variety of computer system readable media. Such media may be any available media that can be accessed by the sorting apparatus 12 for the positioned test scenario and includes both volatile and non-volatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 30 and/or cache 32. The sorting apparatus 12 for locating test scenarios may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, and commonly referred to as a "hard drive"). Although not shown in FIG. 4, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. The system memory 28 may include at least one program product having a set of program modules (e.g., an image acquisition module 210, an area determination module 220, an occlusion degree determination module 230, and a scene classification module 240 of a classification device that locates a test scene) configured to perform the functions of embodiments of the present invention.
A program/utility 44 having a set of program modules 46 (e.g., an image acquisition module 210, an area determination module 220, an occlusion degree determination module 230, and a scene classification module 240 of a classification device that locates a test scene) may be stored, for example, in the system memory 28, such program modules 46 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which or some combination of which may comprise an implementation of a network environment. Program modules 46 generally carry out the functions and/or methodologies of the described embodiments of the invention.
The sorting device 12 of the localized test scenario may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with the sorting device 12 of the localized test scenario, and/or with any device (e.g., network card, modem, etc.) that enables the sorting device 12 of the localized test scenario to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the sorting apparatus 12 that locates the test scenario may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with the other modules of the classification device 12 that locate the test scenario via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the classification device 12 that locates the test scenarios, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, to name a few.
The processing unit 16 executes various functional applications and data processing by running the program stored in the system memory 28, for example, implementing a classification method for locating test scenarios provided by the embodiment of the present invention, the method includes:
acquiring an image to be classified of a current test scene;
determining the category of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects;
determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, gao Zhedang and ultrahigh shielding;
and determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source.
The processing unit 16 executes programs stored in the system memory 28 to perform various functional applications and data processing, for example, to implement a classification method for locating test scenarios provided by the embodiment of the present invention.
Of course, those skilled in the art can understand that the processor may also implement the technical solution of the classification method for positioning test scenes provided in any embodiment of the present invention.
Example four
The fourth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for classifying a positioning test scenario provided in the fourth embodiment of the present invention, where the method includes:
acquiring an image to be classified of a current test scene;
determining the category of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects;
determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, gao Zhedang and ultrahigh shielding;
and determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source.
Of course, the computer program stored on the computer-readable storage medium provided by the embodiments of the present invention is not limited to the above method operations, and may also perform related operations in a classification method for positioning a test scenario provided by any embodiment of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, or device.
The computer readable signal medium can include a category of obstruction, a degree of obstruction, a category of radiation, etc., having computer readable program code embodied therein. The type of such propagating obstruction, the degree of obstruction, the type of radiation, etc. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It should be noted that, in the embodiment of the classification apparatus for positioning a test scenario, the modules included in the classification apparatus are only divided according to functional logic, but are not limited to the above division, as long as corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (6)

1. A classification method for positioning test scenes is characterized by comprising the following steps:
acquiring an image to be classified of a current test scene;
determining the category of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects;
determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, gao Zhedang and ultrahigh shielding;
determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source;
before determining a scene classification result of the image to be classified according to the occlusion degree and the radiation category of the radiation source, the method further comprises:
calculating a second ratio of the area of each type of shelter with high shelter degree or ultrahigh shelter degree to the total area of each type of shelter, and calculating a third ratio of the area of each type of shelter with medium shelter degree to the total area of each type of shelter;
comparing the second ratio with a second occlusion threshold and a third occlusion threshold, determining a first occlusion source type corresponding to high occlusion or ultrahigh occlusion, and comparing the third ratio with the second occlusion threshold and the third occlusion threshold, determining a second occlusion source type corresponding to medium occlusion;
comparing the second ratio with a second occlusion threshold and a third occlusion threshold, and determining a first occlusion source category corresponding to a high occlusion or an ultrahigh occlusion degree, including:
if the second ratio of the shielding degree of any shielding object with high shielding or ultrahigh shielding is larger than or equal to the second shielding threshold value, taking the shielding object as a main shielding source;
if the second ratio of the shielding object with the shielding degree of high shielding or ultrahigh shielding is less than or equal to the second shielding threshold value and is greater than or equal to the third shielding threshold value, taking the shielding object as a secondary shielding source;
determining a first occlusion source category corresponding to high occlusion or ultrahigh occlusion based on the primary occlusion source and the secondary occlusion source;
comparing the third ratio with the second occlusion threshold value to determine a second occlusion source category corresponding to the medium occlusion in the occlusion degree, including:
if the third ratio of any shielding object with the shielding degree of medium shielding is larger than or equal to the second shielding threshold value, taking the shielding object as a main shielding source with the shielding degree of medium shielding;
taking the occlusion type corresponding to the main occlusion source with the occlusion degree of middle occlusion as the second occlusion source type;
determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source, wherein the scene classification result comprises the following steps:
determining the radiation category according to the number of the radiation sources and the distance between the radiation sources and the current test scene, wherein the radiation category comprises at least one of strong radiation, weak radiation and no radiation;
taking the occlusion type corresponding to the occlusion degree of the occlusion object as the non-occlusion and low-occlusion as a third type occlusion source;
and combining the first shielding source category, the second shielding source category, the third shielding source category and the radiation categories to obtain a scene classification result of the image to be classified.
2. The method of claim 1, wherein the determining the class of obstruction in the image to be classified comprises:
inputting the images to be classified into a classification model which is trained in advance to obtain labels of various shielding objects, and determining the types of the shielding objects according to the labels, wherein the classification model which is trained is obtained by performing supervision training on an initial classification model according to a sample scene image and a sample classification image carrying the labels.
3. The method according to claim 1, wherein the determining the degree of occlusion based on the total area of the various types of occlusions and the area of the image to be classified comprises:
calculating a first ratio of the total area of the various types of shelters to the area of the image to be classified;
comparing the first ratio with at least one first occlusion threshold to determine the degree of occlusion.
4. A classification device for positioning test scenes is characterized by comprising:
the image acquisition module is used for acquiring an image to be classified of the current test scene;
the area determining module is used for determining the type of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects;
the shielding degree determining module is used for determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, gao Zhedang and ultrahigh shielding;
the scene classification module is used for determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source;
the second shielding source type determining module is used for calculating a second ratio of the area of each type of shielding object with the shielding degree of high shielding or ultrahigh shielding to the total area of each type of shielding object, and calculating a third ratio of the area of each type of shielding object with the shielding degree of medium shielding to the total area of each type of shielding object;
comparing the second ratio with a second occlusion threshold value and a third occlusion threshold value to determine a first occlusion source type corresponding to high occlusion or ultrahigh occlusion, and comparing the third ratio with the second occlusion threshold value and the third occlusion threshold value to determine a second occlusion source type corresponding to medium occlusion;
the occlusion source type determining module is further configured to, if the second ratio of any occlusion object with the occlusion degree of high occlusion or ultrahigh occlusion is greater than or equal to the second occlusion threshold, take the occlusion object as a main occlusion source;
if the second ratio of the shielding object with the shielding degree of high shielding or ultrahigh shielding is less than or equal to the second shielding threshold value and is greater than or equal to the third shielding threshold value, taking the shielding object as a secondary shielding source;
determining a first occlusion source category corresponding to high occlusion or ultrahigh occlusion based on the primary occlusion source and the secondary occlusion source;
the occlusion source type determining module is further configured to, if the occlusion degree is that the third ratio of any one of the occluded objects is greater than or equal to the second occlusion threshold, take the occlusion object as a main occlusion source with the occlusion degree being middle occlusion;
taking the occlusion type corresponding to the main occlusion source with the occlusion degree of middle occlusion as the second occlusion source type;
the scene classification module is further configured to determine the radiation category according to the number of the radiation sources and a distance between the radiation sources and the current test scene, wherein the radiation category includes at least one of strong radiation, weak radiation, and no radiation;
taking the occlusion type corresponding to the occlusion degree of the occlusion object as the non-occlusion and low-occlusion as a third type occlusion source;
and combining the first shielding source type, the second shielding source type, the third shielding source type and each radiation type to obtain a scene classification result of the image to be classified.
5. A classification device for localization test scenarios, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the classification method for localization test scenarios according to any one of claims 1 to 3 when executing the computer program.
6. A storage medium containing computer-executable instructions which, when executed by a computer processor, implement the classification method of locating test scenarios of any one of claims 1-3.
CN202011386443.0A 2020-12-01 2020-12-01 Classification method, device, equipment and storage medium for positioning test scenes Active CN112396125B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011386443.0A CN112396125B (en) 2020-12-01 2020-12-01 Classification method, device, equipment and storage medium for positioning test scenes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011386443.0A CN112396125B (en) 2020-12-01 2020-12-01 Classification method, device, equipment and storage medium for positioning test scenes

Publications (2)

Publication Number Publication Date
CN112396125A CN112396125A (en) 2021-02-23
CN112396125B true CN112396125B (en) 2022-11-18

Family

ID=74604080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011386443.0A Active CN112396125B (en) 2020-12-01 2020-12-01 Classification method, device, equipment and storage medium for positioning test scenes

Country Status (1)

Country Link
CN (1) CN112396125B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115278538A (en) * 2021-04-29 2022-11-01 北京小米移动软件有限公司 Positioning method, positioning device, electronic equipment and storage medium
CN114001711B (en) * 2021-09-24 2024-04-12 上海东一土地规划勘测设计有限公司 Land mapping method, system, device and storage medium based on positioning system
CN116432090B (en) * 2023-06-13 2023-10-20 荣耀终端有限公司 Scene recognition method, system and terminal equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104105106A (en) * 2014-07-23 2014-10-15 武汉飞脉科技有限责任公司 Wireless communication network intelligent-antenna-covered scene automatic classification and recognition method
WO2019227294A1 (en) * 2018-05-28 2019-12-05 华为技术有限公司 Image processing method, related device and computer storage medium
CN111738329A (en) * 2020-06-19 2020-10-02 中南大学 Land use classification method for time series remote sensing images
CN111753929A (en) * 2020-08-07 2020-10-09 腾讯科技(深圳)有限公司 Artificial intelligence based classification method, device, terminal and storage medium
CN111898423A (en) * 2020-06-19 2020-11-06 北京理工大学 Morphology-based multisource remote sensing image ground object fine classification method
CN111967296A (en) * 2020-06-28 2020-11-20 北京中科虹霸科技有限公司 Iris living body detection method, entrance guard control method and entrance guard control device

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11187035B2 (en) * 2004-05-06 2021-11-30 Mechoshade Systems, Llc Sky camera virtual horizon mask and tracking solar disc
BR112012014661A2 (en) * 2009-12-18 2020-12-29 L'oreal COSMETIC SKIN TREATMENT PROCESS AND COSMETIC ASSEMBLY
US10045427B2 (en) * 2014-09-29 2018-08-07 Philips Lighting Holding B.V. System and method of autonomous restore point creation and restoration for luminaire controllers
US10137890B2 (en) * 2016-06-28 2018-11-27 Toyota Motor Engineering & Manufacturing North America, Inc. Occluded obstacle classification for vehicles
CN110062727A (en) * 2016-10-20 2019-07-26 铁路视像有限公司 System and method for object and detection of obstacles and classification in the collision prevention of railway applications
JP7376468B2 (en) * 2017-09-20 2023-11-08 エーエスエムエル ネザーランズ ビー.ブイ. radiation source
DE102017222258A1 (en) * 2017-12-08 2019-06-13 Robert Bosch Gmbh Method for a LIDAR device for detecting a hidden object
CN108931825B (en) * 2018-05-18 2020-07-14 北京航空航天大学 Remote sensing image cloud thickness detection method based on ground object definition
US11145046B2 (en) * 2018-07-24 2021-10-12 The Regents Of The University Of Michigan Detection of near-field occlusions in images
CN111489384B (en) * 2019-01-25 2023-05-16 曜科智能科技(上海)有限公司 Method, device, system and medium for evaluating shielding based on mutual viewing angle
US10762697B1 (en) * 2019-02-27 2020-09-01 Verizon Patent And Licensing Inc. Directional occlusion methods and systems for shading a virtual object rendered in a three-dimensional scene
CN110275181A (en) * 2019-07-08 2019-09-24 武汉中海庭数据技术有限公司 A kind of vehicle-mounted mobile measuring system and its data processing method
CN110705727B (en) * 2019-09-30 2022-03-01 山东建筑大学 Photovoltaic power station shadow shielding diagnosis method and system based on random forest algorithm
CN111046956A (en) * 2019-12-13 2020-04-21 苏州科达科技股份有限公司 Occlusion image detection method and device, electronic equipment and storage medium
CN111428581B (en) * 2020-03-05 2023-11-21 平安科技(深圳)有限公司 Face shielding detection method and system
CN111860566B (en) * 2020-04-24 2024-06-28 北京嘀嘀无限科技发展有限公司 Shelter identification model training method, shelter identification model training device and storage medium
CN111970424B (en) * 2020-08-25 2022-07-19 武汉工程大学 Light field camera unblocking system and method based on micro-lens array synthetic aperture

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104105106A (en) * 2014-07-23 2014-10-15 武汉飞脉科技有限责任公司 Wireless communication network intelligent-antenna-covered scene automatic classification and recognition method
WO2019227294A1 (en) * 2018-05-28 2019-12-05 华为技术有限公司 Image processing method, related device and computer storage medium
CN111738329A (en) * 2020-06-19 2020-10-02 中南大学 Land use classification method for time series remote sensing images
CN111898423A (en) * 2020-06-19 2020-11-06 北京理工大学 Morphology-based multisource remote sensing image ground object fine classification method
CN111967296A (en) * 2020-06-28 2020-11-20 北京中科虹霸科技有限公司 Iris living body detection method, entrance guard control method and entrance guard control device
CN111753929A (en) * 2020-08-07 2020-10-09 腾讯科技(深圳)有限公司 Artificial intelligence based classification method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN112396125A (en) 2021-02-23

Similar Documents

Publication Publication Date Title
CN112396125B (en) Classification method, device, equipment and storage medium for positioning test scenes
CN109284348B (en) Electronic map updating method, device, equipment and storage medium
US11105638B2 (en) Method, apparatus, and computer readable storage medium for updating electronic map
JP7082151B2 (en) Map trajectory matching data quality determination method, equipment, server and medium
CN110260870B (en) Map matching method, device, equipment and medium based on hidden Markov model
CN109459734B (en) Laser radar positioning effect evaluation method, device, equipment and storage medium
CN109435955B (en) Performance evaluation method, device and equipment for automatic driving system and storage medium
CN110427444B (en) Navigation guide point mining method, device, equipment and storage medium
CN110415545B (en) Lane positioning method and device, electronic equipment and storage medium
CN109931945B (en) AR navigation method, device, equipment and storage medium
CN110426050B (en) Map matching correction method, device, equipment and storage medium
EP3617997A1 (en) Method, apparatus, device, and storage medium for calibrating posture of moving obstacle
CN111380546A (en) Vehicle positioning method and device based on parallel road, electronic equipment and medium
CN110346825B (en) Vehicle positioning method and device, vehicle and storage medium
CN109635868B (en) Method and device for determining obstacle type, electronic device and storage medium
CN110555352B (en) Interest point identification method, device, server and storage medium
CN111121797B (en) Road screening method, device, server and storage medium
CN109284801A (en) State identification method, device, electronic equipment and the storage medium of traffic light
CN112798004A (en) Vehicle positioning method, device, equipment and storage medium
CN110018503B (en) Vehicle positioning method and positioning system
CN111354217A (en) Parking route determining method, device, equipment and medium
CN112100565A (en) Road curvature determination method, device, equipment and storage medium
CN114565780A (en) Target identification method and device, electronic equipment and storage medium
CN110186472B (en) Vehicle yaw detection method, computer device, storage medium, and vehicle system
CN113449687B (en) Method and device for identifying point of interest outlet and point of interest inlet and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant