CN112396125A - Classification method, device, equipment and storage medium for positioning test scenes - Google Patents

Classification method, device, equipment and storage medium for positioning test scenes Download PDF

Info

Publication number
CN112396125A
CN112396125A CN202011386443.0A CN202011386443A CN112396125A CN 112396125 A CN112396125 A CN 112396125A CN 202011386443 A CN202011386443 A CN 202011386443A CN 112396125 A CN112396125 A CN 112396125A
Authority
CN
China
Prior art keywords
shielding
occlusion
image
classified
degree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011386443.0A
Other languages
Chinese (zh)
Other versions
CN112396125B (en
Inventor
刘阳
王硕
高洪伟
李璇
黄志福
张金柱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Group Corp
Original Assignee
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Group Corp filed Critical FAW Group Corp
Priority to CN202011386443.0A priority Critical patent/CN112396125B/en
Publication of CN112396125A publication Critical patent/CN112396125A/en
Application granted granted Critical
Publication of CN112396125B publication Critical patent/CN112396125B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Abstract

The embodiment of the invention discloses a classification method, a classification device, classification equipment and a storage medium for positioning test scenes. The method comprises the steps of obtaining an image to be classified of a current test scene, determining the category of a shielding object in the image to be classified, determining the area of the image to be classified and the area of each type of shielding object, and determining the shielding degree based on the total area of each type of shielding object and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, high shielding and ultrahigh shielding, and the image to be classified under each test scene can be subdivided to accurately determine the shielding degree of the image to be classified; according to the shielding degree and the radiation type of the radiation source, the scene classification result of the image to be classified is determined, the positioning test scenes are classified comprehensively and accurately, reasonable positioning accuracy is set for the test scenes of all types, and the evaluation accuracy of the high-accuracy satellite positioning effect is improved based on the positioning accuracy of the test scenes of all types.

Description

Classification method, device, equipment and storage medium for positioning test scenes
Technical Field
The embodiment of the invention relates to a positioning technology, in particular to a method, a device, equipment and a storage medium for classifying positioning test scenes.
Background
With the continuous development of the positioning and navigation technology, the requirement of the vehicle on the positioning precision is continuously improved, and the vehicle-mounted sub-meter-level and even centimeter-level positioning products appear in succession. However, satellite positioning and navigation use satellite signals for positioning, and the positioning signals are easily affected by the surrounding environment along with the angle change of the satellites in the sky. For example, buildings on both sides of a road, metal signboards, and large vehicles traveling on the road have a large difference in the influence on the positioning accuracy. Therefore, when a high-precision positioning test is performed, the test scenes need to be classified to complete the test under different test scenes.
In the prior art, the dividing mode of the high-precision positioning test scene is generally simply divided according to the road condition and the surrounding buildings. For example, the positioning test scene is divided into an expressway, an urban expressway, a short building, an urban canyon, a shady road, an overpass and the like. However, the division accuracy of the positioning test scenario is poor, because the positioning test result has a large error in the same positioning test scenario. For example, if the vehicles are all highways, a large number of transport vehicles with more and fewer vehicles can generate different positioning test results, and if the positioning test results generate errors of tens of centimeters, the sub-meter or even centimeter-level positioning accuracy is extremely sensitive to the influence.
In summary, in the process of implementing the present invention, the inventor finds that at least the following problems exist in the prior art: the division precision of the positioning test scenes is low, so that the positioning precision setting under different test scenes is inaccurate, and the evaluation accuracy of the high-precision positioning effect of the satellite is reduced.
Disclosure of Invention
The embodiment of the invention provides a classification method, a classification device and a storage medium for positioning test scenes, which improve the classification accuracy of the positioning test scenes, realize the effect of reasonably setting the positioning accuracy of the test scenes and further improve the evaluation accuracy of the high-precision positioning effect of a satellite.
In a first aspect, an embodiment of the present invention provides a method for classifying a positioning test scenario, including:
acquiring an image to be classified of a current test scene;
determining the category of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects;
determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, high shielding and ultrahigh shielding;
and determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source.
In a second aspect, an embodiment of the present invention further provides a classification apparatus for positioning a test scenario, including:
the image acquisition module is used for acquiring an image to be classified of the current test scene;
the area determining module is used for determining the type of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects;
the shielding degree determining module is used for determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, high shielding and ultrahigh shielding;
and the scene classification module is used for determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source.
In a third aspect, an embodiment of the present invention further provides a classification device for a positioning test scenario, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the classification method for a positioning test scenario according to any one of the first aspect when executing the computer program.
In a fourth aspect, an embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, implement the classification method for localization test scenarios according to any one of the first aspect.
According to the technical scheme, the image to be classified of the current test scene is obtained, the category of the shelters in the image to be classified is determined, the area of the image to be classified and the area of each type of shelter are determined, and the sheltering degree is determined based on the total area of each type of shelter and the area of the image to be classified, wherein the sheltering degree comprises at least one of no sheltering, low sheltering, medium sheltering, high sheltering and ultrahigh sheltering, so that the image to be classified under an open test scene and a non-open test scene can be subdivided, and the sheltering degree of the image to be classified is accurately determined; according to the shielding degree and the radiation type of the radiation source, the scene classification result of the image to be classified is determined, the positioning test scenes can be comprehensively and accurately classified, reasonable positioning accuracy is set for the test scenes of all types, and the evaluation accuracy of the high-accuracy satellite positioning effect is further improved according to the set positioning accuracy.
Drawings
Fig. 1 is a schematic flowchart of a classification method for positioning test scenes according to an embodiment of the present invention;
FIG. 2 is a diagram of an image to be classified according to a first embodiment of the present invention;
fig. 3 is a schematic structural diagram of a classification apparatus for positioning a test scenario according to a second embodiment of the present invention;
fig. 4 is a schematic structural diagram of a classification device for positioning a test scenario according to a third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a schematic flowchart of a method for classifying a positioning test scenario according to an embodiment of the present invention, where the method is applicable to a situation of classifying the positioning test scenario and the method can be executed by a device for classifying the positioning test scenario, where the device can be implemented by software and/or hardware and is generally integrated in a device for classifying the positioning test scenario. Referring specifically to fig. 1, the method may include the steps of:
and S110, acquiring an image to be classified of the current test scene.
Wherein, the current test scene refers to a positioning test scene where a vehicle loaded with a positioning product is located. The current test scene can be an expressway, an urban expressway, a forest shade road, an area under a viaduct, a high-voltage iron tower and the like. The images to be classified can be collected through a camera or other photographing equipment on the vehicle.
S120, determining the category of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects.
The categories of shades may include, among others, tree categories, building categories, metal categories, and special categories of buildings. The trees comprise tall trees beside roads, can shield satellite signals to a certain extent, but cannot completely shield the satellite signals, are not easy to reflect and are easy to generate multipath problems; the building category comprises various concrete buildings, tall buildings, bridges, viaducts, overpasses, tunnels, mountains and the like, satellite signals can be completely shielded, certain reflection is caused to the satellite signals, and the problem of multipath is easily caused; the metal types comprise large metal advertising boards on two sides of a road, guideboards, large vehicles running on the road and the like, can completely shield satellite signals, are easy to reflect the satellite signals and are easy to generate multipath problems; the special types comprise a high-voltage iron tower, a signal tower and the like, which interfere satellite signals, reduce the reliability of the satellite signals and influence the positioning accuracy.
Optionally, the determining the category of the obstruction in the image to be classified includes: inputting the images to be classified into a classification model which is trained in advance to obtain labels of various shielding objects, and determining the types of the shielding objects according to the labels, wherein the classification model which is trained is obtained by performing supervision training on an initial classification model according to a sample scene image and a sample classification image carrying the labels.
The classification model may be a neural network model, other learning algorithms, or the like. Illustratively, the classification model may be a decision tree, a logistic Regression model (LR), a Support Vector Machine algorithm (SVM), a full convolution network, or the like. Specifically, the sample scene image is an original image of each sampling scene acquired by a camera or other photographing equipment, the sample scene image with a blocking object is used as a positive sample, the sample scene image without the blocking object is used as a negative sample, the initial classification model is supervised and trained on the basis of the positive sample, the negative sample and the sample classification image with a label, so as to adjust the model parameters of the initial classification model, and the classification model with the model parameters reaching a stable state is used as the trained classification model.
Further, the image to be classified is input into the trained classification model, so that the label and the prediction probability of each type of shielding object can be obtained, the type of the shielding object is determined according to the label and the prediction probability of each shielding object, the area of the image to be classified is determined according to the side length of the image to be classified, and the areas of various shielding objects are determined according to the area of pixel points of the shielding objects of various types in the image to be classified.
As shown in fig. 2, the image to be classified is input to the trained classification model, the types of the shelters are determined to be buildings, numbers and metal signboards, the area of the shelter of the buildings is calculated according to the area of the pixel points of the buildings in the image to be classified, the area of the shelter of the trees is calculated according to the area of the pixel points of the trees in the image to be classified, and the area of the shelter of the metal signboards is calculated according to the area of the pixel points of the metal signboards in the image to be classified.
S130, determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified.
Wherein, the total area based on all kinds of shelters and the area of the image to be classified, determine the degree of shelter from, include: calculating a first ratio of the total area of the various types of shelters to the area of the image to be classified; comparing the first ratio with at least one first occlusion threshold to determine the degree of occlusion. The total area of each type of shelter is the sum of the areas of the various types of shelters, the shelter degree comprises at least one of no shelter, low shelter, medium shelter, high shelter and ultrahigh shelter, and the first shelter threshold comprises at least one of 10%, 30%, 45%, 60% and 90%.
Specifically, no occlusion means that there is substantially no obvious occlusion, and the first ratio of the total area of all types of occlusions of the image to be classified to the area of the classified image is less than or equal to 10%; the low shielding represents that only a far low-angle or nearby flexible shielding object exists, and the first ratio of the total area of various shielding objects of the images to be classified to the area of the classified images is greater than or equal to 10% and less than or equal to 30%; the middle shielding means that the far low-angle range is seriously shielded or a part of shielding objects are arranged at the height of one side of the vehicle body, and the first ratio of the total area of the various shielding objects of the images to be classified to the shielding degree to the area of the classified images is more than or equal to 30 percent and less than or equal to 45 percent; the high shielding means that a large-area shielding exists at one side of the vehicle body or partial shielding exists at two sides of the vehicle body, and the first ratio of the total area of various shielding objects of the images to be classified to the shielding degree to the area of the classified images is greater than or equal to 45% and less than or equal to 60%; the ultrahigh shielding indicates that large-area shielding exists at the height of two sides of the vehicle body, even the vehicle is partially shielded along the driving direction, under the shielding degree, a satellite can not be positioned basically, the precision requirement is not made, and the first ratio of the total area of various shields of the images to be classified to the shielding degree to the area of the classified images is more than or equal to 60 percent and less than or equal to 90 percent.
Through the mode, the images to be classified under the open test scene and the non-open test scene can be subdivided, the shielding degree of the images to be classified is accurately determined, the accuracy of determining the scene classification result of the images to be classified is improved, and the positioning accuracy is further reasonably set for different scenes.
S140, determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source.
In order to improve the scene classification precision of the images to be classified, the shielding sources under the scenes with the shielding degrees of medium shielding, high shielding and ultrahigh shielding can be counted before the scene classification result is determined. Optionally, the statistical method of the occlusion source includes: calculating a second ratio of the area of each type of shelter with high shelter degree or ultrahigh shelter degree to the total area of each type of shelter, and calculating a third ratio of the area of each type of shelter with medium shelter degree to the total area of each type of shelter; and comparing the second ratio with a second occlusion threshold value and a third occlusion threshold value, determining a first occlusion source type corresponding to high occlusion or ultrahigh occlusion, and comparing the third ratio with the second occlusion threshold value and the third occlusion threshold value, and determining a second occlusion source type corresponding to medium occlusion.
For example, if the obstruction is a tree and the degree of obstruction is medium obstruction, the expression for the second ratio of the tree is:
Figure BDA0002809828170000071
wherein M isTree (a tree)Is the area of the tree, MVarious sheltersIs the total area of each type of shelter; if the shielding object is metal, and the shielding degree is high shielding or ultrahigh shielding, the expression of the third ratio of the metal is as follows:
Figure BDA0002809828170000072
wherein M isMetalIs the area of the metal, MVarious sheltersIs the total area of each type of shelter.
The method for determining the first occlusion source category comprises the following steps: if the second ratio of the shielding degree of any shielding object with high shielding or ultrahigh shielding is larger than or equal to the second shielding threshold value, taking the shielding object as a main shielding source; if the second ratio of the shielding object with the shielding degree of high shielding or ultrahigh shielding is less than or equal to the second shielding threshold value and is greater than or equal to the third shielding threshold value, taking the shielding object as a secondary shielding source; determining the occlusion degree as a first occlusion source category corresponding to high occlusion or ultrahigh occlusion based on the primary occlusion source and the secondary occlusion source.
The method for determining the second occlusion source category comprises the following steps: if the third ratio of any shielding object with the shielding degree of medium shielding is larger than or equal to the second shielding threshold value, taking the shielding object as a main shielding source with the shielding degree of medium shielding; and taking the occlusion type corresponding to the main occlusion source with the occlusion degree of the middle occlusion as the second occlusion source type.
Wherein the second occlusion threshold may be 60% and the third occlusion threshold may be 30%.
Illustratively, when the occlusion degree is determined to be the first occlusion source category corresponding to high occlusion or ultrahigh occlusion, the second ratio is compared with the second occlusion threshold and the third occlusion threshold, if the second ratio of any of the occlusions is greater than or equal to 60%, the occlusion is taken as the main occlusion source, if the second ratio of any of the occlusions is less than or equal to 60% and greater than or equal to 30%, the occlusion is taken as the main occlusion source, and the first occlusion source category is further determined according to the main occlusion source and the secondary occlusion source. As shown in table 1, which is a classification table of the first occlusion source category, it can be seen from table 1 that the first occlusion source categories with occlusion degrees of high occlusion and ultrahigh occlusion both include 13 cases.
Table 1: classification List of first occlusion Source Category
Figure BDA0002809828170000091
Illustratively, when the occlusion degree is determined to be a second occlusion source type corresponding to the medium occlusion, the third ratio is compared with the second occlusion threshold and the third occlusion threshold, and if the second ratio of any occlusion object is greater than or equal to 60%, the occlusion object is taken as a main occlusion source with the occlusion degree being the medium occlusion, and the main occlusion source is four cases of trees, metals, buildings and mixed non-main occlusion sources, and the second occlusion source type includes 4 cases.
By the mode, the type of the shielding source is refined according to the shielding degree, and the scene classification result of the image to be classified is accurately determined according to the type of the shielding object and the radiation type of the radiation source. Optionally, the method for determining the scene classification result includes: determining the radiation category according to the number of the radiation sources and the distance between the radiation sources and the current test scene, wherein the radiation category comprises at least one of strong radiation, weak radiation and no radiation; taking the occlusion type corresponding to the occlusion degree of the occlusion object as the non-occlusion and low-occlusion as a third type occlusion source; and combining the first shielding source category, the second shielding source category, the third shielding source category and the radiation categories to obtain a scene classification result of the image to be classified.
The occlusion degree is that the occlusion types corresponding to no occlusion and low occlusion all include 1 condition, then the occlusion source of the third type includes 2 conditions, and it can be known from the foregoing description that the occlusion source of the first type includes 26 conditions, the occlusion source of the second type includes 4 conditions, and then the occlusion source of each type corresponding to each occlusion degree includes 32 conditions. And if the radiation categories comprise 3 conditions of strong radiation, weak radiation and no radiation, combining the first shielding source category, the second shielding source category, the third shielding source category and the radiation categories to obtain scene classification results of the images to be classified, wherein the scene classification results comprise 96 conditions. Based on the mode, the positioning test scenes can be comprehensively and accurately classified, reasonable positioning precision is set for the test scenes of all classes, and the evaluation precision of the high-precision positioning effect of the satellite is further improved based on the set positioning precision.
According to the technical scheme provided by the embodiment, the image to be classified of the current test scene is obtained, the category of the shielding object in the image to be classified is determined, the area of the image to be classified and the area of each type of shielding object are determined, and the shielding degree is determined based on the total area of each type of shielding object and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, high shielding and ultrahigh shielding, so that the image to be classified under the open test scene and the non-open test scene can be subdivided, and the shielding degree of the image to be classified is accurately determined; according to the shielding degree and the radiation type of the radiation source, the scene classification result of the image to be classified is determined, the positioning test scenes can be comprehensively and accurately classified, reasonable positioning accuracy is set for the test scenes of all types, and the evaluation accuracy of the high-accuracy satellite positioning effect is improved based on the set positioning accuracy.
Example two
Fig. 3 is a schematic structural diagram of a classification apparatus for positioning a test scenario according to a second embodiment of the present invention. Referring to fig. 3, the apparatus includes: an image acquisition module 210, an area determination module 220, an occlusion degree determination module 230, and a scene classification module 240.
The image obtaining module 210 is configured to obtain an image to be classified of a current test scene;
the area determining module 220 is configured to determine the type of the blocking object in the image to be classified, and determine the area of the image to be classified and the areas of various blocking objects;
the occlusion degree determining module 230 is configured to determine an occlusion degree based on a total area of each type of occlusion object and an area of the image to be classified, where the occlusion degree includes at least one of no occlusion, low occlusion, medium occlusion, high occlusion, and ultrahigh occlusion;
and a scene classification module 240, configured to determine a scene classification result of the image to be classified according to the occlusion degree and the radiation category of the radiation source.
On the basis of the above technical solutions, the area determining module 220 is further configured to input the image to be classified into a classification model trained in advance to obtain labels of various types of obstacles, and determine the type of the obstacle according to the labels, where the classification model trained is obtained by performing supervised training on an initial classification model according to a sample scene image and a sample classification image carrying the labels.
On the basis of the above technical solutions, the occlusion degree determining module 230 is further configured to calculate a first ratio between the total area of each type of occlusion object and the area of the image to be classified;
comparing the first ratio with at least one first occlusion threshold to determine the degree of occlusion.
On the basis of the above technical solutions, the apparatus further includes: an occlusion source category determination module; the second shielding source type determining module is used for calculating a second ratio of the area of each type of shielding object with the shielding degree of high shielding or ultrahigh shielding to the total area of each type of shielding object, and calculating a third ratio of the area of each type of shielding object with the shielding degree of medium shielding to the total area of each type of shielding object;
and comparing the second ratio with a second occlusion threshold value and a third occlusion threshold value, determining a first occlusion source type corresponding to high occlusion or ultrahigh occlusion, and comparing the third ratio with the second occlusion threshold value and the third occlusion threshold value, and determining a second occlusion source type corresponding to medium occlusion.
On the basis of the above technical solutions, the occlusion source type determining module is further configured to, if the second ratio of any of the occlusions with the occlusion degree of high occlusion or ultrahigh occlusion is greater than or equal to the second occlusion threshold, take the occlusion as a main occlusion source;
if the second ratio of the shielding object with the shielding degree of high shielding or ultrahigh shielding is less than or equal to the second shielding threshold value and is greater than or equal to the third shielding threshold value, taking the shielding object as a secondary shielding source;
determining the occlusion degree as a first occlusion source category corresponding to high occlusion or ultrahigh occlusion based on the primary occlusion source and the secondary occlusion source.
On the basis of the above technical solutions, the occlusion source type determining module is further configured to, if the third ratio of any one of the occlusions with the moderate occlusion degree is greater than or equal to the second occlusion threshold, take the occlusion object as a main occlusion source with the moderate occlusion degree;
and taking the occlusion type corresponding to the main occlusion source with the occlusion degree of the middle occlusion as the second occlusion source type.
On the basis of the foregoing technical solutions, the scene classification module 240 is further configured to determine the radiation category according to the number of the radiation sources and the distance between the radiation source and the current test scene, where the radiation category includes at least one of strong radiation, weak radiation, and no radiation;
taking the occlusion type corresponding to the occlusion degree of the occlusion object as the non-occlusion and low-occlusion as a third type occlusion source;
and combining the first shielding source category, the second shielding source category, the third shielding source category and the radiation categories to obtain a scene classification result of the image to be classified.
According to the technical scheme provided by the embodiment, the image to be classified of the current test scene is obtained, the category of the shielding object in the image to be classified is determined, the area of the image to be classified and the area of each type of shielding object are determined, and the shielding degree is determined based on the total area of each type of shielding object and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, high shielding and ultrahigh shielding, so that the image to be classified under the open test scene and the non-open test scene can be subdivided, and the shielding degree of the image to be classified is accurately determined; according to the shielding degree and the radiation type of the radiation source, the scene classification result of the image to be classified is determined, the positioning test scenes can be comprehensively and accurately classified, reasonable positioning accuracy is set for the test scenes of all types, and the evaluation accuracy of the high-accuracy satellite positioning effect is improved based on the set positioning accuracy.
EXAMPLE III
Fig. 4 is a schematic structural diagram of a classification device for positioning a test scenario according to a third embodiment of the present invention. FIG. 4 shows a block diagram of a classification device 12 suitable for use in implementing an exemplary localization test scenario of an embodiment of the present invention. The classification device 12 for locating test scenarios shown in fig. 4 is only an example and should not bring any limitations to the functionality and scope of use of the embodiments of the present invention.
As shown in FIG. 4, the classification device 12 that locates the test scenario is in the form of a general purpose computing device. The components of the classification device 12 that locate test scenarios may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
The sorting apparatus 12 that locates test scenarios typically includes a variety of computer system readable media. Such media may be any available media that can be accessed by the sorting apparatus 12 for the positioned test scenario and includes both volatile and non-volatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache 32. The sorting apparatus 12 for locating test scenarios may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 4, and commonly referred to as a "hard drive"). Although not shown in FIG. 4, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. The system memory 28 may include at least one program product having a set of program modules (e.g., an image acquisition module 210, an area determination module 220, an occlusion degree determination module 230, and a scene classification module 240 of a classification device that locates a test scene) configured to perform the functions of embodiments of the present invention.
A program/utility 44 having a set of program modules 46 (e.g., an image acquisition module 210, an area determination module 220, an occlusion degree determination module 230, and a scene classification module 240 of a classification device that locates a test scene) may be stored, for example, in system memory 28, such program modules 46 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which or some combination of which may comprise an implementation of a network environment. Program modules 46 generally carry out the functions and/or methodologies of the described embodiments of the invention.
The sorting device 12 of the localized test scenario may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with the sorting device 12 of the localized test scenario, and/or with any device (e.g., network card, modem, etc.) that enables the sorting device 12 of the localized test scenario to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the sorting apparatus 12 that locates the test scenario may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with the other modules of the classification device 12 that locate the test scenario via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the classification device 12 that locates test scenarios, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by running a program stored in the system memory 28, for example, implementing a classification method for locating test scenarios provided by an embodiment of the present invention, the method includes:
acquiring an image to be classified of a current test scene;
determining the category of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects;
determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, high shielding and ultrahigh shielding;
and determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source.
The processing unit 16 executes various functional applications and data processing by running programs stored in the system memory 28, for example, to implement a classification method for locating test scenarios provided by the embodiment of the present invention.
Of course, those skilled in the art can understand that the processor may also implement the technical solution of the classification method for positioning test scenes provided in any embodiment of the present invention.
Example four
The fourth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for classifying a positioning test scenario provided in the fourth embodiment of the present invention, where the method includes:
acquiring an image to be classified of a current test scene;
determining the category of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects;
determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, high shielding and ultrahigh shielding;
and determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source.
Of course, the computer program stored on the computer-readable storage medium provided by the embodiments of the present invention is not limited to the above method operations, and may also perform related operations in a classification method for positioning a test scenario provided by any embodiment of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, or device.
The computer readable signal medium can include a category of obstruction, a degree of obstruction, a category of radiation, etc., having computer readable program code embodied therein. The type, degree, type of radiation, etc. of such propagating obstruction. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It should be noted that, in the embodiment of the classification device for positioning test scenarios, the modules included in the classification device are only divided according to functional logic, but are not limited to the above division, as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A classification method for positioning test scenes is characterized by comprising the following steps:
acquiring an image to be classified of a current test scene;
determining the category of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects;
determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, high shielding and ultrahigh shielding;
and determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source.
2. The method of claim 1, wherein the determining the class of obstruction in the image to be classified comprises:
inputting the images to be classified into a classification model which is trained in advance to obtain labels of various shielding objects, and determining the types of the shielding objects according to the labels, wherein the classification model which is trained is obtained by performing supervision training on an initial classification model according to a sample scene image and a sample classification image carrying the labels.
3. The method according to claim 1, wherein the determining the degree of occlusion based on the total area of the various types of occlusions and the area of the image to be classified comprises:
calculating a first ratio of the total area of the various types of shelters to the area of the image to be classified;
comparing the first ratio with at least one first occlusion threshold to determine the degree of occlusion.
4. The method according to claim 1, wherein before the determining the scene classification result of the image to be classified according to the occlusion degree and the radiation class of the radiation source, the method further comprises:
calculating a second ratio of the area of each type of shelter with high shelter degree or ultrahigh shelter degree to the total area of each type of shelter, and calculating a third ratio of the area of each type of shelter with medium shelter degree to the total area of each type of shelter;
and comparing the second ratio with a second occlusion threshold value and a third occlusion threshold value, determining a first occlusion source type corresponding to high occlusion or ultrahigh occlusion, and comparing the third ratio with the second occlusion threshold value and the third occlusion threshold value, and determining a second occlusion source type corresponding to medium occlusion.
5. The method of claim 4, wherein comparing the second ratio with a second occlusion threshold and a third occlusion threshold to determine a first occlusion source class corresponding to high occlusion or ultrahigh occlusion comprises:
if the second ratio of the shielding degree of any shielding object with high shielding or ultrahigh shielding is larger than or equal to the second shielding threshold value, taking the shielding object as a main shielding source;
if the second ratio of the shielding object with the shielding degree of high shielding or ultrahigh shielding is less than or equal to the second shielding threshold value and is greater than or equal to the third shielding threshold value, taking the shielding object as a secondary shielding source;
determining the occlusion degree as a first occlusion source category corresponding to high occlusion or ultrahigh occlusion based on the primary occlusion source and the secondary occlusion source.
6. The method of claim 4, wherein comparing the third ratio with the second occlusion threshold and the third occlusion threshold to determine a second occlusion source class corresponding to an occlusion with an occlusion degree of middle includes:
if the third ratio of any shielding object with the shielding degree of medium shielding is larger than or equal to the second shielding threshold value, taking the shielding object as a main shielding source with the shielding degree of medium shielding;
and taking the occlusion type corresponding to the main occlusion source with the occlusion degree of the middle occlusion as the second occlusion source type.
7. The method according to claim 4, wherein the determining the scene classification result of the image to be classified according to the occlusion degree and the radiation category of the radiation source comprises:
determining the radiation category according to the number of the radiation sources and the distance between the radiation sources and the current test scene, wherein the radiation category comprises at least one of strong radiation, weak radiation and no radiation;
taking the occlusion type corresponding to the occlusion degree of the occlusion object as the non-occlusion and low-occlusion as a third type occlusion source;
and combining the first shielding source category, the second shielding source category, the third shielding source category and the radiation categories to obtain a scene classification result of the image to be classified.
8. A classification device for positioning test scenes is characterized by comprising:
the image acquisition module is used for acquiring an image to be classified of the current test scene;
the area determining module is used for determining the type of the shielding objects in the image to be classified, and determining the area of the image to be classified and the areas of various shielding objects;
the shielding degree determining module is used for determining the shielding degree based on the total area of various types of shielding objects and the area of the image to be classified, wherein the shielding degree comprises at least one of no shielding, low shielding, medium shielding, high shielding and ultrahigh shielding;
and the scene classification module is used for determining a scene classification result of the image to be classified according to the shielding degree and the radiation type of the radiation source.
9. A classification device for localization test scenarios, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the classification method for localization test scenarios according to any one of claims 1 to 7 when executing the computer program.
10. A storage medium containing computer-executable instructions which, when executed by a computer processor, implement the classification method of localization test scenarios of any one of claims 1-7.
CN202011386443.0A 2020-12-01 2020-12-01 Classification method, device, equipment and storage medium for positioning test scenes Active CN112396125B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011386443.0A CN112396125B (en) 2020-12-01 2020-12-01 Classification method, device, equipment and storage medium for positioning test scenes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011386443.0A CN112396125B (en) 2020-12-01 2020-12-01 Classification method, device, equipment and storage medium for positioning test scenes

Publications (2)

Publication Number Publication Date
CN112396125A true CN112396125A (en) 2021-02-23
CN112396125B CN112396125B (en) 2022-11-18

Family

ID=74604080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011386443.0A Active CN112396125B (en) 2020-12-01 2020-12-01 Classification method, device, equipment and storage medium for positioning test scenes

Country Status (1)

Country Link
CN (1) CN112396125B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114001711A (en) * 2021-09-24 2022-02-01 上海东一土地规划勘测设计有限公司 Land surveying and mapping method, system, device and storage medium based on positioning system
CN116432090A (en) * 2023-06-13 2023-07-14 荣耀终端有限公司 Scene recognition method, system and terminal equipment

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103260596A (en) * 2009-12-18 2013-08-21 莱雅公司 Cosmetic treatment method using a compound that can be condensed in situ and a uv-adiation-filtering agent
CN104105106A (en) * 2014-07-23 2014-10-15 武汉飞脉科技有限责任公司 Wireless communication network intelligent-antenna-covered scene automatic classification and recognition method
US20170369051A1 (en) * 2016-06-28 2017-12-28 Toyota Motor Engineering & Manufacturing North America, Inc. Occluded obstacle classification for vehicles
WO2018073778A1 (en) * 2016-10-20 2018-04-26 Rail Vision Ltd System and method for object and obstacle detection and classification in collision avoidance of railway applications
CN108931825A (en) * 2018-05-18 2018-12-04 北京航空航天大学 A kind of remote sensing image clouds thickness detecting method based on atural object clarity
NL2021472A (en) * 2017-09-20 2019-03-26 Asml Netherlands Bv Radiation Source
CN110031861A (en) * 2017-12-08 2019-07-19 罗伯特·博世有限公司 Method for detecting the laser radar apparatus of occluded object
CN110099482A (en) * 2014-09-29 2019-08-06 飞利浦灯具控股公司 System and method for Lighting control
CN110275181A (en) * 2019-07-08 2019-09-24 武汉中海庭数据技术有限公司 A kind of vehicle-mounted mobile measuring system and its data processing method
WO2019227294A1 (en) * 2018-05-28 2019-12-05 华为技术有限公司 Image processing method, related device and computer storage medium
CN110705727A (en) * 2019-09-30 2020-01-17 山东建筑大学 Photovoltaic power station shadow shielding diagnosis method and system based on random forest algorithm
US20200034959A1 (en) * 2018-07-24 2020-01-30 The Regents Of The University Of Michigan Detection Of Near-Field Occlusions In Images
CN111046956A (en) * 2019-12-13 2020-04-21 苏州科达科技股份有限公司 Occlusion image detection method and device, electronic equipment and storage medium
CN111428581A (en) * 2020-03-05 2020-07-17 平安科技(深圳)有限公司 Face shielding detection method and system
US20200240206A1 (en) * 2004-05-06 2020-07-30 Mechoshade Systems, Llc Sky Camera Virtual Horizon Mask and Tracking Solar Disc
CN111489384A (en) * 2019-01-25 2020-08-04 曜科智能科技(上海)有限公司 Occlusion assessment method, device, equipment, system and medium based on mutual view
US20200273240A1 (en) * 2019-02-27 2020-08-27 Verizon Patent And Licensing Inc. Directional occlusion methods and systems for shading a virtual object rendered in a three-dimensional scene
CN111738329A (en) * 2020-06-19 2020-10-02 中南大学 Land use classification method for time series remote sensing images
CN111753929A (en) * 2020-08-07 2020-10-09 腾讯科技(深圳)有限公司 Artificial intelligence based classification method, device, terminal and storage medium
CN111860566A (en) * 2020-04-24 2020-10-30 北京嘀嘀无限科技发展有限公司 Method and device for training occlusion recognition model and storage medium
CN111898423A (en) * 2020-06-19 2020-11-06 北京理工大学 Morphology-based multisource remote sensing image ground object fine classification method
CN111967296A (en) * 2020-06-28 2020-11-20 北京中科虹霸科技有限公司 Iris living body detection method, entrance guard control method and entrance guard control device
CN111970424A (en) * 2020-08-25 2020-11-20 武汉工程大学 Light field camera shielding removing system and method based on micro-lens array synthetic aperture

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200240206A1 (en) * 2004-05-06 2020-07-30 Mechoshade Systems, Llc Sky Camera Virtual Horizon Mask and Tracking Solar Disc
CN103260596A (en) * 2009-12-18 2013-08-21 莱雅公司 Cosmetic treatment method using a compound that can be condensed in situ and a uv-adiation-filtering agent
CN104105106A (en) * 2014-07-23 2014-10-15 武汉飞脉科技有限责任公司 Wireless communication network intelligent-antenna-covered scene automatic classification and recognition method
CN110099482A (en) * 2014-09-29 2019-08-06 飞利浦灯具控股公司 System and method for Lighting control
US20170369051A1 (en) * 2016-06-28 2017-12-28 Toyota Motor Engineering & Manufacturing North America, Inc. Occluded obstacle classification for vehicles
WO2018073778A1 (en) * 2016-10-20 2018-04-26 Rail Vision Ltd System and method for object and obstacle detection and classification in collision avoidance of railway applications
NL2021472A (en) * 2017-09-20 2019-03-26 Asml Netherlands Bv Radiation Source
CN110031861A (en) * 2017-12-08 2019-07-19 罗伯特·博世有限公司 Method for detecting the laser radar apparatus of occluded object
CN108931825A (en) * 2018-05-18 2018-12-04 北京航空航天大学 A kind of remote sensing image clouds thickness detecting method based on atural object clarity
WO2019227294A1 (en) * 2018-05-28 2019-12-05 华为技术有限公司 Image processing method, related device and computer storage medium
US20200034959A1 (en) * 2018-07-24 2020-01-30 The Regents Of The University Of Michigan Detection Of Near-Field Occlusions In Images
CN111489384A (en) * 2019-01-25 2020-08-04 曜科智能科技(上海)有限公司 Occlusion assessment method, device, equipment, system and medium based on mutual view
US20200273240A1 (en) * 2019-02-27 2020-08-27 Verizon Patent And Licensing Inc. Directional occlusion methods and systems for shading a virtual object rendered in a three-dimensional scene
CN110275181A (en) * 2019-07-08 2019-09-24 武汉中海庭数据技术有限公司 A kind of vehicle-mounted mobile measuring system and its data processing method
CN110705727A (en) * 2019-09-30 2020-01-17 山东建筑大学 Photovoltaic power station shadow shielding diagnosis method and system based on random forest algorithm
CN111046956A (en) * 2019-12-13 2020-04-21 苏州科达科技股份有限公司 Occlusion image detection method and device, electronic equipment and storage medium
CN111428581A (en) * 2020-03-05 2020-07-17 平安科技(深圳)有限公司 Face shielding detection method and system
CN111860566A (en) * 2020-04-24 2020-10-30 北京嘀嘀无限科技发展有限公司 Method and device for training occlusion recognition model and storage medium
CN111738329A (en) * 2020-06-19 2020-10-02 中南大学 Land use classification method for time series remote sensing images
CN111898423A (en) * 2020-06-19 2020-11-06 北京理工大学 Morphology-based multisource remote sensing image ground object fine classification method
CN111967296A (en) * 2020-06-28 2020-11-20 北京中科虹霸科技有限公司 Iris living body detection method, entrance guard control method and entrance guard control device
CN111753929A (en) * 2020-08-07 2020-10-09 腾讯科技(深圳)有限公司 Artificial intelligence based classification method, device, terminal and storage medium
CN111970424A (en) * 2020-08-25 2020-11-20 武汉工程大学 Light field camera shielding removing system and method based on micro-lens array synthetic aperture

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
SHOUCHENG NI等: ""Learning discriminative and shareable patches for scene classification"", 《IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS》 *
ZJU_FISH1996: ""[图形学]环境光遮蔽(AO)"", 《HTTPS://BLOG.CSDN.NET/ZJU_FISH199》 *
乔立永等: ""红外目标识别图像复杂度度量方法综述"", 《红外技术》 *
戴激光等: ""光学遥感影像道路提取的方法综述"", 《遥感学报》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114001711A (en) * 2021-09-24 2022-02-01 上海东一土地规划勘测设计有限公司 Land surveying and mapping method, system, device and storage medium based on positioning system
CN114001711B (en) * 2021-09-24 2024-04-12 上海东一土地规划勘测设计有限公司 Land mapping method, system, device and storage medium based on positioning system
CN116432090A (en) * 2023-06-13 2023-07-14 荣耀终端有限公司 Scene recognition method, system and terminal equipment
CN116432090B (en) * 2023-06-13 2023-10-20 荣耀终端有限公司 Scene recognition method, system and terminal equipment

Also Published As

Publication number Publication date
CN112396125B (en) 2022-11-18

Similar Documents

Publication Publication Date Title
CN109284348B (en) Electronic map updating method, device, equipment and storage medium
US11105638B2 (en) Method, apparatus, and computer readable storage medium for updating electronic map
CN110260870B (en) Map matching method, device, equipment and medium based on hidden Markov model
CN109188438B (en) Yaw angle determination method, device, equipment and medium
CN109459734B (en) Laser radar positioning effect evaluation method, device, equipment and storage medium
CN109032102B (en) Unmanned vehicle testing method, device, equipment and storage medium
CN110427444B (en) Navigation guide point mining method, device, equipment and storage medium
CN109931945B (en) AR navigation method, device, equipment and storage medium
CN112396125B (en) Classification method, device, equipment and storage medium for positioning test scenes
US11087474B2 (en) Method, apparatus, device, and storage medium for calibrating posture of moving obstacle
CN110346825B (en) Vehicle positioning method and device, vehicle and storage medium
CN109558854B (en) Obstacle sensing method and device, electronic equipment and storage medium
CN110018503B (en) Vehicle positioning method and positioning system
CN111551190B (en) Method, apparatus, device and medium for determining location capability for autonomous driving
CN110555352B (en) Interest point identification method, device, server and storage medium
CN109635868B (en) Method and device for determining obstacle type, electronic device and storage medium
CN111674388A (en) Information processing method and device for vehicle curve driving
CN114419601A (en) Obstacle information determination method, obstacle information determination device, electronic device, and storage medium
CN113758492A (en) Map detection method and device
CN109270566B (en) Navigation method, navigation effect testing method, device, equipment and medium
CN109188419B (en) Method and device for detecting speed of obstacle, computer equipment and storage medium
CN113847914B (en) Vehicle positioning method and device, electronic equipment and storage medium
CN113449687B (en) Method and device for identifying point of interest outlet and point of interest inlet and electronic equipment
CN114565780A (en) Target identification method and device, electronic equipment and storage medium
CN114136327A (en) Automatic inspection method and system for recall ratio of dotted line segment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant