CN108268831A - The robustness test method and system of a kind of unmanned vehicle vision-based detection - Google Patents

The robustness test method and system of a kind of unmanned vehicle vision-based detection Download PDF

Info

Publication number
CN108268831A
CN108268831A CN201711248427.3A CN201711248427A CN108268831A CN 108268831 A CN108268831 A CN 108268831A CN 201711248427 A CN201711248427 A CN 201711248427A CN 108268831 A CN108268831 A CN 108268831A
Authority
CN
China
Prior art keywords
scene
unmanned vehicle
tested
image
test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711248427.3A
Other languages
Chinese (zh)
Inventor
冯西
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201711248427.3A priority Critical patent/CN108268831A/en
Publication of CN108268831A publication Critical patent/CN108268831A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/14Receivers specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Abstract

The application provides a kind of the robustness test method and system of unmanned vehicle vision-based detection, and the method includes obtain unmanned vehicle acquisition includes the original image for detecting target;Using the corresponding test parameter of scene to be tested, abnormality processing is carried out to the original image, builds the corresponding test image of scene to be tested;Using the vision-based detection for the detection target in original image and test image as a result, obtaining robustness test result of the unmanned vehicle vision-based detection under the scene to be tested.Using the present embodiment described method and system, test scene is more targetedly and repeatable, substantially increases the efficiency of robustness test.

Description

The robustness test method and system of a kind of unmanned vehicle vision-based detection
【Technical field】
This application involves field of image recognition more particularly to a kind of unmanned vehicle vision-based detection robustness test method and be System.
【Background technology】
Visual detection algorithm plays a crucial role in unmanned vehicle system, the unmanned vehicle travelled on road, if It will be by crossing, it is necessary first to identify traffic lights (hereinafter referred to as signal lamp).
In the prior art, during marker lamp, unmanned vehicle can be clapped signal lamp by filming apparatus It takes the photograph, the image obtained by filming apparatus only met visual detection algorithm can just learn in the image whether there is signal lamp, believe The type of signal lamp, color of signal lamp etc., and Decision Control module is sent to, it is made according to testing result by Decision Control module Meet the operation of traffic rules, that is, stop or current.
Because filming apparatus can face various different scenes, such as in feelings such as sleet greasy weather gas, the movements of filming apparatus angle Condition, unmanned vehicle possibly can not be identified signal lamp.Thus act of violating regulations may be caused or even cause traffic accident.
In order to ensure safe driving, need under various scenes can correct marker lamp, this requires unmanned vehicle regards Feel that detection algorithm has high robustness.
In unmanned vehicle development process, need to carry out real roads test, to test out unmanned vehicle in real roads Whether the vision-based detection of the scene encountered can reach actual demand.But for different scenes, the especially fields such as sleet greasy weather gas Scape is larger by effect of natural conditions, and there may be not reproducible.In order to complete the test of various different scenes, need to consume Take more time, manpower, monetary cost.
【Invention content】
The many aspects of the application provide a kind of the robustness test method and system of unmanned vehicle vision-based detection, to test The robustness of unmanned vehicle vision algorithm.
The one side of the application provides a kind of robustness test method of unmanned vehicle vision-based detection, including:
Obtain the original image for including detection target of unmanned vehicle acquisition;
Using the corresponding test parameter of scene to be tested, abnormality processing is carried out to the original image, builds examination hall to be measured The corresponding test image of scape;
Using the vision-based detection for the detection target in original image and test image as a result, obtaining the inspection of unmanned vehicle vision Survey the robustness test result under the scene to be tested.
The aspect and any possible implementation manners as described above, it is further provided a kind of realization method, the acquisition nothing The original image comprising detection target of people's vehicle acquisition includes:
When monitoring unmanned vehicle and detection target in preset distance range, then the filming apparatus started on unmanned vehicle is clapped It takes the photograph and stores the original image for including detection target.
The aspect and any possible implementation manners as described above, it is further provided a kind of realization method, the vision inspection Result is surveyed to include:
Detect at least one of type, color and position of target.
The aspect and any possible implementation manners as described above, it is further provided a kind of realization method, it is described to be tested Scene includes bad weather scene;
Using the corresponding test parameter of scene to be tested, abnormality processing is carried out to the original image, builds examination hall to be measured The corresponding test image of scape includes:
The original image is carried out plus made an uproar according to meteorologic parameter, brightness processed, obtains the corresponding survey of bad weather scene Attempt picture.
The aspect and any possible implementation manners as described above, it is further provided a kind of realization method, it is described to be tested Scene includes filming apparatus position mobile context;
Using the corresponding test parameter of scene to be tested, abnormality processing is carried out to the original image, builds examination hall to be measured The corresponding test image of scape includes:
According to location information parameter of the filming apparatus in unmanned vehicle, mobile, scaling processing is carried out to the original image, Obtain the corresponding test image of filming apparatus position mobile context.
The aspect and any possible implementation manners as described above, it is further provided a kind of realization method, it is described to be tested Scene includes filming apparatus angle mobile context;
Using the corresponding test parameter of scene to be tested, abnormality processing is carried out to the original image, builds examination hall to be measured The corresponding test image of scape includes:
According to attitude information parameter of the filming apparatus in unmanned vehicle, mobile, rotation processing is carried out to described image, is obtained The corresponding test image of filming apparatus angle mobile context.
The aspect and any possible implementation manners as described above, it is further provided a kind of realization method, it is described to utilize needle To in original image and test image detection target carry out vision-based detection as a result, obtaining unmanned vehicle under the scene to be tested The robustness test result of vision-based detection includes:
The vision-based detection result carried out for the detection target in original image and test image is compared, obtains nothing Error rate of people's vehicle vision-based detection under scene to be tested.
The one side of the application provides a kind of robustness test system of unmanned vehicle vision-based detection, including:
Original image acquisition module, for obtaining the original image for including detection target of unmanned vehicle acquisition;
Image processing module, for using the corresponding test parameter of scene to be tested, being carried out to the original image abnormal Processing, builds the corresponding test image of scene to be tested;
Vision-based detection module, for utilizing the vision-based detection knot for the detection target in original image and test image Fruit obtains robustness test result of the unmanned vehicle vision-based detection under the scene to be tested.
The aspect and any possible implementation manners as described above, it is further provided a kind of realization method, the original graph As acquisition module is specifically used for:
When monitoring unmanned vehicle and detection target in preset distance range, then the filming apparatus started on unmanned vehicle is clapped It takes the photograph and stores the original image for including detection target.
The aspect and any possible implementation manners as described above, it is further provided a kind of realization method, the vision inspection Result is surveyed to include:
Detect at least one of type, color and position of target.
The aspect and any possible implementation manners as described above, it is further provided a kind of realization method, it is described to be tested Scene includes bad weather scene;
Described image processing module is specifically used for:
The original image is carried out plus made an uproar according to meteorologic parameter, brightness processed, obtains the corresponding survey of bad weather scene Attempt picture.
The aspect and any possible implementation manners as described above, it is further provided a kind of realization method, it is described to be tested Scene includes filming apparatus position mobile context;
Described image processing module is specifically used for:
According to location information parameter of the filming apparatus in unmanned vehicle, mobile, scaling processing is carried out to the original image, Obtain the corresponding test image of filming apparatus position mobile context.
The aspect and any possible implementation manners as described above, it is further provided a kind of realization method, it is described to be tested Scene includes filming apparatus angle mobile context;
Described image processing module is specifically used for:
According to attitude information parameter of the filming apparatus in unmanned vehicle, mobile, rotation processing is carried out to described image, is obtained The corresponding test image of filming apparatus angle mobile context.
The aspect and any possible implementation manners as described above, it is further provided a kind of realization method, the vision inspection Module is surveyed to be specifically used for:
The vision-based detection result carried out for the detection target in original image and test image is compared, obtains nothing Error rate of people's vehicle vision-based detection under scene to be tested.
The another aspect of the application provides a kind of equipment, which is characterized in that the equipment includes:
One or more processors;
Storage device, for storing one or more programs,
When one or more of programs are performed by one or more of processors so that one or more of processing Device realizes any above-mentioned method.
The another aspect of the application provides a kind of computer readable storage medium, is stored thereon with computer program, special Sign is that the program realizes any above-mentioned method when being executed by processor.
By the technical solution it is found that the embodiment of the present application can effectively test the robustness of unmanned vehicle vision-based detection.
【Description of the drawings】
It, below will be to embodiment or description of the prior art in order to illustrate more clearly of the technical solution in the embodiment of the present application Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description is some realities of the application Example is applied, it for those of ordinary skill in the art, without having to pay creative labor, can also be attached according to these Figure obtains other attached drawings.
Fig. 1 is the flow diagram of the robustness test method of unmanned vehicle vision-based detection that one embodiment of the application provides;
Fig. 2 is that the robustness for the unmanned vehicle vision-based detection that another embodiment of the application provides tests the structural representation of system Figure;
Fig. 3 is the block diagram suitable for being used for realizing the exemplary computer system/server of the embodiment of the present invention.
【Specific embodiment】
Purpose, technical scheme and advantage to make the embodiment of the present application are clearer, below in conjunction with the embodiment of the present application In attached drawing, the technical solution in the embodiment of the present application is clearly and completely described, it is clear that described embodiment is Some embodiments of the present application, instead of all the embodiments.Based on the embodiment in the application, those of ordinary skill in the art The whole other embodiments obtained without creative efforts, shall fall in the protection scope of this application.
In addition, the terms "and/or", only a kind of incidence relation for describing affiliated partner, represents there may be Three kinds of relationships, for example, A and/or B, can represent:Individualism A exists simultaneously A and B, these three situations of individualism B.Separately Outside, character "/" herein, it is a kind of relationship of "or" to typically represent forward-backward correlation object.
Fig. 1 is the flow diagram of the robustness test method of unmanned vehicle vision-based detection that one embodiment of the application provides, As shown in Figure 1, include the following steps:
Step S11, the original video for including detection target of unmanned vehicle acquisition is obtained;
Step S12, according to the test parameter of scene to be tested in test request, exception is carried out to the original video Reason, builds the corresponding test video of scene to be tested;
Step S13, using the vision-based detection for the detection target in original video and test video as a result, obtaining nobody Robustness test result of the vehicle vision-based detection under the scene to be tested.
In a kind of preferred implementation of step S11,
Original video with detection target in actual test scene is shot by the filming apparatus on unmanned vehicle, in this reality It applies in example, the monitoring objective is by taking signal lamp as an example, including following sub-step:
Sub-step S111 builds test map corresponding with unmanned vehicle test site.
In the present embodiment, the test map of test site can be built first.
When structure tests map, at least one can be drawn according to the height of the coordinate of each point of test site, each point A test map;Existing high-precision map can also be used, modifies to it, obtains can be adapted for test site extremely Few test map.Wherein, the location information and directional information of signal lamp are included in the test map.
Sub-step S112 determines the expectation driving trace from preset initial position to final position on test map.
After structure completes above-mentioned test map, the initial position of unmanned vehicle can be determined to end on above-mentioned test map The expectation driving trace that stop bit is put.
Sub-step S113, it would be desirable to driving trace is sent to unmanned vehicle so that unmanned vehicle on test site desirably Driving trace travels.
Above-mentioned expectation driving trace can be sent to unmanned vehicle by server after above-mentioned expectation driving trace is determined. Unmanned vehicle after above-mentioned expectation driving trace is had received, can desirably driving trace from initial position set out traveling to terminate Position.It is understood that test map can also be sent jointly to unmanned vehicle by server simultaneously.
Sub-step S114 controls the filming apparatus on unmanned vehicle to shoot and store the original video comprising signal lamp,
In a kind of preferred implementation of the present embodiment, the filming apparatus on unmanned vehicle can also be controlled to shoot and stored Original image comprising signal lamp, is directly stored with image mode.
In the present embodiment, during being detected, handling to video, both for the image of a frame frame in video Detection, the processing of progress.
Preferably, when monitoring on the direction of travel of unmanned vehicle, with signal lamp to be captured in preset distance range It is interior, then start the video that the shooting of the filming apparatus on unmanned vehicle includes signal lamp.Pass through the location information of unmanned vehicle, signal lamp Location information can determine the range information of filming apparatus and signal lamp.
Signal lamp to be captured is that unmanned vehicle needs what is shot and detect by crossing, so that it is determining current or stopping Signal lamp.The signal lamp it is targeted be vehicle on the unmanned vehicle direction of travel.
Pre-determined distance range is numberical range of the preset unmanned vehicle to the distance of signal lamp.Pre-determined distance range can To be a numerical intervals, such as 40-60 meters or a numerical value, such as 50 meters.
In the present embodiment, vision-based detection can be carried out to each frame image of the original video, and records detection knot Fruit.The virtual condition of the signal lamp corresponding to each frame image of the original video can also be directly recorded, that is, is not needed to pair The original video carries out vision-based detection, and only need to obtain signal according to the timestamp of frame image each in the original video Lamp is in the virtual condition of corresponding time.
The testing result includes the type and color of signal lamp in each frame image, such as:
The signal lamp is motor vehicle signal lamp, and the motor vehicle signal lamp is green light;Or
The signal lamp is motor vehicle signal lamp, and the motor vehicle signal lamp is bright for amber light;Or
The signal lamp is motor vehicle signal lamp, and the motor vehicle signal lamp is when red.
The signal lamp is lane signal lamp, and the lane signal lamp is bright for green arrow lamp;Or
The signal lamp is lane signal lamp, and the lane signal lamp is red fork-shaped lamp.
The signal lamp is directional signal light, and arrow direction is to the left;Or
The signal lamp is directional signal light, and arrow direction is upward;Or
The signal lamp is directional signal light, and arrow direction is to the right.
The testing result can also include the position of signal lamp in each frame image.
Preferably, the testing result further includes the location information of the corresponding unmanned vehicle of each frame image, so as to subsequently into Row comparison.The information in geographical location of the location information of unmanned vehicle where unmanned vehicle, which can utilize nobody What the IMS inertial navigation systems or GPS system of vehicle were got.
Preferably, it based on ROS this robot operating system is core that unmanned vehicle, which is, the data of unmanned vehicle at runtime It can be stored as the bag files that form is .bag.The detailed of all topic in unmanned vehicle ROS operating systems can be recorded in bag files Information, the topic perceived including corresponding filming apparatus topic and signal lamp.Filming apparatus topic is output to signal lamp sense Know this topic, perceive this topic by signal lamp detects whether presence signal lamp and signal according to filming apparatus topic The position of lamp, type, color.
In a kind of preferred implementation of step S12,
When receiving test request, according to the corresponding test parameter of scene to be tested in test request, to described original Each frame image of video carries out abnormality processing, builds the corresponding test video of scene to be tested.
The test request includes testing map, scene to be tested, wherein, scene to be tested includes:Bad weather, bat Take the photograph mobile, filming apparatus angle movement of setting position etc.;The test parameter of bad weather scene is meteorologic parameter, filming apparatus position The test parameter for putting mobile context is the test of location information, filming apparatus angle mobile context of the filming apparatus in unmanned vehicle Parameter is the attitude information of filming apparatus.
Meteorologic parameter is the specific meteorologic parameters such as rain, snow, mist, sand and dust, haze, cloudy day.Position of the filming apparatus in unmanned vehicle It is the information for indicating position of the filming apparatus in unmanned vehicle to put information parameter.Attitude information parameter is the posture with filming apparatus Relevant information indicates the posture of filming apparatus.
By the setting to meteorologic parameter, the scene to be measured under inclement weather conditions can be realized;By to filming apparatus The setting of location information in unmanned vehicle can realize the scene to be measured under the situation of movement of filming apparatus position;By to clapping The setting of the attitude information of device is taken the photograph, can realize the scene to be measured under filming apparatus angle situation of movement;
Preferably, above-mentioned parameter can be carried out at the same time setting, to realize the scene to be measured under complex situations.It is for example, severe Under weather, mobile scene all occurs for filming apparatus position and angle.
The original video stored in extraction video container is joined according to the corresponding test of scene to be tested in test request Several each frames in the original video carry out abnormality processing, and the abnormal image Data Synthesis that abnormality processing is obtained is to be measured The corresponding test video of examination hall scape.
Extract first with the relevant topic of video, video is stored in topic with compressed form, extraction side Method is topic all in the entire bag files of traversal, select specify with the relevant topic of signal lamp.Then by signal lamp pair The processing of mono- frame frames of topic answered, is converted to the manageable forms of OpenCV firstly the need of by signal lamp topic here
Each frame image in the original video move using the computer visions such as OpenCV library, is scaled, partially It moves, rotation processing, generates corresponding abnormal image;
Preferably, the location information parameter according to filming apparatus in unmanned vehicle carries out described image at mobile, scaling Reason, obtains the corresponding test image of filming apparatus position mobile context;
According to the attitude information parameter of filming apparatus, mobile, rotation processing is carried out to described image, obtains filming apparatus angle Spend the corresponding test image of mobile context.
Due to the computer visions such as OpenCV library can not to image carry out plus make an uproar, brightness processed, so need using its His image processing tool carries out plus makes an uproar to each frame image in video according to meteorologic parameter, brightness processed, generates bad weather The corresponding test image of scene.
The video data that the test image of generation is converted to ROS by OpenCV to read is used as test video, from And be written in the corresponding test video topic of scene to be tested, it is stored as the file that form is .bag.
In a kind of preferred implementation of step S13,
Using the vision-based detection carried out for the detection target in original video and test video as a result, obtaining described to be measured The robustness test result of unmanned vehicle vision-based detection under the scape of examination hall.
Unmanned vehicle vision-based detection is carried out according to the corresponding test video of the scene to be tested, records testing result.
Newly-generated bag files are played on unmanned vehicle operating system ROS, while enabling signal lamp perceives this topic, The position of presence signal lamp and signal lamp, type, face are detected whether according to the corresponding test video topic of scene to be tested Color.
According to the corresponding test video of the scene to be tested carry out unmanned vehicle vision-based detection testing result include it is each The type and color of signal lamp in frame image, such as:
The signal lamp is motor vehicle signal lamp, and the motor vehicle signal lamp is green light;Or
The signal lamp is motor vehicle signal lamp, and the motor vehicle signal lamp is bright for amber light;Or
The signal lamp is motor vehicle signal lamp, and the motor vehicle signal lamp is when red.
The signal lamp is lane signal lamp, and the lane signal lamp is bright for green arrow lamp;Or
The signal lamp is lane signal lamp, and the lane signal lamp is that red fork-shaped lamp or arrow lamp are bright.
The signal lamp is directional signal light, and the arrow direction is to the left;Or
The signal lamp is directional signal light, and arrow direction is upward;Or
The signal lamp is directional signal light, and arrow direction is to the right.
The testing result can also include the position of signal lamp in each frame image.
Preferably, the testing result further includes the location information of the corresponding unmanned vehicle of each frame image, the position of unmanned vehicle Confidence breath is to be shot in actual test scene during the video with signal lamp by the filming apparatus carried on unmanned vehicle The information in the geographical location where the unmanned vehicle of record.
Preferably, the testing result of the testing result of original video test video corresponding with scene to be tested is carried out pair Than obtaining error rate of the unmanned vehicle vision algorithm under scene to be tested, determining that can unmanned vehicle under the scene to be tested Normal operation.The virtual condition of the testing result of scene video to be tested and signal lamp can also be compared.
For example, under different test scenes, in corresponding frame, unmanned vehicle visual detection algorithm detects signal lamp accurate The information comparison such as true rate, can obtain robustness of the unmanned vehicle visual detection algorithm to different test scenes.
Specific aim improvement is carried out to unmanned vehicle vision algorithm according to the robustness test result, enhances visual detection algorithm Robustness.
The robustness test method for the unmanned vehicle vision-based detection that above-described embodiment of the application provides, test scene have more Specific aim and repeatability substantially increase the efficiency of robustness test.
It should be noted that for aforementioned each method embodiment, in order to be briefly described, therefore it is all expressed as a series of Combination of actions, but those skilled in the art should know, the application is not limited by described sequence of movement because According to the application, certain steps may be used other sequences or be carried out at the same time.Secondly, those skilled in the art should also know It knows, embodiment described in this description belongs to preferred embodiment, involved action and module not necessarily the application It is necessary.
In the described embodiment, it all emphasizes particularly on different fields to the description of each embodiment, there is no the portion being described in detail in some embodiment Point, it may refer to the associated description of other embodiment.
Fig. 2 is that the robustness for the unmanned vehicle vision-based detection that another embodiment of the application provides tests the structural representation of system Figure, as shown in Fig. 2, including with lower module:
Original video acquisition module 21, for obtaining the original video for including detection target of unmanned vehicle acquisition;
Video processing module 22, for the test parameter according to scene to be tested in test request, to the original video Abnormality processing is carried out, builds the corresponding test video of scene to be tested;
Vision-based detection module 23, for utilizing the vision-based detection knot for the detection target in original video and test video Fruit obtains robustness test result of the unmanned vehicle vision-based detection under the scene to be tested.
In a kind of preferred implementation of original video acquisition module 21,
Original video with detection target in actual test scene is shot by the filming apparatus on unmanned vehicle, in this reality It applies in example, the monitoring objective is by taking signal lamp as an example.
Structure test map corresponding with unmanned vehicle test site.
In the present embodiment, the test map of test site can be built first.
When structure tests map, at least one can be drawn according to the height of the coordinate of each point of test site, each point A test map;Existing high-precision map can also be used, modifies to it, obtains can be adapted for test site extremely Few test map.Wherein, the location information and directional information of signal lamp are included in the test map.
The expectation driving trace from preset initial position to final position is determined on test map.
After structure completes above-mentioned test map, the initial position of unmanned vehicle can be determined to end on above-mentioned test map The expectation driving trace that stop bit is put.
It will it is expected that driving trace is sent to unmanned vehicle, so that unmanned vehicle desirably driving trace row on test site It sails.
Above-mentioned expectation driving trace can be sent to unmanned vehicle by server after above-mentioned expectation driving trace is determined. Unmanned vehicle after above-mentioned expectation driving trace is had received, can desirably driving trace from initial position set out traveling to terminate Position.It is understood that test map can also be sent jointly to unmanned vehicle by server simultaneously.
Filming apparatus on control unmanned vehicle shoots and stores the original video comprising signal lamp,
In a kind of preferred implementation of the present embodiment, the filming apparatus on unmanned vehicle can also be controlled to shoot and stored Original image comprising signal lamp, is directly stored with image mode.
In the present embodiment, during being detected, handling to video, both for the image of a frame frame in video Detection, the processing of progress.
Preferably, when monitoring on the direction of travel of unmanned vehicle, with signal lamp to be captured in preset distance range It is interior, then start the video that the shooting of the filming apparatus on unmanned vehicle includes signal lamp.Pass through the location information of unmanned vehicle, signal lamp Location information can determine the range information of filming apparatus and signal lamp.
Signal lamp to be captured is that unmanned vehicle needs what is shot and detect by crossing, so that it is determining current or stopping Signal lamp.The signal lamp it is targeted be vehicle on the unmanned vehicle direction of travel.
Pre-determined distance range is numberical range of the preset unmanned vehicle to the distance of signal lamp.Pre-determined distance range can To be a numerical intervals, such as 40-60 meters or a numerical value, such as 50 meters.
In the present embodiment, vision-based detection can be carried out, and record testing result to each frame image of the original video. The virtual condition of the signal lamp corresponding to each frame image of the video can also be directly recorded, that is, is not needed to described original Video carries out vision-based detection, and only need to obtain signal lamp in correspondence according to the timestamp of frame image each in the original video The virtual condition of time.
The testing result includes the type and color of signal lamp in each frame image, such as:
The signal lamp is motor vehicle signal lamp, and the motor vehicle signal lamp is green light;Or
The signal lamp is motor vehicle signal lamp, and the motor vehicle signal lamp is bright for amber light;Or
The signal lamp is motor vehicle signal lamp, and the motor vehicle signal lamp is when red.
The signal lamp is lane signal lamp, and the lane signal lamp is bright for green arrow lamp;Or
The signal lamp is lane signal lamp, and the lane signal lamp is red fork-shaped lamp.
The signal lamp is directional signal light, and arrow direction is to the left;Or
The signal lamp is directional signal light, and arrow direction is upward;Or
The signal lamp is directional signal light, and arrow direction is to the right.
The testing result can also include the position of signal lamp in each frame image.
Preferably, the testing result further includes the location information of the corresponding unmanned vehicle of each frame image, so as to subsequently into Row comparison.The information in geographical location of the location information of unmanned vehicle where unmanned vehicle, which can utilize nobody What the IMS inertial navigation systems or GPS system of vehicle were got.
Preferably, it based on ROS this robot operating system is core that unmanned vehicle, which is, the data of unmanned vehicle at runtime It can be stored as the bag files that form is .bag.The detailed of all topic in unmanned vehicle ROS operating systems can be recorded in bag files Information, the topic perceived including corresponding filming apparatus topic and signal lamp.Filming apparatus topic is output to signal lamp sense Know this topic, perceive this topic by signal lamp detects whether presence signal lamp and signal according to filming apparatus topic The position of lamp, type, color.
In a kind of preferred implementation of image processing module 22,
When receiving test request, image processing module 22 is joined according to the corresponding test of scene to be tested in test request Number carries out abnormality processing to each frame image of the original video, builds the corresponding test video of scene to be tested.
The test request includes testing map, scene to be tested, wherein, scene to be tested includes:Bad weather, bat Take the photograph mobile, filming apparatus angle movement of setting position etc.;The test parameter of bad weather scene is meteorologic parameter, filming apparatus position The test parameter for putting mobile context is the test of location information, filming apparatus angle mobile context of the filming apparatus in unmanned vehicle Parameter is the attitude information of filming apparatus.
Meteorologic parameter is the specific meteorologic parameters such as rain, snow, mist, sand and dust, haze, cloudy day.Position of the filming apparatus in unmanned vehicle It is the information for indicating position of the filming apparatus in unmanned vehicle to put information parameter.Attitude information parameter is the posture with filming apparatus Relevant information, the posture of records photographing device.
By the setting to meteorologic parameter, the scene to be measured under inclement weather conditions can be realized;By to filming apparatus The setting of location information in unmanned vehicle can realize the scene to be measured under the situation of movement of filming apparatus position;By to clapping The setting of the attitude information of device is taken the photograph, can realize the scene to be measured under filming apparatus angle situation of movement.
Preferably, above-mentioned parameter can be carried out at the same time setting, to realize the scene to be measured under complex situations.It is for example, severe Under weather, mobile scene all occurs for filming apparatus position and angle.
The original video stored in extraction video container is joined according to the corresponding test of scene to be tested in test request Several each frames in the original video carry out abnormality processing, and the abnormal image Data Synthesis that abnormality processing is obtained is to be measured The corresponding test video of examination hall scape.
Extract first with the relevant topic of video, video is stored in topic with compressed form, extraction side Method is topic all in the entire bag files of traversal, select specify with the relevant topic of signal lamp.Then by signal lamp pair The processing of mono- frame frames of topic answered, is converted to the manageable forms of OpenCV firstly the need of by signal lamp topic here
Each frame image in the original video move using the computer visions such as OpenCV library, is scaled, partially It moves, rotation processing, generates corresponding abnormal image;
Preferably, the location information parameter according to filming apparatus in unmanned vehicle carries out described image at mobile, scaling Reason, obtains the corresponding test image of filming apparatus position mobile context;
According to the attitude information parameter of filming apparatus, mobile, rotation processing is carried out to described image, obtains filming apparatus angle Spend the corresponding test image of mobile context.
It makes an uproar since the computer visions such as OpenCV library can not carry out image to add, brightness processed, so needing to utilize other Image processing tool carries out plus makes an uproar to each frame image in video according to meteorologic parameter, brightness processed, obtains bad weather field The corresponding test image of scape.
The video data that the test image of generation is converted to ROS by OpenCV to read is used as test video, from And be written in the corresponding test video topic of scene to be tested, it is stored as the file that form is .bag.
In a kind of preferred implementation of vision-based detection module 23,
Vision-based detection module 23 utilizes the vision-based detection knot carried out for the detection target in original video and test video Fruit obtains the robustness test result of unmanned vehicle vision-based detection under the scene to be tested.
Unmanned vehicle vision-based detection is carried out according to the corresponding test video of the scene to be tested, records testing result.
Newly-generated bag files are played on unmanned vehicle operating system ROS, while enabling signal lamp perceives this topic, The position of presence signal lamp and signal lamp, type, face are detected whether according to the corresponding test video topic of scene to be tested Color.
According to the corresponding test video of the scene to be tested carry out unmanned vehicle vision-based detection testing result include it is each The type and color of signal lamp in frame image, such as:
The signal lamp is motor vehicle signal lamp, and the motor vehicle signal lamp is green light;Or
The signal lamp is motor vehicle signal lamp, and the motor vehicle signal lamp is bright for amber light;Or
The signal lamp is motor vehicle signal lamp, and the motor vehicle signal lamp is when red.
The signal lamp is lane signal lamp, and the lane signal lamp is bright for green arrow lamp;Or
The signal lamp is lane signal lamp, and the lane signal lamp is that red fork-shaped lamp or arrow lamp are bright.
The signal lamp is directional signal light, and the arrow direction is to the left;Or
The signal lamp is directional signal light, and arrow direction is upward;Or
The signal lamp is directional signal light, and arrow direction is to the right.
The testing result can also include the position of signal lamp in each frame image.
Preferably, the testing result further includes the location information of the corresponding unmanned vehicle of each frame image, the position of unmanned vehicle Confidence breath is to be shot in actual test scene during the video with signal lamp by the filming apparatus carried on unmanned vehicle The information in the geographical location where the unmanned vehicle of record.
Preferably, the testing result of the testing result of original video test video corresponding with scene to be tested is carried out pair Than obtaining error rate of the unmanned vehicle vision algorithm under scene to be tested, determining that can unmanned vehicle under the scene to be tested Normal operation.The virtual condition of the testing result of scene video to be tested and signal lamp can also be compared.
For example, under different test scenes, in corresponding frame, unmanned vehicle visual detection algorithm detects signal lamp accurate The information comparison such as true rate, can obtain robustness of the unmanned vehicle visual detection algorithm to different test scenes.
Specific aim improvement is carried out to unmanned vehicle vision algorithm according to the robustness test result, enhances visual detection algorithm Robustness.
The robustness test system for the unmanned vehicle vision-based detection that above-described embodiment of the application provides, test scene have more Specific aim and repeatability substantially increase the efficiency of robustness test.
In several embodiments provided herein, it should be understood that disclosed method and apparatus can pass through it Its mode is realized.For example, the apparatus embodiments described above are merely exemplary, for example, the division of the unit, only Only a kind of division of logic function can have other dividing mode in actual implementation, such as multiple units or component can be tied It closes or is desirably integrated into another system or some features can be ignored or does not perform.Another point, it is shown or discussed Mutual coupling, direct-coupling or communication connection can be the INDIRECT COUPLING or logical by some interfaces, device or unit Letter connection can be electrical, machinery or other forms.
The unit illustrated as separating component may or may not be physically separate, be shown as unit The component shown may or may not be physical unit, you can be located at a place or can also be distributed to multiple In network element.Some or all of unit therein can be selected according to the actual needs to realize the mesh of this embodiment scheme 's.
In addition, each functional unit in each embodiment of the application can be integrated in a processing unit, it can also That each unit is individually physically present, can also two or more units integrate in a unit.The integrated list The form that hardware had both may be used in member is realized, can also be realized in the form of hardware adds SFU software functional unit.
Fig. 3 shows the frame suitable for being used for the exemplary computer system/server 012 for realizing embodiment of the present invention Figure.The computer system/server 012 that Fig. 3 is shown is only an example, function that should not be to the embodiment of the present invention and use Range band carrys out any restrictions.
As shown in figure 3, computer system/server 012 is showed in the form of universal computing device.Computer system/clothes The component of business device 012 can include but is not limited to:One or more processor or processing unit 016, system storage 028, the bus 018 of connection different system component (including system storage 028 and processing unit 016).
Bus 018 represents one or more in a few class bus structures, including memory bus or Memory Controller, Peripheral bus, graphics acceleration port, processor or the local bus using the arbitrary bus structures in a variety of bus structures.It lifts For example, these architectures include but not limited to industry standard architecture (ISA) bus, microchannel architecture (MAC) Bus, enhanced isa bus, Video Electronics Standards Association (VESA) local bus and peripheral component interconnection (PCI) bus.
Computer system/server 012 typically comprises a variety of computer system readable media.These media can be appointed What usable medium that can be accessed by computer system/server 012, including volatile and non-volatile medium, movably With immovable medium.
System storage 028 can include the computer system readable media of form of volatile memory, such as deposit at random Access to memory (RAM) 030 and/or cache memory 032.Computer system/server 012 may further include other Removable/nonremovable, volatile/non-volatile computer system storage medium.Only as an example, storage system 034 can For reading and writing immovable, non-volatile magnetic media (Fig. 3 do not show, commonly referred to as " hard disk drive ").Although in Fig. 3 Be not shown, can provide for move non-volatile magnetic disk (such as " floppy disk ") read-write disc driver and pair can The CD drive that mobile anonvolatile optical disk (such as CD-ROM, DVD-ROM or other optical mediums) is read and write.In these situations Under, each driver can be connected by one or more data media interfaces with bus 018.Memory 028 can include At least one program product, the program product have one group of (for example, at least one) program module, these program modules are configured To perform the function of various embodiments of the present invention.
Program/utility 040 with one group of (at least one) program module 042, can be stored in such as memory In 028, such program module 042 includes --- but being not limited to --- operating system, one or more application program, other Program module and program data may include the realization of network environment in each or certain combination in these examples.Journey Sequence module 042 usually performs function and/or method in embodiment described in the invention.
Computer system/server 012 can also with one or more external equipments 014 (such as keyboard, sensing equipment, Display 024 etc.) communication, in the present invention, computer system/server 012 communicates with outside radar equipment, can also be with One or more enables a user to the equipment interacted with the computer system/server 012 communication and/or with causing the meter Any equipment that calculation machine systems/servers 012 can communicate with one or more of the other computing device (such as network interface card, modulation Demodulator etc.) communication.This communication can be carried out by input/output (I/O) interface 022.Also, computer system/clothes Being engaged in device 012 can also be by network adapter 020 and one or more network (such as LAN (LAN), wide area network (WAN) And/or public network, such as internet) communication.As shown in the figure, network adapter 020 by bus 018 and computer system/ Other modules communication of server 012.It should be understood that although being not shown in Fig. 3, computer system/server 012 can be combined Using other hardware and/or software module, including but not limited to:Microcode, device driver, redundant processing unit, external magnetic Dish driving array, RAID system, tape drive and data backup storage system etc..
Processing unit 016 is stored in the program in system storage 028 by operation, described in the invention so as to perform Function and/or method in embodiment.
Above-mentioned computer program can be set in computer storage media, i.e., the computer storage media is encoded with Computer program, the program by one or more computers when being performed so that one or more computers are performed in the present invention State the method flow shown in embodiment and/or device operation.
With time, the development of technology, medium meaning is more and more extensive, and the route of transmission of computer program is no longer limited by Tangible medium, can also directly be downloaded from network etc..The arbitrary combination of one or more computer-readable media may be used. Computer-readable medium can be computer-readable signal media or computer readable storage medium.Computer-readable storage medium Matter for example may be-but not limited to-electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor system, device or device or The arbitrary above combination of person.The more specific example (non exhaustive list) of computer readable storage medium includes:There are one tools Or the electrical connections of multiple conducting wires, portable computer diskette, hard disk, random access memory (RAM), read-only memory (ROM), Erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light Memory device, magnetic memory device or above-mentioned any appropriate combination.In this document, computer readable storage medium can To be any tangible medium for including or storing program, the program can be commanded execution system, device or device use or Person is in connection.
Computer-readable signal media can include in a base band or as a carrier wave part propagation data-signal, Wherein carry computer-readable program code.Diversified forms may be used in the data-signal of this propagation, including --- but It is not limited to --- electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be Any computer-readable medium other than computer readable storage medium, which can send, propagate or Transmission for by instruction execution system, device either device use or program in connection.
The program code included on computer-readable medium can be transmitted with any appropriate medium, including --- but it is unlimited In --- wireless, electric wire, optical cable, RF etc. or above-mentioned any appropriate combination.
It can write to perform the computer that operates of the present invention with one or more programming language or combinations Program code, described program design language include object oriented program language-such as Java, Smalltalk, C++, Further include conventional procedural programming language-such as " C " language or similar programming language.Program code can be with It fully performs, partly perform on the user computer on the user computer, the software package independent as one performs, portion Divide and partly perform or perform on a remote computer or server completely on the remote computer on the user computer. Be related in the situation of remote computer, remote computer can pass through the network of any kind --- including LAN (LAN) or Wide area network (WAN) be connected to subscriber computer or, it may be connected to outer computer (such as is provided using Internet service Quotient passes through Internet connection).
Finally it should be noted that:Above example is only to illustrate the technical solution of the application, rather than its limitations;Although The application is described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that:It still may be used To modify to the technical solution recorded in foregoing embodiments or carry out equivalent replacement to which part technical characteristic; And these modification or replace, each embodiment technical solution of the application that it does not separate the essence of the corresponding technical solution spirit and Range.

Claims (16)

1. a kind of robustness test method of unmanned vehicle vision-based detection, which is characterized in that including:
Obtain the original image for including detection target of unmanned vehicle acquisition;
Using the corresponding test parameter of scene to be tested, abnormality processing is carried out to the original image, builds scene pair to be tested The test image answered;
Existed using the vision-based detection for the detection target in original image and test image as a result, obtaining unmanned vehicle vision-based detection Robustness test result under the scene to be tested.
2. the according to the method described in claim 1, it is characterized in that, original for including detection target for obtaining unmanned vehicle acquisition Beginning image includes:
When monitoring unmanned vehicle and detection target in preset distance range, then the filming apparatus started on unmanned vehicle is shot simultaneously Storage includes the original image of detection target.
3. according to the method described in claim 1, it is characterized in that, the vision-based detection result includes:
Detect at least one of type, color and position of target.
4. according to the method described in claim 1, it is characterized in that, the scene to be tested includes bad weather scene;
Using the corresponding test parameter of scene to be tested, abnormality processing is carried out to the original image, builds scene pair to be tested The test image answered includes:
The original image is carried out plus made an uproar according to meteorologic parameter, brightness processed, obtains the corresponding test chart of bad weather scene Picture.
5. according to the method described in claim 1, it is characterized in that, the scene to be tested includes filming apparatus position moving field Scape;
Using the corresponding test parameter of scene to be tested, abnormality processing is carried out to the original image, builds scene pair to be tested The test image answered includes:
According to location information parameter of the filming apparatus in unmanned vehicle, mobile, scaling processing is carried out to the original image, is obtained The corresponding test image of filming apparatus position mobile context.
6. according to the method described in claim 1, it is characterized in that, the scene to be tested includes filming apparatus angle moving field Scape;
Using the corresponding test parameter of scene to be tested, abnormality processing is carried out to the original image, builds scene pair to be tested The test image answered includes:
According to attitude information parameter of the filming apparatus in unmanned vehicle, mobile, rotation processing is carried out to described image, is shot The corresponding test image of design factors mobile context.
It is 7. according to the method described in claim 1, it is characterized in that, described using for the inspection in original image and test image The vision-based detection of target progress is surveyed as a result, obtaining the robustness test result packet of unmanned vehicle vision-based detection under the scene to be tested It includes:
The vision-based detection result carried out for the detection target in original image and test image is compared, obtains unmanned vehicle Error rate of the vision-based detection under scene to be tested.
8. a kind of robustness test system of unmanned vehicle vision-based detection, which is characterized in that including:
Original image acquisition module, for obtaining the original image for including detection target of unmanned vehicle acquisition;
For utilizing the corresponding test parameter of scene to be tested, abnormality processing is carried out to the original image for image processing module, Build the corresponding test image of scene to be tested;
Vision-based detection module, for utilizing the vision-based detection for the detection target in original image and test image as a result, obtaining To robustness test result of the unmanned vehicle vision-based detection under the scene to be tested.
9. system according to claim 8, which is characterized in that the original image acquisition module is specifically used for:
When monitoring unmanned vehicle and detection target in preset distance range, then the filming apparatus started on unmanned vehicle is shot simultaneously Storage includes the original image of detection target.
10. system according to claim 8, which is characterized in that the vision-based detection result includes:
Detect at least one of type, color and position of target.
11. system according to claim 8, which is characterized in that the scene to be tested includes bad weather scene;
Described image processing module is specifically used for:
The original image is carried out plus made an uproar according to meteorologic parameter, brightness processed, obtains the corresponding test chart of bad weather scene Picture.
12. system according to claim 8, which is characterized in that the scene to be tested is moved including filming apparatus position Scene;
Described image processing module is specifically used for:
According to location information parameter of the filming apparatus in unmanned vehicle, mobile, scaling processing is carried out to the original image, is obtained The corresponding test image of filming apparatus position mobile context.
13. system according to claim 8, which is characterized in that the scene to be tested is moved including filming apparatus angle Scene;
Described image processing module is specifically used for:
According to attitude information parameter of the filming apparatus in unmanned vehicle, mobile, rotation processing is carried out to described image, is shot The corresponding test image of design factors mobile context.
14. system according to claim 8, which is characterized in that the vision-based detection module is specifically used for:
The vision-based detection result carried out for the detection target in original image and test image is compared, obtains unmanned vehicle Error rate of the vision-based detection under scene to be tested.
15. a kind of equipment, which is characterized in that the equipment includes:
One or more processors;
Storage device, for storing one or more programs,
When one or more of programs are performed by one or more of processors so that one or more of processors are real The now method as described in any in claim 1-7.
16. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the program is by processor The method as described in any in claim 1-7 is realized during execution.
CN201711248427.3A 2017-12-01 2017-12-01 The robustness test method and system of a kind of unmanned vehicle vision-based detection Pending CN108268831A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711248427.3A CN108268831A (en) 2017-12-01 2017-12-01 The robustness test method and system of a kind of unmanned vehicle vision-based detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711248427.3A CN108268831A (en) 2017-12-01 2017-12-01 The robustness test method and system of a kind of unmanned vehicle vision-based detection

Publications (1)

Publication Number Publication Date
CN108268831A true CN108268831A (en) 2018-07-10

Family

ID=62771819

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711248427.3A Pending CN108268831A (en) 2017-12-01 2017-12-01 The robustness test method and system of a kind of unmanned vehicle vision-based detection

Country Status (1)

Country Link
CN (1) CN108268831A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112200217A (en) * 2020-09-09 2021-01-08 天津津航技术物理研究所 Identification algorithm evaluation method and system based on infrared image big data
CN112614372A (en) * 2020-12-24 2021-04-06 奇瑞汽车股份有限公司 Method and device for vehicle to safely pass through crossroad
CN112699765A (en) * 2020-12-25 2021-04-23 北京百度网讯科技有限公司 Method and device for evaluating visual positioning algorithm, electronic equipment and storage medium
CN112995656A (en) * 2021-03-04 2021-06-18 黑芝麻智能科技(上海)有限公司 Anomaly detection method and system for image processing circuit
WO2022068468A1 (en) * 2020-09-29 2022-04-07 京东科技信息技术有限公司 Robot testing site and robot testing method
CN115222610A (en) * 2022-03-11 2022-10-21 广州汽车集团股份有限公司 Image method, image device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105954048A (en) * 2016-07-07 2016-09-21 百度在线网络技术(北京)有限公司 Method for testing normal driving of unmanned vehicle and device thereof
CN106096192A (en) * 2016-06-27 2016-11-09 百度在线网络技术(北京)有限公司 The construction method of the test scene of a kind of automatic driving vehicle and device
CN107330105A (en) * 2017-07-07 2017-11-07 上海木爷机器人技术有限公司 The robustness evaluating method and device of a kind of retrieving similar images algorithm

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106096192A (en) * 2016-06-27 2016-11-09 百度在线网络技术(北京)有限公司 The construction method of the test scene of a kind of automatic driving vehicle and device
CN105954048A (en) * 2016-07-07 2016-09-21 百度在线网络技术(北京)有限公司 Method for testing normal driving of unmanned vehicle and device thereof
CN107330105A (en) * 2017-07-07 2017-11-07 上海木爷机器人技术有限公司 The robustness evaluating method and device of a kind of retrieving similar images algorithm

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112200217A (en) * 2020-09-09 2021-01-08 天津津航技术物理研究所 Identification algorithm evaluation method and system based on infrared image big data
CN112200217B (en) * 2020-09-09 2023-06-09 天津津航技术物理研究所 Identification algorithm evaluation method and system based on infrared image big data
WO2022068468A1 (en) * 2020-09-29 2022-04-07 京东科技信息技术有限公司 Robot testing site and robot testing method
CN112614372A (en) * 2020-12-24 2021-04-06 奇瑞汽车股份有限公司 Method and device for vehicle to safely pass through crossroad
CN112699765A (en) * 2020-12-25 2021-04-23 北京百度网讯科技有限公司 Method and device for evaluating visual positioning algorithm, electronic equipment and storage medium
CN112995656A (en) * 2021-03-04 2021-06-18 黑芝麻智能科技(上海)有限公司 Anomaly detection method and system for image processing circuit
CN115222610A (en) * 2022-03-11 2022-10-21 广州汽车集团股份有限公司 Image method, image device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN108268831A (en) The robustness test method and system of a kind of unmanned vehicle vision-based detection
CN107153363B (en) Simulation test method, device, equipment and readable medium for unmanned vehicle
CN108318043A (en) Method, apparatus for updating electronic map and computer readable storage medium
CN109931945B (en) AR navigation method, device, equipment and storage medium
CN109345510A (en) Object detecting method, device, equipment, storage medium and vehicle
CN109343061A (en) Transducer calibration method, device, computer equipment, medium and vehicle
CN109345596A (en) Multisensor scaling method, device, computer equipment, medium and vehicle
CN109101861A (en) Obstacle identity recognition methods, device, equipment and storage medium
CN109271944A (en) Obstacle detection method, device, electronic equipment, vehicle and storage medium
CN111856963B (en) Parking simulation method and device based on vehicle-mounted looking-around system
CN109598066A (en) Effect evaluation method, device, equipment and the storage medium of prediction module
CN109145680A (en) A kind of method, apparatus, equipment and computer storage medium obtaining obstacle information
CN109116374A (en) Determine the method, apparatus, equipment and storage medium of obstacle distance
CN107450088A (en) A kind of location Based service LBS augmented reality localization method and device
CN109032103A (en) Test method, device, equipment and the storage medium of automatic driving vehicle
CN109211575A (en) Pilotless automobile and its field test method, apparatus and readable medium
CN108230437A (en) Scene reconstruction method and device, electronic equipment, program and medium
CN109344804A (en) A kind of recognition methods of laser point cloud data, device, equipment and medium
CN108399752A (en) A kind of driving infractions pre-judging method, device, server and medium
CN106023622B (en) A kind of method and apparatus of determining traffic lights identifying system recognition performance
CN109435955A (en) A kind of automated driving system performance estimating method, device, equipment and storage medium
CN113535569B (en) Control effect determination method for automatic driving
CN109961522A (en) Image projecting method, device, equipment and storage medium
CN108876857A (en) Localization method, system, equipment and the storage medium of automatic driving vehicle
CN110097121A (en) A kind of classification method of driving trace, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180710

RJ01 Rejection of invention patent application after publication