CN112287806A - Road information detection method, system, electronic equipment and storage medium - Google Patents
Road information detection method, system, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN112287806A CN112287806A CN202011160991.1A CN202011160991A CN112287806A CN 112287806 A CN112287806 A CN 112287806A CN 202011160991 A CN202011160991 A CN 202011160991A CN 112287806 A CN112287806 A CN 112287806A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- information
- special event
- license plate
- plate number
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 41
- 238000000034 method Methods 0.000 claims abstract description 25
- 230000007613 environmental effect Effects 0.000 claims abstract description 13
- 230000008447 perception Effects 0.000 claims abstract description 9
- 230000004044 response Effects 0.000 claims abstract description 7
- 230000015654 memory Effects 0.000 claims description 20
- 230000002776 aggregation Effects 0.000 claims description 16
- 238000004220 aggregation Methods 0.000 claims description 16
- 238000012937 correction Methods 0.000 claims description 9
- 239000000126 substance Substances 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 abstract description 3
- 230000006870 function Effects 0.000 description 7
- 238000007689 inspection Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000010276 construction Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/54—Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/62—Text, e.g. of license plates, overlay texts or captions on TV images
- G06V20/63—Scene text, e.g. street names
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/62—Text, e.g. of license plates, overlay texts or captions on TV images
- G06V20/625—License plates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/08—Detecting or categorising vehicles
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
The application discloses a road information detection method, a road information detection device, electronic equipment and a storage medium, and relates to the technical field of automatic driving and intelligent transportation. The specific implementation scheme is as follows: determining environmental information of an area where an autonomous vehicle is located; detecting whether special events exist in the area where the automatic driving vehicle is located according to the environment information, wherein the special events comprise vehicle violations and/or road faults; in the presence of a special event, a picture is taken reflecting the special event in response to the relative position of the autonomous vehicle and the position where the special event occurs satisfying a predetermined condition. The method and the device can accurately and efficiently detect the road information, and solve the problems that the road traffic data utilization rate is not high, the traffic incident perception positioning speed is low, the shielding is easy and the like in the traditional technology.
Description
Technical Field
The application relates to the field of image processing, in particular to the technical field of automatic driving and intelligent traffic.
Background
The road information detection relates to contents such as vehicle violation identification, road fault identification and the like, and is a necessary step of urban traffic control. The traditional road information detection is generally realized by a detection device arranged on a road, and various problems that the road traffic data utilization rate is not high, the traffic incident perception positioning speed is low, the detection device is easily shielded and the like generally exist.
Disclosure of Invention
The application provides a road information detection method, a road information detection device, electronic equipment and a storage medium.
According to an aspect of the present application, there is provided a road information detection method including:
determining environmental information of an area where an autonomous vehicle is located;
detecting whether special events exist in the area where the automatic driving vehicle is located according to the environment information, wherein the special events comprise vehicle violations and/or road faults;
in the presence of a special event, a picture is taken reflecting the special event in response to the relative position of the autonomous vehicle and the position where the special event occurs satisfying a predetermined condition.
According to another aspect of the present application, there is provided a road information detecting system including:
the sensing module is used for determining environmental information of an area where the automatic driving vehicle is located;
the detection module is used for detecting whether special events exist in the area where the automatic driving vehicle is located or not according to the environment information, wherein the special events comprise vehicle violation and/or road faults;
and the picture evidence obtaining module is used for taking a picture reflecting the special event in response to that the relative position of the automatic driving vehicle and the occurrence position of the special event meets a preset condition under the condition that the special event exists.
According to another aspect of the present application, there is provided an electronic device including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method described above.
According to another aspect of the present application, there is provided a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the above-described method.
According to the road information detection method and the road information detection system, the surrounding environment is sensed by the automatic driving vehicle, special events such as vehicle violation or road faults and the like appearing on the road are detected by the environment information, and pictures reflecting the vehicle violation or the road faults are taken, so that the road information detection is realized. Therefore, the sensing capability and the mobility of the automatic driving vehicle are utilized, the road information can be accurately and efficiently detected, and the problems that the road traffic data utilization rate is low, the sensing and positioning speed of traffic events is low, the shielding is easy and the like in the traditional technology are solved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present application, nor do they limit the scope of the present application. Other features of the present application will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
fig. 1 is a flowchart of a road information detection method according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating another implementation of a road information detection method according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a road information detection system according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of another road information detection system according to an embodiment of the present application;
fig. 5 is a block diagram of an electronic device for implementing a road information detection method according to an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
An embodiment of the present application provides a method for detecting road information, and fig. 1 is a flowchart illustrating an implementation of the method for detecting road information according to the embodiment of the present application, where the method includes:
step S101: determining environmental information of an area where an autonomous vehicle is located;
step S102: detecting whether special events exist in the area where the automatic driving vehicle is located according to the environment information, wherein the special events comprise vehicle violations and/or road faults;
step S103: in the presence of the special event, a picture is taken reflecting the special event in response to the relative position of the autonomous vehicle and the position where the special event occurs satisfying a predetermined condition.
Alternatively, the area in which the autonomous vehicle is located may refer to an area within a sensor sensing range of the autonomous vehicle.
Optionally, the flowchart for determining the environmental information of the area where the autonomous vehicle is located in step S101 includes: and acquiring high-precision map information and sensor data of the automatic driving vehicle, and determining the environmental information of the area where the automatic driving vehicle is located by using the high-precision map information and the sensor data.
Alternatively, the vehicle violation may include a vehicle violation or the like, such as a violation wherein the motor vehicle is parked on a pedestrian road, a non-motor vehicle lane, a bus lane, or the like. The road fault can comprise the conditions of road construction, road damage, traffic signal lamp fault, traffic guideboard fault and the like.
Optionally, as shown in fig. 2, after the step S103, the method further includes:
step S204: and under the condition that the special event comprises vehicle violation, license plate number recognition is carried out on the picture reflecting the special event to obtain license plate number information of the vehicle violating the regulations.
Specifically, the picture reflecting the special event may include a picture including an image of the violation vehicle, and the picture may be captured by a camera provided in the vehicle or the body of the autonomous vehicle.
Alternatively, the satisfaction of the predetermined condition of the relative position of the autonomous vehicle and the occurrence position of the special event in step S103 may include: the distance between the autonomous vehicle and the location where the special event occurs (e.g., violation vehicle location, road fault location, etc.) satisfies a predetermined condition, and/or the angle of the autonomous vehicle relative to the location where the special event occurs satisfies a predetermined condition. In the case where the aforementioned predetermined condition is satisfied, the camera of the autonomous vehicle can take a clear picture reflecting the special event.
Because the automatic driving vehicle may have a plurality of cameras, the information of the plurality of cameras can be fused, screened and the like, and the most suitable picture can be found out for license plate number identification.
In one implementation, the identifying the license plate number of the picture reflecting the special event includes: and calling a license plate number recognition system, and carrying out license plate number recognition on the picture reflecting the special event by the license plate number recognition system. Specifically, the cloud platform can receive pictures which reflect special events and are shot by the automatic driving vehicle, the cloud platform calls the license plate number recognition system, the license plate number recognition system recognizes the license plate number of the pictures, and license plate number information of the illegal vehicles which are recognized is fed back to the cloud platform.
As shown in fig. 2, the step 204 may further include:
step S205: carrying out data aggregation on license plate number information of a plurality of violation vehicles, and displaying the violation vehicle information after the data aggregation;
step S206: and correcting the violation vehicle information after the data aggregation according to the correction instruction aiming at the violation vehicle information after the data aggregation.
In some embodiments, the cloud platform can aggregate and display the license plate number information of the illegal vehicles as a primary inspection result. And then, the related staff performs operations such as online confirmation, license plate correction and the like on the preliminary inspection result, and inputs a correction instruction aiming at the preliminary inspection result to the cloud platform. After receiving the correction instruction, the cloud platform can correct the preliminary inspection result according to the correction instruction to obtain a final result; the final result may then be input to a correlation system. Therefore, the method and the device can realize the function of intelligently detecting the behaviors of the road vehicles, the road environment, the non-motor vehicles and the pedestrians, and further realize the recognition and the report of the special events and the tracking and the recognition and the report of the key vehicles.
Fig. 3 is a schematic structural diagram of a road information detection system according to an embodiment of the present application, including:
a perception module 310 for determining environmental information of an area in which the autonomous vehicle is located;
a detection module 320, configured to detect whether a special event exists in an area where the autonomous vehicle is located according to the environment information, where the special event includes a vehicle violation and/or a road fault;
and the picture evidence obtaining module 330 is used for taking a picture reflecting the special event in response to the relative position of the automatic driving vehicle and the occurrence position of the special event meeting a preset condition under the condition that the special event exists.
In some embodiments, the sensing module 310 takes data from various sensors and high-precision map information as input, and performs a series of calculations and processes to accurately sense the surrounding environment of the autonomous vehicle. The perception module 310 can provide rich information for the downstream modules, including the position, shape, category and speed information of obstacles, and also semantic understanding of some special scenes (e.g., construction area, traffic lights, traffic signboards, etc.).
Optionally, as shown in fig. 4, the sensing module 310 includes:
an acquisition unit 311 for acquiring high-precision map information and sensor data of an autonomous vehicle;
a determining unit 312 for determining environmental information of an area where the autonomous vehicle is located using the high-precision map information and the sensor data.
In some embodiments, the detection module 320 can accurately determine illegal vehicles and road faults based on the data output by the sensing module 310, for example, vehicles illegal to stop at the roadside, such as vehicles illegally occupying public transportation lanes, illegal occupying non-motor-driven lanes, illegal occupied pedestrian lanes, and the like.
In some embodiments, the picture evidence taking module 330 takes a picture of the violation vehicle and road fault (e.g., the parking violation vehicle) and obtains evidence through the camera of the unmanned vehicle when the distance and angle between the autonomous vehicle and the violation vehicle and road fault (e.g., the parking violation vehicle) are appropriate. Since the autonomous vehicle generally has a plurality of cameras, data of the plurality of cameras can be fused to find out the most appropriate picture as evidence.
Optionally, as shown in fig. 4, the system may further include:
and the cloud platform 440 is used for identifying the license plate number of the picture reflecting the special event under the condition that the special event comprises the violation of the vehicle, so as to obtain the license plate number information of the violation of the vehicle.
Optionally, the cloud platform 440 is configured to invoke a license plate number recognition system, and receive license plate number information of the violation vehicle returned by the license plate number recognition system.
The license plate number recognition system relies on an image processing technology and massive high-quality data, can support recognition of various common types of license plate information, and achieves high recognition accuracy. The license plate number recognition system carries out license plate number recognition on the picture sent by the cloud platform 440 to obtain license plate number information of the violation vehicle, the license plate number information of the violation vehicle is fed back to the cloud platform 440, and the cloud platform 440 displays the license plate number information of the violation vehicle.
In some embodiments, the cloud platform 440 may display the preliminary inspection result of the autonomous vehicle, perform online confirmation on the conditions of violation recognition, event recognition and the like in the inspection process, correct the license plate, and output the confirmed result data to the relevant system after approval through a manual process. The embodiment of the application can also be combined with a road end detector to realize the function of intelligently detecting the behaviors of road vehicles, road environments, non-motor vehicles and pedestrians, so that the special event identification report and the key vehicle tracking identification report are realized.
Optionally, the sensing module 310, the detecting module 320 and the image forensics module 330 may be disposed in an autonomous vehicle, and the cloud platform 440 may be a cloud server in communication connection with a plurality of autonomous vehicles. Optionally, an interface exists between the cloud platform 440 and the license plate number recognition system, and the cloud platform 440 can call the service of the license plate number recognition system by calling the interface.
Optionally, as shown in fig. 4, the cloud platform 440 includes:
the display unit 441 is used for carrying out data aggregation on license plate number information of a plurality of violation vehicles and displaying the violation vehicle information after the data aggregation;
the correcting unit 442 is configured to correct the violation vehicle information after the data aggregation according to a correction instruction for the violation vehicle information after the data aggregation.
The functions of the modules in the system according to the embodiment of the present application may refer to the corresponding descriptions in the above method, and are not described herein again.
As can be seen from the foregoing embodiments, the road information detection method and system provided in the embodiments of the present application have at least the following advantages:
first, stronger event recognition ability, effectively helping police service work. Specifically, the method comprises the following steps: by utilizing the multi-path fusion perception of the automatic driving vehicle, the traditional perception short board (such as easy shielding, short distance and small dimensionality) is avoided, and the accurate perception of the peripheral environment is ensured. The comprehensive and accurate geographic information brought by the high-precision map is beneficial to more accurately judging the event. Moreover, the automatic driving vehicle and the cloud platform have stronger calculation power, and the accuracy of event judgment can be further improved.
Second, the lack of stationary detection devices is supplemented by the mobility of the autonomous vehicle. Particularly, the automatic driving vehicle has mobility, is convenient to maintain, flexibly maneuvers, and has a perfect vehicle body maintenance and repair scheme, so that the defect of the detection range of the fixed equipment can be overcome.
Third, autonomous vehicles have growth plasticity that can expand the capability boundaries as needed; with the continuous updating or increasing of the road information detection requirements, the automatic driving vehicle can improve the detection capability as required to meet different road information detection requirements.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 5, it is a block diagram of an electronic device of a road information detection method according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 5, the electronic apparatus includes: one or more processors 501, memory 502, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). In fig. 5, one processor 501 is taken as an example.
The memory 502, which is a non-transitory computer-readable storage medium, may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules (e.g., the perception module 310, the detection module 320, and the picture forensics module 330 shown in fig. 3) corresponding to the road information detection method in the embodiment of the present application. The processor 501 executes various functional applications of the server and data processing, i.e., implements the road information detection method in the above-described method embodiments, by running non-transitory software programs, instructions, and modules stored in the memory 502.
The memory 502 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created from use of the electronic device detected from the road information, and the like. Further, the memory 502 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 502 may optionally include a memory remotely located from the processor 501, and these remote memories may be connected to the road information detection electronics through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the road information detection method may further include: an input device 503 and an output device 504. The processor 501, the memory 502, the input device 503 and the output device 504 may be connected by a bus or other means, and fig. 5 illustrates the connection by a bus as an example.
The input device 503 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic equipment for road information detection, such as a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, and the like. The output devices 504 may include a display device, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and Virtual Private Server (VPS) service.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present application can be achieved, and the present invention is not limited herein.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (12)
1. A road information detection method, comprising:
determining environmental information of an area where an autonomous vehicle is located;
detecting whether special events exist in the area where the automatic driving vehicle is located according to the environment information, wherein the special events comprise vehicle violations and/or road faults;
in the presence of the special event, in response to a relative position of the autonomous vehicle and an occurrence position of the special event satisfying a predetermined condition, taking a picture reflecting the special event.
2. The method of claim 1, further comprising:
and under the condition that the special event comprises vehicle violation, license plate number recognition is carried out on the picture reflecting the special event to obtain license plate number information of the vehicle violating the regulations.
3. The method of claim 2, wherein the identifying of the license plate number of the picture reflecting the special event comprises:
and calling a license plate number recognition system, and carrying out license plate number recognition on the picture reflecting the special event by the license plate number recognition system.
4. The method of any of claims 1 to 3, wherein the determining environmental information of the area in which the autonomous vehicle is located comprises:
acquiring high-precision map information and sensor data of the autonomous vehicle;
and determining environmental information of the area where the automatic driving vehicle is located by using the high-precision map information and the sensor data.
5. The method of claim 2 or 3, further comprising:
carrying out data aggregation on license plate number information of the illegal vehicles, and displaying the illegal vehicle information after the data aggregation;
and correcting the violation vehicle information after the data aggregation according to a correction instruction aiming at the violation vehicle information after the data aggregation.
6. A road information detection system, comprising:
the sensing module is used for determining environmental information of an area where the automatic driving vehicle is located;
the detection module is used for detecting whether a special event exists in the area where the automatic driving vehicle is located according to the environment information, wherein the special event comprises a vehicle violation and/or a road fault;
and the picture evidence obtaining module is used for taking a picture reflecting the special event in response to that the relative position of the automatic driving vehicle and the occurrence position of the special event meets a preset condition under the condition that the special event exists.
7. The system of claim 6, further comprising:
and the cloud platform is used for identifying the license plate number of the picture reflecting the special event under the condition that the special event comprises the violation of the vehicle, so as to obtain the license plate number information of the violation of the vehicle.
8. The system of claim 7, wherein the cloud platform is configured to invoke a license plate number recognition system to receive license plate number information of the violation vehicle returned by the license plate number recognition system.
9. The system of any one of claims 6 to 8, wherein the perception module comprises:
an acquisition unit for acquiring high-precision map information and sensor data of the autonomous vehicle;
and the determining unit is used for determining the environmental information of the area where the automatic driving vehicle is located by utilizing the high-precision map information and the sensor data.
10. The system of claim 7 or 8, the cloud platform comprising:
the display unit is used for carrying out data aggregation on license plate number information of the illegal vehicles and displaying the illegal vehicle information after the data aggregation;
and the correction unit is used for correcting the violation vehicle information after the data aggregation according to the correction instruction aiming at the violation vehicle information after the data aggregation.
11. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-5.
12. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011160991.1A CN112287806A (en) | 2020-10-27 | 2020-10-27 | Road information detection method, system, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011160991.1A CN112287806A (en) | 2020-10-27 | 2020-10-27 | Road information detection method, system, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112287806A true CN112287806A (en) | 2021-01-29 |
Family
ID=74373063
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011160991.1A Withdrawn CN112287806A (en) | 2020-10-27 | 2020-10-27 | Road information detection method, system, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112287806A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113176097A (en) * | 2021-03-15 | 2021-07-27 | 北京汽车研究总院有限公司 | Detection method of perception module, computer readable storage medium and vehicle |
CN113741485A (en) * | 2021-06-23 | 2021-12-03 | 阿波罗智联(北京)科技有限公司 | Control method and device for cooperative automatic driving of vehicle and road, electronic equipment and vehicle |
CN114332731A (en) * | 2021-12-24 | 2022-04-12 | 阿波罗智联(北京)科技有限公司 | City event identification method and device, automatic driving vehicle and cloud server |
CN114596704A (en) * | 2022-03-14 | 2022-06-07 | 阿波罗智联(北京)科技有限公司 | Traffic event processing method, device, equipment and storage medium |
CN114783188A (en) * | 2022-05-17 | 2022-07-22 | 阿波罗智联(北京)科技有限公司 | Inspection method and device |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040252193A1 (en) * | 2003-06-12 | 2004-12-16 | Higgins Bruce E. | Automated traffic violation monitoring and reporting system with combined video and still-image data |
CN107870983A (en) * | 2017-09-30 | 2018-04-03 | 深圳市易成自动驾驶技术有限公司 | Vehicle peccancy approaches to IM, block chain and storage medium based on block chain |
JP6605176B1 (en) * | 2018-07-17 | 2019-11-13 | 三菱電機株式会社 | Traffic information generation system |
CN110782657A (en) * | 2018-07-31 | 2020-02-11 | 百度(美国)有限责任公司 | Police cruiser using a subsystem of an autonomous vehicle |
CN111583630A (en) * | 2020-04-10 | 2020-08-25 | 河北德冠隆电子科技有限公司 | Brand-new road high-precision map rapid generation system and method based on space-time trajectory reconstruction |
-
2020
- 2020-10-27 CN CN202011160991.1A patent/CN112287806A/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040252193A1 (en) * | 2003-06-12 | 2004-12-16 | Higgins Bruce E. | Automated traffic violation monitoring and reporting system with combined video and still-image data |
CN107870983A (en) * | 2017-09-30 | 2018-04-03 | 深圳市易成自动驾驶技术有限公司 | Vehicle peccancy approaches to IM, block chain and storage medium based on block chain |
JP6605176B1 (en) * | 2018-07-17 | 2019-11-13 | 三菱電機株式会社 | Traffic information generation system |
CN110782657A (en) * | 2018-07-31 | 2020-02-11 | 百度(美国)有限责任公司 | Police cruiser using a subsystem of an autonomous vehicle |
CN111583630A (en) * | 2020-04-10 | 2020-08-25 | 河北德冠隆电子科技有限公司 | Brand-new road high-precision map rapid generation system and method based on space-time trajectory reconstruction |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113176097A (en) * | 2021-03-15 | 2021-07-27 | 北京汽车研究总院有限公司 | Detection method of perception module, computer readable storage medium and vehicle |
CN113741485A (en) * | 2021-06-23 | 2021-12-03 | 阿波罗智联(北京)科技有限公司 | Control method and device for cooperative automatic driving of vehicle and road, electronic equipment and vehicle |
CN114332731A (en) * | 2021-12-24 | 2022-04-12 | 阿波罗智联(北京)科技有限公司 | City event identification method and device, automatic driving vehicle and cloud server |
CN114596704A (en) * | 2022-03-14 | 2022-06-07 | 阿波罗智联(北京)科技有限公司 | Traffic event processing method, device, equipment and storage medium |
CN114596704B (en) * | 2022-03-14 | 2023-06-20 | 阿波罗智联(北京)科技有限公司 | Traffic event processing method, device, equipment and storage medium |
CN114783188A (en) * | 2022-05-17 | 2022-07-22 | 阿波罗智联(北京)科技有限公司 | Inspection method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112287806A (en) | Road information detection method, system, electronic equipment and storage medium | |
US20210335127A1 (en) | Traffic monitoring method, apparatus, device and storage medium | |
EP3933345A2 (en) | Road event detection method, apparatus, device and storage medium | |
CN111784837B (en) | High-precision map generation method, apparatus, device, storage medium, and program product | |
JP2021102437A (en) | Road-vehicle cooperative information processing method, device, apparatus, automatically-operated vehicle, and program | |
CN111292531B (en) | Tracking method, device and equipment of traffic signal lamp and storage medium | |
CN111739344A (en) | Early warning method and device and electronic equipment | |
CN111998860A (en) | Automatic driving positioning data verification method and device, electronic equipment and storage medium | |
CN111854771A (en) | Map quality detection processing method and device, electronic equipment and storage medium | |
CN110689747B (en) | Control method and device of automatic driving vehicle and automatic driving vehicle | |
CN110910665A (en) | Signal lamp control method and device and computer equipment | |
CN111536984A (en) | Positioning method and device, vehicle-end equipment, vehicle, electronic equipment and positioning system | |
CN111652112B (en) | Lane flow direction identification method and device, electronic equipment and storage medium | |
JP2021179964A (en) | Monitoring method for image acquisition equipment, device, electronic equipment, storage medium, and program | |
CN112101223A (en) | Detection method, device, equipment and computer storage medium | |
CN111703371B (en) | Traffic information display method and device, electronic equipment and storage medium | |
CN111666876A (en) | Method and device for detecting obstacle, electronic equipment and road side equipment | |
CN111523471A (en) | Method, device and equipment for determining lane where vehicle is located and storage medium | |
CN113011298A (en) | Truncated object sample generation method, target detection method, road side equipment and cloud control platform | |
CN110866504A (en) | Method, device and equipment for acquiring marked data | |
CN111640301B (en) | Fault vehicle detection method and fault vehicle detection system comprising road side unit | |
CN111540010B (en) | Road monitoring method and device, electronic equipment and storage medium | |
CN110458815A (en) | There is the method and device of mist scene detection | |
CN113091737A (en) | Vehicle-road cooperative positioning method and device, automatic driving vehicle and road side equipment | |
CN110865421A (en) | Business model training method, obstacle detection method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20211015 Address after: 100176 Room 101, 1st floor, building 1, yard 7, Ruihe West 2nd Road, economic and Technological Development Zone, Daxing District, Beijing Applicant after: Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Address before: 2 / F, baidu building, 10 Shangdi 10th Street, Haidian District, Beijing 100085 Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd. |
|
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20210129 |