CN114783188A - Inspection method and device - Google Patents

Inspection method and device Download PDF

Info

Publication number
CN114783188A
CN114783188A CN202210540370.9A CN202210540370A CN114783188A CN 114783188 A CN114783188 A CN 114783188A CN 202210540370 A CN202210540370 A CN 202210540370A CN 114783188 A CN114783188 A CN 114783188A
Authority
CN
China
Prior art keywords
data
inspection
feature
vehicle
patrol
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210540370.9A
Other languages
Chinese (zh)
Inventor
夏娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Zhilian Beijing Technology Co Ltd
Original Assignee
Apollo Zhilian Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apollo Zhilian Beijing Technology Co Ltd filed Critical Apollo Zhilian Beijing Technology Co Ltd
Priority to CN202210540370.9A priority Critical patent/CN114783188A/en
Publication of CN114783188A publication Critical patent/CN114783188A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • G08G1/054Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed photographing overspeeding vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The disclosure provides a polling method and a polling device, which relate to the technical field of artificial intelligence, in particular to the technical fields of intelligent transportation, automatic driving, deep learning, big data, cloud computing, computer vision and image processing. The specific implementation mode comprises the following steps: acquiring patrol data of a patrol site; scene feature extraction is carried out on the patrol data to obtain scene feature data of the patrol data; determining the feature type of the scene feature data; and determining the inspection result of the inspection data based on the scene characteristic data and the characteristic category. Therefore, the coverage range of the routing inspection can be improved, the routing inspection efficiency is improved, and the routing inspection false detection rate is reduced.

Description

Inspection method and device
Technical Field
The disclosure relates to the technical field of artificial intelligence, in particular to the technical field of intelligent transportation, automatic driving, deep learning, big data, cloud computing, computer vision and image processing, and particularly relates to a routing inspection method and a routing inspection device.
Background
Inspection, i.e., tour inspection, is an inspection performed on public facilities, traffic violation conditions, product manufacturing processes, and the like. The aim is to discover problems that may arise or already exist in time. In the prior art, inspection is usually performed in a manual mode, or an inspection result is automatically generated according to an inspection task in a single specific scene.
However, the inspection range is usually limited by using the above inspection method.
Disclosure of Invention
Provided are a patrol method, a patrol device, an electronic device and a storage medium.
According to a first aspect, there is provided a patrol method comprising: acquiring patrol data of a patrol site; scene feature extraction is carried out on the polling data to obtain scene feature data of the polling data; determining the feature type of the scene feature data; and determining a polling result of the polling data based on the scene feature data and the feature types.
In some embodiments, the determining the feature category of the scene feature data includes: determining the feature types of the scene feature data from a predetermined feature type set, wherein the feature types in the feature type set correspond to inspection models in a pre-trained inspection model set in a one-to-one manner, and the inspection models in the inspection model set are used for generating inspection results of the inspection data; and determining a patrol result of the patrol data based on the scene feature data and the feature categories, including: determining a target inspection model corresponding to the determined characteristic category from the inspection model set; and generating a polling result of the polling data based on the target polling model.
In some embodiments, the target patrol model is obtained by training in the following way: acquiring a training sample set; wherein, the training samples in the training sample set include: sample inspection data and a sample label corresponding to the sample inspection data; the sample label corresponding to the sample inspection data represents the inspection result of the sample inspection data; the feature type of the scene feature data of the sample inspection data in the training sample set is the same as the determined feature type; and inputting the sample inspection data and the sample labels included in the training sample set into an initial model for training a target inspection model by adopting a machine learning algorithm, and training to obtain the target inspection model.
In some embodiments, the acquiring inspection data of the inspection site includes: the method comprises the steps of obtaining data of a patrol site, which are sent by an automatic driving vehicle and collected by the automatic driving vehicle, and taking the obtained data as patrol data, wherein the sending priority of the automatic driving vehicle to the automatic driving data is higher than the sending priority of the automatic driving vehicle to the patrol data.
In some embodiments, the above method further comprises: and hierarchically storing the routing inspection data and the characteristic data.
In some embodiments, the above method further comprises: if the inspection result indicates that the target vehicle is in a violation state, acquiring an image of the violation state of the target vehicle; sending the image, a patrol result indicating that the target vehicle has a violation state and vehicle information of the target vehicle to a target terminal, wherein the target terminal is used for determining whether the target vehicle is in the violation state or not based on the image and the vehicle information; and if the target terminal determines that the target vehicle is in the violation state, storing the image and the vehicle information into a preset database.
In some embodiments, the feature classes in the feature class set characterize any one of the following scenarios: whether a road guardrail is damaged, whether a traffic sign marking is damaged, whether a vehicle runs at an overspeed or not, whether the vehicle runs in a reverse direction or not, whether the vehicle runs by pressing a line or not, whether the construction of occupying the road exists or not, whether a well cover is lost or damaged or not, and whether a driver drives in a fatigue way or not.
According to a second aspect, there is provided an inspection device comprising: a first acquisition unit configured to acquire inspection data of an inspection site; the characteristic extraction unit is configured to extract scene characteristics of the patrol data to obtain scene characteristic data of the patrol data; a first determination unit configured to determine a feature type of the scene feature data; and a second determination unit configured to determine a patrol result of the patrol data based on the scene feature data and the feature type.
In some embodiments, the first determining unit includes: the first determining subunit is configured to determine the feature types of the scene feature data from a predetermined feature type set, wherein the feature types in the feature type set correspond to inspection models in a pre-trained inspection model set in a one-to-one manner, and the inspection models in the inspection model set are used for generating inspection results of the inspection data; and the second determining unit includes: a second determining subunit configured to determine, from the inspection model set, a target inspection model corresponding to the determined feature class; and the generating subunit is configured to generate the inspection result of the inspection data based on the target inspection model.
In some embodiments, the target patrol model is obtained by training in the following way: acquiring a training sample set; wherein, the training samples in the training sample set include: sample inspection data and a sample label corresponding to the sample inspection data; the sample label corresponding to the sample inspection data represents the inspection result of the sample inspection data; the feature type of the scene feature data of the sample inspection data in the training sample set is the same as the determined feature type; and inputting the sample inspection data and the sample labels included in the training sample set into an initial model for training a target inspection model by adopting a machine learning algorithm, and training to obtain the target inspection model.
In some embodiments, the first obtaining unit includes: the system comprises an acquisition subunit and a control unit, wherein the acquisition subunit is configured to acquire data of the inspection scene acquired by an automatic driving vehicle and transmitted by the automatic driving vehicle, and the acquired data is used as inspection data, and the transmission priority of the automatic driving vehicle to the automatic driving data is higher than that of the automatic driving vehicle to the inspection data.
In some embodiments, the above apparatus further comprises: and the first storage unit is configured to store the routing inspection data and the characteristic data in a layered mode.
In some embodiments, the above apparatus further comprises: the second acquisition unit is configured to acquire an image of the target vehicle in a violation state if the inspection result indicates that the target vehicle is in the violation state; a transmitting unit configured to transmit the image, a patrol result indicating that the target vehicle has a violation state, and vehicle information of the target vehicle to a target terminal, wherein the target terminal is configured to determine whether the target vehicle is in the violation state based on the image and the vehicle information; and the second storage unit is configured to store the image and the vehicle information into a preset database if the target terminal determines that the target vehicle is in the violation state.
In some embodiments, the feature classes in the set of feature classes characterize any of the following scenarios: whether the road guardrail is damaged, whether the traffic sign marking is damaged, whether the vehicle runs at overspeed or not, whether the vehicle runs in reverse or not, whether the vehicle runs by pressing a line or not, whether the road occupation construction exists or not, whether the well cover is lost or damaged or not, and whether a driver drives in a fatigue manner or not.
According to a third aspect, there is provided an electronic device comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any of the embodiments of the inspection method.
According to a fourth aspect, there is provided a non-transitory computer-readable storage medium having stored thereon computer instructions for causing the computer to perform a method according to any one of the embodiments of the patrol method.
According to a fifth aspect, a computer program product is provided, comprising a computer program which, when executed by a processor, implements a method according to any one of the embodiments of the inspection method.
According to the scheme, the inspection data of an inspection site are obtained, then scene feature extraction is carried out on the inspection data to obtain scene feature data of the inspection data, then the feature type of the scene feature data is determined, and then the inspection result of the inspection data is determined based on the scene feature data and the feature type. Therefore, the inspection result of the inspection data is determined by determining the characteristic type of the scene characteristic data of the inspection site, the inspection coverage can be improved, the inspection efficiency can be improved, and the inspection false detection rate can be reduced.
Drawings
Other features, objects and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which some embodiments of the present disclosure may be applied;
FIG. 2 is a flow chart of one embodiment of a patrol method according to the present disclosure;
fig. 3 is a data acquisition flow chart of a data acquisition device side of patrol data in an embodiment of the patrol method according to the present disclosure;
FIG. 4 is a schematic flow diagram of identifying a violation vehicle in one embodiment of a routing inspection method according to the present disclosure;
FIG. 5 is a schematic diagram of an application scenario of a patrol method according to the present disclosure;
fig. 6 and 7 are exemplary system architecture diagrams in one embodiment of a patrol method according to the present disclosure;
fig. 8 is a flow chart of yet another embodiment of a patrol method according to the present disclosure;
fig. 9 is a schematic structural view of one embodiment of an inspection device according to the present disclosure;
fig. 10 is a block diagram of an electronic device for implementing the patrol method of the embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of embodiments of the present disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the personal information of the related user all accord with the regulations of related laws and regulations, necessary security measures are taken, and the public order and the custom are not violated.
It should be noted that, in the present disclosure, the embodiments and the features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the inspection method or inspection device of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. Network 104 is the medium used to provide communication links between terminal devices 101, 102, 103 and server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
A user may use the terminal devices 101, 102, 103 to interact with the server 105 over the network 104 to receive or transmit data (e.g., patrol data), etc. Various client applications, such as navigation applications, vehicles, instant messaging software, mailbox clients, social platform software, etc., may be installed on the terminal devices 101, 102, 103.
Here, the terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices including, but not limited to, autonomous vehicles (e.g., autonomous buses), smart phones, tablet computers, e-book readers, laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules to provide a distributed service), or as a single piece of software or software module. And is not particularly limited herein.
The server 105 may be a server providing various services, such as a background server providing support for the terminal devices 101, 102, 103. The background server can analyze and process the acquired polling data of the polling site. Optionally, the background server may also feed back the processing result (e.g., the polling result) to another terminal device. As an example, the server 105 may be a cloud server.
It should be noted that the polling method provided by the embodiment of the present disclosure may be executed by the server 105 or the terminal devices 101, 102, and 103, and accordingly, the polling device may be disposed in the server 105 or the terminal devices 101, 102, and 103.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for an implementation.
Continuing to refer to fig. 2, a flow 200 of one embodiment of a patrol method according to the present disclosure is shown. The inspection method comprises the following steps:
step 201, obtaining the inspection data of the inspection site.
In the present embodiment, an execution subject (for example, a server or a terminal device shown in fig. 1) on which the patrol method operates may acquire patrol data of a patrol site.
Wherein, above-mentioned scene of patrolling and examining can be any place that has the demand of patrolling and examining. The inspection site may be, for example, inside equipment, or may be a traffic inspection site, a public facility location, or the like.
The inspection data may be data obtained at the inspection site. The form of the patrol data can be images, videos, characters and the like.
Here, the execution main body may acquire patrol data from a device for acquiring the patrol data; the patrol data may be acquired from a storage device locally or communicatively connected to the execution main body. The equipment for acquiring the patrol data can comprise at least one of the following items: user terminals, autonomous buses, unmanned aerial vehicles, and the like.
And 202, extracting scene characteristics of the inspection data to obtain scene characteristic data of the inspection data.
In this embodiment, the execution main body may perform scene feature extraction on the inspection data to obtain scene feature data of the inspection data.
Wherein the scene characteristic data may include at least one of: luminance features, texture features, color features, and the like. The texture features may include global feature information (also referred to as Gist information).
Step 203, determining the feature type of the scene feature data.
In this embodiment, the execution subject may determine a feature type of the scene feature data.
The number of the determined feature categories of the scene feature data may be one or multiple.
Here, the specific representation meaning of the feature category of the scene feature data may be defined according to actual needs.
As an example, if the scene feature data indicates that the inspection scene includes a traffic light, the feature category of the scene feature data may be: whether a vehicle running red light exists or not; if the scene feature data represents that the patrol inspection site comprises guardrails, the feature categories of the scene feature data can be as follows: whether the guard rail is damaged.
As another example, if the scene feature data indicates that the inspection site includes a traffic light, the feature category of the scene feature data may be: comprises a traffic signal lamp; if the scene feature data represents that the patrol inspection site comprises guardrails, the feature categories of the scene feature data can be as follows: comprises a guardrail.
And 204, determining the inspection result of the inspection data based on the scene characteristic data and the characteristic category.
In this embodiment, the execution subject may determine the patrol result of the patrol data based on the scene feature data and the feature type.
Here, if the number of the determined feature classes of the scene feature data is plural, plural patrol results may be generated. In other words, one patrol result may be generated for each feature class.
As an example, the execution subject may combine the scene feature data and the feature type. And inputting the data to a pre-trained inspection result judgment model to obtain an inspection result of the inspection data. The inspection result judgment model can be used for representing the corresponding relation among scene characteristic data, characteristic categories and inspection results. The inspection result judgment model can be a convolutional neural network model obtained by training by adopting a machine learning algorithm based on a training sample set containing scene characteristic data, characteristic categories and inspection results.
As another example, a patrol result determination model may be trained in advance for each feature class of the scene feature data, that is, the feature classes and the patrol result determination models correspond one to one. The inspection result determination model can be used for representing the corresponding relation between the scene feature data of the feature category corresponding to the inspection result determination model and the inspection result.
On this basis, the execution subject may input the scene feature data obtained in step 202 to a patrol result determination model trained for the feature type of the scene feature data, thereby obtaining a patrol result of the patrol data.
The inspection result judgment model can be a convolutional neural network model obtained by training based on a training sample set containing scene characteristic data and an inspection result by adopting a machine learning algorithm. The feature categories of the scene feature data included in the training sample set for training the inspection result determination model are as follows: and a feature type corresponding to the patrol result determination model.
In the method provided by the above embodiment of the present disclosure, inspection data of an inspection site is obtained, then scene feature extraction is performed on the inspection data to obtain scene feature data of the inspection data, then a feature type of the scene feature data is determined, and then an inspection result of the inspection data is determined based on the scene feature data and the feature type. Therefore, the inspection result of the inspection data is determined by determining the characteristic type of the scene characteristic data of the inspection site, the inspection coverage can be improved, the inspection efficiency is improved, and the inspection false detection rate is reduced.
In some optional implementation manners of this embodiment, the executing main body may execute the step 201 in the following manner, so as to obtain the inspection data of the inspection site:
the method comprises the steps of obtaining data of a patrol site, which are sent by an automatic driving vehicle and collected by the automatic driving vehicle, and taking the obtained data as patrol data.
Wherein, in some cases, the autonomous vehicle may comprise an autonomous bus.
Here, the number of autonomous vehicles for collecting patrol data may be one or more. In some cases, the autonomous vehicle used to collect inspection data may include autonomous buses that are all in operation in a predetermined area (e.g., an area where a country, a province, or a city is located).
The data of the inspection site, which is sent by the automatic driving vehicle and collected by the automatic driving vehicle, may include at least one of the following: map data, positioning data, video data, radar data, order data, user data, and the like.
It can be understood that, in the above optional implementation manner, the autonomous vehicle is used as the collection device of the inspection data, so that each place where the autonomous vehicle arrives can be used as an inspection site, the inspection data of the inspection site can be obtained, and the inspection coverage can be further expanded.
In some application scenarios in the optional implementation manners, the construction quality, the operation efficiency, the service level and the management level of the traditional infrastructure can be improved by combining means such as holographic sensing, intelligent decision, vehicle-road interconnection, big data management and the like. The acquired map data, positioning data, video data, radar data, order data and user data can be sent to the execution main body by utilizing the mobility, the sensor and the calculation force of the automatic driving vehicle. The traffic accident and damage of road facilities can be identified, and the uploading early warning can be realized. In addition, large-scale full-type collection can be carried out on map data, sensor data, user-defined event trigger data and manual trigger data. The type of the collected data may include the license plate number, time, position, vehicle state, cruising condition information, data collected by the sensor, data information obtained by calculating the data collected by the sensor, and the like.
As an example, fig. 3 may be referred to, where fig. 3 is a data acquisition flow chart of a data acquisition device side of patrol data.
In fig. 3, real-time data can be transmitted to the cloud server through a 3G/4G module deployed in the vehicle-mounted hardware. Where the real-time data may be all or a portion of the data collected by the autonomous vehicle. The transmission mode may include real-time transmission and offline transmission. The real-time transport protocol may include: TCP (Transmission Control Protocol), HTTP (HyperText Transfer Protocol), HTTP2.0, QUIC (quick UDP Internet connection). And continuously tracking and evaluating the transmission performance of different protocols, and continuously iterating the real-time transmission capability.
The inspection data may include: critical data for a landing, such as primary vehicle data (e.g. vehicle information of a violation vehicle), obstacle identification data (e.g. category information of an obstacle). In the aspect of data transmission, a special acquisition box or an acquisition module can be used as a real-time transmission tool, and transmission is mainly performed based on the following interfaces: an MQTT (Message Queuing Telemetry Transport) interface for sending messages, an interface for sending files, and the like.
The offline transmission mode can depend on special nodes of a data transmission network established by a vehicle large-base management center and an operation center to perform full data tray falling, and meanwhile, stable and efficient guarantee is provided. The off-line transmission process comprises the following steps:
first, the device logs in, the device status, and the like, to perform demand monitoring.
After that, SDK (Software Development Kit) docking is performed.
Then, the data transmission priority is scheduled, and high quality is guaranteed. Specifically, the automatic driving vehicle can preferentially ensure the realization of the automatic driving function, and on the basis, if redundant transmission quantity exists, the patrol data is transmitted.
Subsequently, dedicated logistics transport large capacity hard disks to the data center. Specifically, the full amount of data collected by the autonomous vehicle may be stored to a data center.
And finally, controlling the transmission time length and optimizing the transmission cost. In particular, when the stored data is acquired, slicing may be performed according to time, thereby obtaining the data to improve the efficiency of data retrieval.
In some cases of the above-described alternative implementations, the priority of sending the automated driving data by the automated driving vehicle is higher than the priority of sending the patrol data by the automated driving vehicle.
It can be understood that in the above situation, the autonomous driving vehicle can preferentially ensure the automatic driving function to be normal, and on the basis, if the redundant data transmission amount exists, the patrol data can be further sent, so that the autonomous driving function is prevented from being abnormal due to the fact that the patrol data are sent by the autonomous driving vehicle.
In some optional implementation manners of this embodiment, the execution main body may further store the patrol data and the feature data hierarchically.
Specifically, the patrol data can be stored in a cold data layer as cold data accessed at a low frequency, and the characteristic data can be stored in a hot data layer as hot data accessed at a high frequency.
It can be understood that, in the above optional implementation manner, the inspection data and the feature data are both stored, rather than only the feature data being stored, so that the inspection data can be conveniently accessed, and the data loss caused by deleting the inspection data after scene feature extraction is performed on the inspection data is avoided. In addition, the adoption of the layered storage mode can take balance between the performance and the cost of data access to a certain extent.
In some application scenarios in the above-described alternative implementation, raw data (i.e., patrol data) and structured data (e.g., scene feature data) collected by the autonomous vehicle may be stored in a targeted and hierarchical manner.
Here, real-time, offline data storage, multi-media storage, and intelligent partitioning can be supported, thereby enabling lower-cost data storage. The offline data storage is mainly to read hardware data such as a camera, a CANBUS (Controller Area Network-BUS), an IMU (Inertial Measurement Unit), a GPS (Global Positioning System) and the like at a box end by a tray drop program, and store the data in a local Area of the automatic driving vehicle.
The offline data may include video data, log data, message data, and the like. And storing the data acquired by the vehicle end into a vehicle end hard disk according to a preset format (such as bag format). And uploading the data to a cloud server for storage every day in an off-line manner.
The real-time data storage mainly comprises the steps that data are reported to a cloud server by a vehicle end in real time, and the cloud server carries out resolving processing and storage according to the type of the reported data.
The data storage management comprises: access control management such as account, authentication, authority, quota and the like; version control management is realized through version numbers (unique identification for data retrieval) contained in the data; data isolation management; data storage and management such as data saving, backup, isolation, sharing and the like. The data isolation management can store the uploaded data (such as the inspection data) of the automatic bus and the private data of the user separately, provide a virtual table for an application layer, and control isolation through authority.
In some optional implementation manners of this embodiment, the executing main body may further perform the following steps:
firstly, if the inspection result indicates that the target vehicle is in a violation state, acquiring an image of the violation state of the target vehicle. In other words, if the inspection result indicates that the vehicle in the violation state exists, the vehicle is taken as the target vehicle, and the image that the vehicle (i.e., the target vehicle) is in the violation state is acquired. Wherein the target vehicle may be a vehicle in a violation condition.
And then, sending the image, an inspection result indicating that the target vehicle has a violation state and vehicle information of the target vehicle to a target terminal.
The target terminal is used for determining whether the target vehicle is in the violation state or not based on the image and the vehicle information.
Here, the target terminal may automatically determine whether the target vehicle is in the violation state based on the image and the vehicle information; the image can also be presented, so that the violation determiner can artificially determine whether the target vehicle is in the violation state.
And then, if the target terminal determines that the target vehicle is in the violation state, storing the image and the vehicle information into a preset database.
It can be understood that in the above optional implementation manner, under the condition that the inspection result indicates that the target vehicle is in the violation state, the image that the target vehicle is in the violation state is further obtained, so that the evidence collection of the violation vehicle is realized; the image is sent to the target terminal so that the target terminal can judge again, and the situation of false detection can be avoided to a certain extent; under the condition that the target vehicle is judged to be in a violation state, the images and the vehicle information are stored in the preset database, so that follow-up evidence calling and relevant personnel can follow up and carry out on-site processing conveniently.
Referring now to fig. 4, fig. 4 is a schematic flow chart illustrating identification of a violation vehicle in one embodiment of the inspection method according to the present disclosure.
In fig. 4, the scenario described in the above alternative implementation is exemplified by motor vehicle violation identification as an example.
In a first step, decision rules may be formulated. For example, on the basis of the information of the parking forbidding area and the parking forbidding period provided by the local traffic police, the illegal parking area is marked and the parameters (such as the information of the parking forbidding period) are recorded in the electronic map; and (4) referring to related laws and regulations, further expanding the motor vehicle violation information of non-motor vehicle lanes, sidewalks, bus stations and the like by combining the video and radar sensing capability of the automatic driving vehicle, and making a violation judgment rule of the operation area of the automatic driving vehicle.
In a second step, identification of events may be performed. For example, in the running process of an automatic driving vehicle, the real-time position and speed information of surrounding motor vehicles (namely the target vehicles) can be acquired in real time through videos and radars, and traffic basic information such as ground marking lines, non-motor vehicle lanes, bus stops, sidewalks, pedestrian crossings and the like can be identified. And judging whether the motor vehicle is in a violation state or not according to the position and speed information of the motor vehicle of multiple frames and by combining violation judgment rules.
And thirdly, if the motor vehicle is determined to be in a violation state, performing evidence collection. Specifically, if the motor vehicle violation event exists, the violation vehicle is captured and the violation evidence obtaining video is recorded and is used as a cloud processing material, an audit basis and a fine basis to be stored.
And fourthly, uploading the evidence-obtaining content. For example, the vehicle end uploads the captured violation material to the cloud in real time.
And fifthly, identifying by the cloud server. For example, the vehicle information extraction of the illegal parking vehicle is completed in the cloud, and the method comprises the following steps: license plate number, vehicle body color, vehicle type information and the like.
And sixthly, information integration is carried out. Specifically, information integration can be completed by the cloud, and the violation state is further confirmed. The integrated information includes: the time stamp, the identified vehicle information (license plate number, vehicle body color and vehicle type information), the evidence obtaining video or picture and the violation judgment result.
And seventhly, pushing and checking. And pushing the violation vehicle information and the evidence obtaining material to a service platform for manual examination and verification.
And eighthly, manually checking. For example, violation information enters the audit stream at the service platform for manual inspection to ensure accurate identification. States to be checked, checked and the like are reflected in the workflow, and violation information and operation records of a checker are reflected.
And step nine, recording a report. And if receiving manual audit information for judging the correct rule violation result, inputting the rule violation information into an electronic police database, completing the input work of the rule violation information, and providing punishment evidence for a traffic management department.
And step ten, checking and exporting. The traffic management department may view and export downloads of reports for provision to process follow-up personnel.
The eleventh step, archiving backup. And after manual processing, the information is archived and backed up on a cloud platform for retrieval and recording.
Continuing to refer to fig. 5, fig. 5 is a schematic diagram of an application scenario of the inspection method according to the present embodiment. In the application scenario of fig. 5, the execution main body 501 first obtains inspection data 502 of an inspection site, then the execution main body 501 performs scene feature extraction on the inspection data 502 to obtain scene feature data 503 of the inspection data 502, then the execution main body 501 determines a feature type 504 of the scene feature data 503, and then the execution main body 501 determines an inspection result 505 of the inspection data 502 based on the scene feature data 503 and the feature type 504.
Referring now to fig. 6 and 7, fig. 6 and 7 are exemplary system architecture diagrams in one embodiment of a patrol method according to the present disclosure. In fig. 6 and 7, the scheme realizes the capabilities of "perception-fusion-AI (Artificial Intelligence) -recognition-statistical analysis-push" end-to-end product, and has the characteristics of multi-dimensional products such as holographic perception, AI recognition, high-precision maps, space-time distribution analysis and the like. Can meet the customization requirements of multiple users, multiple scenes and the like. Through the fusion of the map data positioning data and the radar data, asset maintenance scenes such as damage of road guardrails and traffic sign lines, traffic enforcement scenes such as overspeed driving, reverse driving and line pressing driving, road occupation construction reporting, well lid loss damage and the like are accurately identified. The multi-source data can be uploaded to a cloud for intelligent processing.
Specifically, in the exemplary system architecture diagram shown in fig. 6, inspection data for one or more inspection sites may be obtained by autonomous driving a bus, a user's cell phone, and a camera. The automatic driving bus can be provided with a Human Machine Interface (HMI), Human-Machine interaction can be achieved through the HMI, and the sending time of routing inspection data collected by the automatic driving bus can be determined. The mobile phone of the user can be provided with an APP (Application), and the user can shoot and upload accident images or accident videos on the inspection site through the APP. The camera (for example, a fixed camera installed on a road) can be in communication connection with the cloud monitoring large screen. The large cloud monitoring screen can be used for displaying images or videos shot by the camera, monitoring of a patrol inspection site is achieved, and in addition, operations such as screening, marking, auditing and analyzing of the images or videos shot by the camera can be achieved through the large cloud monitoring screen, so that a patrol inspection result can be generated later.
In the exemplary system architecture diagram shown in FIG. 7, the architecture includes a data collection layer. The patrol data may be obtained, for example, by patrol vehicles (e.g., manned vehicles, unmanned vehicles), cameras (e.g., existing fixed cameras on roads), video terminals, personnel handsets, and the like.
The data storage layer may store raw patrol data (e.g., video data, photo data taken by a fixed camera or a user's cell phone) and processed patrol data (e.g., structured data obtained by structuring the raw data).
The data perception layer can comprise a big data engine, a vehicle-road cooperation engine, a business support engine, a traffic engine, a road AI perception terminal, an in-vehicle automatic driving terminal and a user travel service terminal. The big data engine can, for example, implement the training of the inspection model and the analysis of the big data. The vehicle-road cooperation engine can analyze and process the relevant information of the infrastructure and the vehicle. The service support engine can analyze and process data obtained by service terminals such as the user APP. The traffic engine may provide processing such as data analysis in terms of traffic information (e.g., traffic lights, vehicle location). The road AI perception terminal can be used for processing road data such as analysis. The in-vehicle automatic driving terminal can analyze and process the automatic driving data. The user travel service terminal can analyze and process the user travel data. Therefore, the data perception layer can realize data perception of the inspection site through the big data engine, the vehicle-road cooperation engine, the business support engine, the traffic engine, the road AI perception terminal, the in-vehicle automatic driving terminal and the user travel service terminal.
The data application layer can comprise applications in the aspects of mobile phone end APP, vehicle-mounted HMI, cloud monitoring large screen, event notification, violation event perception, patrol vehicle and emergency real-time perception, real-time uploading, hardware equipment perception, hardware equipment working state real-time perception, integral reward, public facility perception, perception event auditing analysis and the like. For example, after the patrol inspection result is obtained, the place where the traffic accident occurs may be determined according to the patrol inspection result. Furthermore, the mobile phone end APP can prompt that a traffic accident exists near the place where the user is located; the route planning can be carried out on the unmanned vehicle based on the traffic accident site, and the unmanned vehicle is presented through a vehicle-mounted HMI; the occurrence place of the traffic accident is monitored through the cloud monitoring large screen, and the event notification is carried out, so that related personnel can follow up in time or sense the event to be examined and analyzed. In addition, points can be awarded to the users who upload accident events.
With further reference to fig. 8, a flow 800 of yet another embodiment of a patrol method is illustrated. The process 800 includes the following steps:
step 801, acquiring inspection data of an inspection site.
In the present embodiment, an execution subject (for example, a server or a terminal device shown in fig. 1) on which the patrol method is executed can acquire patrol data of a patrol site.
In this embodiment, step 801 is substantially the same as step 201 in the corresponding embodiment of fig. 2, and is not described herein again.
And 802, extracting scene characteristics of the inspection data to obtain scene characteristic data of the inspection data.
In this embodiment, the execution main body may perform scene feature extraction on the inspection data to obtain scene feature data of the inspection data.
In this embodiment, step 802 is substantially the same as step 202 in the corresponding embodiment of fig. 2, and is not described herein again.
Step 803 is to determine a feature type of the scene feature data from a predetermined feature type set.
In this embodiment, the execution subject may determine the feature type of the scene feature data from a predetermined feature type set.
And the characteristic categories in the characteristic category set correspond to the routing inspection models in the pre-trained routing inspection model set one by one. And the routing inspection model in the routing inspection model set is used for generating a routing inspection result of routing inspection data corresponding to the routing inspection model. The feature type of the scene patrol data of the patrol data corresponding to the patrol model is the same as the feature type of the scene patrol data corresponding to the patrol model.
Here, the feature class set may include various feature classes, and the setting of the feature class set may be as large and full as possible to cover as many feature classes as possible. Further, the inspection model set can be obtained through training according to the corresponding relation between the feature categories and the inspection models.
As a first example, a tour model (e.g., tour model Y) corresponding to each feature class (e.g., feature class X) in the feature class set may be trained as follows:
first, a set of training samples is obtained. Wherein, the training samples in the training sample set comprise: sample inspection data and a sample label corresponding to the sample inspection data. And the sample label corresponding to the sample polling data represents the polling result of the sample polling data. The feature type of the scene feature data of the sample patrol data in the training sample set is the same as the determined feature type (for example, the feature type X).
Then, the sample patrol data and the sample labels included in the training sample set are input to an initial model (for example, a convolutional neural network model) for training the patrol model (for example, the patrol model Y) by using a machine learning algorithm, and the patrol model is obtained by training.
As a second example, the patrol inspection model (e.g., patrol inspection model Y) corresponding to each feature class (e.g., feature class X) in the feature class set may also include: and obtaining at least one inspection data corresponding to the scene feature data of the feature category after statistical analysis. Each inspection data may correspond to one inspection result. And representing the inspection result of the inspection data according to the inspection result corresponding to the inspection data.
And step 804, determining a target inspection model corresponding to the determined characteristic category from the inspection model set.
In this embodiment, the execution subject may determine a target patrol inspection model corresponding to the feature type determined in step 803 from the patrol inspection model set according to a known correspondence relationship.
The target inspection model can be an inspection model corresponding to the determined characteristic category in the inspection model set.
And 805, generating a polling result of the polling data based on the target polling model.
In this embodiment, the execution subject may generate the inspection result of the inspection data based on the target inspection model.
As an example, if the target inspection model is obtained by training in the manner described in the first example, the executing body may input the inspection data into the target inspection model, so as to obtain the inspection result of the inspection data.
As another example, if the target inspection model is trained in the manner described in the second example, the execution subject may calculate similarity between the inspection data (e.g., inspection data a) and each of the at least one inspection data included in the target inspection model. And taking the routing inspection result of the routing inspection data included by the target routing inspection model corresponding to the calculated maximum similarity as the routing inspection result of the routing inspection data (such as routing inspection data A).
According to the method provided by the embodiment of the disclosure, the inspection result of the inspection data is generated by predetermining the characteristic category set and training the inspection model set in advance, so that the generation efficiency of the inspection result is improved, and the inspection efficiency is further improved.
In some optional implementation manners of this embodiment, the target inspection model is trained in the following manner:
first, a set of training samples is obtained. Wherein, the training samples in the training sample set include: the sample patrol data and a sample label corresponding to the sample patrol data. And the sample label corresponding to the sample inspection data represents the inspection result of the sample inspection data. The feature type of the scene feature data of the sample patrol data in the training sample set is the same as the determined feature type.
And then, inputting the sample inspection data and the sample labels included in the training sample set into an initial model for training a target inspection model by adopting a machine learning algorithm, and training to obtain the target inspection model.
It can be understood that in the optional implementation manner, the target detection model is trained by adopting a machine learning algorithm, so that the accuracy of generating the inspection result can be improved, and the inspection false detection rate is reduced.
In some optional implementations of the present embodiment, the feature classes in the feature class set represent any of the following scenarios:
whether a road guardrail is damaged, whether a traffic sign marking is damaged, whether a vehicle runs at an overspeed or not, whether the vehicle runs in a reverse direction or not, whether the vehicle runs by pressing a line or not, whether the construction of occupying the road exists or not, whether a well cover is lost or damaged or not, and whether a driver drives in a fatigue way or not.
It can be understood that, in the optional implementation manner, inspection can be performed on any scene, so that inspection efficiency in any scene can be improved.
Under some application scenarios in the optional implementation modes, daily emergency event detection and real-time alarm monitoring can be established, and the capacity of plan alarm and rapid emergency handling of accidents is improved. By means of technologies such as video analysis, pattern recognition and information transmission, inspection data of abnormal scenes such as violation of regulations are monitored, collected and uploaded, traffic management departments are helped to handle off-site law enforcement, drivers are supervised to standardize driving, the incidence rate of road traffic accidents is reduced, damage to road facilities and special events are found, and safe and orderly road environments are built.
Optionally, the execution subject may perform multi-angle data verification to ensure availability and integrity of the underlying data. For example, a plurality of pictures of the same inspection scene, which are shot by a plurality of cameras, are compared and verified to determine whether the illegal vehicle exists in the inspection site.
Optionally, the execution subject may also build complete metadata management according to basic data (e.g., patrol data) to clarify data storage conditions, data relationship, data management, and data service conditions.
Optionally, the execution subject may further provide corresponding data demarcation rule management according to different inspection scenarios, so as to clarify corresponding data definitions. For example, the patrol data of each patrol scene may be stored separately.
Optionally, the execution main body may also monitor accuracy of the inspection result, and provide a measurable index for data optimization.
Optionally, the execution main body may monitor data output, data stability, and the like systematically, so as to monitor stability of an alarm and an API (Application Programming Interface) service.
With further reference to fig. 9, as an implementation of the methods illustrated in the above figures, the present disclosure provides an embodiment of an inspection device, the device embodiment corresponding to the method embodiment illustrated in fig. 2, which may include the same or corresponding features or effects as the method embodiment illustrated in fig. 2, in addition to the features described below. The device can be applied to various electronic equipment in particular.
As shown in fig. 9, the inspection equipment 900 of the present embodiment includes: a first acquisition unit 901, a feature extraction unit 902, a first determination unit 903 and a second determination unit 904. The first obtaining unit 901 is configured to obtain inspection data of an inspection site; a feature extraction unit 902, configured to perform scene feature extraction on the inspection data to obtain scene feature data of the inspection data; a first determining unit 903 configured to determine a feature type of the scene feature data; a second determining unit 904 configured to determine a patrol result of the patrol data based on the scene feature data and the feature type.
In this embodiment, the detailed processing of the first obtaining unit 901, the feature extracting unit 902, the first determining unit 903, and the second determining unit 904 of the inspection apparatus 900 and the technical effects brought by the detailed processing may refer to the relevant descriptions of step 201, step 202, step 203, and step 204 in the corresponding embodiment of fig. 2, and are not repeated herein.
In some optional implementations of this embodiment, the first determining unit 903 includes: and a first determining subunit (not shown in the figure) configured to determine the feature class of the scene feature data from a predetermined feature class set, wherein the feature class in the feature class set corresponds to a patrol model in a pre-trained patrol model set in a one-to-one manner, and the patrol model in the patrol model set is used for generating a patrol result of the patrol data.
In some optional implementations of this embodiment, the second determining unit 904 includes: a second determining subunit (not shown in the figure) configured to determine a target inspection model corresponding to the determined feature class from the inspection model set; and a generating subunit (not shown in the figure) configured to generate a patrol result of the patrol data based on the target patrol model.
In some optional implementation manners of this embodiment, the target inspection model is obtained by training in the following manner: acquiring a training sample set; wherein, the training samples in the training sample set include: sample polling data and a sample label corresponding to the sample polling data; the sample label corresponding to the sample inspection data represents the inspection result of the sample inspection data; the feature type of the scene feature data of the sample inspection data in the training sample set is the same as the determined feature type; and inputting the sample inspection data and the sample labels included in the training sample set into an initial model for training a target inspection model by adopting a machine learning algorithm, and training to obtain the target inspection model.
In some optional implementations of this embodiment, the first obtaining unit 901 includes: and an acquisition subunit (not shown in the figure) configured to acquire data of a patrol site collected by the autonomous vehicle, which is transmitted by the autonomous vehicle, and to use the acquired data as patrol data, wherein the transmission priority of the autonomous vehicle to the autonomous data is higher than the transmission priority of the autonomous vehicle to the patrol data.
In some optional implementations of this embodiment, the apparatus 900 further includes: and a first storage unit (not shown) configured to store the patrol data and the feature data hierarchically.
In some optional implementations of this embodiment, the apparatus 900 further includes: a second obtaining unit (not shown in the figure) configured to obtain an image that the target vehicle is in a violation state if the inspection result indicates that the target vehicle is in a violation state; a transmitting unit (not shown in the figure) configured to transmit the image, a patrol result indicating that the target vehicle has a violation state, and vehicle information of the target vehicle to a target terminal, wherein the target terminal is configured to determine whether the target vehicle is in the violation state based on the image and the vehicle information; a second storage unit (not shown in the figure) configured to store the image and the vehicle information to a preset database if the target terminal determines that the target vehicle is in the violation state.
In some optional implementations of this embodiment, the feature class in the feature class set represents any one of the following scenarios: whether the road guardrail is damaged, whether the traffic sign marking is damaged, whether the vehicle runs at overspeed or not, whether the vehicle runs in reverse or not, whether the vehicle runs by pressing a line or not, whether the road occupation construction exists or not, whether the well cover is lost or damaged or not, and whether a driver drives in a fatigue manner or not.
In the apparatus provided by the above embodiment of the present disclosure, the first obtaining unit 901 obtains the inspection data of the inspection site, then the feature extracting unit 902 performs scene feature extraction on the inspection data to obtain scene feature data of the inspection data, then the first determining unit 903 determines a feature type of the scene feature data, and then the second determining unit 904 determines the inspection result of the inspection data based on the scene feature data and the feature type. Therefore, the inspection result of the inspection data is determined by determining the characteristic type of the scene characteristic data of the inspection site, the inspection coverage can be improved, the inspection efficiency can be improved, and the inspection false detection rate can be reduced.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
As shown in fig. 10, is a block diagram of an electronic device of a patrol method according to an embodiment of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic devices may also represent various forms of mobile devices, such as personal digital processors, cellular telephones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 10, the electronic apparatus includes: one or more processors 1001, memory 1002, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). Fig. 10 illustrates an example of one processor 1001.
The memory 1002 is a non-transitory computer readable storage medium provided by the present disclosure. Wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the inspection methods provided by the present disclosure. The non-transitory computer-readable storage medium of the present disclosure stores computer instructions for causing a computer to perform the inspection method provided by the present disclosure.
The memory 1002, as a non-transitory computer-readable storage medium, may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules corresponding to the patrol method in the embodiments of the present disclosure (for example, the acquisition unit 901, the first determination unit 902, the second determination unit 903, and the training unit 904 shown in fig. 9). The processor 1001 executes various functional applications of the server and data processing by executing non-transitory software programs, instructions, and modules stored in the memory 1002, that is, implements the patrol method in the above-described method embodiments.
The memory 1002 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created from use of the processing electronics of the video frame, and the like. Further, the memory 1002 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 1002 may optionally include memory located remotely from the processor 1001, which may be connected to the video frame processing electronics via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the inspection method may further include: an input device 1003 and an output device 1004. The processor 1001, the memory 1002, the input device 1003, and the output device 1004 may be connected by a bus or other means, and the bus connection is exemplified in fig. 10.
The input device 1003 may receive input numeric or character information and generate key signal inputs related to user settings and function controls of the processing electronics for the video frames, such as an input device like a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointer stick, one or more mouse buttons, a track ball, a joystick, etc. The output devices 1004 may include a display device, auxiliary lighting devices (e.g., LEDs), and tactile feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user may provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server may be a cloud Server, also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service extensibility in a traditional physical host and VPS service ("Virtual Private Server", or "VPS" for short). The server may also be a server of a distributed system, or a server incorporating a blockchain.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The described units may also be provided in a processor, which may be described as: a processor includes a first acquisition unit, a feature extraction unit, a first determination unit, and a second determination unit. The names of the units do not in some cases constitute a limitation on the units themselves, and for example, the first acquisition unit may also be described as a "unit that acquires patrol data of a patrol site".
As another aspect, the present disclosure also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be separate and not assembled into the device. The computer readable medium carrying one or more programs which, when executed by the apparatus, cause the apparatus to: acquiring polling data of a polling site; scene feature extraction is carried out on the patrol data to obtain scene feature data of the patrol data; determining the feature type of the scene feature data; and determining a polling result of the polling data based on the scene feature data and the feature types.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the spirit of the invention. For example, the above features and the technical features disclosed in the present disclosure (but not limited to) having similar functions are replaced with each other to form the technical solution.

Claims (17)

1. A method of routing inspection, the method comprising:
acquiring polling data of a polling site;
scene feature extraction is carried out on the patrol data to obtain scene feature data of the patrol data;
determining a feature category of the scene feature data;
and determining a polling result of the polling data based on the scene feature data and the feature category.
2. The method of claim 1, wherein the determining a feature class of the scene feature data comprises:
determining the feature types of the scene feature data from a predetermined feature type set, wherein the feature types in the feature type set correspond to inspection models in an inspection model set trained in advance one by one, and the inspection models in the inspection model set are used for generating inspection results of the inspection data; and
the determining the inspection result of the inspection data based on the scene feature data and the feature category includes:
determining a target inspection model corresponding to the determined characteristic category from the inspection model set;
and generating a polling result of the polling data based on the target polling model.
3. The method of claim 2, wherein the target routing inspection model is trained by:
acquiring a training sample set; wherein the training samples in the training sample set comprise: sample inspection data and a sample label corresponding to the sample inspection data; the sample label corresponding to the sample inspection data represents the inspection result of the sample inspection data; the feature type of scene feature data of the sample routing inspection data in the training sample set is the same as the determined feature type;
and inputting the sample inspection data and the sample labels included in the training sample set into an initial model for training a target inspection model by adopting a machine learning algorithm, and training to obtain the target inspection model.
4. The method of claim 1, wherein the obtaining inspection data for an inspection site comprises:
the method comprises the steps of obtaining data of a patrol site, which are sent by an automatic driving vehicle and collected by the automatic driving vehicle, and taking the obtained data as patrol data, wherein the sending priority of the automatic driving vehicle to the automatic driving data is higher than the sending priority of the automatic driving vehicle to the patrol data.
5. The method of claim 1, wherein the method further comprises:
and carrying out layered storage on the routing inspection data and the characteristic data.
6. The method according to one of claims 1-5, wherein the method further comprises:
if the inspection result indicates that the target vehicle is in a violation state, acquiring an image of the violation state of the target vehicle;
the image, a routing inspection result indicating that the target vehicle has a violation state and vehicle information of the target vehicle are sent to a target terminal, wherein the target terminal is used for determining whether the target vehicle is in the violation state or not based on the image and the vehicle information;
and if the target terminal determines that the target vehicle is in the violation state, storing the image and the vehicle information into a preset database.
7. The method according to one of claims 2-5, wherein a feature class in the set of feature classes characterizes any of the following scenarios:
whether a road guardrail is damaged, whether a traffic sign marking is damaged, whether a vehicle runs at an overspeed or not, whether the vehicle runs in a reverse direction or not, whether the vehicle runs by pressing a line or not, whether the construction of occupying the road exists or not, whether a well cover is lost or damaged or not, and whether a driver drives in a fatigue way or not.
8. An inspection device, the device comprising:
a first acquisition unit configured to acquire inspection data of an inspection site;
the feature extraction unit is configured to extract scene features of the patrol data to obtain scene feature data of the patrol data;
a first determination unit configured to determine a feature category of the scene feature data;
a second determination unit configured to determine a patrol result of the patrol data based on the scene feature data and the feature class.
9. The apparatus of claim 8, wherein the first determining unit comprises:
the first determining subunit is configured to determine the feature class of the scene feature data from a predetermined feature class set, wherein the feature classes in the feature class set correspond to inspection models in a pre-trained inspection model set in a one-to-one manner, and the inspection models in the inspection model set are used for generating inspection results of the inspection data; and
the second determination unit includes:
a second determining subunit configured to determine, from the inspection model set, a target inspection model corresponding to the determined feature class;
a generation subunit configured to generate a patrol result of the patrol data based on the target patrol model.
10. The apparatus of claim 9, wherein the target routing inspection model is trained by:
acquiring a training sample set; wherein training samples in the set of training samples comprise: sample polling data and a sample label corresponding to the sample polling data; the sample label corresponding to the sample inspection data represents the inspection result of the sample inspection data; the feature type of the scene feature data of the sample inspection data in the training sample set is the same as the determined feature type;
and inputting the sample inspection data and the sample labels included in the training sample set into an initial model for training a target inspection model by adopting a machine learning algorithm, and training to obtain the target inspection model.
11. The apparatus of claim 8, wherein the first obtaining unit comprises:
the system comprises an acquisition subunit and a control unit, wherein the acquisition subunit is configured to acquire data of the inspection site, which is transmitted by an automatic driving vehicle and collected by the automatic driving vehicle, and use the acquired data as inspection data, and the transmission priority of the automatic driving vehicle to the automatic driving data is higher than that of the automatic driving vehicle to the inspection data.
12. The apparatus of claim 8, wherein the apparatus further comprises:
a first storage unit configured to store the patrol data and the feature data hierarchically.
13. The apparatus according to one of claims 8-12, wherein the apparatus further comprises:
the second acquisition unit is configured to acquire an image that the target vehicle is in a violation state if the inspection result indicates that the target vehicle is in the violation state;
a sending unit configured to send the image, a patrol result indicating that the target vehicle has a violation state, and vehicle information of the target vehicle to a target terminal, wherein the target terminal is configured to determine whether the target vehicle is in the violation state based on the image and the vehicle information;
a second storage unit configured to store the image and the vehicle information to a preset database if the target terminal determines that the target vehicle is in the violation state.
14. The apparatus according to one of claims 9-12, wherein a feature class in the set of feature classes characterizes any of the following scenarios:
whether a road guardrail is damaged, whether a traffic sign marking is damaged, whether a vehicle runs at an overspeed or not, whether the vehicle runs in a reverse direction or not, whether the vehicle runs by pressing a line or not, whether the construction of occupying the road exists or not, whether a well cover is lost or damaged or not, and whether a driver drives in a fatigue way or not.
15. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the first and the second end of the pipe are connected with each other,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
16. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-7.
17. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-7.
CN202210540370.9A 2022-05-17 2022-05-17 Inspection method and device Pending CN114783188A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210540370.9A CN114783188A (en) 2022-05-17 2022-05-17 Inspection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210540370.9A CN114783188A (en) 2022-05-17 2022-05-17 Inspection method and device

Publications (1)

Publication Number Publication Date
CN114783188A true CN114783188A (en) 2022-07-22

Family

ID=82409608

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210540370.9A Pending CN114783188A (en) 2022-05-17 2022-05-17 Inspection method and device

Country Status (1)

Country Link
CN (1) CN114783188A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116402275A (en) * 2023-03-03 2023-07-07 河海大学 Unmanned carrier dynamic selection method for intelligent cooperative inspection of dam
CN116797435A (en) * 2023-08-29 2023-09-22 北京道仪数慧科技有限公司 Processing system for carrying out road traffic sign inspection by utilizing bus
CN116828288A (en) * 2023-08-28 2023-09-29 广州信邦智能装备股份有限公司 Composite intelligent inspection robot capable of being applied to multiple scenes and related system

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108961768A (en) * 2018-07-30 2018-12-07 鄂尔多斯市普渡科技有限公司 The unmanned police cruiser of one kind and patrol method
CN110223413A (en) * 2019-06-12 2019-09-10 深圳铂石空间科技有限公司 Intelligent polling method, device, computer storage medium and electronic equipment
CN110379036A (en) * 2019-06-26 2019-10-25 广东康云科技有限公司 Intelligent substation patrol recognition methods, system, device and storage medium
CN110969166A (en) * 2019-12-04 2020-04-07 国网智能科技股份有限公司 Small target identification method and system in inspection scene
CN112017323A (en) * 2019-05-30 2020-12-01 深圳市优必选科技有限公司 Patrol alarm method and device, readable storage medium and terminal equipment
CN112287806A (en) * 2020-10-27 2021-01-29 北京百度网讯科技有限公司 Road information detection method, system, electronic equipment and storage medium
CN112306051A (en) * 2019-07-25 2021-02-02 武汉光庭科技有限公司 Robot system for unmanned traffic police vehicle on highway
CN112417955A (en) * 2020-10-14 2021-02-26 国电大渡河沙坪水电建设有限公司 Patrol video stream processing method and device
CN112784815A (en) * 2021-02-19 2021-05-11 苏州市大智无疆智能科技有限公司 Unmanned aerial vehicle cruising target identification method and device and cloud server
CN113538370A (en) * 2021-07-14 2021-10-22 宁波旗芯电子科技有限公司 Power grid inspection method and device, computer equipment and storage medium
CN113673459A (en) * 2021-08-26 2021-11-19 中国科学院自动化研究所 Video-based production construction site safety inspection method, system and equipment
CN113706737A (en) * 2021-10-27 2021-11-26 北京主线科技有限公司 Road surface inspection system and method based on automatic driving vehicle
CN113971795A (en) * 2021-11-18 2022-01-25 上海顺诠科技有限公司 Violation inspection system and method based on self-driving visual sensing
WO2022021739A1 (en) * 2020-07-30 2022-02-03 国网智能科技股份有限公司 Humanoid inspection operation method and system for semantic intelligent substation robot
CN114240868A (en) * 2021-12-09 2022-03-25 陕西省地方电力(集团)有限公司渭南供电分公司 Unmanned aerial vehicle-based inspection analysis system and method
CN114489122A (en) * 2021-12-30 2022-05-13 山东奥邦交通设施工程有限公司 UAV and matching airport-based automatic highway inspection method and system

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108961768A (en) * 2018-07-30 2018-12-07 鄂尔多斯市普渡科技有限公司 The unmanned police cruiser of one kind and patrol method
CN112017323A (en) * 2019-05-30 2020-12-01 深圳市优必选科技有限公司 Patrol alarm method and device, readable storage medium and terminal equipment
CN110223413A (en) * 2019-06-12 2019-09-10 深圳铂石空间科技有限公司 Intelligent polling method, device, computer storage medium and electronic equipment
CN110379036A (en) * 2019-06-26 2019-10-25 广东康云科技有限公司 Intelligent substation patrol recognition methods, system, device and storage medium
CN112306051A (en) * 2019-07-25 2021-02-02 武汉光庭科技有限公司 Robot system for unmanned traffic police vehicle on highway
CN110969166A (en) * 2019-12-04 2020-04-07 国网智能科技股份有限公司 Small target identification method and system in inspection scene
WO2022021739A1 (en) * 2020-07-30 2022-02-03 国网智能科技股份有限公司 Humanoid inspection operation method and system for semantic intelligent substation robot
CN112417955A (en) * 2020-10-14 2021-02-26 国电大渡河沙坪水电建设有限公司 Patrol video stream processing method and device
CN112287806A (en) * 2020-10-27 2021-01-29 北京百度网讯科技有限公司 Road information detection method, system, electronic equipment and storage medium
CN112784815A (en) * 2021-02-19 2021-05-11 苏州市大智无疆智能科技有限公司 Unmanned aerial vehicle cruising target identification method and device and cloud server
CN113538370A (en) * 2021-07-14 2021-10-22 宁波旗芯电子科技有限公司 Power grid inspection method and device, computer equipment and storage medium
CN113673459A (en) * 2021-08-26 2021-11-19 中国科学院自动化研究所 Video-based production construction site safety inspection method, system and equipment
CN113706737A (en) * 2021-10-27 2021-11-26 北京主线科技有限公司 Road surface inspection system and method based on automatic driving vehicle
CN113971795A (en) * 2021-11-18 2022-01-25 上海顺诠科技有限公司 Violation inspection system and method based on self-driving visual sensing
CN114240868A (en) * 2021-12-09 2022-03-25 陕西省地方电力(集团)有限公司渭南供电分公司 Unmanned aerial vehicle-based inspection analysis system and method
CN114489122A (en) * 2021-12-30 2022-05-13 山东奥邦交通设施工程有限公司 UAV and matching airport-based automatic highway inspection method and system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116402275A (en) * 2023-03-03 2023-07-07 河海大学 Unmanned carrier dynamic selection method for intelligent cooperative inspection of dam
CN116402275B (en) * 2023-03-03 2023-12-15 河海大学 Unmanned carrier dynamic selection method for intelligent cooperative inspection of dam
CN116828288A (en) * 2023-08-28 2023-09-29 广州信邦智能装备股份有限公司 Composite intelligent inspection robot capable of being applied to multiple scenes and related system
CN116828288B (en) * 2023-08-28 2024-01-02 广州信邦智能装备股份有限公司 Composite intelligent inspection robot capable of being applied to multiple scenes and related system
CN116797435A (en) * 2023-08-29 2023-09-22 北京道仪数慧科技有限公司 Processing system for carrying out road traffic sign inspection by utilizing bus
CN116797435B (en) * 2023-08-29 2023-10-31 北京道仪数慧科技有限公司 Processing system for carrying out road traffic sign inspection by utilizing bus

Similar Documents

Publication Publication Date Title
US11244570B2 (en) Tracking and analysis of drivers within a fleet of vehicles
CN111179585B (en) Site testing method and device for automatic driving vehicle
CN110378824B (en) Brain for public security traffic management data and construction method
CN114783188A (en) Inspection method and device
CN109804367B (en) Distributed video storage and search using edge computation
JP2022006181A (en) Traffic monitoring method, device, apparatus, and storage medium
JP7371157B2 (en) Vehicle monitoring method, device, electronic device, storage medium, computer program, cloud control platform and roadway coordination system
CN111652940A (en) Target abnormity identification method and device, electronic equipment and storage medium
CN111739344B (en) Early warning method and device and electronic equipment
Xiong et al. A kind of novel ITS based on space-air-ground big-data
US20160284214A1 (en) Vehicle-based abnormal travel event detecting and reporting
CN112069279B (en) Map data updating method, device, equipment and readable storage medium
WO2021237768A1 (en) Data-driven-based system for implementing automatic iteration of prediction model
Wang et al. Realtime wide-area vehicle trajectory tracking using millimeter-wave radar sensors and the open TJRD TS dataset
US11829959B1 (en) System and methods for fully autonomous potholes detection and road repair determination
CN113807588A (en) Traffic accident-based driving path planning method and device
CN114818056A (en) Traffic data integration method, device, equipment and medium based on BIM technology
CN110782670A (en) Scene restoration method based on data fusion, vehicle cloud platform and storage medium
CN114841712B (en) Method and device for determining illegal operation state of network appointment vehicle tour and electronic equipment
Gowtham et al. An efficient monitoring of real time traffic clearance for an emergency service vehicle using iot
Luo et al. Traffic signal transition time prediction based on aerial captures during peak hours
CN114662583A (en) Emergency event prevention and control scheduling method and device, electronic equipment and storage medium
US20220335730A1 (en) System and method for traffic signage inspection through collection, processing and transmission of data
CN116524210A (en) Automatic driving data screening method, system, electronic equipment and storage medium
CN116434525A (en) Intelligent management early warning system for expressway

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination