CN114724272A - Vehicle detection method and vehicle detection device - Google Patents

Vehicle detection method and vehicle detection device Download PDF

Info

Publication number
CN114724272A
CN114724272A CN202110015232.4A CN202110015232A CN114724272A CN 114724272 A CN114724272 A CN 114724272A CN 202110015232 A CN202110015232 A CN 202110015232A CN 114724272 A CN114724272 A CN 114724272A
Authority
CN
China
Prior art keywords
vehicle
detected
image
server
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110015232.4A
Other languages
Chinese (zh)
Inventor
林忠能
许国华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110015232.4A priority Critical patent/CN114724272A/en
Priority to PCT/CN2021/120343 priority patent/WO2022148068A1/en
Publication of CN114724272A publication Critical patent/CN114724272A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/006Indicating maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Abstract

The embodiment of the application provides a vehicle detection method and a vehicle detection device. In the method, a server may receive a detection request for a component to be detected of a vehicle to be detected; responding to the detection request, and determining a first vehicle according to the position information of the vehicle to be detected, wherein the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold value; sending a first work instruction to a vehicle to be detected, wherein the first work instruction is used for indicating that the part to be detected is opened at least once within a target time period; sending a second work instruction to the first vehicle, wherein the second work instruction is used for instructing the first vehicle to shoot a first image in a target time period and sending the first image to a server; a fault condition of the part to be inspected is identified from the first image. By implementing the technical scheme, the server can determine the fault condition of the part to be detected by controlling the vehicle to be detected to be matched with the first vehicle, so that the detection accuracy is improved.

Description

Vehicle detection method and vehicle detection device
Technical Field
The scheme relates to the technical field of computers, in particular to a vehicle detection method and a vehicle detection device.
Background
With the development of the automatic driving technology, the fault detection of the vehicle becomes an important content in the automatic driving field. When the automatic driving vehicle breaks down, because no driver is usually difficult to find, huge potential safety hazards exist.
At present, the existing vehicle detection method can shoot a vehicle to be detected through a camera, and then judge the fault condition of the vehicle to be detected according to the shot image. The method cannot accurately obtain the fault condition of the part to be detected of the vehicle to be detected.
Disclosure of Invention
The embodiment of the application provides a vehicle detection method and a vehicle detection device, wherein the vehicle detection device comprises a server, a vehicle to be detected and a first vehicle. In the method, a server can respond to a detection request of a to-be-detected component of a to-be-detected vehicle, determine a first vehicle according to the position of the to-be-detected vehicle, and then send a first work instruction to the to-be-detected vehicle and a second work instruction to the first vehicle, wherein the first work instruction is used for indicating the to-be-detected component to be started at least once within a target time period; and sending a second work instruction to the first vehicle, wherein the second work instruction is used for instructing the first vehicle to shoot the first image in the target time period and sending the first image to the server, and finally, the server identifies the fault condition of the component to be detected from the first image. By implementing the technical scheme, the server can determine the fault condition of the part to be detected of the vehicle to be detected by controlling the vehicle to be detected to be matched with the first vehicle, so that the detection accuracy is improved.
In a first aspect, an embodiment of the present application provides a vehicle detection method, which is applied to a server, and the method includes:
determining a first vehicle according to the position information of the vehicle to be detected, wherein the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold value;
sending a first working instruction to a vehicle to be detected, wherein the first working instruction is used for indicating that a part to be detected of the vehicle to be detected is opened at least once within a target time period;
sending a second work instruction to the first vehicle, wherein the second work instruction is used for instructing the first vehicle to shoot a first image in a target time period and sending the first image to a server;
a fault condition of the part to be inspected is identified from the first image.
With reference to the first aspect, in a possible implementation manner, the server may receive a detection request for a component to be detected of a vehicle to be detected, where the detection request is used to instruct the server to send a first work instruction to the vehicle to be detected and send a second work instruction to the first vehicle.
With reference to the first aspect, in a possible implementation manner, the server may obtain the detection requests of a plurality of vehicles to be detected simultaneously, and determine the detection sequence of the plurality of vehicles to be detected according to the number and the position of the first vehicles around the vehicles to be detected.
With reference to the first aspect, in a possible implementation manner, a server obtains a path plan of each vehicle in at least one vehicle and a path plan of a vehicle to be detected, where distances between the at least one vehicle and the vehicle to be detected are both smaller than a target threshold; screening out vehicles with the path plans partially overlapped with the path plans of the vehicles to be detected from the at least one vehicle; the first vehicle is determined from the screened vehicles.
With reference to the first aspect, in a possible implementation manner, the path plan of the path plan in which the screened vehicle and the vehicle to be detected coincide is a straight path in the target time period. The straight road is a lane in which the lane line of the straight road does not cross with the lane lines in other directions.
In some embodiments, the server may screen out, as the first vehicle, a vehicle whose path plan, which coincides with the vehicle to be detected, is planned to be a straight path within the target time period, and may avoid affecting the original path plan of the first vehicle after the vehicle detection method is completed.
In other embodiments, the server may screen out, as the first vehicle, a vehicle, in which a path plan coinciding with the vehicle to be detected in the target time period is planned to be a curve, according to the component to be detected. For example, when the component to be detected is a right turn light, the server may screen out a vehicle, as the first vehicle, whose path plan coinciding with the vehicle to be detected within the target time period is a curve turning right ahead. As can be understood, the method can prevent the vehicle from starting the left and right turn lamps on the straight road section to wait for the detection component, and prevent the influence on the normal running of other vehicles.
With reference to the first aspect, in a possible implementation manner, the server sends a shooting instruction to the screened vehicles, where the shooting instruction is used to instruct the screened vehicles to respectively shoot second images; and determining the first vehicle from the screened vehicles according to the second images respectively shot by the screened vehicles and the identification of the vehicle to be detected, wherein the second image shot by the first vehicle comprises the identification.
With reference to the first aspect, in one possible implementation manner, when the component to be detected is located on the front side of the vehicle to be detected, the first vehicle is located in front of the vehicle to be detected.
With reference to the first aspect, in one possible implementation manner, when the component to be detected is located at the rear side of the vehicle to be detected, the first vehicle is located behind the vehicle to be detected.
With reference to the first aspect, in a possible implementation manner, the method further includes:
and determining the shooting angle of the camera according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected, wherein the second working instruction carries the shooting angle.
With reference to the first aspect, in one possible implementation manner, the server identifies the component to be detected from the first image; and when the average gray value of the part to be detected in the first image is within the target range, determining that the part to be detected has no fault.
With reference to the first aspect, in one possible implementation manner, the server identifies the position of the component to be detected from the first image; and when the position of the component to be detected in the first image is located at the target position, determining that no fault exists in the component to be detected.
With reference to the first aspect, in a possible implementation manner, the first image includes multiple frames of images, and the server identifies the position of the component to be detected in each frame of image of the first image; and when the positions of the parts to be detected in the multi-frame images are inconsistent, determining that the parts to be detected have no fault.
With reference to the first aspect, in a possible implementation manner, the first image includes multiple frames of images, and the server identifies the component to be detected in each frame of image of the multiple frames of images; identifying the working state of the part to be detected according to the gray value of the part to be detected in each frame of image; and if the change rule of the working state of the part to be detected, which is identified according to the multi-frame image, meets the preset rule, determining that no fault exists in the part to be detected, wherein the first working instruction is used for indicating the part to be detected to work according to the preset rule.
In a second aspect, an embodiment of the present application provides a vehicle detection method, which is applied to a vehicle to be detected, and the method includes:
the method comprises the steps that a first working instruction sent by a server is received, the server is further used for sending a second working instruction to a first vehicle, the distance between the first vehicle and a vehicle to be detected is smaller than a target threshold value, the second working instruction is used for indicating the first vehicle to shoot a first image in a target time period and sending the first image to the server, and the first image is used for identifying the fault condition of a component to be detected of the vehicle to be detected;
and controlling the to-be-detected component to be started at least once in the target time period according to the first working instruction.
With reference to the second aspect, in a possible implementation manner, the vehicle to be detected may send a detection request for the component to be detected of the vehicle to be detected to the server, where the detection request is used to instruct the server to send the first work instruction to the vehicle to be detected and send the second work instruction to the first vehicle.
With reference to the second aspect, in a possible implementation manner, the server controls the component to be detected to operate according to a preset rule within the target time period according to the first operating instruction.
With reference to the second aspect, in one possible implementation manner, the vehicle to be detected controls the component to be detected to be in the working state within the target time period.
With reference to the second aspect, in one possible implementation manner, the vehicle to be detected starts the component to be detected at a first time point; and closing the part to be detected at a second time point, wherein the first time point and the second time point are two time points in the target time period.
In a third aspect, an embodiment of the present application provides a vehicle detection method, which is applied to a first vehicle, and includes:
receiving a second working instruction sent by the server, wherein the server is further used for sending a first working instruction to the vehicle to be detected, the first working instruction is used for indicating that the part to be detected of the vehicle to be detected is started at least once within a target time period, and the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold value;
shooting a first image in a target time period according to a second working instruction;
and sending the first image to a server, wherein the first image is used for identifying the fault condition of the component to be detected.
With reference to the third aspect, in a possible implementation manner, the second work instruction is generated when the server responds to a detection request for the component to be detected, which is sent by the vehicle to be detected, and the detection request is further used for instructing the server to send the first work instruction to the vehicle to be detected.
With reference to the third aspect, in a possible implementation manner, before receiving the second work instruction sent by the server, the first vehicle receives a shooting instruction sent by the server; according to the shooting instruction, shooting a second image, wherein the second image is used for determining the first vehicle; and sending the second image to the server, wherein the second image comprises the identification of the vehicle to be detected.
With reference to the third aspect, in a possible implementation manner, the second work instruction carries a shooting angle at which the first vehicle can control the camera to shoot the first image, where the shooting angle is determined by the server according to the position of the vehicle to be detected, the position of the first vehicle, and the component to be detected.
With reference to the third aspect, in one possible implementation manner, the second work instruction carries a lane to which the first vehicle can switch, and the lane is determined by the server according to the position of the vehicle to be detected, the position of the first vehicle, and the component to be detected.
With reference to the third aspect, in a possible implementation manner, the second work instruction carries speed information. For example, the vehicle to be detected is switched to the right lane of the first vehicle, and if the first vehicle finishes shooting the part to be detected on the left side of the vehicle to be detected, and finds that other vehicles are in front of the vehicle lane, the first vehicle can decelerate to leave a space, so that the vehicle to be detected can be switched to the vehicle lane and further switched to the left lane from the vehicle lane.
In a possible implementation manner, the server may determine a plurality of first vehicles according to the vehicle to be detected, so as to complete detection of the vehicle to be detected.
Specifically, the server may determine the vehicle to be detected according to the first instruction, and then determine the first vehicle according to the vehicle to be detected. When a plurality of first vehicles are screened out within the range of the target threshold value and the identification of the vehicle to be detected can be identified, the server can simultaneously issue second instructions to the plurality of first vehicles, different detection component ranges are respectively distributed to different first vehicles at different time intervals, and the detection of the vehicle to be detected is carried out by the cooperation of the plurality of first vehicles at the same time or in a time-sharing manner.
For example, the server can screen three first vehicles in front of the vehicle to be detected on the same lane, in the rear of the vehicle to be detected on the left lane, can issue a wiper to the first vehicle in front, detect a water spraying component, issue a double-jump lamp component to the first vehicle in rear, issue a left window component to the first vehicle in left rear, and issue an instruction carrying vehicle speed information at the same time, wherein the instruction is used for enabling all the first vehicles and the vehicle to be detected to maintain the same vehicle speed and carry out detection at the same time. When the detection is finished, the first vehicles in front and left rear can quit the detection, and then the server sends a detection instruction of the brake lamp to the first vehicle in rear, the detection instruction of the brake lamp can carry deceleration information, and the deceleration information is used for pulling the distance between the first vehicle and the vehicle to be detected so that the vehicle to be detected has a space for executing the braking action.
In a fourth aspect, an embodiment of the present application provides a vehicle detection apparatus, which is applied to a server, and the apparatus includes:
the determining unit is used for determining a first vehicle according to the position information of the vehicle to be detected, and the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold value;
the device comprises a sending unit, a detecting unit and a judging unit, wherein the sending unit is used for sending a first working instruction to a vehicle to be detected, and the first working instruction is used for indicating a part to be detected of the vehicle to be detected to be opened at least once within a target time period;
the sending unit is used for sending a second work instruction to the first vehicle, and the second work instruction is used for indicating the first vehicle to shoot a first image in a target time period and sending the first image to the server;
and the identification unit is used for identifying the fault condition of the component to be detected from the first image.
Optionally, the vehicle detection apparatus may further include a receiving unit, configured to receive a detection request for a to-be-detected component of a to-be-detected vehicle, where the detection request is used to instruct the server to send the first work instruction to the to-be-detected vehicle and send the second work instruction to the first vehicle.
Optionally, the vehicle detection apparatus may further include a storage unit for storing data or computer instructions.
In addition, in this aspect, reference may be made to the related matters of the first aspect for further alternative embodiments of the vehicle detection device, and details are not described here.
As an example, the receiving unit or the transmitting unit may be a transceiver or an interface, the storing unit may be a memory, and the determining unit or the identifying unit may be a processor.
In a fifth aspect, an embodiment of the present application provides a vehicle detection apparatus, which is applied to a vehicle to be detected, and the apparatus includes:
the receiving unit is used for receiving a first working instruction sent by the server, the server is also used for sending a second working instruction to the first vehicle, the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold value, the second working instruction is used for indicating the first vehicle to shoot a first image in a target time period and sending the first image to the server, and the first image is used for identifying the fault condition of a component to be detected of the vehicle to be detected;
and the control unit is used for controlling the to-be-detected component to be started at least once in a target time period according to the first working instruction.
Optionally, the vehicle detection apparatus may further include a sending unit, configured to send, to the server, a detection request for a component to be detected of the vehicle to be detected, where the detection request is used to instruct the server to send the first work instruction to the vehicle to be detected and send the second work instruction to the first vehicle.
Optionally, the vehicle detection apparatus may further include a storage unit for storing data or computer instructions.
In addition, in this aspect, reference may be made to the related contents of the second aspect for other alternative embodiments of the vehicle detection device, and details are not described here.
As an example, the receiving unit or the transmitting unit may be a transceiver or an interface, the storage unit may be a memory, and the control unit may be a processor.
In a sixth aspect, an embodiment of the present application provides a vehicle detection apparatus, which is applied to a first vehicle, and includes:
the receiving unit is used for receiving a second working instruction sent by a server, the server is also used for sending a first working instruction to the vehicle to be detected, and the first working instruction is used for indicating the part to be detected of the vehicle to be detected to be started at least once in a target time period; the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold value;
the shooting unit is used for shooting a first image in the target time period according to the second working instruction;
and the sending unit is used for sending the first image to the server, and the first image is used for identifying the fault condition of the component to be detected.
Optionally, the vehicle detection apparatus may further include a storage unit for storing data or computer instructions.
In addition, in this aspect, reference may be made to the related contents of the third aspect for other alternative embodiments of the vehicle detection apparatus, and details are not described here.
As an example, the receiving unit or the transmitting unit may be a transceiver or an interface, the storage unit may be a memory, and the photographing unit may be a processor.
In a seventh aspect, an embodiment of the present application provides an electronic device, where the electronic device includes: one or more processors, memory; the memory is coupled to the one or more processors, the memory for storing computer program code, the computer program code comprising computer instructions, the one or more processors for invoking the computer instructions to cause the electronic device to perform the method as described in any of the first, second, or third aspects.
In an eighth aspect, an embodiment of the present application provides an electronic device, which may include a processor and a communication interface, where the processor obtains program instructions through the communication interface, and when the program instructions are executed by the processor, the method in any one of the first aspect, the second aspect, or the third aspect is implemented.
In a ninth aspect, embodiments of the present application provide an electronic device, which may include a processing circuit configured to perform the method according to any one of the first, second or third aspects.
In a tenth aspect, embodiments of the present application provide a chip applied to an electronic device, where the chip includes one or more processors, and the processor is configured to invoke computer instructions to cause the electronic device to execute the method according to the first aspect, the second aspect, or the third aspect, and any one of the above aspects. In the method according to the first aspect, the second aspect or the third aspect, the processes of sending and receiving the instruction may be understood as a process of outputting the instruction by a processor and a process of receiving the instruction by the processor. When outputting the instruction, the processor outputs the instruction to the transceiver for transmission by the transceiver. The instructions may also require additional processing after being output by the processor before reaching the transceiver. Similarly, when the processor receives the input instruction, the transceiver receives the instruction and inputs the instruction into the processor. Further, after the transceiver receives the command, the command may need to be processed before being input to the processor.
The operations relating to the processor, such as transmitting, sending and receiving, may be understood more generally as operations relating to the processor, such as outputting and receiving, inputting, etc., than those performed directly by the rf circuitry and antenna, unless specifically stated otherwise, or if not contradicted by their actual role or inherent logic in the associated description.
In implementation, the processor may be a processor dedicated to performing the methods, or may be a processor executing computer instructions in a memory to perform the methods, such as a general-purpose processor. The Memory may be a non-transitory (non-transitory) Memory, such as a Read Only Memory (ROM), which may be integrated on the same chip as the processor or may be separately disposed on different chips.
Optionally, with reference to the tenth aspect above, in a first possible implementation manner, the chip system may further include a memory, and the memory is used for storing necessary program instructions and data of the vehicle. The chip system may be constituted by a chip, or may include a chip and other discrete devices. The chip system may include an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA), or other programmable logic devices. Further, the chip system may further include an interface circuit and the like.
Optionally, in combination with the above tenth aspect, the memory is located within the processor; or the memory may be external to the processor.
In an eleventh aspect, embodiments of the present application further provide a processor, configured to perform the method according to the first aspect, the second aspect, or the third aspect.
In a twelfth aspect, embodiments of the present application provide a computer program product comprising instructions, which, when run on an electronic device, cause the electronic device to perform the method according to the first, second, or third aspect and any one of the above aspects.
In a thirteenth aspect, an embodiment of the present application provides a computer-readable storage medium, including instructions that, when executed on an electronic device, cause the electronic device to perform the method according to the first, second, or third aspect and any one of the above aspects.
Drawings
The drawings used in the embodiments of the present application are described below.
FIG. 1 is a schematic diagram of a vehicle detection system according to an embodiment of the present disclosure;
FIG. 2 is a schematic view of another vehicle detection system according to an embodiment of the present disclosure;
fig. 3 is a functional block diagram of a vehicle 002 according to the embodiment of the present application;
FIG. 4 is a flow chart of a vehicle detection method according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a vehicle detection method according to an embodiment of the present disclosure;
FIG. 6 is a schematic illustration of a plurality of first vehicles provided in accordance with an embodiment of the present disclosure;
FIG. 7 is a flowchart of a method for determining a first vehicle according to an embodiment of the present disclosure;
FIG. 8A is a schematic illustration of a method for determining a vehicle in a first zone according to an embodiment of the present disclosure;
FIG. 8B is a schematic diagram of an embodiment of the present application illustrating a second vehicle identification;
FIG. 8C is a schematic illustration of a first vehicle being determined from second vehicles according to an embodiment of the disclosure;
FIG. 9 is a flowchart of a method for determining a first vehicle from a second image according to an embodiment of the present application;
fig. 10 is a hardware structure diagram of a server according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of a vehicle detection device according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of a vehicle detection device according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a vehicle detection device according to an embodiment of the present application.
Detailed Description
The terminology used in the following examples of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of embodiments of the present application. As used in the description of the embodiments of the present application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used in the embodiments of this application refers to and encompasses any and all possible combinations of one or more of the listed items.
In order to better understand a vehicle detection method and a vehicle detection apparatus provided in the embodiments of the present application, a system architecture used in the embodiments of the present application is described below.
Referring to fig. 1, fig. 1 is a schematic diagram of a vehicle detection system according to an embodiment of the present disclosure. As shown in fig. 1, the system architecture includes a server 10, a vehicle to be detected 20, and a first vehicle 30, wherein the first vehicle 30 may include a plurality of vehicles. Wherein:
first, the vehicle 20 to be detected may send a detection request for the component to be detected of the vehicle 20 to be detected to the server 10, and accordingly, the server 10 determines the first vehicle 30 according to the position of the vehicle 20 to be detected in response to the detection request. For example, the server 10 may determine a vehicle having a distance to the vehicle to be detected 20 smaller than the target threshold as the first vehicle 30.
After determining the first vehicle 30, the server 10 may send a first work order to the vehicle to be detected 20 and a second work order to the first vehicle 30. The vehicle 20 to be detected opens the part to be detected at least once within the target time period according to the first work instruction; the first vehicle 30 takes a first image according to the second work order and transmits the first image to the server 10. Finally, the server 10 can identify from the first image a fault condition of the component to be detected.
In some embodiments, the server 10 may further transmit a shooting instruction to a vehicle whose distance from the vehicle 20 to be detected is less than the target threshold, the shooting instruction being used to instruct the vehicle that received the shooting instruction to shoot the second image and transmit the second image to the server 10. Further, the server 10 may determine the vehicle including the identification of the vehicle to be detected 20 in the captured second image as the first vehicle 30.
Among them, the vehicle 20 and the first vehicle 30 to be detected in the embodiment of the present application may be the vehicle 002, as shown in fig. 3. The vehicle 002 is an automobile that senses the road environment through the vehicle-mounted sensing system, automatically plans a driving route and controls the vehicle to reach a predetermined target. The intelligent automobile intensively applies the technologies of computer, modern sensing, information fusion, communication, artificial intelligence, automatic control and the like, and is a high and new technology comprehensive body integrating the functions of environmental perception, planning decision, multi-level auxiliary driving and the like. The intelligent vehicle in the application can be an intelligent vehicle with an intelligent driver which is mainly based on a computer system, wherein the intelligent driver is used for enabling the vehicle to achieve unmanned driving, can also be an intelligent vehicle with an auxiliary driving system or a full-automatic driving system, and can also be a wheeled mobile robot and the like.
The server 10 in the embodiment of the present application may be implemented by an independent server or a server cluster formed by multiple servers.
Referring to fig. 2, fig. 2 is a schematic view of another vehicle detection system according to an embodiment of the present disclosure. As shown in fig. 2, the vehicle detection system architecture includes a server, a vehicle to be detected, a first vehicle, and a roadside camera, wherein the server may also interact with the roadside camera. Specifically, when the vehicle to be detected and the first vehicle pass through a road section where the roadside camera is located, the server can send an instruction to the roadside camera so that the roadside camera starts shooting and uploads a shot image to the server, and finally the server can identify the fault condition of the part to be detected of the vehicle to be detected according to the image shot by the roadside camera and the first image.
It is understood that the vehicle detection system architecture in fig. 1 or fig. 2 is only an exemplary implementation manner in the embodiment of the present application, and the vehicle detection system architecture in the embodiment of the present application includes, but is not limited to, the above vehicle detection system architecture.
Based on the vehicle detection system architecture, the embodiment of the present application provides a vehicle 002 applied to the vehicle detection system architecture, please refer to fig. 3, and fig. 3 is a functional block diagram of the vehicle 002 provided in the embodiment of the present application.
In one embodiment, the vehicle 002 may be configured in a fully or partially autonomous driving mode. For example, the vehicle 002 may control itself while in the autonomous driving mode, and the current state of the vehicle and its surroundings may be determined by human operation, the possible behavior of at least one other vehicle in the surroundings may be determined, and the confidence level corresponding to the possibility of the other vehicle performing the possible behavior may be determined, the vehicle 002 being controlled based on the determined information. When the vehicle 002 is in the automatic driving mode, the vehicle 002 may be placed into operation without human interaction.
Vehicle 002 may include various subsystems such as a travel system 202, a sensor system 204, a control system 206, one or more peripheral devices 208, as well as a power supply 210, a computer system 212, and a user interface 216. Alternatively, the vehicle 002 may include more or fewer subsystems, and each subsystem may include multiple elements. In addition, each of the sub-systems and elements of the vehicle 002 may be interconnected by wire or wirelessly.
The travel system 202 may include components that provide powered motion to the vehicle 002. In one embodiment, the travel system 202 may include an engine 218, an energy source 219, a transmission 220, and wheels 221. The engine 218 may be an internal combustion engine, an electric motor, an air compression engine, or other type of engine combination, such as a hybrid engine of a gasoline engine and an electric motor, or a hybrid engine of an internal combustion engine and an air compression engine. The engine 218 converts the energy source 219 into mechanical energy.
Examples of energy sources 219 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source 219 may also provide energy to other systems of the vehicle 002.
The transmission 220 may transmit mechanical power from the engine 218 to the wheels 221. The transmission 220 may include a gearbox, a differential, and a drive shaft. In one embodiment, the transmission 220 may also include other devices, such as a clutch. Wherein the drive shaft may comprise one or more shafts that may be coupled to one or more wheels 221.
The sensor system 204 may include several sensors that sense information about the environment surrounding the vehicle 002. For example, the sensor system 204 may include a global positioning system 222 (which may be a GPS system, a beidou system, or other positioning system), an Inertial Measurement Unit (IMU) 224, a radar 226, a laser range finder 228, and a camera 230. The sensor system 204 may also include sensors of internal systems of the monitored vehicle 002 (e.g., an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors may be used to detect the object and its corresponding characteristics (position, shape, orientation, velocity, etc.). Such detection and identification is a key function of the safe operation of the autonomous vehicle 002.
The global positioning system 222 may be used to estimate the geographic location of the vehicle 002. The IMU 224 is used to sense the position and orientation change of the vehicle 002 based on inertial acceleration. In one embodiment, the IMU 224 may be a combination of an accelerometer and a gyroscope. For example: the IMU 224 may be used to measure the curvature of the vehicle 002.
The radar 226 may utilize radio signals to sense objects within the surrounding environment of the vehicle 002. In some embodiments, in addition to sensing objects, radar 226 may also be used to sense the speed and/or heading of an object.
The laser range finder 228 may utilize laser light to sense objects in the environment in which the vehicle 002 is located. In some embodiments, laser rangefinder 228 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
The camera 230 may be used to capture multiple images of the surrounding environment of the vehicle 002. The camera 230 may be a still camera or a video camera, a visible light camera or an infrared camera, and may be any camera for acquiring an image, which is not limited in the embodiment of the present application.
In the embodiment of the present application, the cameras 230 may be installed on the front side, the rear side, and the left and right sides of the vehicle 002, and the cameras 230 may be cameras that are rotated to adjust the photographing angle. In addition, the camera in the embodiment of the application can be mounted at any position on the intelligent vehicle through the telescopic rod, and when the image needs to be acquired, the telescopic rod is extended to acquire the image; when the image does not need to be acquired, the telescopic rod is contracted. In this embodiment of the application, the camera 230 may be turned on and off according to an instruction of the second work instruction received by the first vehicle, and perform shooting according to a shooting angle carried in the second work instruction.
The control system 206 is for controlling the operation of the vehicle 002 and its components. The control system 206 may include various elements including a steering unit 232, a throttle 234, a braking unit 236, a sensor fusion algorithm unit 238, a computer vision system 240, a route control system 242, and an obstacle avoidance system 244.
The steering unit 232 is operable to adjust the forward direction of the vehicle 002. For example, in one embodiment, a steering wheel system.
The throttle 234 is used to control the operating speed of the engine 218 and thus the speed of the vehicle 002.
The brake unit 236 is used to control the vehicle 002 to decelerate. The brake unit 236 may use friction to slow the wheel 221. In other embodiments, the brake unit 236 may convert the kinetic energy of the wheel 221 into an electrical current. The brake unit 236 may take other forms to slow the rotational speed of the wheel 221 to control the speed of the vehicle 002.
The computer vision system 240 may be operable to process and analyze images captured by the camera 230 in order to identify objects and/or features in the environment surrounding the vehicle 002. The objects and/or features may include traffic signals, road boundaries, and obstacles. The computer vision system 240 may use object recognition algorithms, Motion from Motion (SFM) algorithms, video tracking, and other computer vision techniques. In some embodiments, the computer vision system 240 may be used to map an environment, track objects, estimate the speed of objects, and so forth.
The route control system 242 is used to determine the travel route of the vehicle 002. In some embodiments, the route control system 242 may combine data from the sensor fusion algorithm unit 238, the GPS 222, and one or more predetermined maps to determine a travel route for the vehicle 002.
The obstacle avoidance system 244 is used to identify, assess and avoid or otherwise negotiate potential obstacles in the environment of the vehicle 002.
Of course, in one example, the control system 206 may additionally or alternatively include components other than those shown and described. Or a portion of the components shown above may be reduced.
The vehicle 002 interacts with external sensors, other vehicles, other computer systems, or users through the peripheral devices 208. The peripheral devices 208 may include a wireless communication system 246, an in-vehicle computer 248, a microphone 250, and/or speakers 252.
In some embodiments, the peripheral device 208 provides a means for a user of the vehicle 002 to interact with the user interface 216. For example, the onboard computer 248 may provide information to the user of the vehicle 002. User interface 216 may also operate in-vehicle computer 248 to receive user inputs. The in-vehicle computer 248 can be operated through a touch screen. In other instances, the peripheral device 208 may provide a means for the vehicle 002 to communicate with other devices located within the vehicle. For example, the microphone 250 may receive audio (e.g., voice commands or other audio input) from a user of the vehicle 002. Similarly, the speaker 252 may output audio to a user of the vehicle 002.
The wireless communication system 246 may wirelessly communicate with one or more devices directly or via a communication network. Such as Code Division Multiple Access (CDMA), Enhanced multimedia Disk system (EVD), global system for mobile communications (GSM)/General Packet Radio Service (GPRS), or 4G cellular communication, such as Long Term Evolution (LTE), or 5G cellular communication, or New Radio (NR) system, or future communication system, etc. The wireless communication system 246 may communicate with a Wireless Local Area Network (WLAN) using WiFi. In some embodiments, the wireless communication system 246 may communicate directly with the devices using infrared links, bluetooth, or wireless personal area networks (ZigBee). Other wireless protocols, such as: various vehicular communication systems, for example, the wireless communication system 246 may include one or more Dedicated Short Range Communications (DSRC) devices, which may include public and/or private data communications between vehicles and/or roadside stations.
The power supply 210 may provide power to various components of the vehicle 002. In one embodiment, the power source 210 may be a rechargeable lithium ion or lead acid battery. One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 002. In some embodiments, the power source 210 and the energy source 219 may be implemented together, such as in some all-electric vehicles.
Some or all of the functions of the vehicle 002 are controlled by the computer system 212. The computer system 212 may include at least one processor 213, the processor 213 executing instructions 215 stored in a non-transitory computer readable medium, such as a data storage device 214. The computer system 212 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 002 in a distributed manner.
The processor 213 may be any conventional processor, such as a commercially available CPU. Alternatively, the processor may be a dedicated device such as an ASIC or other hardware-based processor. Although fig. 3 functionally illustrates a processor, memory, and other elements of the computer 120 in the same block, those skilled in the art will appreciate that the processor, computer, or memory may actually comprise multiple processors, computers, or memories that may or may not be stored within the same physical housing. For example, the memory may be a hard disk drive or other storage medium located in a different housing than computer 120. Thus, references to a processor or computer are to be understood as including references to a collection of processors or computers or memories which may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering and deceleration components, may each have their own processor that performs only computations related to the component-specific functions.
In various aspects described herein, the processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle and others are executed by a remote processor, including taking the steps necessary to perform a single maneuver.
In some embodiments, the memory 214 may contain instructions 215 (e.g., program logic), the instructions 215 being executable by the processor 213 to perform various functions of the vehicle 002, including those described above. The data storage 224 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of the travel system 202, the sensor system 204, the control system 206, and the peripheral devices 208.
In addition to instructions 215, memory 214 may also store data such as road maps, route information, the location, direction, speed of the vehicle, and other such vehicle data, among other information. Such information may be used by the vehicle 002 and the computer system 212 during operation of the vehicle 002 in autonomous, semi-autonomous, and/or manual modes. For example: the current speed and the current curvature of the vehicle can be finely adjusted according to the road information of the target road section and the received vehicle speed range and vehicle curvature range, so that the speed and the curvature of the intelligent vehicle are within the vehicle speed range and the vehicle curvature range.
A user interface 216 for providing information to or receiving information from a user of the vehicle 002. Optionally, the user interface 216 may include one or more input/output devices within the collection of peripheral devices 208, such as a wireless communication system 246, an in-vehicle computer 248, a microphone 250, and a speaker 252.
The computer system 212 may control the functions of the vehicle 002 based on inputs received from various subsystems (e.g., the travel system 202, the sensor system 204, and the control system 206) and from the user interface 216. For example, the computer system 212 may utilize input from the control system 206 to control the steering unit 232 to avoid obstacles detected by the sensor system 204 and the obstacle avoidance system 244. In some embodiments, the computer system 212 is operable to provide control over many aspects of the vehicle 002 and its subsystems.
Alternatively, one or more of these components described above may be mounted or associated separately from the vehicle 002. For example, the memory 214 may exist partially or completely separate from the vehicle 002. The above components may be communicatively coupled together in a wired and/or wireless manner.
Optionally, the above components are only an example, in an actual application, components in the above modules may be added or deleted according to an actual need, and fig. 3 should not be construed as limiting the embodiment of the present application.
An autonomous automobile traveling on a roadway, such as vehicle 002 above, may identify objects within its surrounding environment to determine an adjustment to the current speed. The object may be another vehicle, a traffic control device, or another type of object. In some examples, each identified object may be considered independently, and based on the respective characteristics of the object, such as its current speed, acceleration, separation from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to be adjusted.
Optionally, the autonomous automobile vehicle 002 or a computing device associated with the autonomous vehicle 002 (e.g., computer system 212, computer vision system 240, memory 214 of fig. 3) may predict the behavior of the identified object based on characteristics of the identified object and the state of the surrounding environment (e.g., traffic, rain, ice on the road, etc.). Optionally, each identified object depends on the behavior of each other, so it is also possible to predict the behavior of a single identified object taking all identified objects together into account. The vehicle 002 is able to adjust its speed based on the predicted behaviour of said identified object. In this process, other factors may also be considered to determine the speed of the vehicle 002, such as the lateral position of the vehicle 002 in the road being traveled, the curvature of the road, the proximity of static and dynamic objects, and so forth.
In addition to providing instructions to adjust the speed of the autonomous vehicle, the computing device may also provide instructions to modify the steering angle of the vehicle 002 so that the autonomous vehicle follows a given trajectory and/or maintains a safe lateral and longitudinal distance from objects in the vicinity of the autonomous vehicle (e.g., cars in adjacent lanes on the road).
The vehicle 002 may be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, an amusement car, a playground vehicle, construction equipment, an electric car, a golf cart, a train, a cart, or the like, and the embodiment of the present invention is not particularly limited.
It is understood that the smart vehicle function diagram in fig. 3 is only an exemplary implementation manner in the embodiment of the present application, and the smart vehicle in the embodiment of the present application includes, but is not limited to, the above structure.
Referring to fig. 4, fig. 4 is a flowchart of a vehicle detection method provided by an embodiment of the present application, where the method may be applied to the vehicle detection system described in fig. 1 or fig. 2, where the server 10 may be used to support and execute the method steps 102-105 and 109 shown in fig. 4, the vehicle to be detected may be used to support and execute the method steps 101 and 106 shown in fig. 4, and the first vehicle may be used to support and execute the method steps 107-108 shown in fig. 4. The method may include some or all of the following steps.
101. And the vehicle to be detected sends a detection request to the server.
Optionally, the vehicle to be detected may send the detection request to the server at regular time to obtain the fault condition of the vehicle, or send the detection request for the component to be detected to the server when the number of times of work of the component to be detected reaches a preset number of times, which is not limited herein. Wherein the inspection request may be for one or more parts to be inspected of the vehicle to be inspected.
In some embodiments, the vehicle to be detected may send a detection request to the server when capturing a signal triggering that some component to be detected starts to work, where the detection request is used to instruct the server to immediately detect the component to be detected.
For example, when the vehicle to be detected turns on the right turn lamp in the right turn, the vehicle to be detected triggers when sensing that the right turn lamp is turned on, and sends a detection request to the server, wherein the detection request is used for instructing the server to execute the detection of the right turn lamp at the moment when the detection request is received. The method can be used for detecting the component to be detected by utilizing the time when the component is normally started in the running process of the vehicle, so that the condition that the vehicle to be detected does not conform to traffic rules for detecting the component to be detected can be avoided, for example, the condition that the vehicle to be detected starts a steering lamp on a straight road section for detecting the steering lamp is completed.
It is understood that step 101 is an optional step, and the server may determine the vehicle to be detected according to the real-time position information of the vehicle to be detected stored in the server.
102. The server receives a detection request for a component to be detected of a vehicle to be detected.
The detection request may be sent to the server by the vehicle to be detected in step 101.
In one implementation, the detection request is generated by a server, and the server may determine the component to be detected according to weather conditions and time, for example, may determine that the component to be detected is a wiper in rainy days, and for example, turn on a car light at night. It will be appreciated that the method avoids impact on other vehicles.
In another implementation, the server receiving the detection request may also be sent by another server or input by a person, which is not limited herein.
103. The server responds to the detection request and determines the first vehicle according to the position information of the vehicle to be detected.
Optionally, after receiving the detection request, the server may search for a vehicle according to the position information of the vehicle to be detected and a preset condition, and determine the vehicle meeting the preset condition as the first vehicle. The preset condition may be that the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold. For example, the preset condition may be that the distance between the first vehicle and the vehicle to be detected is less than 10 meters, the server may first acquire the position of the vehicle to be detected, and then determine the vehicle with the distance between the first vehicle and the vehicle to be detected being less than 10 meters as the first vehicle. The server can acquire the position of the vehicle to be detected in real time. It is understood that the vehicle around the vehicle to be detected has the condition for photographing the vehicle to be detected, and thus the vehicle around the vehicle to be detected can be taken as the first vehicle.
The location information may be a location indicated on a map according to longitude and latitude information reported by the vehicle. The server may determine, based on the location, in conjunction with lane direction information on the map, whether the first vehicle sought is located ahead or behind the indication of the direction of travel of the vehicle. The location information may also include path planning information, etc., and is not limited herein.
In one implementation, the preset conditions may also include conditions determined from the components to be detected. For example, when the component to be detected is located on the left side and the right side of the vehicle to be detected, the preset condition further includes that the vehicle to be detected runs on a road with at least two traffic lanes, the server acquires the current lane condition of the vehicle to be detected according to the current position of the vehicle to be detected, and searches for the vehicle, the distance between which and the vehicle to be detected is smaller than the target threshold value, when the vehicle to be detected runs on the road with at least two traffic lanes. It can be understood that when the component to be detected is located on the left side and the right side of the vehicle to be detected, if the first vehicle and the vehicle to be detected are front and rear vehicles on the same lane, the difficulty that the first vehicle can shoot the component to be detected is high.
In some embodiments, the server may also determine the first vehicle according to other preset conditions, which is not limited herein. The method for determining the first vehicle by the server according to the vehicle to be detected may specifically refer to steps 201 to 204, which is not described herein again.
It should be noted that, when the server does not inquire the vehicle meeting the preset condition, the server may feed back the condition to the vehicle to be detected, or may continue to acquire the position of the vehicle to be detected and search for the first vehicle, and when the vehicle meeting the preset condition appears, it is determined that the vehicle is the first vehicle, and then, steps 104 to 108 are executed.
In some embodiments, the server may receive detection requests from a plurality of vehicles to be detected simultaneously, and then the server determines first vehicles of the plurality of vehicles to be detected respectively, and then determines a detection sequence for the plurality of vehicles to be detected according to the number and the position of the first vehicles of the plurality of vehicles to be detected, and further detects each vehicle of the plurality of vehicles to be detected respectively.
104. The server sends a first work instruction to the vehicle to be detected. Correspondingly, the vehicle to be detected receives the first work order.
The first work instruction is used for instructing the to-be-detected component to be opened at least once within a target time period.
In some embodiments, the first work order may be generated by the server according to the component to be detected, where the first work order may further include a work time and a work order of the component to be detected. For example, when the component to be detected is a windshield wiper, the first working instruction can be used for indicating that the vehicle to be detected controls the windshield wiper to work for 5 seconds when receiving the first working instruction, and for example, when the component to be detected is a turn light, the first working instruction can be used for turning on a right turn light when the vehicle to be detected receives the first working instruction, turning off the right turn light and turning on a left turn light after the right turn light works for 5 seconds, and turning off the left turn light after the left turn light works for 5 seconds. Optionally, the server may further adjust the detection time of the to-be-detected component of the to-be-detected vehicle according to weather conditions and the like.
105. The server sends a second work order to the first vehicle. Correspondingly, the first vehicle receives a second work order.
The second work instruction is used for instructing the first vehicle to shoot the first image in the target time period and sending the first image to the server. As can be appreciated, the first work order and the second work order are used to enable the first vehicle to capture the work condition of the component to be detected, resulting in the first image.
In some embodiments, the second work order may also carry a shooting angle of the camera. Optionally, the server may determine the shooting angle of the camera according to the position of the vehicle to be detected, the position of the first vehicle, and the component to be detected, where the position of the vehicle to be detected and the position of the first vehicle are used to determine the relative position between the vehicle to be detected and the first vehicle. For example, the server may obtain that the vehicle to be detected is located behind the first vehicle according to the positions of the vehicle to be detected and the first vehicle, and further determine the shooting angle of the camera according to the specific positions of the component to be detected, which are the left turn light and the left turn light. It is understood that the positions of the vehicle to be detected, the first vehicle and the component to be detected in the embodiment of the present application may be spatial geographic coordinates thereof, and thus the obtained shooting angle may be a direction in a three-dimensional space.
In other embodiments, the first and second work orders may further include instructions to control vehicle speed, switch lanes, etc. so as to adjust the positions of the vehicle to be detected and the first vehicle. For example, when the component to be detected is a brake lamp, the first work instruction is used for instructing the vehicle to be detected to perform braking at least once, and the second work instruction is also used for instructing the first vehicle to adjust the distance between the first vehicle and the vehicle to be detected before and after the vehicle to be detected performs braking each time. The first vehicle and the vehicle to be detected are matched in a specific matching process, the first vehicle keeps a first distance with the vehicle to be detected according to the second working instruction, the vehicle to be detected treads a brake according to the first working instruction and is released after being maintained for 2 seconds, the distance between the first vehicle and the vehicle to be detected is a second distance, and the first distance is larger than the second distance. It can be understood that the distance between the first vehicle and the vehicle to be detected is controlled by the first vehicle, so that the normal operation of the part to be detected of the vehicle to be detected is not influenced, and the collision caused by too close distance between the vehicles and the insufficient definition of the first image shot by too far distance can be prevented.
106. And controlling the part to be detected to be started at least once within a target time period by the vehicle to be detected according to the first working instruction.
Specifically, the vehicle to be detected can open the component to be detected within the target time period according to the first work instruction. For example, the vehicle to be detected may turn on the vehicle lamp at a first time point of the target time period according to the first operating instruction for the vehicle lamp; the vehicle lamp is turned off at a second time point, the first time point and the second time point are two time points in the target time period, and for example, the vehicle to be detected can control the wiper to be in the working state in the target time period according to the first working instruction for the wiper.
In some embodiments, the vehicle to be detected can also control the component to be detected to work according to the first work instruction in a preset rule within the target time period. For example, the vehicle to be detected turns on and off the fog lamp at preset time intervals within one minute after receiving the first work order.
In other embodiments, the vehicle to be detected can also control the component to be detected to work in the target time period according to the first work instruction, and meanwhile, other components are started to work so as to cooperate with the first vehicle to shoot the first image. Referring to fig. 5, fig. 5 is a schematic view of a vehicle detection method according to an embodiment of the present disclosure. In fig. 5, the dotted line vehicle represents the position of the vehicle at the previous moment, the solid line vehicle represents the current position of the vehicle, and the dotted line arrow represents the driving direction from the previous moment to the current moment. Specifically, as shown in fig. 5 (a), after the vehicle to be detected is switched to the left lane according to the first work instruction, the vehicle to be detected may maintain straight movement for 10 seconds in the left lane, and raise the vehicle to the highest position after lowering all the windows to the lowest position during 10 seconds; furthermore, as shown in fig. 5 (B), the vehicle to be detected is switched back to the original lane according to the first work instruction and then switched to the right lane, the right lane keeps straight for 10 seconds and all windows are lowered to the lowest position in 10 seconds and then raised to the highest position, and during the period, the first vehicle keeps straight and shoots the vehicle to be detected as shown in fig. 5. When the lane where the vehicle to be detected is located is only the left lane or the right lane, the server may send a second work instruction to the first vehicle after the lane of the vehicle to be detected is switched once, where the second work instruction includes the lane, so that the first vehicle is switched to the lane, and the fault conditions of the vehicle window and the rearview mirror lamp of the vehicle to be detected are photographed. It can be appreciated that by the above method, parts on the side of the vehicle can be detected, increasing the coverage of the detected parts.
Optionally, the vehicle to be detected may send a feedback that the execution of the first work order is completed to the server after the first work order is completed.
107. And the first vehicle takes the first image in the target time period according to the second work instruction.
The first image may be an image or a video. As can be appreciated, a video is a sequence of images made up of a plurality of images.
In some embodiments, the second work order further comprises a camera angle. Specifically, the first vehicle may determine the target camera according to the shooting angle, for example, determine that the target camera is a front camera, and the front camera is an on-vehicle camera located on the front side of the first vehicle, and further adjust the angle of the camera to the shooting angle, for example, rotate the front camera to a vehicle lamp with a shooting direction of left and lower, and turn on the front camera within the target time to obtain the first image.
108. The first vehicle transmits a first image to a server. Correspondingly, the server receives the first image.
Optionally, the first vehicle may send the first image to the server after step 107 is executed, or may send each of the acquired first images to the server in real time when the first image is a plurality of images, which is not limited herein.
109. The server identifies the fault condition of the component to be detected from the first image.
Specifically, the server may identify the component to be detected in the first image, and then identify the fault condition of the component to be detected according to the component to be detected in the first image.
In some embodiments, the first image is one image.
In one implementation, the server may identify a location of the part to be detected from the first image; and when the position of the component to be detected in the first image is located at the target position, determining that no fault exists in the component to be detected. For example, if the component to be detected is a window, the server may identify the position of the window from the first image, compare the position of the window in the first image with the target position, and determine that the window is not faulty when the server detects that the window is located at the half position of the window in the first image if the first operating instruction is used to indicate that the window is in the half-open state.
In another implementation, the server also identifies the part to be detected from the first image; and when the average gray value of the part to be detected in the first image is within the target range, determining that the part to be detected is free of faults. For example, the component to be detected is a vehicle lamp, the first working instruction is used for controlling the vehicle lamp to be in an on state, and the server determines that the vehicle lamp has no fault when recognizing that the average gray value of the vehicle lamp in the first image is within the gray value range in the image when the vehicle lamp is turned on.
In other embodiments, the first image comprises a plurality of images.
In one implementation, the server may identify the position of the part to be detected in each frame of the first image; and when the positions of the parts to be detected in the multi-frame images are inconsistent, determining that no fault exists in the parts to be detected. For example, the first operation instruction is used for instructing the wiper to perform one operation within a target time period, where the target time period is the time required by the wiper to perform one operation, and the server determines that the wiper is not in fault when recognizing that the positions of the wiper in the multi-frame images are inconsistent, or the server may respectively recognize the water flow distribution conditions of the wiper and the sprayed water in the multi-frame images, and determines that the wiper is not in fault when recognizing that the wiper position and the water flow distribution condition of each frame image in the multi-frame images have regular changes.
In another implementation, the server may identify the part to be detected in each frame of the plurality of frames of images; identifying the working state of the part to be detected according to the gray value of the part to be detected in each frame of image; and if the change rule of the working state of the part to be detected, which is identified according to the multi-frame image, meets the preset rule, determining that the part to be detected has no fault, wherein the first working instruction is used for indicating the part to be detected to work according to the preset rule.
In another implementation, the server may further determine that the component to be detected does not have a fault if the operating state of the component to be detected is different in two of the plurality of images. For example, the first work order is used to instruct the part to be inspected to turn on and off once within a target time period.
Optionally, after the server obtains the fault condition of the component to be detected, the server may send the fault condition of the component to be detected to the vehicle to be detected. The vehicle to be detected can send a query request to the server when the component to be detected has a fault. The server can search the vehicle repairing point information within the preset distance from the vehicle to be detected in the map according to the query request, and send the searched vehicle repairing point information to the vehicle to be detected.
Optionally, the server may further interact with a roadside fixed camera, acquire an image captured by the roadside camera, and identify a fault condition of the component to be detected according to the image captured by the roadside camera. It will be appreciated that roadside cameras have a high degree of advantage, capturing the condition of the subject vehicle roof body and roof skylight.
Optionally, the server may further identify and detect a fault condition of the component to be detected based on the roadside-fixed camera and an image of the vehicle to be detected acquired by the first vehicle.
Optionally, after step 103 is executed and before step 104 is executed, the server may send a command to the vehicle to be detected, so that the vehicle to be detected keeps the lane unchanged and runs at a constant speed; and simultaneously sending a command to the first vehicle to accelerate or decelerate the first vehicle to a target distance close to the vehicle to be detected, keeping constant-speed running, and sending feedback to the server after the first vehicle runs at the constant speed. The target distance may be determined according to the part to be detected, and the target distance may be smaller when the area of the part to be detected is smaller. It can be understood that the first vehicle and the vehicle to be detected keep a constant speed and a target distance, and the definition of the first image shot by the first vehicle can be improved.
An example of the detection request being a request generated by the vehicle to be detected when the component to be detected starts to operate is described below.
Optionally, in step 101, when the component to be detected is turned on, the vehicle to be detected sends a detection request for the component to be detected to the server. In steps 102 and 103, the server may recognize that the component to be detected is in the operating state according to the detection request, and further, the server may not perform steps 104 and 106.
The server can preferentially respond to the detection request, screen out a plurality of first vehicles according to the priority preset condition, and further complete the detection of the component to be detected through steps 105, 107, 108 and step 109. Optionally, the server issues a second instruction to the plurality of first vehicles to shoot the images at the corresponding angles, and the server may determine the fault condition of the component to be detected by combining the vehicle identifier in the image and the corresponding working state of the component to be detected.
The process of screening the first vehicle by the server may not include the following steps 201-204. It can be understood that, since the detection request is sent to the server when the component to be detected is started, the server determines the first vehicle in the shortest time and allows the first vehicle to complete the shooting of the vehicle to be detected, the detection of the component to be detected can be completed in the time when the component to be detected is normally started by the vehicle to be detected, and unnecessary influence on traffic is avoided. The server reduces the steps of determining the first vehicle as much as possible, so that the time of the detection process can be shortened, and the detection result is prevented from being in error due to the fact that the vehicle to be detected stops working states of the component to be detected before the first vehicle shoots the vehicle to be detected.
With reference to fig. 6, an embodiment of detecting a vehicle to be detected by a plurality of first vehicles is described below.
Specifically, the server may determine the vehicle to be detected according to the first instruction, and then determine the first vehicle according to the vehicle to be detected. When a plurality of first vehicles are screened out within the range of the target threshold value and the identification of the vehicle to be detected can be identified, the server can simultaneously issue second instructions to the plurality of first vehicles, different detection component ranges are respectively distributed to different first vehicles at different time intervals, and the detection of the vehicle to be detected is carried out by the cooperation of the plurality of first vehicles at the same time or in a time-sharing manner.
Referring to fig. 6, fig. 6 is a schematic view of a plurality of first vehicles according to an embodiment of the present disclosure. As shown in fig. 6, the first vehicle, the second vehicle, and the third vehicle are three first vehicles screened by the server in front of, behind, and behind the same lane as the vehicle to be detected. Specifically, the server can issue a wiper for a first vehicle in front, a detection instruction of the water spraying component, a detection instruction of the double-jump lamp component for a second vehicle in rear, a detection instruction of the left window component for a third vehicle in left rear, and an instruction carrying vehicle speed information, wherein the instruction is used for enabling all the first vehicles and the vehicles to be detected to maintain the same vehicle speed and carry out detection at the same time. When the detection is finished, the first vehicle and the third vehicle can quit the detection, then the server sends a detection instruction of the brake lamp to the second vehicle, the detection instruction of the brake lamp can carry deceleration information, and the deceleration information is used for pulling the distance between the second vehicle and the vehicle to be detected so that the vehicle to be detected has a space for executing the braking action.
Referring to fig. 7, fig. 7 is a flowchart of a method for determining a first vehicle according to an embodiment of the present application. As shown in fig. 7, the method for determining the first vehicle by the server may include the following steps:
201. and determining the vehicles in the first area according to the positions of the vehicles to be detected.
Referring to fig. 8A, fig. 8A is a schematic diagram illustrating a vehicle in a first area being determined according to an embodiment of the present disclosure. As shown in fig. 8A, the server may determine vehicles around the vehicle to be detected as vehicles within the first area according to the position of the vehicle to be detected. The vehicles in the first area may be vehicles whose distance to the vehicle to be detected is smaller than the target threshold, which may be specifically referred to in the related content of step 103.
202. And screening out vehicles with the path plans partially overlapped with the path plan of the vehicle to be detected from the vehicles in the first area.
Specifically, the server may obtain a path plan of each vehicle in the first area, and then compare the path plan of each vehicle in the first area with the path plan of the vehicle to be detected, and screen out a vehicle partially overlapping with the path plan of the vehicle to be detected. Wherein the path plan is a road on which the vehicle plans to travel.
It can be understood that the screened vehicles are partially overlapped with the path planning part of the vehicle to be detected, so that the screened vehicles detect the first vehicle on the overlapped planning road, the detection failure caused by different driving directions of the two vehicles in the detection process can be avoided, and the first vehicle is prevented from being separated from the original path planning after the detection process is finished.
In some embodiments, the server may determine the first vehicle from the screened vehicles, where a path plan of a path plan overlapped by the screened vehicles and the vehicle to be detected in the target time period is a straight path.
203. And determining a second vehicle from the screened vehicles according to the position of the component to be detected on the vehicle to be detected.
In one implementation, when the component to be detected is located on the front side of the vehicle to be detected, the component located in front of the vehicle to be detected is determined as the second vehicle. For example, the components to be detected are a front windshield wiper, a front sprinkler, a front steering lamp, a front night headlight, a front fog lamp, a double jump lamp, a daytime running lamp and the like, and the server may determine that the screened vehicles are located in front of the vehicle to be detected as a second vehicle.
In another implementation, when the component to be detected is located on the rear side of the vehicle to be detected, the vehicle located behind the vehicle to be detected is determined as the second vehicle. For example, the components to be detected are a rear water spray device, a rear windshield wiper, a rear turn light, a rearview mirror turn light, a rear night headlight, a rear fog light, a double jump light, a rearview mirror and the like, and the server can determine that the vehicle to be detected is behind the screened vehicle as a second vehicle. It can be appreciated that a vehicle behind the vehicle to be detected can use the onboard front camera to capture images of the rear side of the vehicle to be detected.
Referring to fig. 8B, fig. 8B is a schematic diagram of determining a second vehicle according to an embodiment of the present disclosure. As shown in fig. 8B, the component to be detected is located on the rear side of the vehicle to be detected, the first vehicle and the second vehicle are located behind the vehicle, and the third vehicle is located in front of the vehicle, and therefore the first vehicle and the second vehicle are determined as the second vehicle.
In some embodiments, the server may determine the second vehicle as the first vehicle. As can be appreciated, the method can reduce the steps of determining the first vehicle, shorten the detection time and improve the detection efficiency.
204. And determining the first vehicle from the second vehicles according to the images respectively shot by the second vehicles and the marks of the vehicles to be detected.
Optionally, the server may send a shooting instruction to the screened vehicles, where the shooting instruction is used to instruct the screened vehicles to respectively shoot the second images, identify the identifiers of the vehicles to be detected from the second images, and determine the vehicle, which contains the identifier, in the second images shot in the screened vehicles as the first vehicle. Details of step 2041 to step 2044 may be specifically referred to.
Referring to fig. 8C, fig. 8C is a schematic diagram illustrating an example of determining a first vehicle from a second vehicle according to the present application. As shown in fig. 8C, the second vehicle includes a first vehicle and a second vehicle, and the server may acquire a second image captured by the first vehicle and the second vehicle, and identify the identifier of the vehicle to be detected from the second image. Because the first vehicle and the vehicle to be detected are shielded by the second vehicle, the second image shot by the first vehicle does not comprise the identifier of the vehicle to be detected, the second image shot by the second vehicle comprises the identifier of the vehicle to be detected, and the server can determine the second vehicle as the first vehicle.
It should be noted that the order in which the server executes steps 201 to 204 may be adjusted according to a specific embodiment, and is not limited.
Referring to fig. 9, fig. 9 is a flowchart of a method for determining a first vehicle according to a second image according to an embodiment of the present application. Specifically, the method can comprise the following steps:
2041. the server sends a shooting instruction to the second vehicle.
The camera position and camera angle may be included in the shooting instructions. For example, the component to be detected is a front wiper, and the photographing instruction may be used to instruct the second vehicle to turn on a rear camera of the vehicle.
2042. The second vehicle captures a second image according to the capturing instruction.
Wherein the second vehicle may be one or more vehicles. Specifically, each vehicle receives one camera shooting instruction, and the vehicle starts a corresponding camera according to the camera shooting instruction corresponding to the vehicle to obtain a second image, wherein the second image can be one image or an image sequence consisting of a plurality of images.
2043. The second vehicle sends the second image to the server.
2044. And the server determines the first vehicle from the second vehicles according to the second image and the identification of the vehicle to be detected.
Optionally, the server may identify an identifier of the vehicle to be detected in the second image, and if the identifier is included in the second image sent by the vehicle, the vehicle is the first vehicle. The identification may be the license plate number of the vehicle, the number of the vehicle, etc.
Optionally, if the server finds that the images containing the markers cannot be shot by the front and rear vehicles located in the same lane as the vehicle to be detected, but the images containing the markers are shot by the vehicles in the adjacent lane, the server may send an instruction to the vehicle in the adjacent lane to switch to the same lane as the vehicle to be detected, or may stop the vehicle detection.
It can be understood that in the case where there is an obstacle between the first vehicle and the vehicle to be detected, the first vehicle cannot photograph the vehicle to be detected. The first vehicle is determined through the second image and the identification of the vehicle to be detected, and the first vehicle can be ensured to shoot the vehicle to be detected.
Fig. 10 is a schematic hardware structure diagram of a server in an embodiment of the present application. The server 10 shown in fig. 10 (the server 10 may be a computer device in particular) includes a memory 101, a processor 102, a communication interface 103, and a bus 104. The memory 101, the processor 102 and the communication interface 103 are connected to each other through a bus 104. The Memory 101 may be a Read Only Memory (ROM), a static Memory device, a dynamic Memory device, or a Random Access Memory (RAM). The memory 101 may store a program, and the processor 102 and the communication interface 103 are used to perform the respective steps of the vehicle detection in the embodiment of the present application when the program stored in the memory 101 is executed by the processor 102.
The processor 102 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an Application Specific Integrated Circuit (ASIC), a Graphics Processing Unit (GPU), or one or more Integrated circuits, and is configured to execute related programs to implement the vehicle detection method according to the embodiment of the present invention.
The processor 102 may also be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the vehicle detection method of the present application may be performed by instructions in the form of hardware integrated logic circuits or software in the processor 102. The processor 102 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks provided in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method provided in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 101, and the processor 102 reads information in the memory 101 and completes the vehicle fault detection method of the embodiment of the present application in combination with hardware thereof.
The communication interface 103 enables communication between the server 10 and other devices or communication networks using transceiver means such as, but not limited to, a transceiver. For example, data (such as the first image in the embodiment of the present application) may be acquired through the communication interface 103.
Bus 104 may include a path that transfers information between various components of server 10 (e.g., memory 101, processor 102, communication interface 103). In the above-described embodiments, all or part of the functions may be implemented by software, hardware, or a combination of software and hardware. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium. The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Referring to fig. 11, fig. 11 is a schematic structural diagram of a vehicle detection device according to an embodiment of the present disclosure. The vehicle detection device 300 is applied to a server, and comprises:
a determining unit 301, configured to determine a first vehicle according to the position information of the vehicle to be detected, where a distance between the first vehicle and the vehicle to be detected is smaller than a target threshold;
a sending unit 302, configured to send a first work instruction to the vehicle to be detected, where the first work instruction is used to instruct the component to be detected to be opened at least once within a target time period;
a sending unit 302, configured to send a second work instruction to the first vehicle, where the second work instruction is used to instruct the first vehicle to capture a first image in the target time period and send the first image to the server;
an identifying unit 303 for identifying a failure condition of the component to be detected from the first image.
In a possible implementation manner, the vehicle detection apparatus may further include a receiving unit 304, configured to receive a detection request for a to-be-detected component of a to-be-detected vehicle, where the detection request is used to instruct the server to send the first work instruction to the to-be-detected vehicle and send the second work instruction to the first vehicle.
In a possible implementation manner, the determining unit 301 is further configured to:
obtaining a path plan of each vehicle in at least one vehicle and a path plan of the vehicle to be detected, wherein the distance between the at least one vehicle and the vehicle to be detected is smaller than the target threshold value;
screening out vehicles with the path plans partially overlapped with the path plans of the vehicles to be detected from the at least one vehicle;
determining the first vehicle from the screened vehicles.
In a possible implementation manner, the determining unit 301 is further configured to:
sending a shooting instruction to the screened vehicles, wherein the shooting instruction is used for indicating the screened vehicles to respectively shoot second images;
and determining the first vehicle from the screened vehicles according to the second images respectively shot by the screened vehicles and the identification of the vehicle to be detected, wherein the second image shot by the first vehicle comprises the identification.
In one possible implementation, when the component to be detected is located on the front side of the vehicle to be detected, the first vehicle is located in front of the vehicle to be detected.
In one possible implementation, when the to-be-detected component is located on the rear side of the to-be-detected vehicle, the first vehicle is located behind the to-be-detected vehicle.
In a possible implementation manner, the determining unit 301 is configured to:
and determining the shooting angle of the camera according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected, wherein the second working instruction carries the shooting angle.
In a possible implementation manner, the identifying unit 303 is configured to:
identifying the part to be detected from the first image;
and when the average gray value of the part to be detected in the first image is within a target range, determining that no fault exists in the part to be detected.
In a possible implementation manner, the identifying unit 303 is configured to:
identifying the position of the part to be detected from the first image;
and when the position of the component to be detected in the first image is located at the target position, determining that no fault exists in the component to be detected.
In a possible implementation manner, the first image includes a plurality of frame images, and the identifying unit 303 is configured to:
respectively identifying the position of the component to be detected in each frame of the first image;
and when the positions of the parts to be detected in the multi-frame images are inconsistent, determining that no fault exists in the parts to be detected.
In a possible implementation manner, the first image includes a plurality of frame images, and the identifying unit 303 is configured to:
identifying a part to be detected in each frame image of the multi-frame images;
identifying the working state of the part to be detected according to the gray value of the part to be detected in each frame of image;
and if the change rule of the working state of the part to be detected, which is identified according to the multi-frame image, meets a preset rule, determining that no fault exists in the part to be detected, wherein the first working instruction is used for indicating the part to be detected to work according to the preset rule.
More detailed descriptions about the receiving unit 304, the determining unit 301, the sending unit 302, and the identifying unit 303 can be directly obtained by referring to the related descriptions in the embodiment of the method shown in fig. 4, which are not repeated herein.
Referring to fig. 12, fig. 12 is a schematic structural diagram of a vehicle detection device according to an embodiment of the present application. The vehicle detecting device 400 is applied to a vehicle to be detected, and includes:
the receiving unit 401 is configured to receive a first work instruction sent by the server, where the server is further configured to send a second work instruction to the first vehicle, where a distance between the first vehicle and the vehicle to be detected is smaller than a target threshold, the second work instruction is used to instruct the first vehicle to take a first image within a target time period and send the first image to the server, and the first image is used to identify a fault condition of a component to be detected of the vehicle to be detected;
and the control unit 402 is configured to control the to-be-detected component to be started at least once within the target time period according to the first working instruction.
In a possible implementation manner, the vehicle detection apparatus may further include a sending unit 403, configured to send, to the server, a detection request for a component to be detected of the vehicle to be detected, where the detection request is used to instruct the server to send the first work instruction to the vehicle to be detected and the second work instruction to the first vehicle.
In a possible implementation manner, the control unit 402 is configured to:
and controlling the part to be detected to work according to a preset rule in the target time period according to the first working instruction.
In a possible implementation manner, the control unit 402 is configured to:
and controlling the part to be detected to be in the working state in the target time period.
In a possible implementation manner, the component to be detected is a vehicle lamp, and the control unit 402 is configured to:
turning on the vehicle lamp at a first time point;
and turning off the vehicle lamp at a second time point, wherein the first time point and the second time point are two time points in the target time period.
More detailed descriptions about the sending unit 403, the receiving unit 401, and the control unit 402 can be directly obtained by referring to the related descriptions in the method embodiment shown in fig. 4, which are not repeated herein.
Referring to fig. 13, fig. 13 is a schematic structural diagram of a vehicle detection device according to an embodiment of the present application. The vehicle inspection device 500 is applied to a vehicle to be inspected, and includes:
the receiving unit 501 is configured to receive a second work instruction sent by a server, where the server is further configured to send a first work instruction to the vehicle to be detected, and the first work instruction is used to instruct a component to be detected of the vehicle to be detected to be started at least once within a target time period; the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold value;
a shooting unit 502, configured to shoot a first image within the target time period according to the second work instruction;
a sending unit 503, configured to send the first image to the server, where the first image is used to identify a fault condition of the component to be detected.
In a possible implementation manner, the second work instruction is generated when the server responds to a detection request sent by the vehicle to be detected for the component to be detected, where the detection request is further used for instructing the server to send the first work instruction to the vehicle to be detected.
In a possible implementation manner, the receiving unit 501 is configured to receive a shooting instruction sent by the server; the shooting unit 503 is configured to shoot a second image according to the shooting instruction, where the second image is used to determine the first vehicle; the sending unit 503 sends the second image to the server, where the second image includes the identifier of the vehicle to be detected.
In a possible implementation manner, the second work instruction carries the shooting angle, and the shooting unit 502 is configured to:
and controlling a camera to shoot the first image at the shooting angle, wherein the shooting angle is determined by the server according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected.
In a possible implementation, the second work order carries a lane, the apparatus further comprises a control unit 504, the control unit 504 is configured to:
and switching to the lane, wherein the lane is determined by the server according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected.
More detailed descriptions about the receiving unit 501, the capturing unit 502, the sending unit 503, and the controlling unit 504 can be directly obtained by referring to the description in the embodiment of the method shown in fig. 4, which is not repeated herein.
An embodiment of the present application provides an electronic device, which includes: one or more processors, memory; the memory is coupled to the one or more processors and is configured to store computer program code comprising computer instructions, the one or more processors are configured to invoke the computer instructions to cause the electronic device to perform the method as performed by the server of fig. 4-9.
The embodiment of the present application further provides an electronic device, which may include a processor and a communication interface, where the processor obtains program instructions through the communication interface, and when the program instructions are executed by the processor, the electronic device implements the method performed by the vehicle to be detected in fig. 4 to 9.
Embodiments of the present application further provide an electronic device, which may include a processing circuit configured to execute the method performed by the first vehicle in fig. 4 to 9.
The embodiment of the present application provides a chip applied to an electronic device, where the chip includes one or more processors, and the processors are configured to invoke computer instructions to cause the electronic device to execute the method described in fig. 4 to 9.
Optionally, in a first possible implementation, the chip system may further include a memory for storing necessary program instructions and data of the vehicle. The chip system may be formed by a chip, or may include a chip and other discrete devices. The system on chip may include an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA), or other programmable logic devices. Further, the chip system may further include an interface circuit and the like.
Optionally, the memory is located within the processor; or the memory may be external to the processor.
Embodiments of the present application provide a computer program product including instructions, which, when run on an electronic device, cause the electronic device to perform the method described in fig. 4 to 9.
An embodiment of the present application provides a computer-readable storage medium, which includes instructions that, when executed on an electronic device, cause the electronic device to perform the method described in fig. 4 to 9.
In the above-described embodiments, all or part of the functions may be implemented by software, hardware, or a combination of software and hardware. When implemented in software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium. The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

Claims (46)

1. A vehicle detection method is applied to a server, and comprises the following steps:
determining a first vehicle according to the position information of the vehicle to be detected, wherein the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold value;
sending a first work instruction to the vehicle to be detected, wherein the first work instruction is used for indicating that a part to be detected of the vehicle to be detected is opened at least once within a target time period;
sending a second work instruction to the first vehicle, wherein the second work instruction is used for instructing the first vehicle to take a first image in the target time period and sending the first image to the server;
and identifying the fault condition of the component to be detected from the first image.
2. The method according to claim 1, wherein before said determining the first vehicle from the position information of the vehicle to be detected, the method further comprises:
receiving a detection request aiming at a part to be detected of the vehicle to be detected, wherein the detection request is used for instructing the server to send the first work instruction to the vehicle to be detected and send the second work instruction to the first vehicle.
3. The method according to claim 1 or 2, wherein determining the first vehicle based on the position information of the vehicle to be detected comprises:
obtaining a path plan of each vehicle in at least one vehicle and a path plan of the vehicle to be detected, wherein the distance between the at least one vehicle and the vehicle to be detected is smaller than the target threshold value;
screening out vehicles with the path plans partially overlapped with the path plans of the vehicles to be detected from the at least one vehicle;
determining the first vehicle from the screened vehicles.
4. The method of claim 3, wherein said determining the first vehicle from the screened vehicles comprises:
sending a shooting instruction to the screened vehicles, wherein the shooting instruction is used for instructing the screened vehicles to respectively shoot second images;
and determining the first vehicle from the screened vehicles according to the second images respectively shot by the screened vehicles and the identification of the vehicle to be detected, wherein the second image shot by the first vehicle comprises the identification.
5. The method according to any one of claims 1-4, wherein the first vehicle is located in front of the vehicle to be inspected when the component to be inspected is located on a front side of the vehicle to be inspected.
6. The method according to any one of claims 1-4, wherein the first vehicle is located behind the vehicle to be inspected when the component to be inspected is located on a rear side of the vehicle to be inspected.
7. The method according to any one of claims 1-6, further comprising:
and determining the shooting angle of the camera according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected, wherein the second working instruction carries the shooting angle.
8. The method according to any one of claims 1 to 7, wherein said identifying a fault condition of said component to be detected in said first image comprises:
identifying the part to be detected from the first image;
and when the average gray value of the part to be detected in the first image is within a target range, determining that no fault exists in the part to be detected.
9. The method according to any one of claims 1 to 7, wherein said identifying a fault condition of said component to be detected in said first image comprises:
identifying the position of the part to be detected from the first image;
and when the position of the component to be detected in the first image is located at the target position, determining that no fault exists in the component to be detected.
10. The method according to any one of claims 1 to 7, wherein the first image comprises a plurality of frames of images, and the identifying the fault condition of the component to be inspected in the first image comprises:
respectively identifying the position of the part to be detected in each frame of the first image;
and when the positions of the parts to be detected in the multi-frame images are inconsistent, determining that no fault exists in the parts to be detected.
11. The method according to any one of claims 1 to 7, wherein the first image comprises a plurality of frames of images, and the identifying the fault condition of the component to be inspected in the first image comprises:
identifying a part to be detected in each frame image of the multi-frame images;
identifying the working state of the part to be detected according to the gray value of the part to be detected in each frame of image;
and if the change rule of the working state of the part to be detected, which is identified according to the multi-frame image, meets a preset rule, determining that no fault exists in the part to be detected, wherein the first working instruction is used for indicating the part to be detected to work according to the preset rule.
12. A vehicle detection method, applied to a vehicle to be detected, the method comprising:
receiving a first work instruction sent by a server, wherein the server is further used for sending a second work instruction to a first vehicle, the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold value, the second work instruction is used for indicating the first vehicle to shoot a first image within a target time period and sending the first image to the server, and the first image is used for identifying the fault condition of a component to be detected of the vehicle to be detected;
and controlling the to-be-detected component to be started at least once in the target time period according to the first working instruction.
13. The method of claim 12, wherein prior to receiving the first work order sent by the server, the method further comprises:
sending a detection request aiming at a to-be-detected component of the to-be-detected vehicle to the server, wherein the detection request is used for instructing the server to send the first work instruction to the to-be-detected vehicle and the second work instruction to the first vehicle.
14. The method according to claim 12 or 13, wherein said controlling said component to be detected to be turned on at least once during said target time period comprises:
and controlling the part to be detected to work according to a preset rule in the target time period according to the first working instruction.
15. The method according to claim 12 or 13, wherein said controlling said component to be detected to be turned on at least once during said target time period comprises:
and controlling the part to be detected to be in a working state in the target time period.
16. The method according to claim 12 or 13, wherein said controlling said component to be detected to be turned on at least once during said target time period comprises:
starting the part to be detected at a first time point;
and closing the part to be detected at a second time point, wherein the first time point and the second time point are two time points in the target time period.
17. A vehicle detection method, applied to a first vehicle, the method comprising:
receiving a second working instruction sent by a server, wherein the server is further used for sending a first working instruction to a vehicle to be detected, the first working instruction is used for indicating that a part to be detected of the vehicle to be detected is started at least once within a target time period, and the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold value;
shooting a first image in the target time period according to the second working instruction;
and sending the first image to the server, wherein the first image is used for identifying the fault condition of the component to be detected.
18. The method according to claim 17, wherein the second work order is generated when the server responds to a detection request for the component to be detected sent by the vehicle to be detected, the detection request being used to instruct the server to send the first work order to the vehicle to be detected and the second work order to the first vehicle.
19. The method according to claim 17 or 18, wherein before the receiving the second work order sent by the server, the method further comprises:
receiving a shooting instruction sent by the server;
shooting a second image according to the shooting instruction, wherein the second image is used for determining the first vehicle;
and sending the second image to the server, wherein the second image comprises the identification of the vehicle to be detected.
20. The method according to any one of claims 17-19, wherein the second work order carries the camera angle, the method further comprising:
and controlling a camera to shoot the first image at the shooting angle, wherein the shooting angle is determined by the server according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected.
21. The method according to any one of claims 17-19, wherein the second work order carries a lane, the method further comprising:
and switching to the lane, wherein the lane is determined by the server according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected.
22. A vehicle detection device is characterized by being applied to a server, and the device comprises:
the determining unit is used for determining a first vehicle according to the position information of the vehicle to be detected, and the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold value;
the device comprises a sending unit, a detecting unit and a judging unit, wherein the sending unit is used for sending a first working instruction to the vehicle to be detected, and the first working instruction is used for indicating the part to be detected of the vehicle to be detected to be opened at least once within a target time period;
a sending unit, configured to send a second work instruction to the first vehicle, where the second work instruction is used to instruct the first vehicle to capture a first image in the target time period and send the first image to the server;
and the identification unit is used for identifying the fault condition of the part to be detected from the first image.
23. The apparatus of claim 22, further comprising a receiving unit configured to:
receiving a detection request aiming at a to-be-detected component of the to-be-detected vehicle, wherein the detection request is used for instructing the server to send the first work instruction to the to-be-detected vehicle and the second work instruction to the first vehicle.
24. The apparatus according to claim 22 or 23, wherein the determining unit is further configured to:
obtaining a path plan of each vehicle in at least one vehicle and a path plan of the vehicle to be detected, wherein the distance between the at least one vehicle and the vehicle to be detected is smaller than the target threshold value;
screening out vehicles with the path plans partially overlapped with the path plans of the vehicles to be detected from the at least one vehicle;
determining the first vehicle from the screened vehicles.
25. The apparatus of claim 24, wherein the determining unit is further configured to:
sending a shooting instruction to the screened vehicles, wherein the shooting instruction is used for instructing the screened vehicles to respectively shoot second images;
and determining the first vehicle from the screened vehicles according to the second images respectively shot by the screened vehicles and the identification of the vehicle to be detected, wherein the second image shot by the first vehicle comprises the identification.
26. The apparatus of any of claims 22-25, wherein the first vehicle is positioned forward of the vehicle to be tested when the component to be tested is positioned on a front side of the vehicle to be tested.
27. The apparatus of any of claims 22-25, wherein the first vehicle is located behind the vehicle when the component to be inspected is located on a rear side of the vehicle to be inspected.
28. The apparatus according to any of claims 22-27, wherein the determining unit is configured to:
and determining the shooting angle of the camera according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected, wherein the second working instruction carries the shooting angle.
29. The apparatus according to any of claims 22-28, wherein the identification unit is configured to:
identifying the part to be detected from the first image;
and when the average gray value of the part to be detected in the first image is within a target range, determining that no fault exists in the part to be detected.
30. The apparatus according to any of claims 22-28, wherein the identification unit is configured to: identifying the position of the part to be detected from the first image;
and when the position of the component to be detected in the first image is located at the target position, determining that no fault exists in the component to be detected.
31. The apparatus according to any of claims 22-28, wherein the first image comprises a plurality of frame images, the identifying unit configured to:
respectively identifying the position of the component to be detected in each frame of the first image;
and when the positions of the parts to be detected in the multi-frame images are inconsistent, determining that no fault exists in the parts to be detected.
32. The apparatus according to any one of claims 22-28, wherein the first image comprises a multi-frame image, the identifying unit configured to:
identifying a part to be detected in each frame image of the multi-frame images;
identifying the working state of the part to be detected according to the gray value of the part to be detected in each frame of image;
and if the change rule of the working state of the part to be detected, which is identified according to the multi-frame image, meets a preset rule, determining that no fault exists in the part to be detected, wherein the first working instruction is used for indicating the part to be detected to work according to the preset rule.
33. A vehicle testing device, for use with a vehicle to be tested, the device comprising:
the system comprises a receiving unit, a server and a control unit, wherein the receiving unit is used for receiving a first work instruction sent by the server, the server is also used for sending a second work instruction to a first vehicle, the distance between the first vehicle and a vehicle to be detected is smaller than a target threshold value, the second work instruction is used for indicating the first vehicle to shoot a first image in a target time period and sending the first image to the server, and the first image is used for identifying the fault condition of a component to be detected of the vehicle to be detected;
and the control unit is used for controlling the to-be-detected component to be started at least once in the target time period according to the first working instruction.
34. The apparatus of claim 30, further comprising a transmitting unit configured to:
sending a detection request aiming at a to-be-detected component of the to-be-detected vehicle to the server, wherein the detection request is used for instructing the server to send the first work instruction to the to-be-detected vehicle and the second work instruction to the first vehicle.
35. The apparatus of claim 33, wherein the control unit is configured to:
and controlling the part to be detected to work according to a preset rule in the target time period according to the first working instruction.
36. The apparatus according to claim 33 or 34, wherein the control unit is configured to:
and controlling the part to be detected to be in a working state in the target time period.
37. The apparatus according to claim 33 or 34, wherein the control unit is configured to:
starting the part to be detected at a first time point;
and closing the part to be detected at a second time point, wherein the first time point and the second time point are two time points in the target time period.
38. A vehicle detection apparatus, for application to a first vehicle, the apparatus comprising:
the receiving unit is used for receiving a second working instruction sent by the server, the server is also used for sending a first working instruction to the vehicle to be detected, the first working instruction is used for indicating that a part to be detected of the vehicle to be detected is started at least once within a target time period, and the distance between the first vehicle and the vehicle to be detected is smaller than a target threshold value;
the shooting unit is used for shooting a first image in the target time period according to the second working instruction;
and the sending unit is used for sending the first image to the server, and the first image is used for identifying the fault condition of the component to be detected.
39. The apparatus according to claim 38, wherein the second work instruction is generated when the server responds to a detection request sent by the vehicle to be detected for the component to be detected, and the detection request is further used for instructing the server to send the first work instruction to the vehicle to be detected.
40. The apparatus of claim 38 or 39, comprising:
the receiving unit is used for receiving a shooting instruction sent by the server;
the shooting unit is used for shooting a second image according to the shooting instruction, and the second image is used for determining the first vehicle;
and the sending unit sends the second image to the server, wherein the second image comprises the identification of the vehicle to be detected.
41. The apparatus according to any one of claims 38 to 40, wherein the second work order carries the shooting angle, and the shooting unit is configured to:
and controlling a camera to shoot the first image at the shooting angle, wherein the shooting angle is determined by the server according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected.
42. The device according to any of claims 38-40, wherein the second work order carrying lane, the device further comprising a control unit for:
and switching to the lane, wherein the lane is determined by the server according to the position of the vehicle to be detected, the position of the first vehicle and the component to be detected.
43. A computer program product comprising instructions for causing an electronic device to perform the method of any one of claims 1 to 11 when the computer program product is run on the electronic device.
44. A computer program product comprising instructions for causing the vehicle to be detected to perform the method according to any one of claims 12 to 16 when the computer program product is run on an electronic device.
45. A computer program product comprising instructions for causing the first vehicle to perform the method of any one of claims 17 to 21 when the computer program product is run on an electronic device.
46. A computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-21.
CN202110015232.4A 2021-01-06 2021-01-06 Vehicle detection method and vehicle detection device Pending CN114724272A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110015232.4A CN114724272A (en) 2021-01-06 2021-01-06 Vehicle detection method and vehicle detection device
PCT/CN2021/120343 WO2022148068A1 (en) 2021-01-06 2021-09-24 Vehicle detection method and vehicle detection apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110015232.4A CN114724272A (en) 2021-01-06 2021-01-06 Vehicle detection method and vehicle detection device

Publications (1)

Publication Number Publication Date
CN114724272A true CN114724272A (en) 2022-07-08

Family

ID=82233995

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110015232.4A Pending CN114724272A (en) 2021-01-06 2021-01-06 Vehicle detection method and vehicle detection device

Country Status (2)

Country Link
CN (1) CN114724272A (en)
WO (1) WO2022148068A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022119149A1 (en) * 2022-07-29 2024-02-01 Bayerische Motoren Werke Aktiengesellschaft METHOD FOR ANALYZING AN EXTERNAL VEHICLE CONDITION AND COMPUTER PROGRAM

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6281483B2 (en) * 2014-04-17 2018-02-21 株式会社デンソー Failure detection system, information processing apparatus, and vehicle mounting apparatus
KR102034722B1 (en) * 2015-03-19 2019-10-21 현대자동차주식회사 Vehicle, communicating method thereof and wireless communication apparatus therein
JP6915512B2 (en) * 2017-11-28 2021-08-04 トヨタ自動車株式会社 Server device, faulty vehicle estimation method and faulty vehicle estimation program
CN108449583B (en) * 2018-05-09 2021-04-30 爱驰汽车有限公司 Method, system, device and storage medium for mutual monitoring between vehicles
JP7192571B2 (en) * 2019-02-28 2022-12-20 トヨタ自動車株式会社 vehicle
CN110519382A (en) * 2019-08-30 2019-11-29 成都康普斯北斗科技有限公司 A kind of automobile intelligent monitoring system
CN111063054A (en) * 2019-12-23 2020-04-24 智车优行科技(上海)有限公司 Fault information sending method and device and vehicle fault processing method and device

Also Published As

Publication number Publication date
WO2022148068A1 (en) 2022-07-14

Similar Documents

Publication Publication Date Title
US20210278837A1 (en) Remote Assistance for Autonomous Vehicles in Predetermined Situations
US20210158705A1 (en) Reporting Road Event Data and Sharing with Other Vehicles
US9679206B1 (en) Assisted perception for autonomous vehicles
US10444754B2 (en) Remote assistance for an autonomous vehicle in low confidence situations
US20220332348A1 (en) Autonomous driving method, related device, and computer-readable storage medium
US20220289252A1 (en) Operational Design Domain Odd Determining Method and Apparatus and Related Device
CN113968216B (en) Vehicle collision detection method and device and computer readable storage medium
US20180224860A1 (en) Autonomous vehicle movement around stationary vehicles
CA3085319C (en) Adjustable vertical field of view
US20200210731A1 (en) Vehicle control system, vehicle control method, and storage medium
US20200307557A1 (en) Parking management device, method of controlling parking management device, and storage medium
US20200298874A1 (en) Vehicle control device, vehicle control method, and storage medium
WO2021065626A1 (en) Traffic control system, traffic control method, and control device
US20200282976A1 (en) Vehicle control device, information providing device, information providing system, vehicle control method, information providing method, and storage medium
EP4307251A1 (en) Mapping method, vehicle, computer readable storage medium, and chip
US20200307558A1 (en) Vehicle control device, vehicle management device, vehicle control method, vehicle management method, and storage medium
CN112810603B (en) Positioning method and related product
US11117571B2 (en) Vehicle control device, vehicle control method, and storage medium
CN113859265A (en) Reminding method and device in driving process
US20200391605A1 (en) Management device, management method, and storage medium
WO2022148068A1 (en) Vehicle detection method and vehicle detection apparatus
CN116135654A (en) Vehicle running speed generation method and related equipment
CN115164910B (en) Travel route generation method, travel route generation device, vehicle, storage medium, and chip
JP2020201700A (en) Management device, vehicle management method, program, and vehicle management system
CN115042814A (en) Traffic light state identification method and device, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination