CN118190438A - Vehicle detection system, method, electronic device and computer readable medium - Google Patents

Vehicle detection system, method, electronic device and computer readable medium Download PDF

Info

Publication number
CN118190438A
CN118190438A CN202211591296.XA CN202211591296A CN118190438A CN 118190438 A CN118190438 A CN 118190438A CN 202211591296 A CN202211591296 A CN 202211591296A CN 118190438 A CN118190438 A CN 118190438A
Authority
CN
China
Prior art keywords
data acquisition
vehicle
image
control
target vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211591296.XA
Other languages
Chinese (zh)
Inventor
高鸿海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youzhuju Network Technology Co Ltd
Original Assignee
Beijing Youzhuju Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youzhuju Network Technology Co Ltd filed Critical Beijing Youzhuju Network Technology Co Ltd
Priority to CN202211591296.XA priority Critical patent/CN118190438A/en
Priority to PCT/CN2023/137665 priority patent/WO2024125426A1/en
Publication of CN118190438A publication Critical patent/CN118190438A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses a vehicle detection system, a vehicle detection method, electronic equipment and a computer readable medium. Wherein the first slide is disposed around a vehicle placement platform, and the vehicle placement platform is configured to place a target vehicle. In addition, the control device can firstly control the first data acquisition device to move to a first acquisition position along the first sliding device, so that the first data acquisition device acquires image data of a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used; and generating a detection result aiming at the target vehicle according to the to-be-used part image so that the detection result can show the current state of the target vehicle, thus realizing remote control detection processing aiming at the target vehicle and effectively improving the vehicle detection effect.

Description

Vehicle detection system, method, electronic device and computer readable medium
Technical Field
The present application relates to the field of vehicle technologies, and in particular, to a vehicle detection system, a method, an electronic device, and a computer readable medium.
Background
With the popularity of vehicles, vehicle detection for vehicles (e.g., second hand vehicles) is becoming increasingly important. For ease of understanding, a second-hand cart will be described below as an example.
As an example, for a second hand truck, the vehicle detection performed for the second hand truck may be used at least for detecting whether an accident has occurred in the second hand truck, detecting whether the second hand truck has been soaked in water, detecting whether the second hand truck has been burned, etc. In addition, these tests are typically manual field tests.
However, the detection of vehicles is not very good due to defects in manual field detection.
Disclosure of Invention
In order to solve the technical problems, the application provides a vehicle detection system, a method, electronic equipment and a computer readable medium, which can improve the vehicle detection effect.
In order to achieve the above purpose, the technical scheme provided by the application is as follows:
the application provides a vehicle detection system, which comprises a control device, a first data acquisition device and a first sliding device corresponding to the first data acquisition device;
The first sliding device is arranged around the vehicle placing platform; the vehicle placing platform is used for placing a target vehicle;
the control device is used for controlling the first data acquisition device to move to a first acquisition position along the first sliding device so that the first data acquisition device acquires image data of a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used;
the control device is further configured to generate a detection result for the target vehicle according to the to-be-used part image.
In a possible embodiment, the control device is specifically configured to:
Generating a control instruction in response to a control operation triggered for the first data acquisition device;
The control instruction is sent to the first data acquisition equipment, so that the first data acquisition equipment moves to a first acquisition position according to at least one piece of control information carried by the control instruction, and image data acquisition is carried out on a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used; the at least one control information includes location state control information and/or device state control information.
In one possible embodiment, the system further comprises at least one of a chassis data acquisition device and an appearance data acquisition device; the chassis data acquisition equipment is used for acquiring and processing image data aiming at the chassis of the target vehicle; the appearance data acquisition device is used for carrying out image data acquisition processing on the appearance of the vehicle body of the target vehicle.
In one possible embodiment, the chassis data acquisition device comprises a second slider and a second data acquisition device; the second slide device is positioned below a chassis of the target vehicle, the second slide device being disposed around the chassis;
The control device is further used for controlling the second data acquisition device to move to a second acquisition position along the second sliding device, so that the second data acquisition device acquires image data of a chassis area corresponding to the second acquisition position to obtain a chassis area image;
The control device is specifically configured to generate a detection result for the target vehicle according to the to-be-used part image and the chassis region image.
In one possible embodiment, the control apparatus is further configured to perform a position adjustment process in accordance with control of the second slider so that a relative positional relationship between the adjusted second slider and the chassis of the target vehicle satisfies a first relationship condition.
In one possible implementation manner, the first data acquisition device is a robot, and a first image acquisition device is installed at the front end of a mechanical arm of the robot;
the first image acquisition equipment is used for acquiring image data of a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used.
In one possible embodiment, an appearance defect scanning device is installed above the body of the robot; the appearance defect scanning device is used for carrying out appearance defect identification processing on the target vehicle to obtain an appearance defect identification result, and sending the appearance defect identification result to the control device;
The control device is specifically configured to: and generating a detection result aiming at the target vehicle according to the to-be-used part image and the appearance defect identification result.
In one possible embodiment, the control apparatus is further configured to control the first sliding device to perform a position adjustment process so that a relative positional relationship between the target vehicle and the first sliding device satisfies a second relationship condition.
In one possible implementation manner, the control device is further configured to generate the control instruction after receiving the portion image to be used if it is determined that the portion image to be used meets a preset update condition.
In a possible implementation manner, the control device is further configured to generate a three-dimensional model of the target vehicle according to the to-be-used part image; the three-dimensional model carries a detection result aiming at the target vehicle;
the control device is also used for displaying the three-dimensional model.
The application also provides a vehicle detection method which is applied to a vehicle detection system, wherein the system comprises a control device, a first data acquisition device and a first sliding device corresponding to the first data acquisition device; the first sliding device is arranged around the vehicle placing platform; the vehicle placing platform is used for placing a target vehicle;
The method comprises the following steps:
The control equipment controls the first data acquisition equipment to move to a first acquisition position along the first sliding device, so that the first data acquisition equipment acquires image data of a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used;
The control device generates a detection result for the target vehicle according to the to-be-used part image.
The application also provides an electronic device, comprising: a processor and a memory;
The memory is used for storing instructions or computer programs;
The processor is configured to execute the instructions or the computer program in the memory, so that the electronic device executes the vehicle detection method provided by the application.
The application also provides a computer readable medium having instructions or a computer program stored therein, which when run on a device causes the device to perform the vehicle detection method provided by the application.
The present application provides a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the vehicle detection method provided by the present application.
Compared with the prior art, the application has at least the following advantages:
For the vehicle detection system provided by the application, the system comprises a control device, a first data acquisition device and a first sliding device corresponding to the first data acquisition device. The first sliding device is arranged around a vehicle placing platform, and the vehicle placing platform is used for placing a target vehicle, so that the first data acquisition equipment can perform image acquisition processing on certain parts (such as a vehicle body and the like) of the target vehicle in a mode of moving on the first sliding device. In addition, the control device can firstly control the first data acquisition device to move to a first acquisition position along the first sliding device, so that the first data acquisition device acquires image data of a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used; and then generating a detection result aiming at the target vehicle according to the image of the part to be used, so that the detection result can better show the current state of the target vehicle (for example, whether an accident occurs, whether water soaking occurs, whether fire burning occurs or not, etc.), thus realizing remote control detection processing aiming at the target vehicle, effectively avoiding the defects existing in the above manual field detection scheme and further effectively improving the vehicle detection effect.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings may be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a vehicle detection system according to an embodiment of the present application;
FIG. 2 is a schematic diagram of at least one data acquisition device to be controlled according to an embodiment of the present application;
FIG. 3 is a schematic diagram of another vehicle detection system according to an embodiment of the present application;
FIG. 4 is a flowchart of a vehicle detection method according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In fact, for the above manual field test, the manual field test has the following drawbacks: ① Because the staff needs to go to the site to perform the real vehicle detection, not only the timeliness is poor, but also the detection cost is high (such as traffic cost and the like). ② Because the staff needs to go to the scene to collect the manual image of the vehicle, the usability of the image data collected under some conditions (such as the condition that the staff does not know the vehicle or the other conditions) is not great, so that the staff needs to return to the scene for carrying out image collection again on the vehicle for many times, the collection working intensity of the vehicle is increased and the collection working time is increased easily, and the detection cost is further increased.
Based on the above, in order to solve the technical problems shown in the background art, an embodiment of the present application provides a vehicle detection system, where the vehicle detection system includes a control device, a first data acquisition device, and a first sliding device corresponding to the first data acquisition device. The first sliding device is arranged around a vehicle placing platform, and the vehicle placing platform is used for placing a target vehicle, so that the first data acquisition equipment can perform image acquisition processing on certain parts (such as a vehicle body and the like) of the target vehicle in a mode of moving on the first sliding device. In addition, the control device can firstly control the first data acquisition device to move to a first acquisition position along the first sliding device, so that the first data acquisition device acquires image data of a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used; and then generating a detection result aiming at the target vehicle according to the image of the part to be used, so that the detection result can better show the current state of the target vehicle (for example, whether an accident occurs, whether water soaking occurs, whether fire burning occurs or not, etc.), thus realizing remote control detection processing aiming at the target vehicle, effectively avoiding the defects existing in the above manual field detection scheme and further effectively improving the vehicle detection effect.
In order to make the present application better understood by those skilled in the art, the following description will clearly and completely describe the technical solutions in the embodiments of the present application with reference to the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In order to better understand the technical solution provided by the present application, the following describes a vehicle detection system provided by the present application with reference to fig. 1-2. Fig. 1 is a schematic structural diagram of a vehicle detection system according to an embodiment of the present application. Fig. 2 is a schematic diagram of at least one data acquisition device to be controlled according to an embodiment of the present application.
As shown in fig. 1, a vehicle detection system provided by an embodiment of the present application may include a control device and at least one data acquisition device to be controlled. It should be noted that, in the embodiment of the present application, the number of devices in the "at least one data acquisition device to be controlled" is not limited, and for example, as shown in fig. 1, the number of devices may be N. Wherein N is a positive integer.
The control device is used for carrying out corresponding control processing (such as position state control or device state control) on any data acquisition device to be controlled. For example, in one possible implementation manner, the control device may be configured to generate a control instruction in response to a control operation triggered for a target device in the at least one data acquisition device to be controlled, so that the target device, after receiving the control instruction sent by the control device, performs image data acquisition processing according to at least one control information carried by the control instruction, to obtain image data to be used, and sends the image data to be used to the control device, so that the control device may be further configured to generate a detection result for the target vehicle according to the image data to be used. In order to facilitate understanding of the operation principle of the control device, the following description is given in connection with steps 11 to 13.
In one possible embodiment, the working principle of the above control device may specifically comprise the following steps 11-13.
Step 11: the control device responds to a control operation triggered by the target device in at least one data acquisition device to be controlled, and generates a control instruction, or after the control device determines that a preset control condition corresponding to the target device is met, the control instruction is generated.
The target equipment refers to data acquisition equipment to be controlled, which needs to be controlled by the control equipment; moreover, the embodiment of the present application does not limit the determination process of the target device, and for ease of understanding, the following description will be made in connection with two cases.
In case 1, in some application scenarios, each data acquisition device to be controlled may be remotely controlled by a detecting person.
Based on the above case 1, it is known that, in a possible embodiment, when the above "at least one data collection device to be controlled" includes N data collection devices to be controlled, after the control device receives a selection operation of a detection person for an nth data collection device to be controlled (or a selection operation for certain image data transmitted by the nth data collection device to be controlled), the nth data collection device to be controlled may be determined as a target device. Where N is a positive integer, N ε {1,2, … …, N }.
In case 2, in some application scenarios, in order to better improve the vehicle detection effect, trigger conditions of some automatic control flows may be set in advance for each to-be-controlled data acquisition device, so that the control flows for the to-be-controlled data acquisition device can be automatically triggered based on the trigger conditions.
Based on the above case 2, it can be known that, in one possible implementation manner, when the control device determines that the preset control condition corresponding to the nth to-be-controlled data acquisition device is reached when the above "at least one to-be-controlled data acquisition device" includes N to-be-controlled data acquisition devices, it may be determined that the automatic control flow corresponding to the preset control condition needs to be automatically triggered for the nth to-be-controlled data acquisition device, so that the nth to-be-controlled data acquisition device may be determined as the target device. Where N is a positive integer, N ε {1,2, … …, N }. The preset control conditions corresponding to the nth data acquisition device to be controlled can be preset according to the application scene.
In addition, the embodiment of the present application is not limited to the "preset control conditions corresponding to the nth to-be-controlled data acquisition device" above, for example, when the nth to-be-controlled data acquisition device includes the second data acquisition device shown in fig. 2, the "preset control conditions corresponding to the nth to-be-controlled data acquisition device" may be: determining that the relative positional relationship between the track (i.e., the following second slider) required to be used by the second data collection device and the target vehicle does not satisfy the preset first positional relationship (e.g., the track required to be used by the second data collection device should be located directly below the target vehicle, etc.); or determining that the sharpness of certain image data acquired by the second data acquisition device is below a preset sharpness threshold. For another example, when the nth to-be-controlled data acquisition device includes the six-axis robot shown in fig. 2, the "preset control condition corresponding to the nth to-be-controlled data acquisition device" may be: determining that the relative positional relationship between the track (i.e., hereinafter first slider) required for use by the six-axis robot and the target vehicle does not satisfy a second positional relationship set in advance (for example, two long sides in the track required for use by the second data collection device should be kept parallel with the body of the target vehicle, etc.); or determining that the sharpness of certain image data acquired by the six-axis robot is lower than a preset sharpness threshold.
The above "control instructions" are used to control the above target devices; and the control instruction carries at least one control information to enable the target device to adjust accordingly in accordance with the control information.
In addition, the embodiment of the present application is not limited to the above-described at least one control information, and for example, the at least one control information may include location status control information and/or device status control information. The position state control information is used for adjusting the position of the target equipment so that the target equipment can move to the target position designated by the position state control information for image data acquisition processing. The device state control information is used for performing adjustment processing on a state in which the target device is located (for example, at what inclination angle the camera shoots an image, at what posture the robot performs data acquisition, etc.), so that the target device can perform image data acquisition processing according to the target posture specified by the device state control information.
In addition, the embodiment of the present application is not limited to the above determination process of the control instruction, for example, it may determine the control instruction for the target device according to a preset rule or algorithm and according to some data related to the target device (for example, the current location of the target device, etc.), so that the control instruction can better control the target device to complete the corresponding image data acquisition task (for example, to acquire the image data of a new location, to acquire the clearer image data of a certain acquired location, etc.).
The above "control operation" refers to an operation for performing a certain control process with respect to a target device, which is remotely triggered by a detection person via a control device; the embodiment of the present application is not limited to this control operation, and for example, it may be specifically set according to the application scenario.
Based on the above information about step 11, in some cases, the control device may generate a control instruction for the above target device, and send the control instruction to the target device, so that the target device can perform corresponding image acquisition processing based on the control instruction, so that the control device can control the image data acquisition process of the target device.
Step 12: after receiving a control instruction sent by a control device, the target device performs image data acquisition processing according to at least one control information carried by the control instruction to obtain image data to be used, and sends the image data to be used to the control device.
The image data to be used refers to image data acquired by the target equipment according to the control instruction, so that the image data to be used can better meet the image acquisition requirement represented by the control instruction.
Step 13: the control device generates a detection result for the target vehicle according to the image data to be used.
In the embodiment of the application, for the control device, after the control device receives the image data to be used sent by the target device, the control device can generate the detection result for the target vehicle according to the image data to be used, so that the detection result can carry the image information carried by the image data to be used, and the detection result can show the current state of the target vehicle as accurately as possible.
In addition, the embodiment of the present application is not limited to the above determination process of the "detection result for the target vehicle", and may be implemented, for example, by any of existing or future methods of determining the detection result of the vehicle based on a large amount of image data. As another example, the determination process of the "detection result for the target vehicle" may specifically be: with the image data of a large number of vehicles and the detection label information thereof stored in advance in the database, analysis processing is performed on all the image data (for example, the above image data to be used, etc.) related to the target vehicle acquired by the control device to obtain a detection result for the target vehicle. The detection labeling information is used for describing the actual detection result of the vehicle in the database.
Based on the above description of steps 11 to 13, it can be known that, in one possible implementation manner, the control device may control the image capturing process of some data capturing devices to be controlled by the detection personnel, or the control device may automatically control and adjust the image capturing process of some data capturing devices to be controlled, so that it is beneficial to improve the image capturing effect of the data capturing devices to be controlled, thereby improving the vehicle detection effect.
In addition, for a better understanding of the above control device, the following description is continued in connection with the relevant content of the above "at least one data acquisition device to be controlled".
For the above "at least one data acquisition device to be controlled", different data acquisition devices to be controlled are used for performing image data acquisition processing on different areas in the target vehicle, so that the data acquisition devices to be controlled can perform image data acquisition processing on the target vehicle as comprehensively as possible, and thus the image data acquired by the data acquisition devices to be controlled can describe the current state of the target vehicle as comprehensively as possible. Wherein the target vehicle is a vehicle that needs to be subjected to vehicle detection processing; moreover, the embodiment of the present application is not limited to the target vehicle, and for example, in some application scenarios (for example, a second-hand vehicle related scenario, etc.), the target vehicle may refer to any second-hand vehicle.
In addition, the embodiment of the present application is not limited to the above "at least one data acquisition device to be controlled", and for convenience of understanding, the following description will be made with reference to two cases.
In case 1, in some application scenarios, the above "at least one data acquisition device to be controlled" may include a vehicle body data acquisition device.
The above "vehicle body data acquisition device" is used for performing image data acquisition processing for the vehicle body of the target vehicle; and the embodiment of the present application is not limited to the vehicle body data collection device, for example, the "vehicle body data collection device" includes a first data collection device (for example, a six-axis robot shown in fig. 2) and a first sliding means (for example, a six-axis robot moving rail shown in fig. 2) corresponding to the first data collection device, the first sliding means being provided around a vehicle placement platform, and when the vehicle placement platform is used for placing a target vehicle,
The above "first data acquisition device" may be at least used to move along a first sliding device corresponding to the first data acquisition device, so that the first data acquisition device may perform image data acquisition processing for the vehicle body of the above target vehicle. In addition, the present application is not limited to the embodiment of the above "first data collection device".
The above "first sliding means corresponding to the first data collection device" refers to the above moving track of the first data collection device; and the present application is not limited to the "first sliding means corresponding to the first data collection device", and for example, it may be the "six-axis robot moving track" shown in fig. 2.
The above "vehicle placement platform" refers to a place where a vehicle needs to be placed for which a vehicle detection process is required, so that the vehicle placement platform can be used for placing the above target vehicle; the vehicle placement platform is not limited to this embodiment, and may be any platform capable of placing a vehicle (e.g., a lifter, a floor of a room, a ceiling in a space, etc.), for example.
Based on the above description of case 1, it is known that, in one possible embodiment, when the above "vehicle body data collection device" includes the first data collection device and the first sliding means corresponding to the first data collection device, the first sliding means is provided around the vehicle placement platform, and the vehicle placement platform is used for placing the target vehicle, as shown in fig. 3, the above system may include the control device, the first data collection device, and the first sliding means corresponding to the first data collection device, so that the control device in the system has at least the functions shown in (1) - (2) below.
(1) The above control device is used for controlling the above first data acquisition device to move to a first acquisition position along the first sliding device, so that the first data acquisition device acquires image data of a vehicle part corresponding to the first acquisition position, and an image of the part to be used is obtained.
Wherein the first acquisition position refers to a destination that is required to be used when the above control device is used for position control of the above first data acquisition device; the present application is not limited to this first acquisition position, and for example, it may refer to any one of the reachable positions of the first sliding means corresponding to the first data acquisition device.
The above "vehicle portion corresponding to the first collection position" refers to a vehicle portion that the first data collection device can collect when the above first data collection device is located at the first collection position.
The above "portion to be used image" refers to image data acquired by the above first data acquisition device for the above "vehicle portion corresponding to the first acquisition position".
In addition, the present application is not limited to the specific implementation of the above function (1), and for example, it may specifically include the following matters of steps 21 to 23.
Step 21: the control device responds to the control operation triggered by the first data acquisition device and generates a control instruction.
It should be noted that, the embodiment of step 21 is similar to the embodiment of step 11, and will not be described herein for brevity.
Step 22: the control equipment sends a control instruction to the first data acquisition equipment so that the first data acquisition equipment moves to a first acquisition position according to at least one piece of control information carried by the control instruction, and performs image data acquisition on a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used; wherein the at least one control information comprises location state control information and/or device state control information.
It should be noted that, the content of the "at least one control information" is referred to the content of step 11, and is not described herein for brevity.
Based on the above related content of steps 21 to 22, in some possible embodiments, the above control device may implement remote control on the above first data acquisition device by means of a control instruction, so that the first data acquisition device can acquire image data meeting requirements as much as possible, which is beneficial to improving user experience.
(2) The control device is also used for generating a detection result aiming at the target vehicle according to the part image to be used.
It should be noted that the relevant content of the above function (2) is similar to that of the above step 13, and for brevity, a detailed description is omitted here.
Based on the related content of the first data acquisition device, the first data acquisition device can acquire and process the image at any vehicle body position of the target vehicle by means of the first sliding device, so that the image acquisition effect of the vehicle body is improved.
Indeed, to better provide one possible embodiment of the "first data acquisition device" above, in this embodiment, the "first data acquisition device" may be a robot (e.g., the six-axis robot shown in fig. 2) so that the robot may move along a first slide (e.g., "six-axis robot movement track" shown in fig. 2) corresponding to the vehicle placement platform when the robot is embedded in the first slide; and a first image pickup device (for example, a multifunctional camera shown in fig. 2) is mounted at the front end of the mechanical arm of the robot so that the first image pickup device is used for image data pickup processing with respect to the body of the target vehicle.
In addition, for the above robot, in some application scenarios, if the front end of the mechanical arm of the robot is provided with the first image acquisition device, the mechanical arm can stretch the first image acquisition device above the engine to take a picture under the condition that the engine cover is opened, and stretch the first image acquisition device into the driving window to take a picture under the condition that the driving window and the rear vehicle door of the vehicle are opened (for example, taking a picture for the whole driving cab, the seat, the steering wheel, the instrument panel, the center console, the copilot cab, the roof in the vehicle, the whole rear seat, etc.).
Based on the above paragraph, the embodiment of the present application further provides a possible implementation manner of the above robot, where, when it is determined that the portion to be detected in the target vehicle is in the preset state, the robot is specifically configured to: and controlling the first image acquisition equipment to acquire and process image data aiming at the internal state of the part to be detected. The part to be detected refers to an area in the target vehicle, which needs to be subjected to internal detection processing; furthermore, the embodiment of the present application does not limit the portion to be detected, and for example, it may be an engine area or a cab area. The preset state may be preset, for example, if the portion to be detected is an engine area, the preset state corresponding to the portion to be detected is that the engine cover is opened. For another example, if the portion to be detected is a cab area, the preset state corresponding to the portion to be detected is that a driving window and a rear vehicle door are opened.
Further, in some application scenarios, for the above robot, when the robot is at the left front and right front portions of the target vehicle, the above first image capturing device may also be configured to capture left front and right front appearance photographs of the vehicle, respectively, so that after the robot rotates clockwise to the right rear side of the body of the target vehicle to complete scanning, all data is sent to the control device, so that the control device can generate a three-dimensional model of the target vehicle based on the data.
Further, in the above robot, the robot can automatically move in the above first slide device in accordance with a preset movement control program, so that automatic image acquisition processing for the vehicle body in the target vehicle can be realized. The mobile control program may be set in advance according to an application scenario.
Furthermore, in order to better improve the image data acquisition effect, the robot can automatically move in the first sliding device according to a preset movement control program, and can also move according to a command triggered by a detection person. Based on this, the embodiment of the present application further provides a possible implementation manner of the robot, in this implementation manner, when the above control device performs remote control on the robot by means of a control instruction, where the position state control information carried by the control instruction includes second image acquisition position description information, and the device state control information carried by the control instruction is used to describe a state in which the robot is located (for example, a pose in which a mechanical arm is located, etc.), the robot is specifically configured to: and after receiving the second image acquisition position description information and the equipment state control information, performing image data acquisition processing according to the second image acquisition position description information and the equipment state control information. Wherein the second image acquisition position description information describes a destination at which the robot moves within the first sliding device; the device state control information is used to describe the final state of each component (e.g., a robot arm, etc.) in the robot.
In fact, in some application scenarios, it may be necessary to confirm whether there are some defects (e.g., paint defects, deformations, etc.) in the appearance of the target vehicle, so in order to meet this need, the embodiment of the present application also provides a possible implementation of the above robot in which an appearance defect scanning device (e.g., the defect scanning device shown in fig. 2) is installed above the body of the robot. The appearance defect scanning device is used for carrying out appearance defect identification processing on the target vehicle to obtain an appearance defect identification result, and sending the appearance defect identification result to the upper control device, so that the control device can generate a detection result on the target vehicle according to the position image to be used and the appearance defect identification result, and the detection result can further show which appearance defects exist on the target vehicle.
The embodiment of the present application is not limited to the implementation of the appearance defect scanning device, and may be implemented by any device that can perform defect scanning identification processing for the appearance of a vehicle, for example, existing or future.
It should be further noted that, the embodiment of the present application is not limited to the above method for obtaining the appearance defect recognition result, and for example, it may specifically be: when the robot moves along the above first sliding means, the appearance defect scanning apparatus mounted on the robot can recognize the vehicle appearance defect state by performing a back and forth movement scanning on both sides of the target vehicle to obtain the appearance defect recognition result.
In fact, in some cases (for example, improper placement of the target vehicle), in order to better enhance the vehicle body detection effect, the above vehicle placement platform may be adaptively adjusted according to the control instruction so that the relative positional relationship between the target vehicle placed on the vehicle placement platform and the first slider corresponding to the vehicle placement platform reaches the second relationship condition (for example, the vehicle body of the target vehicle remains parallel with the long side in the first slider, etc.). Based on this, the present embodiment also provides a possible implementation of the above control device, in which the control device is further configured to: the first sliding device is controlled to perform position adjustment processing so that the relative position relationship between the target vehicle and the first sliding device satisfies a second relationship condition. The second relationship condition may be preset according to an application scenario, for example, as shown in fig. 2, which may specifically be: the body of the target vehicle is kept parallel to the long side in the first sliding apparatus.
The present application is not limited to the control method for the first sliding device described above, and may be, for example: after the first sliding device receives the control instruction sent by the control equipment, position adjustment processing is performed according to the platform position adjustment information carried by the control instruction, so that the relative position relationship between the target vehicle placed on the vehicle placing platform and the first sliding device meets a second relationship condition. The platform position adjustment information is used for describing a position adjustment mode for the vehicle placement platform, so that the vehicle placement platform adjusted according to the platform position adjustment information can better meet the image data acquisition requirement for the vehicle body in the target vehicle.
In addition, the embodiment of the present application is not limited to the above determination process of the platform position adjustment information, and for example, it may specifically be: and determining the platform position adjustment information according to the actual placement state description information of the target vehicle and the actual position description information of the first sliding device. The "actual position description information of the first sliding device" is used to describe the position where the first sliding device is actually located. It should be noted that, the embodiment of the present application is not limited to the implementation of the step of determining the platform position adjustment information according to the actual placement state description information of the target vehicle and the actual position description information of the first sliding device.
Also, to better enhance the vehicle detection experience, at least one position sensor may be installed within the above vehicle placement platform so that the position sensors can send the actual location of the vehicle placement platform to the control device in real time. Similarly, a position sensor may be installed in the above robot and each device (for example, the first image capturing device or the appearance defect scanning device) to which the robot relates, so that the position sensor can send the robot and the actual positions of the respective devices to which the robot relates to the control device in real time. The control device can accurately determine the vehicle body area aimed by data information (such as image data or appearance defect recognition results) collected and sent by each device in the robot in real time based on the actual position of the target vehicle, the actual position of the vehicle placement platform, the actual position of the robot and the actual positions of the devices involved in the robot.
Based on the above description of case 1, in one possible implementation manner, the vehicle detection system provided by the present application includes a control device, a first data acquisition device, and a first sliding device corresponding to the first data acquisition device. The first sliding device is arranged around a vehicle placing platform, and the vehicle placing platform is used for placing a target vehicle, so that the first data acquisition equipment can perform image acquisition processing on certain parts (such as a vehicle body and the like) of the target vehicle in a mode of moving on the first sliding device. In addition, the control device can firstly control the first data acquisition device to move to a first acquisition position along the first sliding device, so that the first data acquisition device acquires image data of a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used; and then generating a detection result aiming at the target vehicle according to the image of the part to be used, so that the detection result can better show the current state of the target vehicle (for example, whether an accident occurs, whether water soaking occurs, whether fire burning occurs or not, etc.), thus realizing remote control detection processing aiming at the target vehicle, effectively avoiding the defects existing in the above manual field detection scheme and further effectively improving the vehicle detection effect.
In case 2, in some application scenarios, the above "at least one data acquisition device to be controlled" may include not only the above "vehicle body data acquisition device", but also some other image acquisition device (for example, a device other than the six-axis robot and the six-axis robot moving track shown in fig. 2).
As can be seen from the above case 2, in order to better improve the vehicle detection effect, the embodiment of the present application further provides another possible implementation manner of the above "at least one data acquisition device to be controlled", where the "at least one data acquisition device to be controlled" includes not only the above "vehicle body data acquisition device", but also at least one of the following chassis data acquisition device and appearance data acquisition device.
The chassis data acquisition equipment is used for acquiring and processing image data aiming at a chassis of the target vehicle; also, embodiments of the present application are not limited to the chassis data acquisition device, for example, the chassis data acquisition device may include a second slider device and a second data acquisition device (e.g., the second data acquisition device shown in fig. 2), the second slider device being located below the chassis of the above target vehicle, and the second slider device being disposed around the chassis. It can be seen that, since the second sliding device is located at the lower part of the chassis of the target vehicle, and the second data acquisition device is configured to perform a rail movement along the second sliding device, so that the second data acquisition device can perform image acquisition processing (e.g. scanning, photographing, video photographing, etc.) on the chassis of the target vehicle, so that the second data acquisition device can send the image data acquired by the second data acquisition device to the data processing device (e.g. the above control device or a server capable of performing data communication with the control device, etc.), so that the data processing device can compare the image data with the prestored chassis image data of some vehicles with the same model number, so as to obtain the chassis detection result for the target vehicle. The chassis image data of these vehicles having the same model refers to a large amount of image data stored in advance for describing the state in which the chassis is in the same model and capable of participating in the detection process for the vehicle chassis.
In practice, for the above second data collection device, the second data collection device can be automatically moved on the second slider according to a preset movement control program, thus enabling automatic image collection processing for the chassis in the target vehicle. The mobile control program may be set in advance according to an application scenario.
In fact, in order to better enhance the image data acquisition effect, the above second data acquisition device may not only automatically move on the second sliding device according to a preset movement control program, but also move according to an instruction triggered by the inspector. Based on this, the embodiment of the present application also provides one possible implementation of the above control device, in which the control device may have at least the functions shown in (3) to (4) below.
(3) The above control device is further configured to control the above second data acquisition device to move to a second acquisition position along the above second sliding device, so that the second data acquisition device performs image data acquisition on a chassis area corresponding to the second acquisition position, and a chassis area image is obtained.
Wherein the second acquisition location refers to a destination that is required to be used when the above control device is used for location control of the above second data acquisition device; and the present application is not limited to this second acquisition position, for example, it may refer to any one of the reachable positions of the second sliding means corresponding to the second data acquisition device above.
The above "chassis region corresponding to the second acquisition position" refers to a chassis range that the second data acquisition device can acquire when the above second data acquisition device is located at the second acquisition position.
The above "chassis region image" refers to image data acquired by the above second data acquisition device for the above "chassis region corresponding to the second acquisition position".
In addition, the present application is not limited to the specific implementation of the above function (3), for example, the implementation similar to the implementation of the above function (1), and will not be described here again for brevity.
In addition, the present application is not limited to the control manner of the above control device for the above second data acquisition device by means of the control command, for example, after the second data acquisition device receives the control command sent by the control device, the second data acquisition device may move to the second acquisition position along the above second sliding device according to the first image acquisition position description information carried by the control command, so that the second data acquisition device may perform image data acquisition for the chassis area corresponding to the second acquisition position, and obtain the chassis area image, so that the purpose of performing image acquisition for the target position described by the first image acquisition position description information may be achieved. Wherein the first image capturing position description information is used to describe a destination of the second data capturing device when moving in the above second sliding apparatus.
Further, the embodiment of the present application is not limited to the above determination process of the first image capturing position description information, and for example, it may determine the first image capturing position description information according to the actual position of the target area in the chassis of the target vehicle, the current position of the second sliding device, and the current position of the second data capturing device, so that the second data capturing device can move to the target area according to the first image capturing position description information for image data capturing processing. It should be noted that the embodiment of the present application is not limited to the implementation of the step of determining the first image acquisition position description information according to the actual position of the target area in the chassis of the target vehicle, the current position of the second sliding device, and the current position of the second data acquisition device.
(4) The above control device is specifically configured to generate a detection result for the target vehicle according to the above to-be-used part image and the above chassis region image.
It should be noted that the relevant content of the above function (4) is similar to that of the above step 13, and for brevity, a detailed description is omitted here.
Based on the related content of the second data acquisition device, the second data acquisition device can perform image acquisition processing on any area in the chassis of the target vehicle by means of the second sliding device, so that the image acquisition effect on the chassis is improved.
In fact, in some cases (for example, in the case where the target vehicle is improperly placed or the detection person wants to view more detailed image data of a certain position in the chassis, etc.), the above second sliding apparatus may also be controlled to perform position adjustment in order to better improve the chassis detection effect. Based on this, the present application also provides a possible embodiment of the above control device, in which the control device includes at least the function shown in the following (6).
(6) The above control apparatus is further configured to perform a position adjustment process in accordance with the control of the above second slider so that the relative positional relationship between the adjusted second slider and the chassis of the above target vehicle satisfies the first relationship condition. The first relationship condition may be preset according to an application scenario, for example, as shown in fig. 2, which may specifically be: the second slide device is located directly under the chassis of the target vehicle.
In addition, the present application is not limited to the control manner of the above control device for the above second sliding means, and may be implemented by means of a control instruction, for example. It can be seen that, in one possible implementation, after the second sliding device receives the control instruction sent by the control apparatus, the second sliding device may perform a position adjustment process according to the track position adjustment information carried by the control instruction, so that the second sliding device is located directly under the chassis of the target vehicle, so that the second data acquisition apparatus may perform better image data acquisition processing for the chassis of the target vehicle by means of the second sliding device.
The track position adjustment information is used for describing a position adjustment mode of the second sliding device, so that the second sliding device adjusted according to the track position adjustment information can better meet the image data acquisition requirement of a chassis in a target vehicle. In addition, the embodiment of the present application is not limited to the track position adjustment information, and for example, in some cases (for example, when the relative positional relationship between the second sliding device before adjustment and the target vehicle does not satisfy the preset first positional relationship), the track position adjustment information is used to adjust the second sliding device so that the adjusted relative positional relationship between the second sliding device and the target vehicle satisfies the preset first positional relationship. As another example, in other cases (e.g., when a inspector wants to view image data of certain areas of the chassis that have not been acquired), the rail position adjustment information is used to control the second slide to pass under certain new areas of the chassis to enable the second data acquisition device to perform image data acquisition processing for the new areas along the adjusted second slide.
In addition, the embodiment of the present application is not limited to the above determination process of the track position adjustment information, and for example, it may specifically be: the track position adjustment information is determined according to the actual placement state description information of the target vehicle and the pre-adjustment position of the second sliding device. The "actual placement state description information of the target vehicle" is used to describe the current placement state of the target vehicle (for example, where the target vehicle is placed, what angle is placed between the target vehicle and a vehicle placement platform below, etc.). The "pre-adjustment position of the second sliding device" refers to a position where the second sliding device is actually located before adjustment is performed with respect to the second sliding device.
In addition, in order to better enhance the vehicle detection experience, for the above chassis data acquisition device, at least one position sensor may be installed in the second sliding device in the chassis data acquisition device, so that the position sensors can send the position where the second sliding device is actually located to the control device in real time. Similarly, a position sensor can be installed in the second data acquisition device in the chassis data acquisition device, so that the position sensor can send the actual position of the second data acquisition device to the control device in real time. The control device can accurately determine the chassis area for which the image data transmitted by the second data acquisition device are acquired in real time based on the position of the second sliding device in real time and the position of the second data acquisition device in real time.
The above "appearance data collection device" is used for performing image data collection processing for the body appearance of the target vehicle; moreover, the embodiment of the present application is not limited to the appearance data acquisition device, for example, the appearance data acquisition device may include at least one second image acquisition device (for example, an appearance high definition camera shown in fig. 2) so that the second image acquisition devices are used for performing the full view image data acquisition process with respect to the appearance of the target vehicle. In addition, the embodiment of the present application is not limited to the installation positions of the second image capturing devices, for example, a bracket (for example, a camera bracket shown in fig. 2) may be installed at each preset position around the vehicle placement platform, and the second image capturing devices may be installed in preset areas of the bracket, so that the image data captured by the second image capturing devices for the target vehicle can represent the target vehicle as completely as possible. It should be noted that the preset positions may be preset, and the number of the preset positions is not limited in the embodiment of the present application, for example, the preset positions may be 2 or 4 as shown in fig. 2. The preset area may be preset.
Based on the related content of the at least one data acquisition device to be controlled, the data acquisition devices to be controlled not only can automatically acquire and process image data of a target vehicle according to a preset program, but also can perform corresponding image data acquisition and processing according to a control instruction sent by the control device, so that the image data acquisition process of the data acquisition devices to be controlled is favorably realized by a detector by means of the control device, the acquisition of image data which meets the detection requirement of the vehicle is favorably realized, and the improvement of the detection effect of the vehicle is favorably realized.
In addition, in one possible implementation, each of the above data acquisition devices to be controlled may be used to transmit the image data acquired in real time to the control device; and the above control device may be further configured to receive image data transmitted from each of the data collection devices to be controlled. That is, for the control apparatus, the control apparatus may receive all of the image data to be collected and transmitted by the data collection apparatus to be controlled in real time, so that the control apparatus can determine the detection result of the target vehicle based on the image data.
It can be seen that, for the ith data to be controlled collecting device, after the ith data to be controlled collecting device collects image data from a certain area (for example, chassis, body, appearance, etc.) in the target vehicle, the ith data to be controlled collecting device may send the image data to the control device in real time, so that the control device may obtain the image data collected by the ith data to be controlled collecting device in real time, so that the subsequent control device may determine, based on the data information collected by the ith data to be controlled collecting device, an actual state of the area in the target vehicle, so that the subsequent control device may determine, in combination with the actual state of the area, an actual state of the target vehicle (for example, whether an accident occurs, whether a water bath occurs, whether a fire burns occurs, etc.). Wherein i is a positive integer, i is less than or equal to N, and N is a positive integer.
In fact, in some application scenarios, some image data acquired by a certain data acquisition device to be controlled may have defects (such as unclear defects), so in order to better improve the vehicle detection effect, the embodiment of the present application further provides a possible implementation manner of the above control device, where the control device is further configured to: when the image to be updated (for example, the part image to be used) in the image data sent by the target equipment meets the preset updating condition, a control instruction is generated, and the control instruction is sent to the data acquisition equipment to be controlled (for example, the first data acquisition equipment) corresponding to the image to be updated, so that the equipment can carry out image acquisition processing again for the area represented by the image to be updated. The image to be updated refers to image data which is acquired by the target equipment and has defects. The preset update condition may be preset, for example, the preset update condition may specifically be: the sharpness of the image to be updated is lower than a preset sharpness threshold. In addition, please refer to the above for the relevant content of the target device and the control instruction.
Based on the content of the upper section, for the control equipment, the control equipment can generate corresponding control instructions based on some control operations triggered by the detection personnel, so as to realize the purpose of remote control of the data acquisition equipment to be controlled by the detection personnel; in some cases (for example, the image data is not clear, etc.), the control device can also automatically trigger some control instructions, so that the data acquisition devices to be controlled can perform corresponding image acquisition processing based on the control instructions, and thus the expression effect of the image data acquired by the data acquisition devices to be controlled on the target vehicle can be effectively improved, and the detection effect on the target vehicle is improved.
In fact, in order to better improve the vehicle detection effect, the embodiment of the present application also provides a possible implementation manner of the above control device, in which the control device may be specifically configured to: a three-dimensional model of the target vehicle is generated from image data (e.g., the above-mentioned region image to be used) transmitted by the at least one data acquisition device to be controlled. The three-dimensional model carries a detection result for the target vehicle.
The three-dimensional model of the target vehicle is used for displaying the current state of the target vehicle by means of the stereoscopic image data, so that the three-dimensional data can restore the current state of the target vehicle as accurately as possible. The three-dimensional model carries the detection result for the target vehicle (for example, the detection result for the chassis region is marked on the chassis region, the detection result for the door is marked on the door, the detection result for the engine is marked on the engine, etc.) so that a user can know not only the three-dimensional structure of the target vehicle but also whether each region in the target vehicle has a problem, which is beneficial to improving the detection effect for the target vehicle.
Based on the two paragraphs, in one possible implementation, the above control device may further be configured to: the three-dimensional model is displayed, so that a user (for example, a detection personnel) of the control equipment can know the current state of the three-dimensional model from the control equipment, the user can better know the detection result of the target vehicle by means of the three-dimensional model, and the user can determine whether the detection result of the target vehicle is reasonable or accurate by means of the three-dimensional model, so that the detection effect of the target vehicle is improved.
In addition, the embodiment of the present application is not limited to the above implementation of the control device, and for example, it may be implemented by means of an electronic device having a control function. As another example, in some application scenarios, the control device includes a terminal device and a server, so that the execution steps involved in the control device may be implemented by means of a data interaction procedure between the terminal device and the server.
Based on the above description of the vehicle detection system, the vehicle detection system provided by the embodiment of the application includes a control device and at least one data acquisition device to be controlled, and different data acquisition devices to be controlled are used for performing image data acquisition processing on different areas in a target vehicle, so that the image data acquired by the data acquisition devices to be controlled can describe the target vehicle as comprehensively as possible. In addition, the control device may be configured to control (e.g., remotely control) any one of the data acquisition devices to be controlled, so that the data acquisition device to be controlled may acquire image data capable of better expressing the target vehicle; and the control process is specifically as follows: after the control device receives the control operation triggered by the target device in the data acquisition devices to be controlled, a control instruction corresponding to the target device is generated, and then the control instruction is sent to the target device, so that after the target device receives the control instruction, the target device performs image data acquisition processing according to at least one piece of control information (such as position state control information and/or device state control information) carried by the control instruction, image data to be used is obtained, the image data to be used is sent to the control device, so that a detection result of the target vehicle is generated by the control device according to the image data to be used, and the detection result can better represent the current state of the target vehicle (such as whether an accident occurs, whether a bubble occurs, whether a fire burns or not occurs or not), so that remote control detection processing of the target vehicle can be realized, defects existing in the above manual field detection scheme can be effectively avoided, and vehicle detection effects can be effectively improved.
In addition, based on the related content of the vehicle detection system, the embodiment of the application further provides a vehicle detection method, which is described below with reference to the accompanying drawings for convenience of understanding. As shown in fig. 3, the vehicle detection method provided by the embodiment of the present application may be applied to the above vehicle detection system, and when the vehicle detection system includes a control device, a first data acquisition device, and a first sliding device corresponding to the first data acquisition device, the first sliding device is disposed around a vehicle placement platform, and the vehicle placement platform is used for placing a target vehicle, the vehicle detection method may include the following S1-S2. Fig. 4 is a flowchart of a vehicle detection method according to an embodiment of the present application.
S1: the control equipment controls the first data acquisition equipment to move to a first acquisition position along the first sliding device, so that the first data acquisition equipment acquires image data of a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used;
S2: and the control equipment generates a detection result aiming at the target vehicle according to the to-be-used part image.
It should be noted that, please refer to the related content of the control device above for the related content of S1-S2, and for brevity, description thereof is omitted herein.
In one possible embodiment, the S1 specifically includes:
the control equipment responds to the control operation triggered by the first data acquisition equipment and generates a control instruction;
The control equipment sends the control instruction to the first data acquisition equipment so that the first data acquisition equipment moves to a first acquisition position according to at least one piece of control information carried by the control instruction, and performs image data acquisition on a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used; the at least one control information includes location state control information and/or device state control information.
In one possible embodiment, the system further comprises at least one of a chassis data acquisition device and an appearance data acquisition device; the chassis data acquisition equipment is used for acquiring and processing image data aiming at the chassis of the target vehicle; the appearance data acquisition device is used for carrying out image data acquisition processing on the appearance of the vehicle body of the target vehicle.
In one possible embodiment, the chassis data acquisition device comprises a second slider and a second data acquisition device; the second slide device is positioned below a chassis of the target vehicle, the second slide device being disposed around the chassis;
the method further comprises the steps of:
The control equipment controls the second data acquisition equipment to move to a second acquisition position along the second sliding device so that the second data acquisition equipment acquires image data of a chassis area corresponding to the second acquisition position to obtain a chassis area image;
The step S2 specifically comprises the following steps: and generating a detection result aiming at the target vehicle according to the to-be-used part image and the chassis area image.
In one possible embodiment, the method further comprises:
the control device performs position adjustment processing according to control of the second slider so that the relative positional relationship between the adjusted second slider and the chassis of the target vehicle satisfies a first relationship condition.
In one possible implementation manner, the first data acquisition device is a robot, and a first image acquisition device is installed at the front end of a mechanical arm of the robot;
the first image acquisition equipment is used for acquiring image data of a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used.
In one possible embodiment, an appearance defect scanning device is installed above the body of the robot; the appearance defect scanning device is used for carrying out appearance defect identification processing on the target vehicle to obtain an appearance defect identification result, and sending the appearance defect identification result to the control device;
The step S2 specifically comprises the following steps: and generating a detection result aiming at the target vehicle according to the to-be-used part image and the appearance defect identification result.
In one possible embodiment, the method further comprises:
the control apparatus controls the first slider to perform a position adjustment process so that a relative positional relationship between the target vehicle and the first slider satisfies a second relationship condition.
In one possible embodiment, the method further comprises:
and after receiving the part image to be used, the control equipment generates the control instruction if the part image to be used is determined to meet a preset updating condition.
In one possible embodiment, the method further comprises:
The control equipment generates a three-dimensional model of the target vehicle according to the to-be-used part image; the three-dimensional model carries a detection result aiming at the target vehicle;
the control device displays the three-dimensional model.
Based on the above-mentioned related content from S1 to S3, for the vehicle detection method provided by the embodiment of the present application, the control device in the vehicle detection system may first control the first data acquisition device to move to the first acquisition position along the first sliding device, so that the first data acquisition device performs image data acquisition on the vehicle portion corresponding to the first acquisition position, to obtain an image of the portion to be used; and then generating a detection result aiming at the target vehicle according to the image of the part to be used, so that the detection result can better show the current state of the target vehicle (for example, whether an accident occurs, whether water soaking occurs, whether fire burning occurs or not, etc.), thus realizing remote control detection processing aiming at the target vehicle, effectively avoiding the defects existing in the above manual field detection scheme and further effectively improving the vehicle detection effect.
In addition, the embodiment of the application also provides electronic equipment, which comprises a processor and a memory: the memory is used for storing instructions or computer programs; the processor is configured to execute the instructions or the computer program in the memory, so that the electronic device executes any implementation mode of the vehicle detection method provided by the embodiment of the application.
Referring to fig. 5, a schematic structural diagram of an electronic device 500 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 5 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 5, the electronic device 500 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 501, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data required for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
In general, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 507 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 508 including, for example, magnetic tape, hard disk, etc.; and communication means 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 shows an electronic device 500 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or from the storage means 508, or from the ROM 502. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 501.
The electronic device provided by the embodiment of the present disclosure belongs to the same inventive concept as the method provided by the above embodiment, and technical details not described in detail in the present embodiment can be seen in the above embodiment, and the present embodiment has the same beneficial effects as the above embodiment.
The embodiment of the application also provides a computer readable medium, wherein instructions or a computer program are stored in the computer readable medium, and when the instructions or the computer program are run on the device, the device is caused to execute any implementation mode of the vehicle detection method provided by the embodiment of the application.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (Hyper Text Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the method described above.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Where the names of the units/modules do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that, in the present description, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different manner from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the system or device disclosed in the embodiments, since it corresponds to the method disclosed in the embodiments, the description is relatively simple, and the relevant points refer to the description of the method section.
It should be understood that in the present application, "at least one (item)" means one or more, and "a plurality" means two or more. "and/or" for describing the association relationship of the association object, the representation may have three relationships, for example, "a and/or B" may represent: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
It is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (13)

1. A vehicle detection system, characterized in that the system comprises a control device, a first data acquisition device and a first sliding device corresponding to the first data acquisition device;
The first sliding device is arranged around the vehicle placing platform; the vehicle placing platform is used for placing a target vehicle;
the control device is used for controlling the first data acquisition device to move to a first acquisition position along the first sliding device so that the first data acquisition device acquires image data of a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used;
the control device is further configured to generate a detection result for the target vehicle according to the to-be-used part image.
2. The system according to claim 1, characterized in that said control device is specifically configured to:
Generating a control instruction in response to a control operation triggered for the first data acquisition device;
The control instruction is sent to the first data acquisition equipment, so that the first data acquisition equipment moves to a first acquisition position according to at least one piece of control information carried by the control instruction, and image data acquisition is carried out on a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used; the at least one control information includes location state control information and/or device state control information.
3. The system of claim 1, further comprising at least one of a chassis data acquisition device and an appearance data acquisition device; the chassis data acquisition equipment is used for acquiring and processing image data aiming at the chassis of the target vehicle; the appearance data acquisition device is used for carrying out image data acquisition processing on the appearance of the vehicle body of the target vehicle.
4. A system according to claim 3, wherein the chassis data acquisition device comprises a second slider and a second data acquisition device; the second slide device is positioned below a chassis of the target vehicle, the second slide device being disposed around the chassis;
The control device is further used for controlling the second data acquisition device to move to a second acquisition position along the second sliding device, so that the second data acquisition device acquires image data of a chassis area corresponding to the second acquisition position to obtain a chassis area image;
The control device is specifically configured to generate a detection result for the target vehicle according to the to-be-used part image and the chassis region image.
5. The system according to claim 4, wherein the control device is further configured to perform a position adjustment process in accordance with control of the second slider device such that a relative positional relationship between the adjusted second slider device and the chassis of the target vehicle satisfies a first relationship condition.
6. A system according to claim 3, wherein the first data acquisition device is a robot, and a first image acquisition device is mounted at the front end of a mechanical arm of the robot;
the first image acquisition equipment is used for acquiring image data of a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used.
7. The system of claim 6, wherein an appearance defect scanning device is installed above the body of the robot; the appearance defect scanning device is used for carrying out appearance defect identification processing on the target vehicle to obtain an appearance defect identification result, and sending the appearance defect identification result to the control device;
The control device is specifically configured to: and generating a detection result aiming at the target vehicle according to the to-be-used part image and the appearance defect identification result.
8. The system according to claim 1, wherein the control device is further configured to control the first slider device to perform a position adjustment process so that a relative positional relationship between the target vehicle and the first slider device satisfies a second relationship condition.
9. The system according to claim 2, wherein the control device is further configured to generate the control instruction after receiving the portion to be used image if it is determined that the portion to be used image meets a preset update condition.
10. The system according to any one of claims 1-9, wherein the control device is further configured to generate a three-dimensional model of the target vehicle from the to-be-used site image; the three-dimensional model carries a detection result aiming at the target vehicle;
the control device is also used for displaying the three-dimensional model.
11. A vehicle detection method is characterized in that the method is applied to a vehicle detection system, and the system comprises a control device, a first data acquisition device and a first sliding device corresponding to the first data acquisition device; the first sliding device is arranged around the vehicle placing platform; the vehicle placing platform is used for placing a target vehicle;
The method comprises the following steps:
The control equipment controls the first data acquisition equipment to move to a first acquisition position along the first sliding device, so that the first data acquisition equipment acquires image data of a vehicle part corresponding to the first acquisition position to obtain an image of the part to be used;
The control device generates a detection result for the target vehicle according to the to-be-used part image.
12. An electronic device, the device comprising: a processor and a memory;
The memory is used for storing instructions or computer programs;
The processor is configured to execute the instructions or the computer program in the memory to cause the electronic device to perform the method of claim 11.
13. A computer readable medium, characterized in that it has stored therein instructions or a computer program which, when run on a device, causes the device to perform the method of claim 11.
CN202211591296.XA 2022-12-12 2022-12-12 Vehicle detection system, method, electronic device and computer readable medium Pending CN118190438A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211591296.XA CN118190438A (en) 2022-12-12 2022-12-12 Vehicle detection system, method, electronic device and computer readable medium
PCT/CN2023/137665 WO2024125426A1 (en) 2022-12-12 2023-12-08 Vehicle detection system and method, electronic device, and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211591296.XA CN118190438A (en) 2022-12-12 2022-12-12 Vehicle detection system, method, electronic device and computer readable medium

Publications (1)

Publication Number Publication Date
CN118190438A true CN118190438A (en) 2024-06-14

Family

ID=91412639

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211591296.XA Pending CN118190438A (en) 2022-12-12 2022-12-12 Vehicle detection system, method, electronic device and computer readable medium

Country Status (2)

Country Link
CN (1) CN118190438A (en)
WO (1) WO2024125426A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007257559A (en) * 2006-03-27 2007-10-04 Carpoint Holdings Inc Used car sale support system
CN109253888A (en) * 2018-08-30 2019-01-22 北京酷车易美网络科技有限公司 Detection method and system for vehicle vehicle condition
CN109945782B (en) * 2019-04-02 2020-12-08 易思维(杭州)科技有限公司 Method for detecting key position of super-long body-in-white
CN110091342B (en) * 2019-05-20 2024-04-26 金瓜子科技发展(北京)有限公司 Vehicle condition detection method and device and detection robot
CN110231825A (en) * 2019-06-21 2019-09-13 中国神华能源股份有限公司 Vehicular intelligent cruising inspection system and method
CN112326669A (en) * 2020-10-28 2021-02-05 哈尔滨工程大学 Coating defect detection and marking system and method
CN113436366B (en) * 2021-06-25 2023-01-24 北京铁道工程机电技术研究所股份有限公司 Synchronous and cooperative inspection method for bottom and side edges of rail transit vehicle

Also Published As

Publication number Publication date
WO2024125426A1 (en) 2024-06-20

Similar Documents

Publication Publication Date Title
CN109664820A (en) Driving reminding method, device, equipment and storage medium based on automobile data recorder
CN110287810B (en) Vehicle door motion detection method, device and computer readable storage medium
CN109100537B (en) Motion detection method, apparatus, device, and medium
CN104054033A (en) Method for monitoring environment of vehicle
CN111598006B (en) Method and device for labeling objects
CN111310815A (en) Image recognition method and device, electronic equipment and storage medium
CN113553261B (en) Software automated testing method, device and computer readable storage medium
CN112818898B (en) Model training method and device and electronic equipment
CN118190438A (en) Vehicle detection system, method, electronic device and computer readable medium
CN111340880A (en) Method and apparatus for generating a predictive model
JP6945004B2 (en) Information processing device
CN109189068B (en) Parking control method and device and storage medium
CN111324202A (en) Interaction method, device, equipment and storage medium
CN114499709B (en) Antenna radio test method and device, electronic equipment and storage medium
CN112990017B (en) Smart city big data analysis method and monitoring system
CN111681267B (en) Track anti-intrusion method based on image recognition
CN114637400A (en) Visual content updating method, head-mounted display device assembly and computer readable medium
CN111912528A (en) Body temperature measuring system, method, device and equipment storage medium
CN113496533A (en) Sticker processing method and device
CN112416146A (en) Method, device, terminal and storage medium for processing information
JP7462714B2 (en) Information processing device, information processing program, and information processing method
CN113361361B (en) Method and device for interacting with passenger, vehicle, electronic equipment and storage medium
CN116017355A (en) Performance test system, method and equipment for digital key and storage medium
JP2024076811A (en) Imaging system, imaging method, and program
CN114429656A (en) Face recognition equipment control method and device, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination