CN116818070A - Jump pound identification method, apparatus, device and storage medium - Google Patents

Jump pound identification method, apparatus, device and storage medium Download PDF

Info

Publication number
CN116818070A
CN116818070A CN202310773313.XA CN202310773313A CN116818070A CN 116818070 A CN116818070 A CN 116818070A CN 202310773313 A CN202310773313 A CN 202310773313A CN 116818070 A CN116818070 A CN 116818070A
Authority
CN
China
Prior art keywords
wagon balance
target vehicle
image
monitoring
wagon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310773313.XA
Other languages
Chinese (zh)
Inventor
林国森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Chuangxin Qizhi Technology Group Co ltd
Original Assignee
Qingdao Chuangxin Qizhi Technology Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Chuangxin Qizhi Technology Group Co ltd filed Critical Qingdao Chuangxin Qizhi Technology Group Co ltd
Priority to CN202310773313.XA priority Critical patent/CN116818070A/en
Publication of CN116818070A publication Critical patent/CN116818070A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • G01G19/02Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups for weighing wheeled or rolling bodies, e.g. vehicles
    • G01G19/03Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups for weighing wheeled or rolling bodies, e.g. vehicles for weighing during motion
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/64Devices characterised by the determination of the time taken to traverse a fixed distance
    • G01P3/68Devices characterised by the determination of the time taken to traverse a fixed distance using optical means, i.e. using infrared, visible, or ultraviolet light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a jump pound identification method, a jump pound identification device, jump pound identification equipment and a storage medium, wherein the jump pound identification method comprises the following steps: collecting a monitoring image of a target vehicle passing through a wagon balance area through a monitoring camera; wherein, the monitoring image includes: an image acquired when a target vehicle passes through a wagon balance; after the wagon balance is identified to be blocked based on the monitoring image, calculating the moving speed of the target vehicle passing through the wagon balance; and determining whether the target vehicle has the jump behavior according to the judgment on whether the moving speed is abnormal. The method has the advantages that the monitoring camera is used for shooting the image of the vehicle passing through the wagon balance area, whether the wagon balance is shielded or not is judged according to the image, the advancing speed of the wagon head is calculated, whether the cheating behavior of the wagon balance is jumped or not is judged in the process of passing through the wagon balance, the shaft type tire type information feature library is not required to be continuously maintained, the identification result is more reliable, meanwhile, the installation angle and the position of the camera are also more flexible, the deployment cost is lower, and the method is simple and reliable.

Description

Jump pound identification method, apparatus, device and storage medium
Technical Field
The present application relates to the field of image recognition technologies, and in particular, to a method, an apparatus, a device, and a storage medium for identifying skip pounds.
Background
The electronic truck scale is accurate and convenient weighing and metering equipment and is increasingly applied to various industries such as logistics, steel, building materials, coal, asphalt and the like for many years. The alias of the electronic truck scale is a wagon balance, which is an effective mechanical manual weighing instrument and plays a very important role in an unattended weighing system. The jump scale is also called jump scale, and is a weighing method for reducing the axle weight by accurately controlling a clutch, an accelerator and a brake system of a vehicle to enable a tire to generate acceleration when passing through the wagon balance and changing the stress of a weighing platform by utilizing an inertia principle.
At present, because the axle type and the tire type form of the vehicle are various, axle type tire type information of a plurality of vehicle types is needed to establish a vehicle type feature library, then the axle type tire type information is matched with the feature library information, and the identification result is not reliable enough. And the angle and the position of the monitoring camera are strictly required, and high installation and deployment cost exists.
Disclosure of Invention
In view of the above, an object of the embodiments of the present application is to provide a method, an apparatus, a device, and a storage medium for identifying a skip scale, in which an image of a vehicle passing through a wagon balance area is captured by a monitoring camera, whether the wagon balance is blocked is identified according to the image, and a head forward speed is calculated, and whether a cheating behavior of the skip scale exists in a process of passing through the wagon balance is identified and judged.
In a first aspect, an embodiment of the present application provides a method for identifying a skip pound, where the method includes: collecting a monitoring image of a target vehicle passing through a wagon balance area through a monitoring camera; wherein the monitoring image includes: an image acquired when a target vehicle passes through a wagon balance; after the monitoring image identifies that the wagon balance is blocked, calculating the moving speed of the target vehicle passing through the wagon balance; and determining whether the target vehicle has a jump behavior according to the judgment on whether the moving speed is abnormal.
In the implementation process, the monitoring camera is used for shooting the image of the vehicle passing through the wagon balance area, identifying whether the wagon balance is blocked according to the image, calculating the forward speed of the wagon head, identifying and judging whether the vehicle has the cheating action of jumping the wagon balance in the wagon balance passing process, and not needing to continuously maintain the shaft type tire type information feature library, so that the identification result is more reliable, the installation angle and the position of the camera are also more flexible, the deployment cost is lower, and the method is simple and reliable.
Optionally, the monitoring image further includes: a wagon balance image acquired when the target vehicle does not pass through the wagon balance; after the wagon balance is identified to be blocked based on the monitoring image, calculating the moving speed of the target vehicle passing through the wagon balance comprises the following steps: extracting the wagon balance image in the monitoring image; comparing the wagon balance image with a wagon balance position area image corresponding to a plurality of continuously acquired monitoring images to obtain a comparison similarity; and if the comparison similarity exceeds a first preset threshold value, judging that the wagon balance is blocked, and calculating the moving speed of the target vehicle passing through the wagon balance.
In the implementation process, whether the wagon balance is shielded or not is judged through the identification of a plurality of continuously collected monitoring images, so that the identification of whether the wagon balance is weighed or not is realized, the identification of the follow-up wagon balance jumping behavior is facilitated, and the identification accuracy is improved.
Optionally, the determining that the wagon balance is blocked, and calculating the moving speed of the target vehicle through the wagon balance, includes: extracting the central point position of a rectangular frame to which the head of the target vehicle belongs in the monitoring image; calculating the pixel distance between the center point position in the current monitoring image and the corresponding center point position in the monitoring image acquired in the previous time at intervals of a first preset time; and determining the moving speed of the target vehicle passing through the wagon balance based on the first preset time and the pixel distance.
In the implementation process, the center points are extracted by utilizing the monitoring images continuously shot at intervals of preset time, the pixel distance between the center points is calculated, the vehicle speed is calculated, convenience and rapidness are realized, and the jump pound recognition efficiency is improved.
Optionally, the determining that the wagon balance is blocked, calculating a moving speed of the target vehicle passing through the wagon balance, includes: calibrating the travelling route of the target vehicle passing through the wagon balance in the field of view of the monitoring camera to obtain the actual distance corresponding to the pixel distance; extracting the central point position of a rectangular frame to which the head of the target vehicle belongs in the monitoring image; calculating the actual distance between the center point position in the current monitoring image and the corresponding center point position in the monitoring image acquired in the previous time at intervals of a first preset time; and determining the moving speed of the target vehicle passing through the wagon balance based on the first preset time and the actual distance.
In the implementation process, the center points are extracted by utilizing the monitoring images continuously shot at intervals of preset time, the pixel distance between the center points is calculated, the actual distance is converted through a calibration algorithm, the vehicle speed is further calculated, the vehicle speed is convenient and rapid, the accuracy is higher, and the jump pound recognition efficiency and accuracy are improved.
Optionally, the extracting a center point position of a rectangular frame to which the head of the target vehicle belongs in the monitoring image includes: and detecting and identifying a rectangular frame to which the headstock belongs in the monitoring image based on the yolov5 model, and calculating the center point position of the rectangular frame.
In the implementation process, the vehicle head rectangular frame identification is carried out on the monitoring image by using the lighter Yolov5 model framework, so that the speed is faster, the precision is higher, and the jump pound identification efficiency and the accuracy are improved.
Optionally, the comparing the wagon balance image with the wagon balance position area image corresponding to the plurality of continuously collected monitoring images to obtain a comparison similarity includes: and comparing the wagon balance image with the wagon balance position area image corresponding to the continuously acquired monitoring images through a structural similarity algorithm to obtain the comparison similarity.
In the implementation process, the similarity calculation is performed on the plurality of monitoring images by using the structural similarity algorithm, so that the speed is high, the precision is high, and the jump identification efficiency and the accuracy are improved.
Optionally, the determining whether the target vehicle has a skip behavior according to the determination of whether the moving speed is abnormal or not includes: comparing the moving speed with the average value of the moving speed of the target vehicle in a second preset time before the target vehicle passes through the wagon balance to obtain a speed deviation value; and if the speed deviation value exceeds a second preset threshold value, determining that the target vehicle has a jump behavior.
In the implementation process, the abnormal skip behavior of the vehicle is judged by comparing the moving speed with the average value of the moving speed of the target vehicle for a period of time before the target vehicle passes through the wagon balance, so that the situation that the speed is controlled in a hurry just after weighing is avoided, and some unintentional fouls are caused, and the skip recognition accuracy is improved.
In a second aspect, an embodiment of the present application provides a device for identifying a skip pound, the device including: the image acquisition module is used for acquiring a monitoring image of the target vehicle passing through the wagon balance area through the monitoring camera; the identification calculation module is used for calculating the moving speed of the target vehicle passing through the wagon balance after the wagon balance is identified to be blocked based on the monitoring image; and the jump weight judging module is used for determining whether the jump weight behavior of the target vehicle exists according to the judgment on whether the moving speed is abnormal or not.
In a third aspect, an embodiment of the present application further provides an electronic device, including: a processor, a memory storing machine-readable instructions executable by the processor, which when executed by the processor perform the steps of the method described above when the electronic device is run.
In a fourth aspect, embodiments of the present application provide a storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method described above.
In order to make the above objects, features and advantages of the present application more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and should not be considered as limiting the scope, and other related drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for identifying a skip pound according to an embodiment of the present application;
FIG. 2 is a schematic view of a wagon balance zone according to an embodiment of the present application;
fig. 3 is a schematic functional block diagram of a jump pound identification device according to an embodiment of the present application;
fig. 4 is a block diagram of an electronic device provided with a jump pound identification apparatus according to an embodiment of the present application.
Icon: 210-an image acquisition module; 220-identifying a computing module; 230-jump pound judging module; 300-an electronic device; 311-memory; 312-a storage controller; 313-processor; 314-peripheral interface; 315-an input-output unit; 316-display unit.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. The terms "first," "second," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
The inventor notices that in the prior art, the identification of the axle type and the tire type information of the vehicle can be realized through the video information acquisition device and the image processor, and the diagnosis of the cheating behaviors such as the jump, the drag and the walk-around S type of the vehicle can be realized, namely the jump behavior of the vehicle can be found through the identification of the axle type and the tire type information of the vehicle; however, because the axle type and the tire type of the vehicle are various, axle type tire type information of a plurality of vehicle types is needed to establish a vehicle type feature library, then the axle type tire type information is matched with the feature library information, and the identification result is not reliable enough. And the angle and the position of the monitoring camera are strictly required, and high installation and deployment cost exists. Therefore, in order to realize the functional effects of no need of continuously maintaining the shaft type tyre type information feature library, more reliable identification results, more flexible installation angle and position of the camera and lower deployment cost, the embodiment of the application provides a jump pound identification method, a jump pound identification device, jump pound identification equipment and a storage medium as described below.
Referring to fig. 1, fig. 1 is a flowchart of a jump pound identification method according to an embodiment of the present application. The embodiments of the present application will be explained in detail below. The method comprises the following steps: step 100, step 120 and step 140.
Step 100: collecting a monitoring image of a target vehicle passing through a wagon balance area through a monitoring camera; wherein, the monitoring image includes: an image acquired when a target vehicle passes through a wagon balance;
step 120: after the wagon balance is identified to be blocked based on the monitoring image, calculating the moving speed of the target vehicle passing through the wagon balance;
step 140: and determining whether the target vehicle has the jump behavior according to the judgment on whether the moving speed is abnormal.
Illustratively, the monitoring camera may be: the industrial camera arranged in any area above the wagon balance can shoot and record the whole area where the wagon balance is located and the scene of the vehicle passing through the wagon balance from top to bottom, and store images in a frame form; for example: as shown in fig. 2, the monitoring camera is installed right above the wagon balance, and shoots from top to bottom, and in a speed measuring area, the monitoring system continuously monitors the behavior of the wagon, and when the apparent abnormality of the vehicle speed is found in the area, the wagon is judged to have the illegal suspicion of jump of the wagon balance and the like.
Alternatively, the monitoring camera shown in fig. 2 is used to continuously monitor and shoot the target vehicle to be weighed, and the shot image may include a whole process image of the target vehicle passing through the wagon balance position area and not passing through the wagon balance position area. The target vehicle can be a large truck, when the wagon is identified to be blocked in the monitoring image, namely, the wagon is weighed, meanwhile, the moving speed (running speed) of the wagon in a speed measuring area is calculated in real time through the monitoring image, and whether the wagon changes the gravity center by a method of accelerating or decelerating is judged through the monitoring camera so as to achieve the purpose of jumping the wagon, namely, whether the wagon keeps stable speed to pass is judged, so that the wagon is judged to have illegal suspicions such as jumping the wagon. Since the speed measurement index can reflect the vehicle speed, and in the real world, the actual speed of the truck is in units of a real distance and in units of a pixel distance in the field of view of the camera, the movement speed calculated by the pixel distance or the actual distance can be used as a measurement standard of the vehicle speed.
The method has the advantages that the monitoring camera is used for shooting the image of the vehicle passing through the wagon balance area, whether the wagon balance is shielded or not is judged according to the image, the advancing speed of the wagon head is calculated, whether the cheating behavior of the wagon balance is jumped or not is judged in the process of passing through the wagon balance, the shaft type tire type information feature library is not required to be continuously maintained, the identification result is more reliable, the installation angle and the position of the camera are also more flexible, the deployment cost is lower, and the method is simple and reliable.
In one embodiment, the monitoring image further comprises: a wagon balance image acquired when the target vehicle does not pass through the wagon balance; step 120 may include: step 121, step 122 and step 123.
Step 121: extracting a wagon balance image in the monitoring image;
step 122: comparing the wagon balance image with the wagon balance position area image corresponding to the continuously acquired monitoring images to obtain a comparison similarity;
step 123: if the comparison similarity exceeds a first preset threshold value, judging that the wagon balance is blocked, and calculating the moving speed of the target vehicle passing through the wagon balance.
For example, since the monitoring camera is generally continuously photographed, the monitoring image may include: the target vehicle passes through the wagon balance position area and the whole process image of the wagon balance position area. The wagon balance image can be an image shot by a target vehicle which does not pass through a wagon balance position area in the monitoring image, and aims to pre-shoot a picture to serve as a reference when no vehicle passes through the wagon balance, and continuously compare and judge the picture with a newly shot picture to obtain the similarity between the new picture and the reference picture. The object of comparison is the position of the wagon balance area, and the purpose is to judge whether a vehicle exists above the wagon balance, and if the wagon balance is shielded, the vehicle is weighed. Optionally, after the vehicle head starts to be identified in the monitoring image shot by the monitoring camera, the image of the area where the wagon balance is located in the current image is saved. The image of the area is then continuously compared with the images of the corresponding wagon balance area positions in the following several pictures during the running of the vehicle. When the comparison similarity exceeds a first preset threshold (for example, 0.8), namely the difference is large, the wagon balance is regarded as being blocked, namely the wagon balance is indicated to be weighed. And taking 0.8 as a first preset threshold, wherein if the picture similarity of the wagon balance area is lower than 0.8, the wagon balance area is considered to be blocked, namely that the vehicle passes through, and the first preset threshold can be specifically set according to actual conditions.
The method has the advantages that whether the wagon balance is shielded or not is judged through the identification of a plurality of continuously collected monitoring images, the identification of whether the wagon balance is weighed or not is realized, the identification of the follow-up wagon balance jumping behavior is facilitated, and the identification accuracy is improved.
In one embodiment, "determining that the wagon balance is blocked, and calculating the moving speed of the target vehicle through the wagon balance" in step 123 may include: step 1231, step 1232, and step 1233.
Step 1231: extracting the center point position of a rectangular frame to which the head of the target vehicle belongs in the monitoring image;
step 1232: calculating the pixel distance between the position of the central point in the current monitoring image and the position of the corresponding central point in the monitoring image acquired in the previous time at intervals of a first preset time;
step 1233: and determining the moving speed of the target vehicle passing through the wagon balance based on the first preset time and the pixel distance.
The monitoring camera continuously monitors and shoots the target vehicle to be weighed, obtains image information of the vehicle on a moving track of the vehicle passing through the wagon balance, and extracts frame images containing vehicle heads at different detection time points respectively so as to analyze the moving state of the target vehicle through the image information of the vehicle heads. In each frame of image with a first preset time interval, a pre-trained deep learning headstock detection model can be adopted to identify pixel points at the center point positions of rectangular frames to which the headstock of the target vehicle belongs, pixel distances between the center points are calculated, and the current running speed of the vehicle, namely the moving speed of the target vehicle passing through the wagon balance, is calculated by utilizing a distance speed formula according to the pixel distances and the interval time. The first preset time of the interval may be various times such as 0.5 seconds, 1 second, 2 seconds, 3 seconds, and the like, which may be specifically set according to practical situations. The center points are extracted by utilizing the monitoring images continuously shot at intervals of preset time, the pixel distance between the center points is calculated, the vehicle speed is calculated, convenience and rapidness are realized, and the jump pound recognition efficiency is improved.
In one embodiment, "determining that the wagon balance is blocked, and calculating the moving speed of the target vehicle through the wagon balance" in step 123 may include: step 1234, step 1235, step 1236, and step 1237.
Step 1234: calibrating the travelling route of the target vehicle passing through the wagon balance in the field of view of the monitoring camera to obtain the actual distance corresponding to the pixel distance;
step 1235: extracting the center point position of a rectangular frame to which the head of the target vehicle belongs in the monitoring image;
step 1236: calculating the actual distance between the position of the central point in the current monitoring image and the position of the corresponding central point in the monitoring image acquired in the previous time at intervals of a first preset time;
step 1237: and determining the moving speed of the target vehicle passing through the wagon balance based on the first preset time and the actual distance.
For example, to obtain a more accurate real moving speed, the pixel distance and the real world distance are mapped, and the two methods can be equivalently converted. The method can calibrate the vehicle running line at first, calculate how many meters in the picture when the distance between pixels at different positions actually represents the actual distance, then calculate the actual running speed of the vehicle by moving the pixel distance every second through the headstock, and can more effectively solve the problems caused by the picture distortion and shooting angle deviation of the camera. In the view of the camera, an ice cream cone can be placed at intervals of 1 meter (or 0.5 meter) or other marks such as white points or red points can be drawn on the whole straight route through which the vehicle passes. Then, in the monitor picture, the pixel distances of the adjacent mark center points are calculated, and then the pixel distance of each interval in the picture with the actual distance of 1 meter is obtained (because the picture may have distortion, the pixel interval distances at different positions have slight differences). In the image frame, these points are connected in a straight line, denoted by L. When the vehicle passes through, the center point X of the rectangular frame of the head of the picture is mapped onto a straight line L. The perpendicular line between the point and the straight line L is made, and the intersection point of the two lines is the point Y of the center point of the locomotive mapped onto the straight line L. Thus, by calculating the position of Y on the point straight line L at each of the intervals, the distance traveled by the vehicle can be converted.
Alternatively, a plurality of spaced points S0, S1, S2, S3, S4, S5, etc. may be provided on the straight line L, and the distance between two adjacent points in the real world is 1 meter. At the time T0, the projection point of the center point of the vehicle head on the straight line L is Y0, where Y0 is between S2 and S3, and the distance between Y0 and S2 is 0.2 of the distance between S2 and S3. At the time T1 of the next second, the projection point of the center point of the headstock on the straight line L is Y1, at this time, Y0 is between S2 and S3, and the distance between Y0 and S2 is 0.8 of the distance between S2 and S3. It is known that the actual distance traveled by the vehicle head is (0.8-0.2) x 1 m=0.6 m, so that the actual speed of the current vehicle can be calculated to be 0.6 m/s.
The center points are extracted by utilizing the monitoring images continuously shot at intervals of preset time, the pixel distance between the center points is calculated, the actual distance is converted through a calibration algorithm, the vehicle speed is further calculated, the vehicle speed is convenient and rapid, the accuracy is higher, and the jump pound recognition efficiency and accuracy are improved.
In one embodiment, step 1231 or step 1235 may include: step 1238.
Step 1238: and detecting and identifying a rectangular frame to which the headstock belongs in the monitoring image based on the yolov5 model, and calculating the center point position of the rectangular frame.
Exemplarily, yolov5 is a target detection algorithm based on deep learning, is an improved version of Yolov4, adopts a lighter model architecture, and has higher speed and higher precision. In visual tasks, the rectangle is a very basic shape, so Yolov5 can be used for rectangle reasoning. In Yolov5, an input monitoring image is divided into a plurality of lattices, each of which is responsible for predicting one bounding box. For rectangular objects, the bounding box is set to a rectangular shape, i.e., coordinates of the upper left and lower right corners, which may be specified in the configuration file of Yolov 5; upon completion of the configuration, the position of the rectangular object may be predicted using Yolov 5. The specific steps can be as follows: and inputting the monitoring image to be detected into the model to obtain the confidence coefficient and coordinate information of each boundary frame, drawing a corresponding rectangular frame according to the coordinate information of the boundary frame with the confidence coefficient higher than the set threshold value, thereby identifying the rectangular frame to which the headstock belongs in the monitoring image, and calculating the center point position of the rectangular frame according to the pixel information of the rectangular frame. The vehicle head rectangular frame identification is carried out on the monitoring image by using the lighter Yolov5 model framework, so that the speed is faster, the precision is higher, and the jump pound identification efficiency and accuracy are improved.
In one embodiment, step 122 may include: step 125.
Step 125: and comparing the wagon balance image with the wagon balance position area image corresponding to the continuously acquired monitoring images through a structural similarity algorithm to obtain the comparison similarity.
Illustratively, the structural similarity algorithm may be: SSIM (structural similarity) is an index for measuring the similarity of two images, and is mainly used for detecting the similarity of two images with the same size or detecting the distortion degree of the images. The monitoring image may include: the target vehicle passes through the wagon balance position area and the whole process image of the wagon balance position area. The wagon balance image may be an image captured in the monitoring image without the target vehicle passing through the wagon balance position area. The brightness, contrast and structure of the wagon balance image and the continuously acquired monitoring images are respectively compared, the three elements are weighted and expressed by products, the corresponding similarity is calculated, 0.8 can be used as a threshold value, and when the picture similarity of the wagon balance area is lower than 0.8, the wagon balance area is regarded as being blocked, namely, the vehicle passing is indicated. The similarity calculation is carried out on the plurality of monitoring images by using the structural similarity algorithm, so that the speed is high, the precision is high, and the jump pound recognition efficiency and the accuracy are improved.
In one embodiment, step 140 may include: step 141 and step 142.
Step 141: comparing the moving speed with the average value of the moving speed of the target vehicle in a second preset time before the target vehicle passes through the wagon balance to obtain a speed deviation value;
step 142: and if the speed deviation value exceeds the second preset threshold value, determining that the target vehicle has the jump behavior.
For example, since the truck driver needs to keep a constant-speed driving state before weighing from an actual scene, the speed is prevented from being hurried to be controlled when the truck is about to weigh, and some unintentional fouls are caused, and therefore the second preset time can be set to be 4 seconds, 5 seconds, 6 seconds and other various times which are beneficial to the response of the driver. Through experimental verification and statistics, a statistical threshold value is obtained by counting a large number of speed deviation ratios during abnormal driving and uniform driving, and the second preset threshold value can be set to 10% and a plurality of adjacent values, namely, when the speed variation ratio exceeds 10%, the driving abnormality is judged. Optionally, comparing the running speed of the truck calculated in the step with the average value of the speed of the previous 5 seconds, when the speed deviation is higher than 10%, judging that the running of the truck is abnormal, the vehicle has a skip behavior, sending alarm information, and if the speed deviation is lower than 10%, continuing to monitor. By comparing the moving speed with the average value of the moving speed of the target vehicle for a period of time before the target vehicle passes through the wagon balance, the abnormal wagon balance jumping behavior of the vehicle is judged, the situation that the speed is controlled in a hurry only when the target vehicle is weighed, so that unintentional fouls are caused is avoided, and the wagon balance jumping recognition accuracy is improved.
Referring to fig. 3, fig. 3 is a schematic functional block diagram of a jump pound identification device according to an embodiment of the present application. The device comprises: an image acquisition module 210, an identification calculation module 220 and a skip-pound judgment module 230.
The image collecting module 210 is configured to collect, by using a monitoring camera, a monitoring image of a target vehicle passing through a wagon balance area;
the recognition calculation module 220 is used for calculating the moving speed of the target vehicle passing through the wagon balance after recognizing that the wagon balance is blocked based on the monitoring image;
the skip-pound judging module 230 is configured to determine whether the target vehicle has a skip-pound behavior according to the judgment of whether the moving speed is abnormal.
Optionally, the monitoring image further includes: a wagon balance image acquired when the target vehicle does not pass through the wagon balance; the identification calculation module 220 may be configured to:
extracting the wagon balance image in the monitoring image;
comparing the wagon balance image with a wagon balance position area image corresponding to a plurality of continuously acquired monitoring images to obtain a comparison similarity;
and if the comparison similarity exceeds a first preset threshold value, judging that the wagon balance is blocked, and calculating the moving speed of the target vehicle passing through the wagon balance.
Alternatively, the identification calculation module 220 may be configured to:
extracting the central point position of a rectangular frame to which the head of the target vehicle belongs in the monitoring image;
calculating the pixel distance between the center point position in the current monitoring image and the corresponding center point position in the monitoring image acquired in the previous time at intervals of a first preset time;
and determining the moving speed of the target vehicle passing through the wagon balance based on the first preset time and the pixel distance.
Alternatively, the identification calculation module 220 may be configured to:
calibrating the travelling route of the target vehicle passing through the wagon balance in the field of view of the monitoring camera to obtain the actual distance corresponding to the pixel distance;
extracting the central point position of a rectangular frame to which the head of the target vehicle belongs in the monitoring image;
calculating the actual distance between the center point position in the current monitoring image and the corresponding center point position in the monitoring image acquired in the previous time at intervals of a first preset time;
and determining the moving speed of the target vehicle passing through the wagon balance based on the first preset time and the actual distance.
Alternatively, the identification calculation module 220 may be configured to:
and detecting and identifying a rectangular frame to which the headstock belongs in the monitoring image based on the yolov5 model, and calculating the center point position of the rectangular frame.
Alternatively, the identification calculation module 220 may be configured to:
and comparing the wagon balance image with the wagon balance position area image corresponding to the continuously acquired monitoring images through a structural similarity algorithm to obtain the comparison similarity.
Optionally, the skip-pound judging module 230 may be configured to:
comparing the moving speed with the average value of the moving speed of the target vehicle in a second preset time before the target vehicle passes through the wagon balance to obtain a speed deviation value;
and if the speed deviation value exceeds a second preset threshold value, determining that the target vehicle has a jump behavior.
Referring to fig. 4, fig. 4 is a block schematic diagram of an electronic device. The electronic device 300 may include a memory 311, a memory controller 312, a processor 313, a peripheral interface 314, an input output unit 315, a display unit 316. It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 4 is merely illustrative and is not intended to limit the configuration of the electronic device 300. For example, electronic device 300 may also include more or fewer components than shown in FIG. 4, or have a different configuration than shown in FIG. 4.
The above-mentioned memory 311, memory controller 312, processor 313, peripheral interface 314, input/output unit 315, and display unit 316 are electrically connected directly or indirectly to each other to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The processor 313 is used to execute executable modules stored in the memory.
The Memory 311 may be, but is not limited to, a random access Memory (Random Access Memory, RAM), a Read Only Memory (ROM), a programmable Read Only Memory (Programmable Read-Only Memory, PROM), an erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), an electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc. The memory 311 is configured to store a program, and the processor 313 executes the program after receiving an execution instruction, and a method executed by the electronic device 300 defined by the process disclosed in any embodiment of the present application may be applied to the processor 313 or implemented by the processor 313.
The processor 313 may be an integrated circuit chip having signal processing capabilities. The processor 313 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (digital signal processor, DSP for short), application specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), field Programmable Gate Arrays (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The peripheral interface 314 couples various input/output devices to the processor 313 and the memory 311. In some embodiments, the peripheral interface 314, the processor 313, and the memory controller 312 may be implemented in a single chip. In other examples, they may be implemented by separate chips.
The input/output unit 315 is used for providing input data to a user. The input/output unit 315 may be, but is not limited to, a mouse, a keyboard, and the like.
The display unit 316 provides an interactive interface (e.g., a user interface) between the electronic device 300 and a user for reference. In this embodiment, the display unit 316 may be a liquid crystal display or a touch display. The liquid crystal display or the touch display may display a process of executing the program by the processor.
The electronic device 300 in this embodiment may be used to perform each step in each method provided in the embodiment of the present application.
Furthermore, the embodiment of the application also provides a storage medium, and a computer program is stored on the storage medium, and the computer program executes the steps in the embodiment of the method when being executed by a processor.
The computer program product of the above method according to the embodiment of the present application includes a storage medium storing program codes, where the instructions included in the program codes may be used to execute the steps in the above method embodiment, and specifically, reference may be made to the above method embodiment, which is not repeated herein.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, and the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form. The functional modules in the embodiment of the application can be integrated together to form a single part, or each module can exist alone, or two or more modules can be integrated to form a single part.
It should be noted that the functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM) random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and variations will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A method of identifying a skip pound, the method comprising:
collecting a monitoring image of a target vehicle passing through a wagon balance area through a monitoring camera; wherein the monitoring image includes: an image acquired when a target vehicle passes through a wagon balance;
after the monitoring image identifies that the wagon balance is blocked, calculating the moving speed of the target vehicle passing through the wagon balance;
and determining whether the target vehicle has a jump behavior according to the judgment on whether the moving speed is abnormal.
2. The method of claim 1, wherein the monitoring image further comprises: a wagon balance image acquired when the target vehicle does not pass through the wagon balance;
after the wagon balance is identified to be blocked based on the monitoring image, calculating the moving speed of the target vehicle passing through the wagon balance comprises the following steps:
extracting the wagon balance image in the monitoring image;
comparing the wagon balance image with a wagon balance position area image corresponding to a plurality of continuously acquired monitoring images to obtain a comparison similarity;
and if the comparison similarity exceeds a first preset threshold value, judging that the wagon balance is blocked, and calculating the moving speed of the target vehicle passing through the wagon balance.
3. The method of claim 2, wherein the determining that the wagon balance is blocked and calculating a movement speed of a target vehicle through the wagon balance include:
extracting the central point position of a rectangular frame to which the head of the target vehicle belongs in the monitoring image;
calculating the pixel distance between the center point position in the current monitoring image and the corresponding center point position in the monitoring image acquired in the previous time at intervals of a first preset time;
and determining the moving speed of the target vehicle passing through the wagon balance based on the first preset time and the pixel distance.
4. The method of claim 2, wherein the determining that the wagon balance is occluded, calculating a movement speed of a target vehicle past the wagon balance, comprises:
calibrating the travelling route of the target vehicle passing through the wagon balance in the field of view of the monitoring camera to obtain the actual distance corresponding to the pixel distance;
extracting the central point position of a rectangular frame to which the head of the target vehicle belongs in the monitoring image;
calculating the actual distance between the center point position in the current monitoring image and the corresponding center point position in the monitoring image acquired in the previous time at intervals of a first preset time;
and determining the moving speed of the target vehicle passing through the wagon balance based on the first preset time and the actual distance.
5. The method according to claim 3 or 4, wherein the extracting the center point position of the rectangular frame to which the head of the target vehicle belongs in the monitoring image includes:
and detecting and identifying a rectangular frame to which the headstock belongs in the monitoring image based on the yolov5 model, and calculating the center point position of the rectangular frame.
6. The method according to claim 2, wherein comparing the wagon balance image with a wagon balance position area image corresponding to a plurality of continuously acquired monitoring images to obtain a comparison similarity comprises:
and comparing the wagon balance image with the wagon balance position area image corresponding to the continuously acquired monitoring images through a structural similarity algorithm to obtain the comparison similarity.
7. The method of claim 1, wherein determining whether the target vehicle has a skip behavior based on the determination of whether the movement speed is abnormal comprises:
comparing the moving speed with the average value of the moving speed of the target vehicle in a second preset time before the target vehicle passes through the wagon balance to obtain a speed deviation value;
and if the speed deviation value exceeds a second preset threshold value, determining that the target vehicle has a jump behavior.
8. A skip-pound identification device, the device comprising:
the image acquisition module is used for acquiring a monitoring image of the target vehicle passing through the wagon balance area through the monitoring camera;
the identification calculation module is used for calculating the moving speed of the target vehicle passing through the wagon balance after the wagon balance is identified to be blocked based on the monitoring image;
and the jump weight judging module is used for determining whether the jump weight behavior of the target vehicle exists according to the judgment on whether the moving speed is abnormal or not.
9. An electronic device, comprising: a processor, a memory storing machine-readable instructions executable by the processor, which when executed by the processor perform the steps of the method of any of claims 1 to 7 when the electronic device is run.
10. A storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method according to any of claims 1 to 7.
CN202310773313.XA 2023-06-28 2023-06-28 Jump pound identification method, apparatus, device and storage medium Pending CN116818070A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310773313.XA CN116818070A (en) 2023-06-28 2023-06-28 Jump pound identification method, apparatus, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310773313.XA CN116818070A (en) 2023-06-28 2023-06-28 Jump pound identification method, apparatus, device and storage medium

Publications (1)

Publication Number Publication Date
CN116818070A true CN116818070A (en) 2023-09-29

Family

ID=88125284

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310773313.XA Pending CN116818070A (en) 2023-06-28 2023-06-28 Jump pound identification method, apparatus, device and storage medium

Country Status (1)

Country Link
CN (1) CN116818070A (en)

Similar Documents

Publication Publication Date Title
Philion et al. Learning to evaluate perception models using planner-centric metrics
US9684835B2 (en) Image processing system, image processing method, and program
CN110980197B (en) Material detection method and device and electronic equipment
CN105787923A (en) Vision system and analytical method for planar surface segmentation
CN108898109A (en) The determination methods, devices and systems of article attention rate
CN101299275A (en) Method and device for detecting target as well as monitoring system
CN112798811B (en) Speed measurement method, device and equipment
JP6911915B2 (en) Image processing equipment, image processing methods, and programs
CN114170448A (en) Evaluation method and device for visual perception algorithm
CN112562406B (en) Method and device for identifying off-line driving
CN113515985B (en) Self-service weighing system, weighing detection method, weighing detection equipment and storage medium
CN113505638B (en) Method and device for monitoring traffic flow and computer readable storage medium
CN103630221A (en) Method for remotely monitoring vehicle weighing cheating by using video analyzing
CN109766867A (en) Travel condition of vehicle determines method, apparatus, computer equipment and storage medium
CN115171361B (en) Dangerous behavior intelligent detection and early warning method based on computer vision
CN111524394A (en) Method, device and system for improving accuracy of comprehensive track monitoring data of apron
CN112447060A (en) Method and device for recognizing lane and computing equipment
CN107204118A (en) Weighing information matching process, device and server
CN116818070A (en) Jump pound identification method, apparatus, device and storage medium
CN110823596B (en) Test method and device, electronic equipment and computer readable storage medium
CN104718560B (en) Image monitoring apparatus for estimating size of singleton, and method therefor
CN112991769A (en) Traffic volume investigation method and device based on video
CN115289991B (en) Subway track deformation monitoring method and device and electronic equipment
CN111709665A (en) Vehicle safety evaluation method and device
CN105869413A (en) Method for measuring traffic flow and speed based on camera video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination