CN114387311A - LKJ file and locomotive video automatic time synchronization method, device and computer equipment - Google Patents

LKJ file and locomotive video automatic time synchronization method, device and computer equipment Download PDF

Info

Publication number
CN114387311A
CN114387311A CN202111570909.7A CN202111570909A CN114387311A CN 114387311 A CN114387311 A CN 114387311A CN 202111570909 A CN202111570909 A CN 202111570909A CN 114387311 A CN114387311 A CN 114387311A
Authority
CN
China
Prior art keywords
image frame
locomotive
video
intensity
intensities
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111570909.7A
Other languages
Chinese (zh)
Inventor
李俊成
张晋楷
闫帅
闫龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guoneng Xinshuo Railway Co ltd
Original Assignee
Guoneng Xinshuo Railway Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guoneng Xinshuo Railway Co ltd filed Critical Guoneng Xinshuo Railway Co ltd
Priority to CN202111570909.7A priority Critical patent/CN114387311A/en
Publication of CN114387311A publication Critical patent/CN114387311A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to an automatic time synchronization method, device and computer equipment for an LKJ file and a locomotive video. The method comprises the following steps: acquiring a locomotive video, and calculating the light stream intensity of adjacent image frames in the locomotive video at a plurality of same pixel point positions; for each adjacent image frame, determining the motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames; classifying the exercise intensities, and determining a separation score between two adjacent exercise intensities according to the type and separation algorithm of each exercise intensity; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the starting time and the image frame at the ending time of the locomotive video; determining the optimal separation according to the separation score, and determining a target image frame according to the optimal separation; and time synchronization is carried out on the locomotive video and the LKJ file according to the target image frame. By adopting the method, the locomotive video and the LKJ file can be automatically timed.

Description

LKJ file and locomotive video automatic time synchronization method, device and computer equipment
Technical Field
The application relates to the technical field of image analysis, in particular to an automatic time synchronization method, device and computer equipment for an LKJ file and a locomotive video.
Background
In the running process of the locomotive, the locomotive video monitoring host records information such as running road conditions of the locomotive and behaviors of crew members in a video mode. Meanwhile, an LKJ (train operation monitoring and recording device) of the train can record the operation condition of the train and form the recorded data into an LKJ file. In the process of analyzing the locomotive video file of the locomotive video monitoring host, the time of the locomotive video and the time of the LKJ file are required to be synchronous, so that the relation between the data recorded by the LKJ and the locomotive video can be accurately judged. However, in practical situations, the locomotive video monitoring host often cannot acquire the time of the LKJ file due to communication faults, equipment faults and the like, so that the locomotive video time and the LKJ time are completely out of synchronization.
In the prior art, the image OCR technology is utilized to extract the text information on the locomotive video, wherein the text information generally comprises the locomotive speed, the kilometer post, the train number and the like, and then the automatic synchronization of the time of the locomotive video and the time of the LKJ file is realized according to the synchronization of the kilometer post, the train number and the kilometer post and the train number in the LKJ file.
However, the inventor researches and finds that the existing method cannot realize automatic synchronization of the time of the locomotive video and the time of the LKJ file under the condition that the locomotive video has no text information.
Disclosure of Invention
In view of the above, it is necessary to provide an LKJ file and locomotive video automatic time synchronization method, apparatus and computer device.
In a first aspect, the application provides an automatic time synchronization method for an LKJ file and a locomotive video. The method comprises the following steps:
acquiring a locomotive video, and calculating the light stream intensity of adjacent image frames in the locomotive video at a plurality of same pixel point positions; the locomotive video comprises a target image frame, and the target image frame is an image frame of the locomotive at the starting moment;
for each adjacent image frame, determining the motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames;
classifying the exercise intensities, and determining a separation score between two adjacent exercise intensities according to the type and separation algorithm of each exercise intensity; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the starting time and the image frame at the ending time of the locomotive video;
determining the optimal separation according to the separation score, and determining a target image frame according to the optimal separation;
and time synchronization is carried out on the locomotive video and the LKJ file according to the target image frame.
In one embodiment, the step of determining the motion intensity between adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames comprises:
comparing the plurality of optical flow intensities with threshold values respectively;
and counting the number of pixel points with the optical flow intensity exceeding the threshold value according to the comparison result, and taking the number as the motion intensity between the adjacent image frames.
In one embodiment, the step of classifying the exercise intensities comprises:
taking a logarithm of each motion intensity, and clustering a plurality of motion intensities after taking the logarithm;
and classifying the exercise intensities according to the clustering result.
In one embodiment, the step of logarithm each exercise intensity comprises:
for each exercise intensity, if the exercise intensity is 0, the exercise intensity is updated to 1.
In one embodiment, the step of calculating the optical flow strength of adjacent image frames in the locomotive video at a plurality of same pixel positions comprises:
removing noise of each image frame in the locomotive video to obtain a de-noised video;
and calculating the light stream intensity of the adjacent image frames in the de-noised video at the same pixel point positions for each adjacent image frame in the de-noised video.
In one embodiment, the time-aligning the locomotive video with the LKJ file based on the target image frames comprises:
acquiring an LKJ file, and analyzing the LKJ file;
extracting a target time period according to the analysis result; the target time period comprises a starting moment;
and aligning the time corresponding to the target image frame with the starting time in the target time period so as to time the locomotive video and the LKJ file.
In a second aspect, the application further provides an automatic time synchronization device for the LKJ file and the locomotive video. The device comprises:
the optical flow calculation module is used for acquiring a locomotive video and calculating the optical flow intensity of adjacent image frames in the locomotive video at a plurality of same pixel point positions; the locomotive video comprises a target image frame, and the target image frame is an image frame of the locomotive at the starting moment;
the intensity determination module is used for determining the motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames for each adjacent image frame;
the score determining module is used for classifying the exercise intensities and determining the separation score between two adjacent exercise intensities according to the type and the separation algorithm of each exercise intensity; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the starting time and the image frame at the ending time of the locomotive video;
the target frame determining module is used for determining the optimal separation according to the separation score and determining a target image frame according to the optimal separation;
and the time synchronization module is used for synchronizing the locomotive video and the LKJ file according to the target image frame.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the steps of the method described above when executing the computer program.
In a fourth aspect, the present application further provides a computer-readable storage medium. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method.
In a fifth aspect, the present application further provides a computer program product. Computer program product comprising a computer program which, when being executed by a processor, carries out the steps of the method as described above.
The method, the device and the computer equipment for automatically synchronizing the time of the LKJ file and the locomotive video acquire the locomotive video, wherein the locomotive video comprises a target image frame, and the target image frame is an image frame of the locomotive at the starting moment; the optical flow intensity of adjacent image frames in the locomotive video at the same pixel point positions can be calculated; determining the motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to each adjacent image frame; classifying each exercise intensity, and determining a separation score between two adjacent exercise intensities according to the type and separation algorithm of each exercise intensity; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the starting time and the image frame at the ending time of the locomotive video; an optimal segmentation may be determined based on the segmentation score and a target image frame may be determined based on the optimal segmentation; and time synchronization is carried out on the locomotive video and the LKJ file according to the target image frame. Therefore, the motion intensity between each image frame can be determined according to the optical flow intensity, the optimal separation is determined according to the motion intensity, the target image frame at the locomotive starting time is obtained, the starting time of the locomotive in the locomotive video can be obtained, and the locomotive video and the LKJ file can be automatically timed according to the time.
Drawings
Fig. 1 is a schematic flowchart of an automatic time synchronization method for an LKJ file and a locomotive video in an embodiment;
FIG. 2 is a flowchart illustrating the steps of determining the motion intensity between adjacent image frames based on a plurality of optical flow intensities corresponding to the adjacent image frames in one embodiment;
FIG. 3 is a flow diagram illustrating the steps of classifying the exercise intensities in one embodiment;
FIG. 4 is a flowchart illustrating the steps for time matching a locomotive video with an LKJ file based on a target image frame in one embodiment;
fig. 5 is another schematic flow chart of an automatic time synchronization method of an LKJ file and a locomotive video in an embodiment;
fig. 6 is a block diagram of an embodiment of an apparatus for automatically synchronizing an LKJ file with a locomotive video;
FIG. 7 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the aspect of current locomotive video application, a locomotive video device records a driving process scene of a crew, and after the crew leaves the service, a professional analyst analyzes the video to confirm whether the crew has illegal behaviors such as playing a mobile phone, dozing off, not looking at the crew according to the required hands and the like in the driving process. However, the analysis process must be performed based on the data of the LKJ file and synchronously to obtain a correct analysis result. The current process flow is as follows:
1) warehousing the locomotive, wherein a crew member dumps the LKJ file and a manager dumps a locomotive video file to a ground server respectively;
2) the video analysis personnel call the locomotive video and the LKJ file recorded by the behavior of the crew member, conduct driving analysis on the crew member, and analyze the LKJ file and the locomotive video file in a playing mode on one computer manually, or find some key points from the locomotive video and then remove the LKJ file to obtain the time of the key points, and then conduct comparative analysis to obtain an analysis result.
From the above process, the locomotive video analysis requires that the locomotive video time and the LKJ file time are synchronized, so that the relationship between the data of the LKJ file and the video can be accurately judged. However, in an actual situation, due to communication failure, equipment failure and other reasons, the vehicle-mounted video monitoring host often cannot acquire the LKJ file time, so that the video time and the LKJ time are completely asynchronous. I.e. the video surveillance host does not use the clock of LKJ. When the video analysis is caused by the defect, the video analysis and the LKJ file can be synchronized according to the locomotive road condition video and the LKJ file through the experience of an analyst.
In the current method for synchronously analyzing the LKJ file and the locomotive video file, the character information on the video is extracted by utilizing an image OCR technology, the character information of the video generally comprises the speed, the kilometer sign, the train number and the like of a locomotive, and then the automatic alignment of the video and the LKJ file is realized according to the synchronization of the kilometer sign, the train number and the kilometer sign and the train number in the LKJ file. However, in the method, when the communication between the video monitoring host and the LKJ device is abnormal and the video is not superimposed with subtitles, the characters cannot be captured by the OCR technology, so that automatic time synchronization cannot be realized. The actual situation is that most of asynchronism is caused by abnormal communication between the video monitoring host and the LKJ device. Therefore, the technology cannot solve the problem that the LKJ file and the video are not synchronous due to part of reasons.
In order to solve the problems, the application provides an automatic time synchronization method, device and computer equipment for an LKJ file and a locomotive video. The locomotive video and the LKJ file of the locomotive running road condition can be automatically synchronized, so that all locomotive videos in the video monitoring host can be automatically synchronized with the LKJ file.
In one embodiment, as shown in fig. 1, an LKJ file and locomotive video automatic time synchronization method is provided, and this embodiment is illustrated by applying this method to a terminal, and it is to be understood that this method may also be applied to a server, and may also be applied to a system including a terminal and a server, and is implemented by interaction between the terminal and the server. In this embodiment, the method includes the steps of:
step S102, a locomotive video is obtained, and the optical flow intensity of adjacent image frames in the locomotive video at the positions of a plurality of same pixel points is calculated.
The locomotive video comprises a target image frame, and the target image frame is an image frame of the locomotive at the starting moment.
The locomotive video may be one of the road condition videos obtained by recording the running road condition of the locomotive, and the content of the locomotive video includes a process of starting the locomotive. The image frame sequence in the locomotive video is extracted, and continuous adjacent image frames can be obtained. The optical flow intensity is the moving amount of the same pixel point in the video image from the previous frame image to the next frame image, and is generally calculated by an optical flow algorithm, wherein the optical flow algorithm comprises an HS optical flow method, a Lucas-Kanada algorithm and a pyramid LK algorithm; it is understood that in a locomotive video, when the foreground and the background move relatively, optical flow is generated in a video image.
Specifically, a locomotive video is obtained, and the optical flow intensity of a plurality of same pixel points in adjacent image frames of the locomotive video is calculated through an optical flow algorithm.
In one embodiment, when the speed of the locomotive changes, the object shot by the road condition camera of the locomotive video device generates relative displacement, so that the optical flow of the image pixel points of the locomotive video can be calculated. However, because the change of the locomotive speed is relatively large, a large error can be caused by using the Lucas-Kanade algorithm, and in order to reduce the error, the optical flow intensity of a plurality of same pixel points in adjacent image frames can be calculated by using an optical flow algorithm (namely, pyramid layering optical flow LK algorithm) proposed by Jean-Yves Bouguset
Step S104, for each adjacent image frame, determining the motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames.
Wherein the intensity of motion may vary between each adjacent image frame. For example, when the locomotive moves at a high speed, the motion intensity between adjacent image frames is high; when only the constructor walks in the locomotive video and the locomotive is in a stop state, the motion intensity between the adjacent image frames is low (including the motion intensity is 0).
Specifically, for each adjacent image frame, there are a plurality of optical flow intensities calculated by the same pixels, and the motion intensity between the adjacent image frames can be determined according to the plurality of optical flow intensities of each adjacent image frame.
And step S106, classifying the exercise intensities, and determining a separation score between two adjacent exercise intensities according to the type of each exercise intensity and a separation algorithm.
The two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the starting time and the image frame at the ending time of the locomotive video; it is understood that each image frame of the locomotive video corresponds to two motion intensities except for the image frame at the starting time and the image frame at the ending time of the locomotive video, and the two motion intensities are two adjacent motion intensities. The score of the separation between two adjacent exercise intensities is the score of the separation between two adjacent exercise intensities; the separation between these two motion intensities is the image frame of the locomotive video corresponding to these two motion intensities.
Specifically, the classification may be performed according to the similarity of the exercise intensity; as an example, the high similarity may be similar in magnitude of the exercise intensity. And calculating the separation score of two adjacent exercise strengths according to the type of each exercise strength and a separation algorithm for each classified exercise strength.
In a specific embodiment, the types of exercise intensity include high intensity and low intensity. Determining the type of each exercise intensity as high intensity or low intensity, setting the exercise intensity of the type of high intensity as 1, and setting the exercise intensity of the type of low intensity as 0, and calculating a separation score between two adjacent exercise intensities according to the following expression (1):
S=A+B-C-D (1)
wherein S is a separation score; a is the number of the motion intensity with the type of low intensity before the separation, namely the number of 0 before the separation; b is the number of exercise intensities of type high after the separation, i.e. the number of 1 after the separation; c is the number of exercise intensity types of high intensity before the division, i.e. the number of 1 before the division; d is the number of intensity of the motion after the separation, of the type low intensity, i.e. 0 after the separation.
Step S108, determining the optimal separation according to the separation score, and determining a target image frame according to the optimal separation;
specifically, according to each calculated separation score, determining the optimal separation; and determining the target image frame based on the optimal separation.
In a specific embodiment, the segmentation corresponding to the separation score with the largest value is determined as the optimal separation, and the image frame of the locomotive video corresponding to the optimal separation is determined as the target image frame.
And step S110, comparing the locomotive video with the LKJ file according to the target image frame.
The target image frame is an image frame of the locomotive at the starting time, the LKJ file comprises a time axis, and the time on the time axis comprises the starting time.
Specifically, a time axis of the LKJ file may be extracted, and the starting time of the locomotive on the time axis in the LKJ file may be aligned with the target image frame, so as to obtain a synchronized locomotive video and the LKJ file.
Further, the locomotive video is one section of the road condition video obtained by recording the running road condition of the locomotive; the complete road condition video and the LKJ file can be synchronized according to the synchronized locomotive video and the LKJ file;
furthermore, a behavior video and a road condition video obtained by recording the behavior actions of the crew are connected to the same locomotive video monitoring host, and the behavior video and the LKJ file can be synchronized according to the synchronized road condition video and the LKJ file; so that the analysis of whether the crew violates the rule can be performed based on the data in the LKJ file.
In the embodiment, a locomotive video is obtained, wherein the locomotive video comprises a target image frame, and the target image frame is an image frame of the locomotive at the starting time; the optical flow intensity of adjacent image frames in the locomotive video at the same pixel point positions can be calculated; determining the motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to each adjacent image frame; classifying each exercise intensity, and determining a separation score between two adjacent exercise intensities according to the type and separation algorithm of each exercise intensity; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the starting time and the image frame at the ending time of the locomotive video; an optimal segmentation may be determined based on the segmentation score and a target image frame may be determined based on the optimal segmentation; and time synchronization is carried out on the locomotive video and the LKJ file according to the target image frame. Therefore, the motion intensity between each image frame can be determined according to the optical flow intensity, the optimal separation is determined according to the motion intensity, the target image frame at the locomotive starting time is obtained, the starting time of the locomotive in the locomotive video can be obtained, and the locomotive video and the LKJ file can be automatically timed according to the time.
In one embodiment, as shown in fig. 2, the step of determining the motion intensity between adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames comprises:
step S202, comparing a plurality of optical flow intensities with threshold values respectively;
step S204, according to the comparison result, counting the number of pixel points with the optical flow intensity exceeding the threshold value, and taking the number as the motion intensity between the adjacent image frames.
Wherein, the threshold value can be set according to the optical flow intensity of the image frame of the video when the actual locomotive runs; when the locomotive is in a stop state, a moving object may exist in a locomotive video, and in an image frame at the moment when the object moves, a plurality of optical flow intensities between adjacent image frames may exceed a threshold value; the optical flow intensity exceeding the threshold value means that the optical flow intensity is greater than the threshold value, and the optical flow intensity being less than or equal to the threshold value means that the optical flow intensity does not exceed the threshold value.
Specifically, each adjacent image frame includes a plurality of optical flow intensities, and the plurality of optical flow intensities of each image frame are respectively compared with a preset threshold value to obtain a comparison result of each optical flow intensity and the threshold value. And counting the number of pixel points with the optical flow intensity larger than the threshold according to the comparison result of the optical flow intensity and the threshold in each adjacent image frame, and taking the number as the motion intensity of the adjacent image frame. It will be appreciated that the intensity of motion between adjacent image frames may be 0.
In this embodiment, the number of pixels whose optical flow intensity exceeds the threshold is used as the motion intensity between adjacent image frames, so that the motion intensity between adjacent image frames can be reflected more intuitively. Meanwhile, the movement intensity is reflected more accurately, so that the target image frame can be determined more accurately, and the time setting accuracy is improved.
In one embodiment, as shown in fig. 3, the step of classifying the exercise intensities comprises:
step S302, logarithm is taken for each motion intensity, and a plurality of motion intensities after logarithm taking are clustered;
and step S304, classifying the exercise intensities according to the clustering result.
The clustering is an algorithm for processing data, and the clustering algorithm comprises a K-Means clustering algorithm, a mean shift clustering algorithm and a K-Means + + clustering algorithm.
Specifically, logarithm is taken on the motion intensity between each adjacent image frame, and a plurality of motion intensities after logarithm taking are clustered to obtain a clustering result. According to the clustering result, the exercise intensities can be classified, and the types of the exercise intensities can be further determined.
In a specific embodiment, each exercise intensity is subjected to logarithm, a plurality of exercise intensities subjected to logarithm taking are clustered by using a K-Means clustering algorithm, and the plurality of exercise intensities are divided into a high intensity class and a low intensity class according to a clustering result.
In this embodiment, the logarithm of each motion intensity is taken, so that clustering can be performed more accurately, and clustering can be performed more simply and marginally, so that the accuracy of positioning the target image frame can be improved, and the accuracy of time synchronization is further improved.
In one embodiment, the step of logarithmic each of the exercise intensities is preceded by: for each exercise intensity, if the exercise intensity is 0, the exercise intensity is updated to 1.
Specifically, the motion intensity between adjacent image frames may be 0, and in the case where the motion intensity is 0, logarithm extraction cannot be performed, and thus an error may be generated in logarithm extraction, causing a deviation in the result of classification after clustering.
If the exercise intensity is 0, the exercise intensity is updated to 1, and the exercise intensity after being updated to 1 is subjected to logarithm extraction, so that the error of classification after clustering can be reduced.
In one embodiment, the step of calculating optical flow strength of adjacent image frames in the locomotive video at a plurality of same pixel positions comprises:
removing noise of each image frame in the locomotive video to obtain a de-noised video;
and calculating the light stream intensity of the adjacent image frames in the de-noised video at the same pixel point positions for each adjacent image frame in the de-noised video.
Specifically, noise may exist in each image frame of the locomotive video, and the locomotive video after the noise is removed can be obtained by removing each image frame in the locomotive video. And calculating the optical flow intensity of a plurality of same pixel points between every two adjacent image frames in the locomotive video after the noise is removed.
In one embodiment, as shown in fig. 4, the step of time-aligning the locomotive video with the LKJ file based on the target image frames comprises:
step S402, acquiring an LKJ file, and analyzing the LKJ file;
step S404, extracting a target time period according to the analysis result; the target time period comprises a starting moment;
and step S406, aligning the time corresponding to the target image frame with the starting time in the target time period so as to time the locomotive video and the LKJ file.
The LKJ file is a file formed by data recorded by LKJ with the same train number and the same running time as the locomotive video. The target time period comprises the starting time of the locomotive in the locomotive video, and the starting time of the locomotive is the time corresponding to the target image frame; if the starting time of the locomotive is multiple, the target time period comprises the time corresponding to the target image frame obtained in the locomotive video.
Specifically, an LKJ file with the same train number and the same running time as the locomotive video is obtained, and the LKJ file is analyzed. And extracting a target time period including the time corresponding to the target image frame according to the result of analyzing the LKJ file. And aligning the time corresponding to the target image frame with the starting time in the target time period, so that the locomotive video and the LKJ file can be synchronized.
In a specific embodiment, as shown in fig. 5, the method for automatically time-synchronizing the LKJ file with the locomotive video includes the following steps:
step S501, obtaining a locomotive video, and removing noise of each image frame in the locomotive video to obtain a de-noised video; the locomotive video comprises a target image frame, and the target image frame is an image frame of the locomotive at the starting moment;
step S502, calculating the light stream intensity of the adjacent image frames in the de-noised video on a plurality of same pixel point positions by utilizing a Pyramidal LK algorithm for each adjacent image frame in the de-noised video;
step S503, comparing a plurality of optical flow intensities with threshold values respectively for each adjacent image frame; counting the number of pixel points with the optical flow intensity exceeding a threshold value according to a comparison result, and taking the number as the motion intensity between adjacent image frames;
step S504, for each exercise intensity, if the exercise intensity is 0, updating the exercise intensity to 1;
step S505, taking logarithm of each exercise intensity, and clustering a plurality of exercise intensities after taking logarithm; classifying the exercise intensities according to the clustering result, and dividing the exercise intensities into high intensity and low intensity;
step S506, determining the type of each exercise intensity as high intensity or low intensity, setting the exercise intensity with the type as high intensity as 1, setting the exercise intensity with the type as low intensity as 0, and calculating the separation score between two adjacent exercise intensities according to the expression (1); the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the starting time and the image frame at the ending time of the locomotive video;
step S507, determining the segmentation corresponding to the separation score with the maximum numerical value as the optimal separation, and determining the image frame of the locomotive video corresponding to the optimal separation as a target image frame;
step S508, an LKJ file is obtained and analyzed; extracting a target time period according to the analysis result; the target time period comprises a starting moment;
step S509, aligning a time corresponding to the target image frame with a starting time in the target time period, so as to time the locomotive video and the LKJ file.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be rotated or alternated with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the application further provides an automatic time synchronization device for the LKJ file and the locomotive video, wherein the automatic time synchronization device is used for realizing the automatic time synchronization method for the LKJ file and the locomotive video. The implementation scheme for solving the problem provided by the device is similar to the implementation scheme recorded in the method, so that specific limitations in the embodiment of the automatic time synchronization device for one or more LKJ files and a locomotive video provided below can be referred to the limitations on the automatic time synchronization method for the LKJ file and the locomotive video, and details are not repeated here.
In one embodiment, as shown in fig. 6, there is provided an automatic time synchronization device for an LKJ file and a locomotive video, including: an optical flow computation module 610, an intensity determination module 620, a score determination module 630, a target frame determination module 640, and a time tick module 650, wherein:
the optical flow calculation module 610 is configured to obtain a locomotive video, and calculate optical flow intensities of adjacent image frames in the locomotive video at a plurality of same pixel point positions; the locomotive video comprises a target image frame, and the target image frame is an image frame of the locomotive at the starting moment;
the strength determining module 620 is configured to determine, for each adjacent image frame, a motion strength between the adjacent image frames according to a plurality of optical flow strengths corresponding to the adjacent image frames;
the score determining module 630 is configured to classify each exercise intensity, and determine a separation score between two adjacent exercise intensities according to a type and a separation algorithm of each exercise intensity; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the starting time and the image frame at the ending time of the locomotive video;
the target frame determination module 640 is configured to determine an optimal partition according to the partition score, and determine a target image frame according to the optimal partition;
the time synchronization module 650 is configured to perform time synchronization on the locomotive video and the LKJ file according to the target image frame.
In one embodiment, the intensity determination module 620 includes a threshold comparison unit and a statistics unit.
The threshold comparison unit is used for comparing the optical flow intensities with thresholds respectively;
and the counting unit is used for counting the number of pixel points with the optical flow intensity exceeding a threshold value according to the comparison result, and taking the number as the motion intensity between adjacent image frames.
In one embodiment, the score determination module 630 includes a log unit, a cluster unit, and a classification unit;
the logarithm unit is used for taking a logarithm of each motion intensity;
the clustering unit is used for clustering a plurality of logarithmic movement intensities;
the classification unit is used for classifying the motion intensities according to the clustering result.
In one embodiment, the optical flow computation module 610 includes a denoising unit and an optical flow unit;
the denoising unit is used for removing noise of each image frame in the locomotive video to obtain a denoised video;
the optical flow unit is used for calculating the optical flow intensity of the adjacent image frames in the de-noised video at the same pixel point positions for each adjacent image frame in the de-noised video.
In one embodiment, the time synchronization module 650 includes a file acquisition unit, a parsing unit, a time extraction unit, and a synchronization unit;
the file acquisition unit is used for acquiring an LKJ file;
the analyzing unit is used for analyzing the LKJ file;
the time extraction unit is used for extracting a target time period according to the analysis result; the target time period comprises a starting moment;
the synchronization unit is used for aligning the time corresponding to the target image frame with the starting time in the target time period so as to time the locomotive video and the LKJ file.
All or part of each module in the LKJ file and locomotive video automatic time alignment device can be realized through software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 7. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer equipment is used for storing data such as locomotive videos, LKJ files and the like. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to realize an automatic time synchronization method of an LKJ file and a locomotive video.
Those skilled in the art will appreciate that the architecture shown in fig. 7 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is further provided, which includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In an embodiment, a computer program product is provided, comprising a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), Magnetic Random Access Memory (MRAM), Ferroelectric Random Access Memory (FRAM), Phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing based data processing logic devices, etc., without limitation.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (10)

1. An automatic time synchronization method for an LKJ file and a locomotive video is characterized by comprising the following steps:
acquiring a locomotive video, and calculating the light stream intensity of adjacent image frames in the locomotive video at a plurality of same pixel point positions; the locomotive video comprises a target image frame, and the target image frame is an image frame of the locomotive at the starting moment;
for each of the adjacent image frames, determining a motion intensity between the adjacent image frames according to a plurality of the optical flow intensities corresponding to the adjacent image frames;
classifying each exercise intensity, and determining a separation score between two adjacent exercise intensities according to the type and the separation algorithm of each exercise intensity; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the starting time and the image frame at the ending time of the locomotive video;
determining an optimal segmentation according to the segmentation score, and determining the target image frame according to the optimal segmentation;
and according to the target image frame, carrying out time synchronization on the locomotive video and the LKJ file.
2. The method of claim 1, wherein the step of determining the motion intensity between the adjacent image frames according to a plurality of the optical flow intensities corresponding to the adjacent image frames comprises:
comparing a plurality of the optical flow intensities with threshold values respectively;
and counting the number of pixel points with the optical flow intensity exceeding a threshold value according to the comparison result, and taking the number as the motion intensity between the adjacent image frames.
3. The method of claim 1, wherein the step of classifying the respective exercise intensities comprises:
taking a logarithm of each motion intensity, and clustering a plurality of motion intensities after taking the logarithm;
and classifying the motion intensities according to the clustering result.
4. The method of claim 3, wherein the step of logarithmically computing each of said motion strengths is preceded by the step of:
for each exercise intensity, if the exercise intensity is 0, the exercise intensity is updated to 1.
5. The method of claim 1, wherein the step of calculating optical flow strength at a plurality of same pixel locations for adjacent image frames in the locomotive video comprises:
removing noise of each image frame in the locomotive video to obtain a de-noised video;
and calculating the optical flow intensity of the adjacent image frames in the de-noised video at the same pixel point positions for each adjacent image frame in the de-noised video.
6. The method of claim 1, wherein the step of time-aligning the locomotive video with an LKJ file based on the target image frame comprises:
acquiring the LKJ file, and analyzing the LKJ file;
extracting a target time period according to the analysis result; the target time period comprises the starting time;
aligning the time corresponding to the target image frame with the starting time in the target time period so as to time the locomotive video and the LKJ file.
7. An automatic time synchronization device for an LKJ file and a locomotive video, which is characterized by comprising:
the optical flow calculation module is used for acquiring a locomotive video and calculating the optical flow intensity of adjacent image frames in the locomotive video at a plurality of same pixel point positions; the locomotive video comprises a target image frame, and the target image frame is an image frame of the locomotive at the starting moment;
the intensity determination module is used for determining the motion intensity between the adjacent image frames according to a plurality of optical flow intensities corresponding to the adjacent image frames for each adjacent image frame;
the score determining module is used for classifying the exercise intensities and determining a separation score between two adjacent exercise intensities according to the type and the separation algorithm of the exercise intensities; the two adjacent motion intensities are two motion intensities corresponding to each image frame except the image frame at the starting time and the image frame at the ending time of the locomotive video;
a target frame determination module for determining an optimal partition according to the partition score and determining the target image frame according to the optimal partition;
and the time synchronization module is used for synchronizing the locomotive video and the LKJ file according to the target image frame.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 6.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program realizes the steps of the method of any one of claims 1 to 6 when executed by a processor.
CN202111570909.7A 2021-12-21 2021-12-21 LKJ file and locomotive video automatic time synchronization method, device and computer equipment Pending CN114387311A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111570909.7A CN114387311A (en) 2021-12-21 2021-12-21 LKJ file and locomotive video automatic time synchronization method, device and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111570909.7A CN114387311A (en) 2021-12-21 2021-12-21 LKJ file and locomotive video automatic time synchronization method, device and computer equipment

Publications (1)

Publication Number Publication Date
CN114387311A true CN114387311A (en) 2022-04-22

Family

ID=81196961

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111570909.7A Pending CN114387311A (en) 2021-12-21 2021-12-21 LKJ file and locomotive video automatic time synchronization method, device and computer equipment

Country Status (1)

Country Link
CN (1) CN114387311A (en)

Similar Documents

Publication Publication Date Title
CN109710780B (en) Archiving method and device
CN108229456B (en) Target tracking method and device, electronic equipment and computer storage medium
US10803357B2 (en) Computer-readable recording medium, training method, and object detection device
US9418297B2 (en) Detecting video copies
CN103678299A (en) Method and device for monitoring video abstract
CN113301430A (en) Video clipping method, video clipping device, electronic equipment and storage medium
US20220172476A1 (en) Video similarity detection method, apparatus, and device
CN110941978B (en) Face clustering method and device for unidentified personnel and storage medium
CN112989962B (en) Track generation method, track generation device, electronic equipment and storage medium
KR102028930B1 (en) method of providing categorized video processing for moving objects based on AI learning using moving information of objects
CN112434178A (en) Image classification method and device, electronic equipment and storage medium
CN113869137A (en) Event detection method and device, terminal equipment and storage medium
CN111738042A (en) Identification method, device and storage medium
Ding et al. Mit-avt clustered driving scene dataset: Evaluating perception systems in real-world naturalistic driving scenarios
JP2014110020A (en) Image processor, image processing method and image processing program
CN112581489A (en) Video compression method, device and storage medium
CN114387311A (en) LKJ file and locomotive video automatic time synchronization method, device and computer equipment
CN115719428A (en) Face image clustering method, device, equipment and medium based on classification model
CN111062294B (en) Passenger flow queuing time detection method, device and system
CN111553408B (en) Automatic test method for video recognition software
CN115049963A (en) Video classification method and device, processor and electronic equipment
CN110781710B (en) Target object clustering method and device
CN112749660A (en) Method and equipment for generating video content description information
CN117176979B (en) Method, device, equipment and storage medium for extracting content frames of multi-source heterogeneous video
CN114926973B (en) Video monitoring method, device, system, server and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination