CN116128769B - Track vision recording system of swinging motion mechanism - Google Patents

Track vision recording system of swinging motion mechanism Download PDF

Info

Publication number
CN116128769B
CN116128769B CN202310408989.9A CN202310408989A CN116128769B CN 116128769 B CN116128769 B CN 116128769B CN 202310408989 A CN202310408989 A CN 202310408989A CN 116128769 B CN116128769 B CN 116128769B
Authority
CN
China
Prior art keywords
image
cutting device
images
cutting
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310408989.9A
Other languages
Chinese (zh)
Other versions
CN116128769A (en
Inventor
冯乐坤
冯云鹏
李茂申
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liaocheng Jinbang Mechanical Equipment Co ltd
Original Assignee
Liaocheng Jinbang Mechanical Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liaocheng Jinbang Mechanical Equipment Co ltd filed Critical Liaocheng Jinbang Mechanical Equipment Co ltd
Priority to CN202310408989.9A priority Critical patent/CN116128769B/en
Publication of CN116128769A publication Critical patent/CN116128769A/en
Application granted granted Critical
Publication of CN116128769B publication Critical patent/CN116128769B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C3/00Registering or indicating the condition or the working of machines or other apparatus, other than vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of image data processing, in particular to a track vision recording system of a swinging motion mechanism. The system comprises: the image data acquisition module is used for acquiring images of the cutting device of the swing machine in the motion process; the period analysis module is used for acquiring a cutting angle according to the edge direction of the fixed pixel point, and dividing the motion process into periods; the relevant characteristic analysis module is used for acquiring the fuzzy characteristic value of the image of the cutting device and obtaining the relevant characteristic value between two frames of images; the reference index analysis module acquires the period similarity according to the related characteristic values among the images in the period and acquires the reference degree among the images; and the track vision recording module is used for performing deblurring processing based on the reference degree between the images and recording the motion track of the swinging motion mechanism according to the processed images. According to the invention, the reference degree considers the relationship among the periods of the images, so that the selected images have a better deblurring effect, and the track vision recording precision of the swinging motion mechanism is improved.

Description

Track vision recording system of swinging motion mechanism
Technical Field
The invention relates to the technical field of image data processing, in particular to a track vision recording system of a swinging motion mechanism.
Background
The swinging motion mechanism is an important part on the harvester and can convert continuous rotary motion into reciprocating motion so as to realize cutting motion. The motion of the swinging motion mechanism often affects the cutting efficiency in the cutting process, so that the motion trail of the swinging motion needs to be recorded so as to optimize the motion trail and further improve the efficiency of the harvester. When the motion trail is recorded by the vision recording system, the swinging motion mechanism can reciprocate, and the acquired image can have motion blurring phenomenon, so that the recorded motion trail is inaccurate, and therefore, the image needs to be subjected to motion blurring removal.
In the prior art, the blurring degree is reduced by using a method of inserting frames or directly replacing frames by estimating the motion of image blocks of a video sequence on each frame and detecting the corresponding positions of clear image blocks in multi-frame images. The image replaced in the process directly affects the deblurring precision, and when the selected image is poor, the deblurring effect is poor, namely, for a video sequence with each frame containing blur, a method based on multi-frame fusion is difficult to find a better reference clear frame image in multiple frames, so that the deblurring effect is reduced.
Disclosure of Invention
In order to solve the technical problem that the deblurring effect of an image is poor due to the fact that the blurring degree of an image block to be selected and replaced is high, the invention aims to provide a track vision recording system of a swinging motion mechanism, and the adopted technical scheme is as follows:
the invention provides a track vision recording system of a swinging motion mechanism, which comprises:
the image data acquisition module is used for acquiring at least two frames of images of the cutting device in the reciprocating motion process of the swinging motion mechanism;
the period analysis module is used for acquiring the cutting angle of each frame of cutting device image; dividing a reciprocating process into at least two cycles based on the cutting angle;
the relevant characteristic analysis module is used for acquiring a fuzzy characteristic value of each frame of cutting device image according to the rotation area and the smoothed degree of the rotation area of each frame of cutting device image; combining the difference of the cutting angles of any two frames of cutting device images, the number of pixel points in the overlapping area and the motion information difference of the pixel points in the overlapping area to obtain a related characteristic value between any two frames of cutting device images;
the reference index analysis module is used for screening clear images according to the fuzzy characteristic values; acquiring the period similarity between any two periods according to the number difference of the images in any two periods and the correlation characteristic value between the common clear images; combining the related characteristic values between any two frames of cutting device images, the period similarity between the periods and the fuzzy characteristic values to obtain a reference degree between any two frames of cutting device images;
The track vision recording module is used for carrying out deblurring processing on each frame of cutting device image according to the reference degree between the cutting device images; and recording the motion trail of the swinging motion mechanism according to the pixel point position in the image after deblurring.
Further, the method for acquiring the cutting angle comprises the following steps:
the cutting device image and the preset still image both comprise a fixed pixel point;
taking an edge pixel point in a preset neighborhood of the fixed pixel point in the cutting device image as a target edge point; forming a target edge point pair by the target edge point and a corresponding edge pixel point in a preset still image;
taking the included angle between each edge pixel point in the target edge point pair and the straight line formed by the fixed pixel point in the corresponding image as the cutting angle difference of the target edge point pair; and taking the average value of the cutting angle differences of all the target edge points corresponding to the fixed pixel points as the cutting angle of the cutting device image.
Further, the method for obtaining the fixed pixel point comprises the following steps:
edge detection is carried out on each frame of cutting device image to obtain edge pixel points, and the edge pixel points of each frame of cutting device area are mapped to the same frame of image to obtain coincident points; and taking the corresponding edge pixel point of the coincident point in each frame of cutting device image as the fixed pixel point.
Further, the periodic acquisition method includes:
mapping the cutting angles of all the cutting device images to a two-dimensional coordinate system according to time sequence to obtain a cutting angle change curve; the abscissa of the two-dimensional coordinate system is time, and the ordinate is a cutting angle; taking the end points and the trough points of the cutting angle change curve as cutting points, and taking the time intervals corresponding to curve segments among the adjacent three cutting points as a period; each period has three cutting points, wherein the first cutting point is a starting point of the period, and the third cutting point is an ending point of the period; the end point of one cycle is the start point of the next cycle.
Further, the method for acquiring the fuzzy characteristic value comprises the following steps:
mapping a preset still image into each frame of cutting device image, and taking the overlapped part as a rotation area of the corresponding cutting device image; removing the rotating area of the cutting device image to obtain a non-overlapped area of each frame of cutting device image;
taking the absolute value of the gray level difference value between each pixel point in the rotating area in the cutting device image and the corresponding pixel point in the preset still image as the fuzzy difference of the corresponding pixel point in the rotating area; adding up the fuzzy differences of each pixel point in the rotation area to be used as the smoothed degree; and taking the product of the smoothed degree and the number of pixel points in the non-overlapped area as the fuzzy characteristic value of the cutting device image.
Further, the method for acquiring the relevant characteristic value comprises the following steps:
mapping any two frames of cutting device images into the same frame of image, and taking the overlapped part as an overlapped area of the two frames of cutting device images; acquiring the motion speed and the motion direction of each pixel point in each frame of the cutting device image by using a light flow method;
two corresponding pixel points of each position in the overlapped area in the two frames of cutting device images form an area point group; performing negative correlation mapping on the product of the absolute value of the difference value of the motion speed and the absolute value of the difference value of the motion direction of two pixel points in the region point group to obtain the motion similarity of each position in the overlapped region; accumulating the motion similarity of each position in the overlapped area to obtain a motion characteristic value of the corresponding two-frame cutting device image;
taking the absolute value of the difference value of the cutting angles of the two frames of cutting device images as a cutting angle difference value, and carrying out negative correlation mapping on the cutting angle difference value to obtain a cutting angle similarity value;
and taking the product of the cutting angle similarity value of any two frames of cutting device images, the motion characteristic value and the number of pixel points in the overlapping area as the related characteristic value of the two frames of cutting device images.
Further, the method for obtaining the period similarity comprises the following steps:
setting a fuzzy judgment threshold value; normalizing the fuzzy characteristic value of each frame of cutting device image to obtain a normalized fuzzy characteristic value; taking the cutting device image with the normalized blurring characteristic value larger than or equal to the blurring judgment threshold value as a blurring image and taking the cutting device image with the blurring judgment threshold value smaller than the blurring judgment threshold value as the clear image;
selecting any two periods as a period group; selecting any clear image in one period of the period group as a target image, and taking a clear image corresponding to the maximum correlation characteristic value of the target image in the other period as a correlation image of the target image; the target image and the related image form a matching pair;
accumulating the correlation characteristic values between the two clear images in the matching pair in any two periods to obtain an initial period correlation degree; performing negative correlation mapping on absolute values of differences of the number of images in two periods in the period group to obtain the number similarity of the periodic images; taking the product of the reciprocal of the number similarity of the periodic images and the initial period correlation as the period similarity between two corresponding periods.
Further, the reference degree obtaining method includes:
performing negative correlation mapping on the normalized fuzzy characteristic value of each fuzzy image to obtain a fuzzy weight; and taking the product of the correlation characteristic value between each blurred image and each clear image, the period similarity between the two periods and the blurring weight value of the clear image as the reference degree between the corresponding blurred image and the corresponding clear image.
Further, the specific method for deblurring is as follows:
normalizing the reference degree of the blurred image and the clear image of each frame to obtain an initial normalized reference degree of the blurred image and the clear image of each frame;
setting a reference threshold, and taking the clear image with the initial normalized reference degree larger than the reference threshold as a reference image of the blurred image;
normalizing the reference degree between the blurred image and the reference image to obtain a normalized reference degree between the blurred image and each frame of reference image; selecting any pixel point in a blurred image as a target pixel point, and taking a corresponding pixel point of the target pixel point in the reference image as a reference pixel point;
Taking the product of the normalized reference degree between the reference image and the blurred image of each frame and the gray value of the reference pixel point in the corresponding reference image as an initial reference value of the reference pixel point in the corresponding reference image; taking the average value of the initial reference values of the reference pixel points in all the reference images as the reference value of the target pixel point;
changing the target pixel point to obtain the reference value of each pixel point in the blurred image; and taking the reference value of each pixel point in the blurred image as the gray value of the corresponding pixel point to finish the deblurring process of the blurred image.
Further, the specific method for recording is as follows:
deblurring the blurred image during the reciprocating motion; carrying out corner detection on each frame of cutting device image in the reciprocating motion process, wherein each frame of cutting device image obtains at least two corner points; selecting the corner point farthest from the fixed pixel point as a monitoring point of the image of the corresponding cutting device;
and recording the positions of the monitoring points of the images of each frame of cutting device in the reciprocating motion process, and finishing the recording of the motion trail of the swinging motion mechanism.
The invention has the following beneficial effects:
according to the embodiment of the invention, the reciprocating motion and the swinging motion of the cutting device are the same, the cutting angle of each frame of image of the cutting device in the reciprocating motion process is obtained, the cutting angle can clearly show the position condition of the cutting device in the cutting motion process at a certain moment, the reciprocating motion process is divided into periods based on the cutting angle, and the accuracy of period division is improved; when the image is blurred, the pixel points are smoothed, the smoothed degree can reflect the blurring condition of the image, the blurring degree of the image is directly obtained according to the smoothed degree of the whole image, and errors are easily generated at different swinging positions of the cutting device, so that the rotating area of the image of the cutting device and the smoothed degree thereof are analyzed, and the precision of a blurring characteristic value is increased; the number of pixel points in an overlapping area of two frames of images shows the coincidence degree between the images, the motion information difference of the pixel points in the overlapping area reflects the information correlation between the two frames of images from the motion condition of the pixel points, and the swinging characteristic of the cutting device is considered, and meanwhile, the difference analysis of the cutting angles of the two frames of images is combined, so that the correlation characteristic value can more accurately reflect the correlation degree between the two frames of images; the growth condition of crops can influence the reciprocating motion in the operation process of the cutting device, so that the difference exists in the number of images in different periods, meanwhile, the characteristics of clear images are well reserved, so that the correlation characteristic values among the clear images in different periods can reflect the similar conditions of the periods to a large extent, and the period similarity between two periods is obtained according to the difference in the number of the images in the periods and the correlation characteristic values among the common clear images; when the harvester cuts crops, the growth of the crops is not completely consistent, the period of reciprocating motion is not completely the same, the replacement images are directly selected without considering the difference of the period, the deblurring precision cannot be ensured, so that the period similarity between the images of different periods is obtained, the acquired images and the replaced images are higher in reference by combining the relevant characteristic value and the fuzzy characteristic value between the images of the cutting device, the images selected based on the reference are clearer and are higher in correlation degree, the deblurring effect is greatly improved, the pixel point positions in the images after deblurring treatment are accurately positioned, and the accuracy of the motion track of the swinging motion mechanism is improved.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a system block diagram of a track vision recording system for a rocking motion mechanism according to one embodiment of the present invention;
fig. 2 is a simplified schematic diagram of a cutting motion process provided by one embodiment of the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following is a detailed description of specific implementation, structure, characteristics and effects of an intelligent monitoring system and a monitoring method for a construction hanging basket according to the invention with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The invention aims at the specific scene: the swing transmission box is an important part on the harvester and controls the cutting device to reciprocate to harvest crops. The cutting device directly influences the harvesting efficiency of the crop when harvesting the motion trail of the crop, namely, the cutting forces required by different cutting trails are different, the cutting trail is required to be recorded and analyzed, and as the harvesting process is reciprocating motion, when the trail is recorded by the acquired image, the motion blurring phenomenon can occur in the image, so that the trail recording is inaccurate. The movement process of the cutting device is analyzed in the invention.
The following specifically describes a specific scheme of a track vision recording system of a rocking motion mechanism provided by the invention with reference to the accompanying drawings.
Referring to fig. 1, a system block diagram of a track vision recording system of a rocking motion mechanism according to an embodiment of the present invention is shown, the system includes: the system comprises an image data acquisition module 101, a period analysis module 102, a relevant feature analysis module 103, a reference index analysis module 104 and a track visual recording module 105.
The image data acquisition module 101 is used for acquiring at least two frames of images of the cutting device in the reciprocating motion process of the swinging motion mechanism.
Specifically, in the embodiment of the invention, an industrial camera is fixed right above a swing transmission box, and an image of a cutting device is acquired in real time by using the industrial camera, wherein the acquired image is a video image. And carrying out graying treatment on the image by using weighted graying to obtain an initial cutting device image. The weighted gray scale is a known technique, and a specific method is not described herein.
Other image capturing devices and image preprocessing algorithms, which are well known to those skilled in the art, may be used in other embodiments of the present invention, and are not limited herein.
Because the initial cutting device image is acquired in the operation process of the harvester, objects such as a cutting device, crops and the like are arranged in the initial cutting device image, and a cutting device area in the initial cutting device image is acquired for the purpose of analyzing the cutting device.
In the embodiment of the invention, an initial cutting device image in the reciprocating motion process of a cutting device is divided into a still image and a moving image by utilizing a frame difference method, moving pixel points in each frame of moving image are obtained, and an area formed by the moving pixel points in each frame of moving image is used as a cutting device area of a corresponding moving image; since the cutting device is in a stationary state in the still image, the cutting device region can be obtained by manually labeling the cutting device portion within the image. Thus, a cutting device image of each frame of the swinging motion mechanism in the reciprocating motion process is obtained, and only a cutting device area is arranged in the cutting device image. The frame difference method is a known technique, and a specific method is not described herein.
In another embodiment of the present invention, the cutting device region in the initial cutting device image may also be obtained by threshold segmentation, and the specific algorithm content is a technical means well known to those skilled in the art, which is not described herein. Other methods of acquiring the cutting device area may be selected in other embodiments of the present invention, and are not limited in this regard.
The period analysis module 102 is configured to obtain a cutting angle of each frame of the image of the cutting device; the reciprocation process is divided into at least two cycles based on the cutting angle.
Because the swinging motion mechanism reciprocates in motion, the collected image is a video image, and when the image is deblurred, the traditional algorithm directly completes deblurring through the correlation of the video image, namely, the clear image is replaced for the blurred image. Therefore, the image to be replaced directly affects the deblurring precision, and when the selected image is poor, the deblurring effect is poor, namely, for a video sequence with each frame blurred, a method based on multi-frame fusion is difficult to find a better clear image in the multi-frames, so that the deblurring effect is reduced.
In practical situations, the swing transmission case has a plurality of periods in the reciprocating motion process, and clear images with higher correlation with blurred images can be distributed in different periods, so that when the clear images are selected to replace the blurred images, the deblurring effect of the clear images in different periods on the blurred images needs to be considered. Therefore, the reciprocating motion is decomposed into a plurality of periods, and the deblurring is completed based on the motion characteristics of the cutting device in the plurality of periods, so that the deblurring precision can be effectively improved.
The cutting device is typically fixed at one end and cuts at the other end as the device reciprocates. The reciprocating motion of the cutting device is the same as the swinging motion, and the most common swinging motion is that the pocket watch is hung in the air to swing left and right, and in the swinging motion process, the upper end is fixed, and the lower end swings in the left and right direction. Similarly, the tail of the cutting device in a top view is fixed, and the knife edge reciprocates to complete the cutting motion.
Acquiring the motion state of the cutting device requires determining the cutting angle in the image of the cutting device. Preferably, in one embodiment of the present invention, the method for acquiring the cutting angle of each frame of the image of the cutting device by taking the fixed point in the image of the continuous multi-frame cutting device as a reference includes: edge detection is carried out on each frame of cutting device image to obtain edge pixel points, and the edge pixel points of each frame of cutting device area are mapped to the same frame of image to obtain coincident points; and taking the corresponding edge pixel point of the coincident point in the image of each frame of cutting device as a fixed pixel point.
For ease of understanding, assume that the cutting device moves left and right in the image to complete cutting, the image is a top view, and the fixed end is disposed below the image. Fig. 2 is a simplified schematic diagram of a cutting motion process according to an embodiment of the present invention, as shown in fig. 2, fig. 2 is a top view of the cutting motion process of the cutting device, in which two rectangles are simplified diagrams of the cutting device, and a bottom point q of the two rectangles is a fixed pixel point. And respectively carrying out canny operator detection on the multi-frame cutting device images, respectively obtaining edge pixel points of each frame of cutting device image, mapping the edge pixel points of each frame of cutting device image into the same frame of image, taking the overlapped edge pixel points as overlapping points, and taking the corresponding pixel points of the overlapping points in each frame of cutting device image as fixed pixel points. The detection of the canny operator is a well-known technique, and a specific method is not described herein.
The direction of the edge where the fixed pixel point in the image of the cutting device is located can intuitively reflect the movement condition of the area of the cutting device, and the cutting angle of the image of the cutting device is obtained according to the characteristics. Preferably, the method for acquiring the cutting angle of the cutting device image comprises the following steps: the cutting device image and the preset still image both comprise a fixed pixel point; taking an edge pixel point in a preset neighborhood of a fixed pixel point in the image of the cutting device as a target edge point; forming a target edge point pair by the target edge point and a corresponding edge pixel point in a preset still image; taking the included angle between each edge pixel point in the target edge point pair and a straight line formed by a fixed pixel point in a corresponding image as the cutting angle difference of the target edge point pair; and taking the average value of the cutting angle differences of all the target edge points corresponding to the fixed pixel points as the cutting angle of the cutting device image.
As an example, since the information in the cutting device image acquired in the stationary state of the cutting device is accurate, and the cutting device image acquired in the moving state may be blurred, the cutting angle of the cutting device image in the moving state is more accurate based on the information in the cutting device image in the stationary state. Taking a preset still image Q and any cutting device image W as an example for analysis, the motion process of the cutting device is equivalent to taking a fixed pixel point Q as a fixed point, and continuously rotating the cutting device image by a certain angle, so that each pixel point in the preset still image Q can find a corresponding pixel point in the cutting device image W. The straight line formed by the edge pixel points in the neighborhood of the fixed pixel points and the fixed pixel points can intuitively reflect the movement condition of the cutting device region, the clockwise included angle of the straight line formed by each edge pixel point in the preset neighborhood of the fixed pixel points in the preset still image Q and the fixed pixel points is respectively obtained, the clockwise included angle of the straight line formed by the corresponding edge pixel points in the preset neighborhood of the fixed pixel points in the cutting device image W and the fixed pixel points is obtained, and the cutting angle of the cutting device image W is obtained based on the difference between the obtained included angles in the two frames of images. The embodiment of the invention sets the preset still image as the first still image in the reciprocating motion process, the size of the preset neighborhood of the fixed pixel point takes the empirical value of 3 multiplied by 3, and an implementer can set the size of the preset neighborhood according to actual conditions.
And acquiring the cutting angle of the cutting device image W according to the difference of the edge direction of the fixed pixel point in the cutting device image W and the preset still image Q. The calculation formula of the cutting angle is as follows:
Figure SMS_1
in the method, in the process of the invention,
Figure SMS_2
a cutting angle for the cutting device image W;
Figure SMS_3
an included angle between an ith edge pixel point and a fixed pixel point in a preset adjacent area of the fixed pixel point in the preset still image Q and a straight line formed by the corresponding edge pixel point and the fixed pixel point in the cutting device image W; n is the number of edge pixels in the preset neighborhood of the fixed pixel.
It should be noted that, the included angle between the edge pixel point in the preset neighborhood of the fixed pixel point in the preset still image Q and the straight line formed by the fixed pixel point in the corresponding edge pixel point in the cutting device image W and the straight line formed by the fixed pixel point
Figure SMS_4
The smaller the rotation degree of the cutting device image W, the smaller the cutting angle
Figure SMS_5
The smaller.
In other embodiments of the present invention, the cutting angle of each frame of the image of the cutting device may also be determined by a feature point matching method, and a specific algorithm is a technical means well known to those skilled in the art, which is not described herein.
The cutting angle of the image of the cutting device can intuitively present the position information of the cutting device in the cutting motion process at a certain moment, so that the reciprocating motion is divided into periods based on the change process of the cutting angle. Preferably, the specific acquisition method of the period is as follows: mapping the cutting angles of all the cutting device images to a two-dimensional coordinate system according to time sequence to obtain a cutting angle change curve; the abscissa of the two-dimensional coordinate system is time, and the ordinate is cutting angle; taking the end points and the trough points of the cutting angle change curve as cutting points, and taking the time interval corresponding to the curve segments between the three adjacent cutting points as a period; each period has three cutting points, wherein the first cutting point is a starting point of the period, and the third cutting point is an ending point of the period; the end point of one cycle is the start point of the next cycle.
It should be noted that, the growth condition of the crops can affect the reciprocating motion in the operation process of the cutting device, so that the peak points of the cutting angle change curve are not necessarily the same, and the values of the two end points and all the trough points of the cutting angle change curve are all 0. In the embodiment of the invention, the time period between the adjacent three cutting points on the cutting angle change curve is taken as one period, namely, the starting point of the period is the cutting device image with the cutting angle of 0, the cutting angle is continuously increased from 0 to the maximum cutting angle in one period, then is reduced, and is reduced to 0 after being reduced to the maximum cutting angle, and the cutting device image with the cutting angle of 0 is taken as the end point of the period and the starting point of the next period, so that one period of reciprocating motion is obtained.
To this end, the reciprocation process is divided into a plurality of cycles.
The relevant feature analysis module 103 is configured to obtain a fuzzy feature value of each frame of the cutting device image according to the rotation area and the smoothed degree of the rotation area of each frame of the cutting device image; and combining the difference of the cutting angles of the images of any two frames of cutting devices and the difference of the number of the pixel points in the overlapping area and the motion information of the pixel points in the overlapping area to obtain the correlation characteristic value between the images of any two frames of cutting devices.
When the cutting device reciprocates, angles presented by the cutting device areas in different images are different, so that the overlapping part information in the cutting device areas of two frames of images is directly analyzed, and errors exist when the fuzzy characteristic values are constructed. When the image is blurred, the pixel points are smoothed, the smoothed degree can reflect the blurring condition of the image, the blurring degree of the image is directly obtained according to the smoothed degree of the whole image, and errors are easily generated at different swinging positions of the cutting device, so that the rotating area of the image of the cutting device and the smoothed degree thereof are analyzed, and the precision of the blurring characteristic value is increased.
Preferably, the method for acquiring the fuzzy characteristic value of the cutting device image comprises the following steps: mapping a preset still image into each frame of cutting device image, and taking the overlapped part as a rotation area of the corresponding cutting device image; removing a rotating area of the cutting device image to obtain a non-overlapped area of each frame of the cutting device image; taking the absolute value of the gray difference value between each pixel point in the rotating area in the image of the cutting device and the corresponding pixel point in the preset still image as the fuzzy difference of the corresponding pixel point in the rotating area; adding up the fuzzy differences of each pixel point in the rotation area to be used as smoothed; the product of the smoothness and the number of pixels in the non-overlapped area is used as a fuzzy characteristic value of the cutting device image.
As an example, since the information in the cutting device image acquired in the stationary state of the cutting device is accurate, the blurring phenomenon may occur in the cutting device image acquired in the moving state, and the analysis is performed based on the information in the cutting device image in the stationary state, so that the blurring degree of the cutting device image is more accurate. The analysis is performed according to a preset still image Q and an image E of any cutting device in any period, wherein the image E of the cutting device is obtained by taking a fixed pixel point Q as a fixed point of the cutting device and continuously rotating a cutting device area by a certain angle. The pixels in the preset still image Q are mapped into the cutting device image E, the region where the pixels overlap is taken as the overlapping region of the cutting device image E, and the region formed by the non-overlapping pixels is taken as the non-overlapping region of the cutting device image E. When the image is blurred, the pixels in the image are smoothed, so that the gray differences of the pixels in the overlapping area in the corresponding pixels in the cutting device image E and the preset still image Q can reflect the blurring condition of the cutting device image E; and the non-overlapping area of the cutting device image E is combined for analysis, so that the fuzzy characteristic value of the cutting device image E is more accurate.
And combining the gray level difference of the pixels in the overlapping area in the corresponding pixels in the cutting device image E and the preset still image Q with the number of the pixels in the non-overlapping area to obtain a fuzzy characteristic value of the cutting device image E. The calculation formula of the fuzzy characteristic value is as follows:
Figure SMS_6
in the method, in the process of the invention,
Figure SMS_7
for the blur feature value of the cutting device image E, n is the number of pixels in the non-overlapping area of the cutting device image E,
Figure SMS_8
to cut the number of pixels in the rotation area of the device image E,
Figure SMS_9
for the gray value of the i-th pixel in the cutting device image E in the rotation area of the cutting device image E,
Figure SMS_10
the gray value of the ith pixel point in the rotating area of the cutting device image E corresponding to the pixel point in the preset still image Q;
Figure SMS_11
as a function of absolute value.
It should be noted that, when the degree of blurring of the cutting device image E is more serious, the boundary of the image is widened, which leads to a larger cutting device region, and the non-overlapping region of the cutting device image E is larger, and the number n of pixel points in the non-overlapping region is larger, which leads to a more serious degree of blurring of the cutting device image; the greater the difference between the gray value of the pixel point in the rotation area of the cutting device image and the gray value of the corresponding pixel point in the preset still image, the greater the smooth degree of the cutting device area, which means that the greater the blurring degree of the cutting device, the greater the blurring characteristic value ME of the cutting device image E.
The related characteristic values between the images of the cutting device are obtained based on the local characteristics of the images, the number of pixel points in the overlapping area of the images of the two frames of cutting device presents the coincidence degree of the images, the difference of the motion information of the pixel points in the overlapping area reflects the relativity between the images of the two frames of cutting device from the motion condition of the pixel points, the swinging characteristics of the cutting device are considered, meanwhile, the difference of the cutting angles of the images of the two frames of cutting device is combined, and the related characteristic values obtained by comprehensively analyzing the three factors can more accurately represent the relativity between the images of the two frames of cutting device.
Preferably, the method for acquiring the relevant characteristic values of the two-frame cutting device image comprises the following steps: mapping any two frames of cutting device images into the same frame of image, and taking the overlapped part as an overlapped area of the two frames of cutting device images; acquiring the motion speed and the motion direction of each pixel point in the image of each frame of cutting device by using an optical flow method; two corresponding pixel points of each position in the overlapping area in the two frames of cutting device images form an area point group; performing negative correlation mapping on the product of the absolute value of the difference value of the motion speeds and the absolute value of the difference value of the motion directions of two pixel points in the region point group to obtain the motion similarity of each position in the overlapped region; accumulating the motion similarity of each position in the overlapped area to obtain a motion characteristic value of the corresponding two frames of cutting device images; taking the absolute value of the difference value of the cutting angles of the two frames of images of the cutting device as a cutting angle difference value, and carrying out negative correlation mapping on the cutting angle difference value to obtain a cutting angle similarity value; and taking the product of the cutting angle similarity value, the motion characteristic value and the number of pixel points in the overlapping area of any two frames of cutting device images as the related characteristic value of the two frames of cutting device images. The optical flow method is a known technique, and the specific method is not described here.
Taking a cutting device image a and a cutting device image B as an example, a correlation characteristic value between two frames of images is obtained by combining a difference in cutting angle between the cutting device image a and the cutting device image B, a number of pixels in an overlapping region, and a difference in motion information of the pixels in the overlapping region in the two frames of images. The calculation formula of the relevant eigenvalues is as follows:
Figure SMS_12
Figure SMS_13
where RG is a correlation characteristic value between the cutting device image A and the cutting device image B,
Figure SMS_14
similarity of motion for the ith position within the overlap region,
Figure SMS_19
the motion speed of the corresponding pixel point in the cutting device image a for the i-th position of the overlap region,
Figure SMS_22
is the ith position of the overlapping areaThe motion speed of the corresponding pixel point in the cutting device image B,
Figure SMS_16
the i-th position of the overlap area corresponds to the direction of motion of the pixel point in the cutting device image a,
Figure SMS_17
the i-th position of the overlap area corresponds to the direction of motion of the pixel point in the cutting device image B,
Figure SMS_20
for the cutting angle of the cutting device image a,
Figure SMS_23
for the cutting angle of the cutting device image B,
Figure SMS_15
the number of pixel locations within the overlap region,
Figure SMS_18
for presetting a first constant, preventing the denominator from being 0, and the empirical value from being 0.01; exp is an exponential function based on a natural constant e;
Figure SMS_21
As a function of absolute value.
It should be noted that, when the difference of the motion angle and the difference of the motion direction of the corresponding pixel point in the two frames of cutting device images at each position in the overlapping area are smaller, it is described that the closer the positions of the cutting device image a and the cutting device image B are in different periods, the greater the correlation degree of the two frames of cutting device images is, the greater the correlation characteristic value RG is; when the cutting angles of the two frames of cutting device images are closer, the overlapping degree of the two frames of images is larger, the number m of pixel points in the overlapping area is larger, and the fact that the two frames of images are closer is indicated, the relevant characteristic values between the cutting device images A and the cutting device images B are larger.
By using the method, the correlation characteristic value between any two frames of images of the cutting device in the reciprocating motion process is obtained.
The reference index analysis module 104 is used for screening the clear images according to the fuzzy characteristic values; acquiring the period similarity between any two periods according to the number difference of the images in any two periods and the correlation characteristic value between the images with the common definition; and combining the correlation characteristic value between the images of any two frames of cutting devices, the period similarity between the periods and the fuzzy characteristic value to obtain the reference degree between the images of any two frames of cutting devices.
The video images acquired in the reciprocating motion process of the cutting device often have certain correlation, so that the reference degree between images in different periods can be constructed based on the correlation of the video images and the periodicity of the reciprocating motion, the deblurring processing of the blurred image is completed based on the reference degree, and the deblurring precision is improved. Therefore, to improve the deblurring accuracy of the image, the degree of correlation between different periods during the reciprocation is analyzed.
Although the reciprocation motion has periodicity, when a certain image is blurred, the clear image is directly selected for replacement in another period, so that the deblurring operation is not accurate enough. The reason is that the harvester is not likely to completely conform to the growth of the crop when cutting the crop, resulting in a cutting process that is not completely consistent, i.e. the cycle divided by the reciprocating process is not completely consistent. In the case of direct replacement, the deblurring precision cannot be ensured, so that the cycle similarity of two cycles is constructed by the number difference of images in different cycles and the correlation characteristic value between clear images in the two cycles.
Preferably, the specific method for obtaining the period similarity of two periods is as follows: setting a fuzzy judgment threshold value; normalizing the fuzzy characteristic value of each frame of cutting device image to obtain a normalized fuzzy characteristic value; taking the cutting device image with the normalized blur characteristic value larger than or equal to the blur judgment threshold value as a blur image, and taking the cutting device image with the normalized blur characteristic value smaller than the blur judgment threshold value as a clear image; selecting any two periods as a period group; selecting any clear image in one period of the period group as a target image, and taking a clear image corresponding to the maximum correlation characteristic value of the target image in the other period as a correlation image of the target image; the target image and the related image form a matching pair; accumulating correlation characteristic values between two clear images in any two period matching pairs to obtain initial period correlation degree; performing negative correlation mapping on absolute values of differences of the image numbers in two periods in the period group to obtain the similarity of the image numbers in the period; taking the product of the reciprocal of the number similarity of the periodic images and the initial period correlation as the period similarity between the two corresponding periods.
As an example, for different periods, it is necessary that a blurred image in a certain period refers to a clear image at a corresponding time in another period, and the reference degree of the two periods is directly calculated through the correlation characteristic value between the two frames of images, so that the image characteristic of the blurred image is smoothed, and the correlation characteristic value between the blurred image and the clear image is smaller, so that an error occurs in the period similarity between the two periods. The present invention obtains the similarity between periods based on the correlation characteristic values between the clear images in two periods. Taking the period c and the period d as examples for analysis, clear images in the period c and the period d are respectively acquired. And selecting a clear image C in the period C, calculating a correlation characteristic value between the clear image C and each clear image in the period D, and when the correlation characteristic value between the clear image C and the clear image D in the period D is the largest, taking the clear image C and the clear image D as matching pairs, and acquiring all the matching pairs between the period C and the period D. And acquiring the period similarity between the period c and the period d according to the correlation characteristic value between the two clear images in the matching pair and the difference of the number of the two period images. When the two periods are the same period, the period similarity is 1. In the embodiment of the invention, the fuzzy judgment threshold takes the empirical value of 0.2, and an implementer can specifically set according to specific implementation scenes. The calculation formula of the period similarity between the period c and the period d is as follows:
Figure SMS_24
Where AS is the period similarity between period c and period d,
Figure SMS_25
for the number of intra-device images in period c,
Figure SMS_26
for the number of intra-device images in period d, L is the number of matching pairs between period c and period d,
Figure SMS_27
for the correlation eigenvalue between the two sharp images in the ith matching pair between period c and period d,
Figure SMS_28
for presetting a second constant, preventing the denominator from being 0, and the empirical value from being 0.01;
Figure SMS_29
as a function of absolute value.
It should be noted that, when the period c and the period d are matched, the correlation characteristic value between two clear images in the pair
Figure SMS_30
The larger the correlation degree between clear images in two periods is, the higher the period similarity AS between the period c and the period d is; the smaller the difference in the number of images in the period c and the period d, the more similar the two periods are, the greater the period similarity between the two periods is.
The correlation characteristic value between the images of any two frames of cutting devices, the period similarity between the periods and the fuzzy characteristic value can influence the reference degree between the images of the two frames of cutting devices, and the three factors are comprehensively analyzed to obtain the reference degree between the images of the two frames, so that the reference property between the images is stronger. Preferably, the specific acquisition method of the reference degree is as follows: performing negative correlation mapping on the normalized fuzzy characteristic value of each fuzzy image to obtain a fuzzy weight; and taking the product of the correlation characteristic value between each blurred image and each clear image, the period similarity between the two periods and the blurring weight of the clear image as the reference degree between the corresponding blurred image and the corresponding clear image.
It should be noted that, in the deblurring process, in order to select a clear image with a relatively high degree of correlation with the blurred image, the blurred image is replaced by using the clear image, so when the reference degree between the blurred image and other images is considered, only the reference degree between each clear image and the blurred image in each period is calculated, and meanwhile, the calculation amount is reduced.
AS an example, taking a blurred image H and a clear image Z AS an example for analysis, when the blurred image H and the clear image Z are in the same period, the period similarity AS between the two images is 1; when the blurred image H and the clear image Z are in different periods, the period similarity between the periods to which the two frames of images belong is obtained. In the embodiment of the invention, the return fuzzy characteristic value of the clear image Z is normalized through a constant 1
Figure SMS_31
Is used for realizing normalization of fuzzy characteristic values
Figure SMS_32
And obtaining the fuzzy weight. In other embodiments, a method of negative correlation such as function transformation may be selected, without limitation. And combining the correlation characteristic value between the blurred image H and the clear image Z, the period similarity between the two periods and the blur weight value to obtain the reference degree between the blurred image H and the clear image Z. The calculation formula of the reference degree is as follows:
Figure SMS_33
Wherein CK is the reference degree between the blurred image H and the clear image Z, RS is the period similarity between periods of the blurred image H and the clear image Z, RG is the correlation characteristic value between the blurred image H and the clear image Z,
Figure SMS_34
and (5) normalizing the blur characteristic value for the clear image Z.
It should be noted that, when the period similarity RS between two periods to which the blurred image H and the clear image Z belong is larger, the more similar the two periods are described, the larger the reference degree CK between the blurred image H and the clear image Z is; when the correlation characteristic value between the blurred image H and the clear image Z is larger, the reference degree CK is larger when two frames of images are closer; when the normalized blur feature value of the clear image Z is smaller, the clear image Z is clearer, and the better the deblurring operation of the blurred image H is performed by using the clear image Z later, the larger the reference degree CK is.
According to the method for acquiring the reference degree between the blurred image and the clear image, the reference degree between the blurred image and each clear image in each period is acquired respectively.
A track vision recording module 105, configured to perform deblurring processing on each frame of cutting device image according to a reference degree between the cutting device images; and recording the motion trail of the swinging motion mechanism according to the pixel point position in the image after deblurring.
And screening out an image which is more suitable for the cutting device image to be replaced based on the reference degree between the cutting device images, and performing deblurring treatment on the cutting device image through the image to improve the deblurring precision.
Preferably, the specific method for deblurring is as follows: normalizing the reference degree of the blurred image and each frame of clear image to obtain the initial normalized reference degree of the blurred image and each frame of clear image; setting a reference threshold value, and taking a clear image with initial normalization reference degree larger than the reference threshold value as a reference image of the blurred image; normalizing the reference degree between the blurred image and the reference image to obtain a normalized reference degree between the blurred image and each frame of reference image; selecting any pixel point in the blurred image as a target pixel point, and taking a corresponding pixel point of the target pixel point in the reference image as a reference pixel point; taking the product of the normalized reference degree between each frame of reference image and the blurred image and the gray value of the reference pixel point in the corresponding reference image as the initial reference value of the reference pixel point in the corresponding reference image; taking the average value of the initial reference values of the reference pixel points in all the reference images as the reference value of the target pixel point; changing target pixel points to obtain a reference value of each pixel point in the blurred image; and taking the reference value of each pixel point in the blurred image as the gray value of the corresponding pixel point to finish the deblurring treatment of the blurred image.
Taking the blurred image H as an example for analysis, assuming that the clear images in each period are respectively X1, X2, X3 and X4, obtaining the reference degree between the blurred image H and the clear images in each period, namely CK1, CK2, CK3 and CK4 in sequence, normalizing the 4 reference degrees, and obtaining the initial normalized reference degree in sequence
Figure SMS_37
Figure SMS_39
Figure SMS_42
And
Figure SMS_35
. If the reference degree is initially normalized
Figure SMS_38
And
Figure SMS_41
above the reference threshold, the clear image X1 and the clear image X2 are reference images of the blurred image H. Normalizing the reference degree CK1 between the blurred image H and the reference image X1 and the reference degree CK2 between the blurred image H and the reference image X2 to sequentially obtain normalized reference degrees
Figure SMS_44
And
Figure SMS_36
. Selecting a pixel point a in the blurred image H for subsequent analysis, acquiring corresponding pixel points a1 and a2 of the pixel point a in the reference images X1 and X2, and normalizing the gray value of the pixel point a1 with the normalized reference degree
Figure SMS_40
The product of (a), the gray value of the pixel point a2 and the normalized reference degree
Figure SMS_43
As the reference value of pixel a, and the reference value is used as the gray value of pixel a in the blurred image H. In the implementation of the invention, normalization methods such as function transformation, maximum and minimum normalization, sigmoid function and the like can be selected to normalize the reference degree between the blurred image and the clear image and the reference degree between the blurred image and the reference image respectively, and the normalization method is not limited in this regard.
In the embodiment of the invention, the parameter threshold takes the empirical value of 0.5, and an implementer can specifically set according to specific implementation scenes.
In the calculation process, two normalization processes are required, the objects of the two normalization processes are different, the object of the first normalization process is the reference degree between the blurred image and each clear image in each period, and the object of the second normalization process is the reference degree between the blurred image and each reference image. When the initial normalized reference degree between the blurred image and the clear image is larger than the reference threshold value after the first normalization processing, the second normalization processing can be performed on the reference degree between the blurred image and the clear image.
And obtaining a reference value of the pixel point a of the blurred image H according to the reference degree between the blurred image H and the reference images X1 and X2 and the gray values of the pixel points in the reference images X1 and X2. The calculation formula of the reference value is as follows:
Figure SMS_45
in the method, in the process of the invention,
Figure SMS_46
for reference values of pixel points a within blurred image H, D is the number of reference images of blurred image H,
Figure SMS_47
for the gray value of the pixel point having the same coordinates as the pixel point a in the ith reference image of the blurred image H,
Figure SMS_48
is the blurred image H and the ith reference image The reference degree is normalized.
Note that, when the degree of reference between the blurred image H and the reference image is larger, the degree of reference is normalized
Figure SMS_49
The larger the correlation between the blurred image H and the reference image is, the more the reference image is clear, and the normalized reference degree between the blurred image H and the reference image is
Figure SMS_50
As the weight of the gray value of the pixel point in the reference image, the gray value of the pixel point in the reference image can be adjusted according to the reference degree between images, and the deblurring effect of the blurred image H is improved.
Since the degree of blurring of the clear image in each period is small, in order to reduce the amount of calculation, the clear image does not need to be subjected to subsequent processing, and deblurring operation is performed only on the blurred image in each period.
According to the method, the reference value of each pixel point in the blurred image is obtained, the reference value is used as the gray value of the pixel point in the blurred image, and the deblurring operation of the blurred image is completed.
Deblurring the blurred image in the reciprocating process; carrying out corner detection on each frame of cutting device image in the reciprocating motion process, and obtaining at least two corner points from each frame of cutting device image; selecting a corner point farthest from the fixed pixel point as a monitoring point of the image of the corresponding cutting device; and recording the positions of the monitoring points of the images of each frame of cutting device in the reciprocating motion process, and finishing the recording of the motion trail of the swinging motion mechanism. It should be noted that, because the corner points in the image are easy to obtain and have obvious characteristics, the corner points are selected as the monitoring points of the image; the closer the distance between the monitoring point and the fixed pixel point is, the less obvious the motion track of the swinging motion mechanism is recorded by using the detection point, therefore, the farthest corner point away from the fixed pixel point is selected as the monitoring point, and the motion track of the swinging motion mechanism is recorded. The corner detection is a well-known technique, and a specific method is not described here.
In another embodiment of the present invention, the track of each corner point may be directly obtained for recording, and the process thereof will not be described herein.
The present invention has been completed.
In summary, in the embodiment of the present invention, the image data acquiring module is configured to acquire an image of a cutting device of the swing machine during a motion process; the period analysis module is used for acquiring a cutting angle according to the edge direction of the fixed pixel point, and dividing the motion process into periods; the relevant characteristic analysis module is used for acquiring the fuzzy characteristic value of the image of the cutting device and obtaining the relevant characteristic value between two frames of images; the reference index analysis module acquires the period similarity according to the related characteristic values between the images in the two periods and further acquires the reference degree between the images; and the track vision recording module is used for performing deblurring processing based on the reference degree between the images and recording the motion track of the swinging motion mechanism according to the processed images. According to the invention, the reference degree considers the correlation of the period of the image, so that the clear image selected based on the reference degree has a better deblurring effect, and the track vision recording precision and efficiency of the swinging motion mechanism are improved.
It should be noted that: the sequence of the embodiments of the present invention is only for description, and does not represent the advantages and disadvantages of the embodiments. The processes depicted in the accompanying drawings do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments.
The foregoing description of the preferred embodiments of the present invention is not intended to be limiting, but rather, any modifications, equivalents, improvements, etc. that fall within the principles of the present invention are intended to be included within the scope of the present invention.

Claims (7)

1. A track vision recording system for a rocking motion mechanism, the system comprising:
the image data acquisition module is used for acquiring at least two frames of images of the cutting device in the reciprocating motion process of the swinging motion mechanism;
the period analysis module is used for acquiring the cutting angle of each frame of cutting device image; dividing a reciprocating process into at least two cycles based on the cutting angle;
the relevant characteristic analysis module is used for acquiring a fuzzy characteristic value of each frame of cutting device image according to the rotation area and the smoothed degree of the rotation area of each frame of cutting device image; combining the difference of the cutting angles of any two frames of cutting device images, the number of pixel points in the overlapping area and the motion information difference of the pixel points in the overlapping area to obtain a related characteristic value between any two frames of cutting device images;
The reference index analysis module is used for screening clear images according to the fuzzy characteristic values; acquiring the period similarity between any two periods according to the number difference of the images in any two periods and the correlation characteristic value between the common clear images; combining the related characteristic values between any two frames of cutting device images, the period similarity between the periods and the fuzzy characteristic values to obtain a reference degree between any two frames of cutting device images;
the track vision recording module is used for carrying out deblurring processing on each frame of cutting device image according to the reference degree between the cutting device images; recording a motion track of the swinging motion mechanism according to the pixel point position in the image after deblurring treatment;
the method for acquiring the cutting angle comprises the following steps:
the cutting device image and the preset still image both comprise a fixed pixel point;
taking an edge pixel point in a preset neighborhood of the fixed pixel point in the cutting device image as a target edge point; forming a target edge point pair by the target edge point and a corresponding edge pixel point in a preset still image;
taking the included angle between each edge pixel point in the target edge point pair and the straight line formed by the fixed pixel point in the corresponding image as the cutting angle difference of the target edge point pair; taking the average value of the cutting angle differences of all the target edge points corresponding to the fixed pixel points as the cutting angle of the cutting device image;
The method for acquiring the relevant characteristic value comprises the following steps:
mapping any two frames of cutting device images into the same frame of image, and taking the overlapped part as an overlapped area of the two frames of cutting device images; acquiring the motion speed and the motion direction of each pixel point in each frame of the cutting device image by using a light flow method;
two corresponding pixel points of each position in the overlapped area in the two frames of cutting device images form an area point group; performing negative correlation mapping on the product of the absolute value of the difference value of the motion speed and the absolute value of the difference value of the motion direction of two pixel points in the region point group to obtain the motion similarity of each position in the overlapped region; accumulating the motion similarity of each position in the overlapped area to obtain a motion characteristic value of the corresponding two-frame cutting device image;
taking the absolute value of the difference value of the cutting angles of the two frames of cutting device images as a cutting angle difference value, and carrying out negative correlation mapping on the cutting angle difference value to obtain a cutting angle similarity value;
taking the product of the cutting angle similarity value of any two frames of cutting device images, the motion characteristic value and the number of pixel points in the overlapping area as the related characteristic value of the two frames of cutting device images;
The specific method for deblurring comprises the following steps:
normalizing the reference degree of the blurred image and the clear image of each frame to obtain an initial normalized reference degree of the blurred image and the clear image of each frame;
setting a reference threshold, and taking the clear image with the initial normalized reference degree larger than the reference threshold as a reference image of the blurred image;
normalizing the reference degree between the blurred image and the reference image to obtain a normalized reference degree between the blurred image and each frame of reference image; selecting any pixel point in a blurred image as a target pixel point, and taking a corresponding pixel point of the target pixel point in the reference image as a reference pixel point;
taking the product of the normalized reference degree between the reference image and the blurred image of each frame and the gray value of the reference pixel point in the corresponding reference image as an initial reference value of the reference pixel point in the corresponding reference image; taking the average value of the initial reference values of the reference pixel points in all the reference images as the reference value of the target pixel point;
changing the target pixel point to obtain the reference value of each pixel point in the blurred image; and taking the reference value of each pixel point in the blurred image as the gray value of the corresponding pixel point to finish the deblurring process of the blurred image.
2. The track vision recording system of a rocking motion mechanism of claim 1, wherein the method for obtaining the fixed pixel point comprises:
edge detection is carried out on each frame of cutting device image to obtain edge pixel points, and the edge pixel points of each frame of cutting device area are mapped to the same frame of image to obtain coincident points; and taking the corresponding edge pixel point of the coincident point in each frame of cutting device image as the fixed pixel point.
3. The track vision recording system of a rocking motion mechanism of claim 1, wherein the periodic acquisition method comprises:
mapping the cutting angles of all the cutting device images to a two-dimensional coordinate system according to time sequence to obtain a cutting angle change curve; the abscissa of the two-dimensional coordinate system is time, and the ordinate is a cutting angle; taking the end points and the trough points of the cutting angle change curve as cutting points, and taking the time intervals corresponding to curve segments among the adjacent three cutting points as a period; each period has three cutting points, wherein the first cutting point is a starting point of the period, and the third cutting point is an ending point of the period; the end point of one cycle is the start point of the next cycle.
4. The track vision recording system of a rocking motion mechanism of claim 1, wherein the method for obtaining the blur feature value comprises:
mapping a preset still image into each frame of cutting device image, and taking the overlapped part as a rotation area of the corresponding cutting device image; removing the rotating area of the cutting device image to obtain a non-overlapped area of each frame of cutting device image;
taking the absolute value of the gray level difference value between each pixel point in the rotating area in the cutting device image and the corresponding pixel point in the preset still image as the fuzzy difference of the corresponding pixel point in the rotating area; adding up the fuzzy differences of each pixel point in the rotation area to be used as the smoothed degree; and taking the product of the smoothed degree and the number of pixel points in the non-overlapped area as the fuzzy characteristic value of the cutting device image.
5. The track vision recording system of a rocking motion mechanism of claim 2, wherein the cycle similarity obtaining method comprises:
setting a fuzzy judgment threshold value; normalizing the fuzzy characteristic value of each frame of cutting device image to obtain a normalized fuzzy characteristic value; taking the cutting device image with the normalized blurring characteristic value larger than or equal to the blurring judgment threshold value as a blurring image and taking the cutting device image with the blurring judgment threshold value smaller than the blurring judgment threshold value as the clear image;
Selecting any two periods as a period group; selecting any clear image in one period of the period group as a target image, and taking a clear image corresponding to the maximum correlation characteristic value of the target image in the other period as a correlation image of the target image; the target image and the related image form a matching pair;
accumulating the correlation characteristic values between the two clear images in the matching pair in any two periods to obtain an initial period correlation degree; performing negative correlation mapping on absolute values of differences of the number of images in two periods in the period group to obtain the number similarity of the periodic images; taking the product of the reciprocal of the number similarity of the periodic images and the initial period correlation as the period similarity between two corresponding periods.
6. The system for visual recording of trajectories of rocking motion mechanisms of claim 5, wherein the reference degree obtaining method comprises:
performing negative correlation mapping on the normalized fuzzy characteristic value of each fuzzy image to obtain a fuzzy weight; and taking the product of the correlation characteristic value between each blurred image and each clear image, the period similarity between the two periods and the blurring weight value of the clear image as the reference degree between the corresponding blurred image and the corresponding clear image.
7. The track vision recording system of a rocking motion mechanism of claim 5, wherein the specific method of recording is:
deblurring the blurred image during the reciprocating motion; carrying out corner detection on each frame of cutting device image in the reciprocating motion process, wherein each frame of cutting device image obtains at least two corner points; selecting the corner point farthest from the fixed pixel point as a monitoring point of the image of the corresponding cutting device;
and recording the positions of the monitoring points of the images of each frame of cutting device in the reciprocating motion process, and finishing the recording of the motion trail of the swinging motion mechanism.
CN202310408989.9A 2023-04-18 2023-04-18 Track vision recording system of swinging motion mechanism Active CN116128769B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310408989.9A CN116128769B (en) 2023-04-18 2023-04-18 Track vision recording system of swinging motion mechanism

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310408989.9A CN116128769B (en) 2023-04-18 2023-04-18 Track vision recording system of swinging motion mechanism

Publications (2)

Publication Number Publication Date
CN116128769A CN116128769A (en) 2023-05-16
CN116128769B true CN116128769B (en) 2023-06-23

Family

ID=86312150

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310408989.9A Active CN116128769B (en) 2023-04-18 2023-04-18 Track vision recording system of swinging motion mechanism

Country Status (1)

Country Link
CN (1) CN116128769B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117671014B (en) * 2024-02-02 2024-04-19 泰安大陆医疗器械有限公司 Mechanical arm positioning grabbing method and system based on image processing

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111275626A (en) * 2018-12-05 2020-06-12 深圳市炜博科技有限公司 Video deblurring method, device and equipment based on ambiguity

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103489201B (en) * 2013-09-11 2016-10-05 华南理工大学 Method for tracking target based on motion blur information
US10062151B2 (en) * 2016-01-21 2018-08-28 Samsung Electronics Co., Ltd. Image deblurring method and apparatus
WO2020123999A1 (en) * 2018-12-13 2020-06-18 Diveplane Corporation Synthetic data generation in computer-based reasoning systems
CN113992847A (en) * 2019-04-22 2022-01-28 深圳市商汤科技有限公司 Video image processing method and device
US10989833B2 (en) * 2019-09-24 2021-04-27 Deere & Company Systems and methods for monitoring grain loss
CN113538294B (en) * 2021-08-20 2023-09-12 西安交通大学 Method and system for eliminating image motion blur
CN114820773B (en) * 2022-06-26 2022-09-27 山东济宁运河煤矿有限责任公司 Silo transport vehicle carriage position detection method based on computer vision
CN115659160B (en) * 2022-12-28 2023-06-16 北京中航路通科技有限公司 Data quality measurement method for digital twin model optimization

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111275626A (en) * 2018-12-05 2020-06-12 深圳市炜博科技有限公司 Video deblurring method, device and equipment based on ambiguity

Also Published As

Publication number Publication date
CN116128769A (en) 2023-05-16

Similar Documents

Publication Publication Date Title
CN110334635B (en) Subject tracking method, apparatus, electronic device and computer-readable storage medium
CN111753577B (en) Apple identification and positioning method in automatic picking robot
CN109389086B (en) Method and system for detecting unmanned aerial vehicle image target
CN110610150B (en) Tracking method, device, computing equipment and medium of target moving object
CN116128769B (en) Track vision recording system of swinging motion mechanism
CN112598713A (en) Offshore submarine fish detection and tracking statistical method based on deep learning
US20200057886A1 (en) Gesture recognition method and apparatus, electronic device, and computer-readable storage medium
CN109684941B (en) Litchi fruit picking area division method based on MATLAB image processing
CN110992288B (en) Video image blind denoising method used in mine shaft environment
CN110443247A (en) A kind of unmanned aerial vehicle moving small target real-time detecting system and method
CN112116633A (en) Mine drilling counting method
CN115131346B (en) Fermentation tank processing procedure detection method and system based on artificial intelligence
CN114140384A (en) Transverse vibration image recognition algorithm for hoisting steel wire rope based on contour fitting and centroid tracking
CN112464933A (en) Intelligent recognition method for small dim target of ground-based staring infrared imaging
CN117218161B (en) Fish track tracking method and system in fish tank
CN111476804A (en) Method, device and equipment for efficiently segmenting carrier roller image and storage medium
CN110378934B (en) Subject detection method, apparatus, electronic device, and computer-readable storage medium
CN113781523A (en) Football detection tracking method and device, electronic equipment and storage medium
CN112884803B (en) Real-time intelligent monitoring target detection method and device based on DSP
CN111611953B (en) Target feature training-based oil pumping unit identification method and system
CN113505629A (en) Intelligent storage article recognition device based on light weight network
CN114693556B (en) High-altitude parabolic frame difference method moving object detection and smear removal method
CN115512263A (en) Dynamic visual monitoring method and device for falling object
CN110765991B (en) High-speed rotating electrical machine fuse real-time detection system based on vision
Woods et al. Development of a pineapple fruit recognition and counting system using digital farm image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant