CN117351444A - Line inspection method and device, computer readable storage medium and sports equipment - Google Patents

Line inspection method and device, computer readable storage medium and sports equipment Download PDF

Info

Publication number
CN117351444A
CN117351444A CN202311243415.7A CN202311243415A CN117351444A CN 117351444 A CN117351444 A CN 117351444A CN 202311243415 A CN202311243415 A CN 202311243415A CN 117351444 A CN117351444 A CN 117351444A
Authority
CN
China
Prior art keywords
image
advancing
predicted
area image
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311243415.7A
Other languages
Chinese (zh)
Inventor
张申
焦继超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN202311243415.7A priority Critical patent/CN117351444A/en
Publication of CN117351444A publication Critical patent/CN117351444A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/457Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by analysing connectivity, e.g. edge linking, connected component analysis or slices

Abstract

The application belongs to the technical field of motion control, and particularly relates to a line inspection method, a line inspection device, a computer readable storage medium and motion equipment. The method comprises the following steps: obtaining a visible area image at the current moment; the visible area image is an image containing lane lines; predicting the advancing track of the motion equipment by using a preset Kalman filter to obtain a predicted advancing track of the motion equipment; determining a prediction advancing area in the visual area image according to the prediction advancing track to obtain a prediction advancing area image; and carrying out lane line identification on the image of the predicted advancing area, and determining lane lines. By the method, the forward track of the moving equipment can be predicted by using the Kalman filter, and the predicted forward area in the visible area image can be obtained according to the predicted forward track, so that the range of the visible area image can be reduced, interference of factors such as illumination and background in the visible area image can be reduced, and the accuracy of the line patrol method is improved.

Description

Line inspection method and device, computer readable storage medium and sports equipment
Technical Field
The application belongs to the technical field of motion control, and particularly relates to a line inspection method, a line inspection device, a computer readable storage medium and motion equipment.
Background
The application of science and technology in teaching has become an important part of modern education. In these teachings, sports equipment (e.g., carts) is an important carrier, and in particular, in more and more science and technology museums, the development of science and technology is demonstrated to students or teenagers using sports equipment such as carts.
In order for the sports apparatus to be able to carry out autonomous line routing, it is necessary to identify the lane line in front and to give the advancing direction of the sports apparatus. However, when the existing line inspection method is affected by factors such as illumination and background, false detection easily occurs, so that the accuracy of the line inspection method is not high.
Disclosure of Invention
In view of this, the embodiments of the present application provide a line inspection method, apparatus, computer readable storage medium, and motion device, so as to solve the problem that in the prior art, when the line inspection method is affected by factors such as illumination and background, false inspection is easy to occur, resulting in low accuracy of the line inspection method.
A first aspect of an embodiment of the present application provides a line inspection method, which is applied to sports equipment, and the method may include:
obtaining a visible area image at the current moment; the visible area image is an image containing lane lines;
predicting the advancing track of the motion equipment by using a preset Kalman filter to obtain a predicted advancing track of the motion equipment;
determining a prediction advancing area in the visual area image according to the prediction advancing track to obtain a prediction advancing area image;
carrying out lane line identification on the predicted advancing area image, and determining the lane line;
and controlling the movement equipment to move according to the lane lines.
In a specific implementation manner of the first aspect, the determining, according to the prediction advance track, a prediction advance area in the visual area image to obtain a prediction advance area image may include:
determining the predicted advancing area according to the predicted advancing track and the current position of the motion equipment;
setting pixel points of a background area in the visible area image as preset pixel values to obtain the prediction advancing area image; the background area is an area except the prediction advancing area in the visible area image.
In a specific implementation manner of the first aspect, the identifying a lane line of the predicted advanced area image, and determining the lane line may include:
performing morphological processing on the predicted advancing area image to obtain a first identification image;
carrying out connected domain calculation on the first identification image to obtain a second identification image;
and determining the edge line of the lane line according to the second identification image.
In a specific implementation manner of the first aspect, the performing morphological processing on the prediction advanced region image to obtain a first identification image may include:
performing expansion processing on the predicted forward region image to obtain an expansion processing image;
carrying out corrosion treatment on the image of the predicted advancing area to obtain a corrosion treatment image;
and subtracting the expansion processing image from the corrosion processing image to obtain the first identification image.
In a specific implementation manner of the first aspect, the controlling the movement device to move according to the lane line may include:
selecting a pixel point set to be fitted according to the edge line;
performing curve fitting on the pixel point set to be fitted to obtain a fitted curve equation;
determining the advancing direction of the movement equipment according to the fitting curve equation;
and controlling the movement device to move according to the advancing direction.
In a specific implementation manner of the first aspect, the determining, according to the fitted curve equation, a forward direction of the motion device may include:
calculating a curve slope corresponding to the fitted curve equation;
the direction of the slope of the curve is determined as the direction of advance of the moving device.
In a specific implementation manner of the first aspect, after the capturing the visible area image at the current time, the method may further include:
and carrying out Gaussian blur on the visible area image to obtain the visible area image after Gaussian blur.
A second aspect of the embodiments of the present application provides a line inspection device, which is applied to sports equipment, and the device may include:
the image acquisition module is used for acquiring the image of the visible area at the current moment; the visible area image is an image containing lane lines;
the track prediction module is used for predicting the advancing track of the motion equipment by using a preset Kalman filter to obtain a predicted advancing track of the motion equipment;
the image determining module is used for determining a prediction advancing area in the visual area image according to the prediction advancing track to obtain a prediction advancing area image;
the lane line identification module is used for carrying out lane line identification on the predicted advancing area image and determining the lane line;
and the motion control module is used for controlling the motion equipment to move according to the lane lines.
In a specific implementation manner of the second aspect, the image determining module may include:
a predicted advance region determination submodule for determining the predicted advance region according to the predicted advance track and the current position of the motion equipment;
a prediction advance area image determining sub-module, configured to set a pixel point of a background area in the visible area image to a preset pixel value, so as to obtain the prediction advance area image; the background area is an area except the prediction advancing area in the visible area image.
In a specific implementation manner of the second aspect, the lane line identification module may include:
the morphological processing sub-module is used for carrying out morphological processing on the prediction advancing area image to obtain a first identification image;
the connected domain calculation sub-module is used for carrying out connected domain calculation on the first identification image to obtain a second identification image;
and the edge line determining sub-module is used for determining the edge line of the lane line according to the second identification image.
In a specific implementation manner of the second aspect, the morphological processing submodule may include:
the expansion processing unit is used for carrying out expansion processing on the prediction advancing area image to obtain an expansion processing image;
the corrosion processing unit is used for carrying out corrosion processing on the image of the predicted advancing area to obtain a corrosion processing image;
and the image subtraction unit is used for subtracting the expansion processing image and the corrosion processing image to obtain the first identification image.
In a specific implementation manner of the second aspect, the motion control module may include:
the pixel point selecting submodule is used for selecting a pixel point set to be fitted according to the edge line;
the curve fitting sub-module is used for carrying out curve fitting on the pixel point set to be fitted to obtain a fitting curve equation;
the advancing direction determining submodule is used for determining the advancing direction of the movement equipment according to the fitting curve equation;
and the motion control sub-module is used for controlling the motion equipment to move according to the advancing direction.
In a specific implementation manner of the second aspect, the advancing direction determining sub-module may include:
the curve slope calculation unit is used for calculating the curve slope corresponding to the fitted curve equation;
an advancing direction determining unit configured to determine a direction of the slope of the curve as an advancing direction of the moving apparatus.
In a specific implementation manner of the second aspect, the line inspection device may further include:
and the Gaussian blur module is used for carrying out Gaussian blur on the visible area image to obtain the visible area image after Gaussian blur.
A third aspect of the embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of any of the above-described line inspection methods.
A fourth aspect of the embodiments of the present application provides an exercise device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the steps of any of the above-described line patrol methods when executing the computer program.
A fifth aspect of embodiments of the present application provides a computer program product which, when run on a sports apparatus, causes the sports apparatus to perform the steps of any of the above-described line inspection methods.
Compared with the prior art, the embodiment of the application has the beneficial effects that: the embodiment of the application acquires the visible area image at the current moment; the visible area image is an image containing lane lines; predicting the advancing track of the motion equipment by using a preset Kalman filter to obtain a predicted advancing track of the motion equipment; determining a prediction advancing area in the visual area image according to the prediction advancing track to obtain a prediction advancing area image; and carrying out lane line identification on the predicted advancing area image, and determining the lane line. According to the embodiment of the application, the forward track of the motion equipment can be predicted by using the Kalman filter, and the predicted forward area in the visible area image can be obtained according to the predicted forward track, so that the range of the visible area image can be narrowed, interference of factors such as illumination and background in the visible area image can be reduced, and the accuracy of the line inspection method can be improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the following description will briefly introduce the drawings that are needed in the embodiments or the description of the prior art, it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an application scenario of a trolley line patrol;
FIG. 2 is a flow chart of one embodiment of a line inspection method according to the embodiments of the present application;
FIG. 3 is a schematic illustration of morphological processing of a predicted advanced region image;
FIG. 4 is a diagram illustrating an exemplary embodiment of a line inspection device according to an exemplary embodiment of the present disclosure;
fig. 5 is a schematic block diagram of an exercise apparatus according to an embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the embodiments described below are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," "third," etc. are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
The application of science and technology in teaching has become an important part of modern education. In these teachings, sports equipment (e.g., carts) is an important carrier, and in particular, in more and more science and technology museums, the development of science and technology is demonstrated to students or teenagers using sports equipment such as carts. Referring to fig. 1, in practical application, the method of trolley line inspection and the like can be used for displaying; specifically, lane lines may be provided in a preset area, and the movement of the trolley along the lane lines may be controlled.
In order for the sports apparatus to be able to carry out autonomous line routing, it is necessary to identify the lane line in front and to give the advancing direction of the sports apparatus. However, when the existing line inspection method is affected by factors such as illumination and background, false detection easily occurs, so that the accuracy of the line inspection method is not high.
In view of this, the embodiments of the present application provide a line inspection method, apparatus, computer readable storage medium, and movement device, so as to solve the problem that in the prior art, when the line inspection method is affected by factors such as illumination and background, false inspection is easy to occur, resulting in low accuracy of the line inspection method.
It should be noted that the method of the present application may be applied to sports equipment, and may specifically include, but not limited to, sports equipment commonly used in the prior art, such as an intelligent car, an unmanned aerial vehicle, and the like.
Referring to fig. 2, an embodiment of a line inspection method in an embodiment of the present application may include:
step S201, obtaining a visible area image at the current moment.
The visible area image is an image containing lane lines.
It is to be understood that the sports equipment can be provided with a camera acquisition device for acquiring images of the surrounding environment of the sports equipment; the acquired image can contain information such as roads, obstacles and the like, so that the application of functions such as movement, obstacle avoidance, exploration and the like of the movement equipment can be assisted.
In the embodiment of the application, the image acquisition device preset on the motion equipment can be utilized to acquire the image of the surrounding environment at the current moment, so that the image of the visible area containing the lane line at the current moment can be acquired. The current time is the acquisition time of the visible area image of the current frame, and the visible area image may be an RGB image.
It should be understood that the visible area image may contain other impurities except lane lines, and these impurities may be obviously different from the background, and if the visible area image is directly processed, the visible area image may be interfered by the impurities, so that the accuracy of the line inspection method may be affected.
Therefore, in the embodiment of the application, after the visible area image at the current moment is acquired, the visible area image can be further subjected to Gaussian blur, and the visible area image after Gaussian blur is obtained, so that the difference between sundries and the background can be reduced, the interference of the sundries is reduced, and the accuracy of the line inspection method is improved.
It can be appreciated that other image denoising methods in the prior art can be selected to process the image of the visible area according to actual needs, and specifically, the image denoising method can include, but is not limited to, any common image denoising method in the prior art such as mean value blurring and median value blurring.
Step S202, predicting the advancing track of the motion equipment by using a preset Kalman filter to obtain a predicted advancing track of the motion equipment.
Since the visible area image may contain other environmental information or object information except lane lines, for example, the visible area image may contain environmental information such as trees, sky, etc.; for another example, the visible region image may include standing signs, tables, chairs, and other object information. Therefore, in the embodiment of the application, the forward track of the motion equipment can be predicted by using the preset kalman filter to obtain the predicted forward track of the motion equipment, and the range of the visible area image can be reduced according to the predicted forward track, so that the lane line can be positioned more accurately.
The Kalman filter may be set in a relevant manner when or before the line inspection method in the embodiment of the present application is first executed. Specifically, system modeling can be performed on the motion equipment in advance to obtain a state transition matrix and a measurement matrix; wherein the state transition matrix may be used to describe how the position of the motion device changes over time, and the measurement matrix may be used to describe how the position of the motion device is derived from measured position data (e.g., sampled data sampled by an associated positioning sensor); then, initializing state estimation, an observation matrix, measurement noise and process noise of the Kalman filter; the state estimation is a predicted position of the motion device, the observation matrix is measured position data of the motion device, the measurement noise can be used for describing errors in the measurement process, the process noise can be used for describing errors in the prediction process, specifically, the starting values of the state estimation and the observation matrix can be set according to the starting position of the motion device, the measurement noise and the process noise can be set specifically and in a scene according to actual conditions, and in general, the measurement noise and the process noise can be initialized to larger values.
Here, it is possible to acquire a historical motion trajectory of the moving apparatus, obtain each historical position of the moving apparatus, and set the starting position of the moving apparatus as an initial value of the state estimation. When the Kalman filter is used for carrying out forward track prediction, the position of the motion equipment at the k+1 moment can be predicted according to the state transition matrix and posterior state estimation at the k moment, so that prior state estimation of the motion equipment at the k+1 moment is obtained; then, the prior state estimation can be updated by utilizing a measurement matrix at the k+1 time to obtain posterior state estimation of the motion equipment at the k+1 time; accordingly, the measurement noise and the process noise can be updated to obtain more accurate measurement noise and process noise; the above process is repeated, so that posterior state estimation of the motion equipment at the current moment can be obtained, prior state estimation at the next moment can be predicted according to posterior state estimation of the motion equipment at the current moment, and the prior state estimation at the next moment can be used as a predicted advancing position at the next moment to obtain a predicted advancing track; the next time is the acquisition time of the visible area image of the next frame.
In a specific implementation manner of the embodiment of the present application, after obtaining the predicted advanced position at the next moment, the observation matrix at the next moment may be obtained according to the position data observation value at the next moment, so that the posterior state estimation at the next moment may be updated and obtained, and thus, when the advanced track is predicted at the next moment, the advanced track estimation may be performed by using the posterior state estimation at the next moment, so that the line inspection method of the present application may be continuously performed in the subsequent process.
Step S203, a prediction advancing area in the visual area image is determined according to the prediction advancing track, and the prediction advancing area image is obtained.
In this embodiment of the present application, specifically, the range of the image in the visible area may be narrowed according to the predicted advanced position at the next time and the current position of the moving device, so as to obtain the image of the area between the current position of the moving device and the predicted advanced position at the next time.
Specifically, a position of the motion device that may occur in the visual area image of the next frame may be determined according to the predicted advancing track of the motion device, and an area between the position of the motion device that may occur in the visual area image of the next frame and the current position of the motion device may be determined as the predicted advancing area of the motion device. Then, the pixel points of the background area in the visible area image at the current moment can be set to be preset pixel values, and a predicted advancing area image is obtained; the background area is an area except for a prediction advance area in the visual area image.
It should be noted that, in order to reduce interference of the background area on the lane line recognition, the background area may be set to be black, that is, the preset pixel value may be (0, 0), so that the visible area image may be focused on the prediction advancing area, which is helpful for improving accuracy and efficiency of lane line recognition.
It can be understood that the pixel points of the background area can be set according to actual needs, which is not particularly limited in the application.
And S204, carrying out lane line identification on the image of the predicted advancing area, and determining lane lines.
In the embodiment of the present application, morphological processing may be performed on the prediction advance region image to obtain the first identification image. Specifically, referring to fig. 3, the expansion processing may be performed on the predicted forward region image to obtain an expansion processed image; the corrosion treatment can be carried out on the image of the predicted advancing area to obtain a corrosion treatment image; then, subtracting the expansion processing image from the corrosion processing image, and taking an absolute value of a subtraction result to obtain a first identification image; the first recognition image may include two initial edge lines of the lane line.
It should be noted that, as the first recognition image still has a smaller probability of having impurities, erroneous judgment on the edge line of the lane line may be caused; and because the edge line of the lane line is a continuous area with larger length, the image binarization can be carried out on the first identification image to obtain a binary image, and the connected domain calculation is carried out on the binary image to obtain a second identification image, so that sundries with smaller connected domain can be filtered out, and the edge line of the lane line with larger connected domain can be obtained.
In a specific implementation manner of the embodiment of the present application, the area of each connected domain in the binary image may be calculated, the connected domain with the area smaller than the preset area threshold may be filtered, and the connected domain with the area greater than or equal to the preset area threshold may be retained, so as to obtain the second identification image.
It should be understood that the edge line of the lane line in the second recognition image is white, and the other areas except the edge line of the lane line are black.
In some steps of the above processing procedure, in order to process an RGB image including three channels of R, G and B, a large resource may be occupied and a long time may be consumed, so in order to increase the calculation speed, the RGB image may be subjected to image graying to obtain a gray image including only a single channel, and the gray image may be subjected to subsequent processing.
According to the second identification image, the edge line of the lane line can be accurately determined; specifically, the white area in the second recognition image may be determined as an edge line of the lane line.
Step S105, controlling the movement equipment to move according to the lane lines.
In the embodiment of the application, the advancing direction of the moving equipment can be determined according to the lane lines, and the moving equipment is controlled to move according to the advancing direction.
Specifically, a pixel point set to be fitted can be selected according to the edge line of the lane line; each pixel point to be fitted in the pixel point set to be fitted is located in an edge line of the lane line; performing curve fitting on the pixel point set to be fitted to obtain a fitting curve equation; and the advancing direction of the movement equipment can be determined according to the fitting curve equation.
It should be understood that, since the lane lines have a high probability of being located in the middle region of the visible region image, the edge lines of the lane lines can be searched for from the middle region of the second recognition image to the left and right, respectively; if the edge line of the lane line is searched, a preset number of pixel points to be fitted in the edge line of the lane line can be selected from bottom to top to obtain a pixel point set to be fitted; the preset number may be set according to actual needs, where the preset number may be set to be 100 preferentially.
It can be understood that the searching method of the edge line of the lane line and the selecting method of the pixel point set to be fitted can be set according to actual needs, and the application is not limited in detail.
After the fitting pixel point set is selected, curve fitting can be carried out on the pixel points to be fitted to obtain a fitting curve equation; the curve fitting method may be any common curve fitting method in the prior art, such as a least square method or maximum likelihood estimation, which is not limited in this application.
Then, calculating the slope of the curve corresponding to the initial pixel point of the fitted curve equation; the initial pixel point is the pixel point with the minimum distance between the initial pixel point and the lower boundary of the second identification image in the pixel point set to be fitted; specifically, the fitted curve equation may be derived at the starting pixel point, so as to obtain a curve slope corresponding to the fitted curve equation at the starting pixel point, and a direction corresponding to the curve slope may be determined as a forward direction of the motion device.
It should be noted that, at the next moment, line inspection can be continued according to the method, and the above process is repeated until the line inspection task is completed.
In summary, the embodiment of the present application acquires the visible area image at the current moment; the visible area image is an image containing lane lines; predicting the advancing track of the motion equipment by using a preset Kalman filter to obtain a predicted advancing track of the motion equipment; determining a prediction advancing area in the visual area image according to the prediction advancing track to obtain a prediction advancing area image; and carrying out lane line identification on the predicted advancing area image, and determining the lane line. According to the embodiment of the application, the forward track of the motion equipment can be predicted by using the Kalman filter, and the predicted forward area in the visible area image can be obtained according to the predicted forward track, so that the range of the visible area image can be narrowed, interference of factors such as illumination and background in the visible area image can be reduced, and the accuracy of the line inspection method can be improved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
Corresponding to the line inspection method described in the above embodiments, fig. 4 shows a structural diagram of an embodiment of a line inspection device provided in the embodiment of the present application.
In this embodiment of the present application, a line inspection device may be applied to sports equipment, the line inspection device may include:
an image acquisition module 401, configured to acquire a visible region image at a current moment; the visible area image is an image containing lane lines;
the track prediction module 402 is configured to predict a forward track of the motion device by using a preset kalman filter, so as to obtain a predicted forward track of the motion device;
an image determining module 403, configured to determine a predicted advance area in the visual area image according to the predicted advance track, so as to obtain a predicted advance area image;
the lane line recognition module 404 is configured to perform lane line recognition on the predicted advancing area image, and determine the lane line;
and the motion control module 405 is used for controlling the motion equipment to perform motion according to the lane lines.
In a specific implementation manner of the embodiment of the present application, the image determining module may include:
a predicted advance region determination submodule for determining the predicted advance region according to the predicted advance track and the current position of the motion equipment;
a prediction advance area image determining sub-module, configured to set a pixel point of a background area in the visible area image to a preset pixel value, so as to obtain the prediction advance area image; the background area is an area except the prediction advancing area in the visible area image.
In a specific implementation manner of the embodiment of the present application, the lane line identification module may include:
the morphological processing sub-module is used for carrying out morphological processing on the prediction advancing area image to obtain a first identification image;
the connected domain calculation sub-module is used for carrying out connected domain calculation on the first identification image to obtain a second identification image;
and the edge line determining sub-module is used for determining the edge line of the lane line according to the second identification image.
In a specific implementation manner of the embodiment of the present application, the morphological processing sub-module may include:
the expansion processing unit is used for carrying out expansion processing on the prediction advancing area image to obtain an expansion processing image;
the corrosion processing unit is used for carrying out corrosion processing on the image of the predicted advancing area to obtain a corrosion processing image;
and the image subtraction unit is used for subtracting the expansion processing image and the corrosion processing image to obtain the first identification image.
In a specific implementation manner of the embodiment of the present application, the motion control module may include:
the pixel point selecting submodule is used for selecting a pixel point set to be fitted according to the edge line;
the curve fitting sub-module is used for carrying out curve fitting on the pixel point set to be fitted to obtain a fitting curve equation;
the advancing direction determining submodule is used for determining the advancing direction of the movement equipment according to the fitting curve equation;
and the motion control sub-module is used for controlling the motion equipment to move according to the advancing direction.
In a specific implementation manner of the embodiment of the present application, the advancing direction determining sub-module may include:
the curve slope calculation unit is used for calculating the curve slope corresponding to the fitted curve equation;
an advancing direction determining unit configured to determine a direction of the slope of the curve as an advancing direction of the moving apparatus.
In a specific implementation manner of the embodiment of the present application, the line inspection device may further include:
and the Gaussian blur module is used for carrying out Gaussian blur on the visible area image to obtain the visible area image after Gaussian blur.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described apparatus, modules and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Fig. 5 shows a schematic block diagram of an exercise apparatus provided in an embodiment of the present application, and for convenience of explanation, only a portion relevant to the embodiment of the present application is shown.
As shown in fig. 5, the exercise apparatus 5 of this embodiment includes: a processor 50, a memory 51 and a computer program 52 stored in said memory 51 and executable on said processor 50. The steps of the above-described embodiments of the line patrol method are implemented by the processor 50 when executing the computer program 52, for example, steps S201 to S205 shown in fig. 2. Alternatively, the processor 50 may perform the functions of the modules/units of the apparatus embodiments described above, such as the functions of the modules 401 to 405 shown in fig. 4, when executing the computer program 52.
By way of example, the computer program 52 may be partitioned into one or more modules/units that are stored in the memory 51 and executed by the processor 50 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions for describing the execution of the computer program 52 in the sports apparatus 5.
It will be appreciated by those skilled in the art that fig. 5 is merely an example of the motion device 5 and is not meant to be limiting of the motion device 5, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., the motion device 5 may also include input-output devices, network access devices, buses, etc.
The processor 50 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSPs), application specific integrated circuits (Application Specific Integrated Circuit, ASICs), field programmable gate arrays (Field-Programmable Gate Array, FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the sports apparatus 5, such as a hard disk or a memory of the sports apparatus 5. The memory 51 may also be an external storage device of the mobile device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the mobile device 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the sports apparatus 5. The memory 51 is used for storing the computer program as well as other programs and data required by the sports device 5. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in this application, it should be understood that the disclosed apparatus/exercise device and method may be implemented in other ways. For example, the apparatus/motion device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each method embodiment described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable storage medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable storage medium may include content that is subject to appropriate increases and decreases as required by jurisdictions and by jurisdictions in which such computer readable storage medium does not include electrical carrier signals and telecommunications signals.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. The line inspection method is characterized by being applied to sports equipment and comprising the following steps of:
obtaining a visible area image at the current moment; the visible area image is an image containing lane lines;
predicting the advancing track of the motion equipment by using a preset Kalman filter to obtain a predicted advancing track of the motion equipment;
determining a prediction advancing area in the visual area image according to the prediction advancing track to obtain a prediction advancing area image;
carrying out lane line identification on the predicted advancing area image, and determining the lane line;
and controlling the movement equipment to move according to the lane lines.
2. The line patrol method according to claim 1, wherein said determining a predicted advance zone in said visual zone image from said predicted advance track, obtaining a predicted advance zone image, comprises:
determining the predicted advancing area according to the predicted advancing track and the current position of the motion equipment;
setting pixel points of a background area in the visible area image as preset pixel values to obtain the prediction advancing area image; the background area is an area except the prediction advancing area in the visible area image.
3. The line patrol method according to claim 1, wherein said performing lane line recognition on the predicted advance area image, determining the lane line, comprises:
performing morphological processing on the predicted advancing area image to obtain a first identification image;
carrying out connected domain calculation on the first identification image to obtain a second identification image;
and determining the edge line of the lane line according to the second identification image.
4. A line patrol method according to claim 3, wherein said morphologically processing said predicted progressive area image to obtain a first identification image comprises:
performing expansion processing on the predicted forward region image to obtain an expansion processing image;
carrying out corrosion treatment on the image of the predicted advancing area to obtain a corrosion treatment image;
and subtracting the expansion processing image from the corrosion processing image to obtain the first identification image.
5. A line patrol method according to claim 3, wherein said controlling the movement of the movement device according to the lane line comprises:
selecting a pixel point set to be fitted according to the edge line;
performing curve fitting on the pixel point set to be fitted to obtain a fitted curve equation;
determining the advancing direction of the movement equipment according to the fitting curve equation;
and controlling the movement device to move according to the advancing direction.
6. The line patrol method of claim 5 wherein said determining a heading of said moving device according to said fitted curve equation comprises:
calculating a curve slope corresponding to the fitted curve equation;
the direction of the slope of the curve is determined as the direction of advance of the moving device.
7. The line patrol method according to any one of claims 1 to 6, further comprising, after said obtaining the current time of day visible area image:
and carrying out Gaussian blur on the visible area image to obtain the visible area image after Gaussian blur.
8. The utility model provides a line patrol device which characterized in that is applied to sports equipment, includes:
the image acquisition module is used for acquiring the image of the visible area at the current moment; the visible area image is an image containing lane lines;
the track prediction module is used for predicting the advancing track of the motion equipment by using a preset Kalman filter to obtain a predicted advancing track of the motion equipment;
the image determining module is used for determining a prediction advancing area in the visual area image according to the prediction advancing track to obtain a prediction advancing area image;
the lane line identification module is used for carrying out lane line identification on the predicted advancing area image and determining the lane line;
and the motion control module is used for controlling the motion equipment to move according to the lane lines.
9. A computer-readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the line patrol method according to any one of claims 1 to 7.
10. A sports apparatus comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the line patrol method according to any one of claims 1 to 7 when the computer program is executed.
CN202311243415.7A 2023-09-25 2023-09-25 Line inspection method and device, computer readable storage medium and sports equipment Pending CN117351444A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311243415.7A CN117351444A (en) 2023-09-25 2023-09-25 Line inspection method and device, computer readable storage medium and sports equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311243415.7A CN117351444A (en) 2023-09-25 2023-09-25 Line inspection method and device, computer readable storage medium and sports equipment

Publications (1)

Publication Number Publication Date
CN117351444A true CN117351444A (en) 2024-01-05

Family

ID=89365898

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311243415.7A Pending CN117351444A (en) 2023-09-25 2023-09-25 Line inspection method and device, computer readable storage medium and sports equipment

Country Status (1)

Country Link
CN (1) CN117351444A (en)

Similar Documents

Publication Publication Date Title
Wu et al. Lane-mark extraction for automobiles under complex conditions
JP7073247B2 (en) Methods for generating lane boundary detection models, methods for detecting lane boundaries, devices for generating lane boundary detection models, devices for detecting lane boundaries, equipment, computers readable Storage media and computer programs
CN112132156B (en) Image saliency target detection method and system based on multi-depth feature fusion
EP3007099B1 (en) Image recognition system for a vehicle and corresponding method
CN112528878A (en) Method and device for detecting lane line, terminal device and readable storage medium
CN113340334B (en) Sensor calibration method and device for unmanned vehicle and electronic equipment
Mu et al. Lane detection based on object segmentation and piecewise fitting
CN111399492A (en) Robot and obstacle sensing method and device thereof
CN104282020A (en) Vehicle speed detection method based on target motion track
Mu et al. Multiscale edge fusion for vehicle detection based on difference of Gaussian
CN104915642A (en) Method and apparatus for measurement of distance to vehicle ahead
CN111382625A (en) Road sign identification method and device and electronic equipment
CN111507340B (en) Target point cloud data extraction method based on three-dimensional point cloud data
CN112927283A (en) Distance measuring method and device, storage medium and electronic equipment
CN103093481B (en) A kind of based on moving target detecting method under the static background of watershed segmentation
CN112837384A (en) Vehicle marking method and device and electronic equipment
CN116052120A (en) Excavator night object detection method based on image enhancement and multi-sensor fusion
CN116363628A (en) Mark detection method and device, nonvolatile storage medium and computer equipment
Dai et al. A driving assistance system with vision based vehicle detection techniques
CN117351444A (en) Line inspection method and device, computer readable storage medium and sports equipment
WO2017077261A1 (en) A monocular camera cognitive imaging system for a vehicle
CN114973205A (en) Traffic light tracking method and device and unmanned automobile
CN116152127A (en) 3D point cloud processing method, device, equipment and medium
CN113901903A (en) Road identification method and device
CN116503695B (en) Training method of target detection model, target detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination