CN116242322A - Object rolling gesture detection method, device and system - Google Patents

Object rolling gesture detection method, device and system Download PDF

Info

Publication number
CN116242322A
CN116242322A CN202310529530.4A CN202310529530A CN116242322A CN 116242322 A CN116242322 A CN 116242322A CN 202310529530 A CN202310529530 A CN 202310529530A CN 116242322 A CN116242322 A CN 116242322A
Authority
CN
China
Prior art keywords
point
image
angle
edge
acquired image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310529530.4A
Other languages
Chinese (zh)
Other versions
CN116242322B (en
Inventor
訾春元
刘彬彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kaifeng Navigation Control Technology Co ltd
Original Assignee
Kaifeng Navigation Control Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kaifeng Navigation Control Technology Co ltd filed Critical Kaifeng Navigation Control Technology Co ltd
Priority to CN202310529530.4A priority Critical patent/CN116242322B/en
Publication of CN116242322A publication Critical patent/CN116242322A/en
Application granted granted Critical
Publication of CN116242322B publication Critical patent/CN116242322B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F42AMMUNITION; BLASTING
    • F42BEXPLOSIVE CHARGES, e.g. FOR BLASTING, FIREWORKS, AMMUNITION
    • F42B35/00Testing or checking of ammunition
    • F42B35/02Gauging, sorting, trimming or shortening cartridges or missiles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C1/00Measuring angles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method, a device and a system for detecting the rolling gesture of an object, wherein the method comprises the following steps: acquiring an acquired image of an object to be tested in the flying process; detecting a sky area and a boundary line of the sky in the acquired image; calculating an angle of a skyline according to the skyline boundary, and determining corner points of the acquired image contained in the sky region; and obtaining the rolling angle of the object to be measured according to the angle of the astronomical line and the angular point. The method utilizes the acquired image to realize the detection of the rolling gesture of the object to be detected, and can solve the problems that the 0-degree gesture of the object to be detected and the working drift of the gyroscope can not be detected by utilizing the gyroscope.

Description

Object rolling gesture detection method, device and system
Technical Field
The present invention relates to the field of detection technologies, and in particular, to a method, an apparatus, and a system for detecting a rolling gesture of an object.
Background
In the related art, when the rolling gesture of an object (such as a projectile body) in the flying process is obtained, the rolling information is obtained by utilizing an angle signal output by a gyroscope. However, this method has the following problems: 1) The gyroscope can only output a rotation angle and cannot acquire a 0-degree state of the projectile body; 2) The gyro works with drift phenomenon, and the output angle gradually accumulates drift amount along with time increase, so that the output angle is wrong.
Disclosure of Invention
The present invention aims to solve at least one of the technical problems in the related art to some extent. Therefore, the invention aims to provide a method, a device and a system for detecting the rolling gesture of an object, which can realize the detection of the rolling gesture of the object to be detected by using an acquired image and can solve the problems that the 0-degree gesture of the object to be detected and the working drift of a gyroscope cannot be detected by using the gyroscope.
In a first aspect, the present invention provides a method for detecting a rolling gesture of an object, where the method includes: acquiring an acquired image of an object to be tested in the flying process; detecting a sky area and a boundary line of the sky in the acquired image; calculating an angle of a skyline according to the skyline boundary, and determining corner points of the acquired image contained in the sky region; and obtaining the rolling angle of the object to be measured according to the angle of the astronomical line and the angular point.
According to one embodiment of the present invention, after obtaining the roll angle of the object to be measured of two adjacent frames of acquired images, the method further includes: performing edge detection on the front frame acquisition image to obtain a first edge point, and performing edge detection on the rear frame acquisition image to obtain a second edge point; respectively carrying out feature matching on the first edge point and the second edge point, and respectively screening out a first key point and a second key point from the first edge point and the second edge point according to feature matching results; acquiring a first angle of the first key point and a second angle of the second key point; and obtaining the rolling angle of the rear frame acquisition image relative to the front frame acquisition image according to the first angle and the second angle.
According to one embodiment of the present invention, the detecting the sky area and the boundary line in the acquired image includes: carrying out connected region analysis on the acquired image to obtain the sky region; extracting an inner edge contour point of the sky area; and performing straight line fitting on the inner edge contour points to obtain fitting straight lines, and taking the fitting straight lines as the celestial boundary.
According to an embodiment of the present invention, the performing the connected region analysis on the acquired image to obtain the sky region includes: carrying out connected region analysis on the acquired image to obtain a plurality of connected regions; screening a communication region satisfying the following conditions from the plurality of communication regions as the sky region: the area is larger than the preset multiple of the area of the acquired image, the area comprises one corner point of the acquired image, and the number of the corner points is one, wherein the preset multiple is larger than 0 and smaller than 1/8.
According to an embodiment of the present invention, before the analyzing the connected region of the acquired image to obtain the sky region, the method further includes: converting the acquired image into an HSV channel image; performing binarization processing on an S channel of the HSV channel image to obtain a binarized image; and carrying out morphological open operation on the binarized image.
According to one embodiment of the present invention, the angle of the astronomical line is an included angle between the astronomical boundary and a preset edge of the acquired image, the preset edge is a wide edge of the acquired image, and the rolling angle of the object to be measured is obtained according to the angle of the astronomical line and the angular point, and includes: if the angular point is a (0, 0) point, determining that the rolling angle of the object to be measured is theta' +2pi; if the angular point is a (w-1, 0) point, determining that the rolling angle of the object to be measured is theta'; if the angular point is a (w-1, h-1) point, determining that the rolling angle of the object to be measured is theta' +pi; if the angular point is a (0, h-1) point, determining that the rolling angle of the object to be detected is theta' +pi; wherein w is the width of the acquired image, h is the height of the acquired image, 1 is the length of one pixel point in the acquired image, and θ' is the angle of the astronomical line.
According to an embodiment of the present invention, before edge detection is performed on the front frame captured image and the rear frame captured image, the method further includes: and respectively carrying out sharpening, gradient calculation and binarization processing on the front frame acquisition image and the rear frame acquisition image in sequence.
According to an embodiment of the present invention, the feature matching is performed on the first edge point and the second edge point, and a first key point and a second key point in the first edge point and the second edge point are respectively screened according to a feature matching result, including: performing feature matching on the first edge point and the second edge point by adopting a KNN nearest neighbor algorithm; taking a first edge point successfully matched as the first key point, and taking a second edge point successfully matched as the second key point, wherein the first edge point successfully matched is as follows: and when the k takes a preset value, the ratio of the nearest distance to the next nearest distance is smaller than or equal to a first edge point of a preset threshold value, and the preset value is larger than 1.
In a second aspect, the present invention provides a device for detecting a rolling gesture of an object, including a memory, a processor, and a computer program stored on the memory, where the method for detecting a rolling gesture of an object is implemented when the computer program is executed by the processor.
In a third aspect, the present invention proposes a system for detecting a rolling gesture of an object, the system comprising: the image acquisition circuit is used for acquiring an acquired image of the object to be detected in the flying process; according to the object rolling gesture detection device, the object rolling gesture detection device is connected with the acquired image acquisition circuit.
The method, the device and the system for detecting the rolling gesture of the object, disclosed by the embodiment of the invention, detect the rolling gesture of the object to be detected by utilizing the acquired image shot in the air immediately, and can solve the problems that a gyroscope cannot acquire a 0-degree state and the gyroscope drifts in the related art.
Drawings
FIG. 1 is a flowchart of a method for detecting a rolling gesture of an object according to a first embodiment of the present invention;
FIG. 2 (a) is an acquired image of one example of the present invention;
FIG. 2 (b) is an S-channel image of the image of FIG. 2 (a) corresponding to an HSV image;
FIG. 2 (c) is a binarized image of the image shown in FIG. 2 (b);
fig. 2 (d) is an image obtained by performing morphological opening operation on the image shown in fig. 2 (c);
fig. 2 (e) is an image obtained by performing connected domain analysis on the image shown in fig. 2 (d);
FIG. 2 (f) is an image showing the outline of the sky area in the image shown in FIG. 2 (e);
FIG. 3 is a flow chart of a method of detecting a roll gesture of an object in accordance with one embodiment of the present invention;
FIG. 4 is a flowchart of a method for detecting a rolling gesture of an object according to a second embodiment of the present invention;
FIG. 5 is a flow chart of a method of detecting a roll gesture of an object according to another embodiment of the present invention;
FIG. 6 is a block diagram of a detection apparatus for a rolling posture of an object according to an embodiment of the present invention;
FIG. 7 is a block diagram of a system for detecting a roll attitude of an object according to one embodiment of the invention;
fig. 8 is a block diagram of a detection system for a rolling posture of an object according to another embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative and intended to explain the present invention and should not be construed as limiting the invention.
The following describes a method, a device and a system for detecting the rolling gesture of an object according to an embodiment of the present invention with reference to the accompanying drawings.
Fig. 1 is a flowchart of a method of detecting a rolling posture of an object according to a first embodiment of the present invention.
As shown in fig. 1, the method for detecting the rolling gesture of the object includes:
s1, acquiring an acquired image of an object to be tested in the flying process.
In this embodiment, the object to be measured may be an object rotatable at air altitude, such as a projectile, which, after firing, rotates at high speed after the air flight process. Taking a projectile body as an example, an image acquisition circuit can be arranged on the projectile body to acquire an acquired image in the air flight process of the projectile body, and the acquired image is shown in fig. 2 (a).
S2, detecting a sky area and a boundary line of the sky in the acquired image.
In some embodiments, detecting sky areas and astronomical boundaries in an acquired image includes: carrying out connected region analysis on the acquired image to obtain a sky region; extracting inner edge contour points of the sky area; and performing straight line fitting on the inner edge contour points to obtain a fitting straight line, and taking the fitting straight line as a boundary line.
Specifically, the analysis of the connected region of the acquired image to obtain the sky region may include: carrying out connected region analysis on the acquired images to obtain a plurality of connected regions; and screening a communication region meeting the following conditions from the plurality of communication regions as a sky region: the area is larger than the preset multiple of the area of the acquired image, comprises one corner point of the acquired image, and is one in number, wherein the preset multiple is larger than 0 and smaller than 1/8, such as 1/16.
In some embodiments, to ensure the effect of analysis of the connected region, the collected image may be preprocessed before the collected image is subjected to analysis of the connected region to obtain the sky region, including: converting the acquired image into an HSV channel image; performing binarization processing on an S channel of the HSV channel image to obtain a binarized image; and carrying out morphological open operation on the binarized image.
Specifically, performing morphological open operation on the binarized image may include: the method comprises the steps of firstly carrying out morphological erosion operation on a binarized image, and then carrying out morphological expansion operation, wherein the convolution kernel size used by the morphological erosion operation is 7 multiplied by 7, and the convolution kernel size used by the morphological expansion operation is 3 multiplied by 3.
Taking the preprocessing of the acquired image shown in fig. 2 (a) as an example, the S channel corresponding to the HSV channel image may be shown in fig. 2 (b), the binarized image may be shown in fig. 2 (c), and the morphological open-calculated image may be shown in fig. 2 (d). The sky area obtained by the connected area analysis based on the image shown in fig. 2 (d) is shown in the upper left corner area shown in fig. 2 (e), and the inner outline of the sky area is shown in fig. 2 (f).
And S3, calculating the angle of the astronomical line according to the astronomical boundary line, and determining the angular point of the acquired image contained in the sky area.
The angle of the astronomical line may be an included angle between the astronomical boundary and a preset side of the acquired image, where the preset side may be a wide side of the acquired image (i.e. a lateral side in fig. 2 (a)), or may be a high side of the acquired image (i.e. a longitudinal side in fig. 2 (a)).
S4, obtaining the rolling angle of the object to be measured according to the angle of the astronomical line and the angular point.
In some embodiments, when the preset edge is the edge where the width of the acquired image is located, obtaining the rolling angle of the object to be measured according to the angle of the astronomical line and the angular point includes: if the angular point is a (0, 0) point, determining that the rolling angle of the object to be measured is theta' +2pi; if the angular point is a (w-1, 0) point, determining that the rolling angle of the object to be measured is theta'; if the angular point is a (w-1, h-1) point, determining that the rolling angle of the object to be measured is theta' +pi; if the angular point is the (0, h-1) point, determining that the rolling angle of the object to be measured is theta' +pi. Wherein w is the width of the acquired image, h is the height of the acquired image, 1 is the length of one pixel point in the acquired image, and θ' is the angle of the astronomical line.
When the sky in the image is on the upper side and the celestial boundary is parallel to the upper edge (namely the upper edge of the width place) of the acquired image, the rolling angle of the object to be measured is 0 degrees. Taking the acquired image shown in fig. 2 (a) as an example, referring to fig. 2 (a), the above-mentioned point (0, 0) is point a, (w-1, 0) is point B, (w-1, h-1) is point C, and (0, h-1) is point D.
In one embodiment of the present invention, as shown in fig. 3, after the projectile is launched, an image acquisition circuit acquires an image in real time, starts to detect a boundary line based on the image, and calculates a projectile roll angle, which comprises the following steps:
a1, converting the acquired image into an HSV channel, wherein H is hue, S is saturation and V is brightness.
A2, binarizing the S channel to obtain a binarized image.
A3, performing morphological open operation on the binarized image, namely performing morphological erosion operation firstly and then performing morphological expansion operation, wherein the size of a convolution kernel selected by the morphological erosion operation is 7 multiplied by 7, and the size of a convolution kernel selected by the morphological expansion operation is 3 multiplied by 3.
And A4, carrying out connected region analysis on the image obtained by morphological opening operation to obtain a region analysis chart (connected region).
And A5, screening sky areas from the area analysis chart.
During operation of the projectile, the sky area necessarily includes one corner of the captured image, and since the projectile is in a rotated state, the sky area may include at least one of the four corners of the captured image. Based on this, the sky candidate area can be screened out by whether the connected area contains the acquired image corner points. Further, a sky candidate area with the area of the connected area being larger than 1/16 of the pixel number of the acquired image is selected, if only one sky candidate area is selected, the sky candidate area is directly used as the sky area, and if the sky candidate area is larger than one sky area, the sky area is returned to be searched for the next frame of acquired image.
And A6, performing contour detection on the sky area, removing contour points of the image edge, and extracting inner edge contour points of the sky area.
And A7, performing straight line fitting on the inner edge contour points of the sky to obtain a boundary line.
A8, calculating a skyline angle theta' under the current image coordinates.
A9, calculating a projectile body rolling angle theta, wherein the formula is as follows:
Figure SMS_1
according to the method for detecting the rolling gesture of the object, the acquired image immediately shot in the air by the image acquisition circuit is utilized to identify the boundary line of the sky, and then the rolling gesture (including the rolling angle) of the object to be detected such as the projectile body is calculated through the boundary line of the sky, so that the problem that the gyroscope cannot acquire the 0-degree state and the working drift of the gyroscope in the related technology can be solved.
Fig. 4 is a flowchart of a method of detecting a rolling posture of an object according to a second embodiment of the present invention.
As shown in fig. 4, after obtaining the roll angle of the object to be measured of the two adjacent frames of acquired images, the method for detecting the roll gesture of the object further includes:
s5, performing edge detection on the front frame collected image to obtain a first edge point, and performing edge detection on the rear frame collected image to obtain a second edge point.
And S6, respectively carrying out feature matching on the first edge point and the second edge point, and respectively screening out a first key point and a second key point from the first edge point and the second edge point according to feature matching results.
S7, acquiring a first angle of the first key point and a second angle of the second key point.
And S8, obtaining the rolling angle of the rear frame acquisition image relative to the front frame acquisition image according to the first angle and the second angle.
In some embodiments, to facilitate rapid edge detection, the front frame captured image and the rear frame captured image may be sequentially sharpened, gradient calculated, and binarized, respectively, before edge detection is performed on the front frame captured image and the rear frame captured image.
One or both of the sharpening, gradient calculation, and binarization processing may be selected, but the effect is inferior to that of sequentially performing the sharpening, gradient calculation, and binarization processing. The sharpening, gradient calculation and binarization processing are sequentially carried out in an image preprocessing mode which is determined through a plurality of experiments and can enable the edge detection effect to be good.
In some embodiments, feature matching is performed on the first edge point and the second edge point, and the first key point and the second key point in the first edge point and the second edge point are respectively screened according to the feature matching result, including: performing feature matching on the first edge point and the second edge point by adopting a KNN nearest neighbor algorithm; taking the successfully matched first edge point as a first key point and the successfully matched second edge point as a second key point, wherein the successfully matched first edge point is as follows: and when the k takes a preset value, the ratio of the nearest distance to the next nearest distance is smaller than or equal to a first edge point of a preset threshold value, and the preset value is larger than 1.
Correspondingly, the second edge point successfully matched is: and when the k takes a preset value, the ratio of the nearest distance to the next nearest distance is smaller than or equal to a second edge point of a preset threshold value, and the preset value is larger than 1.
The preset value may be 2, 3, 4, etc., and the distance refers to the distance between the current edge point and other edge points.
In one embodiment of the present invention, as shown in fig. 5, after the detection of the boundary line and the calculation of the roll angle of the projectile body are completed, the image roll angle is acquired before and after the detection, and the specific steps are as follows:
b1, sharpening, gradient calculation and binarization are respectively carried out on the front frame image and the rear frame image.
And B2, carrying out edge detection on the sharpened, gradient calculated and binarized image.
And B3, acquiring the feature vector of the edge point.
Specifically, a reference point can be randomly selected from the acquired image, and the corresponding relation between the angles of all points (i.e. edge points) on the boundary point and the lengths (i.e. edge points and reference points) of the boundary point is listed, so that the feature vector of the edge point can be obtained.
In some examples, the manner in which feature vectors for edge points are obtained may include: and the pixel gradient direction histogram in the neighborhood is normalized, and the pixel gradient angle direction histogram in the neighborhood is normalized. Further, feature vector fusion can be performed on the obtained feature vectors.
And B4, feature matching and screening of key points.
Specifically, the respective feature points (i.e., the above-described edge points) of the preceding frame image and the following frame image are in one-to-one correspondence. When feature matching screening key points are carried out, a KNN nearest neighbor matching method can be adopted, for example, k is used for taking 2, a feature point Xf of a front frame image is taken, and the feature vector of the feature point Xf and the feature vectors of all feature points of a rear frame image are calculated to obtain a nearest distance d1 and a next nearest distance d2. If d1 is similar to d2, that is, d1/d2 is larger than a certain threshold (can take a value in the range of 0.8-1), the characteristic point Xf of the front frame image is considered to be similar to 2 characteristic points of the rear frame image, and the characteristic point Xf of the front frame image is abandoned; otherwise, the feature point Xf of the previous frame image is considered as a key point, and the feature point of the corresponding subsequent frame image is also considered as a key point of the subsequent frame image. And by analogy, obtaining all key points of the previous frame image and key points of the subsequent frame image corresponding to the key points one by one.
Similarly, a feature point Xb of the subsequent frame image may be taken, and the distance between the feature vector of the feature point Xb and the feature vectors of all the feature points of the previous frame image may be calculated to obtain a nearest distance d1 'and a next nearest distance d2'. If d1 'is similar to d2', that is, d1'/d2' is greater than a certain threshold (can take a value in the range of 0.8-1), the characteristic point Xb of the rear frame image is considered to be similar to 2 characteristic points of the front frame image, and the characteristic point Xb of the rear frame image is discarded; otherwise, the characteristic point Xb of the rear frame image is regarded as a key point, and the corresponding characteristic point of the front frame image is also regarded as a key point of the front frame image. And by analogy, obtaining all key points of the rear frame image and key points of the front frame image corresponding to the key points one by one.
And B5, acquiring angles of the key points, and taking the highest bin angle in the angle histogram as the angle of the current key point.
Wherein, calculating the angle histogram requires dividing the angle space into several small angle bins, i.e. bins of the angle histogram.
And B6, calculating the rolling angles of the images of the front frame and the rear frame.
Specifically, the angle difference of key points in the acquired images of the front frame and the rear frame is counted, and the average value is calculated, namely the rolling angle of the image of the rear frame relative to the image of the front frame.
According to the method for detecting the rolling gesture of the object, the acquired image shot by the image acquisition circuit in the air is utilized to identify the boundary of the sky, and then the rolling gesture (including the rolling angle) of the object to be detected such as the projectile body is calculated through the boundary of the sky, so that the problem that the gyroscope cannot acquire the 0-degree state and the working drift of the gyroscope in the related technology can be solved, and the rolling angle of the frontal image of the adjacent frame can be obtained through the processing and calculation of the acquired image of the adjacent frame.
Fig. 6 is a block diagram of a detection apparatus for a rolling posture of an object according to an embodiment of the present invention.
As shown in fig. 6, the object roll posture detection apparatus 500 includes: a processor 501 and a memory 503. The processor 501 is coupled to a memory 503, such as via a bus 502. Optionally, the object roll gesture detection device 500 may also include a transceiver 504. It should be noted that, in practical applications, the transceiver 504 is not limited to one, and the structure of the device 500 for detecting a rolling gesture of an object is not limited to the embodiment of the present invention.
The processor 501 may be a CPU (Central Processing Unit ), general purpose processor, DSP (Digital Signal Processor, data signal processor), ASIC (Application Specific Integrated Circuit ), FPGA (Field Programmable Gate Array, field programmable gate array) or other programmable logic device, transistor logic device, hardware components, or any combination thereof. Which may implement or perform the various exemplary logical blocks, modules, and circuits described in connection with the present disclosure. The processor 501 may also be a combination that implements computing functionality, such as a combination comprising one or more microprocessors, a combination of a DSP and a microprocessor, and the like.
Bus 502 may include a path to transfer information between the components. Bus 502 may be a PCI (Peripheral Component Interconnect, peripheral component interconnect Standard) bus or an EISA (Extended Industry Standard Architecture ) bus, or the like. The bus 502 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 6, but not only one bus or one type of bus.
The memory 503 is used to store a computer program corresponding to the detection method of the rolling posture of the object of the above-described embodiment of the present invention, which is controlled to be executed by the processor 501. The processor 501 is configured to execute a computer program stored in the memory 503 to implement what is shown in the foregoing method embodiments.
Among them, the object rolling posture detecting device 500 includes, but is not limited to: mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), car terminals (e.g., car navigation terminals), and stationary terminals such as digital TVs, desktop computers, and the like, may also be implemented by programmable array logic circuits (Field Programmable Gate Array, FPGAs). The object roll gesture detection apparatus 500 shown in fig. 6 is only an example, and should not be construed as limiting the functionality and scope of use of the embodiments of the present invention.
Fig. 7 is a block diagram of a detection system for a roll attitude of an object according to an embodiment of the present invention.
As shown in fig. 7, the object roll attitude detection system 700 includes: the image acquisition circuit 710 and the object rolling posture detection device 500 of the above embodiment.
In this embodiment, the image acquisition circuit 710 is configured to acquire an image of an object to be measured during a flight, and the detection device 500 for rolling gesture of the object is connected to the image acquisition circuit 710.
In some embodiments, as shown in fig. 8, the object roll gesture detection system 700 may further include a storage circuit 720, where the storage circuit 720 is configured to store the acquired images, so that the object roll gesture detection device 500 invokes a corresponding image to calculate a roll angle of the acquired image of the subsequent frame relative to the acquired image of the previous frame.
In this embodiment, the image acquisition circuit 710 may be connected to the object roll gesture detection device 500 through the input interface circuit 730 to transmit the acquired image.
It should be noted that, referring to fig. 8, the system 700 for detecting a rolling gesture of an object may further include a power circuit 740 to power the device 500 for detecting a rolling gesture of an object.
In summary, the method, the device and the system for detecting the rolling gesture of the object in the embodiment of the invention use the acquired image shot in the air to detect the rolling gesture of the object to be detected and the rolling angles of the front frame image and the rear frame image, so that the problems that the gyroscope cannot acquire the 0 degree state and the gyroscope work drift in the related technology can be solved.
While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.

Claims (10)

1. A method for detecting a roll gesture of an object, the method comprising:
acquiring an acquired image of an object to be tested in the flying process;
detecting a sky area and a boundary line of the sky in the acquired image;
calculating an angle of a skyline according to the skyline boundary, and determining corner points of the acquired image contained in the sky region;
and obtaining the rolling angle of the object to be measured according to the angle of the astronomical line and the angular point.
2. The method for detecting a rolling gesture of an object according to claim 1, wherein after obtaining the rolling angle of the object to be detected of two adjacent frames of acquired images, the method further comprises:
performing edge detection on the front frame acquisition image to obtain a first edge point, and performing edge detection on the rear frame acquisition image to obtain a second edge point;
respectively carrying out feature matching on the first edge point and the second edge point, and respectively screening out a first key point and a second key point from the first edge point and the second edge point according to feature matching results;
acquiring a first angle of the first key point and a second angle of the second key point;
and obtaining the rolling angle of the rear frame acquisition image relative to the front frame acquisition image according to the first angle and the second angle.
3. The method according to claim 1, wherein the detecting the sky area and the boundary line in the captured image includes:
carrying out connected region analysis on the acquired image to obtain the sky region;
extracting an inner edge contour point of the sky area;
and performing straight line fitting on the inner edge contour points to obtain fitting straight lines, and taking the fitting straight lines as the celestial boundary.
4. The method for detecting a rolling gesture of an object according to claim 3, wherein the performing the connected region analysis on the acquired image to obtain the sky region includes:
carrying out connected region analysis on the acquired image to obtain a plurality of connected regions;
screening a communication region satisfying the following conditions from the plurality of communication regions as the sky region: the area is larger than the preset multiple of the area of the acquired image, the area comprises one corner point of the acquired image, and the number of the corner points is one, wherein the preset multiple is larger than 0 and smaller than 1/8.
5. The method for detecting a rolling gesture of an object according to claim 3, wherein before the analysis of the connected region of the acquired image to obtain the sky region, the method further comprises:
converting the acquired image into an HSV channel image;
performing binarization processing on an S channel of the HSV channel image to obtain a binarized image;
and carrying out morphological open operation on the binarized image.
6. The method for detecting a rolling gesture of an object according to claim 1, wherein the angle of the astronomical line is an included angle between the astronomical boundary and a preset edge of the acquired image, the preset edge is a wide edge of the acquired image, and the obtaining the rolling angle of the object to be detected according to the angle of the astronomical line and the angular point includes:
if the angular point is a (0, 0) point, determining that the rolling angle of the object to be measured is theta' +2pi;
if the angular point is a (w-1, 0) point, determining that the rolling angle of the object to be measured is theta';
if the angular point is a (w-1, h-1) point, determining that the rolling angle of the object to be measured is theta' +pi;
if the angular point is a (0, h-1) point, determining that the rolling angle of the object to be detected is theta' +pi;
wherein w is the width of the acquired image, h is the height of the acquired image, 1 is the length of one pixel point in the acquired image, and θ' is the angle of the astronomical line.
7. The method for detecting a rolling gesture of an object according to claim 2, wherein before edge detection is performed on the front frame captured image and the rear frame captured image, the method further comprises:
and respectively carrying out sharpening, gradient calculation and binarization processing on the front frame acquisition image and the rear frame acquisition image in sequence.
8. The method for detecting a rolling gesture of an object according to claim 2, wherein the feature matching is performed on the first edge point and the second edge point, and the first key point and the second key point are selected from the first edge point and the second edge point according to a feature matching result, respectively, including:
performing feature matching on the first edge point and the second edge point by adopting a KNN nearest neighbor algorithm;
taking a first edge point successfully matched as the first key point, and taking a second edge point successfully matched as the second key point, wherein the first edge point successfully matched is as follows: and when the k takes a preset value, the ratio of the nearest distance to the next nearest distance is smaller than or equal to a first edge point of a preset threshold value, and the preset value is larger than 1.
9. An object rolling gesture detection device comprising a memory, a processor and a computer program stored on the memory, characterized in that the computer program, when executed by the processor, implements the object rolling gesture detection method according to any one of claims 1-8.
10. A system for detecting a roll attitude of an object, the system comprising:
the image acquisition circuit is used for acquiring an acquired image of the object to be detected in the flying process;
the object roll attitude detection device according to claim 9, wherein the object roll attitude detection device is connected to the captured image capturing circuit.
CN202310529530.4A 2023-05-11 2023-05-11 Object rolling gesture detection method, device and system Active CN116242322B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310529530.4A CN116242322B (en) 2023-05-11 2023-05-11 Object rolling gesture detection method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310529530.4A CN116242322B (en) 2023-05-11 2023-05-11 Object rolling gesture detection method, device and system

Publications (2)

Publication Number Publication Date
CN116242322A true CN116242322A (en) 2023-06-09
CN116242322B CN116242322B (en) 2023-07-25

Family

ID=86629948

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310529530.4A Active CN116242322B (en) 2023-05-11 2023-05-11 Object rolling gesture detection method, device and system

Country Status (1)

Country Link
CN (1) CN116242322B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117330084A (en) * 2023-12-01 2024-01-02 中国航空工业集团公司西安飞机设计研究所 Civil aircraft attitude envelope determining method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003139532A (en) * 2001-11-05 2003-05-14 Virtual Brains:Kk Map and map preparing method
US20070150198A1 (en) * 2005-12-28 2007-06-28 Solmetric Corporation Solar access measurement device
CN103453875A (en) * 2013-08-07 2013-12-18 北京理工大学 Real-time calculating method for pitch angle and roll angle of unmanned aerial vehicle
CN107340711A (en) * 2017-06-23 2017-11-10 中国人民解放军陆军军官学院 A kind of minute vehicle attitude angle automatic testing method based on video image
CN111460898A (en) * 2020-03-04 2020-07-28 北京空间飞行器总体设计部 Skyline acquisition method based on monocular camera image of lunar surface inspection tour device
CN111806354A (en) * 2020-06-05 2020-10-23 北京嘀嘀无限科技发展有限公司 Visual angle adjusting method, storage medium and system for automobile data recorder

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003139532A (en) * 2001-11-05 2003-05-14 Virtual Brains:Kk Map and map preparing method
US20070150198A1 (en) * 2005-12-28 2007-06-28 Solmetric Corporation Solar access measurement device
CN103453875A (en) * 2013-08-07 2013-12-18 北京理工大学 Real-time calculating method for pitch angle and roll angle of unmanned aerial vehicle
CN107340711A (en) * 2017-06-23 2017-11-10 中国人民解放军陆军军官学院 A kind of minute vehicle attitude angle automatic testing method based on video image
CN111460898A (en) * 2020-03-04 2020-07-28 北京空间飞行器总体设计部 Skyline acquisition method based on monocular camera image of lunar surface inspection tour device
CN111806354A (en) * 2020-06-05 2020-10-23 北京嘀嘀无限科技发展有限公司 Visual angle adjusting method, storage medium and system for automobile data recorder

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHRISTOPHER DAHLIN RODIN.ETC: "Skyline Based Camera Attitude Estimation Using a Digital Surface Model" *
丛杨 等: "基于天际线识别的无人机实时姿态角估计", 仪器仪表学报, vol. 30, no. 5, pages 938 - 942 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117330084A (en) * 2023-12-01 2024-01-02 中国航空工业集团公司西安飞机设计研究所 Civil aircraft attitude envelope determining method
CN117330084B (en) * 2023-12-01 2024-02-23 中国航空工业集团公司西安飞机设计研究所 Civil aircraft attitude envelope determining method

Also Published As

Publication number Publication date
CN116242322B (en) 2023-07-25

Similar Documents

Publication Publication Date Title
Fang et al. FPGA-based ORB feature extraction for real-time visual SLAM
CN108960211B (en) Multi-target human body posture detection method and system
CN108388879B (en) Target detection method, device and storage medium
CN109934065B (en) Method and device for gesture recognition
CN106709895B (en) Image generation method and apparatus
CN116242322B (en) Object rolling gesture detection method, device and system
Mu et al. Multiple Vehicle Detection and Tracking in Highway Traffic Surveillance Video Based on SIFT Feature Matching.
Farhat et al. Real-time embedded system for traffic sign recognition based on ZedBoard
CN114648756B (en) Book character recognition and reading method and system based on pointing vector
US8249387B2 (en) Image processing method and apparatus for detecting lines of images and start and end points of lines
WO2023092965A1 (en) Ore image segmentation method and device and computer-readable storage medium
CN114187333A (en) Image alignment method, image alignment device and terminal equipment
Yu et al. Aircraft target detection using multimodal satellite-based data
CN113052162B (en) Text recognition method and device, readable storage medium and computing equipment
EP3044734B1 (en) Isotropic feature matching
Chati et al. Hardware/software co-design of a key point detector on FPGA
KR101733288B1 (en) Object Detecter Generation Method Using Direction Information, Object Detection Method and Apparatus using the same
CN111489433A (en) Vehicle damage positioning method and device, electronic equipment and readable storage medium
Azizabadi et al. VLSI implementation of star detection and centroid calculation algorithms for star tracking applications
Zhang et al. Pedestrian detection with EDGE features of color image and HOG on depth images
CN112989872A (en) Target detection method and related device
CN110689556A (en) Tracking method and device and intelligent equipment
Kim et al. Real-time scale and rotation invariant multiple template matching
Jo et al. Camera orientation estimation using motion based vanishing point detection for automatic driving assistance system
CN112639804A (en) Method for recognizing gestures in a vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant