CN115035356A - Method, device and equipment for changing feature points in embedded system - Google Patents

Method, device and equipment for changing feature points in embedded system Download PDF

Info

Publication number
CN115035356A
CN115035356A CN202210960214.8A CN202210960214A CN115035356A CN 115035356 A CN115035356 A CN 115035356A CN 202210960214 A CN202210960214 A CN 202210960214A CN 115035356 A CN115035356 A CN 115035356A
Authority
CN
China
Prior art keywords
image frame
rectangle
pixel points
points
black
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210960214.8A
Other languages
Chinese (zh)
Other versions
CN115035356B (en
Inventor
冀振
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xin Zhi Lian Software Co ltd
Original Assignee
Shenzhen Xin Zhi Lian Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xin Zhi Lian Software Co ltd filed Critical Shenzhen Xin Zhi Lian Software Co ltd
Priority to CN202210960214.8A priority Critical patent/CN115035356B/en
Publication of CN115035356A publication Critical patent/CN115035356A/en
Application granted granted Critical
Publication of CN115035356B publication Critical patent/CN115035356B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/7715Feature extraction, e.g. by transforming the feature space, e.g. multi-dimensional scaling [MDS]; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure provides a method, device and equipment for feature point transformation in an embedded system, which relates to the technical field of disaster monitoring, and comprises the following steps: inputting feature points into a preset rectangle aiming at any adjacent image frame to be processed, and constructing feature pixel points according to the feature points of the image frame to be processed in the rectangle; subtracting the vertical coordinates of the characteristic pixel points on the corresponding coordinate positions in the adjacent image frames to be processed to obtain standby image frames; performing Gaussian filtering and binarization on the standby image frame to obtain a black-and-white image frame; performing edge calculation according to the rectangle and the pixel points in the black-and-white image frame to obtain coordinates of four corners of the rectangle; determining a target straight line corresponding to the side of the rectangle according to the coordinates of the four corners of the pixel points in the black-and-white image frame based on a least square method; and determining target characteristic points in the adjacent image frames to be processed according to the intersection points of the target straight lines. The accuracy of the transformed feature points is improved.

Description

Method, device and equipment for changing feature points in embedded system
Technical Field
The present disclosure relates to the field of image correction technologies, and in particular, to a method, an apparatus, and a device for feature point transformation in an embedded system.
Background
In the automatic correction process of the projection system, the feature points need to be subjected to spatial distance transformation according to the change of the feature points shot by the camera, and due to the fact that noise interference and illumination influence exist in pictures shot by the camera in an actual environment, a large error exists after the spatial distance transformation, and the calculation capacity of a plurality of projectors is occupied in the spatial distance transformation process.
Disclosure of Invention
In order to solve the technical problems that the projector is occupied with more computing power in the space-to-space distance transformation process in related scenes, and the accuracy of the transformed feature points is low due to noise interference and illumination influence, the disclosure provides a feature point transformation method, a device and equipment in an embedded system.
According to a first aspect of the embodiments of the present disclosure, there is provided a feature point transformation method in an embedded system, the feature point transformation method in the embedded system including:
inputting feature points into a preset rectangle aiming at any adjacent image frame to be processed, and constructing feature pixel points according to the feature points of the image frame to be processed in the rectangle;
subtracting the vertical coordinates of the characteristic pixel points on the corresponding coordinate positions in the adjacent image frames to be processed to obtain standby image frames;
performing Gaussian filtering and binarization on the standby image frame to obtain a black-and-white image frame;
performing edge calculation according to the rectangle and the pixel points in the black-and-white image frame to obtain coordinates of four corners of the rectangle;
determining a target straight line corresponding to the side of the rectangle according to the coordinates of the four corners of the pixel points in the black-and-white image frame based on a least square method;
and determining target characteristic points in the adjacent image frames to be processed according to the intersection points of the target straight lines.
In one embodiment, the step of constructing a feature pixel point according to a feature point of the image frame to be processed in the rectangle includes:
calculating the difference value between the maximum value of each color channel and the parameter value of the color channel corresponding to the characteristic point in the rectangle;
taking the difference value as a reverse color parameter value of a color channel corresponding to the characteristic point to obtain reverse color of the characteristic point;
and constructing corresponding characteristic pixel points according to the reverse color parameter values of the characteristic points of the image frame to be processed in the rectangle.
In one embodiment, the step of performing gaussian filtering and binarization on the spare image frame to obtain a black-and-white image frame includes:
performing Gaussian filtering on the standby image frame through a Gaussian filtering matrix;
determining background pixel points from the Gaussian filtered image according to a preset gray threshold, and calculating the average gray value of the background pixel points to obtain a first average gray value and the average gray values of all pixel points in the Gaussian filtered image to obtain a second average gray value;
determining a binarization threshold according to the first average gray value and the second average gray value;
and carrying out gray level binarization on the Gaussian filtered image according to the binarization threshold value to obtain a black-and-white image frame.
In one embodiment, the step of determining a binarization threshold according to the first average gray-scale value and the second average gray-scale value includes:
and calculating the average value of the first average gray value and the second average gray value to obtain a third average gray value, and taking the third average gray value as the binarization threshold value.
In one embodiment, the step of performing edge calculation according to the rectangle and the pixel points in the black-and-white image frame to obtain coordinates of four corners of the rectangle includes:
calculating the variance of all pixel points in the black-and-white image frame and the gradient information of adjacent pixel points;
and according to the sides of the rectangle, the variance and the gradient information, performing edge calculation on the rectangle to obtain coordinates of four corners of the rectangle.
In one embodiment, the step of determining a target straight line corresponding to the side of the rectangle according to the coordinates of the four corners of the pixel points in the black-and-white image frame based on the least square method includes;
screening pixel points in the black-and-white image frame according to the coordinates of the four corners to obtain standby pixel points;
and determining a target straight line corresponding to the side of the rectangle according to the spare pixel point based on a least square method.
In one embodiment, the standby pixel points are obtained through the following formula;
fabs(r)>1;
wherein r is a correlation function, r = E/F, E = sum [ (Xi-Xmean) × (Yi-Ymean) ], F = sqrt { sum [ (Xi-Xmean) ] } sqrt { sum [ (Yi-Ymean) × (Yi-Ymean) ] }, Xmean and Ymean are an abscissa and an ordinate of any angle of the four angles, and Xi and Yi are respectively an abscissa and an ordinate of the i-th pixel point in the black-and-white image frame.
In one embodiment, the straight line corresponding to the side of the rectangle is determined by the following formula:
y=a*n i +b;
wherein a = (n × C-B × D)/(n × a-B × B), B = (a × D-B × C)/(n × a-B × B), a = sum (Xi × Xi),
B=sum(Xi),C=sum(Xi*Yi),D=sum(Yi),n i the number is the ith spare pixel point, and n is the total number of the spare pixel points.
According to a second aspect of the embodiments of the present disclosure, there is provided a feature point transforming apparatus in an embedded system, the feature point transforming apparatus in the embedded system including:
the image processing device comprises a construction module, a processing module and a processing module, wherein the construction module is configured to input feature points into a preset rectangle aiming at any adjacent image frame to be processed and construct feature pixel points according to the feature points of the image frame to be processed in the rectangle;
the first determining module is configured to subtract vertical coordinates of feature pixel points on corresponding coordinate positions in the adjacent image frames to be processed to obtain a standby image frame;
the second determining module is configured to perform Gaussian filtering and binarization on the standby image frame to obtain a black-and-white image frame;
the calculation module is configured to perform edge calculation according to the rectangle and the pixel points in the black-and-white image frame to obtain coordinates of four corners of the rectangle;
a third determining module configured to determine a target straight line corresponding to an edge of the rectangle according to coordinates of four corners of pixel points in the black-and-white image frame based on a least square method;
a fourth determining module configured to determine a target feature point in the adjacent image frames to be processed according to the intersection point of the target straight line.
In one embodiment, the building module is configured to:
calculating the difference value between the maximum value of each color channel and the parameter value of the color channel corresponding to the characteristic point in the rectangle;
taking the difference value as a reverse color parameter value of a color channel corresponding to the characteristic point to obtain reverse color of the characteristic point;
and constructing corresponding characteristic pixel points according to the reverse color parameter values of the characteristic points of the image frame to be processed in the rectangle.
In one embodiment, the second determining module is configured to:
performing Gaussian filtering on the standby image frame through a Gaussian filtering matrix;
determining background pixel points from the Gaussian filtered image according to a preset gray threshold, and calculating the average gray value of the background pixel points to obtain a first average gray value and the average gray values of all the pixel points in the Gaussian filtered image to obtain a second average gray value;
determining a binarization threshold value according to the first average gray value and the second average gray value;
and carrying out gray level binarization on the Gaussian filtered image according to the binarization threshold value to obtain a black-and-white image frame.
In one embodiment, the second determining module is configured to determine a second threshold value;
and calculating the average value of the first average gray value and the second average gray value to obtain a third average gray value, and taking the third average gray value as the binarization threshold value.
In one embodiment, the computing module is configured to:
calculating the variance of all pixel points in the black-and-white image frame and the gradient information of adjacent pixel points;
and according to the sides of the rectangle, the variance and the gradient information, performing edge calculation on the rectangle to obtain coordinates of four corners of the rectangle.
In one embodiment, the third determining module is configured to:
screening pixel points in the black-and-white image frame according to the coordinates of the four corners to obtain standby pixel points;
and determining a target straight line corresponding to the side of the rectangle according to the spare pixel point based on a least square method.
In one embodiment, the standby pixel point is obtained through the following formula;
fabs(r)>1;
wherein r is a correlation function, r = E/F, E = sum [ (Xi-Xmean) × (Yi-Ymean) ], F = sqrt { sum [ (Xi-Xmean) ] } sqrt { sum [ (Yi-Ymean) × (Yi-Ymean) ] }, Xmean and Ymean are an abscissa and an ordinate of any angle of the four angles, and Xi and Yi are respectively an abscissa and an ordinate of the i-th pixel point in the black-and-white image frame.
In one embodiment, the straight line corresponding to the side of the rectangle is determined by the following formula:
y=a*n i +b;
wherein a = (n × C-B × D)/(n × a-B × B), B = (a × D-B × C)/(n × a-B × B), a = sum (Xi × Xi),
B=sum(Xi),C=sum(Xi*Yi),D=sum(Yi),n i the number is the ith spare pixel point, and n is the total number of the spare pixel points.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the feature point transformation method in the embedded system according to any one of the first aspect.
The following beneficial effects can be at least achieved through the technical scheme:
inputting feature points into a preset rectangle by aiming at any adjacent image frame to be processed, and constructing feature pixel points according to the feature points of the image frame to be processed in the rectangle; subtracting the vertical coordinates of the characteristic pixel points on the corresponding coordinate positions in the adjacent image frames to be processed to obtain standby image frames; performing Gaussian filtering and binarization on the standby image frame to obtain a black-and-white image frame; performing edge calculation according to the rectangle and the pixel points in the black-and-white image frame to obtain coordinates of four corners of the rectangle; determining a target straight line corresponding to the side of the rectangle according to the coordinates of the four corners of the pixel points in the black-and-white image frame based on a least square method; and determining target characteristic points in the adjacent image frames to be processed according to the intersection points of the target straight lines. The accuracy of the transformed feature points is improved, and the resource occupation of the projector is reduced.
Drawings
Fig. 1 is a flowchart of a feature point transformation method in an embedded system according to an embodiment.
FIG. 2 is a flowchart of implementing step S11 in FIG. 1, according to one embodiment.
FIG. 3 is a flowchart of implementing step S13 in FIG. 1, according to one embodiment.
Fig. 4 is a block diagram of a feature point conversion device in the embedded system according to one embodiment.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, embodiments accompanying the present disclosure are described in detail below. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. This disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein.
The embodiment of the present disclosure provides a feature point transformation method in an embedded system, and fig. 1 is a flowchart of the feature point transformation method in the embedded system according to one embodiment, where the feature point transformation method in the embedded system includes the following steps:
in step S11, for any adjacent image frame to be processed, inputting feature points into a preset rectangle, and constructing feature pixel points according to the feature points of the image frame to be processed in the rectangle;
in step S12, subtracting the ordinate of the feature pixel point at the corresponding coordinate position in the adjacent image frame to be processed to obtain a standby image frame;
in step S13, performing gaussian filtering and binarization on the spare image frame to obtain a black-and-white image frame;
in step S14, performing edge calculation according to the rectangle and the pixel points in the black-and-white image frame to obtain coordinates of four corners of the rectangle;
in step S15, determining a target straight line corresponding to the side of the rectangle according to the coordinates of the four corners of the pixel points in the monochrome image frame based on a least square method;
in step S16, a target feature point is determined in the adjacent image frames to be processed according to the intersection of the target straight lines.
The technical scheme includes that feature points are input into a preset rectangle aiming at any adjacent image frame to be processed, and feature pixel points are constructed according to the feature points of the image frame to be processed in the rectangle; subtracting the vertical coordinates of the characteristic pixel points on the corresponding coordinate positions in the adjacent image frames to be processed to obtain a standby image frame; performing Gaussian filtering and binarization on the standby image frame to obtain a black-and-white image frame; performing edge calculation according to the rectangle and the pixel points in the black-and-white image frame to obtain coordinates of four corners of the rectangle; determining a target straight line corresponding to the side of the rectangle according to the coordinates of the four corners of the pixel points in the black-and-white image frame based on a least square method; and determining target characteristic points in the adjacent image frames to be processed according to the intersection points of the target straight lines. The accuracy of the transformed feature points is improved, and the resource occupation of the projector is reduced.
In one embodiment, referring to fig. 2, in step S11, the step of constructing a feature pixel point according to the feature point of the image frame to be processed in the rectangle includes:
in step S111, calculating a difference between the maximum value of each color channel and a parameter value of the color channel corresponding to the feature point in the rectangle;
it can be understood that the maximum value of each color channel is 255, for any feature point, there are three color channels of red, green and blue, the parameter value of each color channel is subtracted from 255, and the parameter value represents the value of the color channel, so as to obtain the corresponding difference value of the feature point in the three color channels.
In step S112, taking the difference value as an inverse color parameter value of a color channel corresponding to the feature point, so as to obtain an inverse color for the feature point;
for example, the values of the three color channels for any feature point are 122,94 and 106 respectively, and then the three values are subtracted from 255 respectively, so that the value of the inverse color parameter of the feature point is 133,161,149.
In step S113, a corresponding feature pixel point is constructed according to the inverse color parameter value of the feature point of the image frame to be processed in the rectangle.
In one embodiment, referring to fig. 3, in step S13, the step of performing gaussian filtering and binarization on the spare image frame to obtain a black-and-white image frame includes:
in step S131, gaussian filtering is performed on the spare image frame through a gaussian filter matrix;
in step S132, according to a preset gray threshold, determining a background pixel point from the gaussian-filtered image, and calculating an average gray value of the background pixel point to obtain a first average gray value and average gray values of all pixel points in the gaussian-filtered image, so as to obtain a second average gray value;
exemplarily, the preset gray threshold is 127, the pixel points with the gray value greater than 127 are taken as foreground pixel points, the foreground pixel points are removed from the image after gaussian filtering, and the remaining pixel points with the gray value less than or equal to 127 are taken as background pixel points. And calculating the average value of the gray values of the background pixel points to obtain a first average gray value.
In step S133, a binarization threshold is determined according to the first average grayscale value and the second average grayscale value;
in step S134, performing gray level binarization on the gaussian-filtered image according to the binarization threshold to obtain a black-and-white image frame.
In one embodiment, in step S133, the step of determining a binarization threshold according to the first average gray-scale value and the second average gray-scale value includes:
and calculating the average value of the first average gray value and the second average gray value to obtain a third average gray value, and taking the third average gray value as the binarization threshold value.
In one embodiment, in step S14, the step of performing edge calculation according to the rectangle and the pixel points in the black-and-white image frame to obtain coordinates of four corners of the rectangle includes:
calculating the variance of all pixel points in the black-and-white image frame and the gradient information of adjacent pixel points;
and according to the sides of the rectangle, the variance and the gradient information, performing edge calculation on the rectangle to obtain coordinates of four corners of the rectangle.
In one embodiment, in step S15, the step of determining a target straight line corresponding to a side of the rectangle according to coordinates of four corners of pixel points in the black-and-white image frame based on a least square method includes;
screening pixel points in the black-and-white image frame according to the coordinates of the four corners to obtain standby pixel points;
and determining a target straight line corresponding to the side of the rectangle according to the spare pixel point based on a least square method.
In one embodiment, the standby pixel point is obtained through the following formula;
fabs(r)>1;
wherein r is a correlation function, r = E/F, E = sum [ (Xi-Xmean) × (Yi-Ymean) ], F = sqrt { sum [ (Xi-Xmean) ] } sqrt { sum [ (Yi-Ymean) × (Yi-Ymean) ] }, Xmean and Ymean are an abscissa and an ordinate of any angle of the four angles, and Xi and Yi are respectively an abscissa and an ordinate of the i-th pixel point in the black-and-white image frame.
In one embodiment, the straight line corresponding to the side of the rectangle is determined by the following formula:
y=a*n i +b;
wherein a = (n × C-B × D)/(n × a-B × B), B = (a × D-B × C)/(n × a-B × B), a = sum (Xi × Xi),
B=sum(Xi),C=sum(Xi*Yi),D=sum(Yi),n i the number is the ith spare pixel point, and n is the total number of the spare pixel points.
Based on the same inventive concept, an embodiment of the present disclosure further provides a feature point transforming apparatus in an embedded system, and fig. 4 is a block diagram of the feature point transforming apparatus in the embedded system according to an embodiment of the present disclosure, where the feature point transforming apparatus 400 in the embedded system includes:
the construction module 410 is configured to input feature points into a preset rectangle for any adjacent image frame to be processed, and construct feature pixel points according to the feature points of the image frame to be processed in the rectangle;
the first determining module 420 is configured to subtract the vertical coordinates of the feature pixel points at the corresponding coordinate positions in the adjacent image frames to be processed to obtain a standby image frame;
a second determining module 430, configured to perform gaussian filtering and binarization on the spare image frame to obtain a black-and-white image frame;
the calculation module 440 is configured to perform edge calculation according to the rectangle and the pixel points in the black-and-white image frame to obtain coordinates of four corners of the rectangle;
a third determining module 450, configured to determine a target straight line corresponding to an edge of the rectangle according to coordinates of four corners of pixel points in the black-and-white image frame based on a least square method;
a fourth determining module 460 configured to determine a target feature point in the adjacent image frames to be processed according to the intersection point of the target straight line.
In one embodiment, the building module 410 is configured to:
calculating the difference value between the maximum value of each color channel and the parameter value of the color channel corresponding to the characteristic point in the rectangle;
taking the difference value as a reverse color parameter value of a color channel corresponding to the characteristic point to obtain reverse color of the characteristic point;
and constructing corresponding characteristic pixel points according to the reverse color parameter values of the characteristic points of the image frame to be processed in the rectangle.
In one embodiment, the second determining module 430 is configured to:
performing Gaussian filtering on the standby image frame through a Gaussian filtering matrix;
determining background pixel points from the Gaussian filtered image according to a preset gray threshold, and calculating the average gray value of the background pixel points to obtain a first average gray value and the average gray values of all the pixel points in the Gaussian filtered image to obtain a second average gray value;
determining a binarization threshold according to the first average gray value and the second average gray value;
and carrying out gray level binarization on the Gaussian filtered image according to the binarization threshold value to obtain a black-and-white image frame.
In one embodiment, the second determining module 430 is configured to;
and calculating the average value of the first average gray value and the second average gray value to obtain a third average gray value, and taking the third average gray value as the binarization threshold value.
In one embodiment, the calculation module 440 is configured to:
calculating the variance of all pixel points in the black-and-white image frame and the gradient information of adjacent pixel points;
and according to the sides of the rectangle, the variance and the gradient information, performing edge calculation on the rectangle to obtain coordinates of four corners of the rectangle.
In one embodiment, the third determining module 450 is configured to:
screening pixel points in the black-and-white image frame according to the coordinates of the four corners to obtain standby pixel points;
and determining a target straight line corresponding to the side of the rectangle according to the spare pixel point based on a least square method.
In one embodiment, the standby pixel point is obtained through the following formula;
fabs(r)>1;
wherein r is a correlation function, r = E/F, E = sum [ (Xi-Xmean) × (Yi-Ymean) ], F = sqrt { sum [ (Xi-Xmean) ] } sqrt { sum [ (Yi-Ymean) × (Yi-Ymean) ] }, Xmean and Ymean are an abscissa and an ordinate of any angle of the four angles, and Xi and Yi are respectively an abscissa and an ordinate of the i-th pixel point in the black-and-white image frame.
In one embodiment, the straight line corresponding to the side of the rectangle is determined by the following formula:
y=a*n i +b;
wherein a = (n × C-B × D)/(n × a-B × B), B = (a × D-B × C)/(n × a-B × B), a = sum (Xi × Xi),
B=sum(Xi),C=sum(Xi*Yi),D=sum(Yi),n i and n is the total number of the spare pixels.
An embodiment of the present disclosure further provides an electronic device, including:
a memory having a computer program stored thereon;
a processor, configured to execute the computer program in the memory to implement the steps of the feature point transformation method in the embedded system in any one of the foregoing embodiments.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-described embodiments are merely illustrative of several embodiments of the present disclosure, which are described in more detail and detailed, but are not to be construed as limiting the scope of the disclosure. It should be noted that, for those skilled in the art, various changes and modifications can be made without departing from the concept of the present disclosure, and these changes and modifications are all within the scope of the present disclosure. Therefore, the protection scope of the present disclosure should be subject to the appended claims.

Claims (10)

1. A method for transforming feature points in an embedded system is characterized in that the method for transforming feature points in the embedded system comprises the following steps:
inputting feature points into a preset rectangle aiming at any adjacent image frame to be processed, and constructing feature pixel points according to the feature points of the image frame to be processed in the rectangle;
subtracting the vertical coordinates of the characteristic pixel points on the corresponding coordinate positions in the adjacent image frames to be processed to obtain standby image frames;
performing Gaussian filtering and binarization on the standby image frame to obtain a black-and-white image frame;
performing edge calculation according to the rectangle and the pixel points in the black-and-white image frame to obtain coordinates of four corners of the rectangle;
determining a target straight line corresponding to the side of the rectangle according to the coordinates of the four corners of the pixel points in the black-and-white image frame based on a least square method;
and determining a target characteristic point in the adjacent image frames to be processed according to the intersection point of the target straight line.
2. The method for transforming the feature points in the embedded system according to claim 1, wherein the step of constructing the feature pixel points according to the feature points of the image frames to be processed in the rectangle comprises:
calculating the difference value between the maximum value of each color channel and the parameter value of the color channel corresponding to the characteristic point in the rectangle;
taking the difference value as a reverse color parameter value of a color channel corresponding to the characteristic point to obtain reverse color of the characteristic point;
and constructing corresponding characteristic pixel points according to the reverse color parameter values of the characteristic points of the image frame to be processed in the rectangle.
3. The method for transforming feature points in an embedded system according to claim 1, wherein the step of performing gaussian filtering and binarization on the spare image frame to obtain a black-and-white image frame comprises:
performing Gaussian filtering on the standby image frame through a Gaussian filtering matrix;
determining background pixel points from the Gaussian filtered image according to a preset gray threshold, and calculating the average gray value of the background pixel points to obtain a first average gray value and the average gray values of all pixel points in the Gaussian filtered image to obtain a second average gray value;
determining a binarization threshold according to the first average gray value and the second average gray value;
and carrying out gray level binarization on the Gaussian filtered image according to the binarization threshold value to obtain a black-and-white image frame.
4. The method according to claim 3, wherein the step of determining a binarization threshold according to the first average gray-scale value and the second average gray-scale value comprises:
and calculating the average value of the first average gray value and the second average gray value to obtain a third average gray value, and taking the third average gray value as the binarization threshold value.
5. The method for transforming feature points in an embedded system according to claim 1, wherein the step of performing edge calculation according to the rectangle and the pixel points in the black-and-white image frame to obtain the coordinates of the four corners of the rectangle comprises:
calculating the variance of all pixel points in the black-and-white image frame and the gradient information of adjacent pixel points;
and according to the sides of the rectangle, the variance and the gradient information, performing edge calculation on the rectangle to obtain coordinates of four corners of the rectangle.
6. The method for transforming the feature points in the embedded system according to any one of claims 1-5, wherein the step of determining the target straight line corresponding to the side of the rectangle according to the coordinates of the four corners of the pixel points in the black-and-white image frame based on the least square method comprises;
screening pixel points in the black-and-white image frame according to the coordinates of the four corners to obtain standby pixel points;
and determining a target straight line corresponding to the side of the rectangle according to the spare pixel point based on a least square method.
7. The method according to claim 6, wherein the spare pixel is obtained by the following formula;
fabs(r)>1;
wherein r is a correlation function, r = E/F, E = sum [ (Xi-Xmean) × (Yi-Ymean) ], F = sqrt { sum [ (Xi-Xmean) ] } sqrt { sum [ (Yi-Ymean) × (Yi-Ymean) ] }, Xmean and Ymean are an abscissa and an ordinate of any angle of the four angles, and Xi and Yi are respectively an abscissa and an ordinate of the i-th pixel point in the black-and-white image frame.
8. The method of claim 7, wherein the straight line corresponding to the side of the rectangle is determined by the following formula:
y=a*n i +b;
wherein a = (n × C-B × D)/(n × a-B × B), B = (a × D-B × C)/(n × a-B × B), a = sum (Xi × Xi),
B=sum(Xi),C=sum(Xi*Yi),D=sum(Yi),n i the number is the ith spare pixel point, and n is the total number of the spare pixel points.
9. A feature point transforming apparatus in an embedded system, the feature point transforming apparatus in the embedded system comprising:
the image processing device comprises a construction module, a processing module and a processing module, wherein the construction module is configured to input feature points into a preset rectangle aiming at any adjacent image frame to be processed and construct feature pixel points according to the feature points of the image frame to be processed in the rectangle;
the first determining module is configured to subtract the vertical coordinates of the characteristic pixel points at the corresponding coordinate positions in the adjacent image frames to be processed to obtain a standby image frame;
the second determining module is configured to perform Gaussian filtering and binarization on the standby image frame to obtain a black-and-white image frame;
the calculation module is configured to perform edge calculation according to the rectangle and the pixel points in the black-and-white image frame to obtain coordinates of four corners of the rectangle;
a third determining module configured to determine a target straight line corresponding to a side of the rectangle according to coordinates of the four corners of the pixel points in the monochrome image frame based on a least square method;
a fourth determining module configured to determine a target feature point in the adjacent image frames to be processed according to the intersection point of the target straight line.
10. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the feature point transformation method in the embedded system according to any one of claims 1-8.
CN202210960214.8A 2022-08-11 2022-08-11 Method, device and equipment for changing feature points in embedded system Active CN115035356B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210960214.8A CN115035356B (en) 2022-08-11 2022-08-11 Method, device and equipment for changing feature points in embedded system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210960214.8A CN115035356B (en) 2022-08-11 2022-08-11 Method, device and equipment for changing feature points in embedded system

Publications (2)

Publication Number Publication Date
CN115035356A true CN115035356A (en) 2022-09-09
CN115035356B CN115035356B (en) 2022-12-06

Family

ID=83131118

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210960214.8A Active CN115035356B (en) 2022-08-11 2022-08-11 Method, device and equipment for changing feature points in embedded system

Country Status (1)

Country Link
CN (1) CN115035356B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130027599A1 (en) * 2011-07-28 2013-01-31 Aptos Technology Inc. Projection system and image processing method thereof
CN106954054A (en) * 2017-03-22 2017-07-14 成都市极米科技有限公司 A kind of image correction method, device and projecting apparatus
CN107423739A (en) * 2016-05-23 2017-12-01 北京陌上花科技有限公司 Image characteristic extracting method and device
CN112584113A (en) * 2020-12-02 2021-03-30 深圳市当智科技有限公司 Wide-screen projection method and system based on mapping correction and readable storage medium
WO2021136981A1 (en) * 2019-12-30 2021-07-08 Sensetime International Pte. Ltd. Image processing method and apparatus, and electronic device
CN114727081A (en) * 2022-06-09 2022-07-08 深圳新智联软件有限公司 Projector projection correction method and device and projector

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130027599A1 (en) * 2011-07-28 2013-01-31 Aptos Technology Inc. Projection system and image processing method thereof
CN107423739A (en) * 2016-05-23 2017-12-01 北京陌上花科技有限公司 Image characteristic extracting method and device
CN106954054A (en) * 2017-03-22 2017-07-14 成都市极米科技有限公司 A kind of image correction method, device and projecting apparatus
WO2021136981A1 (en) * 2019-12-30 2021-07-08 Sensetime International Pte. Ltd. Image processing method and apparatus, and electronic device
CN112584113A (en) * 2020-12-02 2021-03-30 深圳市当智科技有限公司 Wide-screen projection method and system based on mapping correction and readable storage medium
CN114727081A (en) * 2022-06-09 2022-07-08 深圳新智联软件有限公司 Projector projection correction method and device and projector

Also Published As

Publication number Publication date
CN115035356B (en) 2022-12-06

Similar Documents

Publication Publication Date Title
US8600105B2 (en) Combining multiple cues in a visual object detection system
CN110650334A (en) Dead pixel detection and correction method and device, storage medium and terminal
US20130002810A1 (en) Outlier detection for colour mapping
CN107615331B (en) System and method for supporting image denoising based on neighborhood block dimension reduction
US8417047B2 (en) Noise suppression in low light images
CN105069453A (en) Image correction method and apparatus
WO2019041842A1 (en) Image processing method and device, storage medium and computer device
US8639054B2 (en) Image processing apparatus for noise removal and edge enhancement based on edge direction
CN111368587B (en) Scene detection method, device, terminal equipment and computer readable storage medium
CN114820417A (en) Image anomaly detection method and device, terminal device and readable storage medium
CN114820334A (en) Image restoration method and device, terminal equipment and readable storage medium
CN111582100B (en) Target object detection method and device
CN111563517A (en) Image processing method, image processing device, electronic equipment and storage medium
CN115035356B (en) Method, device and equipment for changing feature points in embedded system
CN113888509A (en) Method, device and equipment for evaluating image definition and storage medium
CN116091405B (en) Image processing method and device, computer equipment and storage medium
CN111340714A (en) Moire pattern processing method and device and electronic equipment
US20030169375A1 (en) Method for detecting grid in block-based compressed video
KR20220151130A (en) Image processing method and device, electronic equipment and medium
CN112489115B (en) Light-emitting module positioning method, device, electronic equipment, storage medium and system
CN111986144B (en) Image blurring judging method, device, terminal equipment and medium
CN114782239A (en) Digital watermark adding method and system based on convolutional neural network
CN108090950A (en) A kind of method for optimizing the high light pollution of go image
CN115705622A (en) Image processing method and device
CN114820418A (en) Image exception handling method and device, terminal equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Method, device, and device for feature point transformation in embedded systems

Effective date of registration: 20230918

Granted publication date: 20221206

Pledgee: Shenzhen small and medium sized small loan Co.,Ltd.

Pledgor: Shenzhen Xin Zhi Lian Software Co.,Ltd.

Registration number: Y2023980057251

CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 518000, Commercial Building 305, Block C, Huameiju A, Area 82, Haiyu Community, Xin'an Street, Bao'an District, Shenzhen City, Guangdong Province

Patentee after: Shenzhen Xin Zhi Lian Software Co.,Ltd.

Country or region after: China

Address before: 518000 room 406, block F, Huafeng Baoan Zhigu science and Technology Innovation Park, Yintian Road, Yantian community, Xixiang street, Bao'an District, Shenzhen City, Guangdong Province

Patentee before: Shenzhen Xin Zhi Lian Software Co.,Ltd.

Country or region before: China