CN115131387A - Gasoline engine spray wall collision parameter automatic extraction method and system based on image processing - Google Patents

Gasoline engine spray wall collision parameter automatic extraction method and system based on image processing Download PDF

Info

Publication number
CN115131387A
CN115131387A CN202211022129.3A CN202211022129A CN115131387A CN 115131387 A CN115131387 A CN 115131387A CN 202211022129 A CN202211022129 A CN 202211022129A CN 115131387 A CN115131387 A CN 115131387A
Authority
CN
China
Prior art keywords
edge information
point
growing
points
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211022129.3A
Other languages
Chinese (zh)
Other versions
CN115131387B (en
Inventor
孙士刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Xiaochen Chemical Technology Co ltd
Original Assignee
Shandong Dingtai New Energy Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Dingtai New Energy Co ltd filed Critical Shandong Dingtai New Energy Co ltd
Priority to CN202211022129.3A priority Critical patent/CN115131387B/en
Publication of CN115131387A publication Critical patent/CN115131387A/en
Application granted granted Critical
Publication of CN115131387B publication Critical patent/CN115131387B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention relates to the technical field of image processing, in particular to a gasoline engine spray wall collision parameter automatic extraction method and a gasoline engine spray wall collision parameter automatic extraction system based on image processing; acquiring a spray image of a direct injection gasoline engine fuel injector in a cylinder, and extracting an initial spray area in the spray image; acquiring edge information corresponding to the initial spraying area, acquiring a plurality of breaking point pairs and a plurality of connecting point pairs corresponding to the edge information, and acquiring missing edge information between each breaking point pair; obtaining the edge information to be selected of each connecting point pair; recording the edge information to be selected and the edge information of each connecting point pair as edge information to be detected; calculating a judgment index according to the first characteristic region and the second characteristic region corresponding to the edge information to be detected, and obtaining termination edge information of each connecting point pair according to the judgment index; further obtaining final edge information; and acquiring a spraying area based on the final edge information, and then extracting spraying wall collision parameters. The method can accurately acquire the spraying area, and further accurately extract the spraying wall collision parameters.

Description

Gasoline engine spray wall collision parameter automatic extraction method and system based on image processing
Technical Field
The invention relates to the technical field of image processing, in particular to a gasoline engine spray wall collision parameter automatic extraction method and system based on image processing.
Background
In recent years, in order to improve the fuel economy of gasoline engines, more and more researchers begin to research the in-cylinder direct injection of gasoline engines, and the power performance, the economy and the emission performance of the in-cylinder direct injection gasoline engines are directly determined by the quality of the spray characteristics, so that the research on the in-cylinder direct injection spray characteristics of the gasoline engines is very important; the accurate processing of the spray image and the accurate measurement of the spray wall collision parameter are the precondition for the research of the direct spray characteristic in the cylinder of the gasoline engine.
In the prior art, a spray image is generally preprocessed to obtain a spray boundary in the spray image, then a spray area image is obtained based on the spray boundary, and finally a spray wall collision parameter is obtained according to the spray area image; therefore, whether the spray boundary extraction is accurate or not directly influences the accuracy of the spray wall collision parameters; preprocessing the spray image by utilizing an Otsu threshold segmentation method or a canny operator to obtain a spray boundary; the Otsu threshold segmentation method has a good segmentation effect on double thresholds in an image and can obtain a good boundary, but the Otsu threshold segmentation method has a non-obvious segmentation effect on multiple thresholds in the image and cannot well obtain the boundary in the image, and the information contained in the spray image is too complex, so that the Otsu threshold segmentation method cannot accurately obtain the spray boundary; however, the canny operator is prone to loss of edge information and incomplete boundary images due to the fact that detected edges are broken.
Disclosure of Invention
In order to solve the technical problems, the invention aims to provide an automatic extraction method of gasoline engine spray wall-collision parameters based on image processing, which adopts the following technical scheme:
acquiring a spray image of a gasoline direct injection injector, and extracting an initial spray area in the spray image by utilizing an Otsu threshold segmentation method;
acquiring edge information corresponding to the initial spraying area through a canny operator, wherein the edge information is non-closed edge information;
acquiring a plurality of breaking point pairs corresponding to the edge information, wherein the edge information is lost between each breaking point pair, randomly selecting one breaking point from any breaking point pair as a growing point, and acquiring the lost edge information between the breaking point pairs based on the next growing point obtained by growing the growing point;
acquiring a plurality of connecting point pairs corresponding to the edge information, wherein a plurality of pieces of edge information exist between each connecting point pair, randomly selecting one connecting point from any one connecting point pair as a seed point, growing the seed point in an area surrounded by two pieces of edge information on the outermost side between the connecting point pairs, and acquiring a plurality of pieces of edge information to be selected between the connecting point pairs;
recording edge information and edge information to be selected between each connecting point pair as edge information to be detected, acquiring a first characteristic region and a second characteristic region corresponding to the edge information to be detected, calculating a judgment index corresponding to each piece of edge information to be detected between each connecting point pair based on an average gray value corresponding to the first characteristic region and the second characteristic region and a gray value of each pixel point on the edge information to be detected, and recording the edge information to be detected corresponding to the maximum judgment index as termination edge information between each connecting point pair;
obtaining final edge information corresponding to the initial spraying area based on the termination edge information and the obtained missing edge information between each breaking point pair; and acquiring a spraying area based on the final edge information, and extracting spraying wall collision parameters according to the spraying area.
Preferably, the method for acquiring missing edge information between the pairs of fracture points includes: sequentially selecting a plurality of pixel points on edge information where the growth points are located by taking the growth points as starting points, calculating an included angle between a connecting line of any two adjacent pixel points and a horizontal line, and calculating a difference value of two adjacent included angles to obtain a difference characteristic value, wherein the difference characteristic value is the difference characteristic value of common pixel points corresponding to the two adjacent included angles; then obtaining a plurality of non-edge information pixel points on the vertical line of the non-edge information pixel points adjacent to the growing point, marking the non-edge information pixel points as first pixel points, calculating the optimal value corresponding to each first pixel point according to the included angle between the connecting line of each first pixel point and the growth point and the horizontal line and the difference characteristic value, taking the first pixel point corresponding to the maximum optimal value as the next growth point, and the next growing point is classified into the edge information, the next growing point at the moment is the first next growing point, then the first next growing point is taken as a growing point to continue growing to obtain a second next growing point, by analogy, until the obtained mth next growth point stops growing on the vertical line where the non-edge information pixel point adjacent to the other fracture point in the fracture point pair is located, the obtained next growth points form missing edge information between the fracture point pairs; wherein m is greater than or equal to 1.
Preferably, the preferred values are:
Figure 554946DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 155561DEST_PATH_IMAGE002
a preferred value corresponding to the qth first pixel point,
Figure 976886DEST_PATH_IMAGE003
the 1 st pixel point is selected on the edge information of the growing point by taking the growing point as a starting point,
Figure 857118DEST_PATH_IMAGE004
the 2 nd pixel point is selected on the edge information of the growing point by taking the growing point as a starting point,
Figure 763894DEST_PATH_IMAGE005
the ith pixel point is selected on the edge information of the growing point by taking the growing point as a starting point,
Figure 168199DEST_PATH_IMAGE006
is composed of
Figure 844031DEST_PATH_IMAGE003
With the q-th first imageThe included angle between the connecting line of the element points and the horizontal line,
Figure 426322DEST_PATH_IMAGE007
is composed of
Figure 85974DEST_PATH_IMAGE003
And
Figure 762812DEST_PATH_IMAGE004
the angle between the line of (a) and the horizontal line,
Figure 293150DEST_PATH_IMAGE008
is the q-th first pixel and
Figure 46342DEST_PATH_IMAGE005
the euclidean distance between them,
Figure 911399DEST_PATH_IMAGE009
is composed of
Figure 673819DEST_PATH_IMAGE005
Corresponding difference characteristic values; n is the total number of pixel points sequentially selected on the edge information where the growing points are located by taking the growing points as starting points;
Figure 324243DEST_PATH_IMAGE010
is an exponential function with e as the base,
Figure 248337DEST_PATH_IMAGE011
as a function of the absolute value;
Figure 617001DEST_PATH_IMAGE012
as a function of the maximum.
Preferably, the method for acquiring a plurality of pairs of connection points corresponding to the edge information includes:
carrying out corner point detection on the edge information to obtain a plurality of corner points corresponding to the edge information; and randomly selecting one corner point, acquiring edge information between the corner point and the rest of other corner points by a chain code method, and marking the corner point and one of the rest of other corner points as a connecting point pair when a plurality of pieces of edge information exist between the corner point and the one of the rest of other corner points.
Preferably, the method for acquiring the first and second feature regions corresponding to the edge information to be detected includes: recording a region with the left side width of a pixel point of the edge information to be detected as a first characteristic region, and recording a region with the right side width of a pixel point of the edge information to be detected as a second characteristic region, wherein a is more than or equal to 1.
Preferably, the determination index is:
Figure DEST_PATH_IMAGE013
wherein the content of the first and second substances,
Figure 370062DEST_PATH_IMAGE014
in order to determine the index,
Figure 874993DEST_PATH_IMAGE015
is the average gray-scale value corresponding to the first characteristic region,
Figure 969988DEST_PATH_IMAGE016
is the average gray value corresponding to the second characteristic region,
Figure 809637DEST_PATH_IMAGE017
the gray value of the jth pixel point on the edge information to be detected,
Figure 648280DEST_PATH_IMAGE018
the gray value of the j +1 th pixel point on the edge information to be detected is obtained; n is the total number of pixel points on the edge information to be detected,
Figure 273296DEST_PATH_IMAGE019
e is a natural constant as a function of the maximum.
Preferably, the spray impingement parameters include spray cone angle, spray penetration distance, spray radius, and spray height.
The invention also provides an automatic extraction system of the gasoline engine spray wall-hitting parameter based on image processing, which comprises a processor and a memory, wherein the processor executes a program of the automatic extraction method of the gasoline engine spray wall-hitting parameter based on image processing stored in the memory.
The embodiment of the invention at least has the following beneficial effects:
according to the invention, the missing edge information between the breaking point pairs corresponding to the growing points is obtained through the next growing points obtained by growing the growing points, so that the missing edge information formed by each next growing point can better accord with the edge characteristics of the edge information, and the obtained missing edge information is more real; recording edge information between each connecting point pair and edge information to be selected as edge information to be detected between each connecting point pair, calculating a judgment index according to an average gray value of a first characteristic region and a second characteristic region corresponding to the edge information to be detected and a gray value of each pixel point on the edge information to be detected, and obtaining termination edge information between each connecting point pair according to the judgment index; the termination edge information obtained by judging the index is more accurate edge information between the corresponding connecting point pairs; obtaining final edge information according to the missing edge information and the terminating edge information between the obtained breaking point pairs, so that the obtained final edge information has no breaking condition and can completely obtain the boundary of the spraying area; the method can accurately acquire the spraying area, and further accurately extract the spraying wall collision parameters.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flowchart illustrating steps of an embodiment of an automatic extraction method for gasoline engine spray wall collision parameters based on image processing according to the present invention;
FIG. 2 is a schematic illustration of a spray image;
figure 3 is a schematic view of a corner point.
Detailed Description
To further explain the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description of the proposed solution, its specific implementation, structure, features and effects will be made with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" refers to not necessarily the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The purpose of the invention is: acquiring missing edge information between breaking point pairs corresponding to the growing points through the next growing point obtained by growing the growing points, and then acquiring terminating edge information between each connecting point pair through judging indexes; obtaining accurate final edge information corresponding to the initial spraying area according to the termination edge information and the obtained missing edge information between each breaking point pair; and acquiring a spraying area based on the final edge information, and further accurately extracting spraying wall collision parameters.
Referring to fig. 1, a flowchart illustrating steps of a gasoline engine spray wall collision parameter automatic extraction method based on image processing according to an embodiment of the present invention is shown, where the method includes the following steps:
step 1, obtaining a spray image of a gasoline direct injection engine fuel injector, and extracting an initial spray area in the spray image by using an Otsu threshold segmentation method.
Specifically, a high-speed camera is used for obtaining a spray image of the gasoline direct injection engine injector, the spray image is stored in the high-speed camera as an unsigned gray scale digital image of 8-bit bytes as shown in fig. 2, in the digital image, the brightness of each pixel is represented by a numerical value between 0 and 255, 0 represents black, 255 represents white, and other numerical values represent gray scales of the pixel.
And then extracting an initial spraying area in the spraying image by using an Otsu threshold segmentation method, namely, taking the initial spraying area as a foreground and other areas as backgrounds, and segmenting the initial spraying area by using the Otsu threshold segmentation method. The Otsu threshold segmentation method is a well-known technique and is not described in detail.
Step 2, obtaining edge information corresponding to the initial spraying area through a canny operator, wherein the edge information is non-closed edge information; the method comprises the steps of obtaining a plurality of breaking point pairs corresponding to edge information, wherein the edge information is lost between each breaking point pair, randomly selecting one breaking point from any breaking point pair as a growing point, and obtaining the lost edge information between the breaking point pairs based on the next growing point obtained by growing the growing point.
The edge information corresponding to the initial spraying area is obtained through the canny operator, and the characteristics of the canny operator show that the canny operator has poor noise immunity and is sensitive to noise, so that the small noise can cause large deviation of the image boundary, and the local noise is enhanced while the boundary is detected, therefore, when the edge information of the image is detected by using the canny operator, the broken edge information or the noise edge information is easy to detect. The edge information corresponding to the initial spray region is not accurate, that is, there may be broken edge information (edge information missing) and noise edge information in the edge information corresponding to the initial spray region. In order to accurately acquire the edge information corresponding to the initial spray region, the missing edge information needs to be acquired.
Because the edge information is non-closed edge information, firstly, a plurality of breaking point pairs corresponding to the edge information are obtained, specifically, each end point corresponding to the edge information is obtained, and when the edge information is lacked between two adjacent end points, the two adjacent end points are one breaking point pair; for example, when the endpoints corresponding to the obtained edge information are endpoint 1, endpoint 2, endpoint 3, and endpoint 4 in sequence; if edge information exists between the end point 1 and the end point 2, edge information exists between the end point 3 and the end point 4, and edge information does not exist between the end point 2 and the end point 3, marking the end point 2 and the end point 3 as a fracture point pair; endpoint 2 is one of the pair of break points, and endpoint 3 is the other of the pair of break points; since there should be edge information between endpoint 2 and endpoint 3, but the canny operator does not detect edge information between endpoint 2 and endpoint 3, endpoint 2 and endpoint 3 are a pair of breakpoint points.
And then randomly selecting one breaking point from any breaking point pair as a growing point, and obtaining the missing edge information between the breaking point pairs based on the next growing point obtained by growing the growing point.
The method for acquiring the missing edge information between the fracture point pairs comprises the following steps:
sequentially selecting a plurality of pixel points on edge information where the growth points are located by taking the growth points as starting points, calculating an included angle between a connecting line of any two adjacent pixel points and a horizontal line, and calculating a difference value of two adjacent included angles to obtain a difference characteristic value, wherein the difference characteristic value is the difference characteristic value of common pixel points corresponding to the two adjacent included angles; then, a plurality of non-edge information pixel points are obtained on a vertical line where the non-edge information pixel points adjacent to the growing point are located, and are marked as first pixel points; in the embodiment, 3 non-edge information pixel points, namely 3 first pixel points, are obtained, and in the actual operation process, an implementer can adjust the non-edge information pixel points according to the actual situation; and then, taking the first next growing point as a growing point to continuously grow to obtain a second next growing point, and so on until the obtained mth next growing point is on the vertical line of the non-edge information pixel point adjacent to the other breaking point in the breaking point pair, stopping growing, and obtaining each next growing point to form missing edge information between the breaking point pair.
The preferred values are:
Figure 539193DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 866138DEST_PATH_IMAGE002
for a preferred value corresponding to the qth first pixel point,
Figure 508471DEST_PATH_IMAGE003
the 1 st pixel point is selected on the edge information of the growing point by taking the growing point as a starting point,
Figure 722415DEST_PATH_IMAGE004
the 2 nd pixel point is selected on the edge information of the growing point by taking the growing point as a starting point,
Figure 159213DEST_PATH_IMAGE005
the ith pixel point is selected on the edge information of the growing point by taking the growing point as a starting point,
Figure 244892DEST_PATH_IMAGE006
is composed of
Figure 425338DEST_PATH_IMAGE003
The angle between the connecting line of the qth first pixel point and the horizontal line,
Figure 24947DEST_PATH_IMAGE007
is composed of
Figure 632645DEST_PATH_IMAGE003
And
Figure 668603DEST_PATH_IMAGE004
the angle between the connecting line of (a) and the horizontal line,
Figure 652740DEST_PATH_IMAGE008
is the q thA first pixel and
Figure 106855DEST_PATH_IMAGE005
the euclidean distance between them,
Figure 885455DEST_PATH_IMAGE009
is composed of
Figure 408709DEST_PATH_IMAGE005
Corresponding difference characteristic values; n is the total number of pixel points sequentially selected on the edge information where the growing points are located by taking the growing points as starting points; an implementer obtains the total number of sequentially selected pixel points according to actual conditions, when the trend of the edge information of the growing point is relatively smooth, fewer pixel points can be sequentially selected, and when the trend of the edge information of the growing point is not smooth, more pixel points can be correspondingly and sequentially selected;
Figure 196537DEST_PATH_IMAGE010
is an exponential function with the base e as the base,
Figure 505158DEST_PATH_IMAGE011
as a function of the absolute value;
Figure 454660DEST_PATH_IMAGE012
as a function of the maximum.
The calculation of the above-mentioned european distance and included angle is well known technology and is not specifically described.
The preferred value representation is obtained according to the qth first pixel point
Figure 481522DEST_PATH_IMAGE003
Actual difference eigenvalue of
Figure 791149DEST_PATH_IMAGE003
The greater the similarity degree is, the greater the possibility that the qth first pixel point is the next growth point is, and the greater the preferred value is;
Figure 219856DEST_PATH_IMAGE020
the representation is obtained according to the qth first pixel point
Figure 340259DEST_PATH_IMAGE003
The actual value of the difference characteristic of (c),
Figure 588838DEST_PATH_IMAGE021
characterization of
Figure 702156DEST_PATH_IMAGE003
The difference characteristic value is calculated according to a plurality of sequentially selected pixel points, the trend of the edge information where the growth point is located is obtained, and the trend behind the edge information where the growth point is located is predicted to obtain the predicted difference characteristic value
Figure 985370DEST_PATH_IMAGE003
The predicted difference characteristic value is calculated by considering the distance between each pixel point selected in sequence and the q-th first pixel point, and the closer the distance is, the larger the weight of the difference characteristic value corresponding to the pixel point is, so that the difference characteristic value can be more accurately obtained
Figure 276674DEST_PATH_IMAGE003
The predicted difference feature value of (1).
It should be noted that the difference characteristic value of the growth point is predicted by the difference characteristic value calculated by the sequentially selected plurality of pixel points, so as to obtain the difference characteristic value of the growth point, and further obtain each first pixel point as the preferred value of the missing edge information; when calculating the optimal value, not only the angle change trend of the edge information, namely the difference characteristic value, but also the influence of the distance on the prediction difference characteristic value is considered; the obtained prediction difference characteristic value is more accurate; meanwhile, when the predicted difference characteristic value corresponding to the first next growth point is calculated, the number of the corresponding sequentially selected pixel points is one more than that of the sequentially selected pixel points corresponding to the calculated breaking point as the growth point, namely, the obtained first next growth point is added, when the predicted difference characteristic value corresponding to the second next growth point is calculated, the obtained first next growth point and the second next growth point are added, and the like. The prediction result of the first prediction is added into the second prediction, the prediction results of the first prediction and the second prediction are added into the third prediction, and the like; the missing edge information formed by each next growing point can be more consistent with the edge characteristics of the edge information, and the obtained missing edge information is more real.
And 3, acquiring a plurality of connecting point pairs corresponding to the edge information, wherein a plurality of pieces of edge information exist between each connecting point pair, randomly selecting one connecting point from any one connecting point pair as a seed point, growing the seed point in an area surrounded by two pieces of edge information on the outermost side between the connecting point pairs, and acquiring a plurality of pieces of edge information to be selected between the connecting point pairs.
Firstly, a plurality of connecting point pairs corresponding to the edge information are obtained, and the method for obtaining the plurality of connecting point pairs specifically comprises the following steps:
carrying out corner detection on the edge information by using a Harris corner detection algorithm to obtain a plurality of corners corresponding to the edge information; and randomly selecting one corner point, acquiring edge information between the corner point and the rest of other corner points by a chain code method, and marking the corner point and one of the rest of other corner points as a connecting point pair when a plurality of pieces of edge information exist between the corner point and the one of the rest of other corner points. The implementer may also select another corner detection algorithm to obtain a plurality of corners corresponding to the edge information, and both the Harris corner detection algorithm and the chain code method are known technologies and are not further described.
For example, a schematic diagram of each corner point corresponding to the edge information is shown in fig. 3, and then the corner point d and the corner point e are a connecting point pair; i.e. corner d is one of the pair of connection points and corner e is the other of the pair of connection points.
Then randomly selecting one connection point from any one connection point pair as a seed point, and growing the seed point in an area surrounded by two pieces of edge information on the outermost side between the connection point pairs to obtain a plurality of pieces of edge information to be selected between the connection point pairs; for example, an angular point d is used as a seed point, an angular point e is used as an end point, the seed point grows in a region Q surrounded by two pieces of edge information on the outermost side between the connection point pairs, the growth directions are 0 °, 45 ° and-45 ° of the seed point, when the difference value between the gray value of the pixel point in the growth direction and the gray value of the seed point is smaller than a difference threshold, the seed point grows once, the pixel point in the corresponding growth direction is used as the seed point to continue to grow until the difference value between the gray value of the pixel point in the growth direction and the gray value of the seed point is greater than the difference threshold or the end point (angular point e) is reached, the growth is stopped, and the edge information to be selected at the moment is obtained, but because of excessive interference factors in the growth process, there may be a plurality of the obtained edge information to be selected at the moment. The values of the difference threshold values corresponding to different spray images are different, and the difference threshold values are set by an implementer according to specific conditions.
It should be noted that, the existence of multiple pieces of edge information between one connecting point pair is caused by the inaccuracy of the canny operator in the detection; since the canny operator detects a plurality of pieces of edge information between a connection point pair, it is considered that there is necessarily one piece of accurate edge information between the connection point pair, and therefore, one connection point is randomly selected from any one connection point pair as a seed point, the edge information to be selected between the corresponding connection point pair is obtained through the growth of the seed point, and then an accurate piece of edge information is selected from the edge information to be selected and the edge information between the two connection point pairs.
And 4, recording edge information and to-be-selected edge information between each connecting point pair as to-be-detected edge information, acquiring a first characteristic region and a second characteristic region corresponding to the to-be-detected edge information, calculating a judgment index corresponding to each piece of to-be-detected edge information between each connecting point pair based on an average gray value corresponding to the first characteristic region and the second characteristic region and a gray value of each pixel point on the to-be-detected edge information, and recording the to-be-detected edge information corresponding to the maximum judgment index as termination edge information between each connecting point pair.
The method for acquiring the first and second characteristic regions corresponding to the edge information to be detected comprises the following steps: and recording the region with the width of a pixel points on the left side of the edge information to be detected as a first characteristic region, and recording the region with the width of a pixel points on the right side of the edge information to be detected as a second characteristic region, wherein a is more than or equal to 1. In this embodiment, a =2, the implementer can select the value of a according to the specific situation.
Specifically, the determination index is:
Figure 12549DEST_PATH_IMAGE022
wherein the content of the first and second substances,
Figure 929558DEST_PATH_IMAGE014
in order to determine the index,
Figure 67279DEST_PATH_IMAGE015
is the average gray value corresponding to the first characteristic region,
Figure 529484DEST_PATH_IMAGE016
is the average gray value corresponding to the second characteristic region,
Figure 752655DEST_PATH_IMAGE017
the gray value of the jth pixel point on the edge information to be detected,
Figure 473355DEST_PATH_IMAGE018
the gray value of the j +1 th pixel point on the edge information to be detected is obtained; n is the total number of pixel points on the edge information to be detected,
Figure 465582DEST_PATH_IMAGE019
e is a natural constant as a function of the maximum.
The judgment index represents the probability that the edge information to be detected is the termination edge information, the higher the probability is, the more likely the corresponding edge information to be detected is the termination edge information,
Figure 98688DEST_PATH_IMAGE023
representing the difference degree between every two adjacent pixel points on the edge information to be detected, wherein the larger the difference degree is, the smaller the similarity degree between every two adjacent pixel points on the edge information to be detected is, namely the more disordered the gray scale distribution of the pixel points on the edge information to be detected is, the more unlikely the edge information to be detected is to be the termination edge information,
Figure 74735DEST_PATH_IMAGE024
and representing the difference between the first characteristic region and the second characteristic region corresponding to the edge information to be detected, wherein the larger the difference is, the better the segmentation effect of the edge information to be detected is represented, and the higher the possibility that the edge information to be detected is termination edge information is. As can be seen from the above-described analysis,
Figure 333547DEST_PATH_IMAGE023
and the judgment index are in a negative correlation relationship, but not in a linear relationship,
Figure 180280DEST_PATH_IMAGE024
and the judgment indexes are in positive correlation but not linear, so that a calculation formula of the judgment indexes is obtained by using a mathematical modeling method, and the relationship between each factor and the judgment indexes is met.
It should be noted that, as can be seen from the description in step 3, there is an accurate edge information between each pair of connected points, and the terminating edge information is the accurate edge information between the corresponding pair of connected points.
Step 5, obtaining final edge information corresponding to the initial spraying area based on the termination edge information and the obtained missing edge information between each breaking point pair; and acquiring a spraying area based on the final edge information, and extracting spraying wall collision parameters according to the spraying area.
The final edge information comprises edge information obtained by using a canny operator, termination edge information and the obtained missing edge information between each breaking point pair; and finally, the edge information completes accurate acquisition of the boundary of the spraying area, and then the spraying area is accurately acquired according to the final edge information, so that the spraying wall collision parameters can be accurately extracted from the spraying area.
The final edge information comprises edge information obtained by using a canny operator, wherein the edge information comprises the following edge information: only one piece of edge information exists between two corner points, such as edge information between corner point a and corner point b, edge information between corner point c and corner point d, and edge information between corner point e and corner point a in fig. 3.
Spray impingement parameters include spray cone angle, spray penetration distance, spray radius, and spray height. Wherein the spray cone angle is the included angle between the tangent lines at two sides of the spray from the position of the nozzle to the spray penetration distance of 1/2; the spray penetration distance is the distance from the nozzle position to the farthest end of the spray development before hitting the wall, and the vertical distance from the nozzle position to the wall surface after hitting the wall; the spray radius is the distance of spreading of the jet along the wall surface after the spray hits the wall; the spraying height is the spraying entrainment height which is vertical to the wall surface and is outward along the wall surface direction after the free spraying hits the wall surface;
it should be noted that extracting the spray wall collision parameter according to the spray area is a known technique and will not be described in detail.
The invention also provides a gasoline engine spray wall-hitting parameter automatic extraction system based on image processing, which comprises a processor and a memory, wherein the processor executes a program of the gasoline engine spray wall-hitting parameter automatic extraction method based on image processing stored in the memory.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; the modifications or substitutions do not make the essence of the corresponding technical solutions deviate from the technical solutions of the embodiments of the present application, and are included in the protection scope of the present application.

Claims (8)

1. A gasoline engine spray wall collision parameter automatic extraction method based on image processing is characterized by comprising the following steps:
acquiring a spray image of a gasoline direct injection injector, and extracting an initial spray area in the spray image by utilizing an Otsu threshold segmentation method;
acquiring edge information corresponding to the initial spraying area through a canny operator, wherein the edge information is non-closed edge information;
acquiring a plurality of breaking point pairs corresponding to the edge information, wherein the edge information is lost between each breaking point pair, randomly selecting one breaking point from any breaking point pair as a growing point, and acquiring the lost edge information between the breaking point pairs based on the next growing point obtained by growing the growing point;
acquiring a plurality of connecting point pairs corresponding to the edge information, wherein a plurality of pieces of edge information exist between each connecting point pair, randomly selecting one connecting point from any one connecting point pair as a seed point, growing the seed point in an area surrounded by two pieces of edge information on the outermost side between the connecting point pairs, and acquiring a plurality of pieces of edge information to be selected between the connecting point pairs;
recording edge information and edge information to be selected between each connecting point pair as edge information to be detected, acquiring a first characteristic region and a second characteristic region corresponding to the edge information to be detected, calculating a judgment index corresponding to each piece of edge information to be detected between each connecting point pair based on an average gray value corresponding to the first characteristic region and the second characteristic region and a gray value of each pixel point on the edge information to be detected, and recording the edge information to be detected corresponding to the maximum judgment index as termination edge information between each connecting point pair;
obtaining final edge information corresponding to the initial spraying area based on the termination edge information and the obtained missing edge information between each breaking point pair; and acquiring a spraying area based on the final edge information, and extracting spraying wall collision parameters according to the spraying area.
2. The automatic extraction method of gasoline engine spray wall collision parameters based on image processing as claimed in claim 1, wherein the method for obtaining the missing edge information between the pair of fracture points is: sequentially selecting a plurality of pixel points on edge information where the growth points are located by taking the growth points as starting points, calculating an included angle between a connecting line of any two adjacent pixel points and a horizontal line, and calculating a difference value of two adjacent included angles to obtain a difference characteristic value, wherein the difference characteristic value is the difference characteristic value of common pixel points corresponding to the two adjacent included angles; then obtaining a plurality of non-edge information pixel points on the vertical line of the non-edge information pixel points adjacent to the growing point, marking the non-edge information pixel points as first pixel points, calculating the optimal value corresponding to each first pixel point according to the included angle between the connecting line of each first pixel point and the growth point and the horizontal line and the difference characteristic value, taking the first pixel point corresponding to the maximum optimal value as the next growth point, and the next growing point is classified into the edge information, the next growing point at the moment is the first next growing point, then the first next growing point is taken as a growing point to continue growing to obtain a second next growing point, by analogy, until the obtained mth next growth point stops growing on the vertical line where the non-edge information pixel point adjacent to the other fracture point in the fracture point pair is located, the obtained next growth points form missing edge information between the fracture point pairs; wherein m is greater than or equal to 1.
3. The gasoline engine spray wall-collision parameter automatic extraction method based on image processing as claimed in claim 2, wherein the preferred values are:
Figure 569592DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 895531DEST_PATH_IMAGE002
a preferred value corresponding to the qth first pixel point,
Figure 58659DEST_PATH_IMAGE003
the 1 st pixel point is selected on the edge information of the growing point by taking the growing point as a starting point,
Figure 179062DEST_PATH_IMAGE004
the 2 nd pixel point is selected on the edge information of the growing point by taking the growing point as a starting point,
Figure 942487DEST_PATH_IMAGE005
the ith pixel point is selected on the edge information of the growing point by taking the growing point as a starting point,
Figure 806538DEST_PATH_IMAGE006
is composed of
Figure 89752DEST_PATH_IMAGE003
The angle between the connecting line of the qth first pixel point and the horizontal line,
Figure 381056DEST_PATH_IMAGE007
is composed of
Figure 360339DEST_PATH_IMAGE003
And
Figure 762501DEST_PATH_IMAGE004
the angle between the line of (a) and the horizontal line,
Figure 165801DEST_PATH_IMAGE008
is the q-th first pixel and
Figure 628006DEST_PATH_IMAGE005
the euclidean distance between them,
Figure 834865DEST_PATH_IMAGE009
is composed of
Figure 306298DEST_PATH_IMAGE005
Corresponding difference characteristic values; n is the total number of pixel points sequentially selected on the edge information where the growing points are located by taking the growing points as starting points;
Figure 298525DEST_PATH_IMAGE010
is an exponential function with e as the base,
Figure 931631DEST_PATH_IMAGE011
as a function of the absolute value;
Figure 891366DEST_PATH_IMAGE012
as a function of the maximum.
4. The method for automatically extracting the gasoline engine spray wall collision parameter based on the image processing as claimed in claim 1, wherein the method for acquiring the plurality of connecting point pairs corresponding to the edge information comprises:
carrying out corner detection on the edge information to obtain a plurality of corners corresponding to the edge information; and randomly selecting one corner point, acquiring edge information between the corner point and the rest of other corner points by a chain code method, and marking the corner point and one of the rest of other corner points as a connecting point pair when a plurality of pieces of edge information exist between the corner point and the rest of other corner points.
5. The method for automatically extracting the gasoline engine spray wall collision parameter based on the image processing as claimed in claim 1, wherein the method for acquiring the first and second characteristic regions corresponding to the edge information to be detected comprises the following steps: and recording the region with the width of a pixel points on the left side of the edge information to be detected as a first characteristic region, and recording the region with the width of a pixel points on the right side of the edge information to be detected as a second characteristic region, wherein a is more than or equal to 1.
6. The gasoline engine spray wall-collision parameter automatic extraction method based on image processing as claimed in claim 1, wherein the determination index is:
Figure 900910DEST_PATH_IMAGE013
wherein the content of the first and second substances,
Figure 747644DEST_PATH_IMAGE014
in order to determine the index,
Figure 817231DEST_PATH_IMAGE015
is the average gray value corresponding to the first characteristic region,
Figure 998682DEST_PATH_IMAGE016
is the average gray value corresponding to the second characteristic region,
Figure 811918DEST_PATH_IMAGE017
the gray value of the jth pixel point on the edge information to be detected,
Figure 778737DEST_PATH_IMAGE018
the gray value of the j +1 th pixel point on the edge information to be detected is obtained; n is the total number of pixel points on the edge information to be detected,
Figure 753646DEST_PATH_IMAGE019
e is a natural constant as a function of the maximum.
7. The method for automatically extracting the gasoline engine spray impingement wall parameters based on the image processing as claimed in claim 1, wherein the spray impingement wall parameters comprise a spray cone angle, a spray penetration distance, a spray radius and a spray height.
8. An automatic extraction system for gasoline engine spray wall-hitting parameters based on image processing, comprising a processor and a memory, wherein the processor executes the program of the automatic extraction method for gasoline engine spray wall-hitting parameters based on image processing, which is stored in the memory, according to any one of claims 1 to 7.
CN202211022129.3A 2022-08-25 2022-08-25 Gasoline engine spray wall collision parameter automatic extraction method and system based on image processing Active CN115131387B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211022129.3A CN115131387B (en) 2022-08-25 2022-08-25 Gasoline engine spray wall collision parameter automatic extraction method and system based on image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211022129.3A CN115131387B (en) 2022-08-25 2022-08-25 Gasoline engine spray wall collision parameter automatic extraction method and system based on image processing

Publications (2)

Publication Number Publication Date
CN115131387A true CN115131387A (en) 2022-09-30
CN115131387B CN115131387B (en) 2023-01-24

Family

ID=83387766

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211022129.3A Active CN115131387B (en) 2022-08-25 2022-08-25 Gasoline engine spray wall collision parameter automatic extraction method and system based on image processing

Country Status (1)

Country Link
CN (1) CN115131387B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116843682A (en) * 2023-08-30 2023-10-03 江苏太湖锅炉股份有限公司 Boiler thermal efficiency on-line detection and analysis system using thermal infrared imager

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050047678A1 (en) * 2003-09-03 2005-03-03 Jones James L. Image change detection systems, methods, and articles of manufacture
CN107220988A (en) * 2017-04-30 2017-09-29 南京理工大学 Based on the parts image edge extraction method for improving canny operators
CN108564091A (en) * 2018-03-08 2018-09-21 佛山市云米电器科技有限公司 Target area weak boundary extracting method and oil smoke concentration detection and interference elimination method
CN109102518A (en) * 2018-08-10 2018-12-28 广东工业大学 A kind of method of Image Edge-Detection, system and associated component
CN109102517A (en) * 2018-08-10 2018-12-28 广东工业大学 A kind of method of Image Edge-Detection, system and associated component
WO2019041590A1 (en) * 2017-08-31 2019-03-07 中国科学院微电子研究所 Edge detection method using arbitrary angle
CN109558908A (en) * 2018-11-28 2019-04-02 西安邮电大学 A kind of determination method of given area optimal edge
CN111127498A (en) * 2019-12-12 2020-05-08 重庆邮电大学 Canny edge detection method based on edge self-growth
CN112053345A (en) * 2020-09-02 2020-12-08 长春大学 GDI gasoline engine spraying wall-collision parameter automatic extraction method and system based on machine vision
CN112070792A (en) * 2020-08-25 2020-12-11 清华大学 Edge growth connection method and device for image segmentation
CN113269791A (en) * 2021-04-26 2021-08-17 西安交通大学 Point cloud segmentation method based on edge judgment and region growth
CN113343976A (en) * 2021-05-13 2021-09-03 武汉大学 Anti-highlight interference engineering measurement mark extraction method based on color-edge fusion feature growth
CN113935992A (en) * 2021-12-15 2022-01-14 武汉和众成设备工贸有限公司 Image processing-based oil pollution interference resistant gear crack detection method and system
CN113989313A (en) * 2021-12-23 2022-01-28 武汉智博通科技有限公司 Edge detection method and system based on image multidimensional analysis
CN114399522A (en) * 2022-01-14 2022-04-26 东南大学 High-low threshold-based Canny operator edge detection method
CN114627140A (en) * 2022-05-16 2022-06-14 新风光电子科技股份有限公司 Coal mine ventilator intelligent adjusting method based on high-voltage frequency converter
CN114937039A (en) * 2022-07-21 2022-08-23 阿法龙(山东)科技有限公司 Intelligent detection method for steel pipe defects

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050047678A1 (en) * 2003-09-03 2005-03-03 Jones James L. Image change detection systems, methods, and articles of manufacture
CN107220988A (en) * 2017-04-30 2017-09-29 南京理工大学 Based on the parts image edge extraction method for improving canny operators
WO2019041590A1 (en) * 2017-08-31 2019-03-07 中国科学院微电子研究所 Edge detection method using arbitrary angle
CN108564091A (en) * 2018-03-08 2018-09-21 佛山市云米电器科技有限公司 Target area weak boundary extracting method and oil smoke concentration detection and interference elimination method
CN109102518A (en) * 2018-08-10 2018-12-28 广东工业大学 A kind of method of Image Edge-Detection, system and associated component
CN109102517A (en) * 2018-08-10 2018-12-28 广东工业大学 A kind of method of Image Edge-Detection, system and associated component
CN109558908A (en) * 2018-11-28 2019-04-02 西安邮电大学 A kind of determination method of given area optimal edge
CN111127498A (en) * 2019-12-12 2020-05-08 重庆邮电大学 Canny edge detection method based on edge self-growth
CN112070792A (en) * 2020-08-25 2020-12-11 清华大学 Edge growth connection method and device for image segmentation
CN112053345A (en) * 2020-09-02 2020-12-08 长春大学 GDI gasoline engine spraying wall-collision parameter automatic extraction method and system based on machine vision
CN113269791A (en) * 2021-04-26 2021-08-17 西安交通大学 Point cloud segmentation method based on edge judgment and region growth
CN113343976A (en) * 2021-05-13 2021-09-03 武汉大学 Anti-highlight interference engineering measurement mark extraction method based on color-edge fusion feature growth
CN113935992A (en) * 2021-12-15 2022-01-14 武汉和众成设备工贸有限公司 Image processing-based oil pollution interference resistant gear crack detection method and system
CN113989313A (en) * 2021-12-23 2022-01-28 武汉智博通科技有限公司 Edge detection method and system based on image multidimensional analysis
CN114399522A (en) * 2022-01-14 2022-04-26 东南大学 High-low threshold-based Canny operator edge detection method
CN114627140A (en) * 2022-05-16 2022-06-14 新风光电子科技股份有限公司 Coal mine ventilator intelligent adjusting method based on high-voltage frequency converter
CN114937039A (en) * 2022-07-21 2022-08-23 阿法龙(山东)科技有限公司 Intelligent detection method for steel pipe defects

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
MEIPING SONG: "Crack Detection Algorithm for Photovoltaic Image Based on Multi-Scale Pyramid and Improved Region Growing", 《2018 IEEE 3RD INTERNATIONAL CONFERENCE ON IMAGE, VISION AND COMPUTING (ICIVC)》 *
XUEFENG NI等: "Detection for Rail Surface Defects via Partitioned Edge Feature", 《IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS》 *
史庆楠: "基于蚁群算法的图像边缘检测及分割", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
张娜: "基于多视角区域生长的复杂曲面结构模型分割方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116843682A (en) * 2023-08-30 2023-10-03 江苏太湖锅炉股份有限公司 Boiler thermal efficiency on-line detection and analysis system using thermal infrared imager
CN116843682B (en) * 2023-08-30 2023-11-03 江苏太湖锅炉股份有限公司 Boiler thermal efficiency on-line detection and analysis system using thermal infrared imager

Also Published As

Publication number Publication date
CN115131387B (en) 2023-01-24

Similar Documents

Publication Publication Date Title
CN115082419B (en) Blow-molded luggage production defect detection method
CN107578035A (en) Human body contour outline extracting method based on super-pixel polychrome color space
CN107564017B (en) Method for detecting and segmenting urban high-resolution remote sensing image shadow
CN107066952A (en) A kind of method for detecting lane lines
CN110110687B (en) Method for automatically identifying fruits on tree based on color information and three-dimensional contour information
CN109409181B (en) Independent detection method for upper and lower edges of fingers for low-quality finger vein image
CN107992856B (en) High-resolution remote sensing building shadow detection method under urban scene
CN110415241A (en) A kind of surface of concrete structure quality determining method based on computer vision
CN115131387B (en) Gasoline engine spray wall collision parameter automatic extraction method and system based on image processing
CN116645373B (en) Wood surface defect identification method
CN114863493B (en) Detection method and detection device for low-quality fingerprint image and non-fingerprint image
CN105405148B (en) A kind of remote sensing image Chinese white poplar recognition methods of combination trees shadow character
CN115393657A (en) Metal pipe production abnormity identification method based on image processing
Liu et al. Development of a machine vision algorithm for recognition of peach fruit in a natural scene
CN102938074B (en) Self-adaptive extraction method of badminton field or tennis field in virtual advertising system during sports live broadcast
CN106324708B (en) Digitizing solution, the device of rainfall record drawing
CN115546615A (en) Chinese herbal medicine rhizome slice identification method, storage medium and electronic equipment
CN116823820A (en) Aluminum-carbon integral stopper rod contour bending detection method
CN114758139B (en) Method for detecting accumulated water in foundation pit
CN113066041A (en) Pavement crack detection method based on stack sparse self-coding deep learning
CN105678795B (en) A kind of field shoe watermark image method of inspection
CN115439891A (en) Fingerprint identification method based on small fingerprint head, low computational power and low memory chip
CN105894000A (en) RANSAC-based laser network mark image feature extraction
Su et al. Effective target extraction of automatic target-scoring system
Wang et al. A novel method concerning about image transition region extraction and segmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20231025

Address after: 274000 North Road, 800 meters east of the intersection of Leize Avenue and Chengpu Street, Fenghuang Town, Juancheng County, Heze City, Shandong Province

Patentee after: Shandong Xiaochen Chemical Technology Co.,Ltd.

Address before: 274600 west of Fuma Zhangzhuang, north section of Leize Avenue, juancheng County, Heze City, Shandong Province

Patentee before: Shandong Dingtai new energy Co.,Ltd.

TR01 Transfer of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Automatic extraction method and system of gasoline engine spray impingement parameters based on image processing

Effective date of registration: 20231221

Granted publication date: 20230124

Pledgee: Shandong juancheng Rural Commercial Bank Co.,Ltd.

Pledgor: Shandong Xiaochen Chemical Technology Co.,Ltd.

Registration number: Y2023980073195

PE01 Entry into force of the registration of the contract for pledge of patent right