CN116718791A - Method, device, system and storage medium for detecting rotation speed of torque spring - Google Patents

Method, device, system and storage medium for detecting rotation speed of torque spring Download PDF

Info

Publication number
CN116718791A
CN116718791A CN202310396912.4A CN202310396912A CN116718791A CN 116718791 A CN116718791 A CN 116718791A CN 202310396912 A CN202310396912 A CN 202310396912A CN 116718791 A CN116718791 A CN 116718791A
Authority
CN
China
Prior art keywords
face
target
torque spring
gray value
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310396912.4A
Other languages
Chinese (zh)
Other versions
CN116718791B (en
Inventor
杜智生
柯琳健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Dushi Chengfa Precision Spring Co ltd
Original Assignee
Dongguan Dushi Chengfa Precision Spring Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Dushi Chengfa Precision Spring Co ltd filed Critical Dongguan Dushi Chengfa Precision Spring Co ltd
Priority to CN202310396912.4A priority Critical patent/CN116718791B/en
Priority claimed from CN202310396912.4A external-priority patent/CN116718791B/en
Publication of CN116718791A publication Critical patent/CN116718791A/en
Application granted granted Critical
Publication of CN116718791B publication Critical patent/CN116718791B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/36Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E10/00Energy generation through renewable energy sources
    • Y02E10/70Wind energy
    • Y02E10/72Wind turbines with rotation axis in wind direction

Abstract

The invention discloses a method, equipment, a system and a storage medium for detecting the rotating speed of a torque spring, which relate to the technical field of computer vision, and the method comprises the following steps: acquiring a rotation motion video of a target end face of the torque spring; the rotary motion video comprises a plurality of frame end face images shot at a preset frame rate; detecting the center position of the end face of the target end face in the end face image of each frame; drawing a gray value circumferential change curve of the end face image; determining translation distances between two gray value circumferential change curves corresponding to two adjacent frames of end face images; and obtaining the instantaneous rotating speed of the target end face according to the translation distance and the preset frame rate. The invention solves the technical problems that the conventional rotation speed detection method is difficult to detect the instantaneous rotation speed of the torque spring and can not accurately analyze the motion stability of the imaging catheter.

Description

Method, device, system and storage medium for detecting rotation speed of torque spring
Technical Field
The invention relates to the technical field of computer vision, in particular to a method, equipment and a system for detecting the rotating speed of a torque spring and a storage medium.
Background
The torsion spring can be arranged in the protective tube to serve as an imaging catheter of the flexibly driven medical endoscopic imaging device, so that scanning action is realized, and a scanning image is obtained by matching with the imaging end head. However, due to friction between the torsion spring and the protective tube, the movement of the imaging catheter is inconsistent everywhere, and the scanning movement of the imaging tip at the distal end of the imaging catheter is unstable, so that the scanning image obtained by the imaging tip can generate non-uniform rotation (non-uniform rotational distortion, NURD) artifacts, and the interpretation and analysis of the imaging result are interfered.
In the related art, a fuzzy image method can be utilized to shoot a frame of fuzzy image with relatively long exposure time when an imaging catheter rotates, so that a concentric circle smear is formed on the distal end face of the imaging catheter in the fuzzy image, then according to a mathematical model of motion blur, rotation speed information is calculated from the concentric circle smear, rotation speed measurement on the distal end face of the imaging catheter is realized, motion stability of the imaging catheter is analyzed, and further, evaluation on NURD artifact severity during imaging of an endoscopic imaging device is realized. However, since the instantaneous rotation speed of the distal end face fluctuates greatly, and the longer exposure time in the blurred image method deteriorates the time resolution, it is difficult to reflect the instantaneous rotation speed, and the motion stability of the imaging catheter cannot be accurately analyzed.
Disclosure of Invention
The main purpose of the invention is that: the invention provides a rotational speed detection method, equipment, a system and a storage medium of a torque spring, and aims to solve the technical problems that the conventional rotational speed detection method is difficult to detect the instantaneous rotational speed of the torque spring and can not accurately analyze the motion stability of an imaging catheter.
In order to achieve the above purpose, the invention adopts the following technical scheme:
in a first aspect, the present invention provides a method for detecting a rotational speed of a torsion spring, including:
Acquiring a rotation motion video of a target end face of the torque spring; the rotary motion video comprises a plurality of frame end face images shot at a preset frame rate;
detecting the center position of the end face of the target end face in the end face image of each frame;
drawing a gray value circumferential change curve of the end face image; the vertical axis of the gray value circumferential change curve is the total pixel value of the subareas, the plurality of subareas take the center position of the end face as the center of a circle, the subareas are evenly distributed along the circumferential direction of the target end face, and the horizontal axis of the gray value circumferential change curve is the angle difference of each subarea relative to the reference subarea in the plurality of subareas in the circumferential direction;
determining translation distances between two gray value circumferential change curves corresponding to two adjacent frames of end face images;
and obtaining the instantaneous rotating speed of the target end face according to the translation distance and the preset frame rate.
Optionally, before detecting the end face center position of the target end face in the end face image of each frame, the method further includes:
carrying out inter-frame differential processing on each two adjacent frames of initial end face images to obtain multi-frame inter-frame differential images;
overlapping the multi-frame inter-frame difference images to obtain an inter-frame difference result image;
according to the brightness distribution condition of each pixel point in the inter-frame difference result image, identifying an initial positioning area of a target end face in the inter-frame difference result image;
Determining the centroid position of the target end face in the initial positioning area according to the pixel values and the pixel coordinates of all the pixel points in the initial positioning area;
and cutting the initial end face image according to the preset cutting frame and the centroid position to obtain an end face image.
Optionally, detecting an end face center position of the target end face in each frame of end face image includes:
according to the brightness distribution condition of each pixel point in the end face image, identifying a positioning area of a target end face in the target end face image;
performing morphological closing operation on the positioning area according to brightness distribution conditions of all pixel points in the positioning area, and filling an unoccluded part of the positioning area to obtain a target positioning area;
detecting a boundary curve of the end face of the target in the target positioning area according to the gradient intensity of each pixel in the target positioning area;
and obtaining the center position of the end face according to the boundary center position of the boundary curve.
Optionally, obtaining the end face center position according to the boundary center position of the boundary curve includes:
identifying the inner hole circle of the end face of the target in the target positioning area by using a Hough transformation algorithm;
performing circular fitting on the boundary curve to obtain a plurality of fitting circles of the boundary curve;
Determining a target fitting circle from a plurality of fitting circles according to the inner hole circle;
and obtaining the center position of the end face according to the fitting center position of the target fitting circle.
Optionally, after identifying the hole circle of the target end surface in the target positioning area by using the hough transform algorithm, the method further comprises:
if the inner hole circle identification of the end face image fails, the inner hole identification result of other end face images closest to the end face image in the inner hole identification results of the end face images is taken as the inner hole circle of the target positioning area.
Optionally, the step of drawing the gray value circumferential variation curve of the end face image includes:
dividing an end face image into a preset number of subareas along the circumferential direction of the target end face by taking the center position of the end face as the circle center;
and drawing a gray value circumferential change curve of the end face image according to the total pixel value of each subarea in the preset circumferential direction.
Optionally, determining a translation distance between two gray value circumferential change curves corresponding to two adjacent frames of end face images includes:
keeping any one of the two gray value circumferential change curves motionless, translating the other gray value circumferential change curve in the two gray value circumferential change curves back and forth according to a preset translation rule, and calculating a plurality of cross-correlation variance values between the two gray value circumferential change curves;
And determining the translation distance according to the maximum cross covariance value in the cross covariance values.
In a second aspect, the present invention also provides a rotation speed detection apparatus for a torsion spring, the apparatus comprising: the device comprises a memory, a processor and a rotating speed detection program of the torque spring, wherein the rotating speed detection program of the torque spring is stored in the memory and can be operated on the processor, and the rotating speed detection program of the torque spring is configured to realize the steps of the rotating speed detection method of any one of the torque springs.
In a third aspect, the present invention also provides a rotation speed detection system of a torque spring, the system comprising:
the endoscopic imaging device comprises a protection tube, a torque spring and a driving piece, wherein at least part of the torque spring is arranged in the protection tube, at least part of the protection tube is arranged in a pipeline to be imaged, and the driving piece is arranged at one end of the pipeline to be imaged and connected with the torque spring and is used for driving the torque spring to move in the pipeline to be imaged;
the high-speed camera is arranged on one side of the pipeline to be imaged, which is far away from the driving piece, and is used for shooting a rotary motion video of the target end face of the torque spring when the torque spring moves in the pipeline to be imaged at a preset frame rate; the method comprises the steps of,
the rotation speed detecting device as described above is connected to the high-speed camera.
In a fourth aspect, the present invention also provides a computer readable storage medium, on which a rotation speed detection program of a torque spring is stored, the rotation speed detection program of the torque spring, when executed by a processor, implementing the steps of the rotation speed detection method of the torque spring as described in any one of the above.
The invention provides a method for detecting the rotating speed of a torque spring, which is used for acquiring a rotating motion video of a target end face of the torque spring; the rotary motion video comprises a plurality of frame end face images shot at a preset frame rate; detecting the center position of the end face of the target end face in the end face image of each frame; drawing a gray value circumferential change curve of the end face image; the vertical axis of the gray value circumferential change curve is the total pixel value of the subareas, the plurality of subareas are sequentially distributed along the circumferential direction of the target end surface by taking the center position of the end surface as the center, and the horizontal axis of the gray value circumferential change curve is the angle difference of each subarea relative to the reference subarea in the plurality of subareas in the circumferential direction; determining translation distances between two gray value circumferential change curves corresponding to two adjacent frames of end face images; and obtaining the instantaneous rotating speed of the target end face according to the translation distance and the preset frame rate.
Therefore, the invention obtains the instantaneous rotation speed of the target end face of the torque spring by analyzing the end face image shot at the preset frame rate and with shorter exposure time, and can accurately analyze the motion stability of the torque spring based on the instantaneous rotation speed under the condition that the instantaneous rotation speed of the far-end face fluctuates greatly, thereby solving the technical problems that the conventional rotation speed detection method is difficult to detect the instantaneous rotation speed of the torque spring and the motion stability of an imaging catheter cannot be accurately analyzed; the motion stability of the torque spring is accurately analyzed based on the instantaneous rotation speed, so that the NURD artifact severity degree when the endoscopic imaging device formed by the torque spring is imaged can be accurately estimated.
The invention also combines the center position of the end face in the end face image to obtain the instantaneous rotation speed of the target end face, fully considers the motion characteristics of the torque spring when detecting the instantaneous rotation speed, avoids the calculation deviation of the instantaneous rotation speed caused by the center swing of the end face, and improves the accuracy of the rotation speed detection.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to the structures shown in these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a rotational speed detection apparatus of a torsion spring of the present invention;
FIG. 2 is a rotational speed detection system for a torsion spring according to an exemplary embodiment;
FIG. 3 is a flowchart of a first embodiment of a method for detecting rotational speed of a torsion spring according to the present invention;
FIG. 4 is a schematic diagram of a refinement flow chart of step S300 in FIG. 3;
FIG. 5 is a schematic diagram of a refinement flow chart of step S400 in FIG. 3;
FIG. 6 is an exemplary diagram of an end image;
FIG. 7 is an exemplary graph of a gray value circumferential variation curve;
FIG. 8 is an exemplary graph of cross-covariance values and translation distances of the gray value circumferential variation curve of FIG. 7;
FIG. 9 is a flowchart of a second embodiment of a method for detecting rotational speed of a torsion spring according to the present invention;
FIG. 10 is a diagram of a cropping example of an end-face image;
fig. 11 is a detailed flowchart of step S200 in fig. 9;
FIG. 12 is an exemplary view after preprocessing an end image;
FIG. 13 is a detailed flowchart of step S240 in FIG. 11;
FIG. 14 is an exemplary diagram of a boundary curve, an inner circle, and a target fit circle;
fig. 15 is an exemplary diagram of a rotational speed variation curve and a rotation center wobble.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
In the present disclosure, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a device or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such device or system. Without further limitation, an element defined by the phrase "comprising … …" does not exclude that an additional identical element is present in a device or system comprising the element.
If there is a description of "first", "second", etc. in an embodiment of the present invention, the description of "first", "second", etc. is for descriptive purposes only and is not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature.
The torsion spring is a spring complex composed of multiple layers of springs, can be arranged in the protection tube to be used as an imaging catheter for flexibly driving medical endoscopic imaging, realizes scanning action, and is matched with an imaging end to obtain a scanning image. Taking intravascular ultrasound as an example, the imaging catheter mainly comprises a protective tube, a torque spring and an imaging end head. Generally, the torque spring is connected proximally to the motor and distally to the imaging tip and is disposed within the protective tube. During imaging, the imaging catheter is required to be deep into a human blood vessel, the torque spring is driven by the motor to rotate, torque is transmitted to the far-end imaging end head, the imaging end head is driven to rotate together, and scanning of abnormal lesion information in the blood vessel is completed.
However, due to friction between the torsion spring and the protective tube, the movement of the imaging catheter is inconsistent everywhere, and the scanning movement of the imaging tip at the distal end of the imaging catheter is unstable, so that the scanning image obtained by the imaging tip can generate non-uniform rotation (non-uniform rotational distortion, NURD) artifacts, and the interpretation and analysis of the imaging result are interfered. The severity of the NURD artifact is closely related to the structure and quality of the torque spring, the torque spring with good quality can effectively resist the influence caused by friction, the far-end rotation is stable, and the NURD artifact in an imaging result is weaker. Currently, the severity of NURD artifacts can be assessed by detecting the rotational speed of distal movement of the torsion spring during imaging, analyzing the stability of the torsion spring.
Specifically, the rotation speed detection method includes: (1) The rotation speed detection is carried out by utilizing a photoelectric rotation speed measurement principle or an electromagnetic type capacitive sensor and the like; however, in the method, a CD-ROM disk is required to be installed on a tested rotating body, and an additional transmitting and receiving device is required, so that the problems of eccentric vibration and the like are easily caused, and the accuracy of a measurement result is influenced; (2) Adding a mark on a detected rotating body, and detecting the rotating speed by shooting a moving image; however, because the torque spring is softer, the mark easily causes eccentric movement of the torque spring, and the measurement accuracy is affected; (3) The method can be used for shooting a frame of fuzzy image with relatively long exposure time when the imaging catheter rotates, so that the far-end surface of the imaging catheter forms concentric circle smear in the fuzzy image, and then according to a mathematical model of motion blur, the rotating speed information is calculated from the concentric circle smear, so that the rotating speed measurement of the far-end surface of the imaging catheter is realized, and further, the assessment of the severity of NURD artifact when the imaging catheter is imaged is realized.
However, due to the large instantaneous rotational speed fluctuation of the distal end face and the wobbling of the rotational center, it is difficult to form stable concentric circle smear, resulting in failure of the blurred image method. In addition, the fluctuation of the instantaneous rotation speed of the distal end surface is large, so that the time resolution is poor due to the long exposure time in the blurred image method, the instantaneous rotation speed is difficult to reflect, and the severity of NURD artifacts during imaging cannot be accurately estimated. In addition, when the rotation speed of the torque spring is detected by the rotation speed detection method, the center swing condition of the torque spring during movement is not considered, so that the accuracy rate of analyzing the stability of the torque spring is not high.
In view of the technical problem that the conventional rotating speed detection method is difficult to detect the instantaneous rotating speed of the torque spring and cannot accurately analyze the motion stability of the imaging catheter, the invention provides the rotating speed detection method of the torque spring, which has the following overall thought:
the method comprises the following steps: acquiring a rotation motion video of a target end face of the torque spring; the rotary motion video comprises a plurality of frame end face images shot at a preset frame rate; detecting the center position of the end face of the target end face in the end face image of each frame; drawing a gray value circumferential change curve of the end face image; the vertical axis of the gray value circumferential change curve is the total pixel value of the subareas, the plurality of subareas are sequentially distributed along the circumferential direction of the target end surface by taking the center position of the end surface as the center, and the horizontal axis of the gray value circumferential change curve is the angle difference of each subarea relative to the reference subarea in the plurality of subareas in the circumferential direction; determining translation distances between two gray value circumferential change curves corresponding to two adjacent frames of end face images; and obtaining the instantaneous rotating speed of the target end face according to the translation distance and the preset frame rate.
The invention provides a rotational speed detection method of a torque spring, which is used for obtaining the instantaneous rotational speed of a target end face of the torque spring by analyzing an end face image shot at a preset frame rate and with shorter exposure time, and can accurately analyze the motion stability of the torque spring based on the instantaneous rotational speed under the condition that the instantaneous rotational speed of a far end face fluctuates greatly, so that the technical problem that the instantaneous rotational speed of the torque spring is difficult to detect and the motion stability of an imaging catheter cannot be accurately analyzed in the conventional rotational speed detection method is solved; the motion stability of the torque spring is accurately analyzed based on the instantaneous rotation speed, so that the NURD artifact severity degree when the endoscopic imaging device formed by the torque spring is imaged can be accurately estimated.
The invention also combines the center position of the end face in the end face image to obtain the instantaneous rotation speed of the target end face, fully considers the motion characteristics of the torque spring when detecting the instantaneous rotation speed, avoids the calculation deviation of the instantaneous rotation speed caused by the center swing of the end face, and improves the accuracy of the rotation speed detection.
The following describes in detail a method, a device, a system and a storage medium for detecting the rotational speed of a torque spring applied in the implementation of the technology of the present invention:
referring to fig. 1, fig. 1 is a schematic structural view of a rotation speed detecting apparatus of a torsion spring according to the present invention;
as shown in fig. 1, the apparatus may include: a processor 1001, such as a central processing unit (Central Processing Unit, CPU), a communication bus 1002, a user interface 1003, a network interface 1004, a memory 1005. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include user equipment such as an endoscopic imaging device, a high speed camera, etc., and the optional user interface 1003 may also include standard wired, wireless interfaces. The network interface 1004 may optionally include a standard wired interface, a Wireless interface (e.g., a Wireless-Fidelity (Wi-Fi) interface). The Memory 1005 may be a high-speed random access Memory (Random Access Memory, RAM) Memory or a stable nonvolatile Memory (NVM), such as a disk Memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the structure shown in fig. 1 is not limiting of the apparatus and may include more or fewer components than shown, or certain components may be combined, or a different arrangement of components.
As shown in fig. 1, an operating system, a data storage module, a network communication module, a user interface module, and a rotation speed detection program of the torque spring may be included in the memory 1005 as one type of storage medium.
In the device shown in fig. 1, the network interface 1004 is mainly used for data communication with other network devices; the user interface 1003 is mainly used for data interaction with user equipment; the processor 1001 and the memory 1005 in the method for detecting the rotational speed of the torque spring can be arranged in the device, and the method for detecting the rotational speed of the torque spring calls the rotational speed detection program of the torque spring stored in the memory 1005 through the processor 1001 and executes the method for detecting the rotational speed of the torque spring or the method for determining the matching of the finite state automaton.
Referring to fig. 2, fig. 2 is a rotational speed detection system for a torque spring according to an exemplary embodiment; the embodiment provides a rotational speed detecting system of torque spring, the system includes:
The endoscopic imaging device 10 comprises a protection tube 13, a torque spring 14 and a driving piece 11, wherein at least part of the torque spring 14 is arranged in the protection tube 13, at least part of the protection tube 13 is arranged in a pipeline 20 to be imaged, and the driving piece 11 is arranged at one end of the pipeline 20 to be imaged and connected with the torque spring 14 for driving the torque spring 14 to move in the pipeline 20 to be imaged;
a high-speed camera 30 disposed at a side of the pipe 20 to be imaged far from the driving member 11, for photographing a rotation movement video of a target end surface of the torsion spring 14 when the torsion spring 14 moves in the pipe 20 to be imaged at a preset frame rate; the method comprises the steps of,
the rotation speed detecting device 40 as described above is connected to the high-speed camera 30.
In this embodiment, the endoscopic imaging device 10 further comprises a clamping device 12 for fixing the torsion spring 14 to the rotation shaft of the driving member 11. The driving member 11 includes a motor connected to the rotation speed detecting device 40 for operating according to a control signal outputted from the rotation speed detecting device 40 to drive the torsion spring 14 to move in the pipe 20 to be imaged. The tube 20 to be imaged is designed according to the real organ or tissue of the human body, etc., and when the torsion spring 14 moves in the tube 20 to be imaged, the real bending state of the torsion spring 14 in the human body can be simulated. The rotation speed detecting device 40 includes the detecting device shown in fig. 1, and may be a physical server including a separate host, or may be a virtual server carried by a host cluster. The rotation speed detection device 40 is connected to the high-speed camera 30, and can obtain a rotation movement video of the target end surface from the high-speed camera 30, and perform rotation speed detection of the torque spring according to the rotation movement video of the target end surface, so as to obtain instantaneous rotation speed detection of the target end surface of the torque spring. The system further comprises a display device 50 connected with the rotation speed detection device 40 and used for realizing a man-machine interaction function, wherein the man-machine interaction function comprises the steps of displaying a detection result, performing endoscopic imaging simulation setting and the like. In addition, the system further includes illumination devices 60, the number of the illumination devices 60 is set according to actual use requirements, and the illumination devices are respectively disposed around a side of the high-speed camera 30 near the preset channel model 20, so as to provide an illumination function.
Further details of implementation of the rotation speed detection device 40 in this embodiment in the implementation of the rotation speed detection of the torque spring can be found in the following description of the specific implementation of the rotation speed detection method of the torque spring in the first or second embodiment.
The method for detecting the rotational speed of the torsion spring according to the present invention will be described in detail with reference to the accompanying drawings and the detailed description.
Based on the above hardware structure, but not limited to the above hardware structure, referring to fig. 3 to 7, fig. 3 is a flowchart of a first embodiment of a method for detecting a rotational speed of a torque spring according to the present invention; FIG. 4 is a schematic diagram of a refinement flow chart of step S300 in FIG. 3; FIG. 5 is a schematic diagram of a refinement flow chart of step S400 in FIG. 3; FIG. 6 is an exemplary plot of an end image, and FIG. 7 is an exemplary plot of an axial gray value variation curve; FIG. 8 is an exemplary graph of cross-covariance values and translation distances of the gray value circumferential variation curve of FIG. 7. The embodiment provides a method for detecting the rotating speed of a torque spring, which comprises the following steps:
step S100: and acquiring a rotation motion video of the target end face of the torque spring.
The rotary motion video comprises a plurality of frames of end face images shot at a preset frame rate.
In this embodiment, the execution body is the above rotation speed detection device, and the device may be a physical server including an independent host, or may be a virtual server carried by a host cluster.
In this embodiment, the torque spring is disposed in the protective tube, and the torque spring is fixed on the rotating shaft of the motor by the clamping device, thereby forming the endoscopic imaging device. When the rotating speed of the torque spring is detected, the protection tube and the torque spring are arranged in the pipeline to be imaged, and the torque spring is driven by the motor to move in the pipeline to be imaged, so that the real bending state of the torque spring in a human body can be simulated. At this time, a rotational motion video of the end face of the target can be captured by the high-speed camera.
The target end surface is the end surface of the torque spring far away from the motor, and is usually an annular torque spring section; the preset frame rate is a higher frame rate and may be set according to the performance of the high-speed camera. The multi-frame end face image is a rotary motion image of all frames in the rotary motion video. Wherein the end face image may be a gray scale image. It will be appreciated that if the end face image obtained from the rotary motion video is a color image, the color image may be converted into a grayscale image to obtain an end face image.
In the specific implementation, when the torque spring moves in a pipeline to be imaged, the rotating speed detection equipment obtains a rotating motion video of a target end face shot by a high-speed camera according to a preset frame rate, and the rotating motion video comprises multi-frame end face images arranged according to time.
Step S200: and detecting the end face center position of the target end face in the end face image of each frame.
In this embodiment, the center position of the end face is the position of the center of the target end face of the torsion spring in the end face image. Because the target end face of the torque spring is not a mirror face, scattered points with different brightness appear in an end face image shot by the high-speed camera; thus, a relatively bright annular region exists in the end face image, which is the target end face of the torsion spring. In contrast, the area corresponding to the cavity in the torque spring is a darker inner circle area in the end face image. Therefore, the position of the inner circle region can be positioned according to the brightness distribution conditions of the annular region and the inner circle region in the end face image, and the circle center position of the inner circle region can be obtained and used as the center position of the end face.
In specific implementation, the target end face of the torque spring in each frame of end face image can be detected, a plurality of inner circle areas corresponding to the target end face are detected according to the brightness distribution condition of each pixel point at the target end face, and a plurality of circle center positions of the inner circle areas are detected to obtain a plurality of end face center positions.
Step S300: and drawing a gray value circumferential change curve of the end face image.
The vertical axis of the gray value circumferential change curve is the total pixel value of the subareas, the subareas are evenly distributed along the circumferential direction of the target end surface by taking the center position of the end surface as the center, and the horizontal axis of the gray value circumferential change curve is the angle difference of each subarea relative to the reference subarea in the subareas in the circumferential direction.
In this embodiment, under the condition of smaller rotation angle, the scattered points with different brightness in the end face image are less affected by illumination and can rotate along with the target end face of the torque spring. Thus, these naturally occurring scatter points can be used as features to identify how much the torque spring has rotated.
The specific practice can be as follows: and drawing a gray value circumferential change curve of the gray value of the target end face in the end face image along with the circumferential change, and identifying how much the torque spring end face rotates according to two gray value circumferential change curves corresponding to two adjacent end face images.
Specifically, as an option of the present embodiment, as shown in fig. 4, step S300 includes:
step S310: and dividing the end face image into a preset number of subareas along the circumferential direction of the target end face by taking the center position of the end face as the circle center.
In this embodiment, the preset number may be set according to the distribution condition of the scattered points in the end face image, and the areas of the sub-areas have the same size. As in one example, a circular region of the end face image having the outer diameter of the end face of the target as a radius is divided into a predetermined number of sub-regions on average along the circumferential direction, e.g., clockwise, of the end face of the target with the center position of the end face as the center. Each sub-area is in the shape of a sector with the same area size.
Step S320: and drawing a gray value circumferential change curve of the end face image according to the total pixel value of each subarea in the preset circumferential direction.
In this embodiment, the preset circumferential direction includes a counterclockwise circumferential direction and a clockwise circumferential direction.
In the specific implementation, when a gray value circumferential change curve is drawn, calculating the sum of pixel values of all pixels in the subareas to obtain a total pixel value, taking one subarea in a preset number of subareas as a reference subarea, and obtaining the angle difference between the position of each subarea in the circumferential direction and the position of the reference subarea in the circumferential direction according to the preset circumferential direction; and drawing to obtain an axial change curve of the gray value according to the total pixel value and the corresponding angle difference.
As shown in fig. 6 and 7, fig. 6 is an exemplary end-face image, and fig. 7 is an axial gray-value variation curve corresponding to the exemplary end-face image; the vertical axis of the gray value axial change curve is the total pixel value of each sub-region, and the horizontal axis is the angular difference between the position of the sub-region in the circumferential direction and the position of the reference sub-region in the circumferential direction.
In the example shown in fig. 6, the circular region may be divided into 360 sector-shaped sub-regions with a radius in either radial direction of the circular region as one radius of the reference sub-region and 1 ° as a division scale. The other radius of the reference subarea is one radius of the next subarea, and the other radius of the next subarea is one radius of the next subarea in turn. The angular difference between the sub-regions in the circumferential direction and the reference sub-region in the circumferential direction may be the angle of one radius of each sub-region with one radius of the reference sub-region.
Step S400: and determining the translation distance between two gray value circumferential change curves corresponding to the end face images of two adjacent frames.
In this embodiment, under the condition that the rotation angle of two adjacent frames of end face images is smaller, two gray value circumferential change curves corresponding to the two adjacent frames of end face images are similar, and only the positions are shifted; the translation distance between the two gray value circumferential change curves is the rotation angle of the target end face between the time intervals of two adjacent frames of images. Therefore, the instantaneous rotation speed of the target end face can be calculated from the translation distance and the image time interval. As shown in FIG. 7, the curves I and II are two gray value circumferential change curves corresponding to the end face images of two adjacent frames.
In the specific implementation, one gray value circumferential change curve can be kept motionless, the other gray value circumferential change curve is translated back and forth, and when the two gray value circumferential change curves are completely overlapped, the translation distance between the two gray value circumferential change curves is obtained.
Specifically, as an option of the present embodiment, as shown in fig. 5, step S400 includes:
step S410: and keeping any one of the two gray value circumferential change curves motionless, translating the other gray value circumferential change curve in the two gray value circumferential change curves back and forth according to a preset translation rule, and calculating a plurality of cross-correlation variance values between the two gray value circumferential change curves.
In this embodiment, when calculating the target translation distance, any one of the two gray value circumferential change curves is kept motionless, and another gray value circumferential change curve is translated back and forth along the transverse axis according to a preset translation rule, and each time the other gray value circumferential change curve is translated, a cross-correlation variance value between the two gray value circumferential change curves is calculated. The preset translation rule comprises a preset translation distance and a preset translation range of the other gray value circumferential change curve, the preset translation range comprises a preset left translation range and a preset right translation range, and the preset left translation range and the preset right translation range are the same translation range. Therefore, a plurality of cross covariance values and a distance for shifting another gray value circumferential change curve corresponding to the plurality of cross covariance values can be obtained. The preset translation range can be determined according to the maximum distance between two gray value circumferential change curves, and the preset translation distance is set according to actual requirements.
Step S420: and determining the translation distance according to the maximum cross covariance value in the cross covariance values.
In this embodiment, the cross-correlation value will only take a maximum value when the two gray value circumferential variation curves completely coincide. Therefore, the translation distance corresponding to the maximum cross covariance value in the cross covariance values is the translation distance between two gray value circumferential change curves corresponding to the end face images of two adjacent frames. As shown in fig. 8, the relationship between the cross-correlation variance value between two gray value circumferential variation curves and the translation distance corresponding to different cross-correlation variance values is shown, the horizontal axis is the translation distance, and the vertical axis is the cross-correlation variance value; wherein the preset translation range is 10a.u., and the preset translation distance is 0.5a.u.
Step S500: and obtaining the instantaneous rotating speed of the target end face according to the translation distance and the preset frame rate.
In this embodiment, according to the rotation angle of the target end face between two adjacent frame image time intervals and the image time intervals, the instantaneous rotation speed of the target end face can be obtained; the translation distance between two gray value circumferential change curves is the rotation angle of the target end face between the time intervals of two adjacent frames of images.
In specific implementation, the translation distance obtained through the determination is taken as the rotation angle of the target end face, the image time interval is obtained according to the preset frame rate, and the instantaneous rotation speed of the target end face is obtained according to the translation distance and the image time interval.
The embodiment provides a rotational speed detection method of a torque spring, which is used for obtaining the instantaneous rotational speed of a target end face of the torque spring by analyzing an end face image shot at a preset frame rate and with a short exposure time, and can accurately analyze the motion stability of the torque spring based on the instantaneous rotational speed under the condition that the instantaneous rotational speed of a distal end face fluctuates greatly, so that the technical problem that the conventional rotational speed detection method is difficult to detect the instantaneous rotational speed of the torque spring and cannot accurately analyze the motion stability of an imaging catheter is solved. The motion stability of the torque spring is accurately analyzed based on the instantaneous rotation speed, so that the NURD artifact severity degree when the endoscopic imaging device formed by the torque spring is imaged can be accurately estimated. Therefore, according to the severity of NURD artifact, the torque spring is selected to effectively resist the influence of friction.
The embodiment also combines the end face center position in the end face image to obtain the instantaneous rotation speed of the target end face, fully considers the motion characteristics of the torque spring when detecting the instantaneous rotation speed, avoids the calculation deviation of the instantaneous rotation speed caused by the center swing of the end face, and improves the accuracy of the rotation speed detection. The problems that when the fuzzy image method is used for analysis, the instantaneous rotation speed fluctuation of the end face of the far end is large, the rotation center swings, and stable concentric circle smear is difficult to form, so that the fuzzy image method fails are avoided.
In addition, the embodiment also utilizes the cross covariance to analyze the translation distance between two gray value circumferential change curves corresponding to two adjacent frames of end face images, does not need complex mathematical analysis operation, and is suitable for real-time analysis of high rotation speed; in addition, the gray value circumferential change curve reflects the displacement change of the bright and dark areas of the image, so that the characteristics in the end face image are not required to be clear, and the instantaneous rotating speed of the target end face can be well obtained by the rotating speed detection method of the torque spring under the condition that the shot end face image is out of focus and blurred, and the performance requirement and the installation precision requirement of a camera are reduced.
In the embodiment, the naturally-occurring scattered points in the end face image are used as characteristics, the rotation angle of the torque spring is identified, the characteristics of the torque spring are fully utilized, any mark and change of the torque spring are not needed, and the instantaneous rotation speed of the target end face and the center swing condition of the end face during high-speed movement of the torque spring can be obtained without any contact with the torque spring.
Further, referring to fig. 9 to 15, fig. 9 is a flowchart illustrating a second embodiment of a method for detecting a rotational speed of a torsion spring according to the present invention; FIG. 10 is a diagram of a cropping example of an end-face image; fig. 11 is a detailed flowchart of step S200 in fig. 9; FIG. 12 is an exemplary view after preprocessing an end image; FIG. 13 is a detailed flowchart of step S240 in FIG. 11; FIG. 14 is an exemplary diagram of a boundary curve, an inner circle, and a target fit circle; fig. 15 is an exemplary diagram of a rotational speed variation curve and a rotation center wobble. The embodiment provides a method for detecting the rotation speed of a torque spring, and before step S200, the method further includes:
Step S600: carrying out inter-frame differential processing on each two adjacent frames of initial end face images to obtain multi-frame inter-frame differential images;
in this embodiment, when the rotation speed detection device reads the rotation motion video, the area of the target end face in the obtained initial end face image is smaller, and the area of the background area is larger; if the rotation speed detection is performed directly through the initial end face image, the data amount required to be analyzed is relatively large, and the background area may interfere with the analysis result. Therefore, the initial end face image can be cut first, so that the end face image with smaller area of the background area is obtained, the data volume of subsequent analysis is reduced, and the interference of the background area is avoided.
In specific implementation, an adjacent inter-frame difference method can be utilized to perform inter-frame difference processing on initial end face images of every two adjacent frames, so as to obtain multi-frame inter-frame difference images.
Step S700: and superposing the multi-frame inter-frame difference images to obtain an inter-frame difference result image.
In this embodiment, since only the target end face of the torsion spring moves in the initial end face image, only a relatively bright pixel point appears at the target end face of the torsion spring after the inter-frame difference processing is performed on the two adjacent frames of initial end face images. Therefore, after all the inter-frame difference images are overlapped together, the target end face of the torque spring in the end face image can be roughly positioned according to the positions of the brighter pixel points in the inter-frame difference result image.
Step S800: and identifying an initial positioning area of the target end face in the inter-frame difference result image according to the brightness distribution condition of each pixel point in the inter-frame difference result image.
In the embodiment, a relatively bright ring area exists in the inter-frame difference result image, and the ring area is a target end face of the torque spring; in contrast, there is a darker inner circle region in the inter-frame difference result image, which is the region corresponding to the cavity in the torsion spring. Therefore, the binarization processing can be performed on the inter-frame difference result image according to the brightness of the pixel points in the inner circle region and the brightness of the pixel points in the annular region, so as to obtain the initial positioning region of the target end face.
Step S900: and determining the centroid position of the target end face in the initial positioning area according to the pixel values and the pixel coordinates of all the pixel points in the initial positioning area.
In this embodiment, after the binarization processing, the pixel values of all the pixel points in the inner circle region in the initial positioning region are 1, and the centroid position of the target end face in the initial positioning region is obtained according to the average coordinates of the pixel coordinates of all the pixel points whose pixel values are 1.
Step S000: and cutting the initial end face image according to the preset cutting frame and the centroid position to obtain an end face image.
In this embodiment, the preset cutting frame may be determined according to an outer diameter of the torsion spring.
In the specific implementation, the centroid position is taken as a center point, and all initial end face images are cut according to the preset length and the preset width of a preset cutting frame, so that multi-frame end face images are obtained.
As shown in fig. 10, the left image is an initial end image before an exemplary cut, and the right image is a target end image after an exemplary cut. It can be seen that the area size of the background area in the initial end face image before clipping is large, and the area size of the background area in the end face image after clipping is small. During the subsequent rotation speed detection, the rotation speed detection is carried out according to the cut end face image, so that the analysis data volume is effectively reduced, and the analysis speed is improved; and the interference of the background image in the end face image on the subsequent analysis is reduced, and the analysis accuracy is improved.
Specifically, as shown in fig. 11, as an option of the present embodiment, step S200 includes:
step S210: and identifying the positioning area of the target end face in the target end face image according to the brightness distribution condition of each pixel point in the end face image.
In this embodiment, after obtaining the end face image, the end face image may be preprocessed, and edge detection is performed on the preprocessed end face image, so as to obtain a boundary circle of the inner circle region of the target end face; and obtaining the center position of the end face according to the center position of the inner circle region. Specifically, a relatively bright ring area exists in the end face image, and the ring area is a target end face of the torque spring; in contrast, a darker inner circle region exists in the end face image, and the inner circle region is a region corresponding to the cavity in the torque spring. Therefore, the end face image can be subjected to binarization processing according to the brightness of the pixel points of the inner circle region and the brightness of the pixel points of the annular region, and a positioning region of the target end face can be obtained.
Step S220: and carrying out morphological closing operation processing on the positioning area according to the brightness distribution condition of all the pixel points in the positioning area, and filling the non-closed part of the positioning area to obtain the target positioning area.
In this embodiment, some non-closed portions may exist at the edge of the inner circle region with the pixel value of 1 in the positioning region obtained after the binarization processing. At this time, morphological closing operation processing can be performed on the positioning area, eight field pixel points of each pixel point with a pixel value of 1 in the positioning area are filled, then the filled positioning area is corroded according to the edge pixel points of the positioning area, the non-closed part of the positioning area is filled, and the positioning area is optimized, so that the target positioning area is obtained.
As shown in fig. 12, the left image is a positioning area obtained by binarizing an end face image, and the right image is a target positioning area obtained by optimizing the positioning area. It can be seen that there are many non-closed portions at the edges of the positioning region in the binarized end image, and these non-closed portions are filled after morphological closing operation, so that a more complete positioning region is obtained.
Step S230: and detecting a boundary curve of the end face of the target in the target positioning area according to the gradient intensity of each pixel in the target positioning area.
In this embodiment, after the target positioning area is obtained, a Canny edge detection algorithm may be used to detect a boundary curve of an inner circle area of the target end surface in the target positioning area according to the gradient intensity of each pixel point in the target positioning area.
Step S240: and obtaining the center position of the end face according to the boundary center position of the boundary curve.
In this embodiment, the boundary center position of the boundary curve is the end face center position of the center of the target end face in the end face image.
In the specific implementation, after the boundary curve of the inner circle is obtained, detecting the boundary center position of the boundary curve, and obtaining the end face center position.
Specifically, as shown in fig. 13, as an option of the present embodiment, step S240 includes:
step S241: identifying the inner circle of the end face of the target in the target positioning area by using a Hough transformation algorithm;
in this embodiment, the hough transform algorithm is used to detect the boundary curve, and generally identify the inner circle of the end face of the object corresponding to the boundary curve, so as to obtain the center coordinates (C) of the inner circle hx ,C hy ) And radius R h
Step S242: performing circular fitting on the boundary curve to obtain a plurality of fitting circles of the boundary curve;
in this embodiment, the circle center fitting can be performed on the boundary curve by using the least square algorithm, however, the circle fitting can be performed on any closed form by using the least square algorithmThe result is fitted by the combined curves, so that a plurality of fitting circles are obtained; recording the circle center coordinates of the fitting circle of the ith section boundary curve asRadius is->
Step S243: according to the inner circle, determining a target fitting circle from a plurality of fitting circles;
in this embodiment, when the hough transform algorithm detects, the detection result is easy to generate eccentric inscribed circles or circumscribed circles, and the plurality of fitting circles obtained by the least square algorithm fitting include a large number of unnecessary fitting circles; therefore, the internal circle obtained by the Hough transform algorithm is used as a reference, a plurality of fitting circles obtained by fitting the least square algorithm are screened, unnecessary fitting circles in the plurality of fitting circles can be removed, and a target fitting circle is obtained.
Specifically, a screening parameter D is obtained according to the center coordinates and the radius of the inner circle and the center coordinates and the radius of the fitting circle by using a formula I i The method comprises the steps of carrying out a first treatment on the surface of the According to the screening parameter D i Screening a plurality of fitting circles obtained by fitting of a least square algorithm to obtain a target fitting circle; wherein, formula one is:
it will be appreciated that D i The larger the value is, the larger the deviation between the fitting circle obtained by fitting the least square algorithm and the internal circle identified by the Hough transformation algorithm is, otherwise, only D is needed i The closer to 0, the closer the circles identified by the two algorithms. Thus, D can be taken i The fitting circle corresponding to the minimum value is the target fitting circle. As shown in fig. 14, the left side is a boundary curve obtained by performing edge detection on the target positioning area, and the middle is an inner circular shape identified by hough transform algorithmThe right side is the final target fitting circle; it can be seen that the eccentric condition of the inner circle identified by the Hough transform algorithm occurs, and the overlapping degree of the obtained target fitting circle and the boundary curve is higher after the detection by combining the least square algorithm.
Step S244: and obtaining the center position of the end face according to the fitting center position of the target fitting circle.
In this embodiment, the fitting circle center position is the circle center coordinate of the target fitting circle, and the circle center coordinate corresponding to the fitting circle center position is the end face center position.
After step S241, the method further includes: if the inner hole circle identification of the end face image fails, the inner hole identification result of other end face images closest to the end face image in the inner hole identification results of the end face images is taken as the inner hole circle of the target positioning area.
In this embodiment, since the hough transform algorithm has a higher requirement on the circularity of the boundary curve, when the boundary curve is elliptical or quasi-circular, the detection is easy to fail, so it is difficult to ensure that the boundary graph corresponding to each frame end image can be successfully identified. If the identification of a certain frame or a plurality of continuous frames fails, the inner hole circle corresponding to the image of the end face closest to the target of the certain frame or the plurality of continuous frames in the inner hole identification result with successful identification is used as the inner hole circle of the certain frame or the plurality of continuous frames.
It can be understood that, after the end face center positions are obtained in this embodiment, a plurality of end face center positions may be drawn in the same graph, and a center position detection result of the end face center positions is obtained, so that a rotation center swing condition of the target end face of the torque spring may be obtained. And the instantaneous rotating speed of the torque spring in high-speed movement is combined, so that the movement stability of the torque spring is analyzed, and the evaluation of the NURD artifact severity degree of the endoscopic imaging device formed by the torque spring in actual imaging is realized.
In addition, a rotation speed change curve of the instantaneous rotation speed changing along with time can be drawn according to the plurality of instantaneous rotation speeds and the time corresponding to the plurality of instantaneous rotation speeds. Therefore, according to the rotation speed change curve and the rotation center swing condition, the motion stability of the torque spring in a certain time period is analyzed, and the NURD artifact severity degree when the endoscopic imaging device formed by the torque spring is imaged can be accurately estimated. As shown in fig. 15, the left graph shows a rotational speed variation curve, the horizontal axis shows time t, and the vertical axis shows instantaneous rotational speed v; the right figure shows the swing of the rotation center. According to the rotation speed change curve and the rotation center swing condition, the motion stability of the torque spring in a certain time period can be analyzed.
The embodiment provides a rotational speed detection method of a torque spring, which combines two algorithms, namely a Hough transformation algorithm and a least square algorithm, to detect the center position of an end face, thereby improving the detection success rate and accuracy. And when the Hough transform algorithm fails to identify, the identification results of the end face images which are closest to each other replace the identification results of the end face images which fail to identify, so that all fitting results of the least square algorithm can be ensured to be screened, interference of unnecessary fitting results is avoided, and accuracy of detecting the center position of the end face is improved.
In addition, the embodiment of the application also provides a computer storage medium, wherein the storage medium stores a rotating speed detection program of the torque spring, and the rotating speed detection program of the torque spring realizes the steps of the rotating speed detection method of the torque spring when being executed by a processor. Therefore, a detailed description will not be given here. In addition, the description of the beneficial effects of the same method is omitted. For technical details not disclosed in the embodiments of the computer-readable storage medium according to the present application, please refer to the description of the method embodiments of the present application. As an example, the program instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
Those skilled in the art will appreciate that implementing all or part of the above-described methods may be accomplished by way of a computer program for instructing relevant hardware, and that the above-described program may be stored in a computer readable storage medium, which when executed may comprise the steps of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), or the like.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (10)

1. A method for detecting a rotational speed of a torsion spring, the method comprising:
acquiring a rotation motion video of a target end face of the torque spring; the rotary motion video comprises a plurality of frames of end face images shot at a preset frame rate;
detecting the center position of the end face of the target end face in the end face image of each frame;
drawing a gray value circumferential change curve of the end face image; the vertical axis of the gray value circumferential change curve is the total pixel value of the subareas, the plurality of subareas take the center position of the end face as the center of a circle and are evenly distributed along the circumferential direction of the target end face, and the horizontal axis of the gray value circumferential change curve is the angle difference of each subarea relative to the reference subarea in the plurality of subareas in the circumferential direction;
determining translation distances between two gray value circumferential change curves corresponding to two adjacent frames of end face images;
And obtaining the instantaneous rotating speed of the target end face according to the translation distance and the preset frame rate.
2. The method of claim 1, wherein the detecting the end face center position of the target end face in each frame of the end face image is preceded by:
carrying out inter-frame differential processing on each two adjacent frames of initial end face images to obtain multi-frame inter-frame differential images;
overlapping the multi-frame inter-frame difference images to obtain an inter-frame difference result image;
according to the brightness distribution condition of each pixel point in the inter-frame difference result image, identifying an initial positioning area of the target end face in the inter-frame difference result image;
determining the centroid position of the target end face in the initial positioning area according to the pixel values and the pixel coordinates of all the pixel points in the initial positioning area;
and cutting the initial end face image according to a preset cutting frame and the centroid position to obtain the end face image.
3. The method of claim 2, wherein detecting the end face center position of the target end face in the end face image of each frame comprises:
identifying a positioning area of the target end face in the end face image according to the brightness distribution condition of each pixel point in the end face image;
Performing morphological closing operation on the positioning area according to brightness distribution conditions of all pixel points in the positioning area, and filling an unoccluded part of the positioning area to obtain a target positioning area;
detecting a boundary curve of the end face of the target in the target positioning area according to the gradient intensity of each pixel point in the target positioning area;
and obtaining the center position of the end face according to the boundary center position of the boundary curve.
4. A method according to claim 3, wherein said deriving said end face center position from a boundary center position of said boundary curve comprises:
identifying the inner hole circle of the target end face in the target positioning area by using a Hough transformation algorithm;
performing circular fitting on the boundary curve to obtain a plurality of fitting circles of the boundary curve;
determining a target fitting circle from a plurality of fitting circles according to the inner hole circle;
and obtaining the center position of the end face according to the fitting center position of the target fitting circle.
5. The method of claim 4, wherein after identifying the hole circle of the target end surface in the target positioning area using a hough transform algorithm, the method further comprises:
If the inner hole circle identification of the end face image fails, taking the inner hole identification result of other end face images closest to the end face image in the inner hole identification results of multiple frames of end face images as the inner hole circle of the target positioning area.
6. The method of claim 1, wherein the step of plotting the gray value circumferential variation curve of the end image comprises:
dividing the end face image into a preset number of subareas along the circumferential direction of the target end face by taking the center position of the end face as the circle center;
and drawing a gray value circumferential change curve of the end face image according to the total pixel value of each subarea in the preset circumferential direction.
7. The method of claim 1, wherein determining a translation distance between two gray value circumferential change curves corresponding to two adjacent frames of the end face image comprises:
keeping any one of the two gray value circumferential change curves motionless, translating the other gray value circumferential change curve in the two gray value circumferential change curves back and forth according to a preset translation rule, and calculating a plurality of cross-cooperation variance values between the two gray value circumferential change curves;
And taking the maximum cross covariance value in the cross covariance values as the translation distance.
8. A rotational speed detection apparatus of a torsion spring, characterized in that the apparatus comprises: a memory, a processor, and a rotational speed detection program of a torque spring stored on the memory and operable on the processor, the rotational speed detection program of the torque spring being configured to implement the steps of the rotational speed detection method of a torque spring as claimed in any one of claims 1 to 7.
9. A rotational speed detection system for a torque spring, the system comprising:
the endoscopic imaging device comprises a protection tube, a torque spring and a driving piece, wherein at least part of the protection tube is arranged in a pipeline to be imaged, at least part of the torque spring is arranged in the protection tube, and the driving piece is arranged at one end of the pipeline to be imaged and connected with the torque spring and is used for driving the torque spring to move in the pipeline to be imaged;
the high-speed camera is arranged at one end of the pipeline to be imaged, which is far away from the driving piece, and is used for shooting a rotary motion video of a target end face of the torque spring when the torque spring moves in the pipeline to be imaged at a preset frame rate; the method comprises the steps of,
The rotational speed detection apparatus of claim 8, connected to the high-speed camera.
10. A computer-readable storage medium, wherein a rotation speed detection program of a torque spring is stored on the storage medium, and the rotation speed detection program of the torque spring, when executed by a processor, implements the steps of the rotation speed detection method of the torque spring according to any one of claims 1 to 7.
CN202310396912.4A 2023-04-13 Method, device, system and storage medium for detecting rotation speed of torque spring Active CN116718791B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310396912.4A CN116718791B (en) 2023-04-13 Method, device, system and storage medium for detecting rotation speed of torque spring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310396912.4A CN116718791B (en) 2023-04-13 Method, device, system and storage medium for detecting rotation speed of torque spring

Publications (2)

Publication Number Publication Date
CN116718791A true CN116718791A (en) 2023-09-08
CN116718791B CN116718791B (en) 2024-04-26

Family

ID=

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018013450A (en) * 2016-07-22 2018-01-25 株式会社小野測器 Rotary velocity measurement device and rotary velocity measurement method
CN108022249A (en) * 2017-11-29 2018-05-11 中国科学院遥感与数字地球研究所 A kind of remote sensing video satellite moving vehicle target region of interest extraction method
CN111275036A (en) * 2018-12-04 2020-06-12 北京嘀嘀无限科技发展有限公司 Target detection method, target detection device, electronic equipment and computer-readable storage medium
CN111462066A (en) * 2020-03-30 2020-07-28 华南理工大学 Thread parameter detection method based on machine vision
CN112184734A (en) * 2020-09-30 2021-01-05 南京景瑞康分子医药科技有限公司 Long-time animal posture recognition system based on infrared images and wearable optical fibers
CN112288693A (en) * 2020-10-19 2021-01-29 佛山(华南)新材料研究院 Round hole detection method and device, electronic equipment and storage medium
CN113358351A (en) * 2021-06-17 2021-09-07 武汉理工大学 Rotating shaft end face torsional vibration extraction method and device based on photogrammetry
CN113893517A (en) * 2021-11-22 2022-01-07 动者科技(杭州)有限责任公司 Rope skipping true and false judgment method and system based on difference frame method
CN113989683A (en) * 2021-09-16 2022-01-28 中国科学院空天信息创新研究院 Ship detection method for synthesizing synchronous orbit sequence optical image space-time information
CN114964770A (en) * 2022-05-17 2022-08-30 同济大学 Tooth end deblurring imaging monitoring device and method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018013450A (en) * 2016-07-22 2018-01-25 株式会社小野測器 Rotary velocity measurement device and rotary velocity measurement method
CN108022249A (en) * 2017-11-29 2018-05-11 中国科学院遥感与数字地球研究所 A kind of remote sensing video satellite moving vehicle target region of interest extraction method
CN111275036A (en) * 2018-12-04 2020-06-12 北京嘀嘀无限科技发展有限公司 Target detection method, target detection device, electronic equipment and computer-readable storage medium
CN111462066A (en) * 2020-03-30 2020-07-28 华南理工大学 Thread parameter detection method based on machine vision
CN112184734A (en) * 2020-09-30 2021-01-05 南京景瑞康分子医药科技有限公司 Long-time animal posture recognition system based on infrared images and wearable optical fibers
CN112288693A (en) * 2020-10-19 2021-01-29 佛山(华南)新材料研究院 Round hole detection method and device, electronic equipment and storage medium
CN113358351A (en) * 2021-06-17 2021-09-07 武汉理工大学 Rotating shaft end face torsional vibration extraction method and device based on photogrammetry
CN113989683A (en) * 2021-09-16 2022-01-28 中国科学院空天信息创新研究院 Ship detection method for synthesizing synchronous orbit sequence optical image space-time information
CN113893517A (en) * 2021-11-22 2022-01-07 动者科技(杭州)有限责任公司 Rope skipping true and false judgment method and system based on difference frame method
CN114964770A (en) * 2022-05-17 2022-08-30 同济大学 Tooth end deblurring imaging monitoring device and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李逸岳: "形状图像检测技术研究与应用", 《中国优秀硕士学位论文全文数据库 信息科技辑》, pages 53 *
王柏权: "透射共轴式光声内窥成像方法的研究", 《中国优秀硕士学位论文全文数据库 工程科技II辑》, pages 32 - 33 *

Similar Documents

Publication Publication Date Title
JP6240064B2 (en) Method for determining depth-resolved physical and / or optical properties of a scattering medium
US8622548B2 (en) 3D retinal disruptions detection using optical coherence tomography
CN102525381B (en) The recording equipment of image processing apparatus, image processing method and embodied on computer readable
CN108292433B (en) Detection and validation of shadows in intravascular images
CN102525405B (en) Image processing apparatus and method
US20110137157A1 (en) Image processing apparatus and image processing method
JP5757724B2 (en) Image processing apparatus, image processing method, and image processing program
US10699411B2 (en) Data display and processing algorithms for 3D imaging systems
CN111429451B (en) Medical ultrasonic image segmentation method and device
CN102056533A (en) Method of eye registration for optical coherence tomography
JPH09191408A (en) Method for automatically adjusting gradation scale using picture activity measurement
Klemenčič et al. Automated segmentation of muscle fiber images using active contour models
WO2007142682A2 (en) Method for detecting streaks in digital images
JP2008022928A (en) Image analysis apparatus and image analysis program
EP2693399A1 (en) Method and apparatus for tomography imaging
CN111462156A (en) Image processing method for acquiring corneal vertex
KR20210014267A (en) Ultrasound diagnosis apparatus for liver steatosis using the key points of ultrasound image and remote medical-diagnosis method using the same
JP6646921B2 (en) Computed tomography (CT) method and CT system
CA3104562A1 (en) Method and computer program for segmentation of optical coherence tomography images of the retina
US10918275B2 (en) Optical texture analysis of the inner retina
US20210161604A1 (en) Systems and methods of navigation for robotic colonoscopy
CN116718791B (en) Method, device, system and storage medium for detecting rotation speed of torque spring
CN116718791A (en) Method, device, system and storage medium for detecting rotation speed of torque spring
CN109447948B (en) Optic disk segmentation method based on focus color retina fundus image
EP2693397B1 (en) Method and apparatus for noise reduction in an imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant