CN111414124A - Image measuring method, device, equipment and storage medium - Google Patents

Image measuring method, device, equipment and storage medium Download PDF

Info

Publication number
CN111414124A
CN111414124A CN202010222303.3A CN202010222303A CN111414124A CN 111414124 A CN111414124 A CN 111414124A CN 202010222303 A CN202010222303 A CN 202010222303A CN 111414124 A CN111414124 A CN 111414124A
Authority
CN
China
Prior art keywords
image
target object
interest
region
measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010222303.3A
Other languages
Chinese (zh)
Inventor
牟晓勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
New Famous Medical Beijing Technology Co ltd
Original Assignee
New Famous Medical Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New Famous Medical Beijing Technology Co ltd filed Critical New Famous Medical Beijing Technology Co ltd
Priority to CN202010222303.3A priority Critical patent/CN111414124A/en
Publication of CN111414124A publication Critical patent/CN111414124A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Abstract

The embodiment of the invention discloses an image measuring method, an image measuring device, image measuring equipment and a storage medium. The method comprises the following steps: receiving a touch operation instruction input by a user based on an original image in a first display window, wherein the touch operation instruction comprises a click operation instruction and/or an image sliding instruction; acquiring an area-of-interest image corresponding to the touch operation instruction in the original image, and displaying the area-of-interest image in a second display window based on a preset scaling; when an image measurement instruction is detected, measurement data of a target object in the region-of-interest image is determined. According to the embodiment of the invention, the image of the region of interest of the user is displayed through the second display window, so that the problem that the finger touch cannot be accurately positioned is solved, the image of the region of interest can be accurately selected even under the touch condition of the finger, and the convenience and the accuracy of image measurement are improved.

Description

Image measuring method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of image processing, in particular to an image measuring method, device, equipment and storage medium.
Background
In daily life, there is an image measurement demand for accurately measuring the size of an object in an image or the distance between two points in an image in the course of taking a photograph, drawing, medical imaging, or the like. Particularly in the field of medical imaging, when an imaging device is used to image a region to be measured, it is necessary to accurately measure the size of a lesion or the thickness of target tissue in a medical image in order to determine a diagnosis result of a patient and to provide a reference for a subsequent treatment plan.
The conventional image measuring method is to locate measured points to be measured in an image by clicking an external control device such as a mouse, a trackball, a stylus, and the like, and to obtain measurement data by calculating distances between the measured points. Portable medical equipment or household medical equipment has the main characteristics of simple operation, small volume, convenient carrying and the like, and is more and more concerned by people. On the medical equipment with small volume, the peripheral control device is usually not needed, various operations on the image are mainly completed in a finger touch mode, but the finger touch range is large, and the measured point cannot be accurately clicked and set in the finger touch mode due to the shielding problem in the finger operation process, so that the problems of inconvenience in image measurement and inaccuracy in measurement result exist.
Disclosure of Invention
The embodiment of the invention provides an image measuring method, device, equipment and storage medium, which are used for accurately selecting an image of a region of interest and improving the convenience and accuracy of image measurement.
In a first aspect, an embodiment of the present invention provides an image measurement method, where the method includes:
receiving a touch operation instruction input by a user based on an original image in a first display window, wherein the touch operation instruction comprises a click operation instruction and/or an image sliding instruction;
acquiring an area-of-interest image corresponding to the touch operation instruction in the original image, and displaying the area-of-interest image in a second display window based on a preset scaling;
when an image measurement instruction is detected, measurement data of a target object in the region-of-interest image is determined.
In a second aspect, an embodiment of the present invention further provides an image measuring apparatus, including:
the touch operation instruction receiving module is used for receiving a touch operation instruction input by a user based on an original image in a first display window, wherein the touch operation instruction comprises a click operation instruction and/or an image sliding instruction;
the interesting region image display module is used for acquiring an interesting region image corresponding to the touch operation instruction in the original image and displaying the interesting region image in a second display window based on a preset scaling;
and the target object measurement module is used for determining the measurement data of the target object in the image of the region of interest when the image measurement instruction is detected.
In a third aspect, an embodiment of the present invention further provides an apparatus, where the apparatus includes:
one or more processors;
a memory for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement any of the image measurement methods recited above.
In a fourth aspect, embodiments of the present invention also provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform any of the image measurement methods referred to above.
According to the embodiment of the invention, the image of the region of interest of the user is displayed through the second display window, so that the problem that the finger touch cannot be accurately positioned is solved, the image of the region of interest can be accurately selected even under the touch condition of the finger, and the convenience and the accuracy of image measurement are improved.
Drawings
Fig. 1 is a flowchart of an image measuring method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a second display window according to an embodiment of the invention.
Fig. 3 is a flowchart of an image measuring method according to a second embodiment of the present invention.
FIG. 4 is a flowchart of an image measuring method according to a third embodiment of the present invention
Fig. 5 is a schematic diagram of another second display window provided in the third embodiment of the present invention.
Fig. 6 is a schematic diagram of an image measuring apparatus according to a fourth embodiment of the present invention.
Fig. 7 is a schematic structural diagram of an apparatus according to a fifth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of an image measuring method according to an embodiment of the present invention, where the embodiment is applicable to a case of measuring a size of a target object in an image in a touch screen mode, and the method may be executed by an image measuring apparatus, where the apparatus may be implemented in a software and/or hardware manner, and the apparatus may be configured in a terminal device, and for example, the terminal device may be a smart phone, a desktop computer, a tablet computer, a notebook computer, and the like. The method specifically comprises the following steps:
s110, receiving a touch operation instruction input by a user based on the original image in the first display window, wherein the touch operation instruction comprises a click operation instruction and/or an image sliding instruction.
The first display window includes a display interface displayed on a display screen of the terminal device, and for example, the first display window may occupy a part, such as 50%, of the display interface of the display screen, and of course, may occupy the whole display interface of the display screen, where a specific display size of the first display window is not limited.
Wherein the original image is an image for which a measurement is required. In one embodiment, the raw image comprises a medical image, in particular, the medical image comprises an ultrasound image. The ultrasonic image is an image obtained by imaging a detected part by using ultrasonic equipment. Illustratively, the ultrasound images may be ultrasound vessel images, tumor images, carotid artery images, and the like.
In one embodiment, the touch technology optionally includes at least one of laser Plane multi-touch technology (L laser L light Plane, LL P), Frustrated Total internal reflection multi-touch technology (FTIR), scattered light Plane multi-touch technology (DSI), back scattered light multi-touch technology (reader-DI), and light emitting diode Plane multi-touch technology (L ED-L P).
Wherein, the click operation instruction and/or the image sliding instruction are generated based on the click operation and the sliding operation input by the user. In one embodiment, the click operation command comprises a click operation command and/or a multi-finger click operation command, wherein the click operation command is generated by receiving click operation of one position point on the touch interface, and the multi-finger click operation command is generated by receiving click operation of a plurality of position points on the touch interface. In one embodiment, the image slide instructions include instantaneous slide instructions and/or long-term slide instructions. The instantaneous slide instruction and the long-term slide instruction refer to instructions generated by receiving a slide operation within a preset time on the touch interface, and illustratively, the preset time corresponding to the instantaneous slide instruction may be 1s, and the preset time corresponding to the long-term slide instruction may be 5 min. The specific click operation and the image sliding operation are not limited herein, and may be determined according to the actual touch setting.
And S120, acquiring an interested area image corresponding to the touch operation instruction in the original image, and displaying the interested area image in a second display window based on a preset scaling.
In one embodiment, optionally, a region-of-interest image corresponding to the touch operation instruction in the original image is obtained according to the touch position and a preset selection rule, wherein the preset selection rule comprises at least one of a reference position, a selection shape and a selection range of the selected image.
In one embodiment, optionally, the preset zoom ratio comprises a preset zoom-in ratio or a preset zoom-out ratio. For example, the preset enlargement ratio may be 5 times enlargement, and the preset reduction ratio may be 2 times reduction. The second display window comprises a display interface displayed on a display screen of the terminal device, and the display interface of the second display window can be square, circular, triangular or irregular in shape. In one embodiment, optionally, the display interface of the second display window overlaps the display interface of the first display window based on a preset overlap ratio. For example, the preset overlap ratio may be 50% or 100%. And when the preset overlapping proportion is 100%, the display interface of the second display window is included in the display interface of the first display window.
Fig. 2 is a schematic diagram of a second display window according to an embodiment of the present invention, where fig. 2 illustrates that a display interface of the second display window is circular and a preset overlap ratio is 100%. Fig. 2 shows that the original image displayed in the first display window is an ultrasound grayscale image, a white dot at a position to the left of the center on the ultrasound grayscale image represents a detected touch position, and a circular area at the upper right corner of fig. 2 represents a display interface of the second display window. In one embodiment, the position of the display interface of the second display window is different from the touch position in the touch operation instruction. In one embodiment, optionally, based on the display interface of the first display window, a distance between the touch position and the second display window is determined; and when the distance is smaller than the preset distance, updating the display position of the second display window. For example, the distance between the touch position and the second display window may be a distance between the touch position and a center position of the second display window, or a distance between the touch position and a boundary position of the second display window. Specifically, the intersection point position of the connecting line between the center position and the touch position of the second display window and the display interface of the second display window may be used as the boundary position for distance measurement. Wherein, for example, the preset distance may be 10 cm. In one embodiment, updating the display position of the second display window comprises: and taking the target position with the distance from the touch position greater than the preset distance as the updated display position of the second display window, and displaying the second display window based on the updated display position. For example, the distance between the target position and the touch position includes a maximum distance.
S130, when the image measurement instruction is detected, the measurement data of the target object in the region-of-interest image is determined.
In one embodiment, optionally, the image measurement instruction is generated when an image measurement operation input by a user is received. Here, the user may input an image measuring operation through an image measuring button or an image measuring option, for example. In an embodiment, optionally, when the touch operation instruction is a click operation instruction, if a next touch operation instruction is not received within a preset touch time, an image measurement instruction is generated. The preset touch time includes starting timing after the previous touch operation instruction is finished and the time until the next touch operation instruction is received, and for example, the preset touch time may be 5 min. In one embodiment, optionally, if the touch operation instruction is an image sliding instruction, when it is detected that the image sliding operation is finished, an image measurement instruction is generated. In an exemplary embodiment, the user slides in the first display window with a finger, when the user lifts the finger, the image sliding instruction disappears, which indicates that the image sliding operation of the user is finished, and at this time, an image measurement instruction is generated, and the target object in the region-of-interest image is measured. On the basis of the above embodiment, optionally, when the image measurement instruction is detected, the second display window is hidden.
In one embodiment, optionally, an image feature extraction algorithm is used to determine contour data of a target object in an image of a region of interest, and measurement data of the target object is calculated according to the contour data, wherein, for example, the image feature extraction algorithm includes at least one of a Histogram of Oriented Gradient extraction (HOG), a local Binary Pattern extraction algorithm (L) L BP), a Scale-Invariant Features Transform (SIFT) algorithm, and a Speeded Up robust Features extraction algorithm (SURF).
According to the technical scheme, the image of the region of interest of the user is displayed through the second display window, the problem that finger touch cannot be accurately positioned is solved, the image of the region of interest can be accurately selected even under the touch condition of the finger, and convenience and accuracy of image measurement are improved.
Example two
Fig. 3 is a flowchart of an image measuring method according to a second embodiment of the present invention, and the technical solution of the present embodiment is further detailed based on the above-mentioned embodiment. Optionally, the determining measurement data of the target object in the image of the region of interest includes: and determining a target measuring method according to the image type of the original image, and determining the measuring data of the target object in the image of the region of interest based on the target measuring method.
S210, receiving a touch operation instruction input by a user based on the original image in the first display window, wherein the touch operation instruction comprises a click operation instruction and/or an image sliding instruction.
S220, obtaining an area-of-interest image corresponding to the touch operation instruction in the original image, and displaying the area-of-interest image in a second display window based on a preset scaling.
And S230, when the image measurement instruction is detected, determining a target measurement method according to the image type of the original image, and determining measurement data of the target object in the region-of-interest image based on the target measurement method.
In one embodiment, optionally, the image type of the original image comprises a grayscale image, a color image, a contour image, or a contour gradient image. A grayscale image refers to an image represented in grayscale, where each pixel of the grayscale image has only one sample color, and is typically displayed as a grayscale value from darkest black to brightest white. Illustratively, the grayscale image may be an image acquired using grayscale ultrasound techniques. Each pixel in a color image is an image composed of different components of red, green, and blue. Illustratively, the color image may be an image acquired using color ultrasound doppler techniques. The contour image includes an image in which a contour of the target object is labeled on the basis of the grayscale image. Specifically, the contour image includes image information such as a target object image, a contour labeling image of the target object, and a background image. The outline marking image may be a gray image or a color image. The contour gradient image includes an image obtained by removing the target object image and the background image on the basis of the contour image, that is, the contour gradient image includes a contour labeling image of the target object.
In one embodiment, optionally, when the image type of the original image is a color image, a contour image, or a contour gradient image, the measurement data of the target object in the region-of-interest image is determined based on a pixel gradient algorithm. In an embodiment, optionally, gradient calculation is performed on pixel values of adjacent pixel points in the region-of-interest image, and contour data of the target object in the region-of-interest image is determined; and calculating the measurement data of the target object based on the contour data.
Wherein, the pixel gradient algorithm satisfies the following formula:
T=Px-Py
wherein T represents a gradient difference, PxAnd PyAnd adjacent pixel points in the image of the region of interest are obtained. In one embodiment, when the gradient difference satisfies a preset gradient threshold, at least one of the pixel points participating in the gradient calculation is used as a boundary pixel point and stored. In one embodiment, the neighboring pixels include top-to-bottom neighboring pixels and/or bottom-to-top neighboring pixels, and left-to-right neighboring pixels and/or right-to-left neighboring pixels. Illustratively, gradient calculation is performed on the basis of adjacent pixel points from top to bottom to determine an upper boundary and a lower boundary in the region-of-interest image, and gradient calculation is performed on the basis of adjacent pixel points from left to right to determine a left boundary and a right boundary in the region-of-interest image, so as to obtain contour data of the target object in the region-of-interest image.
In an embodiment, optionally, when the image type of the original image is a gray image, performing contour recognition on a target object in the image of the region of interest by using a recognition model trained in advance to obtain contour data of the target object; and calculating the measurement data of the target object based on the contour data. The pixel gradient algorithm described above can achieve a good measurement effect for an image in which a contour image of an obvious target object exists in an image of a region of interest. However, if the type of the original image is a grayscale image, the grayscale image has a poorer display effect on the target object in the image than a color image, a contour image, or a contour gradient image. Therefore, the applicant proposes that when the image type of the original image is a gray image, a pre-trained recognition model is adopted to perform contour recognition on the target object in the image of the region of interest. In an embodiment, optionally, a training image set is obtained, and a target object in the training image set is labeled; and inputting the marked training image set serving as a training sample into the initial recognition model, and adjusting model parameters of the initial recognition model according to an output result to obtain a trained recognition model.
According to the technical scheme of the embodiment, the target measuring method is determined according to the image type of the original image, the target object is measured, the problem of inaccurate image measuring result is solved, and the accurate image measuring result can be obtained by adopting the technical scheme no matter the original image with an obvious outline and the original image with a fuzzy outline.
EXAMPLE III
Fig. 4 is a flowchart of an image measuring method provided by a third embodiment of the present invention, and the technical solution of the present embodiment is further detailed based on the above embodiments. Optionally, a ruler is displayed in the second display window, and a display direction and/or a display position of the ruler in the second display window are determined according to a target object in the original image and/or according to a received ruler sliding instruction input by a user.
The specific implementation steps of this embodiment include:
s310, receiving a touch operation instruction input by a user based on the original image in the first display window, wherein the touch operation instruction comprises a click operation instruction and/or an image sliding instruction.
S320, obtaining an interesting area image corresponding to the touch operation instruction in the original image, and displaying the interesting area image in a second display window based on a preset scaling.
In one embodiment, optionally, the second display window further displays a scale, and the scale marked by the scale is the same as the actual size of the target object. For example, the actual length of the target object is 1m, the size of the image of the target object displayed in the second display window is 10cm, and the scale marked on the scale at the position of 10cm in the image is 1 m.
In one embodiment, optionally, determining the display direction and/or the display position of the ruler in the second display window according to the target object in the original image includes: when the target object in the original image is the carotid intima-media, the display direction of the scale is vertical; when the target object in the original image is the transparent layer of the fetal neck, the display direction of the scale is horizontal. Among these, the thickness of the intima-media in the carotid artery is an important indicator for the assessment of atherosclerosis and cardiovascular disease using ultrasound imaging. The fetal cervical diaphragma is an important index for evaluating the risk of a fetus suffering from Down syndrome and other chromosome abnormalities and major congenital heart diseases by using ultrasonic imaging, and is generally carried out in 11-14 weeks of pregnancy. Specifically, the target object in the original image may be input by the user before the image measurement instruction is received.
In one embodiment, optionally, determining the display direction and/or the display position of the scale in the second display window according to the received scale sliding instruction input by the user includes: and determining the display direction and/or the display position of the scale in the second display window according to the received scale sliding instruction input by the user based on the first display window or the second display window.
In one embodiment, determining the display direction and/or display position of the scale in the second display window according to the received user sliding instruction based on the scale input in the first display window comprises: after detecting that the image sliding instruction disappears, the sliding instruction detected again in the first display window is used as a scale sliding instruction, and the display direction and/or the display position of the scale in the second display window are/is determined based on the scale sliding instruction. In another embodiment, determining the display direction and/or display position of the scale in the second display window according to the received user sliding instruction based on the scale input in the first display window comprises: when receiving the scale operation instruction, the sliding instruction detected in the first display window is taken as a scale sliding instruction, and the display direction and/or the display position of the scale in the second display window are determined based on the scale sliding instruction. Illustratively, when a scale operation option input by a user is received, a scale operation instruction is generated. The scale operation option can be input by text input, clicking selection operation option or scale operation button, etc.
In one embodiment, the display direction and/or display position of the scale in the second display window is determined according to the received user sliding instruction based on the scale input in the second display window. Specifically, a ruler sliding instruction input by a user in the second display window is obtained by adopting a multi-point touch technology, and the display direction and/or the display position of the ruler in the second display window are/is determined based on the ruler sliding instruction. The multi-point touch technology is a technology that allows a computer user to control an image interface through multiple fingers or multiple touches simultaneously, so that human-computer interaction is realized. Fig. 5 is a schematic diagram of another second display window provided in the third embodiment of the present invention, and the scale in fig. 5 is displayed in the center of the second display window.
In one embodiment, a user aligns a contour boundary of a target object in a region-of-interest image with a ruler through an image sliding operation; and measuring the target object in the image of the region of interest by adopting a target measurement method based on the contour boundary aligned with the ruler. In which, for example, the contour boundary is aligned with a preset scale line. The advantage of this arrangement is that the user can determine at least one contour boundary of the target object by the ruler, which can improve the accuracy of the subsequent measurement results based on the contour boundary.
On the basis of the above embodiment, optionally, pixel gradient calculation is performed on the target object in the image of the region of interest, and a gradient auxiliary line is drawn; and displaying the gradient auxiliary line in a second display window. In one embodiment, when the width data from top to bottom needs to be measured, the gradient auxiliary line is drawn with the maximum width from top to bottom of the second display window as the abscissa and the sum of the pixel values corresponding to the respective abscissas as the ordinate. Wherein the abscissa of the gradient auxiliary line is parallel to the display direction of the scale. In another embodiment, when the left-to-right width data needs to be measured, the maximum left-to-right width of the second display window is taken as an abscissa, and the sum of pixel values corresponding to the respective abscissas is taken as an ordinate, and the gradient auxiliary line is drawn. Wherein the abscissa of the gradient auxiliary line is parallel to the display direction of the scale. The method has the advantages that the user can roughly measure the size of the target object in the interested area image according to the gradient auxiliary line, and when the size of the target object is determined to meet the requirement, the target object in the interested area image is accurately measured. The contour boundary of the target object in the region-of-interest image can be aligned with the scale conveniently by the user according to the gradient auxiliary line, so that the accuracy of the subsequent measurement result is improved.
S330, displaying the scale in the second display window, and determining the display direction and/or the display position of the scale in the second display window according to the target object in the original image and/or according to the received scale sliding instruction input by the user.
S340, when the image measurement instruction is detected, determining a target measurement method according to the image type of the original image, and determining measurement data of the target object in the region-of-interest image based on the target measurement method.
When selecting the region of interest image, the user may have a requirement on the size of the target object in the region of interest image, such as measuring for a target object with a size greater than 10 cm. According to the technical scheme, the problem that the size of the target object cannot be selected is solved by displaying the scale in the second display window, the user is helped to screen the selected image of the region of interest in advance, the situation that the size of the obtained target object does not meet the requirement and the image needs to be reselected for image measurement is avoided, and therefore the image measurement efficiency is reduced.
Example four
Fig. 6 is a schematic diagram of an image measuring apparatus according to a fourth embodiment of the present invention. The device can be implemented in a software and/or hardware manner, and can be configured in a terminal device. Illustratively, the terminal device may be a smart phone, a desktop computer, a tablet computer, a notebook computer, and the like. The image measuring apparatus includes: a touch operation instruction receiving module 410, a region-of-interest image display module 420 and a target object measurement module 430.
The touch operation instruction receiving module 410 is configured to receive a touch operation instruction input by a user based on an original image in a first display window, where the touch operation instruction includes a click operation instruction and/or an image sliding instruction;
the region-of-interest image display module 420 is configured to acquire a region-of-interest image corresponding to the touch operation instruction in the original image, and display the region-of-interest image in the second display window based on a preset scaling;
a target object measurement module 430, configured to determine measurement data of the target object in the region of interest image when the image measurement instruction is detected.
According to the technical scheme, the image of the region of interest of the user is displayed through the second display window, the problem that finger touch cannot be accurately positioned is solved, the image of the region of interest can be accurately selected even under the touch condition of the finger, and convenience and accuracy of image measurement are improved.
On the basis of the above technical solution, optionally, the apparatus further includes:
and the scale display module is used for displaying the scale in the second display window, and determining the display direction and/or the display position of the scale in the second display window according to the target object in the original image and/or according to the received scale sliding instruction input by the user.
Optionally, the target object measurement module 430 includes:
and the target object measuring unit is used for determining a target measuring method according to the image type of the original image and determining the measuring data of the target object in the interested area image based on the target measuring method.
Optionally, the image type of the original image includes a grayscale image, a color image, a contour image, or a contour gradient image.
Optionally, the target object measuring unit includes:
the target object measurement first subunit is used for carrying out contour recognition on a target object in the image of the region of interest by adopting a recognition model which is trained in advance when the image type of the original image is a gray image, so as to obtain contour data of the target object;
and calculating the measurement data of the target object based on the contour data.
Optionally, the target object measuring unit includes:
and the target object measurement second subunit is used for determining the measurement data of the target object in the interested area image based on the pixel gradient algorithm when the image type of the original image is a color image, a contour image or a contour gradient image.
Optionally, the target object measurement second subunit is specifically configured to:
performing gradient calculation on pixel values of all adjacent pixel points in the interested area image, and determining contour data of a target object in the interested area image;
and calculating the measurement data of the target object based on the contour data.
The image measuring device provided by the embodiment of the invention can be used for executing the image measuring method provided by the embodiment of the invention, and has corresponding functions and beneficial effects of the executing method.
It should be noted that, in the embodiment of the image measuring apparatus, the included units and modules are only divided according to the functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
EXAMPLE five
Fig. 7 is a schematic structural diagram of an apparatus according to a fifth embodiment of the present invention, where the fifth embodiment of the present invention provides a service for implementing the image measurement method according to the foregoing embodiment of the present invention, and the image measurement apparatus according to the foregoing embodiment may be configured. Fig. 7 illustrates a block diagram of an exemplary device 12 suitable for use in implementing embodiments of the present invention. The device 12 shown in fig. 7 is only an example and should not bring any limitation to the function and scope of use of the embodiments of the present invention.
As shown in FIG. 7, device 12 is in the form of a general purpose computing device. The components of device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. Device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 7, and commonly referred to as a "hard drive"). Although not shown in FIG. 7, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), one or more devices that enable a user to interact with the device 12, and/or any device (e.g., network card, modem, etc.) that enables the device 12 to communicate with one or more other computing devices, such communication may occur via input/output (I/O) interfaces 22. furthermore, device 12 may also communicate with one or more networks (e.g., local area network (L AN), Wide Area Network (WAN) and/or a public network, such as the Internet) via network adapter 20. As shown in FIG. 7, network adapter 20 communicates with the other modules of device 12 via bus 18. it should be appreciated that, although not shown, other hardware and/or software modules may be used in conjunction with device 12, including, but not limited to, microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processing unit 16 executes various functional applications and data processing, such as implementing an image measurement method provided by an embodiment of the present invention, by running a program stored in the system memory 28.
Through the equipment, the problem that finger touch cannot be accurately positioned is solved, so that the region-of-interest image can be accurately selected even under the touch condition of the finger, and the convenience and accuracy of image measurement are improved.
EXAMPLE six
An embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, perform an image measurement method, the method including:
receiving a touch operation instruction input by a user based on an original image in a first display window, wherein the touch operation instruction comprises a click operation instruction and/or an image sliding instruction;
acquiring an interesting area image corresponding to the touch operation instruction in the original image, and displaying the interesting area image in a second display window based on a preset scaling;
when an image measurement instruction is detected, measurement data of a target object in the region-of-interest image is determined.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including AN object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
Of course, the storage medium provided by the embodiment of the present invention contains computer-executable instructions, and the computer-executable instructions are not limited to the above method operations, and may also perform related operations in the image measurement method provided by any embodiment of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. An image measuring method, characterized by comprising:
receiving a touch operation instruction input by a user based on an original image in a first display window, wherein the touch operation instruction comprises a click operation instruction and/or an image sliding instruction;
acquiring an area-of-interest image corresponding to the touch operation instruction in the original image, and displaying the area-of-interest image in a second display window based on a preset scaling;
when an image measurement instruction is detected, measurement data of a target object in the region-of-interest image is determined.
2. The method of claim 1, further comprising:
and displaying a scale in the second display window, and determining the display direction and/or the display position of the scale in the second display window according to a target object in the original image and/or according to a received scale sliding instruction input by a user.
3. The method of claim 1, wherein the determining measurement data of a target object in the region of interest image comprises:
and determining a target measuring method according to the image type of the original image, and determining the measuring data of the target object in the image of the region of interest based on the target measuring method.
4. The method of claim 3, wherein the image type of the original image comprises a grayscale image, a color image, a contour image, or a contour gradient image.
5. The method of claim 4, wherein determining a target measurement method according to the image type of the raw image and determining measurement data of a target object in the region-of-interest image based on the target measurement method comprises:
when the image type of the original image is a gray image, carrying out contour recognition on a target object in the image of the region of interest by adopting a recognition model trained in advance to obtain contour data of the target object;
and calculating the measurement data of the target object based on the profile data.
6. The method of claim 4, wherein determining a target measurement method according to the image type of the raw image and determining measurement data of a target object in the region-of-interest image based on the target measurement method comprises:
when the image type of the original image is a color image, a contour image or a contour gradient image, the measurement data of the target object in the region-of-interest image is determined based on a pixel gradient algorithm.
7. The method of claim 6, wherein the pixel gradient based algorithm determining measurement data of a target object in the region of interest image comprises:
performing gradient calculation on pixel values of adjacent pixel points in the region-of-interest image, and determining contour data of a target object in the region-of-interest image;
and calculating the measurement data of the target object based on the profile data.
8. An image measuring apparatus, characterized by comprising:
the touch operation instruction receiving module is used for receiving a touch operation instruction input by a user based on an original image in a first display window, wherein the touch operation instruction comprises a click operation instruction and/or an image sliding instruction;
the interesting region image display module is used for acquiring an interesting region image corresponding to the touch operation instruction in the original image and displaying the interesting region image in a second display window based on a preset scaling;
and the target object measurement module is used for determining the measurement data of the target object in the image of the region of interest when the image measurement instruction is detected.
9. An apparatus, comprising:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the image measurement method of any one of claims 1-7.
10. A storage medium containing computer-executable instructions for performing the image measurement method of any one of claims 1-7 when executed by a computer processor.
CN202010222303.3A 2020-03-26 2020-03-26 Image measuring method, device, equipment and storage medium Pending CN111414124A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010222303.3A CN111414124A (en) 2020-03-26 2020-03-26 Image measuring method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010222303.3A CN111414124A (en) 2020-03-26 2020-03-26 Image measuring method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111414124A true CN111414124A (en) 2020-07-14

Family

ID=71493266

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010222303.3A Pending CN111414124A (en) 2020-03-26 2020-03-26 Image measuring method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111414124A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113175879A (en) * 2021-03-25 2021-07-27 潮州三环(集团)股份有限公司 Method, device, equipment and medium for detecting T surface of sharp-nose ceramic column
CN113778302A (en) * 2021-08-20 2021-12-10 北京达佳互联信息技术有限公司 Interface element display method and device, storage medium and electronic equipment
CN114066781A (en) * 2022-01-18 2022-02-18 浙江鸿禾医疗科技有限责任公司 Capsule endoscope intestinal tract image identification and positioning method, storage medium and equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5999892A (en) * 1997-05-05 1999-12-07 Fan; Rodric C. Method for providing an instant distance measurement on a map
CN104598228A (en) * 2015-01-05 2015-05-06 深圳市金立通信设备有限公司 Measuring method of size of object
CN107480673A (en) * 2017-06-30 2017-12-15 上海联影医疗科技有限公司 Determine the method, apparatus and image editing system of area-of-interest in medical image
CN110680399A (en) * 2019-10-25 2020-01-14 深圳度影医疗科技有限公司 Automatic measurement method of prenatal ultrasound image, storage medium and ultrasound equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5999892A (en) * 1997-05-05 1999-12-07 Fan; Rodric C. Method for providing an instant distance measurement on a map
CN104598228A (en) * 2015-01-05 2015-05-06 深圳市金立通信设备有限公司 Measuring method of size of object
CN107480673A (en) * 2017-06-30 2017-12-15 上海联影医疗科技有限公司 Determine the method, apparatus and image editing system of area-of-interest in medical image
CN110680399A (en) * 2019-10-25 2020-01-14 深圳度影医疗科技有限公司 Automatic measurement method of prenatal ultrasound image, storage medium and ultrasound equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113175879A (en) * 2021-03-25 2021-07-27 潮州三环(集团)股份有限公司 Method, device, equipment and medium for detecting T surface of sharp-nose ceramic column
CN113778302A (en) * 2021-08-20 2021-12-10 北京达佳互联信息技术有限公司 Interface element display method and device, storage medium and electronic equipment
CN114066781A (en) * 2022-01-18 2022-02-18 浙江鸿禾医疗科技有限责任公司 Capsule endoscope intestinal tract image identification and positioning method, storage medium and equipment
CN114066781B (en) * 2022-01-18 2022-05-10 浙江鸿禾医疗科技有限责任公司 Capsule endoscope intestinal image identification and positioning method, storage medium and equipment

Similar Documents

Publication Publication Date Title
US10545643B2 (en) User interface for data comparison
US11113836B2 (en) Object detection method, device, apparatus and computer-readable storage medium
CN111414124A (en) Image measuring method, device, equipment and storage medium
US8401285B1 (en) Methods, apparatuses, and computer program products for controlling luminance of non-tissue objects within an image
US20150030249A1 (en) Overlapped handwriting input method
JP6089886B2 (en) Region dividing method and inspection apparatus
US9058650B2 (en) Methods, apparatuses, and computer program products for identifying a region of interest within a mammogram image
CN111625668B (en) Object detection and candidate filtering system
US20180239957A1 (en) Heuristic finger detection method based on depth image
JP2015153195A (en) Gesture recognition device and control method therefor
CN108596955A (en) A kind of image detecting method, image detection device and mobile terminal
JP2015153120A (en) Image processor, image processing method and image processing program
CN112215769A (en) Ultrasonic image processing method and device, ultrasonic equipment and storage medium
WO2020103462A1 (en) Video search method and apparatus, computer device, and storage medium
US9569661B2 (en) Apparatus and method for neck and shoulder landmark detection
CN117372415A (en) Laryngoscope image recognition method, device, computer equipment and storage medium
CN110063753B (en) Method and device for measuring head circumference of fetus in ultrasonic image
JP2017534121A (en) Handwritten content classification method and device, and handwritten content editing device
TWI571753B (en) Electronic calculating apparatus for generating an interaction index map of an image, method thereof and computer program product thereof
KR20200005853A (en) Method and System for People Count based on Deep Learning
CN117333487B (en) Acne classification method, device, equipment and storage medium
TWI825643B (en) Medical auxiliary information generation method and medical auxiliary information generation system
TWI825633B (en) Image processing method, image processing apparatus, and image processing system
JP2014102711A (en) Contour detection device, contour detection method, and contour detection program
CN114098663A (en) Pulse wave acquisition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200714

RJ01 Rejection of invention patent application after publication