CN113436171A - Processing method and device for canned image - Google Patents

Processing method and device for canned image Download PDF

Info

Publication number
CN113436171A
CN113436171A CN202110721540.9A CN202110721540A CN113436171A CN 113436171 A CN113436171 A CN 113436171A CN 202110721540 A CN202110721540 A CN 202110721540A CN 113436171 A CN113436171 A CN 113436171A
Authority
CN
China
Prior art keywords
canned
partition
image
viscera
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110721540.9A
Other languages
Chinese (zh)
Other versions
CN113436171B (en
Inventor
张智
任睿芳
滕慧慧
曹晨思
程京
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CapitalBio Corp
Original Assignee
CapitalBio Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CapitalBio Corp filed Critical CapitalBio Corp
Priority to CN202110721540.9A priority Critical patent/CN113436171B/en
Publication of CN113436171A publication Critical patent/CN113436171A/en
Application granted granted Critical
Publication of CN113436171B publication Critical patent/CN113436171B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H9/00Pneumatic or hydraulic massage
    • A61H9/005Pneumatic massage
    • A61H9/0057Suction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M1/00Suction or pumping devices for medical purposes; Devices for carrying-off, for treatment of, or for carrying-over, body-liquids; Drainage systems
    • A61M1/08Cupping glasses, i.e. for enhancing blood circulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Epidemiology (AREA)
  • Rehabilitation Therapy (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Pain & Pain Management (AREA)
  • Vascular Medicine (AREA)
  • Anesthesiology (AREA)
  • Biomedical Technology (AREA)
  • Hematology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The embodiment of the application provides a processing method and a device of a canned image, wherein the method is used for obtaining a target image; the target image comprises a canned stamp sub-image of a canned stamp left after cupping on the back of a human body according to a preset viscera partition; the preset viscera subareas comprise viscera function reflection subareas divided by the back of the human body; determining the stamping subareas of the stamping subimages according to the preset viscera subareas; and calling a preset organ function reflecting area dividing algorithm corresponding to each overprint partition to draw the viscera function reflecting area image corresponding to each overprint subimage. The application can automatically carry out the overprinting recognition and the interception of the image of the functional reflex area, avoid the subjective influence of people, obtain the accurate image of the visceral functional reflex area and lay the foundation for the computer automatic jar diagnosis.

Description

Processing method and device for canned image
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for processing an image to be printed.
Background
With the spread and popularization of traditional Chinese medicine, the application of traditional Chinese medicine cupping and health preserving is more and more extensive at home and abroad, and physical events such as Olympic games, European cups and the like have athletes at home and abroad to carry out health care through cupping.
The cupping diagnosis of traditional Chinese medicine can see through the functional states of the viscera and the health state of the human body by observing the color and morphological characteristics of the cupping marks of different areas of the back after cupping.
At present, cupping diagnosis application and research mainly rely on artificial visual observation and personal operation experience, and images of viscera function reflecting areas of various viscera organs are determined by dividing viscera function cupping areas and positioning at the back and dividing approximate viscera function reflecting areas of each cupping area based on traditional Chinese medicine tibetan elephantiasis, meridian science and western medicine dorsal splanchnic nerve anatomy and physiological theory.
Therefore, in the prior art, the images of the viscera organ function reflecting areas corresponding to the cupping prints can be divided only by means of manual experience, and the images of the specific viscera organ function reflecting areas cannot be accurately obtained, so that the computer automated cupping diagnosis cannot be realized.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method and an apparatus for processing a canned image, so as to obtain an image of a visceral organ function reflex zone objectively and accurately, and lay a foundation for computer automated jar diagnosis. The specific technical scheme is as follows:
the embodiment of the application discloses a processing method of a canned image, which comprises the following steps:
obtaining a target image; the target image comprises a canned stamp sub-image of a canned stamp left after cupping on the back of a human body according to a preset viscera partition; the preset viscera subareas comprise viscera function reflection subareas divided by the back of the human body;
determining the stamping subareas of the stamping subimages according to the preset viscera subareas;
and calling a preset organ function reflecting area dividing algorithm corresponding to each overprint partition to draw the viscera function reflecting area image corresponding to each overprint subimage.
In some embodiments, the parameters of the pre-determined organ functional reflex zone division algorithm specifically include:
the method comprises the following steps of obtaining one or more curve data, standard canned circle data, the offset of a curve and the center of a standard canned circle and a curve effective area division rule.
In some embodiments, the invoking of the preset organ function reflex zone division algorithm corresponding to each of the canned partitions to draw the viscera function reflex zone image corresponding to each canned sub-image includes:
for each canned sub-image:
drawing viscera reflecting region curves corresponding to the target canned sub-image according to one or more curve data corresponding to the canned subareas of the target canned sub-image and the data of the standard canned circle;
translating the viscera reflection region curve to the target canned child image according to the offset;
and segmenting the target canned child image according to the viscera reflecting region curve and the curve effective region dividing rule corresponding to the viscera reflecting region curve to obtain the viscera function reflecting region image corresponding to the target canned child image.
In some embodiments, if it is determined that the data of the canned circle of the target canned sub-image does not match the data of the standard canned circle, the translating the viscera reflection area curve to the target canned sub-image according to the offset further includes:
calculating the scaling of the can printing according to the standard can printing circle data and the can printing circle data of the target can printing sub-image;
and scaling the curve of the viscera reflecting region according to the scaling.
In some embodiments, the predetermined viscera partition specifically comprises:
lung, heart, gall bladder, liver, spleen, stomach, large intestine, small intestine, left kidney, right kidney, and bladder.
In some embodiments, the lung region consists of a nasal partition, a throat-tonsil partition, a trachea-left bronchus partition, a trachea-right bronchus partition, a thyroid partition, a lung partition;
the heart area is composed of a coronary artery partition, a cardiac muscle partition, a cerebral vessel-right brain partition, a cerebral vessel-left brain partition and a cervical vertebra partition;
the biliary region consists of a gallbladder partition and a bile duct partition;
the liver region consists of fatty liver partition, liver-brain-breast partition;
the spleen area is a whole standard canned circle;
the stomach area consists of a pylorus partition, a cardia partition, a stomach partition and a duodenum partition;
the large intestine area consists of a large intestine-colon-pancreas subarea, an oral cavity subarea and a rectum subarea;
the small intestine area consists of a small intestine subarea and a duodenum subarea;
the left kidney area consists of a lumbar-kidney-ureter partition, a left kidney partition and a right lower limb partition;
the right kidney area consists of a lumbar-kidney-ureter partition, a right kidney partition and a left lower limb partition;
the bladder zone consists of a bladder partition, a uterus body partition, a cervix partition, a vagina-urethra partition, a prostate/ovary-fallopian tube partition and an anus partition.
The present application also provides a processing apparatus of an imprint image, including:
an obtaining module for obtaining a target image; the target image comprises a canned stamp sub-image of a canned stamp left after cupping on the back of a human body according to a preset viscera partition; the preset viscera subareas comprise viscera function reflection subareas divided by the back of the human body;
the partition module is used for determining the stamping partitions of the stamping subimages according to the preset viscera partitions;
and the drawing module is used for calling a preset organ function reflecting area dividing algorithm corresponding to each canned printing subarea to draw the viscera function reflecting area image corresponding to each canned printing subimage.
In some embodiments, the parameters of the pre-determined organ functional reflex zone division algorithm specifically include:
the method comprises the following steps of obtaining one or more curve data, standard canned circle data, the offset of a curve and the center of a standard canned circle and a curve effective area division rule.
In some embodiments, the rendering module is specifically configured to:
for each canned sub-image:
drawing viscera reflecting region curves corresponding to the target canned sub-image according to one or more curve data corresponding to the canned subareas of the target canned sub-image and the data of the standard canned circle;
translating the viscera reflection region curve to the target canned child image according to the offset;
and segmenting the target canned child image according to the viscera reflecting region curve and the curve effective region dividing rule corresponding to the viscera reflecting region curve to obtain the viscera function reflecting region image corresponding to the target canned child image.
In some embodiments, the rendering module is further to:
if the data of the canned circle of the target canned sub-image is judged not to be matched with the data of the standard canned circle, the translating the viscera reflection area curve to the position before the target canned sub-image according to the offset further comprises:
calculating the scaling of the can printing according to the standard can printing circle data and the can printing circle data of the target can printing sub-image;
and scaling the curve of the viscera reflecting region according to the scaling.
The processing method and the processing device for the canned image can automatically perform canned identification and intercept the function reflex zone image, avoid subjective influence of people, obtain an accurate image of the viscera function reflex zone, and lay a foundation for computer automated canning diagnosis.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of the visceral function reflex zone divided by the back of the human body according to the embodiment of the present application;
FIG. 2 is a schematic flow chart of a method of processing an imprinted image as disclosed herein;
3.1.1-3.9.6 are schematic diagrams of the algorithm for segmenting the functional reflex zones of the organ;
fig. 4 is a schematic structural diagram of a processing apparatus for processing an imprinted image disclosed in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The characteristics of traditional Chinese medicine are mainly the integration of heaven and human, the holistic concept and the basic theories of treatment based on syndrome differentiation, and the like, and the analysis of human body from the holistic view is emphasized. Under the guidance of the theory of meridians and theory of visceral manifestation, the conduction of qi, blood and essence can be stimulated by the physical stimulation of cupping at specific positions, so as to achieve the purpose of health care.
The cupping diagnosis is based on the theory of phenomenology of traditional Chinese medicine to cupping marks and the division of the cupping mark areas, and according to the life holographic theory, each part of the life body carries the information of the whole life, the traditional Chinese medicine theory also runs through the essence of the holographic theory, and the 11 viscera function reflecting areas on the back are considered to be in one-to-one correspondence with the viscera of the human body, and the states of the viscera organs can be expressed in the cupping marks of the 11 areas on the back, so that the functional states and the health states of the viscera of the human body can be reflected by observing and analyzing the cupping marks in the areas and the color and morphological characteristics in the division areas of the cupping marks.
Modern medicine believes that the negative pressure mechanical action of the cupping can generate self hemolysis, so that local congestion and edema can be caused, local blood supply can be enhanced, blood oxygen supply can be improved, the excretion function of local sweat glands and sebaceous glands can be enhanced, and local gas exchange can be enhanced; the back has the blood supply of visceral motor nerves, visceral sensory nerves and spinal cord, and the specific reflecting areas can form prints corresponding to the visceral states, so that the diagnosis assisting function is strong.
Under the guidance of the traditional Chinese and western medicine theory, in the embodiment of the application, 11 viscera function reflex zones are defined as preset viscera zones for the back of a human body in advance, and a preset organ function reflex zone division algorithm is further arranged for dividing the viscera zones into viscera function reflex zones so as to represent the health state of each specific organ aiming at each preset viscera zone. The predetermined viscera zones may each comprise a circle corresponding to a standard impression and a set of differently shaped regions.
It is to be understood that in the embodiment of the present application, the standard imprint is defined as a circle C0 with a radius r. In the embodiment of the present application, r is 200 pixels as an example, that is, all parameters in the curve formula appearing later are calculated by using the radius value; if the value of the radius r varies, the corresponding curve is scaled proportionally.
In the embodiment of the present application, a pixel coordinate system with the center of the circle C0 as the origin is adopted, and the numbers involved are all in units of pixels unless otherwise specified.
Referring to fig. 1, fig. 1 is a schematic diagram of the visceral function reflex zone divided by the back of the human body according to the embodiment of the present application.
In the embodiment of the present application, the predetermined visceral partition may be divided into a lung area, a heart area, a gallbladder area, a liver area, a spleen area, a stomach area, a large intestine area, a small intestine area, a left kidney area, a right kidney area, and a bladder area.
Wherein, the lung area is positioned at the greater vertebral acupoint, namely the sunken part below the 7 th cervical vertebra spinous process; the bladder area is located at the Changqiang acupoint, namely the gluteal groove separation; the midpoint of the connecting line of the lung area and the bladder area is a large intestine area, and the rest 6 areas are equally divided; the spleen area was 3 cun lateral to the left arm, and the liver area was 3 cun lateral to the right arm, between the gallbladder and stomach.
The specific parameters of the preset organ function reflex zone division algorithm may include: 1) one or more curve data; 2) data of standard canned circle C0: the width and height values of the C0 circumscribed rectangle; 3) the offset of the curve from the center of the C0 circle; 4) the curve effective area is divided into rules, or called the mark of the curve effective area, that is, the area on which side of the curve is the target area.
Based on this, the technical scheme of this application has proposed the processing method of a canned image.
Referring to fig. 2, fig. 2 is a flow chart illustrating a processing method of a canned image disclosed in the present application.
The embodiment of the application discloses a processing method of a canned image, which comprises the following steps:
s201, obtaining a target image; the target image comprises a canned stamp sub-image of a canned stamp left after cupping on the back of a human body according to a preset viscera partition; the preset viscera subareas comprise viscera function reflection subareas divided by the back of the human body;
the execution main body in the embodiment of the present application may be a server, a computer, a mobile terminal, or a cluster device with an automatic operation capability, and the like, and is not specifically limited herein, and only the processing method in the embodiment of the present application needs to be executed.
In this embodiment of the application, the obtained target image may be acquired in real time by an acquisition device, or may be acquired from a cloud platform, a database, or other mobile terminal devices, which is not particularly limited in this embodiment of the application.
Wherein, the target image comprises a seal image of a seal left after cupping on the back of the human body according to a preset viscera partition. Of course, the number of the canned sub-images may be one or more, and in order to provide a basis for the automatic computer can diagnosis, the embodiment of the present application is preferably eleven.
In the embodiment of the application, the target images can also correspond to the identity information of the user, so that when a plurality of target images are processed in batch, corresponding results are output according to the identity information.
S202, determining the stamping subareas of the stamping subimages according to the preset viscera subareas;
in the embodiment of the present application, after the target image is obtained, each of the canned subimages included in the target image is subjected to partition processing. Referring to the preset zang-fu partition set in fig. 1, the corresponding stamp partition of each stamp sub-image can be identified and calculated.
In the partition data of the preset viscera partition, the coordinate position of each viscera function reflection partition in the standard canned image under the pixel coordinate system can be predefined, and the canned seal partition can be obtained by matching the coordinate position with the coordinate position of each canned seal in the target image. Of course, the image parameters of the canned sub-image are processed to be the same as those of the standard canned image before the step S202 is executed. The image parameters may include parameters such as pixel, size, resolution, etc. of the image.
Of course, it is also possible to manually input the stamp division of each stamp sub-image in the target image after the target image is obtained.
S203, calling a preset organ function reflection region division algorithm corresponding to each canned stamp region to draw viscera function reflection region images corresponding to each canned stamp subimage;
in the embodiment of the application, the drawing steps can be executed circularly or simultaneously for each of the canned partitions:
drawing viscera reflecting region curves corresponding to the target canned sub-image according to one or more curve data corresponding to the canned subareas of the target canned sub-image and the data of the standard canned circle;
translating the viscera reflection region curve to the target canned child image according to the offset;
and segmenting the target canned child image according to the viscera reflecting region curve and the curve effective region dividing rule corresponding to the viscera reflecting region curve to obtain the viscera function reflecting region image corresponding to the target canned child image.
In the embodiment of the present application, for example, if the pot mark partition is a biliary region, the visceral reflex region curve is drawn according to a preset organ function reflex region partition algorithm corresponding to the biliary region, and the preset organ function reflex region partition algorithm corresponding to the biliary region may include a gallbladder partition algorithm and a bile duct partition algorithm. The division algorithm of the predetermined organ functional reflex zones is described in detail in the following embodiments.
In the embodiment of the present application, the curve of the viscera reflection region can be drawn into the drawing board without directly processing the original image, so as to ensure that the original image is not modified or damaged.
In the embodiment of the present application, since there may be a difference in the ratio between the canned sub-image and the standard canned sub-image, if it is determined that the data of the canned circle of the target canned sub-image does not match the data of the standard canned circle, a scaling operation is required.
Then before translating the curve of the viscera reflection region to the target canned sub-image according to the offset:
calculating the scaling of the can printing according to the standard can printing circle data and the can printing circle data of the target can printing sub-image;
and scaling the curve of the viscera reflecting region according to the scaling.
Of course, in the embodiment of the present application, the target canned child image may be scaled, and the scaling may be specifically set according to needs.
The center, radius and width of the circumscribed rectangle of the canned circle Ct in the canned sub-image can be identified and calculated. The scale rt is then calculated as the aspect ratio of Ct to the standard canned circle C0.
The processing method of the canned image provided by the embodiment of the application can automatically carry out canned identification and intercept the image of the viscera function reflecting area, avoids the subjective influence of people, obtains the accurate image of the viscera function reflecting area, and lays a foundation for the automatic computer jar diagnosis.
The division algorithm for each predetermined organ functional compartment will be described in detail below.
In the embodiment of the present application, the algorithm for dividing the predetermined organ functional reflex zones related to the 11 zang-fu functional reflex zones can be seen in fig. 3.1.1-3.9.6.
Referring to fig. 3.1.1-3.1.6, the division algorithm of the predetermined organ function reflex zone corresponding to the lung zone, i.e. the division algorithm of the lung zone R1, is shown.
The lung region R1 is composed of a nose region R11, a throat-tonsil region R12, a trachea-left bronchus region R13, a trachea-right bronchus region R14, a thyroid region R15 and a lung region R16 respectively, and the division algorithm of each region is as follows:
referring to fig. 3.1.1, the algorithm is schematically illustrated for dividing the nasal partition R11.
Define area BA 11: a region where an elliptical line segment E11 located above the circle C0 intersects the circle C0; the ellipse segment E11 is from an ellipse with a centroid of (0,18), a major axis of 513, and a minor axis of 392.
Nasal partition R11 is a sub-area of any size located within area BA 11.
Referring to fig. 3.1.2, the algorithm is schematically illustrated for dividing the throat-tonsil partition R12.
Define area BA 12: the region where one parabola P12 intersects the circle C0; the parabola P12 is formulated as:
P12:y=-0.0013x2-0.0008x-47.47。
the laryngo-tonsillar partition R12 is a sub-area of arbitrary size located within the area BA 12.
Referring to fig. 3.1.3, the algorithm for partitioning the trachea-left bronchus partition R13 is shown.
Define area BA 13: the region where one parabola P13 intersects the circle C0; the parabola P13 is formulated as:
P13:x=-0.0045y2-0.02y-27。
the tracheo-left bronchial section R13 is a sub-region of any size located within the area BA 13.
Referring to fig. 3.1.4, the algorithm is schematically illustrated for dividing the trachea-right bronchus partition R14.
Define area BA 14: the region where one parabola P14 intersects the circle C0; the parabola P14 is formulated as:
P14:x=0.0045y2+0.02y+27。
the trachea-right bronchus partition R14 is a sub-area of arbitrary size located within the area BA 14.
See fig. 3.1.5, which is a schematic diagram of the thyroid partition R15 partitioning algorithm.
Define area BA 15: the region where one parabola P15 intersects the circle C0; the parabola P15 is formulated as:
P15:y=0.002x2+0.0006x+18.46。
thyroid zone R15 is a sub-zone of any size located within zone BA 15.
Referring to fig. 3.1.6, the algorithm for dividing the lung partition R16 is shown.
Define area BA 16: the intersected area of the eight straight lines L16-1-L16-8 in the circle C0; the formula of eight straight lines is:
L16-1:y=0.5225x-132
L16-2:y=-0.5086x-130
L16-3:y=-1.9259x+355
L16-4:y=1.9717x-360
L16-5:y=0.5225x+132
L16-6:y=-0.5086x+130
L16-7:y=-1.9259x-355
L16-8:y=1.9717x+360
the lung partition R16 is a sub-area of arbitrary size located within the area BA 16.
Referring to fig. 3.2.1-3.2.5, a schematic diagram of the heart region R2 partition algorithm is shown.
The heart region R2 is composed of a coronary artery region R21, a myocardial region R22, a cerebral blood vessel-right brain region R23, a cerebral blood vessel-left brain region R24, and a cervical vertebra region R25, respectively, each region being defined as follows:
referring to fig. 3.2.1, the algorithm is schematically illustrated for partitioning the coronary artery partition R21.
Define area BA 21: a region where the two straight lines L21-1, L21-2 intersect within the circle C0; the formula of two straight lines is:
L21-1:y=-100
L21-2:y=100
the coronary artery partition R21 is a sub-area of arbitrary size located within the area BA 21.
Referring to fig. 3.2.2, the algorithm for dividing the myocardial partition R22 is shown.
Define area BA 22: respectively consists of an intersection region of a straight line L22-1 and a parabola P22-1 and an intersection region of a straight line L22-2 and a parabola P22-2; wherein, the formula of 2 straight lines and 2 parabolas is:
L22-1:y=-50
L22-2:y=50
P22-1:x=0.0053y2+0.0260y-163.20
P22-2:x=-0.0053y2-0.0260y+162.20
the myocardial partition R22 is a sub-area of an arbitrary size located within the area BA 22.
Referring to fig. 3.2.3, the algorithm is schematically illustrated for dividing the cerebral vessel-right brain partition R23.
Define area BA 23: the region where one parabola P23 intersects the circle C0; the parabola P23 is expressed by the formula y being 0.0022x2-0.0060x +25.16 parabola rotated-121.5 degrees parabola. Clockwise rotation is a negative number and counterclockwise rotation is a positive number.
The cerebral-vascular-right-brain partition R23 is a sub-area of arbitrary size located within the area BA 23.
See 3.2.4, the schematic diagram of the algorithm for dividing the cerebral vessel-left brain region R24.
Define area BA 24: the region where one parabola P24 intersects the circle C0; parabola P24 is expressed by the formula y-0.0020 x2+0.0007x-23.56 parabola rotated-58 degrees. Clockwise rotation is a negative number and counterclockwise rotation is a positive number.
The cerebrovascular-left brain partition R24 is a sub-region of arbitrary size located within the area BA 24.
See fig. 3.2.5, which is a schematic diagram of the algorithm for dividing the cervical vertebra partition R25.
Define area BA 25: the region where one parabola P25 intersects the circle C0; the formula for parabola P25 is:
P25:y=0.0014x2+0.0005x+13.14
the cerebrovascular-left brain partition R24 is a sub-region of arbitrary size located within the area BA 24.
See fig. 3.3.1-3.3.2, which are schematic diagrams of the biliary region segmentation algorithm.
The biliary region R3 is composed of a gallbladder partition R31 and a bile duct partition R32, respectively, each partition being defined as follows:
referring to fig. 3.3.1, the gallbladder partition R31 is divided into algorithm diagrams.
Define area BA 31: the upper region where one parabola P31 intersects the circle C0; the formula for parabola P31 is:
P31:y=0.0013x2-0.0014x+87.88
the gallbladder partition R31 is a sub-area of any size located within the area BA 31.
Referring to fig. 3.3.2, the bile duct partition R32 is divided into an algorithm diagram.
Define area BA 32: the lower region where one parabola P32 intersects the circle C0; the formula for parabola P32 is:
P32:y=0.0011x2-0.0019x+22.77
the bile duct partition R32 is a sub-area of arbitrary size located within the area BA 32.
See fig. 3.4.1-3.4.2 for a schematic diagram of the algorithm for dividing the liver region R4.
The liver partition R4 is composed of a fatty liver partition R41 and a liver-brain-breast partition R42, respectively, each of which is defined as follows:
referring to fig. 3.4.1, the partitioning algorithm for fatty liver partition R41 is shown.
Define area BA 41: two side regions where the two parabolas P41-1 and P41-2 intersect with the circle C0; wherein, the formula of the two parabolas is as follows:
P41-1:x=-0.0017y2-0.0037y-63
P41-2:x=0.0017y2+0.0037y+63
fatty liver partition R41 is a sub-region of arbitrary size located within region BA 41.
See fig. 3.4.2 for a schematic diagram of the R42 partition algorithm for the liver-brain-breast partition.
Define area BA 42: the middle region where the two parabolas P42-1, P42-2 intersect with the circle C0; wherein, the formula of the two parabolas is as follows:
P42-1:x=-0.0020y2-0.014y-145
P42-2:x=0.0020y2+0.014y+145
the liver-brain-breast partition R42 is a sub-region of arbitrary size located within the area BA 42.
The splenic region R5 is the size of the entire standard canned circle C0.
Referring to fig. 3.5.1-3.5.4, a schematic diagram of the algorithm for R6 segmentation of the gastric region is shown.
The gastric region R6 is composed of pyloric region R61, cardiac region R62, gastric region R63, and duodenal region R64, respectively, each defined as follows:
referring to fig. 3.5.1, the algorithm for partitioning the pyloric region R61 is shown.
Define area BA 61: the region where one elliptical line segment E61 intersects circle C0; wherein the ellipse line segment E61 is from an ellipse with a centroid of (-146, -67), a major axis of 312, a minor axis of 266, and an angle of 29 degrees. Clockwise rotation is positive and counterclockwise rotation is negative.
The pyloric section R61 is a sub-section of any size located within the area BA 61.
See fig. 3.5.2, which is a schematic diagram of the cardia R62 division algorithm.
Define area BA 62: the intersection regions of three straight lines L62-1-L62-3 in the circle C0; the formula of the three straight lines is:
L62-1:x=-70
L62-2:y=-1
L62-3:x=88
cardiac region R62 is a sub-region of any size located within region BA 62.
See FIG. 3.5.3 for a schematic diagram of the stomach R63 partition algorithm.
Define area BA 63: the four straight lines L63-1-L63-4 intersect in the circle C0; the formula of the four straight lines is:
L63-1:y=0.6014x-121.47
L63-2:y=-0.6014x-121.47
L63-3:y=0.614x+121.47
L63-4:y=-0.6014x+121.47
see FIG. 3.5.4 for a schematic diagram of the partitioning algorithm for duodenum partition R64.
Define area BA 64: the lower region where one parabola P64 intersects the circle C0; the formula for parabola P64 is:
P64:y=0.0011x2-0.0019x+22.77
duodenal partition R64 is a sub-area of any size located within area BA 64.
See fig. 3.6.1-3.63, which are schematic diagrams of the algorithm for dividing the large intestine region R7.
The large intestine region R7 is composed of a large intestine-colon-pancreas region R71, an oral cavity region R72 and a rectum region R73, and the regions are defined as follows:
see fig. 3.6.1, which is a schematic diagram of the large intestine-colon-pancreas partition R71 partitioning algorithm.
Define area BA 71: an upper region where one straight line L71 intersects the circle C0; wherein, the formula of the straight line is as follows:
L71:y=106
the large intestine-colon-pancreas partition R71 is a subregion of any size that lies within the region BA 71.
Referring to fig. 3.6.2, the algorithm is schematically divided for the oral cavity partition R72.
Define area BA 72: an oval E72 inner region located within circle C0; the centroid of ellipse E72 is (0,26), the major axis is 323 and the minor axis is 210.
The buccal partition R72 is a sub-region of arbitrary size located within the area BA 72.
See fig. 3.6.3 for a schematic diagram of the partitioning algorithm for rectum R73.
Define area BA 73: the intersection regions of three straight lines L73-1-L73-3 in the circle C0; the formula of the three straight lines is:
L73-1:x=-157
L73-2:y=60
L73-3:x=157
rectal partition R73 is a sub-area of arbitrary size located within area BA 73.
See fig. 3.7.1-3.7.2, which is a schematic diagram of the R8 partition algorithm for small intestine region.
The small intestine region R8 is composed of a small intestine region R81 and a duodenum region R82, and each region is defined as follows:
see fig. 3.7.1 for a schematic diagram of the algorithm for small bowel partition R81 partitioning.
Define area BA 81: the upper region where one parabola P81 intersects the circle C0; the formula for parabola P81 is:
P81:y=0.0013x2-0.0014x+87.88
small intestine partition R81 is a sub-region of any size located within region BA 81.
Referring to fig. 3.7.2, the duodenum partition R82 is a schematic diagram of the partitioning algorithm.
Define area BA 82: the upper region where one parabola P82 intersects the circle C0; the formula for parabola P82 is:
P82:y=0.0011x2-0.0019x+22.77
duodenal partition R82 is a sub-area of any size located within area BA 82.
See fig. 3.8.1-3.8.3 for a schematic diagram of the algorithm for R9 segmentation of the left renal region.
The left kidney region R9 is composed of a lumbar-kidney-ureter region R91, a left kidney region R92, and a right lower limb region R93, respectively, each region being defined as follows:
see figure 3.8.1 for a schematic diagram of the lumbar-renal-ureteral partition R91 partitioning algorithm.
Define area BA 91: the middle area of the intersection of the two straight lines L91-1 and L91-2 and the circle C0; the formula of two straight lines is:
L91-1:y=-143
L91-2:y=143
lumbar-renal-ureteral partition R91 is a sub-region of any size located within region BA 91.
See FIG. 3.8.2 for a schematic diagram of the left kidney partition R92 partitioning algorithm.
Define area BA 92: a left region where one straight line L92 intersects the circle C0; the formula for line L92 is:
L92:y=-70
the left kidney partition R92 is a sub-area of arbitrary size located within the area BA 92.
See FIG. 3.8.3 for a schematic diagram of the algorithm for right lower limb segment R93 partitioning.
Define area BA 93: a left region where one straight line L93 intersects the circle C0; the formula for line L93 is:
L93:y=70
the right lower limb partition R93 is a sub-area of any size located within the area BA 93.
The right kidney region R10 is composed of a lumbar-kidney-ureter region R101, a right kidney region R102, and a left lower limb region R103, respectively, each region being defined as follows:
lumbar-kidney-ureter partition R101:
lumbar-renal-ureteral region R101 is a sub-region of arbitrary size located within region BA101, where region BA101 is the same as region BA91 in fig. 3.8.2.
Right renal partition R102:
the right kidney partition R102 is a sub-area of arbitrary size located within the area BA102, wherein the area BA102 is identical to the area BA93 in the graph 3.8.2.
Left lower limb partition R103:
the left lower limb partition R103 is a sub-area of any size located within the area BA103, wherein the area BA103 is the same as the area BA92 in fig. 3.8.2.
See fig. 3.9.1-3.9.6 for a schematic diagram of the algorithm for R11 segmentation of the bladder region.
The bladder region R11 is composed of a bladder partition R111, a uterine body partition R112, a cervical partition R113, a vaginal-urethral partition R114, a prostate/ovarian-fallopian tube partition R115, and an anal partition R116, respectively, each of which is defined as follows:
see FIG. 3.9.1 for a schematic diagram of the algorithm for dividing the bladder sector R111.
Define area BA 111: the region where one parabola P111 intersects the circle C0; the formula for parabola P111 is:
P111:y=-0.0014x2-0.0029x-58.64
bladder sector R111 is a sub-area of any size located within area BA 111.
See fig. 3.9.2 for a schematic diagram of the algorithm for dividing the uterine body region R112.
Define area BA 112: the region where one parabola P112 and one straight line L112 intersect within the circle C0; the formula of the straight line L112 and the parabola P112 is:
P112:y=-0.0016x2-0.0047x-133.82
L112:y=23
the uterine body compartment R112 is a sub-compartment of any size located within the area BA 112.
See FIG. 3.9.3 for a schematic diagram of the algorithm for classifying the cervical partition R113.
Define area BA 113: an inner region of circle C113 located within circle C0; the formula for circle C113 is:
C113:x2+y2=67.5
cervical partition R113 is a sub-area of any size located within area BA 113.
See figure 3.9.4 for an algorithm for dividing the vaginal-urethral partition R114.
Define area BA 114: the intersection areas of the three straight lines L114-1 to L114-3 in the circle C0; the formula of the three straight lines is:
L114-1:x=-76
L114-2:y=-29
L114-3:x=76
vaginal-urethral partition R114 is a sub-area of any size located within area BA 114.
See fig. 3.9.5, which is a schematic diagram of the prostate/ovary-fallopian tube partition R115 partitioning algorithm.
Define area BA 115: the area where the two elliptical line segments E115-1, E115-2 intersect within the circle C0; the ellipse line segment E115-1 is from an ellipse with a centroid of (-219,141), a major axis of 466, a minor axis of 292, and a rotation angle of 55 degrees; ellipse E115-2 is from an ellipse having center of mass (219,141), major axis 466, minor axis 292, and rotation angle-55 degrees. Clockwise rotation is positive and counterclockwise rotation is negative.
Prostate/ovary-fallopian tube region R115 is a sub-region of any size located within region BA 115.
See FIG. 3.9.6 for a schematic diagram of the algorithm for partitioning the anal region R116.
Define area BA 116: the region where one parabola P116 intersects the circle C0; the formula for parabola P116 is:
P116:y=-0.0001x2-0.0335x+1.76
the anal region R116 is a sub-region of any size located within the area BA 116.
In the embodiment of the application, the canned viscera function reflex zones are automatically segmented according to different viscera partitions and corresponding to the preset organ function reflex zone segmentation algorithm.
Compared with the traditional method based on manual work, the embodiment of the application realizes the automatic identification and automatic segmentation of the canned viscera function reflecting region, avoids the subjective influence of people, forms a uniform partition standard, improves the efficiency and reduces the labor cost. The identification target area of the cupping diagnosis can be further accurately positioned to the visceral function reflecting area level of each cupping area from the overall level of the cupping area.
The embodiment of the application not only keeps the granularity of the cupping print subareas of the manual traditional method, but also converts the Chinese medical phenomenological language description of the cupping print viscera function reflecting area subareas into clear digital definition, establishes an objective and accurate dividing method and standard, realizes the automatic identification and division of the cupping print images, and lays a foundation for realizing the automatic cupping diagnosis in the later period.
Corresponding to the method embodiment, the embodiment of the application also discloses a processing device for the canned image.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a processing apparatus for processing an imprinted image in an embodiment of the present application.
An obtaining module 1, configured to obtain a target image; the target image comprises a canned stamp sub-image of a canned stamp left after cupping on the back of a human body according to a preset viscera partition; the preset viscera subareas comprise viscera function reflection subareas divided by the back of the human body;
the partition module 2 is used for determining the stamping partitions of the stamping subimages according to the preset viscera partitions;
and the drawing module 3 is used for calling a preset organ function reflecting area dividing algorithm corresponding to each overprint partition to draw the viscera function reflecting area image corresponding to each overprint subimage.
In some embodiments, the pre-determined organ functional reflex zone division algorithm specifically comprises:
the method comprises the following steps of obtaining one or more curve data, standard canned circle data, the offset of a curve and the center of a standard canned circle and a curve effective area division rule.
In some embodiments, the rendering module is specifically configured to:
for each canned sub-image:
drawing viscera reflecting region curves corresponding to the target canned sub-image according to one or more curve data corresponding to the canned subareas of the target canned sub-image and the data of the standard canned circle;
translating the viscera reflection region curve to the target canned child image according to the offset;
and segmenting the target canned child image according to the viscera reflecting region curve and the curve effective region dividing rule corresponding to the viscera reflecting region curve to obtain the viscera function reflecting region image corresponding to the target canned child image.
In some embodiments, the rendering module is further to:
if the data of the canned circle of the target canned sub-image is judged not to be matched with the data of the standard canned circle, the translating the viscera reflection area curve to the position before the target canned sub-image according to the offset further comprises:
calculating the scaling of the can printing according to the standard can printing circle data and the can printing circle data of the target can printing sub-image;
and scaling the curve of the viscera reflecting region according to the scaling.
Since the functions of each module correspond to each step in the foregoing method embodiment, the effect corresponding to the method can be achieved, and the functions of each module in the apparatus embodiment are not described herein again.
The embodiment of the processing device for the canned image realizes automatic identification and segmentation of the canned viscera function reflection partition, avoids subjective influence of people, forms a uniform partition standard, improves efficiency and reduces labor cost.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a device includes one or more processors (CPUs), memory, and a bus. The device may also include input/output interfaces, network interfaces, and the like.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip. The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A method of processing an imprinted image, comprising:
obtaining a target image; the target image comprises a canned stamp sub-image of a canned stamp left after cupping on the back of a human body according to a preset viscera partition; the preset viscera subareas comprise viscera function reflection subareas divided by the back of the human body;
determining the stamping subareas of the stamping subimages according to the preset viscera subareas;
and calling a preset organ function reflecting area dividing algorithm corresponding to each overprint partition to draw the viscera function reflecting area image corresponding to each overprint subimage.
2. The processing method according to claim 1, wherein the parameters of the pre-determined organ functional reflex zone division algorithm specifically comprise:
the method comprises the following steps of obtaining one or more curve data, standard canned circle data, the offset of a curve and the center of a standard canned circle and a curve effective area division rule.
3. The processing method according to claim 2, wherein the step of calling the preset organ function reflex zone division algorithm corresponding to each of the canned partitions to draw the viscera function reflex zone image corresponding to each canned sub-image comprises:
for each canned sub-image:
drawing viscera reflecting region curves corresponding to the target canned sub-image according to one or more curve data corresponding to the canned subareas of the target canned sub-image and the data of the standard canned circle;
translating the viscera reflection region curve to the target canned child image according to the offset;
and segmenting the target canned child image according to the viscera reflecting region curve and the curve effective region dividing rule corresponding to the viscera reflecting region curve to obtain the viscera function reflecting region image corresponding to the target canned child image.
4. The processing method according to claim 3, wherein if it is determined that the data of the canned circle of the target canned sub-image does not match the data of the standard canned circle, the shifting the visceral reflection area curve to the target canned sub-image according to the offset further comprises:
calculating the scaling of the can printing according to the standard can printing circle data and the can printing circle data of the target can printing sub-image;
and scaling the curve of the viscera reflecting region according to the scaling.
5. The processing method according to any one of claims 1 to 4, wherein the pre-determined viscera partition comprises:
lung, heart, gall bladder, liver, spleen, stomach, large intestine, small intestine, left kidney, right kidney, and bladder.
6. The processing method according to claim 5,
the lung area consists of a nose area, a throat-tonsil area, a trachea-left bronchus area, a trachea-right bronchus area, a thyroid area and a lung area;
the heart area is composed of a coronary artery partition, a cardiac muscle partition, a cerebral vessel-right brain partition, a cerebral vessel-left brain partition and a cervical vertebra partition;
the biliary region consists of a gallbladder partition and a bile duct partition;
the liver region consists of fatty liver partition, liver-brain-breast partition;
the spleen area is a whole standard canned circle;
the stomach area consists of a pylorus partition, a cardia partition, a stomach partition and a duodenum partition;
the large intestine area consists of a large intestine-colon-pancreas subarea, an oral cavity subarea and a rectum subarea;
the small intestine area consists of a small intestine subarea and a duodenum subarea;
the left kidney area consists of a lumbar-kidney-ureter partition, a left kidney partition and a right lower limb partition;
the right kidney area consists of a lumbar-kidney-ureter partition, a right kidney partition and a left lower limb partition;
the bladder zone consists of a bladder partition, a uterus body partition, a cervix partition, a vagina-urethra partition, a prostate/ovary-fallopian tube partition and an anus partition.
7. A processing apparatus for an imprinted image, comprising:
an obtaining module for obtaining a target image; the target image comprises a canned stamp sub-image of a canned stamp left after cupping on the back of a human body according to a preset viscera partition; the preset viscera subareas comprise viscera function reflection subareas divided by the back of the human body;
the partition module is used for determining the stamping partitions of the stamping subimages according to the preset viscera partitions;
and the drawing module is used for calling a preset organ function reflecting area dividing algorithm corresponding to each canned printing subarea to draw the viscera function reflecting area image corresponding to each canned printing subimage.
8. The processing device according to claim 7, wherein the parameters of the pre-determined organ functional reflex zone division algorithm specifically comprise:
the method comprises the following steps of obtaining one or more curve data, standard canned circle data, the offset of a curve and the center of a standard canned circle and a curve effective area division rule.
9. The processing device according to claim 8, wherein the rendering module is specifically configured to:
for each canned sub-image:
drawing viscera reflecting region curves corresponding to the target canned sub-image according to one or more curve data corresponding to the canned subareas of the target canned sub-image and the data of the standard canned circle;
translating the viscera reflection region curve to the target canned child image according to the offset;
and segmenting the target canned child image according to the viscera reflecting region curve and the curve effective region dividing rule corresponding to the viscera reflecting region curve to obtain the viscera function reflecting region image corresponding to the target canned child image.
10. The processing apparatus as in claim 9, wherein the rendering module is further configured to:
if the data of the canned circle of the target canned sub-image is judged not to be matched with the data of the standard canned circle, the translating the viscera reflection area curve to the position before the target canned sub-image according to the offset further comprises:
calculating the scaling of the can printing according to the standard can printing circle data and the can printing circle data of the target can printing sub-image;
and scaling the curve of the viscera reflecting region according to the scaling.
CN202110721540.9A 2021-06-28 2021-06-28 Processing method and device for can printing image Active CN113436171B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110721540.9A CN113436171B (en) 2021-06-28 2021-06-28 Processing method and device for can printing image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110721540.9A CN113436171B (en) 2021-06-28 2021-06-28 Processing method and device for can printing image

Publications (2)

Publication Number Publication Date
CN113436171A true CN113436171A (en) 2021-09-24
CN113436171B CN113436171B (en) 2024-02-09

Family

ID=77755091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110721540.9A Active CN113436171B (en) 2021-06-28 2021-06-28 Processing method and device for can printing image

Country Status (1)

Country Link
CN (1) CN113436171B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115661178A (en) * 2022-11-17 2023-01-31 博奥生物集团有限公司 Method and apparatus for segmenting an imprinted image

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104732551A (en) * 2015-04-08 2015-06-24 西安电子科技大学 Level set image segmentation method based on superpixel and graph-cup optimizing
US20170301083A1 (en) * 2016-04-13 2017-10-19 Fujifilm Corporation Image registration device, method, and program
CN107292894A (en) * 2017-06-28 2017-10-24 新绎健康科技有限公司 A kind of method and system for being handled tank spot characteristics of image
CN107432822A (en) * 2017-09-04 2017-12-05 杭州真医疗器械有限公司 A kind of intelligent cup diagnosis and therapy system
CN110490844A (en) * 2019-07-24 2019-11-22 广州三得医疗科技有限公司 A kind of recognition methods, system, device and the therapeutic equipment of electromagnetic therapeutic apparatus tank print
WO2021004402A1 (en) * 2019-07-05 2021-01-14 深圳数字生命研究院 Image recognition method and apparatus, storage medium, and processor
US20210110511A1 (en) * 2018-10-30 2021-04-15 Beijing Sensetime Technology Development Co., Ltd. Image processing method and apparatus, computer device, and computer storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104732551A (en) * 2015-04-08 2015-06-24 西安电子科技大学 Level set image segmentation method based on superpixel and graph-cup optimizing
US20170301083A1 (en) * 2016-04-13 2017-10-19 Fujifilm Corporation Image registration device, method, and program
CN107292894A (en) * 2017-06-28 2017-10-24 新绎健康科技有限公司 A kind of method and system for being handled tank spot characteristics of image
CN107432822A (en) * 2017-09-04 2017-12-05 杭州真医疗器械有限公司 A kind of intelligent cup diagnosis and therapy system
US20210110511A1 (en) * 2018-10-30 2021-04-15 Beijing Sensetime Technology Development Co., Ltd. Image processing method and apparatus, computer device, and computer storage medium
WO2021004402A1 (en) * 2019-07-05 2021-01-14 深圳数字生命研究院 Image recognition method and apparatus, storage medium, and processor
CN110490844A (en) * 2019-07-24 2019-11-22 广州三得医疗科技有限公司 A kind of recognition methods, system, device and the therapeutic equipment of electromagnetic therapeutic apparatus tank print

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王金淼;田丰;: "罐印效应的临床意义", 按摩与康复医学, no. 02 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115661178A (en) * 2022-11-17 2023-01-31 博奥生物集团有限公司 Method and apparatus for segmenting an imprinted image

Also Published As

Publication number Publication date
CN113436171B (en) 2024-02-09

Similar Documents

Publication Publication Date Title
CN107808377B (en) The positioning device of lesion in a kind of lobe of the lung
CN107464230B (en) Image processing method and device
WO2020122357A1 (en) Method and device for reconstructing medical image
CN110464633A (en) Acupuncture point recognition methods, device, equipment and storage medium
CN110766701B (en) Network model training method and device, and region division method and device
CN106355043A (en) Automatic analysis system and method for breast infrared information
CN113436171A (en) Processing method and device for canned image
CN109215104B (en) Brain structure image display method and device for transcranial stimulation treatment
CN112017185A (en) Focus segmentation method, device and storage medium
CN110880366A (en) Medical image processing system
Chen et al. Missing teeth and restoration detection using dental panoramic radiography based on transfer learning with CNNs
CN106780377A (en) A kind of contour smoothing method based on Freeman chain codes in medical image segmentation
KR20210008397A (en) Automatic liver segmentation in CT
CN115063386A (en) Medical image processing method, device, equipment and storage medium
CN106462971B (en) Imaging device for registering different imaging modalities
CN110120052B (en) Target area image segmentation system and device
CN109712186B (en) Method, computer device and storage medium for delineating a region of interest in an image
CN111583212A (en) Method and device for determining brain midline shift
JP2020151450A (en) Image identification method and image identification device
CN110647947A (en) Method and device for lesion fusion
CN109461143A (en) Image display method, device, computer equipment and storage medium
WO2020101265A1 (en) Myocardium image analysis method and device
JP7104913B2 (en) Imaging device, imaging program, image determination device, image determination program, and image processing system
CN114298957A (en) Classification method, system and storage medium for breast molybdenum target image lesions
CN111967540A (en) Maxillofacial fracture identification method and device based on CT database and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant