CN108648229B - Human back feature point extraction method based on Kinect camera - Google Patents
Human back feature point extraction method based on Kinect camera Download PDFInfo
- Publication number
- CN108648229B CN108648229B CN201810479306.8A CN201810479306A CN108648229B CN 108648229 B CN108648229 B CN 108648229B CN 201810479306 A CN201810479306 A CN 201810479306A CN 108648229 B CN108648229 B CN 108648229B
- Authority
- CN
- China
- Prior art keywords
- human body
- coordinate
- point
- human
- contour
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/181—Segmentation; Edge detection involving edge growing; involving edge linking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30008—Bone
- G06T2207/30012—Spine; Backbone
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Geometry (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a human back feature point extraction method based on a Kinect camera, which comprises the steps of obtaining a human back image, graying, contrast stretching, binaryzation, connected domain marking, traversing and searching for human back armpit, shoulder, waist and crotch feature points, and marking in the human back image. According to the invention, by extracting the outline image of the human back image and traversing the outline image, the characteristic points of the human back image can be efficiently and accurately extracted, and accurate data is provided for the subsequent calculation of the scoliosis degree of the human body.
Description
Technical Field
The invention belongs to the field of digital image processing, and particularly relates to a human back feature point extraction method based on a Kinect camera.
Background
Scoliosis can also be called as scoliosis, and has the characteristic of multiformity, in recent years, the occurrence of scoliosis in adolescents is increasing year by year, and the scoliosis has a certain influence on the life and the later work of the adolescents, so that the scoliosis detection has quite wide application. The scoliosis ruler is a scoliosis ruler, Adams forward bending test is a common method at present, but the detection methods are not accurate enough, manual detection is labor-consuming, the obtained result is not accurate enough, false detection and omission detection are easily caused, and automatic feature extraction based on digital image processing can ensure the accuracy and save a large amount of manpower and material resources.
Disclosure of Invention
The invention aims to: in order to solve the problems in the prior art, the invention provides a human back feature point extraction method based on a Kinect camera.
The technical scheme of the invention is as follows: a human back feature point extraction method based on a Kinect camera comprises the following steps:
A. acquiring a back image of a human body, and carrying out gray processing on the back image of the human body;
B. b, performing contrast stretching treatment on the gray level image obtained in the step A;
C. b, processing the image processed in the step B by adopting an Otsu threshold binarization method to obtain a binary image;
D. c, processing the binary image obtained in the step C by adopting a connected domain method to obtain a human back binary image;
E. d, processing the human body back binary image obtained in the step D by adopting a canny operator method to obtain a human body back outline image;
F. e, traversing the human body back outline drawing obtained in the step E to respectively obtain a left-side axillary coordinate point, a right-side axillary coordinate point, a left-side shoulder coordinate point, a right-side shoulder coordinate point, a waist coordinate point and a hip coordinate point of the human body back;
G. and D, marking the human back image in the step A according to the plurality of coordinate points obtained in the step F.
Further, the step D processes the binary image obtained in the step C by using a connected domain method to obtain a binary image of the back of the human body, which specifically comprises: c, carrying out connected domain marking on the binary image obtained in the step C, and counting the number of connected domains; and calculating the area of each connected region for the obtained connected regions, and setting the connected region smaller than the area of the maximum connected region to zero to obtain a binary image of the back of the human body.
Further, the step D further includes performing hole filling processing on the obtained human back binary image.
Further, the step F comprises the step of calculating the human back outline diagram obtained in the step ECenter coordinates and back contour coordinates (i) on the left and right sides of the ith row of the contour map1,k1) And (i)2,k2)。
Further, step F is to traverse the human back outline map, and obtain the left-side axillary coordinate points of the human back specifically: from the ith to the human back outline1-the jth column of line 1 starts traversing, finding the column coordinate k 'of the left side contour of the human back'1Judging the column coordinate k'1Difference k from the column coordinate1Whether the threshold value is greater than a set threshold value; if yes, the ith1The left contour coordinate of the row is taken as the left axillary coordinate point (m) of the back of the human body1,n1) (ii) a If not, continue traversing ith1-2 rows.
Further, step F is to traverse the human back outline map, and obtain the right axillary coordinate points of the human back specifically: from the ith to the human back outline2-the jth column of line 1 starts traversing, finding the column coordinate k 'of the contour of the right side of the back of the human body'2Judging the column coordinate k'2Difference k from the column coordinate2Whether the threshold value is greater than a set threshold value; if yes, the ith2The right contour coordinate of the line is taken as the coordinate point (m) of the right armpit of the back of the human body2,n2) (ii) a If not, continue traversing ith2-2 rows.
Further, step F is to traverse the human back outline map, and obtain the coordinate points of the left shoulder of the human back as follows: according to the outer contour diagram of the back of the human body from the m-th11 line starts traversing n1And (5) finding out a point which is not zero as a coordinate point of the left shoulder of the back of the human body.
Further, the step F is to traverse the human back outline drawing to obtain coordinate points of the right shoulder of the human back: according to the outer contour diagram of the back of the human body from the m-th21 line starts traversing n2And (5) finding a point which is not zero as a coordinate point of the right shoulder of the back of the human body.
Further, the step F of traversing the human back outer contour map to obtain the human back waist coordinate point and the human back crotch coordinate point specifically performs the angular point detection on the human back outer contour map by using an angular point detection algorithm based on curvature, and includes the following sub-steps:
s1, respectively acquiring contour coordinates of the left side and the right side of the back of the human body from the center line coordinate of the outline drawing of the back of the human body;
s2, sequentially selecting the Nth contour point of the contour coordinates of the left side and the right side of the back of the human body as the current contour point piThe contour point spaced from the current contour point by N-1 points is a front contour point pi-(N-1)And back contour point pi+(N-1);
S3, calculating the current contour point p according to the curvature formulaiThe curvature of (a);
and S4, finding out contour points corresponding to the maximum value and the minimum value of the curvature from the curvatures of the contour points on the left side and the right side of the back of the human body, namely the hip coordinate points and the waist coordinate points of the back of the human body.
Further, the curvature formula in step S3 is specifically:
where k (i) is the curvature of the ith contour point, | pipi-kI is the distance between the current contour point and the front contour point spaced by k points, | pipi+kI is the distance between the current contour point and the back contour point separated by k points, | pi-kpi+kI is the front contour point pi-kAnd the back contour point pi+kThe distance of (c).
The invention has the beneficial effects that: according to the invention, by extracting the outline image of the human back image and traversing the outline image, the characteristic points of the human back image can be efficiently and accurately extracted, and accurate data is provided for the subsequent calculation of the scoliosis degree of the human body.
Drawings
FIG. 1 is a schematic flow chart of a human back feature point extraction method based on a Kinect camera according to the invention;
FIG. 2 is a gray scale image after a graying process in an embodiment of the present invention;
FIG. 3 is a binary image after binarization by using an Otsu threshold value in the embodiment of the present invention;
FIG. 4 is a schematic diagram of the external contour of the back of a human body according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a labeling result of the image of the back of the human body according to the embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, a schematic flow chart of the method for extracting human back feature points based on a Kinect camera is shown. A human back feature point extraction method based on a Kinect camera comprises the following steps:
A. acquiring a back image of a human body, and carrying out gray processing on the back image of the human body;
B. b, performing contrast stretching treatment on the gray level image obtained in the step A;
C. b, processing the image processed in the step B by adopting an Otsu threshold binarization method to obtain a binary image;
D. c, processing the binary image obtained in the step C by adopting a connected domain method to obtain a human back binary image;
E. d, processing the human body back binary image obtained in the step D by adopting a canny operator method to obtain a human body back outline image;
F. e, traversing the human body back outline drawing obtained in the step E to respectively obtain a left-side axillary coordinate point, a right-side axillary coordinate point, a left-side shoulder coordinate point, a right-side shoulder coordinate point, a waist coordinate point and a hip coordinate point of the human body back;
G. and D, marking the human back image in the step A according to the plurality of coordinate points obtained in the step F.
In an optional embodiment of the present invention, in the step a, a Kinect camera is used to capture an image of the back of the human body, wherein the size of the image is 1920X 1080; and carrying out gray processing on the obtained back image of the human body. Fig. 2 shows a gray image after the graying process in the embodiment of the present invention; as shown in fig. 3, the binary image obtained by binarization using an atrazine threshold value in the embodiment of the present invention is shown; fig. 4 is a schematic diagram of the external contour of the back of a human body according to an embodiment of the present invention.
In an optional embodiment of the present invention, in step D, the binary image obtained in step C is processed by using a connected domain method to obtain a binary image of the back of the human body, specifically: c, carrying out connected domain marking on the binary image obtained in the step C, and counting the number of connected domains; and calculating the area of each connected region for the obtained connected regions, screening, and setting the connected regions smaller than the area of the maximum connected region to zero to obtain a binary image of the back of the human body.
Because the human back image has possible color difference, some holes can be generated after binarization, and therefore hole filling processing is carried out on the obtained human back binary image.
In an alternative embodiment of the present invention, step F above first calculates the center coordinates (i, j) of the human back outline obtained in step E and the back outline coordinates (i) on the left and right sides of the ith row of the outline1,k1) And (i)2,k2)。
The method comprises the following steps of traversing an outline diagram of the back of a human body to obtain coordinate points of the left armpit of the back of the human body: from the ith to the human back outline11 line starts the up traversal, from the ith1Traversing from the jth column of the-1 row to the 0 th column to find the column coordinate k 'of the contour of the left side of the back of the human body'1Judging the column coordinate k'1And the column coordinate k1Is greater than a set threshold, where the threshold is set to 10; if yes, the ith1The left contour coordinate of the row is taken as the left axillary coordinate point (m) of the back of the human body1,n1) (ii) a If not, continue traversing ith1-2 lines, finding new column coordinates of the contour of the left side of the back of the human body, and judging the column coordinates and the column coordinates k'1Whether the difference is greater than a set threshold value; if yes, the ith1Left side contour coordinates of line 1 as left side axillary coordinates of the back of the human bodyPoint; and (4) performing iteration according to the method until a coordinate point of the left armpit of the back of the human body is found.
Traversing the outline drawing of the back of the human body to obtain armpit coordinate points on the right side of the back of the human body, wherein the armpit coordinate points are as follows: from the ith to the human back outline2-the jth column of line 1 starts traversing, finding the column coordinate k 'of the contour of the right side of the back of the human body'2Judging the column coordinate k'2And the column coordinate k2Whether the difference is greater than a set threshold value; if yes, the ith2The right contour coordinate of the line is taken as the coordinate point (m) of the right armpit of the back of the human body2,n2) (ii) a If not, continue traversing ith2-2 lines, finding new column coordinates of the contour of the left side of the back of the human body, and judging the column coordinates and the column coordinates k'2Whether the difference is greater than a set threshold value; if yes, the ith2-the right contour coordinates of row 1 as the right axillary coordinate points of the back of the human body; and (4) performing iteration according to the method until coordinate points of armpits on the right side of the back of the human body are found.
Traversing the outline drawing of the back of the human body to obtain the coordinate points of the left shoulder of the back of the human body: according to the outer contour diagram of the back of the human body from the m-th11 line starts to traverse the n of the human back outline map upwards1And (5) finding out a point which is not zero as a coordinate point of the left shoulder of the back of the human body.
Traversing the outline diagram of the back of the human body to obtain coordinate points of the right shoulder of the back of the human body: according to the outer contour diagram of the back of the human body from the m-th21 line starts to traverse the n of the human back outline map upwards2And (5) finding a point which is not zero as a coordinate point of the right shoulder of the back of the human body.
Traversing the human back outline drawing to obtain human back waist coordinate points and human back crotch coordinate points, specifically adopting an angular point detection algorithm based on curvature to carry out angular point detection on the human back outline drawing, and comprising the following steps:
s1, respectively acquiring contour coordinates of the left side and the right side of the back of the human body from the center line coordinate of the outline drawing of the back of the human body;
s2, starting from the Nth contour point of the contour coordinates of the left side and the right side of the back of the human body,sequentially selecting the point as a current contour point piThe contour point spaced from the current contour point by N-1 points is a front contour point pi-(N-1)And back contour point pi+(N-1);
The invention respectively starts from the 5 th contour point of the contour coordinates of the left side and the right side of the back of the human body, and sequentially selects the contour point as the current contour point piThe contour point spaced 4 points apart from the current contour point is the front contour point pi-4And back contour point pi+4。
S3, calculating the current contour point p according to the curvature formulaiThe curvature formula is specifically as follows:
where k (i) is the curvature of the ith contour point, | pipi-kI is the distance between the current contour point and the front contour point spaced by k points, | pipi+kI is the distance between the current contour point and the back contour point separated by k points, | pi-kpi+kI is the front contour point pi-kAnd the back contour point pi+kThe distance of (c).
And S4, finding out contour points corresponding to the maximum value and the minimum value of the curvature from the curvatures of the contour points on the left side and the right side of the back of the human body, namely the hip coordinate points and the waist coordinate points of the back of the human body. Fig. 5 is a schematic diagram illustrating a labeling result of a back image of a human body according to an embodiment of the present invention.
It will be appreciated by those of ordinary skill in the art that the embodiments described herein are intended to assist the reader in understanding the principles of the invention and are to be construed as being without limitation to such specifically recited embodiments and examples. Those skilled in the art can make various other specific changes and combinations based on the teachings of the present invention without departing from the spirit of the invention, and these changes and combinations are within the scope of the invention.
Claims (4)
1. A human back feature point extraction method based on a Kinect camera is characterized by comprising the following steps:
A. acquiring a back image of a human body, and carrying out gray processing on the back image of the human body;
B. b, performing contrast stretching treatment on the gray level image obtained in the step A;
C. b, processing the image processed in the step B by adopting an Otsu threshold binarization method to obtain a binary image;
D. c, processing the binary image obtained in the step C by adopting a connected domain method to obtain a human back binary image;
E. d, processing the human body back binary image obtained in the step D by adopting a canny operator method to obtain a human body back outline image;
F. e, traversing the human back outline diagram obtained in the step E, and respectively obtaining a left-side axillary coordinate point, a right-side axillary coordinate point, a left-side shoulder coordinate point, a right-side shoulder coordinate point, a waist coordinate point and a crotch coordinate point of the human back, specifically:
calculating the center coordinates of the human body back outline drawing obtained in the step E and the back outline coordinates (i) at the left side and the right side of the ith row of the outline drawing1,k1) And (i)2,k2);
From the ith to the human back outline1Starting to traverse the jth column of the-1 row, and finding the column coordinate k of the left outline of the back of the human body1', judging the column coordinate k1' and column coordinate k1Whether the difference is greater than a set threshold value; if yes, the ith1The left contour coordinate of the row is taken as the left axillary coordinate point (m) of the back of the human body1,n1) (ii) a If not, continue traversing ith1-2 rows;
from the ith to the human back outline2Starting to traverse the jth column of the 1 row, and finding the column coordinate k of the contour of the right side of the back of the human body2', judging the column coordinate k2' and column coordinate k2Whether the difference is greater than a set threshold value; if yes, the ith2The right contour coordinate of the line is taken as the coordinate point (m) of the right armpit of the back of the human body2,n2) (ii) a If not, continue traversing ith2-2 rows;
according to the outline diagram of the back of the human bodyM th11 line starts traversing n1Column, finding out a point which is not zero as a coordinate point of the left shoulder of the back of the human body;
according to the outer contour diagram of the back of the human body from the m-th21 line starts traversing n2Column, finding out a point which is not zero as a coordinate point of the right shoulder of the back of the human body;
the method for detecting the corners of the human body back outline by adopting the corner detection algorithm based on curvature comprises the following steps:
s1, respectively acquiring contour coordinates of the left side and the right side of the back of the human body from the center line coordinate of the outline drawing of the back of the human body;
s2, sequentially selecting the Nth contour point of the contour coordinates of the left side and the right side of the back of the human body as the current contour point piThe contour point spaced from the current contour point by N-1 points is a front contour point pi-(N-1)And back contour point pi+(N-1);
S3, calculating the current contour point p according to the curvature formulaiThe curvature of (a);
s4, finding out contour points corresponding to the maximum value and the minimum value of curvature from the curvatures of the contour points on the left side and the right side of the back of the human body, namely the hip coordinate points and the waist coordinate points of the back of the human body;
G. and D, marking the human back image in the step A according to the plurality of coordinate points obtained in the step F.
2. The method for extracting human back feature points based on the Kinect camera as claimed in claim 1, wherein the step D adopts a connected domain method to process the binary image obtained in the step C to obtain a human back binary image, specifically: c, carrying out connected domain marking on the binary image obtained in the step C, and counting the number of connected domains; and calculating the area of each connected region for the obtained connected regions, and setting the connected region smaller than the area of the maximum connected region to zero to obtain a binary image of the back of the human body.
3. The method as claimed in claim 2, wherein the step D further comprises performing hole filling processing on the obtained binary image of the back of the human body.
4. The method for extracting human back feature points based on a Kinect camera as claimed in claim 1, wherein the curvature formula in the step S3 is specifically:
where k (i) is the curvature of the ith contour point, | pipi-kI is the distance between the current contour point and the front contour point spaced by k points, | pipi+kI is the distance between the current contour point and the back contour point separated by k points, | pi-kpi+kI is the front contour point pi-kAnd the back contour point pi+kThe distance of (c).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810479306.8A CN108648229B (en) | 2018-05-18 | 2018-05-18 | Human back feature point extraction method based on Kinect camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810479306.8A CN108648229B (en) | 2018-05-18 | 2018-05-18 | Human back feature point extraction method based on Kinect camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108648229A CN108648229A (en) | 2018-10-12 |
CN108648229B true CN108648229B (en) | 2020-07-28 |
Family
ID=63756830
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810479306.8A Active CN108648229B (en) | 2018-05-18 | 2018-05-18 | Human back feature point extraction method based on Kinect camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108648229B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110309787B (en) * | 2019-07-03 | 2022-07-29 | 电子科技大学 | Human body sitting posture detection method based on depth camera |
CN112750143B (en) * | 2020-12-02 | 2024-04-26 | 上海海洋大学 | Method for extracting morphological characteristics of stem flexible fish based on extremum method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103578004A (en) * | 2013-11-15 | 2014-02-12 | 西安工程大学 | Method for displaying virtual fitting effect |
CN105469113A (en) * | 2015-11-19 | 2016-04-06 | 广州新节奏智能科技有限公司 | Human body bone point tracking method and system in two-dimensional video stream |
CN106384126A (en) * | 2016-09-07 | 2017-02-08 | 东华大学 | Clothes pattern identification method based on contour curvature feature points and support vector machine |
CN107481228A (en) * | 2017-07-28 | 2017-12-15 | 电子科技大学 | Human body back scoliosis angle measurement method based on computer vision |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE60113046T2 (en) * | 2000-01-14 | 2006-06-14 | Koninkl Philips Electronics Nv | PROCESSING METHOD AND SYSTEM FOR GEOMETRIC 3D MODELING OF THE SPINE |
US8900146B2 (en) * | 2009-07-27 | 2014-12-02 | The Hong Kong Polytechnic University | Three-dimensional (3D) ultrasound imaging system for assessing scoliosis |
CN102930534B (en) * | 2012-10-15 | 2015-05-13 | 北京工业大学 | Method for automatically positioning acupuncture points on back of human body |
-
2018
- 2018-05-18 CN CN201810479306.8A patent/CN108648229B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103578004A (en) * | 2013-11-15 | 2014-02-12 | 西安工程大学 | Method for displaying virtual fitting effect |
CN105469113A (en) * | 2015-11-19 | 2016-04-06 | 广州新节奏智能科技有限公司 | Human body bone point tracking method and system in two-dimensional video stream |
CN106384126A (en) * | 2016-09-07 | 2017-02-08 | 东华大学 | Clothes pattern identification method based on contour curvature feature points and support vector machine |
CN107481228A (en) * | 2017-07-28 | 2017-12-15 | 电子科技大学 | Human body back scoliosis angle measurement method based on computer vision |
Also Published As
Publication number | Publication date |
---|---|
CN108648229A (en) | 2018-10-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110866924B (en) | Line structured light center line extraction method and storage medium | |
CN107563446B (en) | Target detection method for micro-operation system | |
CN110717872B (en) | Method and system for extracting characteristic points of V-shaped welding seam image under laser-assisted positioning | |
CN107369159B (en) | Threshold segmentation method based on multi-factor two-dimensional gray level histogram | |
US10885321B2 (en) | Hand detection method and system, image detection method and system, hand segmentation method, storage medium, and device | |
CN113034399A (en) | Binocular vision based autonomous underwater robot recovery and guide pseudo light source removing method | |
CN114081471B (en) | Scoliosis cobb angle measuring method based on three-dimensional image and multilayer perception | |
CN103870597B (en) | A kind of searching method and device of no-watermark picture | |
CN106446894A (en) | Method for recognizing position of spherical object based on contour | |
WO2017113692A1 (en) | Method and device for image matching | |
CN110084830B (en) | Video moving object detection and tracking method | |
CN108648229B (en) | Human back feature point extraction method based on Kinect camera | |
CN110687122A (en) | Method and system for detecting surface cracks of ceramic tile | |
CN110334727B (en) | Intelligent matching detection method for tunnel cracks | |
CN114331986A (en) | Dam crack identification and measurement method based on unmanned aerial vehicle vision | |
CN114187267B (en) | Stamping part defect detection method based on machine vision | |
CN106340010A (en) | Corner detection method based on second-order contour difference | |
CN115830018B (en) | Carbon block detection method and system based on deep learning and binocular vision | |
CN111275040A (en) | Positioning method and device, electronic equipment and computer readable storage medium | |
CN111310754A (en) | Method for segmenting license plate characters | |
CN105374045B (en) | One kind is based on morphologic image given shape size objectives fast partition method | |
CN107729863B (en) | Human finger vein recognition method | |
CN107369179B (en) | High-precision image positioning method | |
CN107748897B (en) | Large-size curved part profile quality detection method based on pattern recognition | |
CN111667429B (en) | Target positioning correction method for inspection robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20211110 Address after: 518057 314, industry university research building, Hong Kong Polytechnic University, 18 Yuexing 1st Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong Patentee after: Shenzhen Lingdong Medical Technology Co., Ltd Address before: 610041 No. 1602, floor 16, building 1, No. 138, Tianfu Second Street, Chengdu hi tech Zone, China (Sichuan) pilot Free Trade Zone, Chengdu, Sichuan Patentee before: Sichuan efficiency Future Technology Co., Ltd |