CN107137225B - Method and system for establishing and positioning personalized head and face acupoint recognition model - Google Patents

Method and system for establishing and positioning personalized head and face acupoint recognition model Download PDF

Info

Publication number
CN107137225B
CN107137225B CN201710479753.9A CN201710479753A CN107137225B CN 107137225 B CN107137225 B CN 107137225B CN 201710479753 A CN201710479753 A CN 201710479753A CN 107137225 B CN107137225 B CN 107137225B
Authority
CN
China
Prior art keywords
face
head
dimensional
acupoint
subject
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710479753.9A
Other languages
Chinese (zh)
Other versions
CN107137225A (en
Inventor
付先军
李学博
王振国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University of Traditional Chinese Medicine
Original Assignee
Shandong University of Traditional Chinese Medicine
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Traditional Chinese Medicine filed Critical Shandong University of Traditional Chinese Medicine
Priority to CN201710479753.9A priority Critical patent/CN107137225B/en
Publication of CN107137225A publication Critical patent/CN107137225A/en
Application granted granted Critical
Publication of CN107137225B publication Critical patent/CN107137225B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H39/00Devices for locating or stimulating specific reflex points of the body for physical therapy, e.g. acupuncture
    • A61H39/02Devices for locating such points
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y80/00Products made by additive manufacturing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The application relates to a method and a system for establishing and positioning a personalized head and face acupoint recognition model, wherein the establishing method comprises the following steps: manually marking the head and face acupoints of the human body sample; scanning and collecting three-dimensional head and face information of each human body sample, wherein the three-dimensional head and face information comprises marked relative coordinates of acupoints, distances between the acupoints and accessory bony marks and the same Shen size of the human body sample; dividing the three-dimensional head-face information into a head-face information training data set and a head-face information testing data set according to a certain proportion; training an automatic head and face acupoint recognition model by adopting the head and face information training data set through a deep learning algorithm; and testing and optimizing the automatic head-face acupoint recognition model based on the professional evaluation information and the head-face information testing data set.

Description

Method and system for establishing and positioning personalized head and face acupoint recognition model
Technical Field
The application belongs to the technical field of acupoint positioning, and particularly relates to a method and a system for establishing and positioning an individualized head and face acupoint recognition model.
Background
At present, the treatment methods of acupuncture point acupuncture, massage, infrared physiotherapy, drug administration and the like in the traditional Chinese medicine treatment are popular with people due to the conciseness, quick response and no side effect. The acupoint positioning is the basis for implementing the traditional Chinese medicine acupuncture and massage and acupoint medication. The acupoint positioning methods mainly comprise a body surface anatomical sign positioning method, a bone fracture positioning method and a finger-inch positioning method, wherein the first two methods are used for searching acupoints according to body surface anatomy and bone segments, and the second method is used for searching acupoints through finger-inch of the finger. Traditional acupoint positioning is determined by senior middle doctors according to personal experience and hand feeling, namely, acupoint positioning is performed by the latter method. However, for non-traditional Chinese medicine professions or new scholars, the location of the acupoints is often "heart-clear, difficult to be seen under the fingers", difficult to accurately locate, and difficult to achieve an ideal curative effect.
Along with development of electronic technology and computer technology, li Yanze et al propose a simple human body acupoint recognition therapeutic apparatus for solving the problem that non-traditional Chinese medicine specialists are difficult to accurately position, and acupoint positioning judgment is performed by measuring human body voltage values through utilizing differences between voltage values of human body non-acupoint points and acupoint points in different impedances. Chinese patent document CN1111029C discloses a circuit for detecting human body acupoints according to the divergence characteristics of the bipolar detection signals by human body acupoints, wherein a sampling circuit is connected in series in an output loop of the circuit for rectifying and sampling the bipolar current signals flowing through the human body, and when the current divergence characteristics, i.e. the acupoint characteristics, are determined, a feedback circuit automatically strengthens detection output, increases the intensity of detection signals for the human body acupoints, increases the acupoint feeling of a detected person, and indicates acupoint positions. However, the above method for locating acupoints by using voltage signals has a certain problem, in which voltage needs to be applied to human body, a certain potential safety hazard exists, and only relatively large acupoints can be determined.
The Chinese patent document CN103735407B proposes a human ear acupoint positioning device, which solves the problems that the existing acupoint positioning method needs to apply voltage to the human body and the existing acupoint positioning device is not suitable for realizing accurate positioning on the region with dense acupoints. The positioning method comprises the following steps: collecting an image of a region to be positioned; selecting a region to be positioned, and reading a corresponding local acupoint map from an acupoint database according to the region; obtaining a contour map, and comparing the contour map with a local hole map to obtain a fitting image; selecting a positioning acupoint, and controlling a laser indicator to emit an indicating laser beam to the position of the acupoint to be positioned according to the position of the positioning acupoint in the fitting image. The position of the part to be measured is fixed by a bracket in the positioning device, and the image acquisition and the emission of the indication laser beam to the corresponding acupuncture point of the ear are realized by an image acquisition and acupuncture point indication device. However, the above method also has certain disadvantages: firstly, the parts are limited and cannot be suitable for the head and the face of a human body; secondly, the positioning principle is to position by taking the acquired two-dimensional plane image and fitting the contour line in the local point image as a basis to obtain a fitting image, but because the human body is a three-dimensional structure, the individual differences of the structures and the three-dimensional conformations of different human bodies are very large, the simple positioning of the points by using the plane image through the fitting of the contours is difficult to achieve the purpose of accurate positioning.
In addition, in the prior art, after the acupuncture points are positioned, almost none of the acupuncture point identification systems or devices can be directly applied to acupuncture, massage and physiotherapy instruments after humanized imaging for each individual.
In summary, how to solve the problem that the existing acupoint positioning method cannot accurately position the fine acupoint on the face and the problem that the existing acupoint positioning and identifying device cannot perform personalized acupoint positioning and identifying are not yet available.
Disclosure of Invention
The application aims to solve the problems that the prior art acupoint positioning method cannot accurately position the fine acupoint on the face and the prior acupoint positioning and identifying device cannot perform personalized acupoint positioning and identifying, and provides a method and a system for establishing and positioning a personalized head-face acupoint identifying model.
In order to achieve the above purpose, the present application adopts the following technical scheme:
a personalized head and face acupoint automatic identification model building method comprises the following steps:
manually marking the head and face acupoints of the human body sample;
scanning and collecting three-dimensional head and face information of each human body sample, wherein the three-dimensional head and face information comprises marked relative coordinates of acupoints, distances between the acupoints and accessory bony marks and the same Shen size of the human body sample;
dividing the three-dimensional head-face information into a head-face information training data set and a head-face information testing data set according to a certain proportion;
training an automatic head and face acupoint recognition model by adopting the head and face information training data set through a deep learning algorithm;
and testing and optimizing the automatic head-face acupoint recognition model based on the professional evaluation information and the head-face information testing data set.
Further, training the head-face acupoint automatic recognition model by using the head-face information training data set through a deep learning algorithm includes:
establishing a three-dimensional head and face information database with the human body sample, and denoising to improve data precision;
constructing a grid diagram converted from a three-dimensional model to a two-dimensional plane, projecting the three-dimensional head and face information to the two-dimensional plane through the grid diagram, and establishing a two-dimensional face image;
according to the three-dimensional head and face information, separating to obtain a face depth image, wherein the face depth image is a two-dimensional depth image;
and training a cascade convolutional neural network by taking the two-dimensional face image and the face depth map as inputs, and establishing an automatic head and face acupoint recognition model.
The application also discloses an automatic identification and positioning method for the personalized head and face acupoints, which comprises the following steps:
scanning and collecting three-dimensional head and face information of a subject;
performing digital reconstruction on the three-dimensional head and face information of the subject to obtain a three-dimensional head and face model of the subject, and respectively converting the three-dimensional head and face model of the subject into a two-dimensional face image of the subject and a face depth map of the subject, wherein the face depth map of the subject is a two-dimensional image;
taking the two-dimensional face image of the subject and the face depth image of the subject as input of a trained head and face acupoint automatic recognition model to obtain face acupoint coordinates (x, y, z) of the subject;
and performing acupoint positioning on the head and the face of the subject according to the coordinates of the acupoints of the face of the subject.
Further, performing acupoint positioning on the head and face of the subject according to the coordinates of the acupoints of the face of the subject comprises:
performing acupoint positioning on the three-dimensional head-face model of the subject;
or performing acupoint positioning on the two-dimensional face image of the subject.
Further, the three-dimensional head and face information of the subject is represented by digitized dot data, and its shape is represented by XYZ coordinates, and 24-bit RGB values represent colors.
Further, receiving the three-dimensional head-face model of the subject and a two-dimensional face image of the subject;
printing a three-dimensional head and face solid model of the subject by adopting a three-dimensional printing device, and positioning acupoints on the solid model;
or adopting a plane printing device to print a two-dimensional face picture of the subject, and performing acupoint positioning on the picture;
the application also provides an automatic personalized head and face acupoint recognition and positioning system, which comprises:
the acquisition module is used for scanning and acquiring three-dimensional head and face information of each human body sample and three-dimensional head and face information of a subject, wherein the three-dimensional head and face information of the human body sample comprises marked relative coordinates of acupuncture points, the distance between the acupuncture points and the accessory osseous marks and the co-applied size of the human body sample;
and
the training module is configured to select a head and face information training data set and a head and face information testing data set from three-dimensional head and face information of a human body sample, and train and optimize an automatic head and face acupoint recognition model through a deep learning algorithm;
and
and the identifying and positioning module is used for identifying and positioning the acupuncture points on the basis of the three-dimensional head and face information of the subject.
Further, the model building module includes:
the training module is used for training the head and face acupoint automatic identification model by adopting the head and face information training data set through a deep learning algorithm;
and the testing module is used for testing and optimizing the automatic head and face acupoint recognition model based on the professional evaluation information and the head and face information testing data set.
Further, the identification positioning module comprises a shape positioning unit, a color positioning unit or a sound positioning unit.
Further, the device also comprises a three-dimensional printing device or a plane printing device, wherein the three-dimensional printing device is used for printing a three-dimensional head and face solid model of the subject, and acupoint positioning is carried out on the solid model;
or adopting a plane printing device to print a two-dimensional face picture of the subject, and performing acupoint positioning on the picture.
The application has the beneficial effects that:
(1) The application realizes the precise positioning of the head and face acupoints without contact and application, can visualize the positioning result, is directly applied to the operations of acupuncture, massage, drug administration, infrared physiotherapy and the like, and plays the role of personalized treatment of traditional Chinese medicine.
(2) The application adopts the three-dimensional scanning system of the head and the face, does not need to exert any potential dangerous influence on the human body, can acquire more three-dimensional information of the head and the face of the human body than the plane scanning, and provides more information basis for the accurate positioning of the following acupuncture points;
(3) The application constructs the head-face acupoint automatic identification and marking model by collecting three-dimensional information of the head and face of the human body, especially bone mark and other parameters and applying a deep learning method, and continuously perfects the algorithm of the model by expert evaluation.
(4) The three-dimensional image reconstruction is carried out on the structure of the positioning mark, the imaging is realized through 3D printing, the mask mode can be printed, and the operations such as acupuncture, massage, infrared physiotherapy, drug delivery and the like can be applied to the corresponding acupuncture point mark on the mask mode, so that the personalized application value of the mask is greatly increased.
Drawings
FIG. 1 is a schematic diagram of an automatic head and face acupoint recognition training system;
FIG. 2 personalizes the head and face acupoint positioning method;
FIG. 3 is a schematic diagram of the identification of the acupoints on the head and face of a human body;
FIG. 4 is a schematic diagram of a personalized head-face acupoint positioning method and apparatus;
FIG. 5 is a flowchart of a personalized head-face acupoint positioning method and apparatus electrical connection and workflow;
wherein, 1.3D scanner fixed bolster, 2.3D scanner, 3 computer system, 4.3D print support, 5.3D print head, 6 chin rest
The specific embodiment is as follows:
the application is further illustrated by the following examples in conjunction with the accompanying drawings:
it should be noted that the following detailed description is illustrative and is intended to provide further explanation of the application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present application. As used herein, the singular is also intended to include the plural unless the context clearly indicates otherwise, and furthermore, it is to be understood that the terms "comprises" and/or "comprising" when used in this specification are taken to specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof.
An exemplary embodiment of the present application is a method for establishing an automatic recognition model for personalized head and face acupoints, as shown in fig. 1, including:
manually marking the head and face acupoints of a human body sample: before scanning, the head and the face of a human body are marked directly, the relative coordinates of the acupoints are obtained through three-dimensional scanning, and meanwhile, the distance between the acupoints and the bony marks of the appendages and the cun of the same person (cun-shen method in acupuncture) are collected, so that enough individuation information is obtained, and more useful information is provided for positioning the rear acupoints.
Dividing the three-dimensional head-face information into a head-face information training data set and a head-face information testing data set according to a certain proportion;
training an automatic head and face acupoint recognition model by adopting the head and face information training data set through a deep learning algorithm;
and testing and optimizing the automatic head-face acupoint recognition model based on the professional evaluation information and the head-face information testing data set.
The training of the head and face acupoint automatic identification model by the deep learning algorithm through the head and face information training data set comprises the following steps:
establishing a three-dimensional head and face information database with the human body sample, and denoising to improve data precision;
wherein the three-dimensional head and face information database of human body samples comprises 200 people, 100 men and 100 women:
constructing a grid diagram converted from a three-dimensional model to a two-dimensional plane, projecting the three-dimensional head and face information to the two-dimensional plane through the grid diagram, and establishing a two-dimensional face image;
according to the three-dimensional head and face information, separating to obtain a face depth map; the obtaining of the face depth map may be: 100 persons in a three-dimensional head and face information database and 10 gestures are selected, 1000 samples are taken, and a face depth map is obtained through separation;
and training a cascade convolutional neural network by taking the two-dimensional face image and the face depth map as inputs, and establishing an automatic head and face acupoint recognition model. The key points of the face are as follows: positioning of eyes, nasal tip, two side corners of mouth and bone marks and acupoints.
On the basis of establishing the automatic head and face acupoint recognition model, the application also provides a personalized automatic head and face acupoint recognition positioning method, which comprises the following steps:
scanning and collecting three-dimensional head and face information of a subject: the subject fixes his head on the chin rest of fig. 4-6, and the three-dimensional scanner's camera (fig. 4-2) scans the human head and face, and the character of the head and face is caught through the sensor, produces a set of digitized point data, and represents its shape in XYZ coordinates, and represents color in 24-bit RGB values. Transmitting the information into a computer system (fig. 5);
performing digital reconstruction on the three-dimensional head and face information of the subject to obtain a three-dimensional head and face model of the subject, and respectively converting the three-dimensional head and face model of the subject into a two-dimensional face image of the subject and a face depth map of the subject, wherein the face depth map of the subject is a two-dimensional image;
taking the two-dimensional face image of the subject and the face depth image of the subject as input of a trained head and face acupoint automatic recognition model to obtain face acupoint coordinates (x, y, z) of the subject;
and performing acupoint positioning on the head and the face of the subject according to the coordinates of the acupoints of the face of the subject.
We can choose to perform acupoint positioning on the subject's three-dimensional head-face model; the acupoint positioning can also be performed on the two-dimensional face image of the subject.
Furthermore, the three-dimensional printing equipment can be adopted to print a three-dimensional head and face solid model of the subject, and acupuncture point positioning is carried out on the solid model;
the three-dimensional head and face solid model comprises a solid model and a hollowed mask model.
Or printing a two-dimensional face picture of the subject by adopting plane printing equipment, and performing acupoint positioning on the picture;
the application also provides a system as a hardware system for realizing the method, which comprises the following steps:
the acquisition module is used for scanning and acquiring three-dimensional head and face information of each human body sample and three-dimensional head and face information of a subject, wherein the three-dimensional head and face information of the human body sample comprises marked relative coordinates of acupuncture points, the distance between the acupuncture points and the accessory osseous marks and the co-applied size of the human body sample;
and
the training module is configured to select a head and face information training data set and a head and face information testing data set from three-dimensional head and face information of a human body sample, and train and optimize an automatic head and face acupoint recognition model through a deep learning algorithm;
and
and the identifying and positioning module is used for identifying and positioning the acupuncture points on the basis of the three-dimensional head and face information of the subject.
Wherein the model building module comprises:
the training module is used for training the head and face acupoint automatic identification model by adopting the head and face information training data set through a deep learning algorithm;
and the testing module is used for testing and optimizing the automatic head and face acupoint recognition model based on the professional evaluation information and the head and face information testing data set.
The identification positioning module comprises a shape positioning unit, a color positioning unit or a sound positioning unit.
Further, in order to print out a three-dimensional solid model or a two-dimensional picture, the three-dimensional head face solid model printing device further comprises a three-dimensional printing device or a plane printing device, wherein the three-dimensional printing device is adopted to print the three-dimensional head face solid model of the subject, and acupoint positioning is carried out on the solid model;
or adopting a plane printing device to print a two-dimensional face picture of the subject, and performing acupoint positioning on the picture.
Finally, the solid model can be used as a reference object for positioning the individual acupoints.
The mask model can be directly added with an infrared device at the acupoint for physiotherapy, and the acupoint gap can be applied with acupuncture, massage or medicine adding treatment modes.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (8)

1. The method for establishing the personalized head and face acupoint automatic identification model is characterized by comprising the following steps of:
manually marking the head and face acupoints of the human body sample;
scanning and collecting three-dimensional head and face information of each human body sample, wherein the three-dimensional head and face information comprises marked relative coordinates of acupoints, distances between the acupoints and accessory bony marks and the same size of the human body sample;
dividing the three-dimensional head-face information into a head-face information training data set and a head-face information testing data set according to a certain proportion;
training an automatic head and face acupoint recognition model by adopting the head and face information training data set through a deep learning algorithm;
based on the professional evaluation information and the head-face information test data set, testing and optimizing the head-face acupoint automatic identification model;
training the head and face acupoint automatic identification model by adopting the head and face information training data set through a deep learning algorithm comprises the following steps:
establishing a three-dimensional head and face information database with the human body sample, and denoising to improve data precision;
constructing a grid diagram converted from a three-dimensional model to a two-dimensional plane, projecting the three-dimensional head and face information to the two-dimensional plane through the grid diagram, and establishing a two-dimensional face image;
according to the three-dimensional head and face information, separating to obtain a face depth image, wherein the face depth image is a two-dimensional depth image;
and training a cascade convolutional neural network by taking the two-dimensional face image and the face depth map as inputs, and establishing an automatic head and face acupoint recognition model.
2. An automatic personalized head and face acupoint recognition positioning method, characterized in that the method based on the automatic personalized head and face acupoint recognition model establishment method according to claim 1 comprises the following steps:
scanning and collecting three-dimensional head and face information of a subject;
performing digital reconstruction on the three-dimensional head and face information of the subject to obtain a three-dimensional head and face model of the subject, and respectively converting the three-dimensional head and face model of the subject into a two-dimensional face image of the subject and a face depth map of the subject, wherein the face depth map of the subject is a two-dimensional image;
taking the two-dimensional face image of the subject and the face depth image of the subject as input of a trained head and face acupoint automatic recognition model to obtain face acupoint coordinates (x, y, z) of the subject;
and performing acupoint positioning on the head and the face of the subject according to the coordinates of the acupoints of the face of the subject.
3. The method of claim 2, wherein performing acupoint positioning on the face of the subject based on the face acupoint coordinates of the subject comprises:
performing acupoint positioning on the three-dimensional head-face model of the subject;
or performing acupoint positioning on the two-dimensional face image of the subject.
4. The method according to claim 2, characterized in that: the three-dimensional head and face information of the subject is represented by digitized dot data, and its shape is represented by XYZ coordinates, and 24-bit RGB values represent colors.
5. The method as recited in claim 2, further comprising:
receiving the three-dimensional head-face model of the subject and a two-dimensional face image of the subject;
printing a three-dimensional head and face solid model of the subject by adopting a three-dimensional printing device, and positioning acupoints on the solid model;
or adopting a plane printing device to print a two-dimensional face picture of the subject, and performing acupoint positioning on the picture.
6. An automatic personalized head and face acupoint recognition and positioning system, comprising:
the acquisition module is used for scanning and acquiring three-dimensional head and face information of each human body sample and three-dimensional head and face information of a subject, wherein the three-dimensional head and face information of the human body sample comprises marked relative coordinates of acupuncture points, the distance between the acupuncture points and the accessory osseous marks and the co-applied size of the human body sample;
and
the model building module is configured to select a head-face information training data set and a head-face information testing data set from three-dimensional head-face information of a human body sample, and train and optimize an automatic head-face acupoint recognition model through a deep learning algorithm; the model building module comprises: the training module is used for training the head and face acupoint automatic identification model by adopting the head and face information training data set through a deep learning algorithm; the testing module is used for testing and optimizing the automatic head-face acupoint recognition model based on the professional evaluation information and the head-face information testing data set;
the identifying and positioning module is used for identifying and positioning the acupuncture points on the basis of the three-dimensional head and face information of the subject;
training the head and face acupoint automatic identification model by adopting the head and face information training data set through a deep learning algorithm comprises the following steps:
establishing a three-dimensional head and face information database with the human body sample, and denoising to improve data precision;
constructing a grid diagram converted from a three-dimensional model to a two-dimensional plane, projecting the three-dimensional head and face information to the two-dimensional plane through the grid diagram, and establishing a two-dimensional face image;
according to the three-dimensional head and face information, separating to obtain a face depth image, wherein the face depth image is a two-dimensional depth image;
and training a cascade convolutional neural network by taking the two-dimensional face image and the face depth map as inputs, and establishing an automatic head and face acupoint recognition model.
7. The system of claim 6, wherein the identification location module comprises a shape location unit, a color location unit, or a sound location unit.
8. The system of claim 6, further comprising a stereoscopic printing device or a planar printing device, printing a three-dimensional head-face solid model of the subject with the stereoscopic printing device, and performing acupoint positioning on the solid model;
or adopting a plane printing device to print a two-dimensional face picture of the subject, and performing acupoint positioning on the picture.
CN201710479753.9A 2017-06-22 2017-06-22 Method and system for establishing and positioning personalized head and face acupoint recognition model Active CN107137225B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710479753.9A CN107137225B (en) 2017-06-22 2017-06-22 Method and system for establishing and positioning personalized head and face acupoint recognition model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710479753.9A CN107137225B (en) 2017-06-22 2017-06-22 Method and system for establishing and positioning personalized head and face acupoint recognition model

Publications (2)

Publication Number Publication Date
CN107137225A CN107137225A (en) 2017-09-08
CN107137225B true CN107137225B (en) 2023-09-01

Family

ID=59782321

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710479753.9A Active CN107137225B (en) 2017-06-22 2017-06-22 Method and system for establishing and positioning personalized head and face acupoint recognition model

Country Status (1)

Country Link
CN (1) CN107137225B (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108938396A (en) * 2017-09-26 2018-12-07 炬大科技有限公司 A kind of ear acupuncture point identification device and its method based on deep learning
CN108158804B (en) * 2017-12-15 2021-09-14 苏州科灵医疗科技有限公司 Body surface vein mold for realizing target positioning based on body surface vein characteristics and manufacturing method
CN107898626A (en) * 2017-12-20 2018-04-13 大连交通大学 A kind of binocular vision acupoint positioning instrument and its method of work
CN108514509A (en) * 2018-03-30 2018-09-11 吴佳桐 A kind of intelligence acupuncture point localization method
CN108765546A (en) * 2018-04-18 2018-11-06 北京奇虎科技有限公司 Acupuncture point detection method and device
CN109568123B (en) * 2018-11-02 2021-02-02 广东数相智能科技有限公司 Acupuncture point positioning method based on YOLO target detection
CN109939001A (en) * 2019-04-02 2019-06-28 王云 A kind of device for healing and training of Dysphagia After Stroke
CN110464633A (en) * 2019-06-17 2019-11-19 深圳壹账通智能科技有限公司 Acupuncture point recognition methods, device, equipment and storage medium
CN110448453A (en) * 2019-08-08 2019-11-15 杭州诚一文化创意有限公司 A kind of smart machine and its method for facial point massage
CN110613605B (en) * 2019-08-29 2021-06-22 成都中医药大学 Construction method of acupoint discrimination model, acupoint discrimination method and discrimination system
CN111028950A (en) * 2019-12-26 2020-04-17 中科彭州智慧产业创新中心有限公司 Three-dimensional human body meridian display method and system, electronic device and storage medium
CN111723700B (en) * 2020-06-08 2022-11-11 国网河北省电力有限公司信息通信分公司 Face recognition method and device and electronic equipment
CN112184705B (en) * 2020-10-28 2022-07-05 成都智数医联科技有限公司 Human body acupuncture point identification, positioning and application system based on computer vision technology
CN112519235A (en) * 2020-12-23 2021-03-19 中科彭州智慧产业创新中心有限公司 Rapid model preparation device and method based on 3D printing technology
CN113813169B (en) * 2021-08-30 2023-12-01 中科尚易健康科技(北京)有限公司 Model decreasing deep learning human body acupoint recognition method and physiotherapy equipment
CN113807207A (en) * 2021-08-30 2021-12-17 中科尚易健康科技(北京)有限公司 Human body meridian recognition method and device based on multiple cameras and human body meridian conditioning equipment
CN113842116B (en) * 2021-10-14 2022-09-27 北京鹰之眼智能健康科技有限公司 Automatic positioning method and device for human acupuncture points and electronic equipment
CN113975152B (en) * 2021-11-01 2023-10-31 潍坊信行中直医疗科技有限公司 Individualized skin penetrating and supporting positioning device based on 3D printing and manufacturing method thereof
CN113780250B (en) * 2021-11-11 2022-01-28 四川大学 End-to-end facial acupoint positioning method for small sample and electronic equipment
CN116020094A (en) * 2022-12-28 2023-04-28 四川省八一康复中心(四川省康复医院) Adult swallowing rehabilitation training system with portable acupoint stimulation instrument and man-machine interaction
CN116364238A (en) * 2023-05-26 2023-06-30 青岛市第五人民医院 Acupuncture treatment system and method based on deep learning
CN117636446B (en) * 2024-01-25 2024-05-07 江汉大学 Face acupoint positioning method, acupuncture robot and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101958079A (en) * 2010-07-22 2011-01-26 西北师范大学 Positioning model of channel acupuncture point in three-dimensional virtual human anatomy texture and application thereof
CN102525795A (en) * 2012-01-16 2012-07-04 沈阳理工大学 Fast automatic positioning method of foot massaging robot
CN104207931A (en) * 2013-06-05 2014-12-17 上海中医药大学 Accurate human face acupuncture point locating and acupuncture and moxibustion prescription learning method
CN105653875A (en) * 2016-01-19 2016-06-08 中国科学院微电子研究所 Manufacturing method and device of human body meridian point model
CN105930810A (en) * 2016-04-26 2016-09-07 北京工业大学 Facial acupoint positioning method and positioning device based on feature point positioning algorithm

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101958079A (en) * 2010-07-22 2011-01-26 西北师范大学 Positioning model of channel acupuncture point in three-dimensional virtual human anatomy texture and application thereof
CN102525795A (en) * 2012-01-16 2012-07-04 沈阳理工大学 Fast automatic positioning method of foot massaging robot
CN104207931A (en) * 2013-06-05 2014-12-17 上海中医药大学 Accurate human face acupuncture point locating and acupuncture and moxibustion prescription learning method
CN105653875A (en) * 2016-01-19 2016-06-08 中国科学院微电子研究所 Manufacturing method and device of human body meridian point model
CN105930810A (en) * 2016-04-26 2016-09-07 北京工业大学 Facial acupoint positioning method and positioning device based on feature point positioning algorithm

Also Published As

Publication number Publication date
CN107137225A (en) 2017-09-08

Similar Documents

Publication Publication Date Title
CN107137225B (en) Method and system for establishing and positioning personalized head and face acupoint recognition model
Ubelaker et al. Computer-assisted facial reproduction
Gupta et al. Forensic facial reconstruction: the final frontier
Bonnechere et al. Determination of the precision and accuracy of morphological measurements using the Kinect™ sensor: comparison with standard stereophotogrammetry
US20150310629A1 (en) Motion information processing device
US20120086793A1 (en) Video image information processing apparatus and video image information processing method
WO2014112631A1 (en) Movement information processing device and program
Shujaat et al. The clinical application of three-dimensional motion capture (4D): a novel approach to quantify the dynamics of facial animations
DE60212658D1 (en) Computer assisted, automatic system for balancing vital acupuncture points
CN104274183A (en) Motion information processing apparatus
CN109091380B (en) Traditional Chinese medicine system and method for realizing acupoint visualization by AR technology
Pithon et al. Soft tissue thickness in young north eastern Brazilian individuals with different skeletal classes
CN206584354U (en) Physical examination information acquisition system based on image recognition
Mohd et al. Mental stress recognition based on non-invasive and non-contact measurement from stereo thermal and visible sensors
Li et al. Evaluation of the fine motor skills of children with DCD using the digitalised visual‐motor tracking system
Mohd et al. Internal state measurement from facial stereo thermal and visible sensors through SVM classification
Maltais Lapointe et al. Validation of the new interpretation of Gerasimov's nasal projection method for forensic facial approximation using CT data
Samsudin et al. Clinical and non-clinical initial assessment of facial nerve paralysis: A qualitative review
CN111729200B (en) Transcranial magnetic stimulation automatic navigation system and method based on depth camera and magnetic resonance
Chen et al. 3-D printing based production of head and neck masks for radiation therapy using CT volume data: a fully automatic framework
KR101897512B1 (en) Face Fit Eyebrow tattoo system using 3D Face Recognition Scanner
Kumar et al. Performance improvement using an automation system for segmentation of multiple parametric features based on human footprint
KR102348663B1 (en) System and method for measurement pulse and respiration using image and line laser
US20220096004A1 (en) System for visualizing patient stress
CN115568823A (en) Method, system and device for evaluating human body balance ability

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant