CN110897865A - Auricular point guiding device and method - Google Patents

Auricular point guiding device and method Download PDF

Info

Publication number
CN110897865A
CN110897865A CN201911353011.7A CN201911353011A CN110897865A CN 110897865 A CN110897865 A CN 110897865A CN 201911353011 A CN201911353011 A CN 201911353011A CN 110897865 A CN110897865 A CN 110897865A
Authority
CN
China
Prior art keywords
ear
auricular
auricular point
module
segmentation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911353011.7A
Other languages
Chinese (zh)
Inventor
张以涛
张俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Science Pengzhou Intelligent Industry Innovation Center Co Ltd
Original Assignee
China Science Pengzhou Intelligent Industry Innovation Center Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Science Pengzhou Intelligent Industry Innovation Center Co Ltd filed Critical China Science Pengzhou Intelligent Industry Innovation Center Co Ltd
Priority to CN201911353011.7A priority Critical patent/CN110897865A/en
Publication of CN110897865A publication Critical patent/CN110897865A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H39/00Devices for locating or stimulating specific reflex points of the body for physical therapy, e.g. acupuncture
    • A61H39/02Devices for locating such points
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The invention provides an auricular point guiding device and method, comprising the following steps: the device comprises a structured light module, a projection module and a processing unit, wherein the structured light module and the projection module are respectively connected with the processing unit. The structured light module is used for acquiring an ear image of an object to be acquired and sending the ear image to the processing unit. The processing unit is used for calculating the auricular point position according to the received auricular diagram and determining the projection direction angle of the projection module according to the auricular point position so that the projection module can project the guide light to the auricular point position. The auricular point guiding device and the auricular point guiding method can improve the accuracy and the visualization of auricular point guiding and are beneficial to developing treatment diagnosis and teaching of auricular points.

Description

Auricular point guiding device and method
Technical Field
The invention belongs to the technical field of auricular point positioning, and particularly relates to an auricular point guiding device and an auricular point guiding method.
Background
The ear has close relationship with the zang-fu organs and meridians, and the zang-fu organs have corresponding reaction zones (auricular points) in the auricle to stimulate the auricular points and to regulate and treat the corresponding zang-fu organs. The ear acupoint diagnosis and treatment method originates from China, is an important component of traditional Chinese medicine acupuncture and moxibustion, and is a unique holographic microneedle therapy which is characterized by local reaction and integration. The correct positioning of the ear acupoints plays a crucial role in the process of diagnosing and treating diseases and guiding teaching. The traditional ear acupoint location is mainly positioned by observing ears of doctors or comparing ear acupoint maps, and has certain subjectivity and inaccuracy, so that the device capable of visually presenting the accurate positions of ear acupoints has positive effects on treatment, diagnosis and guidance teaching of the ear acupoints.
Disclosure of Invention
The present invention is directed to at least one of the technical problems of the prior art, and provides an auricular point guiding device and method.
A first aspect of the present invention provides an auricular point guiding device, comprising: a structured light module, a projection module and a processing unit, wherein the structured light module and the projection module are respectively connected with the processing unit,
the structured light module is used for acquiring an ear image of an object to be acquired and sending the ear image to the processing unit;
the processing unit is used for calculating the auricular point position according to the received auricular diagram and determining the projection direction angle of the projection module according to the auricular point position so that the projection module can project the guide light to the auricular point position.
Optionally, the ear picture includes ear structure graph and ear's image, the structured light module is including dot matrix projector, infrared camera lens and the CCD camera that sets gradually, wherein, acquire the ear picture of waiting to gather the object, include:
the dot matrix projector is used for projecting light spots to the ears;
the infrared lens is used for acquiring the ear dot matrix pattern of the ear part to obtain the ear part structure diagram;
the CCD camera is used for collecting ear images.
Optionally, the processing unit includes a segmentation subunit, a matching subunit, an identification subunit, and a control subunit; wherein the content of the first and second substances,
the segmentation subunit is configured to transmit the ear image to a preset area positioning network, remove an interference area around an ear, and obtain an ear segmentation map;
the matching subunit is configured to match the ear part segmentation chart with the ear structure chart to obtain an ear segmentation structure chart;
the identification subunit is used for inputting the ear segmentation structure chart into a pre-stored ear acupoint identification model, and performing comparison analysis according to the characteristic information of the ear segmentation structure chart and the characteristic information of the pre-stored ear acupoint identification model to determine the position of the ear acupoint;
and the control subunit is used for determining the projection direction angle of the projection module according to the auricular point position.
Optionally, the area location network comprises a convolution structure, a standard box layer and a regression classification layer, wherein,
the convolution structure is used for network training for identifying ear images;
the standard frame layer is used for storing an ear standard area selected by the convolution structure in a network training process;
and the regression classification layer is used for calculating the probability of the candidate frame serving as the target frame by adopting cross entropy and obtaining the ear part segmentation map by selecting the candidate frame with the highest probability.
Optionally, the ear image is consistent with the picture size of the ear structure diagram, and in the matching subunit, the ear part segmentation diagram is mapped to the ear structure diagram, and the corresponding position is the ear segmentation structure diagram.
Optionally, the auricular point identification model comprises a feature extraction module, a feature module and an identification module, wherein,
the feature extraction module is used for extracting feature information of the ear segmentation structure chart;
the characteristic module is prestored with standard auricular point characteristic information of the auricular point identification model;
the identification module adopts a softmax classifier, compares the feature information of the ear segmentation structure chart with the standard ear acupoint feature information, and determines the ear acupoint position.
Optionally, the processing unit further comprises a computing module, the computing module is configured to:
establishing a first space coordinate system by the structured light module and establishing a second space coordinate system by the projection module;
calculating a conversion relation between the first space coordinate system and the second space coordinate system;
calculating a first position coordinate of the ear acupoint in the first space coordinate system according to the position of the ear acupoint in the ear part cutting diagram;
converting the first position coordinate to a second position coordinate in the second space coordinate system according to the conversion relation;
and determining the projection direction angle according to the second position coordinate.
A second aspect of the present invention provides an auricular point guidance method, including:
acquiring an ear graph of an object to be acquired;
calculating the position of the auricular point according to the received auricular diagram, determining the projection direction angle of the projection light according to the position of the auricular point, and projecting the guide light to the position of the auricular point.
Optionally, the acquiring an ear chart of the object to be acquired includes:
projecting a light spot toward the ear;
acquiring an ear dot matrix pattern of the ear to obtain an ear structure diagram;
an ear image is acquired.
Optionally, the calculating an auricular point position according to the received auricular diagram, determining a projection direction angle of the projected light according to the auricular point position, and projecting the guided light to the auricular point position includes:
transmitting the ear image into a preset area positioning network, removing an interference area around the ear, and obtaining an ear part cutting chart;
matching the ear part segmentation drawing with the ear structure drawing to obtain an ear part segmentation structure drawing;
inputting the ear segmentation structure chart into a pre-stored ear acupoint identification model, and comparing and analyzing the feature information of the ear segmentation structure chart with the feature information of the pre-stored ear acupoint identification model to determine the ear acupoint position;
and determining the projection direction angle of the projection light according to the auricular point position.
The auricular point guiding device and the auricular point guiding method provided by the embodiment of the invention comprise the following steps: the device comprises a structured light module, a projection module and a processing unit, wherein the structured light module and the projection module are respectively connected with the processing unit. The structured light module is used for acquiring an ear image of an object to be acquired and sending the ear image to the processing unit. The processing unit is used for calculating the auricular point position according to the received auricular diagram and determining the projection direction angle of the projection module according to the auricular point position so that the projection module can project the guide light to the auricular point position. The auricular point guiding device and the auricular point guiding method can improve the accuracy and the visualization of auricular point guiding and are beneficial to developing treatment diagnosis and teaching of auricular points.
Drawings
Fig. 1 is a schematic structural diagram of an auricular point guiding device according to the present invention;
FIG. 2 is a schematic diagram of data processing of the auricular point guiding device according to the present invention;
FIG. 3 is a block diagram schematically illustrating the components of the processing unit of FIG. 1;
FIG. 4 is a flow chart of an auricular point guiding method according to the present invention;
fig. 5 is a schematic diagram illustrating an eardrum image obtaining process in an auricular point guiding method according to the present invention;
fig. 6 is a schematic flow chart of auricular point position calculation and guided light projection in the auricular point guiding method according to the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
As shown in fig. 1 and 2, an auricular point guiding device 100 includes: the structure light module 110, the projection module 120 and the processing unit 130, wherein the structure light module 110 and the projection module 120 are respectively connected with the processing unit 130. The structured light module 110 is configured to obtain an ear image of the object to be collected, and send the ear image to the processing unit 130. And the processing unit 130 is configured to calculate an auricular point position according to the received auricular diagram, and determine a projection direction angle of the projection module 120 according to the auricular point position, so that the projection module 120 projects the guide light to the auricular point position.
As shown in fig. 1, the structured light module 110 and the projection module 120 form a collecting projection module a. The collecting and projecting module a is connected to the processing unit 130 through the data line 140, and is used for transmitting data and control signals between the collecting and projecting module a and the processing unit 130.
It should be noted that the ear map may include a structural diagram of the ear to be recognized acquired by the structured light module 110 and an ear image recorded by the camera, and may also be other types of ear images, such as a depth image. The projection module 120 of the present embodiment employs a laser projection module 120. According to the invention, the ear map is obtained through the structured light module 110, the ear acupuncture point position is calculated according to the ear map, and the ear acupuncture point position is presented on the ear by combining the laser projection module 120, so that the accuracy and the visualization of ear acupuncture point guiding can be improved, and the treatment diagnosis and teaching of the ear acupuncture point can be favorably carried out.
The ear view of the present embodiment includes an ear structure view and an ear image. The ear structure chart is an ear structure chart obtained by irradiating the ear of the auricular point to be identified with structural light, and the ear image is an ear image recorded by a camera. As shown in fig. 1, the structured light module 110 includes a dot matrix projector 111, an infrared lens 112, and a CCD camera 113, which are sequentially disposed. Wherein, acquire the ear picture of waiting to gather the object, include: a dot matrix projector 111 for projecting a light spot to the ear. And the infrared lens 112 is used for acquiring an ear dot matrix pattern of the ear to obtain an ear structure diagram. The CCD camera 113 is configured to acquire an ear image, where the ear image acquired by the CCD camera of this embodiment is a color image of an ear, and the color image may include more feature information of the ear, so as to facilitate subsequent processing of the ear image.
By means of the dot matrix projector 111, tens of thousands of light points are projected to the ear to be identified, and the depth of field information of the ear can be drawn. The ear identification of the depth of field can improve the accuracy and the safety of the ear identification. The infrared lens 112 can capture the infrared light projected by the dot matrix projector 111 to obtain the ear dot matrix pattern of the ear. A CCD (charge coupled device) camera has the characteristics of small volume, light weight, no influence of magnetic field, and vibration and impact resistance.
The distance between the device and the ear part can be 0.2 m-0.6 m; the precision of the structured light projected to the ear in the structured light module 110 is ± 1mm @600 mm; the number of light spots emitted to the ear by the dot matrix projector 111 of the present embodiment is 30000; the resolution of the CCD camera 113 is 1280 × 1024.
Specifically, when the auricular point guide device 100 is started to perform auricular point guide, the dot matrix projector 111 projects a light spot to the ear, the infrared lens 112 reads the auricular dot matrix pattern, the auricular structure diagram is captured, and the CCD camera 113 records an auricular image. The processing unit 130 processes the captured ear structure diagram according to the ear structure diagram and in combination with the ear image recorded by the CCD camera 113, calculates the position of the auricular point, and controls the laser projection module 120 to project laser to the corresponding auricular point of the ear, thereby realizing accurate positioning and guiding of the auricular point.
As shown in fig. 3, the processing unit 130 includes a dividing subunit 131, a matching subunit 132, an identifying subunit 133, and a controlling subunit 134. The segmentation subunit 131 is configured to transmit the ear image to a preset area positioning network, remove an interference area around the ear, and obtain an ear segmentation map. And the matching subunit 132 is configured to match the ear part segmentation drawing with the ear structure drawing to obtain an ear part segmentation structure drawing. The identifying subunit 133 is configured to input the ear segmentation structure diagram into a pre-stored ear acupoint identification model, perform comparison analysis according to the feature information of the ear segmentation structure diagram and the feature information of the pre-stored ear acupoint identification model, and determine the position of the ear acupoint. And the control subunit 134 is configured to determine the projection direction angle of the projection module 120 according to the auricular point position.
Specifically, first, an ear image recorded by the CCD camera 113 is transmitted to a preset area positioning network, and the area positioning network obtains an ear part cutout map by removing an interference area such as skin around the ear. Then, the ear part division diagram is matched with the ear part structure diagram captured by the infrared lens 112 to obtain an ear part division structure diagram, and the ear part division structure diagram comprises an ear part dot matrix pattern. And then, inputting the ear segmentation structure chart into a pre-stored ear acupoint identification model, and comparing and analyzing the feature information of the ear segmentation structure chart with pre-stored ear acupoint features to determine the corresponding ear acupoint position. Finally, according to the position of the auricular point, the projection direction angle of the projection module 120 is adjusted, and the projection module 120 is controlled to project and guide light to the auricular point corresponding to the auricular part, so that auricular point guidance is completed.
The ear image is input into a preset area identification network, an interference area is removed, an ear segmentation graph is identified, an ear segmentation structure graph is obtained according to the obtained ear segmentation graph and combined with the ear structure graph, and the ear segmentation structure graph is subjected to feature comparison to obtain positioning information of the auricular points. Ear feature information is extracted in a mode of combining the ear structure diagram and the ear image, and the identification efficiency can be improved.
Specifically, the area location network includes a convolution structure, a standard box layer, and a regression classification layer. The convolution structure comprises an input layer, a plurality of convolution layers, a plurality of pooling layers, a full-link layer and an output layer and is used for network training of ear images. And the standard frame layer is used for storing the ear standard area selected by the convolution structure in the network training process. And the regression classification layer is used for calculating the probability of the candidate frame as the target frame by adopting the cross entropy and obtaining the ear part segmentation map by selecting the candidate frame with the highest probability.
The training of the area recognition network is completed through the convolution structure, the standard frame layer and the regression classification layer, and the accuracy and the recognition efficiency of the area recognition are improved.
Specifically, the ear image is the same as the picture size of the ear structure diagram, and in the matching subunit 132, the ear part segmentation diagram is mapped into the ear structure diagram, and the corresponding position is the ear segmentation structure diagram. The sizes of the ear images and the images of the ear structure chart are consistent, matching calculation can be simplified, and the efficiency of a matching process is improved.
Specifically, the auricular point recognition model comprises a feature extraction module, a feature module and a recognition module. The device comprises a feature extraction module, a feature extraction module and a feature extraction module, wherein the feature extraction module is used for extracting feature information of the ear segmentation structure chart. And the characteristic module is prestored with standard auricular point characteristic information of the auricular point identification model. And the identification module compares the characteristic information of the ear segmentation structure chart with the standard ear acupoint characteristic information by adopting a softmax classifier so as to determine the ear acupoint position.
The feature extraction module is used for extracting feature information of the ear segmentation structure chart. The characteristic extraction module comprises an input layer, a plurality of convolution layers, a plurality of pooling layers, a full-connection layer and an output layer. The characteristic module is composed of pre-stored ear acupoint characteristic information, and the pre-stored ear acupoint characteristic information is extracted from standard ear acupoint images of GB/T13734-2008 ear acupoint name and location. The identification module adopts a softmax classifier to compare the characteristic information of the ear segmentation structure chart with the standard ear acupoint information to obtain the corresponding ear acupoint.
As shown in fig. 3, the processing unit 130 further includes a calculation subunit 135. The calculation subunit 135 is configured to: establishing a first spatial coordinate system by the structured light module 110 and a second spatial coordinate system by the projection module 120; calculating a conversion relation between the first space coordinate system and the second space coordinate system; calculating a first position coordinate of the ear acupoint in a first space coordinate system according to the position of the ear acupoint in the ear part cutting chart; converting the first position coordinate to a second position coordinate in a second space coordinate system according to the conversion relation; and determining the projection direction angle according to the second position coordinate.
After the space coordinate system is established by the structured light module 110 and the projection module 120, the spatial position relationship between the displacement and the rotation can be determined. Therefore, according to the position of the auricular point in the ear part cutting diagram, the position of the auricular point in the space coordinate system of the structured light module 110 can be determined, and further the auricular point is converted into the position of the space coordinate system of the projection module 120, so that the projection direction angle of the projection light can be determined.
As shown in fig. 4, a second aspect of the present invention provides an auricular point guiding method S100, which is implemented based on the auricular point guiding device 100 provided in the first aspect of the present invention, and the specific structure of the auricular point guiding device 100 is described above and will not be described herein again. The auricular point guiding method S100 includes:
step S110, obtaining an ear image of an object to be collected;
and step S120, calculating the auricular point position according to the received auricular diagram, determining the projection direction angle of the projection light according to the auricular point position, and projecting the guide light to the auricular point position.
Through the ear picture that acquires the object of waiting to gather, calculate ear acupuncture point position according to the ear picture that receives to according to the projection direction angle of ear acupuncture point position determination projection light, throw the guide light to ear acupuncture point position, can promote the accuracy and the visualization of ear acupuncture point guide, be favorable to carrying out the treatment of ear acupuncture point and diagnose and the teaching.
Specifically, as shown in fig. 5, in step S110, acquiring an ear chart of the object to be acquired includes:
step S111, projecting light spots to ears;
step S112, acquiring an ear dot matrix pattern of an ear to obtain an ear structure diagram;
and step S113, acquiring an ear image.
Specifically, as shown in fig. 6, calculating the auricular point position according to the received auricular diagram, determining the projection direction angle of the projected light according to the auricular point position, and projecting the guided light to the auricular point position, includes:
step S121, transmitting the ear image into a preset area positioning network, removing an interference area around the ear and obtaining an ear part cutting chart;
step S122, matching the ear part segmentation chart with the ear structure chart to obtain an ear part segmentation structure chart;
step S123, inputting the ear segmentation structure chart into a pre-stored ear acupoint identification model, and comparing and analyzing the feature information of the ear segmentation structure chart with the feature information of the pre-stored ear acupoint identification model to determine the position of an ear acupoint;
and step S124, determining the projection direction angle of the projected light according to the ear acupoint position.
The ear characteristic information is extracted in a mode of combining the ear structure diagram and the ear image, so that the recognition efficiency can be improved, and the accurate positioning and guiding of the auricular points are realized.
It will be understood that the above embodiments are merely exemplary embodiments taken to illustrate the principles of the present invention, which is not limited thereto. It will be apparent to those skilled in the art that various modifications and improvements can be made without departing from the spirit and substance of the invention, and these modifications and improvements are also considered to be within the scope of the invention.

Claims (10)

1. An auricular point guiding device, comprising: a structured light module, a projection module and a processing unit, wherein the structured light module and the projection module are respectively connected with the processing unit,
the structured light module is used for acquiring an ear image of an object to be acquired and sending the ear image to the processing unit;
the processing unit is used for calculating the auricular point position according to the received auricular diagram and determining the projection direction angle of the projection module according to the auricular point position so that the projection module can project the guide light to the auricular point position.
2. The auricular point guiding device according to claim 1, wherein the auricular image comprises an auricular structure diagram and an auricular image, and the structured light module comprises a dot matrix projector, an infrared lens and a CCD camera, which are sequentially arranged, wherein the auricular image of the object to be collected is obtained, which comprises:
the dot matrix projector is used for projecting light spots to the ears;
the infrared lens is used for acquiring the ear dot matrix pattern of the ear part to obtain the ear part structure diagram;
the CCD camera is used for collecting ear images.
3. The auricular point guiding device according to claim 2, wherein the processing unit comprises a segmentation subunit, a matching subunit, an identification subunit, and a control subunit; wherein the content of the first and second substances,
the segmentation subunit is configured to transmit the ear image to a preset area positioning network, remove an interference area around an ear, and obtain an ear segmentation map;
the matching subunit is configured to match the ear part segmentation chart with the ear structure chart to obtain an ear segmentation structure chart;
the identification subunit is used for inputting the ear segmentation structure chart into a pre-stored ear acupoint identification model, and performing comparison analysis according to the characteristic information of the ear segmentation structure chart and the characteristic information of the pre-stored ear acupoint identification model to determine the position of the ear acupoint;
and the control subunit is used for determining the projection direction angle of the projection module according to the auricular point position.
4. The auricular point guiding device according to claim 3, wherein the area localization network comprises a convolution structure, a standard box layer and a regression classification layer, wherein,
the convolution structure is used for network training for identifying ear images;
the standard frame layer is used for storing an ear standard area selected by the convolution structure in a network training process;
and the regression classification layer is used for calculating the probability of the candidate frame serving as the target frame by adopting cross entropy and obtaining the ear part segmentation map by selecting the candidate frame with the highest probability.
5. The auricular point guiding device according to claim 3, wherein the ear image is in accordance with the picture size of the ear structure diagram, and the ear segmentation structure diagram is mapped to the ear portion in the matching subunit, and the corresponding position is the ear segmentation structure diagram.
6. The auricular point guiding device according to claim 3, wherein the auricular point recognition model comprises a feature extraction module, a feature module, and a recognition module, wherein,
the feature extraction module is used for extracting feature information of the ear segmentation structure chart;
the characteristic module is prestored with standard auricular point characteristic information of the auricular point identification model;
the identification module adopts a softmax classifier, compares the feature information of the ear segmentation structure chart with the standard ear acupoint feature information, and determines the ear acupoint position.
7. The auricular point guiding device according to claim 3, wherein the processing unit further comprises a computing module for:
establishing a first space coordinate system by the structured light module and establishing a second space coordinate system by the projection module;
calculating a conversion relation between the first space coordinate system and the second space coordinate system;
calculating a first position coordinate of the ear acupoint in the first space coordinate system according to the position of the ear acupoint in the ear part cutting diagram;
converting the first position coordinate to a second position coordinate in the second space coordinate system according to the conversion relation;
and determining the projection direction angle according to the second position coordinate.
8. An auricular point guiding method, comprising:
acquiring an ear graph of an object to be acquired;
calculating the position of the auricular point according to the received auricular diagram, determining the projection direction angle of the projection light according to the position of the auricular point, and projecting the guide light to the position of the auricular point.
9. The auricular point guiding method according to claim 8, wherein the obtaining of the auricular map of the subject to be collected comprises:
projecting a light spot toward the ear;
acquiring an ear dot matrix pattern of the ear to obtain an ear structure diagram;
an ear image is acquired.
10. The auricular point guiding method according to claim 9, wherein the calculating of the auricular point position from the received auricular diagram, and determining the projection direction angle of the projected light from the auricular point position, and projecting the guided light to the auricular point position, comprises:
transmitting the ear image into a preset area positioning network, removing an interference area around the ear, and obtaining an ear part cutting chart;
matching the ear part segmentation drawing with the ear structure drawing to obtain an ear part segmentation structure drawing;
inputting the ear segmentation structure chart into a pre-stored ear acupoint identification model, and comparing and analyzing the feature information of the ear segmentation structure chart with the feature information of the pre-stored ear acupoint identification model to determine the ear acupoint position;
and determining the projection direction angle of the projection light according to the auricular point position.
CN201911353011.7A 2019-12-25 2019-12-25 Auricular point guiding device and method Pending CN110897865A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911353011.7A CN110897865A (en) 2019-12-25 2019-12-25 Auricular point guiding device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911353011.7A CN110897865A (en) 2019-12-25 2019-12-25 Auricular point guiding device and method

Publications (1)

Publication Number Publication Date
CN110897865A true CN110897865A (en) 2020-03-24

Family

ID=69827528

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911353011.7A Pending CN110897865A (en) 2019-12-25 2019-12-25 Auricular point guiding device and method

Country Status (1)

Country Link
CN (1) CN110897865A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114099322A (en) * 2021-12-06 2022-03-01 贵州中医药大学第一附属医院 Method for conveniently positioning auricular points

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102324041A (en) * 2011-09-09 2012-01-18 深圳泰山在线科技有限公司 Pixel classification method, joint body gesture recognition method and mouse instruction generating method
CN102854192A (en) * 2012-08-22 2013-01-02 北京农业智能装备技术研究中心 System and method for detecting apple surface defect
CN104800068A (en) * 2015-05-22 2015-07-29 京东方科技集团股份有限公司 Device and system used for remotely determining acupuncture point
JP2016052517A (en) * 2014-09-03 2016-04-14 和明 戸澤 Ear pressure point detection terminal device, ear pressure point diet management system and ear pressure point diet management method
CN106683134A (en) * 2017-01-25 2017-05-17 触景无限科技(北京)有限公司 Table lamp image calibration method and device
CN107423698A (en) * 2017-07-14 2017-12-01 华中科技大学 A kind of gesture method of estimation based on convolutional neural networks in parallel
CN108154176A (en) * 2017-12-22 2018-06-12 北京工业大学 A kind of 3D human body attitude algorithm for estimating for single depth image
CN108938396A (en) * 2017-09-26 2018-12-07 炬大科技有限公司 A kind of ear acupuncture point identification device and its method based on deep learning
CN109635783A (en) * 2019-01-02 2019-04-16 上海数迹智能科技有限公司 Video monitoring method, device, terminal and medium
CN109886062A (en) * 2017-12-06 2019-06-14 东北林业大学 A kind of camellia oleifera fruit flower identification positioning system
CN110232318A (en) * 2019-05-06 2019-09-13 平安科技(深圳)有限公司 Acupuncture point recognition methods, device, electronic equipment and storage medium
CN110232383A (en) * 2019-06-18 2019-09-13 湖南省华芯医疗器械有限公司 A kind of lesion image recognition methods and lesion image identifying system based on deep learning model
CN110232326A (en) * 2019-05-20 2019-09-13 平安科技(深圳)有限公司 A kind of D object recognition method, device and storage medium
CN110264416A (en) * 2019-05-28 2019-09-20 深圳大学 Sparse point cloud segmentation method and device
CN110443205A (en) * 2019-08-07 2019-11-12 北京华捷艾米科技有限公司 A kind of hand images dividing method and device
CN110464633A (en) * 2019-06-17 2019-11-19 深圳壹账通智能科技有限公司 Acupuncture point recognition methods, device, equipment and storage medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102324041A (en) * 2011-09-09 2012-01-18 深圳泰山在线科技有限公司 Pixel classification method, joint body gesture recognition method and mouse instruction generating method
CN102854192A (en) * 2012-08-22 2013-01-02 北京农业智能装备技术研究中心 System and method for detecting apple surface defect
JP2016052517A (en) * 2014-09-03 2016-04-14 和明 戸澤 Ear pressure point detection terminal device, ear pressure point diet management system and ear pressure point diet management method
CN104800068A (en) * 2015-05-22 2015-07-29 京东方科技集团股份有限公司 Device and system used for remotely determining acupuncture point
CN106683134A (en) * 2017-01-25 2017-05-17 触景无限科技(北京)有限公司 Table lamp image calibration method and device
CN107423698A (en) * 2017-07-14 2017-12-01 华中科技大学 A kind of gesture method of estimation based on convolutional neural networks in parallel
CN108938396A (en) * 2017-09-26 2018-12-07 炬大科技有限公司 A kind of ear acupuncture point identification device and its method based on deep learning
CN109886062A (en) * 2017-12-06 2019-06-14 东北林业大学 A kind of camellia oleifera fruit flower identification positioning system
CN108154176A (en) * 2017-12-22 2018-06-12 北京工业大学 A kind of 3D human body attitude algorithm for estimating for single depth image
CN109635783A (en) * 2019-01-02 2019-04-16 上海数迹智能科技有限公司 Video monitoring method, device, terminal and medium
CN110232318A (en) * 2019-05-06 2019-09-13 平安科技(深圳)有限公司 Acupuncture point recognition methods, device, electronic equipment and storage medium
CN110232326A (en) * 2019-05-20 2019-09-13 平安科技(深圳)有限公司 A kind of D object recognition method, device and storage medium
CN110264416A (en) * 2019-05-28 2019-09-20 深圳大学 Sparse point cloud segmentation method and device
CN110464633A (en) * 2019-06-17 2019-11-19 深圳壹账通智能科技有限公司 Acupuncture point recognition methods, device, equipment and storage medium
CN110232383A (en) * 2019-06-18 2019-09-13 湖南省华芯医疗器械有限公司 A kind of lesion image recognition methods and lesion image identifying system based on deep learning model
CN110443205A (en) * 2019-08-07 2019-11-12 北京华捷艾米科技有限公司 A kind of hand images dividing method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨志尧: "基于区域建议网络的图像语义分割技术研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114099322A (en) * 2021-12-06 2022-03-01 贵州中医药大学第一附属医院 Method for conveniently positioning auricular points
CN114099322B (en) * 2021-12-06 2023-05-26 贵州中医药大学第一附属医院 Method for conveniently positioning auricular points

Similar Documents

Publication Publication Date Title
CN107609383B (en) 3D face identity authentication method and device
CN107748869B (en) 3D face identity authentication method and device
CN107633165B (en) 3D face identity authentication method and device
JP4636140B2 (en) Vein imaging device, vein imaging method, and vein authentication device
CA2744757C (en) Biometric authentication using the eye
EP3654239A1 (en) Contact and non-contact image-based biometrics using physiological elements
CN109190540B (en) Biopsy region prediction method, image recognition device, and storage medium
CN109272483B (en) Capsule endoscopy and quality control system and control method
US20070127781A1 (en) Animal eye biometrics
JP2019117579A5 (en)
CN111860203B (en) Abnormal pig identification device, system and method based on image and audio mixing
CN110960193A (en) Traditional Chinese medicine diagnosis analysis system and method based on face and tongue image acquisition
US20150227791A1 (en) Biometric authentication using the eye
CN110897865A (en) Auricular point guiding device and method
CN104667510A (en) Human motion test system
CN110321946A (en) A kind of Multimodal medical image recognition methods and device based on deep learning
CN112317962A (en) Marking system and method for invisible appliance production
KR20210013901A (en) All in One Nose Pattern Image Detector Comprising at Least one Code Reader and Method for Registering the Nose Pattern
JP2010240215A (en) Vein depth determination apparatus, vein depth determination method and program
CN108813899B (en) Nail beautifying device and method thereof
JP2005259049A (en) Face collation device
CN116725487A (en) Laser beauty treatment diagnosis and treatment area identification system
WO2010150018A1 (en) Equipment for facilitating recordal of irregularities on an object, and acupuncture diagnosis apparatus, software and method
WO2019102686A1 (en) Biometric authentication device and biometric authentication system
CN113870214A (en) Vein image quality evaluation method and device and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200324

RJ01 Rejection of invention patent application after publication