CN215181889U - Apparatus for providing real-time visualization service using three-dimensional facial and body scan data - Google Patents

Apparatus for providing real-time visualization service using three-dimensional facial and body scan data Download PDF

Info

Publication number
CN215181889U
CN215181889U CN202121555828.5U CN202121555828U CN215181889U CN 215181889 U CN215181889 U CN 215181889U CN 202121555828 U CN202121555828 U CN 202121555828U CN 215181889 U CN215181889 U CN 215181889U
Authority
CN
China
Prior art keywords
data
face
unit
glasses
providing real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202121555828.5U
Other languages
Chinese (zh)
Inventor
黃湧智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202121555828.5U priority Critical patent/CN215181889U/en
Application granted granted Critical
Publication of CN215181889U publication Critical patent/CN215181889U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The utility model relates to an use three-dimensional face and health scan data to provide real-time visual service's equipment contains: the system comprises a network, at least one user terminal, a server, at least one 3D scanner and at least one AR (augmented reality) glasses, wherein the user terminal is provided with a user interface for user operation, the server is used for providing real-time visual service, the 3D scanner is used for acquiring 3D face or body scanning data of a patient, the AR is used for capturing a seen picture and displaying received visual data, and the received visual data can be overlapped on the seen picture; therefore, the precision of treatment or operation can be improved, and the use flexibility and convenience are improved.

Description

Apparatus for providing real-time visualization service using three-dimensional facial and body scan data
Technical Field
The utility model relates to an equipment that provides real-time visual service indicates especially an equipment that uses three-dimensional face and health scan data to provide real-time visual service.
Background
Image registration techniques are increasingly used in the planning of the treatment of various lesions, and in short, they are mainly used to express medical images of various modalities in one coordinate system, and also for other purposes, such as motion correction, distortion correction, creation of multi-functional maps and image arrangements, and matching techniques are also used in the matching of coordinate systems between images and automatic control devices and between images, and technically, they have evolved from point-to-point methods using anatomical pointers or artificial attachments to automatic optimization methods of image methods.
However, in the prior art, when an image of a certain area is captured, a camera or a video camera is used to map three-dimensional space data of the area into a two-dimensional image space so as to form a corresponding two-dimensional image, and face detection is performed on illumination information, viewpoint information, and the like in the two-dimensional image.
However, since the imaging process of the two-dimensional image involves mapping from a three-dimensional physical space to a two-dimensional image space, part of information of the three-dimensional object is lost, so that the two-dimensional image is difficult to accurately represent the three-dimensional object, and the two-dimensional image is easily affected by factors such as scale change, viewpoint change and illumination change, so that the detection accuracy is reduced.
SUMMERY OF THE UTILITY MODEL
In view of the above-mentioned shortcoming of the prior art, the utility model provides an equipment that uses three-dimensional face and health scan data to provide real-time visual service contains: the system comprises a network, and at least one user terminal, a server, at least one 3D scanner and at least one AR glasses which are mutually connected through the network;
the user terminal is provided with a user interface for user operation;
the server is used for providing real-time visual service and comprises a collecting unit, a merging unit, a transmission unit, a tracking unit, an output unit, an extraction unit and a storage unit;
the 3D scanner is used for acquiring 3D face or body scanning data of a patient;
the AR glasses are worn on the head of a human body, provide captured and seen pictures and display received visual data, and can overlap the received visual data on the seen pictures.
Preferably, the network is a wired network or a wireless network.
Preferably, the user terminal is a desktop computer, a notebook computer, a tablet computer or a smart phone.
Preferably, the collecting unit is configured to collect 3D face or body data by 3D scanning a face or body of a patient, the merging unit may merge the 3D face data by overlapping at least one visualization data, the transmitting unit may transmit 3D information corresponding to the at least one visualization data to the AR glasses, the tracking unit may perform AR face tracking when extracting at least one face feature point from the AR glasses, the output unit may output the at least one visualization data to the AR glasses to correspond to the at least one feature point, the extracting unit may collect the 3D face data by scanning the face of the patient by the collecting unit, and the storing unit may store an image photographed and output from the AR glasses.
The utility model discloses an use three-dimensional face and health scan data to provide real-time visual service's equipment's main purpose lies in using three-dimensional face or health scan data cooperation AR glasses, overlaps visual data in the picture of seeing, can not cause the delay to improve the accuracy of treatment or operation, and have more use flexibility and convenience.
Drawings
Fig. 1 is a schematic diagram of the architecture of the present invention.
Fig. 2 is a block diagram of the server according to the present invention.
Fig. 3 is a block flow chart of the present invention.
Fig. 4 is the schematic diagram of the 3D scanner of the present invention scanning the patient.
Fig. 5 is a schematic diagram of another 3D scanner of the present invention scanning a patient.
Fig. 6 is a schematic diagram of the present invention showing a patient being scanned by a 3D scanner.
Fig. 7 is a schematic view of the present invention for inputting information required for surgery.
Fig. 8 is a schematic diagram of another embodiment of the present invention for inputting information required for surgery.
Fig. 9 is a schematic diagram of information required for the overlap operation of the present invention.
Fig. 10 is a schematic diagram of information required for another overlay operation of the present invention.
Fig. 11 is a schematic diagram of information required for another overlay operation of the present invention.
Fig. 12 is a schematic diagram of the use of the visual data transmission to the AR glasses according to the present invention.
Fig. 13 is another schematic usage diagram of the present invention for transmitting visual data to the AR glasses.
Fig. 14 is a schematic diagram of the visual data transmission to the AR glasses according to the present invention.
Fig. 15 is a flowchart of an embodiment of the present invention.
Reference numerals
10 network
20 user terminal
30 server
31 collecting unit
32 merging unit
33 Transmission Unit
34 tracking unit
35 output unit
36 extraction unit
37 storage unit
403D scanner
50 AR glasses
S1013D face scan
S1023D face data processing
S1033D data transfer
S104 AR face tracking
S105 AR glasses data visualization
S106 facial treatment and surgery
S107 data storage
S201 collects 3D face data by scanning a face of a person to be 3D treated
S202 superimposing and merging at least one visualization data on the 3D face data
S203 transmitting 3D data corresponding to at least one visualization data to AR glasses
S204, when at least one feature point of the face is extracted from the AR glasses, AR face tracking is carried out
S205, outputting at least one visual data to the AR glasses corresponding to the at least one feature point
A1 anterior surgical image
A2 simulated postoperative facial image
B1 treatment or surgery information
C13D simulation data
C23D analog data
C33D simulation data
D1 visual data
E13D simulates data.
Detailed Description
The following description is provided as an illustration of a preferred embodiment of the invention in order to enable those skilled in the art to practice it in light of the teachings herein.
First, please refer to fig. 1 and 2, the apparatus for providing real-time visual service using three-dimensional face and body scan data of the present invention includes: a network 10, and at least one user terminal 20, a server 30, at least one 3D scanner 40, and at least one AR glasses 50 connected to each other through the network 10.
The network 10 may be a wired network or a wireless network.
The user terminal 20 is provided with a user interface for user operation, and the user terminal may be a desktop computer, a notebook computer, a tablet computer, a smart phone, and the like.
The server 30 is used for providing real-time visualization services, and the server 30 includes a collecting unit 31, a merging unit 32, a transmitting unit 33, a tracking unit 34, an output unit 35, an extracting unit 36, and a storage unit 37.
The 3D scanner 40 is used to acquire 3D face or body scan data of a patient, and the 3D scanner 40 may be a portable 3D scanner as shown in fig. 4, a handheld 3D scanner as shown in fig. 5, or a stationary 3D scanner as shown in fig. 6.
The AR (Augmented Reality) glasses 50 are worn on the head of a human body, provide captured visual images and display received visual data, and can overlap the received visual data with the captured visual images.
The collection unit 31 is configured to collect 3D face or body data by 3D scanning a face or body of a patient.
The merging unit 32 may merge the 3D facial data by overlaying at least one visualization data, which may be any one or at least one combination of image data, such as X-Ray, CT, PET, MRI and dental data, of patient information content, labeling treatment or surgical sites, simulation data generated by driving a 3D simulation procedure, and anatomical data.
The transmission unit 33 may transmit 3D information corresponding to at least one visualization data to the AR glasses 50, the 3D information transmission being operated by the user terminal 20.
The tracking unit 34 may perform AR face tracking when extracting at least one face feature point from the AR glasses 50.
The output unit 35 can output at least one visualization data to the AR glasses 50 to correspond to at least one feature point, and when the at least one visualization data is output to the AR glasses 50 to correspond to the at least one feature point, the output unit 35 combines the at least one visualization data according to the at least one feature point so as to be displayed on the AR glasses 50.
The extraction unit 36 may acquire 3D face data by scanning the face of the patient by the collection unit 31 and then extract at least one feature point from the 3D face data, the at least one feature point extracted in this manner serving the same function as an AR marker even if there is no AR marker in the above-described AR glasses 50.
The storage unit 37 may store the image photographed and output from the AR glasses 50 to correspond to the at least one feature point from the output unit 35 or output at least one visualization data to the AR glasses 50, the image captured by the AR glasses 50 may be a procedure or surgery video, and the image output to the AR glasses 50 may be the at least one visualization data.
The utility model discloses when carrying out real-time visual service, as shown in fig. 3 to 14, contain following step: 3D face scanning S101, 3D face data processing S102, 3D data transmission S103, AR face tracking S104, AR glasses data visualization S105, face treatment and surgery S106, and data storage S107, the 3D face scanning data of the person to be treated is photographed before the initial surgery, feature points are extracted and regions are divided, and then the appearance after surgery is predicted by simulating the control feature points.
The simulation result data is stored as a reference value for comparing 3D face scan data after a patient operation or a post-operation, and is displayed on the AR glasses 50 as 3D data, and at this time, the AR glasses 50 output a prescribed treatment region, a treatment degree, and the like on the face when performing face tracking, and thus, it is possible to output an internal structure that is not clearly seen on the outer surface, or output an angle according to a prescribed matter.
In the conventional method, when an important nerve or blood vessel passes through the inside, the nerve or blood vessel is usually checked and avoided by touching the inside with a hand. According to the embodiment of the present invention, since the corresponding visual data is displayed on the face of the patient, it can be avoided without touching the position, and serious damage that may cause an obstacle can be avoided, so that the practitioner or surgeon can perform an operation or treatment with great care, and the patient can confirm the agreed items through the result rather than the armed evaluation or subjective judgment.
As shown in fig. 4 to 6, the real-time visualization service providing server 30 receives 3D face scan data through at least one 3D scanner 40, displays a comparison of a pre-operative face image a1 with a simulated post-operative face image a2 as shown in fig. 7, superimposes a face image marked with treatment or operation information B1 as shown in fig. 8, and superimposes 3D simulation data C1, C2, C3 as shown in fig. 9 to 11.
As shown in fig. 12, overlapped 3D data, i.e., visualization data, is transmitted to the AR glasses 50 and AR face tracking is started, the AR glasses 50 receive information on at least one feature point of the face from the 3D scanner 40 connected thereto, upon extraction, posture estimation and visualization data D1 corresponding thereto is rotated or transformed according to an angle, and then output, at which time, the AR glasses 50 merge the 3D data by matching the visualization data D1 from the facial feature point data.
As shown in fig. 13 and 14, the operation can be performed while checking data necessary for the operation in real time by the AR glasses 50, and even in the case of dental treatment, the operation can be performed while viewing 3D dental data by the AR glasses 50, and further, in the case of filling treatment, the 3D simulation data E1 can be viewed in real time by the AR glasses 50, and in the case of botulism treatment, the anatomical data can be viewed in real time by the AR glasses 50, and thus, data output and photographed by the AR glasses 50 is stored in the server 30, and the procedure and operation video are photographed and stored.
The utility model discloses during the implementation, as shown in fig. 15, contain following step: collecting 3D face data by scanning a face of a person to be 3D treated S201, superimposing and merging at least one visualization data on the 3D face data S202, transmitting the 3D data corresponding to the at least one visualization data to AR glasses S203, performing AR face tracking when at least one feature point of the face is extracted from the AR glasses S204, outputting the at least one visualization data to the AR glasses corresponding to the at least one feature point S205.
From the above specific embodiments, the following advantageous effects can be obtained:
the utility model discloses use three-dimensional face and health scan data to provide real-time visual service's equipment, it uses three-dimensional face or health scan data cooperation AR glasses 50, overlaps visual data in the picture of seeing, can not cause the delay to the data that AR glasses 50 output and shoot can be saved in server 30, and process and operation video can be shot and the storage, thereby improve the accuracy of treatment or operation, and have more use flexibility and convenience.

Claims (4)

1. An apparatus for providing real-time visualization services using three-dimensional facial and body scan data, comprising: the system comprises a network, and at least one user terminal, a server, at least one 3D scanner and at least one AR glasses which are mutually connected through the network;
the user terminal is provided with a user interface for user operation;
the server is used for providing real-time visual service and comprises a collecting unit, a merging unit, a transmission unit, a tracking unit, an output unit, an extraction unit and a storage unit;
the 3D scanner is used for acquiring 3D face or body scanning data of a patient;
the AR glasses are worn on the head of a human body, provide captured and seen pictures and display received visual data, and can overlap the received visual data on the seen pictures.
2. The apparatus for providing real-time visualization services using three-dimensional facial and body scan data as recited in claim 1, wherein the network is a wired network or a wireless network.
3. The apparatus for providing real-time visualization services using three-dimensional facial and body scan data according to claim 1, wherein the user terminal is a desktop computer, a laptop computer, a tablet computer or a smart phone.
4. The apparatus for providing real-time visualization services using three-dimensional facial and body scan data according to claim 1, characterized in that the collection unit is configured to collect 3D face or body data by 3D scanning a face or body of a patient, the merging unit may merge the 3D face data by overlapping at least one visualization data, the transmitting unit may transmit 3D information corresponding to the at least one visualization data to the AR glasses, the tracking unit may perform AR face tracking while extracting at least one face feature point from the AR glasses, the output unit may output at least one visualization data to the AR glasses to correspond to at least one feature point, the extraction unit may acquire 3D face data by scanning the face of the patient by the collection unit, and the storage unit may store the image photographed and output from the AR glasses.
CN202121555828.5U 2021-07-09 2021-07-09 Apparatus for providing real-time visualization service using three-dimensional facial and body scan data Active CN215181889U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202121555828.5U CN215181889U (en) 2021-07-09 2021-07-09 Apparatus for providing real-time visualization service using three-dimensional facial and body scan data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202121555828.5U CN215181889U (en) 2021-07-09 2021-07-09 Apparatus for providing real-time visualization service using three-dimensional facial and body scan data

Publications (1)

Publication Number Publication Date
CN215181889U true CN215181889U (en) 2021-12-14

Family

ID=79396901

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202121555828.5U Active CN215181889U (en) 2021-07-09 2021-07-09 Apparatus for providing real-time visualization service using three-dimensional facial and body scan data

Country Status (1)

Country Link
CN (1) CN215181889U (en)

Similar Documents

Publication Publication Date Title
RU2740259C2 (en) Ultrasonic imaging sensor positioning
US11443431B2 (en) Augmented reality patient positioning using an atlas
KR101647467B1 (en) 3d surgical glasses system using augmented reality
US20190066390A1 (en) Methods of Using an Imaging Apparatus in Augmented Reality, in Medical Imaging and Nonmedical Imaging
US20050203380A1 (en) System and method for augmented reality navigation in a medical intervention procedure
CN112087985A (en) Simulated orthodontic treatment via real-time enhanced visualization
US9713508B2 (en) Ultrasonic systems and methods for examining and treating spinal conditions
CN107854142B (en) Medical ultrasonic augmented reality imaging system
CN111297501B (en) Augmented reality navigation method and system for oral implantation operation
US11139070B2 (en) Medical information virtual reality server system, medical information virtual reality program, medical information virtual reality system, method of creating medical information virtual reality data, and medical information virtual reality data
US10078906B2 (en) Device and method for image registration, and non-transitory recording medium
CN111588467B (en) Method for converting three-dimensional space coordinates into two-dimensional image coordinates based on medical images
KR101993384B1 (en) Method, Apparatus and system for correcting medical image by patient's pose variation
CN110638525B (en) Operation navigation system integrating augmented reality
KR20220019353A (en) User terminal for providing augmented reality medical image and method for providing augmented reality medical image
CN111658142A (en) MR-based focus holographic navigation method and system
CN215181889U (en) Apparatus for providing real-time visualization service using three-dimensional facial and body scan data
KR20160057024A (en) Markerless 3D Object Tracking Apparatus and Method therefor
CN111462314B (en) Organ three-dimensional image reconstruction method, operation navigation method and operation auxiliary system
CN111631814B (en) Intraoperative blood vessel three-dimensional positioning navigation system and method
TWM618266U (en) Device for providing real-time visualization service using three-dimensional face and body scanning data
WO2024067753A1 (en) Registration method, registration system, navigation information determination method, and navigation system
US11832895B2 (en) Method and system for register operating space
US11369440B2 (en) Tactile augmented reality for medical interventions
US20240197411A1 (en) System and method for lidar-based anatomical mapping

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant