CN108074637A - Hospital guide's method and diagnosis guiding system - Google Patents

Hospital guide's method and diagnosis guiding system Download PDF

Info

Publication number
CN108074637A
CN108074637A CN201610980293.3A CN201610980293A CN108074637A CN 108074637 A CN108074637 A CN 108074637A CN 201610980293 A CN201610980293 A CN 201610980293A CN 108074637 A CN108074637 A CN 108074637A
Authority
CN
China
Prior art keywords
human
model
human body
selection part
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610980293.3A
Other languages
Chinese (zh)
Inventor
杨书宾
张广钦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201610980293.3A priority Critical patent/CN108074637A/en
Publication of CN108074637A publication Critical patent/CN108074637A/en
Pending legal-status Critical Current

Links

Landscapes

  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

This application discloses a kind of hospital guide's method and diagnosis guiding systems.Hospital guide's method includes:Show human 3d model, human 3d model includes each position of human body;Receive selection of the user to position in human 3d model;If selection part is basic human body unit, disease information corresponding with selection part is shown, otherwise show that human body included in selection part segments location information.According to technical solution provided by the embodiments of the present application, human body is selected by human 3d model, can solve the problems, such as that the display human body that human body plan's image strip comes is not directly perceived enough.

Description

Hospital guide's method and diagnosis guiding system
Technical field
The disclosure relates generally to medical information processing field more particularly to hospital guide's method and diagnosis guiding system.
Background technology
Existing intelligence diagnosis guiding system is the plane intelligence diagnosis guiding system of two dimension, and human body different parts are presented with plan view, So that human internal organs are not directly perceived and are not easy to search, the corresponding disease information that selected area is related to is various, gives User brings inconvenience, so as to longer, inefficiency of searching the time that a certain human body spends.
The content of the invention
In view of drawbacks described above of the prior art or deficiency, are intended to provide a kind of hospital guide's method efficient using easy, In order to solve the problems, such as said one or multiple, the application proposes a kind of hospital guide's method and diagnosis guiding system.
In a first aspect, a kind of hospital guide's method is provided, including:
Show human 3d model, human 3d model includes each position of human body;
Receive selection of the user to position in human 3d model;
If selection part is basic human body unit, shows disease information corresponding with selection part, otherwise show selection Human body included in position segments location information.
Second aspect provides a kind of diagnosis guiding system, including:
Model display unit:Display human 3d model is configured to, human 3d model includes each position of human body;
Receiving unit:It is configured to receive selection of the user to position in human 3d model;
Information display unit:If being configured to selection part as basic human body unit, display is corresponding with selection part Otherwise disease information shows that human body included in selection part segments location information.
The third aspect provides a kind of computer system, including processor, memory,
Memory includes the instruction that can be performed by processor so that processor performs hospital guide's method of the application proposition.
Fourth aspect provides a kind of computer readable storage medium for being stored with computer program, which is characterized in that calculates Machine program makes computer perform hospital guide's method that the application proposes.
According to technical solution provided by the embodiments of the present application, human body is selected by human 3d model, can be solved The problem of display human body that human body plan's image strip comes is not directly perceived enough.Further, according to some embodiments of the application, lead to Cross subdivision human body, moreover it is possible to solve the problems, such as that the corresponding disease information in conventional body position is more, obtain the human body of simple and effective Position and hospital guide's effect of disease.
Description of the drawings
By reading the detailed description made to non-limiting example made with reference to the following drawings, the application's is other Feature, objects and advantages will become more apparent upon:
Fig. 1 shows the exemplary process diagram of hospital guide's method according to the embodiment of the present application;
Fig. 2 shows the structure diagram of the diagnosis guiding system according to the embodiment of the present application;
Fig. 3 shows to be used for the structure diagram for the computer system for realizing the embodiment of the present application.
Specific embodiment
The application is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched The specific embodiment stated is used only for explaining related invention rather than the restriction to the invention.It also should be noted that in order to Convenient for description, illustrated only in attached drawing with inventing relevant part.
It should be noted that in the case where there is no conflict, the feature in embodiment and embodiment in the application can phase Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
It please refers to Fig.1, shows the exemplary process diagram of hospital guide's method according to the embodiment of the present application.
As shown in the figure, in a step 101, showing human 3d model, human 3d model includes each position of human body.
The application efficiently use three-dimensional stereo model can multi-angle the characteristics of showing numerous information multidimensionally, by human body Modelling is three-dimensional stereo model, and includes each position of human body.So that user's selection part is more convenient and directly perceived.
Then, in a step 102, selection of the user to position in human 3d model is received.
User can use mouse, move and wait the man-machine interaction modes such as phonetic orders, utilize human 3d model Select the position to be searched.
Preferably, machine receives the various selection operations of user, and in response to user to the behaviour of human 3d model Make, present with the associated display effect of operation, wherein operate include it is following at least one:
Scaling, visual angle is mobile, rotates, partial enlargement.
Below by taking common mouse as an example, aforesaid operations are introduced.
Scaling:
User carries out the scaling of manikin by roll mouse axis, and manikin shows the area according to scaling ratio Domain.Such as, when view shows entire manikin, it can be seen that human body head and eye mouth and nose lip amplify when to manikin When can only see head to view, head model can see a Segmentation Model for mouth and nose lip.By taking mouth as an example, it can be seen that oral cavity Internal organ, tongue, tooth etc..
It moves at visual angle:
By the dragging of mouse, the human body shown in view can be moved.Such as, active view shows head, passes through Mouse pulls upwards, can be presented neck, chest in the view;When showing cardiac module, pull show lung to the left or to the right The model in portion.
The visual angle refers to the position where user's eyes, by mobile visual angle user can be allowed to pass through different angles See different position, just see that inside of human body constructs when visual angle enters inside manikin, form the effect of " perspective ".
When visual angle is in head front, the visual angle that can move forward enters inside head model;User is inside head model It can be seen that the position that brain tissue, optic nerve etc. are more careful.
Rotation:
For example, when the face of user perspective face manikin, user chooses eyes and nose is easier, but It is that selection ear is just less susceptible to, by rotating manikin, visual angle is moved to the side on head, is just easier selection ear Piece.
Partial enlargement:
After mouse is moved to some position, the profile at the position is shown.User can be clicked on by left mouse button should Position, view show the partial enlargement model at the position.Such as, mouse is placed on abdomen by user, and manikin will be with difference Mode shows the outer profile of abdomen.When mouse point left button, manikin will be amplified, and view shows abdomen, by It is amplified in manikin, the Visceral models more segmented of abdomen can be revealed.
User can quickly be found by the flexible use of the operations such as scaling, visual angle movement, rotation, partial enlargement Required position.
Then, in step 103, if selection part is basic human body unit, disease corresponding with selection part is shown Otherwise information shows that human body included in selection part segments location information.
The basic human body unit is to refer to subdivision human body corresponding with disease, such as the human body portion that abdomen is rough segmentation , there are downwards intestines, stomach etc. in position, intestines can be subdivided into duodenum etc. again.Therefore, duodenum is a basic human body unit, is related to Disease has duodenal ulcer, Tumors of Duodenum etc..
In a step 102, more careful human body can be shown by human 3d model by mentioning.Each more refinement The disease information that the position of cause is included will greatly reduce, and advantageously reduce the time that user searches disease, and realization is efficiently led The purpose examined.
Preferably, show that human body subdivision location information includes at least one of following included in selection part:
Show the amplification threedimensional model of selection part, amplification threedimensional model includes each human body included in selection part subdivision Position;
Display menu associated with selection part, the menu include the name at each human body subdivision position of selection part Claim.
Above two mode can be implemented in combination with human body subdivision location information included in display selection part, also may be selected One of which method is realized.
The following examples provide the scene that the two is implemented in combination with.
It clicks on right mouse button and shows the menu related with the position.The organ and the portion that the position includes are shown in menu The disease of position.The model of selection organ can be entered by clicking on organ menu item.For example mouse is placed on abdomen, right mouse button is clicked on, There are intestines, stomach etc. in the menu item of display;The menu item of intestines is clicked on, view can be switched in intestines model, user can select intestines Comprising position, as duodenum etc.;Disease can be selected by clicking on the disease name in menu item, carry out hospital guide.It if should Organ is careful enough, the organ menu item that will not show that it is included in menu, e.g., into after intestines model, mouse movement Onto duodenum, duodenum can be chosen, and when again tapping on right button menu, will not have organ menu item in menu option.
Preferably, above-mentioned menu includes disease information at least partly corresponding with human body subdivision position.
Selection part may include multiple human body subdivision positions, will correspondingly include multiple disease informations.Menu can show it In part disease information, can also show whole disease informations.When display portion disease information, the part disease of display can be The important diseases information that the selection position is related to, so that user can quickly carry out the hospital guide of important diseases.
It is understood that the human body subdivision toponym and disease information of menu can be used the mode that subfield shows or Using hierarchical menu mode or other modes, specific implementation does not limit.
Preferably, the human 3d model of the application includes different types of model, can be based on user's input information and show Show the human 3d model of respective type.
For example, before human 3d model is shown, can advanced pedestrian's body Model type selection.
Preferably, user inputs information including following any:
The operation information of user's preference pattern type;And
Pass through user's image information of image capture sensor.
For example, user can select the type operations information mode of manikin to be selected by mouse or touch screen input It selects;Or user's face-image is shot by camera, the mode of identification user's identity information carries out manikin automatically The selection of type.
Preferably, the type of the human 3d model of the application includes male's class, women class, children's class, wherein all types of It is associated with the specific disease information of type.
For example, above-mentioned imaging sensor recognize user for male when, then show the manikin of male's class;Work as knowledge Be clipped to user for children when, then show the anthropoid model of children.
Hospital guide's method of the application can be applied to appointment registration system, disease teaching, disease explanation, human body knowledge learning Wait numerous areas.
A kind of diagnosis guiding system is also disclosed in another aspect the application.It please refers to Fig.2, shows leading according to the embodiment of the present application Examine the structure diagram of system.As shown in the figure, diagnosis guiding system 200 includes:
Model display unit 210:Display human 3d model is configured to, human 3d model includes each portion of human body Position;
Receiving unit 220:It is configured to receive selection of the user to position in human 3d model;
Information display unit 230:If being configured to selection part as basic human body unit, display is corresponding with selection part Disease information, otherwise show included in selection part that human body segments location information.
In actual application, above-mentioned diagnosis guiding system can be hospital guide APP or be that online order is registered or network hospital guide The plug-in unit of offer is intelligent hospital guide's device with hardware such as display screen, processors, and form can be various, is not limited.
Preferentially, included in the display selection part of information display unit 230 human body subdivision location information include with It is at least one of lower:
Show the amplification threedimensional model of selection part, amplification threedimensional model includes each human body included in selection part subdivision Position;
Display menu associated with selection part, the menu include the name at each human body subdivision position of selection part Claim.
Above two mode can be implemented in combination with human body subdivision location information included in display selection part, also may be selected One way in which is realized.
Preferably, above-mentioned menu includes disease information at least partly corresponding with human body subdivision position.
Selection part may include multiple human body subdivision positions, will correspondingly include multiple disease informations.Menu can show it In part disease information, can also show whole disease informations.When display portion disease information, the part disease of display can be The important diseases information that the selection position is related to.
Preferably, diagnosis guiding system 200 further includes:
Respond operating unit 240:The operation to human 3d model in response to user is configured to, is presented with operating phase Associated display effect, wherein operation include it is following at least one:
Scaling, visual angle is mobile, rotates, partial enlargement.
User can soon be checked quickly by the flexible use of the operations such as above-mentioned scaling, visual angle movement, rotation, partial enlargement Find required position.
Reason, which should be understood that, realizes that the above-mentioned mode respectively operated is not limited to mouse, can be finger point touching screen Mode, the mode of movement human position control Wearable cursor movement, the realization method of human-computer interaction do not limit.
Preferably, the human 3d model of the application includes different types of model, and diagnosis guiding system 200 further includes:
Type determining units 250:It is configured to input the human 3d model of presentation of information respective type based on user.
Preferably, the user of type determining units 250 inputs information including following any:
The operation information of user's preference pattern type;And
Pass through user's image information of image capture sensor.
User selects manikin type that can be selected by the operation in person of user or user station is only needed to scheme As the appropriate location that sensor can capture its image can make choice automatically.
Preferably, the type of human 3d model includes male's class, women class, children's class, wherein all types of be associated with class The specific disease information of type.
For example, above-mentioned imaging sensor recognize user for women when, then show the manikin of women class.
Below with reference to Fig. 3, it illustrates suitable for being used for realizing the structural representation of the computer system 300 of the embodiment of the present application Figure.
As shown in figure 3, computer system 300 includes central processing unit (CPU) 301, it can be read-only according to being stored in Program in memory (ROM) 302 or be loaded into program in random access storage device (RAM) 303 from storage part 308 and Perform various appropriate actions and processing.In RAM303, also it is stored with system 300 and operates required various programs and data. CPU 301, ROM 302 and RAM 303 are connected with each other by bus 304.Input/output (I/O) interface 305 is also connected to always Line 304.
I/O interfaces 305 are connected to lower component:Importation 306 including keyboard, mouse etc.;It is penetrated including such as cathode The output par, c 307 of spool (CRT), liquid crystal display (LCD) etc. and loud speaker etc.;Storage part 308 including hard disk etc.; And the communications portion 309 of the network interface card including LAN card, modem etc..Communications portion 309 via such as because The network of spy's net performs communication process.Driver 310 is also according to needing to be connected to I/O interfaces 305.Detachable media 311, such as Disk, CD, magneto-optic disk, semiconductor memory etc. are mounted on driver 310, as needed in order to read from it Computer program be mounted into as needed storage part 308.
Particularly, in accordance with an embodiment of the present disclosure, it may be implemented as computer software above with reference to the method for Fig. 1 descriptions Program.For example, embodiment of the disclosure includes a kind of computer program product, including being tangibly embodied in machine readable media On computer program, the computer program include for perform Fig. 1 method program code.In such embodiment In, which can be downloaded and installed from network by communications portion 309 and/or from 311 quilt of detachable media Installation.
Flow chart and block diagram in attached drawing, it is illustrated that according to the system of various embodiments of the invention, method and computer journey Architectural framework in the cards, function and the operation of sequence product.In this regard, each box in flow chart or block diagram can generation The part of one module of table, program segment or code, a part for the module, program segment or code include one or more The executable instruction of logic function as defined in being used to implement.It should also be noted that some as replace realization in, institute in box The function of mark can also be occurred with being different from the order marked in attached drawing.For example, two boxes succeedingly represented are actual On can perform substantially in parallel, they can also be performed in the opposite order sometimes, this is depending on involved function.Also It is noted that the combination of each box in block diagram and/or flow chart and the box in block diagram and/or flow chart, Ke Yiyong The dedicated hardware based systems of functions or operations as defined in execution is realized or can referred to specialized hardware and computer The combination of order is realized.
The preferred embodiment and the explanation to institute's application technology principle that above description is only the application.People in the art Member should be appreciated that invention scope involved in the application, however it is not limited to the technology that the particular combination of above-mentioned technical characteristic forms Scheme, while should also cover in the case where not departing from the inventive concept, it is carried out by above-mentioned technical characteristic or its equivalent feature The other technical solutions for being combined and being formed.Such as features described above has similar work(with (but not limited to) disclosed herein The technical solution that the technical characteristic of energy is replaced mutually and formed.

Claims (14)

1. a kind of hospital guide's method, including:
Show human 3d model, the human 3d model includes each position of human body;
Receive selection of the user to position in the human 3d model;
If selection part is basic human body unit, disease information corresponding with the selection part is shown, otherwise described in display Human body included in selection part segments location information.
2. hospital guide's method according to claim 1, which is characterized in that people included in the display selection part Body subdivision location information includes at least one of following:
Show the amplification threedimensional model of the selection part, the amplification threedimensional model includes including in the selection part each Human body segments position;
Display menu associated with the selection part, the menu include each human body subdivision position of the selection part Title.
3. hospital guide's method according to claim 2, which is characterized in that the menu includes at least part and the human body Segment the corresponding disease information in position.
4. hospital guide's method according to claim 1, which is characterized in that the method further includes:In response to user to institute State the operation of human 3d model, present with the associated display effect of the operation, wherein it is described operate include it is following at least One:
Scaling, visual angle is mobile, rotates, partial enlargement.
5. hospital guide's method according to claim 1, which is characterized in that the human 3d model includes different types of mould Type, the method further include:
The human 3d model of presentation of information respective type is inputted based on user.
6. hospital guide's method according to claim 5, which is characterized in that the user inputs information including following any:
The operation information of user's preference pattern type;And
Pass through user's image information of image capture sensor.
7. hospital guide's method according to claim 5, which is characterized in that the type of the human 3d model includes male Class, women class, children's class, wherein all types of be associated with the specific disease information of the type.
8. a kind of diagnosis guiding system, including:
Model display unit:Display human 3d model is configured to, the human 3d model includes each position of human body;
Receiving unit:It is configured to receive selection of the user to position in the human 3d model;
Information display unit:If being configured to selection part as basic human body unit, display is corresponding with the selection part Otherwise disease information shows that human body included in the selection part segments location information.
9. diagnosis guiding system according to claim 8, which is characterized in that described information display unit passes through at least one of following To show that human body included in the selection part segments location information:
Show the amplification threedimensional model of the selection part, the amplification threedimensional model includes including in the selection part each Human body segments position;
Display menu associated with the selection part, the menu include each human body subdivision position of the selection part Title.
10. diagnosis guiding system according to claim 9, which is characterized in that the menu includes at least part and the people The corresponding disease information in body subdivision position.
11. diagnosis guiding system according to claim 8, which is characterized in that the system also includes:
Respond operating unit:It is configured to, in response to operation of the user to the human 3d model, present and the operation Associated display effect, wherein it is described operation include it is following at least one:
Scaling, visual angle is mobile, rotates, partial enlargement.
12. diagnosis guiding system according to claim 8, which is characterized in that the human 3d model includes different types of Model, the system also includes:
Type determining units:It is configured to determine the human 3d model of respective type for aobvious based on user's input information Show.
13. diagnosis guiding system according to claim 12, which is characterized in that the user, which inputs information, includes following One:
The operation information of user's preference pattern type;And
Pass through user's image information of image capture sensor.
14. diagnosis guiding system according to claim 12, which is characterized in that the type of the human 3d model includes male Class, women class, children's class, wherein all types of be associated with the specific disease information of the type.
CN201610980293.3A 2016-11-08 2016-11-08 Hospital guide's method and diagnosis guiding system Pending CN108074637A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610980293.3A CN108074637A (en) 2016-11-08 2016-11-08 Hospital guide's method and diagnosis guiding system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610980293.3A CN108074637A (en) 2016-11-08 2016-11-08 Hospital guide's method and diagnosis guiding system

Publications (1)

Publication Number Publication Date
CN108074637A true CN108074637A (en) 2018-05-25

Family

ID=62153901

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610980293.3A Pending CN108074637A (en) 2016-11-08 2016-11-08 Hospital guide's method and diagnosis guiding system

Country Status (1)

Country Link
CN (1) CN108074637A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111028921A (en) * 2019-12-09 2020-04-17 中国科学院上海技术物理研究所 Visualization method and system for patient information
CN111241322A (en) * 2020-01-17 2020-06-05 上海忠秐信息科技有限公司 Patient body part clinical marking method based on human body diagram, terminal and medium
CN111494187A (en) * 2020-05-18 2020-08-07 四川大学华西医院 Feedback type shock wave treatment system
CN111933254A (en) * 2020-05-21 2020-11-13 淮阴工学院 Visual automatic registration and diagnosis assisting system
CN112734903A (en) * 2020-12-24 2021-04-30 深圳市智连众康科技有限公司 3D partition alopecia model display method and device and computer readable storage medium
CN113257411A (en) * 2021-06-11 2021-08-13 成都安易迅科技有限公司 Self-service medical registration interaction method and device, storage medium and self-service registration machine

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110214055A1 (en) * 2010-02-26 2011-09-01 General Electric Company Systems and Methods for Using Structured Libraries of Gestures on Multi-Touch Clinical Systems
CN103793611A (en) * 2014-02-18 2014-05-14 中国科学院上海技术物理研究所 Medical information visualization method and device
CN104239735A (en) * 2014-09-24 2014-12-24 上海华博信息服务有限公司 Self-service hospital pre-checking device
CN104834824A (en) * 2015-05-20 2015-08-12 上海简医信息科技有限公司 Mobile terminal visual hospital guide system and method based on anthropometric dummy
CN104881577A (en) * 2015-05-16 2015-09-02 深圳市前海安测信息技术有限公司 Autodiagnosis method and system based on human body parts in network hospital

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110214055A1 (en) * 2010-02-26 2011-09-01 General Electric Company Systems and Methods for Using Structured Libraries of Gestures on Multi-Touch Clinical Systems
CN103793611A (en) * 2014-02-18 2014-05-14 中国科学院上海技术物理研究所 Medical information visualization method and device
CN104239735A (en) * 2014-09-24 2014-12-24 上海华博信息服务有限公司 Self-service hospital pre-checking device
CN104881577A (en) * 2015-05-16 2015-09-02 深圳市前海安测信息技术有限公司 Autodiagnosis method and system based on human body parts in network hospital
CN104834824A (en) * 2015-05-20 2015-08-12 上海简医信息科技有限公司 Mobile terminal visual hospital guide system and method based on anthropometric dummy

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111028921A (en) * 2019-12-09 2020-04-17 中国科学院上海技术物理研究所 Visualization method and system for patient information
CN111241322A (en) * 2020-01-17 2020-06-05 上海忠秐信息科技有限公司 Patient body part clinical marking method based on human body diagram, terminal and medium
CN111494187A (en) * 2020-05-18 2020-08-07 四川大学华西医院 Feedback type shock wave treatment system
CN111933254A (en) * 2020-05-21 2020-11-13 淮阴工学院 Visual automatic registration and diagnosis assisting system
CN112734903A (en) * 2020-12-24 2021-04-30 深圳市智连众康科技有限公司 3D partition alopecia model display method and device and computer readable storage medium
CN112734903B (en) * 2020-12-24 2022-07-12 深圳市智连众康科技有限公司 3D partition alopecia model display method and device and computer readable storage medium
CN113257411A (en) * 2021-06-11 2021-08-13 成都安易迅科技有限公司 Self-service medical registration interaction method and device, storage medium and self-service registration machine

Similar Documents

Publication Publication Date Title
CN108074637A (en) Hospital guide's method and diagnosis guiding system
EP3824370B1 (en) Selectively alerting users of real objects in a virtual environment
TWI754195B (en) Image processing method and device, electronic device and computer-readable storage medium
CN105637564B (en) Generate the Augmented Reality content of unknown object
AU2017254848B2 (en) Image matting using deep learning
US20220378510A1 (en) System and method for multi-client deployment of augmented reality instrument tracking
WO2020010979A1 (en) Method and apparatus for training model for recognizing key points of hand, and method and apparatus for recognizing key points of hand
US9916002B2 (en) Social applications for augmented reality technologies
CN108885794A (en) The virtually trying clothes on the realistic human body model of user
CN111862333B (en) Content processing method and device based on augmented reality, terminal equipment and storage medium
CN110070556A (en) Use the structural modeling of depth transducer
CN101916333A (en) Transesophageal echocardiography visual simulation system and method
US20190045270A1 (en) Intelligent Chatting on Digital Communication Network
US11931166B2 (en) System and method of determining an accurate enhanced Lund and Browder chart and total body surface area burn score
CN109815854A (en) It is a kind of for the method and apparatus of the related information of icon to be presented on a user device
KR20210130953A (en) Method and system for creating virtual image based deep-learning
Amara et al. HOLOTumour: 6DoF Phantom Head pose estimation based deep learning and brain tumour segmentation for AR visualisation and interaction
WO2019098872A1 (en) Method for displaying a three-dimensional face of an object, and device for same
LIU et al. A preliminary study of kinect-based real-time hand gesture interaction systems for touchless visualizations of hepatic structures in surgery
Xie et al. Structure-consistent customized virtual mannequin reconstruction from 3D scans based on optimization
Yang et al. 3D character recognition using binocular camera for medical assist
Li et al. Design of a multi-sensor information acquisition system for mannequin reconstruction and human body size measurement under clothes
CN116580398A (en) Dynamic layout optimization of annotation tags in volume rendering
Ogiela et al. Natural user interfaces for exploring and modeling medical images and defining gesture description technology
CN107369209A (en) A kind of data processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180525

RJ01 Rejection of invention patent application after publication