CN113961122A - Human body model display method and device - Google Patents

Human body model display method and device Download PDF

Info

Publication number
CN113961122A
CN113961122A CN202111297740.2A CN202111297740A CN113961122A CN 113961122 A CN113961122 A CN 113961122A CN 202111297740 A CN202111297740 A CN 202111297740A CN 113961122 A CN113961122 A CN 113961122A
Authority
CN
China
Prior art keywords
human body
display
submodel
determining
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111297740.2A
Other languages
Chinese (zh)
Other versions
CN113961122B (en
Inventor
崔燕
陈联忠
胡可云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jiahesen Health Technology Co ltd
Original Assignee
Beijing Jiahesen Health Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jiahesen Health Technology Co ltd filed Critical Beijing Jiahesen Health Technology Co ltd
Priority to CN202111297740.2A priority Critical patent/CN113961122B/en
Publication of CN113961122A publication Critical patent/CN113961122A/en
Application granted granted Critical
Publication of CN113961122B publication Critical patent/CN113961122B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a human body model display method and device. Wherein, the method comprises the following steps: responding to position selection operation aiming at the human body model, and determining a human body sub-model of a position corresponding to the position selection operation; determining a display position of the human body submodel in response to an incision depth selection operation for the human body submodel; responding to the display type selection operation aiming at the human body submodel, and determining the display type of the human body submodel; wherein the display type comprises a longitudinal planing graph display and a transverse planing graph display; and displaying the planing surface map of the human body submodel based on the display position and the display type. In the mode, a user can simulate the incision depth of the scalpel in actual debridement through the touch strength change of the intelligent mobile equipment and observe the hierarchical relationship of human tissues, so that the user can systematically know the morphological structure of the human body and display the human tissue structure and the labeling information of the human tissue structure in the depth.

Description

Human body model display method and device
Technical Field
The invention relates to the technical field of medical instruments, in particular to a human body model display method and device.
Background
Anatomy is an important course of medical students, human body specimens have certain restrictive property in human anatomy courses under real scenes, and the human body specimens belong to consumable articles, so that the practical operation chances of students are very few. Human body specimens cannot be observed and learned in real time due to scene particularity in learning, and in general, anatomical lessons of colleges and universities are taught by means of traditional 3D teaching software.
However, the above-mentioned conventional teaching method has a problem that the teaching operation is different from the actual anatomical procedure, and because the human body is a very complicated anisotropic medium, the above-mentioned conventional teaching method is not three-dimensional and systematic enough to display the morphological structure of the human body, and the user experience is poor.
Disclosure of Invention
In view of the above, the present invention provides a method and an apparatus for displaying a human body model to more three-dimensionally and systematically display the human body form structure of the human body model, so as to improve the experience of the user.
In a first aspect, an embodiment of the present invention provides a human body model display method, where a human body model includes a plurality of human body submodels; the method comprises the following steps: responding to position selection operation aiming at the human body model, and determining a human body sub-model of a position corresponding to the position selection operation; determining a display position of the human body submodel in response to an incision depth selection operation for the human body submodel; responding to the display type selection operation aiming at the human body submodel, and determining the display type of the human body submodel; wherein the display type comprises a longitudinal planing graph display and a transverse planing graph display; and displaying the planing surface map of the human body submodel based on the display position and the display type.
In a preferred embodiment of the present invention, the position selection operation is a click operation; responding to the position selection operation aiming at the human body model, and determining the human body sub-model of the position corresponding to the position selection operation, wherein the step comprises the following steps: responding to the click operation aiming at the human body model, and determining a position corresponding to the click operation; and determining the human body submodel in the preset range of the position corresponding to the clicking operation.
In a preferred embodiment of the present invention, after the step of determining the human body submodel within the preset range of the position corresponding to the click operation, the method further includes: and displaying the human body submodel in the preset range of the position corresponding to the click operation.
In a preferred embodiment of the present invention, the above-mentioned incision depth selecting operation is a sliding operation; the step of determining a display position of the human body submodel in response to an incision depth selection operation for the human body submodel includes: responding to the sliding operation aiming at the human body submodel, and determining the distance of the sliding operation; and determining the display position of the human body submodel based on the distance of the sliding operation.
In a preferred embodiment of the present invention, after the step of determining the display position of the human body sub-model based on the distance of the sliding operation, the method further includes: and a planing surface diagram for displaying the display position of the human body submodel with preset first brightness.
In a preferred embodiment of the present invention, the display type selecting operation is a rotating operation; the step of determining a display type of the human body submodel in response to a display type selection operation for the human body submodel includes: determining a rotation direction of the rotation operation in response to the rotation operation for the human body submodel; the display type of the human body submodel is determined based on the rotation direction of the rotation operation.
In a preferred embodiment of the present invention, the method further includes: determining a target position of a human body sub-model in response to a position selection operation for the planing surface map of the displayed human body model; displaying a surface covering chart of the human body sub-model of a preset target position with second brightness; wherein the second brightness is greater than the first brightness.
In a preferred embodiment of the present invention, after the step of determining the target position of the human body submodel, the method further comprises: and displaying introduction information of the target position of the human body submodel.
In a preferred embodiment of the present invention, after the step of displaying the planing surface map of the human body submodel based on the display position and the display type, the method further includes: and canceling the planing surface graph of the human body submodel in response to the restoration operation aiming at the human body submodel.
In a second aspect, an embodiment of the present invention further provides a human body model display apparatus, where the human body model includes a plurality of human body submodels; the device comprises: the position selection module is used for responding to position selection operation aiming at the human body model and determining a human body submodel of a position corresponding to the position selection operation; the incision depth selection module is used for responding to the incision depth selection operation aiming at the human body submodel and determining the display position of the human body submodel; the display type selection module is used for responding to the display type selection operation aiming at the human body submodel and determining the display type of the human body submodel; wherein the display type comprises a longitudinal planing graph display and a transverse planing graph display; and the first display module is used for displaying the planing surface map of the human body submodel based on the display position and the display type.
The embodiment of the invention has the following beneficial effects:
according to the human body model display method and device provided by the embodiment of the invention, a user can determine the display position and the display type of the human body sub model through various operations, and can display the planing surface map of the human body sub model based on the display position and the display type. In the mode, a user can simulate the incision depth of the scalpel in actual debridement through the touch strength change of the intelligent mobile equipment and observe the hierarchical relationship of human tissues, so that the user can systematically know the morphological structure of the human body and display the human tissue structure and the labeling information of the human tissue structure in the depth.
Additional features and advantages of the disclosure will be set forth in the description which follows, or in part may be learned by the practice of the above-described techniques of the disclosure, or may be learned by practice of the disclosure.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flowchart of a mannequin display method according to an embodiment of the present invention;
FIG. 2 is a flow chart of another human body model display method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a rotational body sub-model according to an embodiment of the present invention;
FIG. 4 is a schematic view of a cross-sectional view of the area covered by the blade of the scalpel according to an embodiment of the present invention;
FIG. 5 is a cross-sectional view through the right arm and adjacent to the shoulder joint in accordance with an embodiment of the present invention;
FIG. 6 is a schematic overall flowchart of a human body model display method according to an embodiment of the present invention;
FIG. 7 is a schematic structural diagram of a human body model display device according to an embodiment of the present invention;
FIG. 8 is a schematic structural diagram of another mannequin display device according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
At present, the traditional 3D teaching software is not systematic enough for displaying the morphological structure of the human body, and tissues, organs and systems are separated from each other, so that a user can hardly master the knowledge of the human body planning comprehensively and systematically. Moreover, the traditional 3D teaching software and the real dissection procedure are disjointed. Based on the above, the embodiments of the present invention provide a method and an apparatus for displaying a human body model, and in particular, to an intelligent human body model system and a virtual scene method for intelligently simulating anatomy, in which a user can simulate the incision depth of a scalpel in actual debridement through a change in touch strength of an intelligent mobile device, and observe the hierarchical relationship of human tissues, thereby enabling the user to have more systematic knowledge of the morphological structure of the human body, and to present the structure of the human body and its label information within the depth.
To facilitate understanding of the embodiment, a detailed description will be given to a human body model display method disclosed in the embodiment of the present invention.
The first embodiment is as follows:
the embodiment of the invention provides a human body model display method. Referring to a flowchart of a mannequin display method shown in fig. 1, the mannequin display method includes the steps of:
and step S102, responding to the position selection operation aiming at the human body model, and determining the human body submodel of the position corresponding to the position selection operation.
The human body model in this embodiment may be a 3D (three-dimensional) model, and in this embodiment, a 3D human body model may be created in advance according to a human anatomy structure, and then, the whole body image data of the human body may be structurally segmented according to the created 3D model, so as to obtain a multi-angle sectional view thereof.
The 3D human body model may provide a human body sub-model for selecting a human body part, such as a torso part and a right arm, and may independently perform a specific angle rotation after the selection, so as to determine a knife-off position during dissection, i.e., a position corresponding to a position selection operation. Specifically, the user can select a human body part by a position selection operation for the human body model.
And step S104, responding to the incision depth selection operation aiming at the human body submodel, and determining the display position of the human body submodel.
Specifically, the user can determine the knife-off position during dissection, namely the display position of the human body submodel, through the incision depth selection operation aiming at the human body submodel. For example: the user can slide down for a certain distance, and the display position of the human body submodel is determined according to the sliding-down distance.
Step S106, responding to the display type selection operation aiming at the human body submodel, and determining the display type of the human body submodel; wherein the display type comprises a longitudinal planing graph display and a transverse planing graph display.
Specifically, the user may perform a specific angle rotation by selecting an operation for a display type of the human body submodel, thereby displaying a longitudinal planar view display or a lateral planar view display, i.e., determining the display type of the human body submodel. For example: it may be preset that the left rotation is a longitudinal planing display and the right rotation is a transverse planing display.
In addition, the embodiment of the invention can also provide two longitudinal and transverse incision modes, and the multi-point touch panel of the electronic device is utilized to touch the command signals (such as force, track and the like) of the multi-point touch panel, so that the corresponding blade coverage area of the corresponding section view is highlighted and the names and the introduction of the tissues and the organs touched by the blade tip are displayed.
And S108, displaying the planing surface map of the human body submodel based on the display position and the display type.
Specifically, the embodiment of the present invention may display the planing surface map of the human body sub-model corresponding to the display position and the display type, for example: the human body sub-model is a right-hand model, the display position is subcutaneous 3cm, the display type is a longitudinal planing surface picture, and the subcutaneous 3cm longitudinal planing surface picture of the right-hand model can be displayed.
It should be noted that, in the embodiment of the present invention, a certain threshold may be set for each operation to prevent the user from being overly sensitive, so as to improve the experience of the user.
According to the human body model display method provided by the embodiment of the invention, a user can determine the display position and the display type of the human body submodel through various operations, and the planing surface map of the human body submodel can be displayed based on the display position and the display type. In the mode, a user can simulate the incision depth of the scalpel in actual debridement through the touch strength change of the intelligent mobile equipment and observe the hierarchical relationship of human tissues, so that the user can systematically know the morphological structure of the human body and display the human tissue structure and the labeling information of the human tissue structure in the depth.
Example two:
the embodiment provides another human body model display method, which is realized on the basis of the embodiment; this embodiment focuses on a specific implementation of the restoration manikin. Referring to fig. 2, another human body model display method is shown in a flowchart, and includes the steps of:
step S202, responding to the position selection operation aiming at the human body model, and determining the human body sub-model of the position corresponding to the position selection operation.
Specifically, the position selecting operation in this embodiment may be a click operation, and the human body submodel at the position corresponding to the position selecting operation may be determined by responding to the position selecting operation for the human body model through the following steps: responding to the click operation aiming at the human body model, and determining a position corresponding to the click operation; and determining the human body submodel in the preset range of the position corresponding to the clicking operation.
Furthermore, after the determination of the human body sub-model, the determined human body sub-model may also be displayed, for example: and displaying the human body submodel in the preset range of the position corresponding to the click operation.
Taking an arm as an example, the implementation process of the invention is described in detail below, and after a user clicks the arm of the 3D human body model, the 3D model of the skin of the arm part and the hair of the attachment thereof can be determined as the human body sub-model, and the 3D model of the skin of the arm part and the hair of the attachment thereof is displayed.
And step S204, responding to the incision depth selection operation aiming at the human body submodel, and determining the display position of the human body submodel.
Specifically, the incision depth selection operation in this embodiment is a sliding operation, and the display position of the human body submodel may be determined in response to the incision depth selection operation for the human body submodel by: responding to the sliding operation aiming at the human body submodel, and determining the distance of the sliding operation; and determining the display position of the human body submodel based on the distance of the sliding operation.
Continuing taking the arm as an example, when the user slides the finger on the human body submodel of the arm, the depth of the incision of the scalpel is obtained by the finger pressing force degree when the simulated dissection is performed; and acquiring the incision length of the scalpel during simulated dissection through the finger sliding distance, thereby determining the display position of the human body submodel.
In addition, after the display position of the human body submodel is determined, a planing surface map can be displayed, for example: and a planing surface diagram for displaying the display position of the human body submodel with preset first brightness. Continuing with the arm as an example, the first brightness is generally lower brightness, the simulated incision direction is determined by the user at the initial position of the sub-model touch on the arm, and the air conditioner simultaneously displays the cross-sectional view corresponding to the incision direction, at this time, the cross-sectional view is low brightness.
Step S206, responding to the display type selection operation aiming at the human body submodel, and determining the display type of the human body submodel; wherein the display type comprises a longitudinal planing graph display and a transverse planing graph display.
Specifically, the display type selection operation in this embodiment is a rotation operation, and the display type of the human body submodel may be determined in response to the display type selection operation for the human body submodel by: determining a rotation direction of the rotation operation in response to the rotation operation for the human body submodel; the display type of the human body submodel is determined based on the rotation direction of the rotation operation.
Continuing with the arm as an example, referring to the schematic diagram of a rotational human body submodel shown in fig. 3, the position 01 and the position 02 in fig. 3 are rotated by 90 degrees, the human body submodel of the arm can be rotated by 90 degrees (left and right directions) by the rotation control module, and the front view of the rotated model is the position where the current user can perform simulated anatomy.
And S208, displaying the planing surface map of the human body submodel based on the display position and the display type.
The user may also perform other operations on the planing surface map, such as: determining a target position of a human body sub-model in response to a position selection operation for the planing surface map of the displayed human body model; displaying a surface covering chart of the human body sub-model of a preset target position with second brightness; wherein the second brightness is greater than the first brightness.
Continuing with the arm example, a second highlighted intensity display may be made of the area covered by the scalpel blade in the cross-sectional view (see a schematic view of the area covered by the scalpel blade in one of the cross-sectional views in fig. 4, where the cross-sectional plane 03 in the left view and the cross-sectional plane 03-1 in the right view in fig. 4 are shown).
In addition, there is introductory information showing the target location of the body submodel, such as: referring to fig. 5, a schematic view of a transverse plane passing through the right arm and approaching the shoulder joint, the name and detailed introduction information of the tissue and organ can be shown for the position (position:infig. 5) of the simulated dissection knife tip; when the finger leaves, the highlight area of the sectional view is kept, a touch point on the tool tip track of the sectional view is provided, and a user can move the touch point on the track to acquire detailed introduction information of the position and the organization of the touch point.
And step S210, responding to the restoration operation aiming at the human body submodel, and canceling the plane graph of the human body submodel.
The user can cancel the planing surface picture of the human body submodel through the restoration operation, and the highlight area of the section picture can be restored to the initial state. The overall process of the human body model display method provided by the embodiment of the invention can also take part in the overall process schematic diagram of the human body model display method shown in fig. 6, firstly the position of the human body 3D model, namely the human body sub-model at the corresponding position in the above embodiment, can be obtained, and then the human body 3D model at the corresponding position can be displayed.
The user executes the rotation operation, the position information can be obtained through the rotation of the user, the initial position of the touch of the user is judged, the display type of the human body sub-model in the embodiment can be determined, and the longitudinal planing plan or the transverse planing plan is respectively displayed.
The user can adjust the touch strength, knife edge information is displayed on the planing surface picture through the touch strength, the user can display whether the dialog box is restored or not by lifting the finger, and if the dialog box is restored, the highlight area of the cross-sectional picture can be restored to the initial state; if not, the user can display the current incision information by moving the touch point, and the whole process of the human body model display method is completed.
The embodiment of the invention can provide an intelligent human body model system and an intelligent anatomical simulation virtual scene method, and a user can simulate the incision depth of a scalpel in actual debridement through the touch force change of intelligent mobile equipment and observe the hierarchical relationship of human tissues, so that the user has more systematic knowledge on the human body morphological structure, and the human body tissue structure and the labeled information thereof in the depth are displayed.
Example three:
corresponding to the method embodiment, the embodiment of the invention provides a human body model display device, wherein the human body model comprises a plurality of human body submodels; fig. 7 is a schematic structural view of a human body model display apparatus, which includes:
a position selection module 71, configured to determine, in response to a position selection operation for the human body model, a human body sub-model of a position corresponding to the position selection operation;
an incision depth selection module 72 for determining a display position of the human body submodel in response to an incision depth selection operation for the human body submodel;
a display type selection module 73 for determining the display type of the human body submodel in response to a display type selection operation for the human body submodel; wherein the display type comprises a longitudinal planing graph display and a transverse planing graph display;
and the first display module 74 is used for displaying the planing surface map of the human body submodel based on the display position and the display type.
According to the human body model display device provided by the embodiment of the invention, a user can determine the display position and the display type of the human body submodel through various operations, and can display the planing surface map of the human body submodel based on the display position and the display type. In the mode, a user can simulate the incision depth of the scalpel in actual debridement through the touch strength change of the intelligent mobile equipment and observe the hierarchical relationship of human tissues, so that the user can systematically know the morphological structure of the human body and display the human tissue structure and the labeling information of the human tissue structure in the depth.
The position selection operation is a click operation; the position selection module is used for responding to the click operation aiming at the human body model and determining the position corresponding to the click operation; and determining the human body submodel in the preset range of the position corresponding to the clicking operation.
Referring to fig. 8, another human body model display device is shown, which further includes: and the second display module 75 is connected with the position selection module 71, and the second display module 75 is used for displaying the human body submodel in the preset range of the position corresponding to the click operation.
The above-mentioned incision depth selection operation is a sliding operation; the incision depth selection module is used for responding to the sliding operation aiming at the human body submodel and determining the distance of the sliding operation; and determining the display position of the human body submodel based on the distance of the sliding operation.
As shown in fig. 8, the manikin display device further includes: and the third display module 76 is connected with the incision depth selection module 72, and the third display module 76 is used for displaying the planing surface map of the display position of the human body submodel with the preset first brightness.
The type selecting operation is a rotating operation; the display type selection module is used for responding to the rotation operation aiming at the human body submodel and determining the rotation direction of the rotation operation; the display type of the human body submodel is determined based on the rotation direction of the rotation operation.
As shown in fig. 8, the manikin display device further includes: a fourth display module 77 connected to the display type selection module 73, the fourth display module 77 being configured to determine a target position of the human body sub-model in response to a position selection operation for the displayed planing surface map of the human body model; displaying a surface covering chart of the human body sub-model of a preset target position with second brightness; wherein the second brightness is greater than the first brightness.
The fourth display module is further configured to display introduction information of the target position of the human body submodel.
As shown in fig. 8, the manikin display device further includes: and the cancellation display module 78 is connected with the first display module 74, and the cancellation display module 78 is used for responding to the restoration operation aiming at the human body submodel and canceling the display of the planing surface map of the human body submodel.
The human body model display device provided by the embodiment of the invention has the same technical characteristics as the human body model display method provided by the embodiment, so the same technical problems can be solved, and the same technical effects can be achieved.
Example four:
the embodiment of the invention also provides electronic equipment for operating the human body model display method; referring to fig. 9, an electronic device is shown, which includes a memory 100 and a processor 101, where the memory 100 is used to store one or more computer instructions, and the one or more computer instructions are executed by the processor 101 to implement the human body model display method.
Further, the electronic device shown in fig. 9 further includes a bus 102 and a communication interface 103, and the processor 101, the communication interface 103, and the memory 100 are connected through the bus 102.
The Memory 100 may include a high-speed Random Access Memory (RAM) and may further include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 103 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used. The bus 102 may be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 9, but this does not indicate only one bus or one type of bus.
The processor 101 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 101. The Processor 101 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 100, and the processor 101 reads the information in the memory 100, and completes the steps of the method of the foregoing embodiment in combination with the hardware thereof.
The embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium stores computer-executable instructions, and when the computer-executable instructions are called and executed by a processor, the computer-executable instructions cause the processor to implement the human body model display method.
The computer program product of the human body model display method and device provided by the embodiment of the present invention includes a computer readable storage medium storing a program code, and instructions included in the program code may be used to execute the method in the foregoing method embodiment, and specific implementation may refer to the method embodiment, and will not be described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and/or the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, an electronic device, or a network device) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A human body model display method is characterized in that the human body model comprises a plurality of human body submodels; the method comprises the following steps:
responding to position selection operation aiming at the human body model, and determining the human body submodel of the position corresponding to the position selection operation;
determining a display position of a human body submodel in response to an incision depth selection operation for the human body submodel;
responding to a display type selection operation aiming at the human body submodel, and determining the display type of the human body submodel; wherein the display type comprises a longitudinal planing graph display and a transverse planing graph display;
and displaying the planing surface map of the human body sub-model based on the display position and the display type.
2. The method of claim 1, wherein the location selection operation is a click operation; the step of determining the human body sub-model of the position corresponding to the position selection operation in response to the position selection operation for the human body model comprises:
responding to the click operation aiming at the human body model, and determining a position corresponding to the click operation;
and determining the human body submodel within the preset range of the position corresponding to the clicking operation.
3. The method of claim 2, wherein after the step of determining the human body submodel within a preset range of the position corresponding to the click operation, the method further comprises:
and displaying the human body submodel within a preset range of the position corresponding to the click operation.
4. The method of claim 1, wherein the cut depth selection operation is a sliding operation; the step of determining a display position of a human body submodel in response to an incision depth selection operation for the human body submodel includes:
responding to a sliding operation aiming at the human body submodel, and determining the distance of the sliding operation;
determining a display position of the human body sub-model based on the distance of the sliding operation.
5. The method of claim 4, wherein after the step of determining the display position of the human body sub-model based on the distance of the sliding operation, the method further comprises:
and displaying the planing surface map of the display position of the human body submodel with preset first brightness.
6. The method according to claim 1, wherein the display type selection operation is a rotation operation; the step of determining the display type of the human body submodel in response to a display type selection operation for the human body submodel includes:
determining a rotation direction of a rotation operation in response to the rotation operation for the human body submodel;
determining a display type of the human body sub-model based on a rotation direction of the rotation operation.
7. The method of claim 5, further comprising:
determining a target position of the human body sub-model in response to a position selection operation for the displayed planing surface map of the human body model;
displaying a surface covering chart of the human body sub-model of the target position with preset second brightness; wherein the second brightness is greater than the first brightness.
8. The method of claim 7, wherein after the step of determining the target location of the human body submodel, the method further comprises:
and displaying introduction information of the target position of the human body submodel.
9. The method of claim 1, wherein after the step of displaying the planing surface map of the human body submodel based on the display location and the display type, the method further comprises:
and canceling the display of the planing surface map of the human body submodel in response to the restoration operation aiming at the human body submodel.
10. A mannequin display apparatus, wherein the mannequin comprises a plurality of mannequins; the device comprises:
the position selection module is used for responding to position selection operation aiming at the human body model and determining the human body submodel of the position corresponding to the position selection operation;
the incision depth selection module is used for responding to the incision depth selection operation aiming at the human body submodel and determining the display position of the human body submodel;
the display type selection module is used for responding to the display type selection operation aiming at the human body submodel and determining the display type of the human body submodel; wherein the display type comprises a longitudinal planing graph display and a transverse planing graph display;
and the first display module is used for displaying the planing surface map of the human body sub-model based on the display position and the display type.
CN202111297740.2A 2021-11-04 2021-11-04 Human body model display method and device Active CN113961122B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111297740.2A CN113961122B (en) 2021-11-04 2021-11-04 Human body model display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111297740.2A CN113961122B (en) 2021-11-04 2021-11-04 Human body model display method and device

Publications (2)

Publication Number Publication Date
CN113961122A true CN113961122A (en) 2022-01-21
CN113961122B CN113961122B (en) 2023-11-17

Family

ID=79469053

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111297740.2A Active CN113961122B (en) 2021-11-04 2021-11-04 Human body model display method and device

Country Status (1)

Country Link
CN (1) CN113961122B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115294845A (en) * 2022-08-09 2022-11-04 苏州旭智设计营造有限公司 Polymorphic interactive display system

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101043843A (en) * 2004-06-30 2007-09-26 詹姆士·V·西茨曼 Medical devices for minimally invasive surgeries and other internal procedures
US20130229499A1 (en) * 2012-03-05 2013-09-05 Microsoft Corporation Generation of depth images based upon light falloff
KR20140035035A (en) * 2012-09-13 2014-03-21 주식회사 인피니트헬스케어 Surgery simulation method using virtual knife and apparatus thereof
CN103870099A (en) * 2012-12-13 2014-06-18 上海联影医疗科技有限公司 Interface display switching method and display interface of medical anatomical surface
CN104821122A (en) * 2015-03-11 2015-08-05 张雁儒 Human anatomy teaching method
CN104837419A (en) * 2012-06-12 2015-08-12 T-医学创新有限公司 Device and method for producing a first skin incision in surgical procedures and marking along the margins of the incision
CN104851345A (en) * 2015-03-11 2015-08-19 张雁儒 Human anatomy teaching system
CN104978872A (en) * 2014-04-04 2015-10-14 上海橘井泉网络科技有限公司 Surgery demonstration method, surgery demonstration device and surgery demonstration system
US20160324580A1 (en) * 2015-03-23 2016-11-10 Justin Esterberg Systems and methods for assisted surgical navigation
CN107315915A (en) * 2017-06-28 2017-11-03 上海联影医疗科技有限公司 A kind of simulated medical surgery method and system
CN206741752U (en) * 2017-04-13 2017-12-12 北京医教科技有限公司 Simulated humanbody laparoscopic surgery simulated training system
CN108140360A (en) * 2015-07-29 2018-06-08 森赛尔股份有限公司 For manipulating the system and method for virtual environment
CN108523995A (en) * 2018-01-23 2018-09-14 上海交通大学医学院附属上海儿童医学中心 A kind of craniosynostosis surgery simulation system and its method
CN108904048A (en) * 2018-06-20 2018-11-30 广东弘基医疗生物技术有限公司 A kind of double-edged eyelid multiple spot deep layer positioning device and its localization method
CN109464194A (en) * 2018-12-29 2019-03-15 上海联影医疗科技有限公司 Display methods, device, medical supply and the computer storage medium of medical image
CN109509555A (en) * 2018-11-26 2019-03-22 刘伟民 A kind of surgical operation preview appraisal procedure and system based on 3-dimensional image
US20190247129A1 (en) * 2016-10-18 2019-08-15 Kamyar ABHARI Methods and systems for providing depth information
CN111045575A (en) * 2018-10-11 2020-04-21 阿里健康信息技术有限公司 Diagnosis and treatment interaction method and diagnosis and treatment terminal equipment
CN111417352A (en) * 2016-10-21 2020-07-14 Gys科技有限责任公司(经营名称为卡丹机器人) Method and system for setting trajectory and target position for image-guided surgery
CN112835480A (en) * 2019-11-25 2021-05-25 京东方科技集团股份有限公司 Human body information display method, device, equipment and computer readable storage medium

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101043843A (en) * 2004-06-30 2007-09-26 詹姆士·V·西茨曼 Medical devices for minimally invasive surgeries and other internal procedures
US20130229499A1 (en) * 2012-03-05 2013-09-05 Microsoft Corporation Generation of depth images based upon light falloff
CN104837419A (en) * 2012-06-12 2015-08-12 T-医学创新有限公司 Device and method for producing a first skin incision in surgical procedures and marking along the margins of the incision
KR20140035035A (en) * 2012-09-13 2014-03-21 주식회사 인피니트헬스케어 Surgery simulation method using virtual knife and apparatus thereof
CN103870099A (en) * 2012-12-13 2014-06-18 上海联影医疗科技有限公司 Interface display switching method and display interface of medical anatomical surface
CN104978872A (en) * 2014-04-04 2015-10-14 上海橘井泉网络科技有限公司 Surgery demonstration method, surgery demonstration device and surgery demonstration system
CN104821122A (en) * 2015-03-11 2015-08-05 张雁儒 Human anatomy teaching method
CN104851345A (en) * 2015-03-11 2015-08-19 张雁儒 Human anatomy teaching system
US20160324580A1 (en) * 2015-03-23 2016-11-10 Justin Esterberg Systems and methods for assisted surgical navigation
CN108140360A (en) * 2015-07-29 2018-06-08 森赛尔股份有限公司 For manipulating the system and method for virtual environment
US20190247129A1 (en) * 2016-10-18 2019-08-15 Kamyar ABHARI Methods and systems for providing depth information
CN111417352A (en) * 2016-10-21 2020-07-14 Gys科技有限责任公司(经营名称为卡丹机器人) Method and system for setting trajectory and target position for image-guided surgery
CN206741752U (en) * 2017-04-13 2017-12-12 北京医教科技有限公司 Simulated humanbody laparoscopic surgery simulated training system
CN107315915A (en) * 2017-06-28 2017-11-03 上海联影医疗科技有限公司 A kind of simulated medical surgery method and system
CN108523995A (en) * 2018-01-23 2018-09-14 上海交通大学医学院附属上海儿童医学中心 A kind of craniosynostosis surgery simulation system and its method
CN108904048A (en) * 2018-06-20 2018-11-30 广东弘基医疗生物技术有限公司 A kind of double-edged eyelid multiple spot deep layer positioning device and its localization method
CN111045575A (en) * 2018-10-11 2020-04-21 阿里健康信息技术有限公司 Diagnosis and treatment interaction method and diagnosis and treatment terminal equipment
CN109509555A (en) * 2018-11-26 2019-03-22 刘伟民 A kind of surgical operation preview appraisal procedure and system based on 3-dimensional image
CN109464194A (en) * 2018-12-29 2019-03-15 上海联影医疗科技有限公司 Display methods, device, medical supply and the computer storage medium of medical image
CN112835480A (en) * 2019-11-25 2021-05-25 京东方科技集团股份有限公司 Human body information display method, device, equipment and computer readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115294845A (en) * 2022-08-09 2022-11-04 苏州旭智设计营造有限公司 Polymorphic interactive display system

Also Published As

Publication number Publication date
CN113961122B (en) 2023-11-17

Similar Documents

Publication Publication Date Title
Pflesser et al. Volume cutting for virtual petrous bone surgery
CN107358595B (en) Method and system for lobe segmentation and preoperative surgical planning
US11547499B2 (en) Dynamic and interactive navigation in a surgical environment
Basdogan et al. VR-based simulators for training in minimally invasive surgery
US20140071165A1 (en) Mixed reality simulation methods and systems
CN109064549B (en) Method for generating mark point detection model and method for detecting mark point
CN105956395A (en) Medical image processing method, device and system
CN106875462A (en) A kind of real-time digital organ cutting method based on first spherical model and hybrid driving method
CN109243614B (en) Operation simulation method, device and system
US9491443B2 (en) Image processing method and image processing apparatus
CN113961122A (en) Human body model display method and device
CN109961514B (en) Cutting deformation simulation method and device, storage medium and terminal equipment
Nienhuys et al. A Delaunay approach to interactive cutting in triangulated surfaces
WO2018156087A1 (en) Finite-element analysis augmented reality system and method
CN115457008A (en) Real-time abdominal puncture virtual simulation training method and device
US8396698B2 (en) Method for the simulation of the haptic of an interaction of a guided object with a virtual three-dimensional object
KR101275938B1 (en) Method for virtual surgery medical simulation and apparatus for thereof
Müller et al. The virtual reality arthroscopy training simulator
CN109308735B (en) Data holing method based on Unity3D volume rendering and storage medium thereof
CN113506365A (en) Image display method and related device, electronic equipment and storage medium
CN114241156A (en) Device for simulating soft tissue deformation and simulation system
CN114333482A (en) Virtual anatomy teaching system based on mixed reality technology
CN110058684A (en) A kind of geography information exchange method, system and storage medium based on VR technology
Huang et al. Virtual reality simulator for training in myringotomy with tube placement
CN113903233B (en) Simulated operation guiding method, device, equipment and storage medium of heart model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant