CN114840110B - Puncture navigation interaction assisting method and device based on mixed reality - Google Patents

Puncture navigation interaction assisting method and device based on mixed reality Download PDF

Info

Publication number
CN114840110B
CN114840110B CN202210264158.4A CN202210264158A CN114840110B CN 114840110 B CN114840110 B CN 114840110B CN 202210264158 A CN202210264158 A CN 202210264158A CN 114840110 B CN114840110 B CN 114840110B
Authority
CN
China
Prior art keywords
strategy
navigation
interaction
command
policy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210264158.4A
Other languages
Chinese (zh)
Other versions
CN114840110A (en
Inventor
禹浪
卢亚超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Advanced Institute of Information Technology AIIT of Peking University
Hangzhou Weiming Information Technology Co Ltd
Original Assignee
Advanced Institute of Information Technology AIIT of Peking University
Hangzhou Weiming Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Advanced Institute of Information Technology AIIT of Peking University, Hangzhou Weiming Information Technology Co Ltd filed Critical Advanced Institute of Information Technology AIIT of Peking University
Priority to CN202210264158.4A priority Critical patent/CN114840110B/en
Publication of CN114840110A publication Critical patent/CN114840110A/en
Application granted granted Critical
Publication of CN114840110B publication Critical patent/CN114840110B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/448Execution paradigms, e.g. implementations of programming paradigms
    • G06F9/4488Object-oriented
    • G06F9/449Object-oriented method invocation or resolution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • Artificial Intelligence (AREA)
  • Computer Graphics (AREA)
  • Pathology (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a puncture navigation interaction assisting method and device based on mixed reality. Wherein the method comprises the following steps: acquiring user behaviors and system flows of navigation interaction users; determining strategy elements required by navigation interaction according to user behaviors and system flows; extracting a strategy object in the strategy element; the user interacts with an interface included by the strategy object to trigger a command included by the strategy object; and calling the object corresponding to the command through the command manager, and executing the method in the interface of the object. The method integrates the mixed reality technology and the artificial intelligence technology deeply, has universality and high expansibility, can intelligently prompt user behaviors according to different application scenes, and reduces the difficulty of navigation interaction; the method supports the self-addition of a third-party developer, supports the unification, abstraction and standardization interfaces of different rules under different navigation systems, and supports the custom addition of policy objects in a universal interface.

Description

Puncture navigation interaction assisting method and device based on mixed reality
Technical Field
The invention relates to the technical field of intelligent interaction, in particular to a puncture navigation interaction assisting method and device based on mixed reality.
Background
The application of the traditional navigation technology in the bone surgery has a plurality of problems. For example, the orthopedic robot has the problems of ethics, responsible people, limited movement range of the mechanical arm, manual uncontrollable operation process, high robot cost, unfavorable popularization and the like; the 3D printing guide plate has problems of manufacturing time, guide plate sterilization, whether to apply during operation, invisible operation process, metal allergy and the like. Under the current situation, the traditional computer navigation has the problems of invisible, expensive equipment, low precision and the like, and for the navigation scheme corresponding to the traditional computer navigation, a doctor is required to conduct the process under the guidance of CT, and the doctor is required to judge information such as puncture angle position and the like through CT images in operation, so that a patient is required to receive CT irradiation for a plurality of times in operation, and damage is caused to the patient and the doctor body; doctors need to complete operations by experience under the guidance of non-real-time CT images, the requirements on the skills of the doctors are high, and secondary injuries to patients are easy to cause because of long operation time and high risk.
Along with the development of science and technology, the development of medical mixed reality technology is rapid, a plurality of puncture navigation technologies based on mixed reality appear, and the navigation schemes display the position of a surgical instrument relative to a focus and the tissue structures of sagittal position, horizontal position and coronal position of the focus area on a screen in real time in a virtual-real superposition mode, finally guide a clinician to adjust the position of the surgical instrument so as to finish the operation more rapidly, safely and accurately, greatly improve the efficiency and success rate of the operation, and also have the defects of complex and complex flow, inconvenient man-machine interaction and higher difficulty in the hand.
Therefore, the puncture navigation interaction assisting method and device based on mixed reality are provided, the mixed reality technology and the artificial intelligence technology are deeply integrated, the universality and the high expansibility are achieved, intelligent prompt can be carried out on user behaviors according to different application scenes, and the difficulty of navigation interaction is reduced; the system flow operation method and the system flow operation device can assist the user in carrying out the system flow operation, reduce the complexity of the system flow and increase convenience and experience.
Disclosure of Invention
The embodiment of the application provides a puncture navigation interaction assisting method and device based on mixed reality. The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview and is intended to neither identify key/critical elements nor delineate the scope of such embodiments. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
In a first aspect, an embodiment of the present application provides a puncture navigation interaction assisting method based on mixed reality, where the method includes:
acquiring user behaviors and system flows of navigation interaction users;
determining strategy elements required by navigation interaction according to the user behaviors and the system flow;
extracting a strategy object in the strategy element;
triggering a command included in the strategy object through interaction between a user and an interface included in the strategy object;
and calling an object corresponding to the command through a command manager, and executing a method in an interface of the object.
Optionally, the determining the policy element required for navigation interaction according to the user behavior and the system flow includes:
analyzing the user behaviors in real time through a scene analyzer to obtain user behavior characteristics of navigation interaction;
extracting relevant policy elements of the user behavior characteristics from an instruction set;
storing the relevant policy elements in a buffer.
Optionally, the buffer has a buffer pool function.
Optionally, the method further comprises:
determining scene behavior characteristics of the system flow through the scene analyzer, and acquiring feature codes corresponding to the scene behavior characteristics;
and extracting strategy elements required by navigation interaction from the related strategy elements through the feature codes.
Optionally, the extracting the policy object in the policy element includes:
the scene analyzer sends the extracted strategy elements to a command processor through a feature queue container;
the command processor sequentially extracts the strategy objects in the strategy elements from the feature queue.
Optionally, all logic of the scene analyzer runs in an Update function.
Optionally, the policy element is composed of a key-value pair, where key represents the feature code in the policy element and value represents the policy object.
In a second aspect, an embodiment of the present application provides a puncture navigation interaction assisting device based on mixed reality, where the device includes:
the information acquisition module is used for acquiring user behaviors and system flows of the navigation interaction user;
the strategy element determining module is used for determining strategy elements required by navigation interaction according to the user behaviors and the system flow;
the strategy object extraction module is used for extracting strategy objects in the strategy elements;
the command triggering module is used for triggering a command included in the strategy object through interaction between a user and an interface included in the strategy object;
and the execution module is used for calling the object corresponding to the command through the command manager and executing the method in the interface of the object.
In a third aspect, embodiments of the present application provide a computer storage medium having stored thereon a plurality of instructions adapted to be loaded by a processor and to perform the above-described method steps.
In a fourth aspect, embodiments of the present application provide a terminal, which may include: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps described above.
The technical scheme provided by the embodiment of the application can comprise the following beneficial effects:
in the embodiment of the application, the puncture navigation interaction assisting method based on mixed reality firstly obtains user behaviors and system flows of navigation interaction users, then determines strategy elements required by navigation interaction according to the user behaviors and the system flows, extracts strategy objects in the strategy elements, interacts with interfaces included by the strategy objects through the users, touches commands included by the strategy objects, and finally invokes objects corresponding to the commands through a command manager to execute the method in the object interfaces. According to the method, the mixed reality technology and the artificial intelligence technology are deeply integrated, so that the method has universality and high expansibility, intelligent prompt can be carried out on user behaviors according to different application scenes, and the difficulty of navigation interaction is reduced; the method supports the self-addition of a third-party developer, supports the unification, abstraction and standardization interfaces of different rules under different navigation systems, and supports the custom addition of policy objects in a universal interface.
In the embodiment of the application, the puncture navigation interaction assisting method based on mixed reality acquires user behaviors and system flows of navigation interaction users; analyzing the user behavior in real time through a scene analyzer, acquiring user behavior characteristics of navigation interaction, extracting relevant strategy elements of the user behavior characteristics from an instruction set, and storing the relevant strategy elements in a buffer; determining scene behavior characteristics of the system flow through the scene analyzer, and acquiring feature codes corresponding to the scene behavior characteristics; and extracting strategy elements required by navigation interaction from the related strategy elements through the feature codes. The method integrates intelligent monitoring and machine learning technologies, can analyze user behaviors in real time, stores relevant strategy elements of the user behaviors in the buffer, and is convenient for intelligent prompt of users in a system flow; the system flow operation method and the system flow operation device can assist the user in carrying out the system flow operation, reduce the complexity of the system flow and increase convenience and experience.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a schematic flow chart of a puncture navigation interaction assisting method based on mixed reality according to an embodiment of the present application;
fig. 2 is a schematic diagram of a logic architecture of a puncture navigation interaction assisting method based on mixed reality according to an embodiment of the present application;
fig. 3 is a schematic use flow diagram of a third party developer of a puncture navigation interaction assisting method based on mixed reality according to an embodiment of the present application;
fig. 4 is a schematic flow chart of another puncture navigation interaction assisting method based on mixed reality according to an embodiment of the present application;
fig. 5 is a schematic device diagram of a puncture navigation interaction assisting device based on mixed reality according to an embodiment of the present application;
fig. 6 is a schematic diagram of a terminal according to an embodiment of the present application.
Detailed Description
The following description and the drawings sufficiently illustrate specific embodiments of the invention to enable those skilled in the art to practice them.
It should be understood that the described embodiments are merely some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the invention. Rather, they are merely examples of systems and methods that are consistent with aspects of the invention as detailed in the accompanying claims.
In the description of the present invention, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art. Furthermore, in the description of the present invention, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
In recent years, the mixed reality technology breaks the limit between the virtual world and the physical real world of the digital by presenting virtual scene information in the real scene and setting up an interactive feedback information loop among the real world, the virtual world and the user, thereby breaking through the accumulation of the digital technology for years into quality changes.
The mixed reality technology shows three-dimensional images, and has higher application value in operation. Along with the rapid development of the medical mixed reality technology, the medical mixed reality technology brings new clinical diagnosis and treatment modes for medical workers, and provides a brand new individualized three-dimensional stereoscopic vision presentation mode for the complex and changeable injury condition of traumatic orthopedics. The mixed reality technology firstly carries out three-dimensional reconstruction by acquiring image data such as CT, MRI and the like, then carries out module setting and parameter setting by a rendering technology, and finally loads a model into special equipment to realize various operations based on mixed reality. Because various structures of the bone parts of the body are rendered by different colors, the whole 3D model can be observed from any angle and can be subjected to definition adjustment, orthopedics doctors can more intuitively know the structure of the focus parts, and the reading of image data is simpler and clearer.
Since microsoft developed a mixed reality device holonens in 2016, holonens-based surgical navigation methods have been widely used in clinic. At present, a plurality of navigation systems based on holomens are tightly coupled with the self navigation system, portability and universality are not achieved, the man-machine interaction modules in different navigation software are required to be repeatedly designed and repeatedly developed by research personnel, if a service scene changes, interfaces are also required to be changed, secondary development and adjustment are necessarily caused, different and complicated operation flows among different navigation systems are caused, and the user learning cost is high, so that the navigation system is not a generalized solution, and is not beneficial to popularization of the navigation operation. Based on the scene, the embodiment of the application aims to solve the problems of how to quickly improve efficiency, liberate productivity, simplify the operation flow of a surgical navigation system and reduce the man-machine interaction difficulty and the user learning cost in a mixed reality environment, so that the embodiment of the application provides a puncture navigation interaction assisting method and device based on mixed reality to solve the problems, and a universal man-machine interaction scheme is utilized to realize multi-scene intelligent interaction assistance; the method combines the mixed reality technology and the artificial intelligence technology, assists doctors in completing complex operations, and can reduce medical accidents; the hierarchical diagnosis and treatment can be realized, and the problem of uneven medical resources and the like is relieved.
The following will describe a puncture navigation interaction assisting method based on mixed reality in detail with reference to fig. 1 to fig. 4.
Referring to fig. 1-3, a flow chart of a puncture navigation interaction assisting method based on mixed reality is provided for an embodiment of the present application. As shown in fig. 1-3, the method of the embodiments of the present application may include the steps of:
the technical scheme of the puncture navigation interaction assisting method based on mixed reality can be used as a general development extension package under a Unity3D platform, the system is based on a modularized design framework, unified interfaces and development specifications are provided, service logic and interaction logic of a navigation interaction system created and developed based on the platform can be decoupled, auxiliary interaction schemes can be provided for different puncture navigation systems without being limited by a single product, different puncture navigation systems have unified development specifications and development interfaces, a third party developer is supported to add and realize interaction strategies by himself, the method has high universality and flexible expansibility, repeated design, repeated development and repeated adjustment of human-computer interaction caused by service logic change are avoided to the greatest extent, and therefore efficiency is improved, and labor cost and research and development cost are reduced.
S110, acquiring user behaviors and system flows of the navigation interaction user.
In the embodiment of the application, the user behavior is obtained through the camera or the voice monitor, and the user behavior may include: speech, gaze, gestures, etc. The acquired system flow may be a stage to which the system is specifically configured, and the like.
S120, determining strategy elements required by navigation interaction according to the user behaviors and the system flow.
Specifically, S120 includes:
s121, analyzing the user behaviors in real time through a scene analyzer, and acquiring user behavior characteristics of navigation interaction. In the embodiment of the application, the user behavior and the user behavior feature have a corresponding relationship, and the scene analyzer can acquire the user behavior feature of navigation interaction in a table look-up mode and the like; the user behavior feature may be a feature code, i.e. the user behavior has a correspondence with the feature code.
S122, extracting relevant strategy elements of the user behavior characteristics from the instruction set.
For example, when the behavior feature of the user is a feature code, if the feature code is a, the policy element with the feature code a extracted from all policy elements is the relevant policy element pre-analyzed in the embodiment of the present application.
In the embodiment of the application, the instruction set is a policy set in which a developer adds a plurality of policy elements actually required by the system flow of the puncture navigation system to a system predefined behavior library, and the embodiment of the application adopts the concept of interface-oriented programming and is highly abstract to basic policy elements forming the instruction set. Each policy element is an abstract class, and the policy element consists of key-value pairs, wherein the key represents feature codes in the policy element, and the feature codes are feature codes bound by developers and are objects highly abstract according to the navigation system flow; the value represents a policy object in the policy element, the policy object is an abstract policy interface object, and the policy object is abstract as an interface, so that a third party developer can conveniently design and assign values according to a standard interface after introducing the development framework of the embodiment of the application.
S123, storing the relevant strategy elements in a buffer; in this embodiment of the present application, the buffer is an instruction buffer set, and the instruction buffer set is used as a secondary buffer for preprocessing the command. The idea of using a buffer pool is used for dynamically managing the application memory of the system, so that the buffer has the function of the buffer pool; presetting a memory space with a single strategy object memory length of 10 units in a buffer pool, recycling the memory space, and if the remaining memory space in the buffer pool is not enough for storing the single strategy object, increasing the memory space by 10 units again; the buffer has the function of a buffer pool, so that the memory space is effectively saved. The buffer is the buffer set in fig. 2.
S124, determining scene behavior characteristics of the system flow through the scene analyzer, and acquiring feature codes corresponding to the scene behavior characteristics. In the embodiment of the application, the scene behavior features have a corresponding relation with the system flow, and the scene analyzer can acquire the scene behavior features of navigation interaction in a table look-up mode and the like; the scene behavior feature has a corresponding feature code, and the system flow has a corresponding relation with the feature code.
In the embodiment of the application, the feature code is a feature id, and the feature id is a Key, a primary Key and a unique identifier of a policy element; the embodiment of the application can acquire the policy element through the feature id.
The feature codes are also called behavior feature codes, and represent different system flows of the abstract navigation system. The HoloLens-based navigation puncture guiding system generally needs to receive model data, render models, coordinate system adaptation, space registration, puncture needle positioning and tracking and other system flows, and the navigation systems are different, so that the implementation algorithms of the system flows are also different; the implementation algorithm of the system flow can be abstracted into specific behavior phases, and the behavior phases can be identified through behavior feature codes.
S125, extracting strategy elements required by navigation interaction from the related strategy elements through the feature codes. The scene analyzer may extract the required policy elements from the relevant policy elements in the buffer by mobilizing the buffer.
In the embodiment of the present application, the relevant policy element may represent a system flow where the current policy element may be located; the policy elements extracted from the related policy elements by the feature codes corresponding to the system flow may represent the determined system flow in which the current system flow is located.
In the embodiment of the present application, the relevant policy elements of the user behavior are extracted from all the policy elements through the feature codes corresponding to the features of the user behavior in S121 and S122; extracting policy elements of the system flow from the related policy elements through feature codes corresponding to the scene behavior features by the steps 124 and 125; therefore, the method and the device realize that the policy elements required by the application are extracted from all policy elements through user behaviors and system flows.
S130, extracting the strategy object in the strategy element. Specifically, the S130 includes:
s131, the scene analyzer sends the extracted strategy elements to a command processor through a feature queue container. The feature queue container is a Push feature queue container, and the scene analyzer sends strategy elements to the command processor according to the first-in first-out characteristic through the Push feature queue container; the operation of the command processor is the operation of the command processing block in fig. 2, and the module corresponding to the step is a queue structure.
In the embodiment of the application, the scene analyzer is Update SceneAnalysis, and the scene analyzer is derived from an Agent concept in a machine learning algorithm and is a participant for observing and taking actions in the environment. The scene analyzer fuses the API of the system with the API of the TensorFlow algorithm based on the TensorFlow algorithm, and trains the Agent by executing frames in each Update function of the Agent; the above S110 to S131 are a complete execution frame operation, and all logic of the scene analyzer is run in the Update function, which executes the above S110 to S131 in a real-time loop manner, and analyzes dynamic elements in the running process of the software in real time, so that it can complete a plurality of specified targets. The dynamic element comprises a GameObject container encapsulated by each behavior content, graphic content, physical content and the like in the Unity; the specified targets comprise environment perception, user behavior prediction, automatic testing in a system, interactive intelligent prompt, design decision evaluation, intelligent learning and the like; the scene analyzer can be continuously reinforcement learned (Reinforcement learning) in daily interactions, and as the number of uses of the system by users increases, the scene analyzer becomes more intelligent.
S132, the command processor sequentially extracts policy objects in the policy elements to be executed from the feature queue.
S140, triggering a command included in the strategy object through interaction between a user and an interface included in the strategy object; the policy object is a list having one or more specific execution objects.
In an embodiment of the present application, the policy object may be composed of a UI block and a Command block. Stored in the UI is a picture resource localization sequence code bound with the command, the unique UI preset corresponding to the resource library can be directly indexed through the sequence code, and the UI preset is loaded into the running environment. The design of the UI block can support click, drag, toggle and other interaction types, and the UI elements can be triggered by user voice, gesture, gaze and the like. Command is a carrier of specific behavior prompt, binds commands to be executed after the object is triggered, adopts a Command mode in a software design model, and decouples an instruction generator and an executor; the third party developer may autonomously design the form of behavioral cues associated with the present command, such as speech, gestures, and gaze.
The UI may be an interface in the embodiment of the present application, and the UI element is an interface element and the executed object; the command may also be referred to as a command in embodiments of the present application.
In the embodiment of the application, all resource sequences to be used by a system are stored in a resource library, and each resource has a unique fileID corresponding to a unique disk path; if the resource is a composite resource, such as an atlas, the composite resource, such as an atlas, records a set of all subfile FileIDs. The command executor can extract complete UI preset resources through the resource ID; after the resources are used, the resource library can be simultaneously responsible for the memory recovery of a certain resource. The resources may be audio, picture, UX, preset resources, etc., and the operation of the command executor is the operation of the command execution block in fig. 2.
S150, after the command is triggered, calling an object corresponding to the command through a command manager, and executing an Execute method in an interface of the object. Each command in the command manager has a unique identifier, and the identifier is marked with a message triggered by the command and a specific execution object corresponding to the command.
In this embodiment of the present application, in the processing stage of the command processor, all types of prompt information (i.e., UI elements) are rendered in the line of sight of the end user by the system, the user can select the triggered prompt information according to the actual needs of the user, the bound command is triggered in the command mode, and the command manager processes the command after receiving the message of the trigger command.
In the embodiment of the present application, after the execution of the execution method is completed, the life cycle of the prompt command ends; the buffer retrieves the policy elements and waits for the scene analyzer to recall or destruct the policy elements again.
The method can form a development kit based on the Unity3D platform; in the embodiment of the application, the content about aspects of artificial intelligence, machine learning, instruction set and the like is encapsulation and transformation under the Unity3D platform, and meets the operation requirements of the Unity3D platform. The authorized third party research and development mechanism can import the development kit into the own navigation engineering, self-define and divide each system flow of the own navigation system, autonomously edit strategy element key value pairs (namely, set a list of feature codes and strategy objects) according to the interface designed by the development kit, bind UI preset data and behaviors associated with each strategy object, and add all added strategy elements to an instruction set, so that interaction auxiliary logic in the own navigation system can be quickly built without self-head design and research and development.
The developer can add a user-defined instruction set and a strategy object according to the self navigation system flow based on the system framework and the system standardized interface, the framework supports the operations of marking, matching, adding, sorting, deleting and the like of the strategy object, and the service logic requirements of different puncture navigation systems can be met, so that repeated design and development are avoided to the greatest extent, the production effect is improved, and the research and development cost is reduced. As shown in fig. 5, after a third party developer imports a development kit into its own puncture navigation system, first, a feature code to be added and a policy object corresponding to the feature code to be added are selected; judging whether the strategy object accords with the behavior configuration or not, and determining strategy elements corresponding to the feature codes under the condition that the strategy object accords with the behavior configuration; and then adding the strategy element, namely adding the strategy object of the strategy element, deleting the strategy object of the strategy element, adjusting the strategy object of the strategy element, and outputting the configured strategy element to an instruction set for storage through a standard interface. When the behavior configuration includes voice, gaze and gesture, if the policy object is video, the policy object does not conform to the behavior configuration.
According to the puncture navigation interaction assisting method based on mixed reality, firstly, user behaviors and system flows of navigation interaction users are obtained, then, policy elements required by navigation interaction are determined according to the user behaviors and the system flows, then, policy objects in the policy elements are extracted, secondly, interaction is carried out through interfaces included by the users and the policy objects, commands included by the policy objects are touched, finally, objects corresponding to the commands are called through a command manager, and a method in the object interfaces is executed. According to the method, the mixed reality technology and the artificial intelligence technology are deeply integrated, intelligent prompt can be carried out on user behaviors according to different application scenes, and difficulty of navigation interaction is reduced; the universal modularized design concept is adopted, the limitation of a single product is avoided, and auxiliary interaction schemes can be provided for different navigation interaction systems; the method has the advantages that high abstraction is carried out from the design level of the software architecture and the data structure, the coupling of the whole and part is greatly reduced, the requirements of different navigation interaction schemes can be met, the development and maintenance cost is reduced, and the method has more flexible expansibility and portability.
Fig. 4 is a schematic flow chart of a puncture navigation interaction assisting method based on mixed reality according to an embodiment of the present application. As shown in fig. 4, the method of the embodiment of the present application may include the following steps:
s210, acquiring user behaviors and system flows of navigation interaction users;
s211, analyzing the user behaviors in real time through a scene analyzer to acquire user behavior characteristics of navigation interaction;
s212, extracting relevant strategy elements of the user behavior characteristics from an instruction set;
s213, storing the relevant strategy elements in a buffer; the buffer has the function of a buffer pool;
s214, determining scene behavior characteristics of the system flow through the scene analyzer, and acquiring feature codes corresponding to the scene behavior characteristics;
s215, extracting strategy elements required by navigation interaction from the related strategy elements through the feature codes;
s216, the scene analyzer sends the extracted strategy elements to a command processor through a feature queue container;
s217, the command processor sequentially extracts the strategy objects in the strategy elements from the feature queue; the policy element consists of a key-value pair, wherein key represents the feature code in the policy element, and value represents the policy object;
s218, triggering a command included in the strategy object through interaction between a user and an interface included in the strategy object;
s219, calling the object corresponding to the command through the command manager, and executing the method in the interface of the object.
In the embodiment of the application, the puncture navigation interaction assisting method based on mixed reality firstly obtains user behaviors and system flows of navigation interaction users, then determines strategy elements required by navigation interaction according to the user behaviors and the system flows, extracts strategy objects in the strategy elements, secondly interacts with interfaces included by the strategy objects through the users, triggers commands included by the strategy objects, and finally invokes objects corresponding to the commands through a command manager to execute the method in the object interfaces. According to the method, the mixed reality technology and the artificial intelligence technology are deeply integrated, so that the method has universality and high expansibility, intelligent prompt can be carried out on user behaviors according to different application scenes, and the difficulty of navigation interaction is reduced; the method supports the self-addition of a third-party developer, supports the unification, abstraction and standardization interfaces of different rules under different navigation systems, and supports the custom addition of policy objects in a universal interface.
The following are examples of the apparatus of the present invention that may be used to perform the method embodiments of the present invention. For details not disclosed in the embodiments of the apparatus of the present invention, please refer to the embodiments of the method of the present invention.
Referring to fig. 5, a schematic structural diagram of a puncture navigation interaction assisting device based on mixed reality according to an exemplary embodiment of the present invention is shown. The device 1 comprises: an information acquisition module 10, a policy element determination module 20, a policy object extraction module 30, a command trigger module 40 and an execution module 50.
The information acquisition module 10 is used for acquiring user behaviors and system flows of the navigation interaction user;
a policy element determining module 20, configured to determine policy elements required for navigation interaction according to the user behavior and the system flow;
a policy object extraction module 30, configured to extract a policy object in the policy element;
a command triggering module 40, configured to trigger a command included in the policy object through interaction between a user and an interface included in the policy object;
and the execution module 50 is used for calling the object corresponding to the command through the command manager and executing the method in the interface of the object.
It should be noted that, when the mixed reality-based puncture navigation interaction assisting device provided in the above embodiment executes the mixed reality-based puncture navigation interaction assisting method, only the division of the above functional modules is used for illustration, in practical application, the above functional allocation may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the puncture navigation interaction assisting device based on mixed reality provided in the above embodiment and the puncture navigation interaction assisting method based on mixed reality belong to the same concept, which embody the detailed implementation process and are not described herein.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
In this embodiment of the present application, the puncture navigation interaction assisting device based on mixed reality firstly obtains a user behavior and a system flow of a navigation interaction user, then determines a policy element required by the navigation interaction according to the user behavior and the system flow, then extracts a policy object in the policy element, secondly interacts with an interface included in the policy object by the user, triggers a command included in the policy object, and finally invokes an object corresponding to the command by a command manager to execute a method in the object interface. The device integrates the mixed reality technology and the artificial intelligence technology deeply, has universality and high expansibility, can intelligently prompt user behaviors according to different application scenes, and reduces the difficulty of navigation interaction; the method supports the self-addition of a third-party developer, supports the unification, abstraction and standardization interfaces of different rules under different navigation systems, and supports the custom addition of policy objects in a universal interface.
The invention also provides a computer readable medium, on which program instructions are stored, which when executed by a processor, implement the puncture navigation interaction assisting method based on mixed reality provided by the above method embodiments.
The invention also provides a computer program product containing instructions, which when run on a computer, cause the computer to execute the mixed reality-based puncture navigation interaction assisting method of the above method embodiments.
Referring to fig. 6, a schematic structural diagram of a terminal is provided in an embodiment of the present application. As shown in fig. 6, terminal 1000 can include: at least one processor 1001, at least one network interface 1004, a user interface 1003, a memory 1005, at least one communication bus 1002.
Wherein the communication bus 1002 is used to enable connected communication between these components.
The user interface 1003 may include a Display screen (Display) and a Camera (Camera), and the optional user interface 1003 may further include a standard wired interface and a wireless interface.
The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Wherein the processor 1001 may include one or more processing cores. The processor 1001 connects various parts within the entire electronic device 1000 using various interfaces and lines, and performs various functions of the electronic device 1000 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 1005, and invoking data stored in the memory 1005. Alternatively, the processor 1001 may be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 1001 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 1001 and may be implemented by a single chip.
The Memory 1005 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 1005 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). The memory 1005 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 1005 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the above-described respective method embodiments, etc.; the storage data area may store data or the like referred to in the above respective method embodiments. The memory 1005 may also optionally be at least one storage device located remotely from the processor 1001. As shown in fig. 6, an operating system, a network communication module, a user interface module, and an availability analysis application of vehicle running track data may be included in a memory 1005 as one type of computer storage medium.
In terminal 1000 shown in fig. 6, user interface 1003 is mainly used for providing an input interface for a user, and acquiring data input by the user; and the processor 1001 may be configured to invoke the mixed reality based puncture navigation interaction assisting application stored in the memory 1005, and specifically perform the following operations:
acquiring user behaviors and system flows of navigation interaction users;
determining strategy elements required by navigation interaction according to the user behaviors and the system flow;
extracting a strategy object in the strategy element;
triggering a command included in the strategy object through interaction between a user and an interface included in the strategy object;
and calling an object corresponding to the command through a command manager, and executing a method in an interface of the object.
In one embodiment, the processor 1001, when executing the determining the policy elements required for the navigation interaction according to the user behavior and the system flow, specifically performs the following operations:
analyzing the user behaviors in real time through a scene analyzer to obtain user behavior characteristics of navigation interaction;
extracting relevant policy elements of the user behavior characteristics from an instruction set;
storing the relevant policy elements in a buffer; the buffer has the function of a buffer pool;
determining scene behavior characteristics of the system flow through the scene analyzer, and acquiring feature codes corresponding to the scene behavior characteristics;
and extracting strategy elements required by navigation interaction from the related strategy elements through the feature codes.
In one embodiment, the processor 1001, when executing the extracting the policy object in the policy element, specifically performs the following operations:
the scene analyzer sends the extracted strategy elements to a command processor through a feature queue container;
the command processor sequentially extracts strategy objects in the strategy elements from the feature queue; the policy element consists of a key-value pair, wherein key represents the feature code in the policy element and value represents the policy object.
In this embodiment of the present application, the puncture navigation interaction assisting device based on mixed reality firstly obtains a user behavior and a system flow of a navigation interaction user, then determines a policy element required by the navigation interaction according to the user behavior and the system flow, then extracts a policy object in the policy element, secondly interacts with an interface included in the policy object by the user, touches a command included in the policy object, and finally invokes an object corresponding to the command by a command manager, and executes a method in the object interface. The device integrates the mixed reality technology and the artificial intelligence technology deeply, has universality and high expansibility, can intelligently prompt user behaviors according to different application scenes, and reduces the difficulty of navigation interaction; the method supports the self-addition of a third-party developer, supports the unification, abstraction and standardization interfaces of different rules under different navigation systems, and supports the custom addition of policy objects in a universal interface.
Those skilled in the art will appreciate that a program implementing all or part of the above-described embodiment method, which is implemented by means of hardware related to instructions of a computer program, may be stored in a computer readable storage medium, and the program, when executed, may include the above-described embodiment method flow. The storage medium may be a magnetic disk, an optical disk, a read-only memory, a random access memory, or the like.
The foregoing disclosure is only illustrative of the preferred embodiments of the present application and is not intended to limit the scope of the claims herein, as the equivalent of the claims herein shall be construed to fall within the scope of the claims herein.

Claims (8)

1. The puncture navigation interaction assisting method based on mixed reality is characterized by comprising the following steps of:
acquiring user behaviors and system flows of navigation interaction users;
determining strategy elements required by navigation interaction according to the user behaviors and the system flow;
extracting a strategy object in the strategy element;
triggering a command included in the strategy object through interaction between a user and an interface included in the strategy object;
calling an object corresponding to the command through a command manager, and executing a method in an interface of the object;
the determining the policy elements required by navigation interaction according to the user behavior and the system flow comprises the following steps:
analyzing the user behaviors in real time through a scene analyzer to obtain user behavior characteristics of navigation interaction;
extracting relevant policy elements of the user behavior characteristics from an instruction set;
storing the relevant policy elements in a buffer;
the method further comprises the steps of:
determining scene behavior characteristics of the system flow through the scene analyzer, and acquiring feature codes corresponding to the scene behavior characteristics;
and extracting strategy elements required by navigation interaction from the related strategy elements through the feature codes.
2. The navigation interactive assistance method according to claim 1, wherein the buffer has a function of a buffer pool.
3. The navigation interaction assistance method according to claim 1, wherein the extracting the policy object in the policy element comprises:
the scene analyzer sends the extracted strategy elements to a command processor through a feature queue container;
the command processor sequentially extracts the strategy objects in the strategy elements from the feature queue.
4. A navigation interaction assistance method according to claim 3 wherein all logic of the scene analyzer is run in an Update function.
5. A navigation interaction assistance method according to claim 3 wherein the policy element consists of a key-value pair, wherein key represents the feature code in the policy element and value represents the policy object.
6. Puncture navigation interaction auxiliary device based on mixed reality, characterized by comprising:
the information acquisition module is used for acquiring user behaviors and system flows of the navigation interaction user;
the strategy element determining module is used for determining strategy elements required by navigation interaction according to the user behaviors and the system flow; analyzing the user behaviors in real time through a scene analyzer to obtain user behavior characteristics of navigation interaction; extracting relevant policy elements of the user behavior characteristics from an instruction set; storing the relevant policy elements in a buffer; determining scene behavior characteristics of the system flow through the scene analyzer, and acquiring feature codes corresponding to the scene behavior characteristics; extracting strategy elements required by navigation interaction from the related strategy elements through the feature codes;
the strategy object extraction module is used for extracting strategy objects in the strategy elements;
the command triggering module is used for triggering a command included in the strategy object through interaction between a user and an interface included in the strategy object;
and the execution module is used for calling the object corresponding to the command through the command manager and executing the method in the interface of the object.
7. A computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the method steps of any of claims 1-5.
8. A terminal, comprising: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps of any of claims 1-5.
CN202210264158.4A 2022-03-17 2022-03-17 Puncture navigation interaction assisting method and device based on mixed reality Active CN114840110B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210264158.4A CN114840110B (en) 2022-03-17 2022-03-17 Puncture navigation interaction assisting method and device based on mixed reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210264158.4A CN114840110B (en) 2022-03-17 2022-03-17 Puncture navigation interaction assisting method and device based on mixed reality

Publications (2)

Publication Number Publication Date
CN114840110A CN114840110A (en) 2022-08-02
CN114840110B true CN114840110B (en) 2023-06-20

Family

ID=82562388

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210264158.4A Active CN114840110B (en) 2022-03-17 2022-03-17 Puncture navigation interaction assisting method and device based on mixed reality

Country Status (1)

Country Link
CN (1) CN114840110B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107273164A (en) * 2017-06-16 2017-10-20 郑州云海信息技术有限公司 A kind of method for realizing Auto-matching scene optimization strategy when linux system performance optimizes
CN109074665A (en) * 2016-12-02 2018-12-21 阿文特公司 System and method for navigating to targeted anatomic object in the program based on medical imaging
CN113133813A (en) * 2021-04-01 2021-07-20 上海复拓知达医疗科技有限公司 Dynamic information display system and method based on puncture process

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9852381B2 (en) * 2012-12-20 2017-12-26 Nokia Technologies Oy Method and apparatus for providing behavioral pattern generation for mixed reality objects
US11547499B2 (en) * 2014-04-04 2023-01-10 Surgical Theater, Inc. Dynamic and interactive navigation in a surgical environment
CN110537980A (en) * 2019-09-24 2019-12-06 上海理工大学 puncture surgery navigation method based on motion capture and mixed reality technology
CN113133829B (en) * 2021-04-01 2022-11-01 上海复拓知达医疗科技有限公司 Surgical navigation system, method, electronic device and readable storage medium
CN113986111A (en) * 2021-12-28 2022-01-28 北京亮亮视野科技有限公司 Interaction method, interaction device, electronic equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109074665A (en) * 2016-12-02 2018-12-21 阿文特公司 System and method for navigating to targeted anatomic object in the program based on medical imaging
CN107273164A (en) * 2017-06-16 2017-10-20 郑州云海信息技术有限公司 A kind of method for realizing Auto-matching scene optimization strategy when linux system performance optimizes
CN113133813A (en) * 2021-04-01 2021-07-20 上海复拓知达医疗科技有限公司 Dynamic information display system and method based on puncture process

Also Published As

Publication number Publication date
CN114840110A (en) 2022-08-02

Similar Documents

Publication Publication Date Title
KR102014385B1 (en) Method and apparatus for learning surgical image and recognizing surgical action based on learning
KR101864380B1 (en) Surgical image data learning system
US11907848B2 (en) Method and apparatus for training pose recognition model, and method and apparatus for image recognition
KR102298412B1 (en) Surgical image data learning system
CN107296650A (en) Intelligent operation accessory system based on virtual reality and augmented reality
US20190333626A1 (en) System and method for artificial agent based cognitive operating rooms
EP2400464A2 (en) Spatial association between virtual and augmented reality
CN111383347B (en) Emergency simulation method, system, server and storage medium based on three-dimensional simulation
CN110060767A (en) A kind of monitoring method washed one's hands, device, equipment and storage medium
US20220125360A1 (en) Method and computer program for determining psychological state through drawing process of counseling recipient
CN110827953A (en) Cognitive memory training evaluation system and method based on VR and storage medium
CN110660130A (en) Medical image-oriented mobile augmented reality system construction method
CN116869651A (en) Device and method for providing operation simulation based on virtual reality
CN104517016A (en) Surgery simulation system using motion sensing technology and virtual reality technology
CN114582487A (en) Traditional Chinese medicine diagnosis and treatment assisting method and system based on traditional Chinese medicine knowledge graph
CN110796064A (en) Human muscle image establishing method and device, storage medium and electronic equipment
CN114840110B (en) Puncture navigation interaction assisting method and device based on mixed reality
CN110335687A (en) Remote medical consultation with specialists method, system and computer readable storage medium
CN106078743A (en) Intelligent robot, is applied to operating system and the application shop of intelligent robot
CN113397708B (en) Particle puncture surgical robot navigation system
CN113409280B (en) Medical image processing method, labeling method and electronic equipment
CN111967333B (en) Signal generation method, system, storage medium and brain-computer interface spelling device
CN114170177A (en) Operation path analysis method and storage medium
CN117058405B (en) Image-based emotion recognition method, system, storage medium and terminal
US20230334998A1 (en) Surgical teaching auxiliary system using virtual reality and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant