CN116417120A - Intelligent diagnosis guiding method, device, equipment, medium and program product - Google Patents

Intelligent diagnosis guiding method, device, equipment, medium and program product Download PDF

Info

Publication number
CN116417120A
CN116417120A CN202111658357.5A CN202111658357A CN116417120A CN 116417120 A CN116417120 A CN 116417120A CN 202111658357 A CN202111658357 A CN 202111658357A CN 116417120 A CN116417120 A CN 116417120A
Authority
CN
China
Prior art keywords
symptom
displaying
selection
control
local area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111658357.5A
Other languages
Chinese (zh)
Inventor
刘泽泓
黄俊凯
孙琦
胡盟皎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111658357.5A priority Critical patent/CN116417120A/en
Publication of CN116417120A publication Critical patent/CN116417120A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Primary Health Care (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Pathology (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an intelligent diagnosis guiding method, device, equipment, medium and program product, and belongs to the technical field of artificial intelligence. The method comprises the following steps: displaying a dialogue interface in an application program when conducting online diagnosis; responding to the diagnosis guiding operation triggered on the dialogue interface, and displaying graphical biological information on a local area of the dialogue interface; responsive to a position selection operation on the graphical biometric information for the symptom position, displaying a symptom selection control corresponding to the symptom position on the local area; in response to a symptom selection operation on the symptom selection control, a lead diagnosis result is displayed in a dialogue message area of the dialogue interface. The method ensures that the user always displays the dialogue interface on the application program when using the intelligent diagnosis guiding function, avoids multiple interface jumps in the symptom selecting process, and achieves the effect of completing the diagnosis guiding service in the dialogue scene.

Description

Intelligent diagnosis guiding method, device, equipment, medium and program product
Technical Field
The embodiment of the application relates to the technical field of artificial intelligence, in particular to an intelligent diagnosis guiding method, an intelligent diagnosis guiding device, intelligent diagnosis guiding equipment, an intelligent diagnosis guiding medium and an intelligent diagnosis guiding program product.
Background
Some computer program products provide intelligent diagnosis guiding functions, and provide diagnosis guiding services for users.
The user opens an application program in the computer equipment and enters a dialogue interface corresponding to the intelligent diagnosis guiding function, wherein the dialogue interface comprises a diagnosis guiding control; the user enters a human body display interface through a diagnosis guiding control, clicks a certain human body part on the human body display interface, and jumps to a symptom selection interface again, wherein the symptom selection interface comprises various symptoms corresponding to the human body part; the user selects the symptom to be inquired, and then the application program provides a department corresponding to the symptom inquired by the user for the user to complete the diagnosis guiding service.
The intelligent diagnosis guiding function provided in the application program makes multiple interface jumps in the symptom selecting process, and breaks away from the dialogue scene during diagnosis guiding.
Disclosure of Invention
The embodiment of the application provides an intelligent diagnosis guiding method, device, equipment, medium and program product, which enable a user to always display a dialogue interface on an application program when using an intelligent diagnosis guiding function, avoid multiple interface jumps in the symptom selection process, and achieve the effect of completing diagnosis guiding service in a dialogue scene. The technical scheme is as follows:
In one aspect, an embodiment of the present application provides an intelligent diagnosis guiding method, where the method includes:
displaying a dialogue interface in an application program when conducting online diagnosis;
responding to the diagnosis guiding operation triggered on the dialogue interface, and displaying graphical biological information on a local area of the dialogue interface;
responsive to a position selection operation on the graphical biometric information for a symptom position, displaying a symptom selection control corresponding to the symptom position on the local area;
and responding to the symptom selection operation on the symptom selection control, and displaying a diagnosis guiding result in a dialogue message area of the dialogue interface.
In another aspect, an embodiment of the present application provides an intelligent diagnosis guiding device, including:
the display module is used for displaying a dialogue interface in the online diagnosis guiding process in the application program;
the display module is used for responding to the diagnosis guiding operation triggered on the dialogue interface and displaying graphical biological information on a local area of the dialogue interface;
the display module is used for responding to the position selection operation of the symptom position on the graphical biological information and displaying a symptom selection control corresponding to the symptom position on the local area;
And the display module is used for responding to the symptom selection operation on the symptom selection control and displaying the diagnosis guiding result in a dialogue message area of the dialogue interface.
In another aspect, an embodiment of the present application provides a computer device, where the computer device includes a processor and a memory, where at least one section of program is stored in the memory, and the at least one section of program is loaded and executed by the processor to implement the intelligent diagnosis guiding method according to the above aspect.
In another aspect, embodiments of the present application provide a computer readable storage medium having at least one program stored therein, where the at least one program is loaded and executed by a processor to implement the intelligent diagnosis guiding method described in the above aspect.
In yet another aspect, embodiments of the present application provide a computer program product (or computer program) comprising computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the intelligent consultation method according to the above aspect.
The beneficial effects that technical scheme that this application embodiment provided include at least:
in the intelligent diagnosis guiding method, after a dialogue interface in online diagnosis guiding is entered in an application program, the application program is controlled to display graphical biological information on a local area of the dialogue interface through diagnosis guiding operation triggered on the dialogue interface, then position selecting operation for symptomatic positions is triggered on the graphical biological information, symptom selecting controls corresponding to symptom positions are displayed on the local area in response to the position selecting operation, so that a user can select symptoms at a certain position on a living body in a dialogue scene, and then the application program gives diagnosis guiding results for the selected symptoms, and the diagnosis guiding results are displayed in a dialogue message area of the dialogue interface. Therefore, the intelligent diagnosis guiding method enables the user to be in a dialogue scene all the time in the process of seeking the diagnosis guiding service, avoids multiple interface jumps in the symptom selection process, and further avoids the experience of unsmooth dialogue brought to the user due to the multiple interface jumps.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 illustrates an interface schematic diagram of an intelligent guided diagnosis provided in an exemplary embodiment of the present application;
FIG. 2 illustrates a block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 3 illustrates a flow chart of an intelligent guided diagnostic method provided in an exemplary embodiment of the present application;
FIG. 4 illustrates a flow chart of an intelligent guided diagnostic method provided in another exemplary embodiment of the present application;
FIG. 5 illustrates an interface schematic diagram of an intelligent guided diagnosis provided in another exemplary embodiment of the present application;
FIG. 6 illustrates an interface schematic diagram of an intelligent guided diagnosis provided in accordance with another exemplary embodiment of the present application;
FIG. 7 illustrates an interface schematic diagram of an intelligent guided diagnosis provided in another exemplary embodiment of the present application;
FIG. 8 illustrates an interface schematic diagram of an intelligent guided diagnosis provided in accordance with another exemplary embodiment of the present application;
FIG. 9 illustrates a flowchart of an intelligent guided diagnostic method provided in another exemplary embodiment of the present application;
FIG. 10 illustrates an interface schematic of an intelligent guided diagnosis provided in another exemplary embodiment of the present application;
FIG. 11 illustrates a block diagram of an intelligent diagnostic device provided in accordance with another exemplary embodiment of the present application;
fig. 12 is a schematic diagram of a computer device according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms involved in the embodiments of the present application will be described:
artificial intelligence (Artificial Intelligence, AI) is the theory, method, technique and application system that uses a digital computer or a machine controlled by a digital computer to simulate, extend and extend human intelligence, sense the environment, acquire knowledge and use the knowledge to obtain optimal results. In other words, artificial intelligence is an integrated technology of computer science that attempts to understand the essence of intelligence and to produce a new intelligent machine that can react in a similar way to human intelligence. Artificial intelligence, i.e. research on design principles and implementation methods of various intelligent machines, enables the machines to have functions of sensing, reasoning and decision.
The artificial intelligence technology is a comprehensive subject, and relates to the technology with wide fields, namely the technology with a hardware level and the technology with a software level. Artificial intelligence infrastructure technologies generally include technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning, medical AI and other directions.
The above mentioned natural language processing (Nature Language Processing, NLP) is an important direction in the field of computer science and artificial intelligence. It is studying various theories and methods that enable effective communication between a person and a computer in natural language. Natural language processing is a science that integrates linguistics, computer science, and mathematics. Thus, the research in this field will involve natural language, i.e. language that people use daily, so it has a close relationship with the research in linguistics. Natural language processing techniques typically include text processing, semantic understanding, machine translation, robotic questions and answers, knowledge graph techniques, and the like.
The medical AI generally refers to the application of artificial intelligence technology, such as sensor, neural network chip, open source open platform and the like, to the medical health field; the application scenario may include fields of virtual assistance, medical imaging, drug discovery, nutrition, biotechnology, emergency room/hospital management, health management, mental well-being, wearable devices, risk management, pathology, and the like.
With research and advancement of artificial intelligence technology, research and application of artificial intelligence technology is being developed in various fields, such as common smart home, smart wearable devices, virtual assistants, smart speakers, smart marketing, unmanned, automatic driving, unmanned aerial vehicles, robots, medical AI, smart customer service, etc., and it is believed that with the development of technology, artificial intelligence technology will be applied in more fields and with increasing importance value.
The method for intelligent diagnosis guiding (Intelligent Guidance) provided by the embodiment of the application relates to the medical AI, natural language processing technology and the like, wherein the intelligent diagnosis guiding comprises more than ten medical service capabilities such as intelligent inquiry, medical consultation and the like, and can be applied to scenes such as registration on a micro-line, internet hospitals, regional platforms and the like.
In order to enable a user to be in a dialogue scene all the time in the process of seeking a diagnosis guiding service, the embodiment of the application provides an intelligent diagnosis guiding method, and by taking a two-dimensional display diagram with graphical biological information as a human body as an example in fig. 1, a dialogue interface 101 of a computer device during online diagnosis guiding is displayed in an application program, and a diagnosis guiding control 102 is displayed on the dialogue interface 101; the user performs a triggering operation on the lead control 102, such as clicking on the lead control 102; the computer equipment responds to the diagnosis guiding operation triggered on the diagnosis guiding control 102, a two-dimensional display diagram of the human body is displayed on a local area 103 of the dialogue interface 101, and a mark control 104 corresponding to the symptom position (taking the human body part as an example in fig. 1) is displayed around the two-dimensional display diagram of the human body; the user selects to trigger a certain mark control 104; in response to the position selection operation on the marker control 104, the computer device displays a symptom selection control 105 corresponding to the selected symptom position on the local area 103 of the dialogue interface 101, for example, the symptom selection control of the selected chest is displayed on the local area 103 in fig. 1, and includes symptom selection controls of symptoms such as thick sputum, shortness of breath, dry cough, chest deformity and the like; the user can complete the symptom selection for a certain human body part through the symptom selection control 105. In the intelligent diagnosis guiding method, the local area is divided from the dialogue interface, the mark control corresponding to the symptom position and the display of the symptom selection control are executed on the local area, so that the user can select the symptom on the local area without jumping to other interfaces, and the user is in the dialogue scene all the time in the process of seeking the diagnosis guiding service.
The intelligent diagnosis guiding method can be applied to computer equipment of a computer system, and is exemplified by fig. 2, which shows a structural framework diagram of the computer system provided by an exemplary embodiment of the application. The computer system includes a computer device (which may also be referred to as an electronic device or user device) 210 and a server 220.
The computer device 210 has applications installed and running. The application program has the intelligent diagnosis guiding function, and can be instant messaging software, life service software, medical service software and the like. And/or the application program supports the operation of an applet, wherein the applet has the intelligent diagnosis guiding function, and the applet can be a life service applet, a medical service applet, an electronic payment applet and the like. By way of example, the application program may be a cloud application.
In some embodiments, the system running on computer device 210 supports a fast application with intelligent guided diagnostics functionality; the computer device may display a dialogue interface in the system program at the time of online consultation; responding to the diagnosis guiding operation triggered on the dialogue interface, and displaying graphical biological information on a local area of the dialogue interface; responsive to a position selection operation on the graphical biometric information for the symptom position, displaying a symptom selection control corresponding to the symptom position on the local area; in response to a symptom selection operation on the symptom selection control, a lead diagnosis result is displayed in a dialogue message area of the dialogue interface.
By way of example, the computer device 210 may include, but is not limited to, a smart phone, a tablet computer, an electronic book reader, an MP3 (Moving Picture Experts Group Audio Layer III, moving Picture experts compression standard Audio layer 3) player, an MP4 (Moving Picture Experts Group Audio Layer IV, moving Picture experts compression standard Audio layer 4) player, a laptop portable computer, a desktop computer, and the like.
The computer device 210 is directly or indirectly connected to the server 220 via a wireless or wired network. Server 220 is used to provide background services for system programs or applications. The server 220 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or may be a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDN), basic cloud computing services such as big data and artificial intelligence platforms, and the like.
Illustratively, the server 220 performs primary computing tasks and the computer device 210 performs secondary computing tasks; alternatively, server 220 performs the secondary computing job and computer device 210 performs the primary computing job; alternatively, a distributed computing architecture is employed between server 220 and computer device 210 for collaborative computing.
The wireless or wired networks described above use, for example, standard communication techniques and/or protocols. The network is typically the Internet, but may be any network including, but not limited to, a local area network (Local Area Network, LAN), metropolitan area network (Metropolitan Area Network, MAN), wide area network (Wide Area Network, WAN), a mobile, wired or wireless network, a private network, or any combination of virtual private networks. In some embodiments, data exchanged over the network is represented using techniques and/or formats including HyperText Mark-up Language (HTML), extensible markup Language (Extensible Markup Language, XML), and the like. All or some of the links may also be encrypted using conventional encryption techniques such as secure socket layer (Secure Socket Layer, SSL), transport layer security (Transport Layer Security, TLS), virtual private network (Virtual Private Network, VPN), internet protocol security (Internet Protocol Security, IPsec), and the like. In other embodiments, custom and/or dedicated data communication techniques may also be used in place of or in addition to the data communication techniques described above.
Those skilled in the art will appreciate that the number of computer devices described above may be greater or lesser. For example, the number of the above-mentioned computer devices may be only one, or the number of the above-mentioned computer devices may be several tens or hundreds, or more. The number and type of computer devices are not limited in the embodiments of the present application.
Fig. 3 shows a flowchart of an intelligent diagnosis guiding method according to an exemplary embodiment of the present application, taking application in the computer device shown in fig. 2 as an example, the method includes:
step 310, a dialogue interface is displayed in the application program at the time of online consultation.
An application program is installed and run on the computer device, the application program has the function of intelligent diagnosis guiding, the function of intelligent diagnosis guiding is started on the application program, the application program enters into a dialogue interface during online diagnosis guiding, and as shown in fig. 1, the dialogue interface 101 during online diagnosis guiding is displayed in the application program.
The application program supports the running of an applet, and the applet has the intelligent diagnosis guiding function; the computer equipment responds to the starting operation of the small program and displays a dialogue interface of the small program in the application program; or displaying a main interface of the applet in the application program, wherein the main interface is displayed with a function starting control for intelligent diagnosis, and the dialog interface of the applet is displayed in the application program in response to the function starting operation on the function starting control.
In step 320, graphical biometric information is displayed on a localized area of the dialog interface in response to the triage operation triggered on the dialog interface.
The above-described lead operation is used to trigger a procedure for entering symptom selection. Illustratively, as in FIG. 1, the computer device displays a triage control 102 on the dialog interface 101, and in response to a triage operation triggered on the triage control 102, displays graphical biometric information 106 on a localized area of the dialog interface 101. Illustratively, the lead control may be a hover button control.
Alternatively, the diagnosis guiding operation includes a voice input operation of a diagnosis guiding command; the computer equipment displays a voice input control on the dialogue interface, extracts a diagnosis guiding command from the input voice in response to the voice input operation on the voice input control, and displays graphical biological information on a local area of the dialogue interface in response to the diagnosis guiding command.
Alternatively, the diagnosis guiding operation includes a designated gesture operation for triggering a diagnosis guiding function, and the computer device displays graphical biological information on a local area of the dialogue interface in response to the designated gesture operation triggered on the dialogue interface.
Optionally, the graphical biometric information includes graphical information of at least one of a human body, an animal, and a plant. Optionally, the graphical biometric information includes at least one of a two-dimensional image, a three-dimensional model, and an augmented reality (Augmented Reality, AR) image. For example, the graphical biometric information may be a three-dimensional human body model, or may be a two-dimensional animal and plant image, or the like.
In response to the position selection operation for the symptom position on the graphical biometric information, a symptom selection control corresponding to the symptom position is displayed on the local area in step 330.
And if the user selects a symptom position with symptoms on the graphical biological information, the computer equipment responds to the position selection operation of the symptom position on the graphical biological information, determines the symptom position on the biological body, inquires at least one symptom corresponding to the symptom position, and displays at least one symptom selection control corresponding to the at least one symptom on the local area.
For example, the computer device receives the position selection operation on the chest on the graphical human body information, inquires and obtains a series of symptoms corresponding to the chest of the human body, such as thick phlegm, shortness of breath, dry cough, bloody sputum, chest deformity, cough, chest distress, dyspnea and the like, and displays a symptom selection control corresponding to the series of symptoms on a local area. As shown in fig. 1, the user clicks on the chest 113 on the graphical human body information; in response to the position selection operation on the chest 113, the computer device queries to obtain a series of symptoms corresponding to the chest of the human body, such as thick phlegm, shortness of breath, dry cough, bloody sputum, chest deformity, cough, chest distress, dyspnea, and the like, and displays a symptom selection control 105 corresponding to the series of symptoms on the local area 103.
In response to the symptom selection operation on the symptom selection control, the lead result is displayed in a dialogue message area of the dialogue interface, step 340.
After triggering the diagnosis guiding operation, the computer equipment divides the dialogue interface into a dialogue message area and the local area; after the symptom selection is completed, the diagnosis guiding result is displayed in a dialogue message area of a dialogue interface. For example, after the user selects "chest, shortness of breath", the intelligent guided diagnosis gives a guided diagnosis result, and the user is recommended to see the respiratory medicine.
In summary, in the intelligent diagnosis guiding method provided in this embodiment, after the application program enters the session interface in online diagnosis guiding, the application program is controlled to display graphical biological information on a local area of the session interface through a diagnosis guiding operation triggered on the session interface, then a position selecting operation for a symptom position is triggered on the graphical biological information, and in response to the position selecting operation, a symptom selecting control corresponding to the symptom position is displayed on the local area, so that a user can select symptoms at a certain position on the living body in a session scene, and then a diagnosis guiding result is given by the application program for the selected symptoms, and the diagnosis guiding result is displayed in a session message area of the session interface. Therefore, the intelligent diagnosis guiding method enables the user to be in a dialogue scene all the time in the process of seeking the diagnosis guiding service, avoids multiple interface jumps in the symptom selection process, and further avoids the experience of unsmooth dialogue brought to the user due to the multiple interface jumps; the problem that interaction experience is not smooth during user operation due to multiple interface jumps is solved, and the situation that the user returns to the dialogue scene after jumping out of the dialogue scene to select symptoms is avoided. And in the method, graphical biological information is displayed in the symptom selection process, so that a user can more easily determine the symptom position selected by the user.
In some embodiments, a marker control for symptom location may also be displayed around the graphical biometric information, and as illustrated in fig. 4, steps 320 through 330 described above may be implemented by steps 322 and 332, as follows:
in response to the guided diagnosis operation triggered on the dialog interface, the graphical biometric information is displayed on the local area and the marker controls for the symptom location are displayed around the graphical biometric information, step 322.
The computer equipment responds to the diagnosis guiding operation, graphical biological information is displayed on the local area, and mark controls are displayed around the graphical biological information in one-to-one correspondence with symptom positions, as shown in the upper right diagram in fig. 1, and all parts on the graphical human information are connected with the mark controls in one-to-one correspondence through connecting lines.
In some embodiments, the computer device may divide a localized area on the dialog interface, and display graphical biometric information, as well as a markup control of symptom location, directly on the localized area of the dialog interface.
In other embodiments, the computer device may divide a localized area on the dialog interface, pop-up a pop-up window on the localized area of the dialog interface, and display graphical biometric information in the pop-up window, along with a markup control for symptom location. Illustratively, the computer device displays a pop-up window on the local area, displays a symptom location selection page in the pop-up window, and displays a graphical biometric information and a symptom location marker control on the symptom location selection page in response to a guided call operation triggered on the dialog interface.
Illustratively, as in fig. 1, the computer device pops up a pop-up window 107 over a localized area 103 in response to a triage operation triggered on a triage control 102, displays a symptom location selection page 108 in the pop-up window 107, displays graphical human information on the symptom location selection page 108, and displays a marker control 104 of symptom locations such as ears, nose, mouth, throat, cranium, hands, chest, abdomen, etc. around the graphical human information. For example, symptom locations may be divided based on the location of the organism; for example, the symptom locations of the human body may be divided based on the human body parts, as shown in fig. 1, including the human body parts of the nose, mouth, throat, cranium, hands, chest, abdomen, etc.
Optionally, in the process of displaying the symptom position mark control around the graphical biological information, the computer equipment further acquires the biological information record from the wearable equipment, and the wearable equipment and the application program have a binding relationship; the computer device may highlight the symptom location correspondence marker control around the graphical biometric information in the event that the biometric information record is information describing the symptom location. For example, the application program and the smart watch have binding relation, and the user uses the heart rate monitoring function of the smart watch, so that the computer device can acquire the biological information record from the smart watch, the biological information record comprises heart rate information, the heart rate information is information for describing the heart function, and the mark control corresponding to the heart position is highlighted around the graphical biological information.
For example, the computer device may also determine, based on the biometric information record, whether a specified symptom exists at a location of the living being described by the biometric information record, and if so, highlighting a symptom location (i.e., a living being location) corresponding mark control around the graphical living being information; if the mark control is not present, the mark control is not highlighted.
In response to the position selection operation on the markup control, a symptom selection control corresponding to the symptom position is displayed on the local area, step 332.
The computer device is further capable of querying at least one symptom corresponding to the symptom location, i.e., at least one symptom that may be present in the symptom location, in response to a location selection operation on the marker control for the symptom location, e.g., a symptom such as a leg that may be present in muscle strain, surface bruise, etc.; at least one symptom selection control corresponding to the at least one symptom is displayed on the local area. Illustratively, the computer device receives a position selection operation on a "chest" marking control, and in response to this position selection control, displays a symptom selection control 105 for symptoms such as thick sputum, shortness of breath, dry cough, bloody sputum, cough, dyspnea, etc., corresponding to the "chest" on the local area 103, as shown in the lower right diagram of fig. 1.
In some embodiments, the computer device may display the symptom selection control directly on a localized area of the dialog interface.
In other embodiments, the computer device may pop up a popup over a localized area of the dialog interface, displaying symptom selection controls in the popup. For example, if a symptom location selection page is displayed in the popup, the computer device switches the symptom location selection page in the popup to a symptom selection page in response to a location selection operation on a symptom location marker control, and displays a symptom selection control corresponding to the symptom location on the symptom selection page.
As shown in fig. 1, the computer device switches the symptom location selection page 108 in the popup window 107 to the symptom selection page 109 in response to the location selection operation on the symptom location marking control 104, and displays the symptom selection control 105 on the symptom selection page 109, for example, switches the symptom location selection page 108 in the popup window 107 to the symptom selection page 109 in response to the location selection operation on the chest marking control, and displays the symptom selection control 105 of symptoms such as dry cough, bloody sputum, chest distress, dyspnea, cough, and the like corresponding to the chest on the symptom selection page 109.
The popup window is a popup window which is independent of the dialogue interface and has a display level higher than a display level of the dialogue interface, the display level is a priority displayed on a screen, and the interface with the display level higher is displayed on the interface with the display level lower.
Optionally, local information of the graphical biological information may be displayed on the symptom selection page, where the local information includes the symptom position; other symptom locations besides the symptom locations are also displayed around the local information; after the symptom selection control corresponding to the symptom position is displayed on the symptom selection page, the computer equipment also responds to other position selection operations on other symptom positions to switch the symptom selection control on the symptom selection page into other symptom selection controls corresponding to other symptom positions.
For example, other markup controls with other symptom locations displayed around the local information on the symptom selection page; after the symptom selection control corresponding to the symptom position is displayed on the symptom selection page, the computer equipment also responds to other position selection operations on other mark controls to switch the symptom selection control on the symptom selection page into other symptom selection controls corresponding to other symptom positions.
As shown in fig. 5, the computer device displays a symptom selection control 105 for the chest in a symptom selection page 109; the symptom selection page 109 also displays local information 401 of the human body, and the periphery of the local information 401 of the human body displays a chest mark control 402 and a right arm mark control 403; the marker control 402 in the left diagram is in a selected state, and a plurality of symptom selection controls 105 corresponding to the chest are displayed below the local information 401 of the human body; the computer device switches the selected state of the marker control 402 to the unselected state, the unselected state of the marker control 403 to the selected state, and switches the plurality of symptom selection controls 105 shown under the local information 401 of the human body to the plurality of other symptom selection controls 110 corresponding to the right arm in response to other position selection operations on the right arm in the local information 401 or other position selection operations on the marker control 403 corresponding to the right arm.
Optionally, a return control of the symptom position selection page is also displayed in the popup window; after the symptom location selection page in the popup is switched to the symptom location selection page, the computer device also switches the symptom location selection page in the popup back to the symptom location selection page in response to a return operation on a return control of the symptom location selection page.
The popup window comprises a page display area and a switching control display area, wherein a symptom position selection page or a symptom selection page is displayed on the page display area, and a return control of the symptom position selection page and a selection control of the symptom selection page are displayed on the switching control display area; as shown in fig. 6, a return control 501 of the symptom location selection page and a selection control 502 of the symptom selection page are displayed on the toggle control display area of the pop-up window; after displaying symptom selection interface 109, the computer device switches symptom selection page 109 back to symptom location selection page 108 in the pop-up window in response to a return operation on return control 501 of the symptom location selection page. Illustratively, in the case where the target mark control is in the selected state, the computer device displays, in the popup, the symptom selection page 109 corresponding to the target mark control in response to a determination operation on the page switch control 503 or the selection control 502 of the symptom selection page.
In summary, according to the intelligent diagnosis guiding method provided by the embodiment, the graphical biological information is displayed, and the mark control of the symptom position is displayed around the graphical biological information, so that the user can more accurately select the symptom position to be selected. In the method, a symptom position selection page and a symptom position selection page switching control is displayed on a local area, so that a user can freely switch after the symptom position selection page and the symptom position selection page, and when the user needs to reselect the symptom position, the user can return to the symptom position selection interface from the symptom position selection page. In the method, the mark controls of a plurality of symptom positions can be displayed around the local information of the organism, so that when other mark controls of other symptom positions which are wanted to be reselected by a user are displayed around the local information of the organism, the other mark controls can be directly selected on the symptom selection page, the user does not need to return to the symptom position selection page again, and the selection efficiency of symptoms corresponding to the symptom positions can be effectively improved.
In other embodiments, the graphical biometric information includes a two-dimensional representation of the living being, with a front selection control and a back selection control of the living being displayed on the localized area; the computer equipment displays graphical organism information on a local area of the dialogue interface, and after the marking control of the symptom position is displayed around the graphical organism information, in the case that the two-dimensional display diagram displays the front surface of the organism, the back surface display diagram of the organism and the back surface marking control corresponding to the back surface display diagram are switched and displayed on the local area in response to the first selection operation on the back surface selection control; in a case where the two-dimensional display shows the back surface of the living body, in response to the second selection operation on the front surface selection control, the front surface display of the living body and the front surface marker control corresponding to the front surface display are switched and displayed on the partial area.
Illustratively, taking an organism as a human body and a symptom position as a human body part as an example, the computer device displays a front selection control and a back selection control of the human body on a local area of the dialogue interface; displaying a two-dimensional display diagram of a human body on a local area, and after displaying a mark control corresponding to a human body part around the two-dimensional display diagram of the human body, switching and displaying the back display diagram of the human body and the back mark control corresponding to the back display diagram on the local area in response to a first selection operation on the back selection control under the condition that the front of the human body is displayed in the two-dimensional display diagram of the human body; in the case that the back of the human body is displayed in the two-dimensional display diagram of the human body, the front display diagram of the human body and the front mark control corresponding to the front display diagram are displayed on the local area in a switching mode in response to the second selection operation on the front selection control.
Illustratively, as in FIG. 1, a symptom location selection page displayed on the localized area includes a front selection control 111 and a back selection control 112. After the computer equipment displays the symptom position selection page on the local area, under the condition that the computer equipment displays the front surface of the whole human body in the two-dimensional display diagram of the human body, responding to the first selection operation on the back surface selection control, and switching and displaying the back surface display diagram of the whole human body on the symptom position selection page; in the case where the display of the human body in two dimensions is the back of the whole human body, the front display of the whole human body is displayed on the symptom position selection page in a switching manner in response to the second selection operation on the front selection control.
Illustratively, as in FIG. 7, a symptom selection page displayed on a localized area includes a front selection control 601 and a back selection control 602. After the computer device displays the symptom selection page on the local area, in a case where the two-dimensional display diagram 603 of the human body part shows the front face of the human body part, the computer device switches and displays the back face display diagram 604 of the human body part on the symptom selection page in response to the first selection operation on the back face selection control 602.
In other embodiments, the graphical biometric information includes a three-dimensional display model of the living being, with an orientation adjustment control displayed on the local area, the orientation adjustment control for adjusting a display orientation of the three-dimensional display model on the local area; the computer device displaying the three-dimensional display model in a first orientation on the local area and displaying a first marker control around the three-dimensional display model at a first symptom location observable on the three-dimensional display model when in the first orientation; then, displaying the graphical biological information on the local area, and after the mark control of the symptom position is displayed around the graphical biological information, responding to the adjustment operation of displaying the azimuth on the azimuth adjustment control, displaying the three-dimensional display model in the adjusted second azimuth on the local area, and displaying a second mark control of the second symptom position observable on the three-dimensional display model when the three-dimensional display model is in the second azimuth around the three-dimensional display model; wherein the first orientation and the second orientation are different display orientations of the three-dimensional display model.
That is, the computer device displays the three-dimensional display model of the organism on the symptom location selection page, or displays the three-dimensional display model of the organism local on the symptom location selection page, and different display surfaces of the three-dimensional display model can be displayed on the local area through display orientation adjustment on the three-dimensional display model, so that the marker controls corresponding to symptom locations existing on the different display surfaces are displayed on the local area, and the display of the symptom locations is more stereoscopic and closer to reality.
In other embodiments, the intelligent diagnosis guiding method is applied to computer equipment provided with a camera; the graphical biometric information includes first biometric information; the computer equipment responds to the diagnosis guiding operation triggered on the dialogue interface and starts the camera; and displaying the first biological information acquired when the camera is at the first position on the local area, and displaying a mark control corresponding to the first biological information around the first biological information. That is, the computer device acquires an AR image of the living body (i.e., living body information) through the camera, and displays marker controls corresponding to symptom positions included in the AR image in a distributed manner around the AR image of the living body.
Optionally, the graphical biometric information further comprises second biometric information; the computer device displays first biological information acquired when the camera is at the first position on the local area, displays second biological information acquired when the camera is at the second position on the local area along with the movement of the camera from the first position to the second position after the marking control corresponding to the first biological information is displayed around the first biological information, and displays the marking control corresponding to the second biological information around the second biological information.
That is, as the position of the camera changes, the biometric information displayed on the local area of the computer device changes, and when the first biometric information changes to the second biometric information, a mark control corresponding to the symptom position included in the second biometric information is displayed on the local area, and the user can find the symptom position by adjusting the camera position.
Optionally, the computer device further displays a shooting control on the local area; after the computer equipment displays the first biological information acquired when the camera is at the first position on the local area, responding to shooting operation triggered on a shooting control, and shooting the first biological information to obtain a biological image; the biological image is uploaded to a background server, and the biological image is used as a reference image for medical staff to diagnose the disease. For example, before the symptom location selection is performed, the symptom location selection and the symptom selection are sequentially performed after the first biological information is photographed to obtain the biological image, and after the completion of the symptom selection is determined, the computer device uploads the biological image together with the symptom information to the background server. For example, under the condition that a binding relationship exists between a user account logged in an application program and a medical seeking account of a user, a doctor can check a biological image uploaded by the user through the user account in the medical seeking process of the user.
For example, the diagnosis guiding can be realized based on symptom information and symptom images, and for example, after the computer equipment displays the first biological information acquired when the camera is at the first position on the local area, the computer equipment responds to shooting operation triggered on the shooting control to shoot the first biological information to obtain a biological image (namely, the symptom image); displaying the biological image as a reference image of symptoms on the local area; responding to the symptom selection operation on the symptom selection control, and sending symptom information and a biological image corresponding to the symptom selection control to a background server; receiving a diagnosis guiding result fed back by a background server based on symptom information and a biological image; and displaying the diagnosis guiding result in a dialogue message area of the dialogue interface.
In summary, in the intelligent diagnosis guiding method provided in the embodiment, the symptom location display is designed according to the display surface of the living body, for example, the front and back surfaces of the living body, so that the user can clearly find the symptom location by comparing the graphical living body information of different surfaces.
In other embodiments, at least two type selection controls corresponding to at least two organism types are displayed on the localized area, the organism types being divided based on at least one attribute of age and gender of the organism; the computer device displays graphical organism information on the local area, and after the mark control of the symptom position is displayed around the graphical organism information, when the organism displayed in the graphical organism information is of the first organism type, the graphical organism information and the mark control corresponding to the second organism type are switched and displayed on the local area in response to the type selection operation on the type selection control of the second organism type.
Illustratively, at least two type selection controls corresponding to at least two human body types are displayed on a local area of the dialogue interface; the human body type is classified based on at least one attribute of the age and sex of the person, for example, the human body type may include boys, girls, young men, young women, middle-aged men, middle-aged women, elderly men, and elderly women. The computer device may display graphical human body information on the local area, and after displaying the mark control corresponding to the symptom position on the human body around the graphical human body information, may also switch and display the graphical human body information corresponding to the second human body type and the mark control on the local area in response to the type selection operation on the type selection control of the second human body type in case that the human body is the first human body type.
Illustratively, as in FIG. 8, the symptom location displayed on the localized area selects the type selection controls in the page, including a male selection control 701 and a female selection control 702. In the left diagram of fig. 8, the male selection control 701 on the symptom location selection page is in a selected state, and correspondingly, a two-dimensional display diagram 703 of the whole human front of the male is displayed on the symptom location selection page, and a plurality of mark controls 104 corresponding to the male are displayed around the two-dimensional display diagram 703; and then in response to a type selection operation on the female selection control 702, a two-dimensional display view 704 showing the entire human front face of the female is switched on the symptom location selection page, and a plurality of mark controls 705 corresponding to the female are displayed around the two-dimensional display view 704.
For example, as in FIG. 7, a symptom selection page displayed on the local area may also include a type selection control, including a male selection control 605 and a female selection control 606. After the computer device displays the symptom selection page on the partial area, in the case where the front face of the human body part of the male is displayed in the two-dimensional display diagram 603 of the human body part, the two-dimensional display diagram of the front face of the human body part of the female is switched to be displayed on the symptom selection page in response to the type selection operation on the female selection control 606.
In the intelligent diagnosis guiding method provided by the embodiment, a type selection control of multiple types of organisms is also provided, so that a user can select the diagnosis guiding service meeting the needs of the user according to gender and age.
Fig. 9 shows a flowchart of an intelligent diagnosis guiding method according to another exemplary embodiment of the present application, taking application in the computer device shown in fig. 2 as an example, the method includes:
step 810, displaying a dialogue interface in the application program during online consultation.
The computer program responds to the function starting instruction of the intelligent diagnosis guiding in the application program, and a dialogue interface during online diagnosis guiding is displayed in the application program.
In response to the triggering operation on the guided diagnosis control, an inquiry dialogue of the guided diagnosis robot, which is an artificial intelligence program for providing guided diagnosis service, is displayed on the dialogue message area, step 820.
The computer device responds to the triggering operation on the diagnosis guiding control, a local area and a dialogue message area are divided on the dialogue interface, the symptom selecting control based on the human body part is displayed on the local area, and the inquiry dialogue of the diagnosis guiding robot is displayed on the dialogue message area. Wherein, the diagnosis guiding robot is an artificial intelligence program for a diagnosis guiding service, as shown in fig. 10, when the computer device receives a triggering operation on the diagnosis guiding control, the diagnosis guiding robot responds to the triggering operation, and the diagnosis guiding robot serves as a dialogue party and outputs a query dialogue "please ask you where uncomfortable? This inquiry dialog is displayed in the dialog message area in the form of a message bubble.
In step 830, graphical biometric information is displayed on a localized area of the dialog interface in response to the triage operation triggered on the dialog interface.
After performing step 820, the computer device also displays the graphical biometric information on the localized area; alternatively, the computer device may display the graphical biometric information on the localized area while performing step 820.
In step 840, symptom selection controls corresponding to symptom locations are displayed on the local area in response to a location selection operation on the symptom locations on the graphical biometric information.
Please refer to the steps 320 and 330 in the above embodiments for detailed implementation of the steps 830 and 840, which are not described herein.
In step 850, in response to the symptom selection operation on the symptom selection control, symptom information corresponding to the symptom selection control is displayed as a dialog of the user on the dialog message area.
Symptom information includes symptoms and symptom locations. After determining that the user has completed selecting the symptom and symptom location, the computer device displays the symptom and symptom location correspondence on the conversation message area. For example, symptom information includes a human body part, and symptoms of the human body part; after the computer equipment determines that the user selects the human body part and the symptoms of the human body part, correspondingly displaying the human body part and the symptoms of the human body part on the dialogue message area; as shown in fig. 10, the computer device displays symptom information 901 in the form of a message bubble on a dialog message area 902 in response to a symptom selection operation on the symptom selection control.
Step 860, the guided diagnosis result is displayed as a dialogue of the guided diagnosis robot in the dialogue message area.
For example, the diagnosis guiding result may be a result fed back by the diagnosis guiding robot for symptom information; or the diagnosis guiding robot obtains the diagnosis guiding result from the background server, for example, before the computer equipment obtains the diagnosis guiding result, the computer equipment responds to the symptom selecting operation on the symptom selecting control and sends a diagnosis guiding request carrying symptom information to the background server; and receiving the diagnosis guiding result sent by the background server. Thereafter, the computer device displays the diagnosis result 903 for the symptom information feedback of "chest, shortness of breath" in the form of message bubbles: "AI recommends you to go to 'respiratory department' department registration" as shown in fig. 10. The diagnosis guiding result can also provide the information such as the position information of the department, the list of registered doctors and the like.
In some embodiments, a common symptom selection control is displayed on a dialogue interface during online guided diagnosis, wherein the common symptom selection control is a selection control corresponding to a symptom of which the number of guided diagnoses exceeds a number threshold; alternatively, the common symptom selection control refers to a symptom selection control corresponding to a common symptom in life. For example, symptoms such as cold, stomach ache, headache and the like can be used as common symptom selection controls.
Illustratively, as shown in fig. 1, the computer device displays a dialogue interface 101 during online diagnosis in an application program, and common symptom selection controls, such as symptom selection controls of diarrhea, vomiting, traumatic injury, headache, insomnia, muscle soreness and the like, are displayed on the dialogue interface 101 in the form of message bubbles. After the user clicks the triage control 102, the computer device pops up a popup window at the bottom (i.e., a local area) of the dialog interface in response to a triggering operation on the triage control 102, first displays a symptom location selection page 108 in the bottom popup window, and displays a front selection control 111 and a back selection control 112, and a male selection control 114 and a female selection control 115 on the symptom location selection page 108; the user can display the front and back of the human body through the front selection control 111 and the back selection control 112, and display the male and female human bodies through the male selection control 114 and the female selection control 115. After a user selects one symptom position on the human body, the computer equipment responds to the position selection operation on the mark control corresponding to the symptom position, and displays the mark control in a selected state, such as the state of the mark control corresponding to the chest in fig. 1, wherein the mark control is black-matrix white word, and the states of other mark controls are white-matrix black word; then, the user clicks the "next" button control, and the computer device displays the symptom selection page 109 in the bottom pop-up window in response to the clicking operation on the button control, and divides the symptom selection page 109 into two display areas: an upper display area and a lower display area, wherein a two-dimensional display diagram of a human body part corresponding to the selected symptom position is displayed in the upper display area; and displaying a symptom selection control corresponding to the symptom position in the lower display area for selection by a user. The user selects one symptom, the computer equipment responds to the symptom selection operation on the symptom selection control of the symptom, and displays the symptom selection control in the selected state, such as a symptom selection control corresponding to 'shortness' in fig. 10, wherein the word sizes of two words displayed on the symptom selection control are larger than the word sizes of other symptoms; the user clicks the "confirm" control, and the computer device responds to the confirm operation on the "confirm" control, so that the selection of the symptom position and symptoms on the human body is completed, the symptom position (including the human body position) and the symptoms are displayed in the form of message bubbles on the dialogue message area of the dialogue interface, as shown in fig. 10, and then the diagnosis guiding result is displayed.
In summary, in the intelligent diagnosis guiding method provided in this embodiment, the diagnosis guiding function is implemented in the form of a dialogue between the user and the diagnosis guiding robot, so as to further improve the authenticity of the dialogue scene and enhance the user experience.
Fig. 11 illustrates a block diagram of an intelligent diagnostic apparatus provided in an exemplary embodiment of the present application, which may be implemented as part or all of a computer device by software, hardware, or a combination of both. The device comprises:
a display module 1010, configured to display a dialogue interface during online diagnosis in an application program;
a display module 1010, configured to display graphical biometric information on a local area of the dialog interface in response to a diagnosis guiding operation triggered on the dialog interface;
a display module 1010 for displaying a symptom selection control corresponding to a symptom location on a local area in response to a location selection operation on the symptom location on the graphical biometric information;
and a display module 1010, configured to display the diagnosis guiding result in the dialogue message area of the dialogue interface in response to the symptom selection operation on the symptom selection control.
In some embodiments, the display module 1010 is configured to:
responding to the diagnosis guiding operation triggered on the dialogue interface, displaying graphical biological information on a local area, and displaying a mark control of symptom positions around the graphical biological information;
And displaying a symptom selection control corresponding to the symptom position on the local area in response to the position selection operation on the marker control.
In some embodiments, the display module 1010 is configured to:
displaying a popup window on the local area, displaying a symptom position selection page in the popup window, and displaying graphical biological information and a mark control on the symptom position selection page;
responding to position selection operation on the mark control, and switching a symptom position selection page in the popup window into a symptom selection page;
a symptom selection control is displayed on the symptom selection page.
In some embodiments, the symptom selection page displays local information of the graphical biological information, wherein the local information comprises symptom positions; other symptom locations besides symptom locations are also displayed around the local information;
and the display module 1010 is configured to switch the symptom selection control on the symptom selection page to another symptom selection control corresponding to another symptom position in response to another position selection operation on another symptom position after the symptom selection control is displayed on the symptom selection page.
In some embodiments, a return control of the symptomatic position selection page is displayed in the popup;
And the display module 1010 is configured to switch the symptom location selection page in the popup back to the symptom location selection page in response to a return operation on the return control after switching the symptom location selection page in the popup to the symptom selection page.
In some embodiments, the display module 1010 is configured to:
acquiring biological information records from wearable equipment, wherein the wearable equipment and an application program have a binding relationship;
in the case where the biometric information record is information describing the symptom location, a mark control corresponding to the symptom location is highlighted around the graphical biometric information.
In some embodiments, the graphical biometric information includes a two-dimensional representation of the living being, with a front selection control and a back selection control of the living being displayed on the localized area;
a display module 1010, configured to display graphical biological information on a local area, and, after displaying a marker control for a symptom location around the graphical biological information, in a case where the two-dimensional display shows the front side of the living body, switch and display the back side display of the living body and a back side marker control corresponding to the back side display on the local area in response to a first selection operation on the back side selection control; in a case where the two-dimensional display shows the back surface of the living body, in response to the second selection operation on the front surface selection control, the front surface display of the living body and the front surface marker control corresponding to the front surface display are switched and displayed on the partial area.
In some embodiments, the graphical biometric information includes a three-dimensional display model of the living being, the local area having displayed thereon an orientation adjustment control for adjusting a display orientation of the three-dimensional display model on the local area;
a display module 1010 for:
displaying the three-dimensional display model in a first orientation on the local area, and displaying a first marker control around the three-dimensional display model at a first symptom location observable on the three-dimensional display model when in the first orientation;
displaying graphical biological information on the local area, and after marking the control of the symptom position around the graphical biological information, responding to the adjustment operation of the display position on the position adjustment control, displaying a three-dimensional display model in the adjusted second position on the local area, and displaying a second marking control of the second symptom position observable on the three-dimensional display model when the three-dimensional display model is in the second position around the three-dimensional display model; wherein the first orientation and the second orientation are different display orientations of the three-dimensional display model.
In some embodiments, the device is fitted with a camera; the graphical biometric information includes first biometric information; a display module 1010 for:
Responding to the diagnosis guiding operation triggered on the dialogue interface, and starting the camera;
and displaying the first biological information acquired when the camera is at the first position on the local area, and displaying a mark control corresponding to the first biological information around the first biological information.
In some embodiments, the graphical biometric information includes second biometric information;
the display module 1010 is configured to display, on the local area, graphical biometric information acquired when the camera is at the first position, and display, after the marker control is displayed around the graphical biometric information, second biometric information acquired when the camera is at the second position along with the movement of the camera from the first position to the second position on the local area, and display, around the second biometric information, the marker control corresponding to the second biometric information.
In some embodiments, a capture control is displayed on the local area;
the display module 1010 is configured to, after displaying first biological information acquired when the camera is at the first position on the local area, perform shooting on the first biological information in response to a shooting operation triggered on the shooting control, so as to obtain a biological image;
The biological image is uploaded to a background server, and the biological image is used as a reference image for medical staff to diagnose the disease.
In some embodiments, at least two type selection controls corresponding to at least two organism types are displayed on the local area, the organism types being partitioned based on at least one attribute of age and gender of the organism;
the display module 1010 is configured to display graphical biometric information on a local area, and after a marker control for a symptom location is displayed around the graphical biometric information, in response to a type selection operation on a type selection control for a second biometric type, switch and display the graphical biometric information and the marker control corresponding to the second biometric type on the local area when the biometric displayed in the graphical biometric information is the first biometric type.
In some embodiments of the present invention, in some embodiments,
a display module 1010, configured to display an inquiry dialogue of a guided diagnosis robot on a dialogue message area in response to a guided diagnosis operation triggered on a dialogue interface, where the guided diagnosis robot is an artificial intelligence program for providing a guided diagnosis service;
a display module 1010, configured to respond to a symptom selection operation on the symptom selection control, and display symptom information corresponding to the symptom selection control as a dialogue of the user on the dialogue message area; the diagnosis guiding result is displayed in a dialogue message area as a dialogue of the diagnosis guiding robot.
In some embodiments, the display module 1010 is configured to display symptom information in the form of message bubbles on the conversation message area.
In some embodiments, the apparatus further comprises: a transmit module 1020 and a receive module 1030;
a sending module 1020, configured to send, in response to a symptom selection operation on the symptom selection control, a diagnosis guidance request carrying symptom information to a background server before displaying the diagnosis guidance result as a dialogue of the diagnosis guidance robot in the dialogue message area;
and the receiving module 1030 is configured to receive the diagnosis guiding result sent by the background server.
In summary, in the intelligent diagnosis guiding device provided in this embodiment, after an application program enters a session interface in online diagnosis guiding, the application program is controlled to display graphical biological information on a local area of the session interface through a diagnosis guiding operation triggered on the session interface, then a position selecting operation for a symptom position is triggered on the graphical biological information, and in response to the position selecting operation, a symptom selecting control corresponding to the symptom position is displayed on the local area, so that a user can select symptoms for a certain position on a living body in a session scene, and then a diagnosis guiding result is given by the application program for the selected symptoms, and the diagnosis guiding result is displayed in a session message area of the session interface; therefore, the user can be in a dialogue scene all the time in the process of seeking the diagnosis guiding service, multiple interface jumps in the symptom selection process are avoided, and the experience that the dialogue is not smooth due to the multiple interface jumps is further avoided.
Fig. 12 is a schematic diagram of a computer device according to an exemplary embodiment of the present application. The computer device may be an electronic device that performs the intelligent guided method as provided herein. Specifically, the present invention relates to a method for manufacturing a semiconductor device.
The computer device 1100 includes a central processing unit (CPU, central Processing Unit) 1101, a system Memory 1104 including a random access Memory (RAM, random Access Memory) 1102 and a Read Only Memory (ROM) 1103, and a system bus 1105 connecting the system Memory 1104 and the central processing unit 1101. The computer device 1100 also includes a basic input/output system (I/O system, input Output System) 1106, which helps to transfer information between the various devices within the computer, and a mass storage device 1107 for storing an operating system 1113, application programs 1114, and other program modules 1115.
The basic input/output system 1106 includes a display 1108 for displaying information and an input device 1109, such as a mouse, keyboard, or the like, for user input of information. Wherein both the display 1108 and the input device 1109 are coupled to the central processing unit 1101 through an input output controller 1111 coupled to the system bus 1105. The basic input/output system 1106 may also include an input/output controller 1110 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, the input output controller 1110 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1107 is connected to the central processing unit 1101 through a mass storage controller (not shown) connected to the system bus 1105. Mass storage device 1107 and its associated computer-readable media provide non-volatile storage for computer device 1100. That is, mass storage device 1107 may include a computer-readable medium (not shown) such as a hard disk or compact disk read-only memory (CD-ROM, compact Disc Read Only Memory) drive.
Computer readable media may include computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, erasable programmable read-only memory (EPROM, erasable Programmable Read Only Memory), electrically erasable programmable read-only memory (EEPROM, electrically Erasable Programmable Read Only Memory), flash memory or other solid state memory technology, CD-ROM, digital versatile disks (DVD, digital Versatile Disc) or solid state disks (SSD, solid State Drives), other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. The random access memory may include resistive random access memory (ReRAM, resistance Random Access Memory) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others. Of course, those skilled in the art will recognize that computer storage media are not limited to the ones described above. The system memory 1104 and mass storage device 1107 described above may be collectively referred to as memory.
According to various embodiments of the present application, the computer device 1100 may also operate by a remote computer connected to the network through a network, such as the Internet. I.e., the computer device 1100 may connect to the network 1112 through a network interface unit 1111 connected to the system bus 1105, or other types of networks or remote computer systems (not shown) may be connected using the network interface unit 1111.
The memory also includes one or more programs, one or more programs stored in the memory and configured to be executed by the CPU.
In an alternative embodiment, a computer device is provided that includes a processor and a memory having at least one instruction, at least one program, code set, or instruction set stored therein, the at least one instruction, at least one program, code set, or instruction set being loaded and executed by the processor to implement the intelligent diagnostic method as described above.
In an alternative embodiment, a computer readable storage medium having at least one instruction, at least one program, code set, or instruction set stored therein is provided, the at least one instruction, at least one program, code set, or instruction set being loaded and executed by a processor to implement the intelligent guided method as described above.
Alternatively, the computer-readable storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), solid state disk (SSD, solid State Drives), or optical disk, etc. The random access memory may include resistive random access memory (ReRAM, resistance Random Access Memory) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others. The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The application also provides a computer readable storage medium, in which at least one instruction, at least one program, a code set, or an instruction set is stored, where the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by a processor to implement the intelligent diagnosis guiding method provided by the above method embodiments.
The present application also provides a computer program product comprising computer instructions stored on a computer readable storage medium. A processor of a computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the intelligent guided method as described above.
It should be understood that references herein to "a plurality" are to two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments is merely exemplary in nature and is in no way intended to limit the invention, since it is intended that all modifications, equivalents, improvements, etc. that fall within the spirit and scope of the invention.

Claims (20)

1. An intelligent diagnosis guiding method, which is characterized by comprising the following steps:
displaying a dialogue interface in an application program when conducting online diagnosis;
responding to the diagnosis guiding operation triggered on the dialogue interface, and displaying graphical biological information on a local area of the dialogue interface;
responsive to a position selection operation on the graphical biometric information for a symptom position, displaying a symptom selection control corresponding to the symptom position on the local area;
and responding to the symptom selection operation on the symptom selection control, and displaying a diagnosis guiding result in a dialogue message area of the dialogue interface.
2. The method of claim 1, wherein the displaying graphical biometric information on a localized area of the dialog interface in response to the triage operation triggered on the dialog interface comprises:
responding to the guided diagnosis operation triggered on the dialogue interface, displaying the graphical biological information on the local area, and displaying a mark control of the symptom position around the graphical biological information;
The method for displaying the symptom selection control corresponding to the symptom position on the local area in response to the position selection operation of the symptom position on the graphical biological information comprises the following steps:
and responding to the position selection operation on the marking control, and displaying the symptom selection control corresponding to the symptom position on the local area.
3. The method of claim 2, wherein the displaying the graphical biometric information on the local area and displaying the marker control for the symptom location around the graphical biometric information comprises:
displaying a popup window on the local area, displaying a symptom position selection page in the popup window, wherein the graphical biological information and the mark control are displayed on the symptom position selection page;
the step of displaying the symptom selection control corresponding to the symptom position on the local area in response to the position selection operation on the marking control comprises the following steps:
responding to position selection operation on the mark control, and switching the symptom position selection page in the popup window into a symptom selection page;
and displaying the symptom selection control on the symptom selection page.
4. The method according to claim 3, wherein the symptom selection page has local information of the graphical biometric information displayed thereon, the local information including the symptom location; other symptom locations besides the symptom location are also displayed around the local information;
after the symptom selection control is displayed on the symptom selection page, the method further comprises:
and responding to other position selection operations on the other symptom positions, and switching the symptom selection control on the symptom selection page into other symptom selection controls corresponding to the other symptom positions.
5. The method of claim 3, wherein the popup window has a return control of the symptom location selection page displayed therein;
after the symptom location selection page in the popup window is switched to a symptom selection page, the method includes:
and switching the symptom selection page in the popup back to the symptom location selection page in response to a return operation on the return control.
6. The method of claim 2, wherein the marking control that displays the symptom location around the graphical biometric information comprises:
Acquiring a biological information record from a wearable device, wherein the wearable device and the application program have a binding relationship;
in the case where the biometric information record is information describing the symptom location, the mark control corresponding to the symptom location is highlighted around the graphical biometric information.
7. The method of any of claims 2 to 6, wherein the graphical biometric information comprises a two-dimensional representation of a living being, the localized area having a front selection control and a back selection control of the living being displayed thereon;
the method for displaying the graphical biometric information on the local area, and displaying the marker control of the symptom location around the graphical biometric information, includes:
when the two-dimensional display diagram shows the front surface of the living body, responding to a first selection operation on the back surface selection control, and switching and displaying the back surface display diagram of the living body and a back surface mark control corresponding to the back surface display diagram on the local area;
and under the condition that the two-dimensional display diagram displays the back surface of the living body, responding to a second selection operation on the front surface selection control, and switching and displaying the front surface display diagram of the living body and the front surface mark control corresponding to the front surface display diagram on the local area.
8. The method of any of claims 2 to 6, wherein the graphical biometric information comprises a three-dimensional display model of the living being, the local area having displayed thereon an orientation adjustment control for adjusting a display orientation of the three-dimensional display model on the local area;
the marking control for displaying the graphical biometric information on the local area and displaying the symptom position around the graphical biometric information includes:
displaying the three-dimensional display model in a first orientation on the local area, and displaying a first marker control around the three-dimensional display model at a first symptom location observable on the three-dimensional display model when in the first orientation;
the method for displaying the graphical biometric information on the local area, and displaying the marker control of the symptom location around the graphical biometric information, includes:
responding to the adjustment operation of the display orientation on the orientation adjustment control, displaying the three-dimensional display model in a second orientation after adjustment on the local area, and displaying a second mark control of a second symptom position observable on the three-dimensional display model when the three-dimensional display model is in the second orientation around the three-dimensional display model;
Wherein the first orientation and the second orientation are different display orientations of the three-dimensional display model.
9. The method according to any one of claims 2 to 6, wherein the method is applied in a camera-mounted computer device; the graphical biometric information includes first biometric information;
the method for displaying the graphical biometric information on the local area and the marking control of the symptom position around the graphical biometric information in response to the guided diagnosis operation triggered on the dialogue interface comprises the following steps:
responding to the diagnosis guiding operation triggered on the dialogue interface, and starting the camera;
and displaying the first biological information acquired when the camera is at the first position on the local area, and displaying a mark control corresponding to the first biological information around the first biological information.
10. The method of claim 9, wherein the graphical biometric information comprises second biometric information;
the displaying the first biological information acquired when the camera is at the first position on the local area, and after displaying the mark control corresponding to the first biological information around the first biological information, includes:
And displaying the second biological information acquired when the camera is positioned at the second position on the local area along with the movement of the camera from the first position to the second position, and displaying a mark control corresponding to the second biological information around the second biological information.
11. The method of claim 9, wherein the local area has a capture control displayed thereon;
after the first biological information acquired when the camera is at the first position is displayed on the local area, the method comprises the following steps:
shooting the first biological information to obtain a biological image in response to shooting operation triggered on the shooting control;
the biological image is uploaded to a background server, and the biological image is used as a reference image for medical staff to diagnose the disease.
12. The method of any of claims 2 to 6, wherein at least two type selection controls corresponding to at least two organism types are displayed on the local area, the organism types being divided based on at least one attribute of age and gender of the organism;
The method for displaying the graphical biometric information on the local area, and displaying the marker control of the symptom location around the graphical biometric information, includes:
and when the living body displayed in the graphical living body information is of a first living body type, responding to a type selection operation on a type selection control of a second living body type, and switching and displaying the graphical living body information and a mark control corresponding to the second living body type on the local area.
13. The method according to any one of claims 1 to 6, further comprising:
responding to the diagnosis guiding operation triggered on the dialogue interface, and displaying an inquiry dialogue of a diagnosis guiding robot on the dialogue message area, wherein the diagnosis guiding robot refers to an artificial intelligent program for providing diagnosis guiding service;
the responding to the symptom selection operation on the symptom selection control displays the diagnosis guiding result in a dialogue message area of the dialogue interface, and the method comprises the following steps:
responding to the symptom selection operation on the symptom selection control, and displaying symptom information corresponding to the symptom selection control as a dialogue of a user on the dialogue message area;
And displaying the diagnosis guiding result as a dialogue of the diagnosis guiding robot in the dialogue message area.
14. The method of claim 13, wherein displaying the symptom information corresponding to the symptom selection control as a dialog of a user on the dialog message area comprises:
the symptom information is displayed on the dialogue message area in the form of message bubbles.
15. The method of claim 13, wherein the displaying the triage result as a dialogue of the triage robot before the dialogue message area comprises:
responding to the symptom selection operation on the symptom selection control, and sending a diagnosis guiding request carrying the symptom information to a background server;
and receiving the diagnosis guiding result sent by the background server.
16. An intelligent diagnostic device, the device comprising:
the display module is used for displaying a dialogue interface in the online diagnosis guiding process in the application program;
the display module is used for responding to the diagnosis guiding operation triggered on the dialogue interface and displaying graphical biological information on a local area of the dialogue interface;
The display module is used for responding to the position selection operation of the symptom position on the graphical biological information and displaying a symptom selection control corresponding to the symptom position on the local area;
and the display module is used for responding to the symptom selection operation on the symptom selection control and displaying the diagnosis guiding result in a dialogue message area of the dialogue interface.
17. The apparatus of claim 16, wherein the display module is configured to:
responding to the guided diagnosis operation triggered on the dialogue interface, displaying the graphical biological information on the local area, and displaying a mark control of the symptom position around the graphical biological information;
and responding to the position selection operation on the marking control, and displaying the symptom selection control corresponding to the symptom position on the local area.
18. A computer device comprising a processor and a memory, wherein the memory has stored therein at least one program that is loaded and executed by the processor to implement the intelligent diagnostic method of any of claims 1 to 15.
19. A computer readable storage medium having stored therein at least one program loaded and executed by a processor to implement the intelligent diagnostic method of any of claims 1 to 15.
20. A computer program product, the computer program product comprising computer instructions stored in a computer readable storage medium; a processor of a computer device reads the computer instructions from the computer readable storage medium, the processor executing the computer instructions, causing the computer device to perform the intelligent diagnostic method of any one of claims 1 to 15.
CN202111658357.5A 2021-12-30 2021-12-30 Intelligent diagnosis guiding method, device, equipment, medium and program product Pending CN116417120A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111658357.5A CN116417120A (en) 2021-12-30 2021-12-30 Intelligent diagnosis guiding method, device, equipment, medium and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111658357.5A CN116417120A (en) 2021-12-30 2021-12-30 Intelligent diagnosis guiding method, device, equipment, medium and program product

Publications (1)

Publication Number Publication Date
CN116417120A true CN116417120A (en) 2023-07-11

Family

ID=87048257

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111658357.5A Pending CN116417120A (en) 2021-12-30 2021-12-30 Intelligent diagnosis guiding method, device, equipment, medium and program product

Country Status (1)

Country Link
CN (1) CN116417120A (en)

Similar Documents

Publication Publication Date Title
Di Nuovo et al. Deep learning systems for estimating visual attention in robot-assisted therapy of children with autism and intellectual disability
JP2022527946A (en) Personalized digital treatment methods and devices
Vinciarelli et al. Open challenges in modelling, analysis and synthesis of human behaviour in human–human and human–machine interactions
Newman et al. HARMONIC: A multimodal dataset of assistive human–robot collaboration
CN108899064A (en) Electronic health record generation method, device, computer equipment and storage medium
US20210057106A1 (en) System and Method for Digital Therapeutics Implementing a Digital Deep Layer Patient Profile
US20170161450A1 (en) Real-time veterinarian communication linkage for animal assessment and diagnosis
WO2020112147A1 (en) Method of an interactive health status assessment and system thereof
US20160234461A1 (en) Terminal, system, display method, and recording medium storing a display program
US20220254514A1 (en) Medical Intelligence System and Method
WO2023160157A1 (en) Three-dimensional medical image recognition method and apparatus, and device, storage medium and product
CN116959733A (en) Medical data analysis method, device, equipment and storage medium
Monteiro et al. An overview of medical Internet of Things, artificial intelligence, and cloud computing employed in health care from a modern panorama
KR102595904B1 (en) Method and system for recommending medical consultant based medical consultation contents
US20190108897A1 (en) Apparatus and method for assisting medical consultation
Sharma Personalized Telemedicine Utilizing Artificial Intelligence, Robotics, and Internet of Medical Things (IOMT)
Palagin et al. Hospital Information Smart-System for Hybrid E-Rehabilitation.
US20160252958A1 (en) Terminal, system, communication method, and recording medium storing a communication program
Sonntag Medical and health systems
CN112740336A (en) Method and electronic device for Artificial Intelligence (AI) -based assisted health sensing in an Internet of things network
CN116417120A (en) Intelligent diagnosis guiding method, device, equipment, medium and program product
WO2023015287A1 (en) Systems and methods for automated medical data capture and caregiver guidance
Spinsante et al. Multimodal interaction in a elderly-friendly smart home: a case study
US20230238151A1 (en) Determining a medical professional having experience relevant to a medical procedure
Crisóstomo et al. Robotics services at home support

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40089834

Country of ref document: HK