CN111128321A - Information display method and system, device, electronic equipment and readable medium - Google Patents

Information display method and system, device, electronic equipment and readable medium Download PDF

Info

Publication number
CN111128321A
CN111128321A CN201911139085.0A CN201911139085A CN111128321A CN 111128321 A CN111128321 A CN 111128321A CN 201911139085 A CN201911139085 A CN 201911139085A CN 111128321 A CN111128321 A CN 111128321A
Authority
CN
China
Prior art keywords
dental
information
tooth
model
target client
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911139085.0A
Other languages
Chinese (zh)
Inventor
白先兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taikang Insurance Group Co Ltd
Original Assignee
Taikang Insurance Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taikang Insurance Group Co Ltd filed Critical Taikang Insurance Group Co Ltd
Priority to CN201911139085.0A priority Critical patent/CN111128321A/en
Publication of CN111128321A publication Critical patent/CN111128321A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Abstract

The present disclosure provides an information display method, an information display device, an electronic device and a computer readable medium, which relate to the technical field of internet of things, and the method includes: acquiring a tooth image and sensor data corresponding to a target client identifier; generating a 3D tooth model corresponding to the target client identification according to the tooth image; obtaining tooth information corresponding to the target client identification according to the 3D tooth model and the sensor data; displaying the 3D tooth model according to the tooth information. The doctor can adopt the information display method provided by the disclosure when performing remote diagnosis on the user, so that the diagnosis efficiency is improved, and the medical experience of the user is improved.

Description

Information display method and system, device, electronic equipment and readable medium
Technical Field
The present disclosure relates to the field of internet of things technology, and in particular, to an information display method and apparatus, an electronic device, and a computer-readable medium.
Background
The tooth diseases are very common, at present, the patient mainly comes to a clinic and a doctor gives a tooth examination result through visual examination, but the examination mode has long appointment time, so that the tooth diseases cannot be examined in time, and a lot of troubles are brought to the user. Alternatively, the doctor can roughly determine the tooth problems and provide voice help to the user according to the language description of the user through the remote system, but the examination result obtained according to the user description has larger error and can not be used as the basis of the subsequent treatment scheme.
Therefore, the method for intuitively and effectively providing the tooth information and the physical sign information of the user to the doctor remotely can effectively improve the diagnosis efficiency of the doctor and the medical experience of the user.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
In view of this, the present disclosure provides an information display method, apparatus, electronic device and computer readable medium, which can intuitively and effectively provide dental information and physical information of a target object to a doctor remotely, so as to effectively improve diagnosis efficiency of the doctor and improve medical experience of the target object.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of an embodiment of the present disclosure, there is provided an information display method including: acquiring a tooth image and sensor data corresponding to a target client identifier; generating a 3D tooth model corresponding to the target client identification according to the tooth image; obtaining tooth information corresponding to the target client identification according to the 3D tooth model and the sensor data; displaying the 3D tooth model according to the tooth information.
In an exemplary embodiment, the method further comprises: acquiring medical history information corresponding to the target client identification; the medical history information is used to obtain the dental information in combination with the 3D dental model and the sensor data.
In an exemplary embodiment, the medical history information includes any one or more of past medical history, family history, and present medical history of the target subject.
In an exemplary embodiment, the method further comprises: obtaining dental information corresponding to the target client identification according to the 3D dental model and the sensor data, including: and processing the 3D tooth model, the sensor data and the medical history information through a neural network model to obtain tooth information corresponding to the target client identification.
In an exemplary embodiment, the dental information includes detection information of a specified region in the 3D dental model, and displaying the 3D dental model according to the dental information includes: and carrying out suspension and amplification display on the designated area on the 3D tooth model.
In some embodiments, the dental information includes a temperature threshold, the sensor information includes a current temperature; wherein displaying the 3D dental model according to the dental information comprises: and when the current temperature exceeds a temperature threshold value, a temperature exceeding early warning prompt appears in the 3D model background.
In some embodiments, displaying the 3D dental model according to the dental information includes: displaying an attention cue marker in a background of the 3D dental model if the medical history information includes medical history information of a specified disease; and clicking the attention prompt mark to display dangerous prompt contents.
According to a second aspect of the embodiments of the present disclosure, there is provided an information display apparatus including: the data acquisition module is configured to acquire a tooth image and sensor data corresponding to the target client identifier; a 3D modeling module configured to generate a 3D dental model corresponding to the target client identifier from the dental image; a preprocessing module configured to obtain dental information corresponding to the target client identifier according to the 3D dental model and the sensor data; a display module configured to display the 3D dental model according to the dental information.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including: one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the information display method of any one of the above.
According to a fourth aspect of the embodiments of the present disclosure, a computer-readable medium is provided, on which a computer program is stored, wherein the program, when executed by a processor, implements the information display method according to any one of the above.
According to a fifth aspect of the embodiments of the present disclosure, there is provided an information display system including: the system comprises image acquisition equipment, sensor equipment, a server and display equipment; the image acquisition equipment is used for acquiring a tooth image corresponding to a target client identifier and sending the tooth image to the server; the sensor equipment is used for acquiring sensor data corresponding to the target client identification and sending the sensor data to the server; the server is used for generating a 3D tooth model corresponding to the target client identification according to the tooth image and obtaining tooth information corresponding to the target client identification according to the 3D tooth model and the sensor data; the display device is used for displaying the 3D tooth model according to the tooth information. In an exemplary embodiment, the information display system further includes a first terminal, wherein the server receives the examination result and the reference suggestion and transmits the examination result and the reference suggestion and/or the 3D tooth model to the first terminal, so that the first target object can be used for medical judgment or the second target object can give a tooth detection result and a treatment suggestion.
In an exemplary embodiment, the server is further configured to obtain, according to the identification information of the target client, medical history information corresponding to the identification of the target client, where the medical history information is used to obtain the dental information by combining the 3D dental model and the sensor data.
According to the information display method provided by the exemplary embodiment of the disclosure, a 3D tooth model corresponding to a target client identifier is constructed through a tooth image corresponding to the target client identifier, tooth information corresponding to the target client identifier is obtained according to the 3D tooth image and sensor data, and finally the 3D tooth model is visually displayed according to the tooth information, so that a target object (such as a doctor) can conveniently diagnose tooth diseases of teeth corresponding to the target client identifier according to the tooth information and the 3D tooth model. According to the information display method provided by the embodiment of the disclosure, a dental image corresponding to a target client identifier is modeled to form a 3D dental image corresponding to the target client identifier, so that teeth of a user corresponding to the target client identifier are visually displayed in a 3D manner; further, according to the technical scheme provided by the disclosure, dental information corresponding to the target client identifier is obtained according to the 3D dental model and the sensor data, and the 3D model is displayed according to the dental information. According to the technical scheme, the 3D tooth model is displayed according to the tooth information, so that a user or a doctor can intuitively and effectively obtain tooth information and sensor information (such as body sign data) corresponding to a target client identifier (namely a target user), and the diagnosis efficiency of the doctor and the medical experience of the user are effectively improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. The drawings described below are merely some embodiments of the present disclosure, and other drawings may be derived from those drawings by those of ordinary skill in the art without inventive effort.
Fig. 1 shows a schematic diagram of an exemplary system architecture of an information display method or an information display apparatus to which an embodiment of the present disclosure can be applied.
Fig. 2 is a flow chart illustrating a method of displaying information according to an example embodiment.
FIG. 3 is a schematic diagram illustrating an information display according to an example embodiment.
Fig. 4 is a schematic diagram illustrating an information display according to another exemplary embodiment.
Fig. 4A is a schematic diagram illustrating an information display according to yet another exemplary embodiment.
Fig. 4B is a schematic diagram illustrating an information display according to yet another exemplary embodiment.
Fig. 5 is a flowchart illustrating another information display method according to an embodiment of the present disclosure.
Fig. 6 is a flowchart of step S203 in fig. 2 in an exemplary embodiment.
Fig. 7 is an illustration of an information display method in accordance with an embodiment of the disclosure.
FIG. 8 is a schematic diagram illustrating an information display system in accordance with an exemplary embodiment.
Fig. 9 is a block diagram illustrating an information display apparatus according to an exemplary embodiment.
Fig. 10 is a schematic diagram illustrating a configuration of a computer system applied to an information display device according to an exemplary embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the disclosure.
The drawings are merely schematic illustrations of the present disclosure, in which the same reference numerals denote the same or similar parts, and thus, a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and steps, nor do they necessarily have to be performed in the order described. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
In this specification, the terms "a", "an", "the", "said" and "at least one" are used to indicate the presence of one or more elements/components/etc.; the terms "comprising," "including," and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first," "second," and "third," etc. are used merely as labels, and are not limiting on the number of their objects.
The following detailed description of exemplary embodiments of the disclosure refers to the accompanying drawings.
Fig. 1 is a schematic diagram showing an exemplary system architecture to which an information display method or an information display apparatus according to an embodiment of the present disclosure can be applied.
As shown in fig. 1, the system architecture 100 may include a smart device 101, an image capture device 102, a server 103, a network 104, a second terminal 105, and a first terminal 106. The network 104 is used to provide a medium of communication links between the smart device 101, the image capturing device 102, the second terminal 105, the first terminal 106 and the server 103. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The image capturing device 102 may be configured to capture a dental image of a target object and send the dental image of the target object to the server 103, including but not limited to a wireless pinhole camera, a micro camera, and the like.
The smart device 101 may be configured to collect the sign data of the target object, and send the sign data of the target object to the server 103, where the smart device 101 may be, for example, a wearable smart device such as a smart band, a smart running shoe, or a smart helmet, which can acquire the body sign data of the target object.
The user may use the terminal devices 105, 106 to interact with the server 103 via the network 104 to receive or send messages or the like. Among other things, the terminal devices 105, 106 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 103 may be a server that provides various services, such as a background management server that provides support for the smart device 101, the image capture device 102, and devices operated by the user using the terminal devices 105, 106. The background management server can analyze and process the received data such as the request and feed back the processing result to the terminal.
Server 103 may, for example, obtain dental images, sensor data corresponding to the target client identification; server 103 may generate a 3D dental model corresponding to the target client identification, e.g., from the dental image; server 103 may obtain dental information corresponding to the target client identification, e.g., from the 3D dental model and the sensor data; the server 103 may display the 3D dental model, for example, according to the dental information.
It should be understood that the number of image capturing devices, intelligent devices, terminal devices, networks, and servers in fig. 1 is only illustrative, and the server 103 may be a physical server or may be composed of a plurality of servers, and there may be any number of terminal devices, networks, and servers according to the implementation requirement.
Fig. 2 is a flow chart illustrating a method of displaying information according to an example embodiment. The method provided by the embodiment of the present disclosure may be processed by any electronic device with computing processing capability, for example, the server 105 and/or the terminal devices 102 and 103 in the embodiment of fig. 1 described above, and in the following embodiment, the server 105 is taken as an execution subject for example, but the present disclosure is not limited thereto.
Referring to fig. 2, an information display method provided by an embodiment of the present disclosure may include the following steps.
Step S201, a dental image and sensor data corresponding to the target client identifier are obtained.
In some embodiments, the sensor data may include temperature, pulse, blood pressure, electrocardiogram information of a first target subject (e.g., a target patient) acquired in real-time by a wearable smart device that may obtain target subject body sign data, such as a smart bracelet, a smart running shoe, a smart helmet, or the like.
In some embodiments, the smart band may include a microprocessor module, a body temperature sensor, a pulse sensor, a blood pressure sensor, and an electrocardiogram sensor. The body temperature sensor may be configured to acquire a body temperature of the first target object, the pulse sensor may be configured to acquire a number of heartbeats of the first target object, the blood pressure sensor may be configured to acquire blood pressure information of the first target object, the electrocardiogram sensor may be configured to acquire electrocardiogram data of the first target object, and the microprocessor sensor may be configured to process data uploaded by the sensors and may upload the sensor data to a server.
In some embodiments, the dental image of the first target object may be acquired by an image acquisition device. The image acquisition device may refer to any image acquisition device such as a medical camera that can acquire tooth images of the first target object from different angles.
In some embodiments, the second target object (e.g., a physician) may remotely direct the first target object to acquire dental images of the first target object from different angles using the image capture device through a video-linkage technique.
In some embodiments, the image capturing device may upload the tooth image data to a server in real time after capturing the tooth image of the first target object, and the server may convert the tooth image of the first target object into a 3D tooth model of the first target object in real time and send the 3D tooth model to the second target object, so that the second target object may guide the first target object to capture the tooth image in detail and accurately in real time according to the 3D tooth model.
In some embodiments, the second target object (e.g., a doctor) may instruct the first target object (e.g., a patient) to perform the acquisition of the dental image through the above-mentioned video linkage method, so as to help the first target object complete the acquisition of the dental image in all directions. The method can avoid missing tooth details and improve the accuracy of tooth diagnosis.
In some embodiments, the dental image and sensor data (e.g., body temperature, pulse, blood pressure, electrocardiogram information) of the first target object may be remotely acquired by an image acquisition apparatus (e.g., a medical camera), a sensor device (e.g., a smart bracelet, a smart helmet, or a smart point-of-care device, etc.), and stored in a target client corresponding to the target client identification.
In some embodiments, the server may obtain dental images and sensor data of the first target object from the target client identification.
Step S202, generating a 3D tooth model corresponding to the target client identification according to the tooth image.
In some embodiments, the server may perform 3D (3Dimensions, three-dimensional) modeling on the dental image obtained according to the target client identifier using a three-dimensional modeling technique to generate a 3D dental model corresponding to the target client identifier. It is to be understood that the target client identification corresponding target client stores dental data of the first target object, so the 3D dental model corresponding to the target client identification may be a 3D dental model of the first target object.
In some embodiments, the three-dimensional modeling may be building a model with three-dimensional data through three-dimensional fabrication software. The three-dimensional manufacturing software may be Auto CAD (automatic desktop Computer Aided Design), photoshop (graphic image processing), and other software.
In some embodiments, the three-dimensional modeling of the teeth may be implemented by a server, or may be implemented by any terminal that can perform data modeling, such as a client (e.g., a client of the first target object or a client of the second target object). It is to be understood that the present disclosure is not limited to devices for three-dimensional modeling.
In step S203, tooth information corresponding to the target client identifier is obtained according to the 3D tooth model and the sensor data.
In some embodiments, the dental information may include dental symptoms of the first target subject (e.g., tooth loss, tooth eruption, mild bleeding, etc.), may also include dental disease detection results for the teeth of the first target subject (caries, periodontitis, pulp necrosis, acute apical periodontitis, etc.), and may also include treatment recommendations for dental diseases of the first target subject (e.g., immediate medical visit, medical visit after normal body temperature, or pungency contraindicated, etc.).
In some embodiments, dental information corresponding to the target client identification may be obtained from the 3D dental model and the sensor data.
In some embodiments, the 3D dental model and the sensor data may be data processed by a neural network model to obtain dental information corresponding to the client identification.
In some embodiments, the 3D dental model may be processed by image processing techniques to determine symptom information for the 3D dental model, and then the symptom information for the 3D dental model is combined with the sensor data to determine dental information identified for the client.
In some embodiments, the 3D dental model and sensor data may be sent to a third target object that gives dental detection results and recommendations etc. in combination with the 3D dental model and the sensor data to generate dental information identified for the client.
In step S204, the 3D dental model is displayed according to the dental information.
In some embodiments, detection information of a specified region (e.g., periodontal, gingival, root, cusp, etc.) in the 3D tooth model may be included in the tooth information. Wherein the designated region may refer to a focal region in the 3D tooth model.
In some embodiments, displaying the 3D dental model according to the dental information may include: and carrying out suspension and amplification display on the designated area on the 3D tooth model.
According to the technical scheme provided by the embodiment, when the 3D tooth model is displayed according to the tooth information, the designated area (which may be a diseased area) is displayed in a suspending and amplifying manner, so that a first target object or a second target object can focus on and intuitively obtain the diseased area and the tooth information corresponding to the diseased area from the 3D tooth model, and the detection of the teeth of the first target object can be conveniently and efficiently completed.
As shown in fig. 3 or 4, the 3D dental model is displayed according to the dental information, and the 3D dental model and the dental information may be displayed simultaneously, so that the second target object may obtain dental image data and dental information simultaneously, and the second target object may give a detection result of the teeth of the first target object comprehensively according to the 3D dental model and the dental information.
In addition, while displaying the 3D model and the dental information, the sensor data may also be displayed simultaneously, so that the second target object may give a treatment plan for the teeth of the first target object according to the 3D dental model, the dental information, and the sensor data.
In some embodiments, lesion location information may be included in the dental information.
In some embodiments, the lesion position information may be displayed in an enlarged manner so that the user can determine the lesion status accurately and in a timely manner to further obtain a dental diagnosis result.
As shown in fig. 4A, the 3D dental model may be displayed in an enlarged manner on the dental model including the lesion area, so that the user can timely and accurately determine the lesion condition to further obtain the dental diagnosis result.
As shown in fig. 4B, the lesion area may be specially marked, so that the user can timely and accurately judge the dental lesion condition to further obtain a dental diagnosis result.
In the information display method provided by this embodiment, a 3D tooth model corresponding to a target client identifier is constructed from a tooth image corresponding to the target client identifier, tooth information corresponding to the target client identifier is obtained according to the 3D tooth image and sensor data, and finally, the 3D tooth model is visually displayed according to the tooth information, so that a second object (e.g., a doctor) can perform tooth disease diagnosis on teeth corresponding to the target client identifier according to the tooth information and the 3D tooth model. According to the information display method provided by the embodiment of the disclosure, a dental image corresponding to a target client identifier is modeled to form a 3D dental image corresponding to the target client identifier, and the 3D dental model is visually displayed in a 3D manner; further, in this embodiment, dental information corresponding to the target client identifier is obtained according to the 3D dental model and the sensor data, and the 3D model is displayed according to the dental information. In this embodiment, the 3D tooth model is displayed according to the tooth information, so that a user or a doctor can visually and effectively obtain tooth information and sensor information (for example, body sign data) corresponding to a target client identifier, and diagnosis efficiency of the doctor and medical experience of the user can be effectively improved.
In actual treatment, a doctor usually performs dental diagnosis and gives a treatment plan according to the current actual body temperature of the first target object. For example, for a dental condition requiring an extraction treatment, the physician will not advise the target subject to immediately proceed with the extraction treatment when the current body temperature of the target subject exceeds a certain threshold, but may give other treatment advises.
In some embodiments, the dental information may include a temperature threshold (i.e., a temperature that the body temperature of the target subject cannot exceed when treating the corresponding dental disease), and the sensor information may include a current temperature of the target subject, and when the current temperature of the target subject exceeds the temperature threshold, a temperature excess warning prompt may appear in the 3D model background.
According to the technical scheme provided by the embodiment, the temperature exceeding early warning is displayed in the background of the 3D model, so that the current body temperature of a first target object (such as a patient)/a second target object (such as a doctor) is prompted to exceed the standard, and the first target object is not suitable for treatment of a specified method (such as tooth extraction). The method provided by the embodiment can intuitively and conveniently remind the second target object/the first target object of the body temperature information of the first target object, so that missed diagnosis or misdiagnosis caused by information omission is avoided, and the user experience can be improved.
In the field of dental disease treatment, when some patients suffering from hemophilia, thrombocytopenic purpura, leukemia and the like extract teeth, the patients often have endless blood flow and even life risks; patients with hypertension and heart disease need to be adequately prepared before tooth extraction; patients with hepatitis, liver cirrhosis, impaired liver function, and patients with a decrease in blood thrombopoietin, are particularly prone to bleeding, and therefore tooth extraction is not suitable for patients with active hepatitis or severe liver damage.
It can be seen that, when a doctor gives a dental diagnosis result or a treatment recommendation, the doctor combines the medical history information (including the king medical history, family history, current medical history, etc.) of the target object, and the doctor considers the treatment scheme carefully when some specified diseases occur.
In order to prompt doctors about the illness information of target objects in time, when the medical history information of the target objects comprises the medical history information of specified illness, attention prompt marks are displayed in the background of the 3D tooth model, so that doctors can pay attention to the information, and attention prompt contents can be further displayed by clicking the danger prompt.
According to the technical scheme provided by the embodiment, the first target object/the second target object can be prompted to have a medical history of a specified disease through the attention prompting identification. The technical scheme provided by the embodiment can avoid medical accidents caused by missed diagnosis or misdiagnosis and the like, and can improve user experience.
Fig. 5 is a flowchart illustrating another information display method according to an embodiment of the present disclosure.
Referring to fig. 5, an information display method provided by an embodiment of the present disclosure may include the following steps.
Step S205, acquiring medical history information corresponding to the target client identifier.
In some embodiments, the target client representation corresponding medical history information may refer to medical history information of the first target object.
In some embodiments, the medical history information may include any one or more of past medical history, family history, and present medical history.
In some embodiments, the past history may refer to a case history corresponding to a dental disease of the first target subject, and the present history may refer to a disease (e.g., heart disease, hypertension, etc.) that the first target subject is currently suffering from.
Step S206, the medical history information is used for combining the 3D tooth model and the sensor data to obtain the tooth information.
In some embodiments, the dental information can be obtained in conjunction with the medical history information, a 3D dental model, and the sensor data.
Fig. 6 is a flowchart of step S203 in fig. 2 in an exemplary embodiment.
Referring to fig. 6, the above-described step S203 may include the following steps.
In step S2031, the 3D dental model, the sensor data, and the medical history information are processed through a neural network model to obtain dental information corresponding to the target client identifier.
In some embodiments, the neural network model may refer to a naive bayes-based disease prediction model.
Naive Bayes belongs to the Bayesian probability theory category, and the Bayes probability introduces prior knowledge and logical reasoning to process uncertain propositions. Naive bayes is part of bayesian decision theory. The method is characterized in that: the method is still effective under the condition of less data, can also treat multiple classes, is sensitive to the preparation mode of input data and is suitable for nominal data. The naive bayes process is mainly divided into two stages. In the first stage, experimental samples are classified, and the probabilities of the experimental samples under different conditions are calculated respectively. And in the second stage, inputting the test samples, calculating the probabilities under different conditions, and comparing the probabilities to finish the classification of the test samples.
In some embodiments, training of the neural network needs to be completed before the neural network is used.
In some embodiments, training of the neural network may be accomplished using training samples in a training set.
In some embodiments, dental images of different objects may be obtained, and a corresponding 3D dental model of each object may be constructed from the dental images of each object, and the 3D dental model may be used as a training sample in a training set.
In some embodiments, the training samples in the training set may further include labels for disease information, symptom information, medical history information, basic features, vital sign data, and the like of each subject.
In some embodiments, the disease information may include caries, gum recession, tooth loosening, periapical disease, tooth position abnormalities, multiple raw teeth, etc.; the symptom information comprises toothache degree, tooth looseness and the like; the medical history information may include past medical histories corresponding to various dental diseases; the basic features include the age, sex, occupation, etc. of the user, and the physical sign data include body temperature, pulse, electrocardiogram, etc.
In some embodiments, training the neural network model may include the steps of: vectorizing the training samples; inputting the training sample into the neural network to obtain first tooth information (including predicted disease information, symptom information, medical history information, basic feature information and sign information) of the training sample; comparing the first tooth information with the label of the training sample to obtain the corresponding loss of the training sample; and updating the parameters of the neural network model according to the loss corresponding to the training sample.
In some embodiments, when the loss corresponding to the training sample is less than a loss threshold, the training of the neural network model may be ended, and the trained neural network model may be used to predict dental information corresponding to the target client identifier.
In this embodiment, the 3D dental model, the sensor data and the medical history information corresponding to the client identifier (or the first target object) are processed through the neural network model to obtain dental information (e.g., a dental detection result and a dental prevention suggestion) corresponding to the target client identifier.
Fig. 7 is an illustration of an information display method in accordance with an embodiment of the disclosure. Referring to fig. 7, the information display method may include the following steps.
Step S701, a first target object applies for accessing an information display system through a first terminal.
In some embodiments, the information display system may include: image acquisition equipment, intelligent equipment, server, second terminal, first terminal. Wherein, image acquisition equipment can gather the tooth picture of first target object, image acquisition equipment can be for example miniature camera, and wearable smart machine that can acquire first target object health sign data such as smart machine intelligence bracelet, intelligent running shoes, intelligent helmet, second terminal and first terminal can be desktop computer, handheld computer and so on can accomplish interactive function's terminal equipment.
In some embodiments, the first target object may refer to a user in need of diagnosis of a dental disease.
In some embodiments, the first target object may successfully establish a video channel with the doctor after the first target object successfully accesses the information display system.
In step S702, the doctor and the first target object communicate the tooth condition through the video.
Step S703, the intelligent device in the information display system acquires the medical history information and the physical sign data of the first target object through the sensor.
In some embodiments, after the first target subject passes the application, the smart device may automatically acquire vital sign data of the target subject, where the vital sign data may be one or more of information of heart beat, blood pressure, body temperature, and the like of the target subject. In addition, the server can also call the medical history information of the target object from the server according to the binding relationship between the intelligent equipment and the medical history information base of the target object. The medical history information may include one or more of past medical history, present medical history, and family history of the target subject, which is not limited by the present disclosure.
In some embodiments, based on the internet of things technology, the smart device uploads the acquired medical history information and physical sign data of the target subject to the server.
Step S704, the image capturing device captures a dental image of the first target object, and uploads the dental image to the server.
In some embodiments, a doctor may perform video interconnection with the first target object on the internet to remotely guide the first target object to acquire dental images of the first target object from different angles using the image capturing apparatus through a video linkage technique. The dental picture of the first target object may be a two-dimensional picture or a three-dimensional picture.
In some embodiments, the image capture device may upload the dental image to a server.
Step S705, the server generates a 3D dental model of the first target object according to the dental image of the first target object.
In some embodiments, the server may perform 3D (3Dimensions, three-dimensional) modeling on the dental image obtained according to the target client identifier using a three-dimensional modeling technique to generate a 3D dental model corresponding to the target client identifier. It is to be understood that the target client identification corresponding target client stores dental data of the first target object, so the 3D dental model corresponding to the target client identification may be a 3D dental model of the first target object.
Step S706, the server generates tooth information aiming at the first target object according to the 3D tooth model, the sign data and the medical history information of the first target object, and transmits the 3D tooth model, the sign data, the medical history information and the tooth information aiming at the first target object to the second terminal.
In some embodiments, the 3D dental model and the sensor data may be data processed by a neural network model to obtain dental information corresponding to the client identification.
In some embodiments, the 3D dental model may be processed by image processing techniques to determine symptom information for the 3D dental model, and then the symptom information for the 3D dental model is combined with the sensor data to determine dental information identified for the client.
In some embodiments, the 3D dental model and sensor data may be sent to a third target object that gives dental detection results and recommendations etc. in combination with the 3D dental model and the sensor data to generate dental information identified for the client.
And step S707, the second terminal displays the 3D tooth model according to the tooth information.
In some embodiments, the second terminal displays the 3D tooth model according to the tooth information, and simultaneously displays the sign data and the medical history information of the first target object, so that a doctor can give a treatment recommendation by combining the sign data and the medical history information while completing tooth detection.
Step S708, the doctor gives a detection result and a reference suggestion according to the 3D tooth model, the physical sign data and the medical history information displayed by the second terminal according to the tooth information.
In some embodiments, the doctor can give the detection result and the next reference suggestion according to the 3D tooth model and the sign data of the first target object.
In other embodiments, the doctor can also extract the detection result and the next reference suggestion according to the 3D tooth model, the sign data and the medical history information of the first target object.
Step S709, the second terminal obtains the examination result and the reference suggestion provided by the doctor, and transmits the examination result and the reference suggestion to the server.
Step S710, the server sends the inspection result and the reference proposal and/or the 3D tooth model including the tooth information to the first terminal.
Step S711, the first target object determines whether to make a remote appointment according to the inspection result displayed by the first terminal and the reference proposal and/or the 3D dental model including dental information, so as to diagnose the teeth.
In the embodiment of the disclosure, a 3D tooth model corresponding to the first target client identifier is constructed from a tooth image corresponding to the target client identifier, tooth information corresponding to the target client identifier is obtained according to the 3D tooth image and sensor data, and finally, the 3D tooth model is visually displayed according to the tooth information, so that a user (e.g., a doctor) can conveniently diagnose tooth diseases of teeth corresponding to the target client identifier according to the tooth information and the 3D tooth model. According to the information display method provided by the embodiment of the disclosure, a dental image corresponding to a target client identifier is modeled to form a 3D dental image corresponding to the target client identifier, so that teeth of a user corresponding to the target client identifier are visually displayed in a 3D manner; further, according to the technical scheme provided by the disclosure, dental information corresponding to the target client identifier is obtained according to the 3D dental model and the sensor data, and the 3D model is displayed according to the dental information. According to the information display method, on one hand, the first target object can be helped to obtain preliminary examination and guidance of a doctor in time when the dental disease occurs, so that the treatment time of the first target object is reduced, and the accuracy of remote examination is improved; on the other hand, the tooth condition (including the 3D tooth model, the medical history information and the tooth information) of the first target object can be visually displayed to the doctor, so that the doctor is helped to diagnose and the condition of misdiagnosis or missed diagnosis can be avoided.
FIG. 8 is a schematic diagram illustrating an information display system in accordance with an exemplary embodiment. Referring to fig. 8, the system includes a data image acquisition device 801, a sensor device 802, a server 803, and a display device 804.
The image capturing device 801 may be configured to capture a dental image corresponding to the target client identifier, and send the dental image to the server. The sensor device 802 may be configured to collect sensor data corresponding to the target client identifier and send the sensor data to the server. The server 803 may be configured to generate a 3D dental model corresponding to the target client identifier according to the dental image, and obtain dental information corresponding to the target client identifier according to the 3D dental model and the sensor data. The display device 804 may be configured to display the 3D dental model according to the dental information.
In some embodiments, the server may be further configured to obtain medical history information corresponding to the target client identifier according to identifier information of a sensor, where the medical history information is used to obtain the dental information by combining the 3D dental model and the sensor data.
Fig. 9 is a block diagram illustrating an information display apparatus according to an exemplary embodiment. Referring to fig. 9, the apparatus 900 includes: a data acquisition module 901, a 3D modeling module 902, a preprocessing module 903, and a display module 904.
The data obtaining module 901 may be configured to obtain a dental image and sensor data corresponding to the target client identifier. The 3D modeling module 902 may be configured to generate a 3D dental model corresponding to the target client identification from the dental image. The preprocessing module 903 may be configured to obtain dental information corresponding to the target client identification from the 3D dental model and the sensor data. The display module 904 may be configured to display the 3D dental model according to the dental information.
In some embodiments, the information display module may further include a medical history information acquisition module and a second dental information acquisition module.
The medical history information acquiring module may be configured to acquire medical history information corresponding to the target client identifier. The second dental information acquisition module can be configured for the medical history information to be used in conjunction with the 3D dental model and the sensor data to obtain the dental information.
In some embodiments, the medical history information includes any one or more of past medical history, family history, and present medical history.
In some embodiments, the preprocessing module may include a neural network processing unit, wherein the neural network processing unit may be configured to process the 3D dental model, the sensor data, and the medical history information through a neural network model to obtain dental information corresponding to the target client identification.
In some embodiments, the preprocessing module may further include an image processing unit, wherein the image processing unit may be configured to process the 3D dental model by an image processing technique to obtain dental information corresponding to the target client identification.
In some embodiments, the tooth information includes detection information of a specified region in the 3D tooth model, and the display module may be further configured to suspend and enlarge the specified region on the 3D tooth model.
In some embodiments, the dental information includes a temperature threshold, the sensor information includes a current temperature; the display module can be further configured to display a temperature-exceeding warning prompt in the 3D model background when the current temperature exceeds a temperature threshold.
In some embodiments, the display module may be further configured to display an attention cue identification in a background of the 3D dental model if the medical history information includes medical history information specifying a disease; and clicking the attention prompt mark to display dangerous prompt contents.
Since each functional module of the information display apparatus 900 according to the exemplary embodiment of the present disclosure corresponds to the step of the exemplary embodiment of the test data generation method, it is not described herein again.
Referring now to FIG. 10, shown is a block diagram of a computer system 1000 suitable for use in implementing a terminal device of an embodiment of the present application. The terminal device shown in fig. 10 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 10, the computer system 1000 includes a Central Processing Unit (CPU)1001 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)1002 or a program loaded from a storage section 1008 into a Random Access Memory (RAM) 1003. In the RAM 1003, various programs and data necessary for the operation of the system 1000 are also stored. The CPU 1001, ROM 1002, and RAM 1003 are connected to each other via a bus 1004. An input/output (I/O) interface 1005 is also connected to bus 1004.
The following components are connected to the I/O interface 1005: an input section 1006 including a keyboard, a mouse, and the like; an output section 1007 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 1008 including a hard disk and the like; and a communication section 1009 including a network interface card such as a LAN card, a modem, or the like. The communication section 1009 performs communication processing via a network such as the internet. The driver 1010 is also connected to the I/O interface 1005 as necessary. A removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 1010 as necessary, so that a computer program read out therefrom is mounted into the storage section 1008 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication part 1009 and/or installed from the removable medium 1011. The computer program executes the above-described functions defined in the system of the present application when executed by the Central Processing Unit (CPU) 1001.
It should be noted that the computer readable medium shown in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a transmitting unit, an obtaining unit, a determining unit, and a first processing unit. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be separate and not incorporated into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to perform functions comprising: acquiring a tooth picture and sign data of a target object; and generating a 3D tooth model of the target object according to the tooth picture, wherein the 3D tooth model of the target object and sign data thereof are used for carrying out tooth examination on the target object so as to obtain an examination result and a reference suggestion. Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution of the embodiment of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.), and includes several instructions for enabling a computing device (which may be a personal computer, a server, a mobile terminal, or a smart device, etc.) to execute the method according to the embodiment of the present disclosure, such as one or more of the steps shown in fig. 2.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the disclosure is not limited to the details of construction, the arrangements of the drawings, or the manner of implementation that have been set forth herein, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (10)

1. An information display method, comprising:
acquiring a tooth image and sensor data corresponding to a target client identifier;
generating a 3D tooth model corresponding to the target client identification according to the tooth image;
obtaining tooth information corresponding to the target client identification according to the 3D tooth model and the sensor data;
displaying the 3D tooth model according to the tooth information.
2. The method of claim 1, further comprising:
acquiring medical history information corresponding to the target client identification;
the medical history information is used to obtain the dental information in combination with the 3D dental model and the sensor data.
3. The method of claim 2, wherein obtaining dental information corresponding to the target client identification from the 3D dental model and the sensor data comprises:
and processing the 3D tooth model, the sensor data and the medical history information through a neural network model to obtain tooth information corresponding to the target client identification.
4. The method of claim 1, wherein obtaining dental information corresponding to the target client identification from the 3D dental model and the sensor data comprises:
and processing the 3D tooth model through an image processing technology to obtain tooth information corresponding to the target client identification.
5. The method according to claim 2, wherein the dental information includes detection information of a specified region in the 3D dental model, and the displaying the 3D dental model according to the dental information includes:
and carrying out suspension and amplification display on the designated area on the 3D tooth model.
6. The method of claim 5, wherein the dental information includes a temperature threshold, the sensor information includes a current temperature; wherein displaying the 3D dental model according to the dental information comprises:
and when the current temperature exceeds a temperature threshold value, a temperature exceeding early warning prompt appears in the 3D model background.
7. The method of claim 6, wherein displaying the 3D dental model according to the dental information comprises:
displaying an attention cue marker in a background of the 3D dental model if the medical history information includes medical history information of a specified disease;
and clicking the attention prompt mark to display dangerous prompt contents.
8. An information display device characterized by comprising:
the data acquisition module is configured to acquire a tooth image and sensor data corresponding to the target client identifier;
a 3D modeling module configured to generate a 3D dental model corresponding to the target client identifier from the dental image;
a preprocessing module configured to obtain dental information corresponding to the target client identifier according to the 3D dental model and the sensor data;
a display module configured to display the 3D dental model according to the dental information.
9. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable medium, on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out the method of any one of claims 1-7.
CN201911139085.0A 2019-11-20 2019-11-20 Information display method and system, device, electronic equipment and readable medium Pending CN111128321A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911139085.0A CN111128321A (en) 2019-11-20 2019-11-20 Information display method and system, device, electronic equipment and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911139085.0A CN111128321A (en) 2019-11-20 2019-11-20 Information display method and system, device, electronic equipment and readable medium

Publications (1)

Publication Number Publication Date
CN111128321A true CN111128321A (en) 2020-05-08

Family

ID=70495930

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911139085.0A Pending CN111128321A (en) 2019-11-20 2019-11-20 Information display method and system, device, electronic equipment and readable medium

Country Status (1)

Country Link
CN (1) CN111128321A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112750520A (en) * 2020-12-31 2021-05-04 四川桑瑞思环境技术工程有限公司 Information processing system
CN116313163A (en) * 2023-05-16 2023-06-23 四川省医学科学院·四川省人民医院 Interaction method and system based on leukemia infant treatment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102016854A (en) * 2008-05-23 2011-04-13 矫正技术公司 Smile designer
CN103593585A (en) * 2013-12-02 2014-02-19 深圳中兴网信科技有限公司 Remote diagnosis system
CN105662630A (en) * 2014-10-08 2016-06-15 上海星寰投资有限公司 Intelligent toothbrush device
CN107784338A (en) * 2017-02-08 2018-03-09 平安医疗健康管理股份有限公司 Method for managing medical information and device
CN108320801A (en) * 2018-04-28 2018-07-24 北京预医智联科技有限公司 A kind of intelligence odontopathy medical treatment system
CN108877897A (en) * 2018-05-28 2018-11-23 牙博士医疗控股集团有限公司 Dental diagnostic scheme generation method, device and diagnosis and therapy system
CN109616197A (en) * 2018-12-12 2019-04-12 泰康保险集团股份有限公司 Tooth data processing method, device, electronic equipment and computer-readable medium
CN109729169A (en) * 2019-01-08 2019-05-07 成都贝施美医疗科技股份有限公司 Tooth based on C/S framework beautifies AR intelligence householder method
CN109817346A (en) * 2019-01-08 2019-05-28 深圳洲斯移动物联网技术有限公司 Cloud diagosis method, integrated display terminal and computer storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102016854A (en) * 2008-05-23 2011-04-13 矫正技术公司 Smile designer
CN103593585A (en) * 2013-12-02 2014-02-19 深圳中兴网信科技有限公司 Remote diagnosis system
CN105662630A (en) * 2014-10-08 2016-06-15 上海星寰投资有限公司 Intelligent toothbrush device
CN107784338A (en) * 2017-02-08 2018-03-09 平安医疗健康管理股份有限公司 Method for managing medical information and device
CN108320801A (en) * 2018-04-28 2018-07-24 北京预医智联科技有限公司 A kind of intelligence odontopathy medical treatment system
CN108877897A (en) * 2018-05-28 2018-11-23 牙博士医疗控股集团有限公司 Dental diagnostic scheme generation method, device and diagnosis and therapy system
CN109616197A (en) * 2018-12-12 2019-04-12 泰康保险集团股份有限公司 Tooth data processing method, device, electronic equipment and computer-readable medium
CN109729169A (en) * 2019-01-08 2019-05-07 成都贝施美医疗科技股份有限公司 Tooth based on C/S framework beautifies AR intelligence householder method
CN109817346A (en) * 2019-01-08 2019-05-28 深圳洲斯移动物联网技术有限公司 Cloud diagosis method, integrated display terminal and computer storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112750520A (en) * 2020-12-31 2021-05-04 四川桑瑞思环境技术工程有限公司 Information processing system
CN116313163A (en) * 2023-05-16 2023-06-23 四川省医学科学院·四川省人民医院 Interaction method and system based on leukemia infant treatment
CN116313163B (en) * 2023-05-16 2023-07-21 四川省医学科学院·四川省人民医院 Interaction method and system based on leukemia infant treatment

Similar Documents

Publication Publication Date Title
US11553874B2 (en) Dental image feature detection
US11676701B2 (en) Systems and methods for automated medical image analysis
US10984529B2 (en) Systems and methods for automated medical image annotation
KR102243830B1 (en) System for providing integrated medical diagnostic service and method thereof
US10282835B2 (en) Methods and systems for automatically analyzing clinical images using models developed using machine learning based on graphical reporting
CN111292839B (en) Image processing method, image processing device, computer equipment and storage medium
KR102140402B1 (en) Apparatus for quality managment of medical image interpretation usnig machine learning, and method thereof
CN111915584A (en) Focus follow-up assessment method and system based on CT (computed tomography) image
CN111008957A (en) Medical information processing method and device
KR20220038017A (en) Systems and methods for automating clinical workflow decisions and generating priority read indicators
CN111128321A (en) Information display method and system, device, electronic equipment and readable medium
EP3428925A1 (en) Method and system for clinical decision support with local and remote analytics
US10909676B2 (en) Method and system for clinical decision support with local and remote analytics
CN110648318A (en) Auxiliary analysis method and device for skin diseases, electronic equipment and storage medium
CN113554607A (en) Tooth body detection model, generation method and tooth body segmentation method
US10558783B2 (en) Image data ingestion application of a medical imaging data processing and retrieval system
CN112735579A (en) Rapid registration treatment system for emergency patients
CN109273080B (en) Intelligent diagnosis and treatment method and device, electronic equipment and storage medium
CN115861283A (en) Medical image analysis method, device, equipment and storage medium
US11869654B2 (en) Processing medical images
Hsu et al. DeepOPG: Improving Orthopantomogram Finding Summarization with Weak Supervision
CN113838560A (en) Remote diagnosis system and method based on medical image
KR20210029167A (en) Apparatus for quality managment of medical image interpretation usnig machine learning, and method thereof
CN111192679B (en) Method, device and storage medium for processing image data abnormality
EP4216229A1 (en) Subscription and retrieval of medical imaging data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination