CN112215969A - User data processing method and device based on virtual reality - Google Patents

User data processing method and device based on virtual reality Download PDF

Info

Publication number
CN112215969A
CN112215969A CN202011051172.3A CN202011051172A CN112215969A CN 112215969 A CN112215969 A CN 112215969A CN 202011051172 A CN202011051172 A CN 202011051172A CN 112215969 A CN112215969 A CN 112215969A
Authority
CN
China
Prior art keywords
information
target user
virtual
virtual image
health state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011051172.3A
Other languages
Chinese (zh)
Inventor
张悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202011051172.3A priority Critical patent/CN112215969A/en
Publication of CN112215969A publication Critical patent/CN112215969A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Computer Graphics (AREA)
  • Biomedical Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Pathology (AREA)
  • Geometry (AREA)
  • Computing Systems (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application relates to a user data processing method and device based on virtual reality, comprising the following steps: acquiring image characteristic data of a target user; generating a first virtual image corresponding to the target user according to the image characteristic data; acquiring first health state information of the target user; and adjusting the first virtual image according to the first health state information to obtain a second virtual image corresponding to the target user. Through the method in the application, compared with the display through characters, the health state of the user is displayed through the virtual image, a more visual and specific display effect can be obtained, the target user can know the health state of the user more conveniently, the doctor can quickly acquire the health state of the target user more conveniently, and finally, the medical history does not need to be registered through the paper medical history, so that the complicated medical condition caused by the loss of the medical history can be avoided.

Description

User data processing method and device based on virtual reality
Technical Field
The present application relates to the field of virtual reality technologies, and in particular, to a user data processing method and apparatus based on virtual reality.
Background
With the rapid development of the internet technology and the addition of the 5G network, the intelligent medical treatment becomes an important direction and chance for the development of the current medical industry, and the online medical treatment, the internet and the medical treatment provide more convenience and possibility for the traditional medical service. The appearance of intelligent medical treatment provides a lot of convenience, solves the defects of doctors and medical resources, eliminates the limitation of hard conditions such as time, space and place and the like, and enables medical service to be more possibly beneficial to more people. Especially, in the epidemic situation period, under the requirement of household isolation, the on-line medical treatment solves the requirements of a plurality of inquiry and the like for people. By combining the virtual technology and the augmented reality technology, professional medical feedback can be more visual and popular and easy to understand.
On the other hand, although the existing hospital system is slowly added with the internet technology, the patient seeing process is convenient and fast, but the poor patient seeing user experience caused by the influence of some traditional processes and human factors still exists. In the prior art, the medical calendar of a user still adopts the traditional paper medical record when the user sees a doctor. Moreover, for the patients with medical records, which are non-professional persons, the patients cannot know the disease conditions and the specific development and change conditions of the treatment process due to the limitation of factors such as writing and professional terms. In addition, the patient may lose the paper medical record, which causes some troubles in the procedure.
In view of the technical problems in the related art, no effective solution is provided at present.
Disclosure of Invention
In order to solve the technical problem or at least partially solve the technical problem, the application provides a user data processing method and device based on virtual reality.
In a first aspect, an embodiment of the present application provides a user data processing method based on virtual reality, including:
acquiring image characteristic data of a target user;
generating a first virtual image corresponding to the target user according to the image characteristic data;
acquiring first health state information of the target user;
and adjusting the first virtual image according to the first health state information to obtain a second virtual image corresponding to the target user.
Optionally, as the foregoing method:
the acquiring of the image characteristic data of the target user comprises:
acquiring scanning data obtained by three-dimensional scanning of the target user, and/or acquiring two-dimensional image information obtained by image shooting of the target user;
obtaining the image characteristic data according to the scanning data and/or the two-dimensional image information;
generating a first avatar corresponding to the target user according to the avatar characteristic data, including:
and performing three-dimensional modeling through the scanning data and/or the two-dimensional image information to generate a first virtual image corresponding to the target user.
Optionally, as in the foregoing method, the adjusting the first avatar according to the first health status information to obtain a second avatar corresponding to the target user includes:
determining first part information and first disease level information corresponding to the first health state information;
positioning in the first virtual image according to the first part information to obtain a first target virtual part;
adjusting the first target virtual part according to the first disease level information to obtain a second virtual image corresponding to the target user;
associating the first avatar and second avatar with the target user.
Optionally, as in the foregoing method, adjusting the first target virtual location according to the first disease level information includes:
determining a highlight display mode corresponding to the first disease condition grade information according to a preset corresponding relation;
and rendering the first target virtual part according to the highlighting mode to obtain a rendered target virtual part.
Optionally, as in the foregoing method, the method further includes:
acquiring treatment information corresponding to the first health state information; the therapy information includes at least one of: rehabilitation plans, medical advice and drug prescriptions;
generating virtual indication information according to the treatment information;
and adding the virtual indication information into a display interface for displaying the second virtual image.
Optionally, as in the foregoing method, the method further includes:
acquiring second health state information of the target user; the first health state information and the second health state information correspond to the same second part information in the second virtual image, and the acquisition time of the second health state information is later than that of the first health state information;
determining the state of illness recovery information of the target user according to the second health state information and the first health state information;
and updating the second virtual image according to the disease recovery condition information to obtain a third virtual image.
Optionally, as in the foregoing method, the determining the disease recovery information of the target user according to the second health status information and the first health status information includes:
determining second disease level information corresponding to the second health state information;
positioning in the second virtual image according to the second part information to obtain a second target virtual part;
determining a severity relationship between the second condition level information and the first condition level information; the first disease condition level information is disease condition level information corresponding to the first health state information;
determining the disease recovery condition information according to the severity relation;
and determining corresponding virtual image state information according to the disease recovery condition information.
Optionally, as in the foregoing method, the updating the second avatar according to the disease recovery condition information to obtain a third avatar includes:
adjusting the second target virtual part according to the disease recovery condition information, and adjusting the image state of the second virtual image according to the virtual image state information to obtain a third virtual image;
associating the third avatar with the target user.
In a second aspect, an embodiment of the present application provides a virtual reality-based user data processing apparatus, including:
the first acquisition module is used for acquiring image characteristic data of a target user;
the generating module is used for generating a first virtual image corresponding to the target user according to the image characteristic data;
the second acquisition module is used for acquiring the first health state information of the target user;
and the adjusting module is used for adjusting the first virtual image according to the first health state information to obtain a second virtual image corresponding to the target user.
In a third aspect, an embodiment of the present application provides an electronic device, including: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
the memory is used for storing a computer program;
the processor is configured to implement the processing method according to any one of the preceding claims when executing the computer program.
In a fourth aspect, an embodiment of the present application provides a storage medium including a stored program, where the program is run to perform the method steps of any one of the preceding claims.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages:
compared with the method for displaying the health state of the user through the characters, the method for displaying the health state of the user through the virtual image can obtain a more vivid and concrete display effect, is more convenient for the target user to know the health state of the target user, and is also more convenient for a doctor to quickly obtain the health state of the target user; finally, the medical history does not need to be registered through paper medical records, and the complicated medical condition caused by the loss of the medical records can be avoided.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic flowchart of a user data processing method based on virtual reality according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a virtual reality-based user data processing method according to another embodiment of the present application;
fig. 3 is a schematic flowchart of a virtual reality-based user data processing method according to another embodiment of the present application;
fig. 4 is a schematic flowchart of a virtual reality-based user data processing method according to another embodiment of the present application;
FIG. 5 is a block diagram of a virtual reality-based user data processing apparatus according to another embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a virtual reality-based user data processing method provided in an embodiment of the present application, including the following steps S1 to S4:
s1, obtaining image characteristic data of a target user.
Specifically, the target user is a user who performs avatar establishment.
The visual feature data may be data for representing the appearance features of the target user, and the visual feature data may be divided into two-dimensional data and three-dimensional data according to the device for collecting, and may include, but is not limited to, the following data from the body data corresponding to the data: three-dimensional data, height data, limb length, facial features, and the like. Since the character feature data corresponds to a specific user, the character feature data generally differs from user to user.
And S2, generating a first virtual image corresponding to the target user according to the image characteristic data.
Specifically, the first avatar is generated according to the avatar characteristic data, and thus, the first avatar may be a stereoscopically displayed avatar corresponding to the target user.
And S3, acquiring first health state information of the target user.
Specifically, the first health status information is information corresponding to the target user and used for characterizing the health status of the target user, and optionally, the first health status information may include but is not limited to: disease information (e.g., leg fracture, heart disease, etc.), sign information (e.g., heart rate, blood glucose, blood oxygen, etc.).
The method for acquiring the first health status information may include:
1. after the medical equipment is detected, the medical equipment is led into a system for realizing the method of the embodiment;
2. after a doctor diagnoses a patient (i.e. a target user), the first health status information is input into the system implementing the method of the embodiment.
In addition, other manners may also be used to obtain the information, and the above are only optional manners of obtaining, which are not listed here.
And S4, adjusting the first virtual image according to the first health state information to obtain a second virtual image corresponding to the target user.
Specifically, the first health status information is information for characterizing the health status of the target user. Therefore, after the first avatar is adjusted according to the first health state information, a second avatar capable of representing the health state of the target user may be obtained, and optionally, since the first health state information may be specific to a specific organ or portion, the second avatar may adjust the first avatar through information included in the first health state information by marking, associating, and the like the portion or the organ corresponding to the first health state information, where the adjustment manner may be: the adjustment policy corresponding to each piece of information is predetermined, for example: when a certain piece of information is represented as abnormal, the part corresponding to the information is displayed in a preset abnormal mode; and then a second virtual image corresponding to the target user is obtained.
Compared with the method for displaying the health state of the user through the characters, the method for displaying the health state of the user through the second virtual image can obtain a more vivid and specific display effect, and is more convenient for the user to know the health state of the user.
In some embodiments, a method as previously described:
the step S1 obtains the image feature data of the target user, including the following steps S11 and S12:
s11, acquiring scanning data obtained by three-dimensional scanning of the target user, and/or acquiring two-dimensional image information obtained by image shooting of the target user.
Specifically, the three-dimensional scanning manner for the target user may include, but is not limited to: depth cameras, millimeter wave radar, and the like. Moreover, the scanning data can be obtained by scanning the target user at a plurality of angles and acquiring the complete image of the target user.
The target user can be imaged by the camera to obtain two-dimensional image information. Similar to three-dimensional scanning, multi-angle shooting can be performed on a target user to obtain two-dimensional image information at different angles.
And S12, obtaining image characteristic data according to the scanning data and/or the two-dimensional image information.
That is, the character feature data includes at least one of scan data and two-dimensional image information.
The step S2 of generating a first avatar corresponding to the target user according to the avatar characteristic data includes:
and performing three-dimensional modeling through the scanning data and/or the two-dimensional image information to generate a first virtual image corresponding to the target user.
Specifically, the three-dimensional modeling may be performed by constructing a model with three-dimensional data through a virtual three-dimensional space by using three-dimensional manufacturing software. The modeling mode can be as follows: NURBS modeling, polygonal mesh modeling, and the like.
By the method in the embodiment, the purpose is: and obtaining a first virtual image corresponding to the image of the target user by scanning the data and/or the two-dimensional image information, wherein the first virtual image has certain similarity with the actual image of the target user.
By the method in the embodiment, the virtual image similar to the target user can be quickly established, and the association relation between the first health state information and the target user can be strengthened according to the first virtual image obtained by the data of the target user.
As shown in fig. 2, in some embodiments, the step S4 of adjusting the first avatar according to the first health status information includes the following steps S41 to S44:
and S41, determining first part information and first disease condition grade information corresponding to the first health state information.
Specifically, the first health status information may include: first part information and first disease level information.
Optionally, the target user may simultaneously correspond to a plurality of first health status information, and therefore, the first location information and the first disease level information may include a plurality of sets to determine the disease conditions of a plurality of locations of the target user.
Wherein, the first location information may include, but is not limited to: bones, organs, blood, etc. of the human body.
Furthermore, the first disease level information may further include outpatient department information corresponding to the medical visit of the user.
The first disease level information may be specific type information of the disease and severity information of the disease corresponding to the region information.
And S42, positioning in the first virtual image according to the first part information to obtain a first target virtual part.
Specifically, the first target virtual part is a virtual part corresponding to the first part information in the first avatar.
Further, when the first target virtual region is inside the human body, transparent processing may be performed on tissue outside the first target virtual region so that the first target virtual region can be observed.
After the first avatar is established, the first part identifier corresponding to each virtual part can be determined (the characterization method can be that the first part identifier is characterized by numbers or names).
In order to achieve the purpose of positioning, the first part information also needs to adopt a coding method which is the same as the identification of each part in the first virtual image; and then the first target virtual part can be obtained by positioning in the first virtual image in a character matching mode.
And S43, adjusting the first target virtual part according to the first disease condition grade information to obtain a second virtual image corresponding to the target user.
Specifically, the adjustment of the target virtual part according to the first disease state level information is generally an adjustment in display effect, so that the adjusted display content can correspond to the first disease state level information, and further, the corresponding disease state can be visually observed.
Step S44, associating the first avatar and the second avatar with the target user.
Specifically, after the first avatar or the second avatar of the target user is obtained, the first avatar or the second avatar can be uploaded to a server for storing patient information, and the first avatar or the second avatar is correlated with personal information of the patient (namely, the target user), so that the patient can check the electronic medical record by inputting the personal information of the patient in the later period, and check the condition of displaying the avatar.
Through the method in the embodiment, the illness state can be quickly embodied in the virtual image, so that the user can conveniently and visually check the illness state, and the illness state is more specific, vivid and easy to understand.
In some embodiments, as in the foregoing method, the step S43 of adjusting the first target virtual location according to the first disease level information includes the following steps S431 and S432:
and S431, determining a highlight display mode corresponding to the first disease condition grade information according to the preset corresponding relation.
Specifically, the first part information is part information corresponding to a virtual part in the first avatar.
The highlighting manner is an optional manner for highlighting the diseased target virtual part, and the highlighting manner may include one or more, and each of the first disease level information may correspond to one of the highlighting manners according to the corresponding relationship.
Further, the highlighting means may include, but is not limited to: the target virtual part may be rendered in color, and therefore, the preset corresponding relationship stores the corresponding relationship between the first disease level information and different colors, for example: the serious injury condition of the first target virtual part can be marked and distinguished through colors (for example, orange red, yellow and green are set to represent the serious condition to the recovery condition in sequence), and in a normal state, the target virtual part is not subjected to color rendering; the target virtual part can be replaced by a diseased state virtual part corresponding to the first disease condition level information, so that the preset corresponding relation can be the corresponding relation between the first disease condition level information and different diseased state virtual parts, and the target virtual part is a display result in a normal state.
And S432, rendering the first target virtual part according to the highlight display mode to obtain the rendered target virtual part.
Specifically, on the basis of the previous step, the first target virtual part is rendered according to the highlighting manner, and the target virtual part may be dyed according to the color corresponding to the highlighting manner, so as to obtain a rendered target virtual part having a color consistent with the color corresponding to the highlighting manner; the method can also be as follows: and determining the corresponding diseased state virtual part according to the highlight display mode, replacing the target virtual part with the diseased state virtual part, and further obtaining the rendered target virtual part.
By the method in the embodiment, the target virtual part corresponding to the diseased part can be quickly processed to obtain the rendered target virtual part, and the rendered target virtual part is highlighted, so that the diseased part can be quickly and accurately positioned, and a doctor or a patient can conveniently check the position.
In some embodiments, the method further comprises steps S51 to S53 as follows:
s51, acquiring treatment information corresponding to the first health state information; the therapy information includes at least one of: rehabilitation plans, medical advice and drug prescriptions.
Specifically, the treatment information may be information which can embody a treatment plan and is prepared and obtained by a doctor through a shooting result and an operation condition, and the treatment information includes at least one of the following items: rehabilitation plans, medical advice and drug prescriptions.
Wherein the rehabilitation plan may be an embodiment of the treatment plan as a whole; the diagnosis and treatment suggestion can be an instrument adopted in treatment or a matter needing attention in recovery; the prescription of the medicine can be information of the medicine to be taken at each stage.
And S52, generating virtual indication information according to the treatment information.
Specifically, the virtual indication information may be digitally displayed information, and is generated according to the treatment information, so that the user can obtain the treatment plan according to the virtual indication information.
And S53, adding the virtual indication information into a display interface for displaying the second virtual image.
Specifically, the virtual indication information and the second avatar are displayed in the same display interface, wherein an optional implementation manner may be: and determining a virtual part corresponding to the virtual indication information in the second virtual image, and then enabling the virtual indication information to point to the virtual part, so that a user can know a treatment scheme for treating the diseased part corresponding to the virtual part.
By the method in the embodiment, the user can more intuitively acquire the treatment scheme given by the doctor and carry out corresponding rehabilitation treatment.
As shown in fig. 3, in some embodiments, the method further includes steps S6 to S8 as follows:
s6, acquiring second health state information of the target user; the first health status information and the second health status information correspond to the same second part information in the second avatar, and the acquisition time of the second health status information is later than that of the first health status information.
Specifically, the second part information is part information in the second avatar; since the first health status information and the second health status information correspond to the same second location information, and the acquisition time of the second health status information is later than that of the first health status information, the second health status information may be the health status information of the location diagnosed by the medical device and/or the doctor when the target user performs a review on the location corresponding to the location information in the hospital. Thus, the first health status information may be health status information that is checked before the second health status information.
Wherein the second location information may include, but is not limited to: bones, organs, blood, etc. of the human body.
And S7, determining the disease recovery condition information of the target user according to the second health state information and the first health state information.
In particular, after the second health status information is obtained, it may be compared to the first health status information by, for example: the severity of the disease condition, the degree of wound healing and the like to obtain the information of the recovery condition of the disease condition.
And S8, updating the second virtual image according to the disease recovery condition information to obtain a third virtual image.
Specifically, the third avatar is an avatar corresponding to the disease recovery situation information, and thus, the third avatar may exhibit the rehabilitation situation of the target user at the current stage.
By the aid of the method, the rehabilitation condition of the user can be displayed through the virtual image, and by the aid of the method, the latest virtual image can be obtained every time the patient is seen, so that the rehabilitation change condition of the patient in the whole treatment process can be displayed step by step, and the patient can know the rehabilitation condition of the patient.
As shown in fig. 4, in some embodiments, the step S7 of determining the recovery status information of the target user according to the second health status information and the first health status information includes the following steps S71 to S75:
and S71, determining second disease condition grade information corresponding to the second health state information.
Specifically, the second health status information may include: second part information and second disease level information.
Optionally, the target user may simultaneously correspond to a plurality of second health status information, and therefore, the location information and the second disease level information may include a plurality of sets to determine the disease conditions of a plurality of locations of the target user.
Furthermore, the second disease level information may further include outpatient department information corresponding to the medical visit of the user.
The second disease level information may be specific type information of the disease and severity information of the disease corresponding to the location information.
And S72, positioning in the second virtual image according to the second part information to obtain a second target virtual part.
Specifically, the second target virtual part is a virtual part corresponding to the second part information in the second avatar.
Further, when the second target virtual region is inside the human body, transparent processing may be performed on tissue outside the second target virtual region so that the second target virtual region can be observed.
After the second avatar is established, the part identifiers corresponding to the virtual parts can be determined (the characterization method can be that the second avatar is characterized by numbers or names).
In order to achieve the purpose of positioning, the second part information also needs to adopt the same coding method as the identification of each part in the second virtual image; and then a second target virtual part can be obtained by positioning in the second virtual image in a character matching mode.
S73, determining the severity relation between the second disease condition grade information and the first disease condition grade information; the first disease condition level information is disease condition level information corresponding to the first health state information.
Specifically, the second disease level information and the first disease level information may be represented by a score or a grade. And the second disease condition level information and the first disease condition level information adopt the same representation mode.
Therefore, the severity relationship can be obtained by comparing the second disease condition level information with the first disease condition level information, and the severity relationship is used for representing which information between the second disease condition level information and the first disease condition level information corresponds to a higher disease condition level.
And S74, determining the recovery condition information of the disease condition according to the severity relation.
Specifically, since the severity relationship is used for representing which information between the second disease level information and the first disease level information corresponds to a higher disease level, whether the disease of the target user is more serious or recovered can be determined according to the severity relationship, and then the disease recovery condition information can be determined.
And S75, determining corresponding virtual image state information according to the disease recovery condition information.
Specifically, the avatar state information may be information for characterizing the mental state of the avatar, such as: a state of shaking and listlessness.
Alternatively, the avatar state information may be information that can adjust the facial emotional state and/or the body state information.
Furthermore, the method in the embodiment can adjust the virtual image as a whole, so that the display effect is more vivid.
In some embodiments, as the method, the step S8 updates the second avatar according to the disease recovery information to obtain a third avatar, including the following steps S81 and S82:
and S81, adjusting the second target virtual part according to the disease recovery condition information, and adjusting the image state of the second virtual image according to the virtual image state information to obtain a third virtual image.
Specifically, after obtaining the disease recovery condition information, the second target virtual part may be adjusted according to a preset corresponding adjustment policy, for example: the recovery condition of the fracture part is evaluated, when the second target virtual part shows light red, the recovery is good, the condition is improved, and corresponding dietary medicine suggestions can be given; if the situation is not good, a warning is displayed by displaying the second target virtual part in yellow.
The character states may include: facial expression and posture.
Therefore, the facial expression and the posture of the second avatar can be adjusted according to the avatar state information.
Step S82, the third avatar is associated with the target user.
Specifically, after the third avatar is obtained, the third avatar can be uploaded to a server for storing patient information, and the third avatar is correlated with personal information of a patient (namely, a target user), so that the patient can check the electronic medical record by inputting the personal information of the patient in the later period, and check the condition of displaying the avatar.
And after the second target virtual part and the image state are adjusted, a third virtual image can be obtained.
Therefore, by adopting the scheme in the embodiment, the corresponding virtual images can be displayed in the medical record of the whole patient and the stages of the treatment process, the medical record condition of the patient in each stage can be checked by doctors and the patient, the treatment conditions in different stages can be displayed through the virtual images, the change of the virtual images shows the change condition of the rehabilitation of the patient in the whole treatment process step by step, and the patient can know the rehabilitation condition of the patient.
Because the third virtual image is associated with the target user, the electronic medical record and the display process of the virtual image can be obtained by accessing a specific server only by the personal information of the target user, and the effect of changing the main doctor can be avoided.
In summary, in the embodiment, the treatment stage result of the patient is subjected to three-dimensional visual display and feedback of good and quick conditions and suggestions by combining the virtual reality technology, so that a more visual and understandable three-dimensional personalized medical record is formed, the patient and the doctor can know the medical history condition of the patient more, the communication efficiency between the doctor and the patient is improved, the professional medical record can be more popular and easier for the patient, and the user experience of the patient in diagnosis is improved.
As shown in fig. 5, according to an embodiment of another aspect of the present application, there is also provided a virtual reality-based user data processing apparatus, including:
the first acquisition module 1 is used for acquiring image characteristic data of a target user;
the generating module 2 is used for generating a first virtual image corresponding to the target user according to the image characteristic data;
the second acquisition module 3 is used for acquiring first health state information of a target user;
and the adjusting module 4 is used for adjusting the first virtual image according to the first health state information to obtain a second virtual image corresponding to the target user.
Specifically, the specific process of implementing the functions of each module in the apparatus according to the embodiment of the present invention may refer to the related description in the method embodiment, and is not described herein again.
According to another embodiment of the present application, there is also provided an electronic apparatus including: as shown in fig. 6, the electronic device may include: the system comprises a processor 1501, a communication interface 1502, a memory 1503 and a communication bus 1504, wherein the processor 1501, the communication interface 1502 and the memory 1503 complete communication with each other through the communication bus 1504.
A memory 1503 for storing a computer program;
the processor 1501 is configured to implement the steps of the above-described method embodiments when executing the program stored in the memory 1503.
The bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
The embodiment of the present application further provides a storage medium, where the storage medium includes a stored program, and the program executes the method steps of the foregoing method embodiment when running.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (11)

1. A user data processing method based on virtual reality is characterized by comprising the following steps:
acquiring image characteristic data of a target user;
generating a first virtual image corresponding to the target user according to the image characteristic data;
acquiring first health state information of the target user;
and adjusting the first virtual image according to the first health state information to obtain a second virtual image corresponding to the target user.
2. The method of claim 1, wherein:
the acquiring of the image characteristic data of the target user comprises:
acquiring scanning data obtained by three-dimensional scanning of the target user, and/or acquiring two-dimensional image information obtained by image shooting of the target user;
obtaining the image characteristic data according to the scanning data and/or the two-dimensional image information;
generating a first avatar corresponding to the target user according to the avatar characteristic data, including:
and performing three-dimensional modeling through the scanning data and/or the two-dimensional image information to generate a first virtual image corresponding to the target user.
3. The method of claim 1, wherein said adjusting the first avatar according to the first health status information to obtain a second avatar corresponding to the target user comprises:
determining first part information and first disease level information corresponding to the first health state information;
positioning in the first virtual image according to the first part information to obtain a first target virtual part;
adjusting the first target virtual part according to the first disease level information to obtain a second virtual image corresponding to the target user;
associating the first avatar and second avatar with the target user.
4. The method of claim 3, wherein adjusting the first target virtual site based on the first condition level information comprises:
determining a highlight display mode corresponding to the first disease condition grade information according to a preset corresponding relation;
and rendering the first target virtual part according to the highlighting mode to obtain a rendered target virtual part.
5. The method of claim 1, further comprising:
acquiring treatment information corresponding to the first health state information; the therapy information includes at least one of: rehabilitation plans, medical advice and drug prescriptions;
generating virtual indication information according to the treatment information;
and adding the virtual indication information into a display interface for displaying the second virtual image.
6. The method of claim 1, further comprising:
acquiring second health state information of the target user; the first health state information and the second health state information correspond to the same second part information in the second virtual image, and the acquisition time of the second health state information is later than that of the first health state information;
determining the state of illness recovery information of the target user according to the second health state information and the first health state information;
and updating the second virtual image according to the disease recovery condition information to obtain a third virtual image.
7. The method of claim 6, wherein determining the recovery information of the target user based on the second health status information and the first health status information comprises:
determining second disease level information corresponding to the second health state information;
positioning in the second virtual image according to the second part information to obtain a second target virtual part;
determining a severity relationship between the second condition level information and the first condition level information; the first disease condition level information is disease condition level information corresponding to the first health state information;
determining the disease recovery condition information according to the severity relation;
and determining corresponding virtual image state information according to the disease recovery condition information.
8. The method of claim 7, wherein said updating said second avatar according to said medical condition recovery information to obtain a third avatar comprises:
adjusting the second target virtual part according to the disease recovery condition information, and adjusting the image state of the second virtual image according to the virtual image state information to obtain a third virtual image;
associating the third avatar with the target user.
9. A virtual reality-based user data processing apparatus, comprising:
the first acquisition module is used for acquiring image characteristic data of a target user;
the generating module is used for generating a first virtual image corresponding to the target user according to the image characteristic data;
the second acquisition module is used for acquiring the first health state information of the target user;
and the adjusting module is used for adjusting the first virtual image according to the first health state information to obtain a second virtual image corresponding to the target user.
10. An electronic device, comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
the memory is used for storing a computer program;
the processor, when executing the computer program, implementing the method steps of any of claims 1-8.
11. A storage medium, characterized in that the storage medium comprises a stored program, wherein the program is operative to perform the method steps of any of the preceding claims 1-8.
CN202011051172.3A 2020-09-29 2020-09-29 User data processing method and device based on virtual reality Withdrawn CN112215969A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011051172.3A CN112215969A (en) 2020-09-29 2020-09-29 User data processing method and device based on virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011051172.3A CN112215969A (en) 2020-09-29 2020-09-29 User data processing method and device based on virtual reality

Publications (1)

Publication Number Publication Date
CN112215969A true CN112215969A (en) 2021-01-12

Family

ID=74051512

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011051172.3A Withdrawn CN112215969A (en) 2020-09-29 2020-09-29 User data processing method and device based on virtual reality

Country Status (1)

Country Link
CN (1) CN112215969A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114842985A (en) * 2022-06-30 2022-08-02 北京超数时代科技有限公司 Virtual reality diagnosis and treatment system under meta-universe scene
CN115810073A (en) * 2022-12-19 2023-03-17 支付宝(杭州)信息技术有限公司 Virtual image generation method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020005888A (en) * 2000-07-10 2002-01-18 황인택 Method and system for providing remote health care service
KR20130029576A (en) * 2011-09-15 2013-03-25 주식회사 케이티 Method for serving health related information and poi information based on cloud and system therefor
KR20140062659A (en) * 2012-11-14 2014-05-26 경북대학교 산학협력단 U-healthcare system and method for providing u-healthcare information using health-avatar
KR20150124422A (en) * 2015-10-12 2015-11-05 중앙대학교 산학협력단 Virtual experience apparatus for curing disease and method thereof
KR20160119025A (en) * 2016-09-30 2016-10-12 중앙대학교 산학협력단 Virtual experience apparatus for curing disease and method thereof
CN107491661A (en) * 2017-09-30 2017-12-19 深圳前海卓岳科技发展有限公司 A kind of medical record management method, apparatus, equipment and system
CN110970105A (en) * 2019-12-04 2020-04-07 深圳追一科技有限公司 Physical examination report broadcasting method and device, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020005888A (en) * 2000-07-10 2002-01-18 황인택 Method and system for providing remote health care service
KR20130029576A (en) * 2011-09-15 2013-03-25 주식회사 케이티 Method for serving health related information and poi information based on cloud and system therefor
KR20140062659A (en) * 2012-11-14 2014-05-26 경북대학교 산학협력단 U-healthcare system and method for providing u-healthcare information using health-avatar
KR20150124422A (en) * 2015-10-12 2015-11-05 중앙대학교 산학협력단 Virtual experience apparatus for curing disease and method thereof
KR20160119025A (en) * 2016-09-30 2016-10-12 중앙대학교 산학협력단 Virtual experience apparatus for curing disease and method thereof
CN107491661A (en) * 2017-09-30 2017-12-19 深圳前海卓岳科技发展有限公司 A kind of medical record management method, apparatus, equipment and system
CN110970105A (en) * 2019-12-04 2020-04-07 深圳追一科技有限公司 Physical examination report broadcasting method and device, electronic equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114842985A (en) * 2022-06-30 2022-08-02 北京超数时代科技有限公司 Virtual reality diagnosis and treatment system under meta-universe scene
CN114842985B (en) * 2022-06-30 2023-12-19 北京超数时代科技有限公司 Virtual reality diagnosis and treatment system under meta-universe scene
CN115810073A (en) * 2022-12-19 2023-03-17 支付宝(杭州)信息技术有限公司 Virtual image generation method and device

Similar Documents

Publication Publication Date Title
US11462309B2 (en) Automated electrocardiogram interpretation system and methods for use therewith
US9122773B2 (en) Medical information display apparatus and operation method and program
US8744147B2 (en) Graphical digital medical record annotation
US9317920B2 (en) System and methods for identification of implanted medical devices and/or detection of retained surgical foreign objects from medical images
US20060173858A1 (en) Graphical medical data acquisition system
KR20190132290A (en) Method, server and program of learning a patient diagnosis
DE102005048725A1 (en) System for managing clinical data of a patient
Carmel The craft of intensive care medicine
WO2014047254A1 (en) Training and testing system for advanced image processing
KR102366290B1 (en) Medical machine learning system
CN112215969A (en) User data processing method and device based on virtual reality
JP2020533669A (en) Systems and methods to generate anonymous interactive displays during extended timeouts
CN110706811A (en) Traditional Chinese medicine auxiliary prescription development system and control method thereof
US11210867B1 (en) Method and apparatus of creating a computer-generated patient specific image
CN109478417A (en) Content driven problem list arrangement in electron medicine record
CN110580952A (en) Large-scale unmanned hospital system using artificial intelligence
CN204813807U (en) Box -like traditional chinese medical science electron equipment of diagnosing of multiunit
JP7238705B2 (en) Medical care support method, medical care support system, learning model generation method, and medical care support program
CN111613280A (en) H.I.P.S multi-point touch propaganda and education interaction system for medical treatment
JP6891000B2 (en) Hospital information system
JP6771328B2 (en) Referral letter creation support system, referral letter creation support device and referral letter creation support method
CN112835480B (en) Human body information display method, device, equipment and computer readable storage medium
Liu et al. Perioperative nursing care of vascular decompression for trigeminal neuralgia under AR medical technology
Peschel et al. Cloud-based Infrastructure for Interactive Analysis of RNFLT Data
KR102500378B1 (en) Automatic medical drawing generator, the method for generating automatic medical drawing thereof and machine learning based automatic medical drawing generator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210112