CN115983051B - Method and system for interactive simulation of electronic pet by virtual pet - Google Patents

Method and system for interactive simulation of electronic pet by virtual pet Download PDF

Info

Publication number
CN115983051B
CN115983051B CN202310266436.4A CN202310266436A CN115983051B CN 115983051 B CN115983051 B CN 115983051B CN 202310266436 A CN202310266436 A CN 202310266436A CN 115983051 B CN115983051 B CN 115983051B
Authority
CN
China
Prior art keywords
user
information
virtual pet
interaction
pet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310266436.4A
Other languages
Chinese (zh)
Other versions
CN115983051A (en
Inventor
宋程
刘保国
胡金有
吴浩
梁开岩
郭玮鹏
李学奇
范存龙
李海
巩京京
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xingchong Kingdom Beijing Technology Co ltd
Original Assignee
Xingchong Kingdom Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xingchong Kingdom Beijing Technology Co ltd filed Critical Xingchong Kingdom Beijing Technology Co ltd
Priority to CN202310266436.4A priority Critical patent/CN115983051B/en
Publication of CN115983051A publication Critical patent/CN115983051A/en
Application granted granted Critical
Publication of CN115983051B publication Critical patent/CN115983051B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a method and a system for interactive simulation of a virtual pet, which comprise the steps of collecting user characteristic information and establishing a virtual pet model according to the information; the virtual pet model learns the interactive behavior of the virtual pet through the user characteristic information; outputting the learned virtual pet image, and making and implementing the virtual pet interaction rule. The method can calculate the adaptive interaction method for fitting the actual demands of the users, and the virtual pets learn and train according to the adaptive interaction method, meanwhile, can provide help to the outside or provide practical help through a third party platform under partial special conditions so as to meet the actual demands of the users and provide high-level emotional value, and solve the problem that the traditional virtual pets cannot meet the emotional demands and emotion consignment of contemporary people.

Description

Method and system for interactive simulation of electronic pet by virtual pet
Technical Field
The invention relates to the technical field of virtual pet simulation, in particular to a virtual pet interactive simulation electronic pet method and a system.
Background
Because the current simulated electronic virtual pet is only used as a simple electronic image, a great gap exists between the current simulated electronic virtual pet and a pet which can provide emotion value and actual help in reality, the current virtual pet can only perform simple feeding, stroking and other simple interactions with a user, the emotion value obtained by the simple interactions of the user is very limited, the user cannot obtain actual help deeply, the user viscosity of the simulated electronic virtual pet is low, and a time-limited problem exists, so that how to improve the actual utility of the virtual pet is a key for solving the current problem. The traditional virtual pet is limited in the cartoon image and the two-dimensional plane, and meanwhile, the technology capability of real-time updating is not provided, so that the appearance, the behavior, the sound, the size and the like of the virtual pet are difficult to update, and the repetition rate of the interactive response action is high, so that the traditional virtual pet cannot meet the emotion requirement and emotion consignment problem of the current person.
Disclosure of Invention
This section is intended to outline some aspects of embodiments of the invention and to briefly introduce some preferred embodiments. Some simplifications or omissions may be made in this section as well as in the description summary and in the title of the application, to avoid obscuring the purpose of this section, the description summary and the title of the invention, which should not be used to limit the scope of the invention.
The present invention has been made in view of the above-described problems occurring in the prior art.
Therefore, the invention provides a method and a system for interactive simulation of electronic pets by using virtual pets, which can solve the problem that the traditional virtual pets cannot meet the emotion requirements and emotion consignment of contemporary people.
In order to solve the technical problems, the invention provides a method for interactively simulating an electronic pet by a virtual pet, which comprises the following steps:
collecting user characteristic information, and establishing a virtual pet model according to the information;
the virtual pet model learns the interactive behavior of the virtual pet through the user characteristic information;
outputting the learned virtual pet image, and making a virtual pet interaction rule;
and according to the virtual pet interaction rules, the virtual pets are interacted.
As a preferable scheme of the virtual pet interactive simulation electronic pet method of the invention, wherein: the user characteristic information comprises pet video information loved by a user, and static and dynamic photo information of the pet stored or loved by the user.
As a preferable scheme of the virtual pet interactive simulation electronic pet method of the invention, wherein: the establishing the virtual pet model according to the information comprises analyzing the pet variety, hair color, hair length, pet sound, sound level, behavior habit and character feature of the user's preferred physical sign information through the pet video information loved by the user and the pet static and dynamic photo information stored or loved by the user, and associating the habit of the relevant pet to establish the virtual pet model.
As a preferable scheme of the virtual pet interactive simulation electronic pet method of the invention, wherein: the virtual pet interaction behavior comprises daily interaction with a user, real-time retrieval of personal health conditions of the user, prediction of user requirements and timely response to the user requirements.
As a preferable scheme of the virtual pet interactive simulation electronic pet method of the invention, wherein: the daily interaction with the user comprises the steps that the virtual pet builds a motion function according to user information, daily workload of the user is estimated through the daily quantity of motion of the user, and the interaction duration and degree which are provided by the virtual pet and required by the user on the same day are deduced;
the interaction rule comprises that after the virtual pet is authorized by a user, the past daily exercise step number and daily activity range are recorded and stored, and an exercise function is constructed according to the content in the record, wherein the exercise function is as follows:
Figure SMS_1
wherein ,
Figure SMS_2
represents the total days that can be traced, +.>
Figure SMS_3
Indicates the number of exercise steps or the range of exercise on the same day of the user, +.>
Figure SMS_4
Represents the i-th day, < > among the n days calculated>
Figure SMS_5
Representing the actual number of steps or range of motion on day i;
when (when)
Figure SMS_6
Allowing the virtual pet and the related interaction information of the virtual pet to pop up automatically when the interaction frequency is less than or equal to 0.3, setting the fluctuation of the number of times of the active interaction of the virtual pet and the user to be 50-100% of the average number of times of the daily interaction, and improving the frequency of long actions or long voices to be 112% of the average frequency of the daily interaction in the active interaction process;
when 0.3<
Figure SMS_7
<1, allowing the related interaction information of the virtual pet to pop up automatically, setting the fluctuation of the number of times of the active interaction of the virtual pet and the user to be 20% -30% of the average number of times of the daily interaction, and simultaneously setting the frequency of long actions or long voices of the virtual pet in the active interaction process to be equal to the previous day;
when (when)
Figure SMS_8
And when the frequency of the pop-up frequency is 60% of the average pop-up frequency of the current day, the long-acting or long-voice interaction of the virtual pet at 8-18 points on the current day is canceled, and the long-acting or long-voice interaction is not performed under the condition of no active requirement of a user.
As a preferable scheme of the virtual pet interactive simulation electronic pet method of the invention, wherein: the real-time retrieval of the personal health condition of the user comprises the steps that the virtual pet presumes the physical state of the user through the acquired physical sign data of the user, and meanwhile, actual assistance is provided for the user according to the physical state of the user;
the interaction rule further comprises that the physical state of the user is presumed, after the virtual pet is authorized by the user, the local climate temperature of the day is obtained, the body temperature of the user is measured, and if the body temperature of the user is below 37.3 ℃, the virtual pet normally broadcasts the climate condition of the day, epidemic diseases of the season and preventive measures thereof;
if the body temperature of the user is at
Figure SMS_9
The virtual pet can search the business pharmacy and clinic around the user while acquiring the epidemic diseases and the preventive measure information thereof in the current season, and can deeply search whether the medicine required by the user is sold in the business pharmacy and clinic around the user after the user is authorized;
the search function of the search range is as follows:
Figure SMS_10
Figure SMS_11
Figure SMS_12
wherein ,
Figure SMS_13
represents a health influencing factor function, and +>
Figure SMS_16
Figure SMS_19
Representing the factor of influence, ++>
Figure SMS_15
Indicating the body temperature of the user +.>
Figure SMS_17
Representing the influencing factor parameters->
Figure SMS_18
Representing physical characteristic parameters of the user->
Figure SMS_20
Representing a preset minimum value of the physical characteristic parameter, set to 18.5,/for>
Figure SMS_14
Representing the maximum value of preset physical characteristic parameters, setting the maximum value to be 23.9, wherein y represents the weight of a user, and z represents the height of the user; />
If the user is male, then
Figure SMS_21
Figure SMS_22
When->
Figure SMS_23
Setting +.f when the value range is less than 18.5 or more than 23.9>
Figure SMS_24
1250->
Figure SMS_25
48125;
if the user is female, then
Figure SMS_26
Figure SMS_27
When->
Figure SMS_28
Setting +.f when the value range is less than 18.5 or more than 23.9>
Figure SMS_29
750->
Figure SMS_30
28875;
when (when)
Figure SMS_31
The value range is +.>
Figure SMS_32
When (I)>
Figure SMS_33
Not substitute->
Figure SMS_34
Performing operation directly on
Figure SMS_35
Figure SMS_36
Taking a value according to a numerical value preset by a user;
if the body temperature of the user is at
Figure SMS_37
Searching for business pharmaceutical stores and clinics around the user, deeply searching for whether the peripheral business pharmaceutical stores and clinics have medicines required by the user for sale, and after the user is authorized, directly carrying out the purchasing and substituting service of the purchased medicines through a third party platform, and inquiring whether the user needs to synchronize the physical condition of the user with the reserved emergency contact person;
if the body temperature of the user is at
Figure SMS_38
Reminding the user to perform secondary measurement within 10 minutes, if the error of the two measurement results is less than 1 degree, sending the physical condition of the user to the emergency contact person reserved by the user, directly calling the emergency contact person for more than one hour without responding, calling for help by the virtual pet through a voice dialogue, and searching corresponding required medicines for a pharmacy or a clinic in the surrounding environment, wherein the search result is synchronously uploaded to the mobile phones of the user and the emergency contact person; if the user does not make the secondary measurement within 10 minutes and the virtual pet call is not responded for half an hour, the virtual pet directly dials the emergency contact phone to make a help explanation.
As a preferable scheme of the virtual pet interactive simulation electronic pet method of the invention, wherein: the predicting the user demand and responding to the user demand in time comprises counting the voice interaction time of the user and the virtual pet image and the time rule of the voice interaction of the user and the virtual pet, carrying out key measurement and calculation on the voice interaction of the time period from 17 in evening to 8 in morning, and carrying out active pop-up in the time period after the user authorization to actively send a voice interaction request.
Another object of the present invention is to provide a virtual pet interactive simulation electronic pet system, which can solve the problem that the existing virtual pet cannot meet the reality needs, emotion demands and emotion consignment of contemporary people by implementing the virtual pet interactive simulation electronic pet method.
As a preferable scheme of the virtual pet interactive simulation electronic pet system of the invention, wherein: the system may be comprised of a plurality of devices,
the system control module is responsible for processing logic and information interaction transmission of each module, inputting a processing result into the virtual pet image, and operating and feeding back the input information by the virtual pet;
the information acquisition module is used for acquiring user characteristic information and data information provided by a third party platform and required by the virtual pet, and feeding back a result to the system control module after processing;
the temperature sensing module senses the body surface temperature information of the user, and feeds back the processed result to the system control module;
the voice interaction module is used for reading the voice interacted by the user, carrying out voice recognition, feeding back the result to the system control module and operating the virtual pet;
the background image processing module is responsible for shooting a current surrounding background picture in real time, performing image fusion operation on the virtual pet and the current environment picture, and finally generating a background with more reality.
As a preferable scheme of the virtual pet interactive simulation electronic pet system of the invention, wherein: the system control module comprises a system logic processing unit, an information acquisition unit, a temperature processing unit, a voice interaction processing unit and a background image processing unit, wherein the system logic processing unit is respectively connected with the information processing unit, the information acquisition unit, the temperature processing unit, the voice interaction processing unit and the background image processing unit, the information acquisition unit is connected with the information acquisition module, the temperature processing unit is connected with the temperature sensing module, the voice interaction processing unit is connected with the voice interaction module, and the background image processing unit is connected with the background image processing module.
As a preferable scheme of the virtual pet interactive simulation electronic pet system of the invention, wherein: the information acquisition module comprises an information processing system and an information transmission system, wherein the information transmission system browses and primarily screens information authorized to be checked, the information which is screened through the primary screening is transmitted to the information processing system through the information transmission system after the user is authorized, the information processing system performs classified integration processing on the input information data, key parameters required by the system are extracted, the parameters are input into the information acquisition module, the information acquisition module transmits the key parameters to a system control module for unified processing, and meanwhile, the system control module guides the obtained key parameter information into a system logic processing unit and performs logic processing on the parameters according to interaction rules in the system logic processing unit;
the temperature sensing module comprises a thermometer and a temperature processing unit which are sequentially connected, wherein the thermometer obtains the body surface temperature of a user in the current state, the body surface temperature is input into the temperature processing unit, an information acquisition unit and a system logic processing unit, the system logic processing unit processes the input temperature information according to interaction rules and transmits the information acquisition rules to the temperature processing unit and the information acquisition unit through a system control module, the temperature processing unit processes, archives and uploads the processed temperature information according to the obtained information, the processed temperature information is transmitted to the virtual pet image, meanwhile, the information acquisition unit performs conditional information secondary screening on information authorized to be checked according to the information obtained from the thermometer and the system logic processing unit, and transmits the screened information to the system control module, and the system control module performs unified operation or modification on interaction parameters of the virtual pet;
the voice interaction module comprises a microphone, a voice recognition module and a voice interaction processing module, wherein the microphone is used for acquiring the input of voice information of a user and a third party, the voice recognition module recognizes the voice content of the user and the source of the third party, the voice interaction processing module performs preprocessing work of the operation appointed by the voice content of the user or the authorized third party according to the voice recognized content, all the information acquired by the voice interaction module is synchronized to the information acquisition unit, the information acquisition unit extracts and processes the key information through the system control module, and under the condition that the user cannot respond to the voice content of the third party in time, the voice content is responded according to the interaction rule of the system logic processing unit, and finally all the information is transmitted to the voice interaction processing unit, and the voice interaction processing unit performs the operation corresponding to the voice;
the background image processing module comprises a camera and an image fusion module, wherein the camera shoots an environmental photo, the environmental photo is transmitted into the image fusion module, the image fusion module obtains the current virtual pet image information from a background image processing unit and an information acquisition unit of the system control module, the virtual pet image is subjected to image fusion with the transmitted environmental picture, and a virtual pet image under the current environmental background is obtained and displayed.
The invention has the beneficial effects that: according to the method, the virtual pet model is established through the user characteristic information which can be updated in real time, the simulated electronic virtual pet with actual utility is obtained based on the motion function and the search range function model, the two models can be calculated through a plurality of influence factors to obtain the adaptive interaction method which fits the actual demands of users, the virtual pet learns and trains according to the adaptive interaction method, and meanwhile, help can be provided for external help or practical help can be provided through a third party platform under partial special conditions so as to meet the actual demands of users and provide high-level emotional value. Because the virtual pet in the invention can learn, the richness of the behavior, the content and the playing method of the virtual pet is far higher than that of the electronic pet in the prior art, thereby solving the problem that the prior virtual pet can not meet the emotion requirement and emotion consignment of the contemporary people.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art. Wherein:
fig. 1 is a schematic flow chart of a method for interactive simulation of a cyber pet according to an embodiment of the present invention.
Fig. 2 is a schematic flow chart of a virtual pet interactive simulation electronic pet system module according to an embodiment of the present invention.
Detailed Description
So that the manner in which the above recited objects, features and advantages of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways other than those described herein, and persons skilled in the art will readily appreciate that the present invention is not limited to the specific embodiments disclosed below.
Further, reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic can be included in at least one implementation of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments.
While the embodiments of the present invention have been illustrated and described in detail in the drawings, the cross-sectional view of the device structure is not to scale in the general sense for ease of illustration, and the drawings are merely exemplary and should not be construed as limiting the scope of the invention. In addition, the three-dimensional dimensions of length, width and depth should be included in actual fabrication.
Also in the description of the present invention, it should be noted that the orientation or positional relationship indicated by the terms "upper, lower, inner and outer", etc. are based on the orientation or positional relationship shown in the drawings, are merely for convenience of describing the present invention and simplifying the description, and do not indicate or imply that the apparatus or elements referred to must have a specific orientation, be constructed and operated in a specific orientation, and thus should not be construed as limiting the present invention. Furthermore, the terms "first, second, or third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
The terms "mounted, connected, and coupled" should be construed broadly in this disclosure unless otherwise specifically indicated and defined, such as: can be fixed connection, detachable connection or integral connection; it may also be a mechanical connection, an electrical connection, or a direct connection, or may be indirectly connected through an intermediate medium, or may be a communication between two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
Example 1
Referring to fig. 1, a first embodiment of the present invention provides a virtual pet interactive simulation electronic pet method, comprising:
s1: and collecting user characteristic information, and establishing a virtual pet model according to the information.
Further, the user characteristic information comprises video information of pets favored by users and static and dynamic photo information of pets stored or favored by users.
It should be noted that the establishing the virtual pet model according to the information includes analyzing the sign information of the pet variety, hair color, hair length, pet sound, cry size, behavior habit and character feature preferred by the user through the video information of the pet favored by the user and the static and dynamic photo information of the pet stored or favored by the user, and correlating the habit of the relevant pet to establish the virtual pet model.
S2: and the virtual pet model learns the interactive behavior of the virtual pet through the user characteristic information.
Further, the virtual pet interaction behavior comprises daily interaction with the user, real-time retrieval of the personal health condition of the user, prediction of the user requirement and timely response to the user requirement.
It should be noted that the daily interaction with the user includes that the virtual pet builds a motion function according to the user information, estimates the daily workload of the user through the motion quantity of the user on the same day, and deduces the interaction duration and degree that the virtual pet is required to provide on the same day.
It should be noted that the real-time retrieval of the personal health condition of the user includes that the virtual pet presumes the physical state of the user through the obtained physical sign data of the user, and meanwhile provides actual assistance for the user according to the physical state of the user.
It should be noted that predicting the user demand and responding to the user demand in time includes counting the voice interaction duration of the user and the virtual pet image and the time rule of the voice interaction of the user and the virtual pet, performing key measurement on the voice interaction of the period from 17 a evening to 8 a morning, and after the user is authorized, actively popping up the voice interaction request in the period and actively sending the voice interaction request.
S3: outputting the learned virtual pet image and making the virtual pet interaction rule.
Further, the interaction rule includes that after the virtual pet is authorized by the user, the past daily movement steps and the daily movement range are recorded and stored, and a movement function is constructed according to the content in the record, wherein the movement function is as follows:
Figure SMS_39
wherein ,
Figure SMS_40
represents the total days that can be traced, +.>
Figure SMS_41
Indicates the number of exercise steps or the range of exercise on the same day of the user, +.>
Figure SMS_42
Represents the i-th day, < > among the n days calculated>
Figure SMS_43
Indicating the actual number of steps or range of motion on day i.
It should be noted that when
Figure SMS_44
When the number of times of interaction between the virtual pet and the user is less than or equal to 0.3, allowing the virtual pet and the related interaction information of the virtual pet to pop up automatically, setting the fluctuation of the number of times of interaction between the virtual pet and the user to be 50% -100% of the average number of times of interaction, and improving the frequency of long actions or long voices to be 112% of the average number of times of interaction in the active interaction process.
Further, when 0.3<
Figure SMS_45
<And 1, allowing the related interaction information of the virtual pet to pop up automatically, setting the fluctuation of the number of times of the active interaction of the virtual pet and the user to be 20% -30% of the average number of times of the daily interaction, and simultaneously setting the frequency of long actions or long voices of the virtual pet in the active interaction process to be equal to the previous day.
It should be noted that when
Figure SMS_46
And when the frequency of the pop-up frequency is 60% of the average pop-up frequency of the current day, the long-acting or long-voice interaction of the virtual pet at 8-18 points on the current day is canceled, and the long-acting or long-voice interaction is not performed under the condition of no active requirement of a user.
Further, the interaction rule further includes that the physical state of the user is presumed, after the virtual pet is authorized by the user, the local climate temperature of the day is obtained, the body temperature of the user is measured, and if the body temperature of the user is below 37.3 ℃, the virtual pet normally broadcasts the climate condition of the day, epidemic diseases of the season and preventive measures thereof;
it should be noted that if the body temperature of the user is at
Figure SMS_47
The virtual pet can search the business pharmacy and clinic around the user while acquiring the epidemic diseases and the preventive measure information thereof in the current season, and can deeply search whether the medicine required by the user is sold in the business pharmacy and clinic around the user after the user is authorized.
The search function of the search range is as follows:
Figure SMS_48
Figure SMS_49
Figure SMS_50
wherein ,
Figure SMS_52
represents a health influencing factor function, and +>
Figure SMS_56
Figure SMS_58
Representing the factor of influence, ++>
Figure SMS_53
Indicating the body temperature of the user +.>
Figure SMS_54
Representing the influencing factor parameterCount (n)/(l)>
Figure SMS_55
Representing physical characteristic parameters of the user->
Figure SMS_57
Representing a preset minimum value of the physical characteristic parameter, set to 18.5,/for>
Figure SMS_51
Representing the maximum value of preset physical characteristic parameters, setting the maximum value to be 23.9, wherein y represents the weight of a user, and z represents the height of the user;
further, if the user is male, then
Figure SMS_59
Figure SMS_60
When (when)
Figure SMS_61
Setting +.f when the value range is less than 18.5 or more than 23.9>
Figure SMS_62
1250->
Figure SMS_63
48125;
it should be noted that if the user is female, then
Figure SMS_64
Figure SMS_65
When (when)
Figure SMS_66
Setting +.f when the value range is less than 18.5 or more than 23.9>
Figure SMS_67
750->
Figure SMS_68
28875;
further, when
Figure SMS_69
The value range is +.>
Figure SMS_70
When (I)>
Figure SMS_71
Not substitute->
Figure SMS_72
The operation is directly performed at->
Figure SMS_73
Figure SMS_74
Taking a value according to a numerical value preset by a user;
it should be noted that if the body temperature of the user is at
Figure SMS_75
Searching for business pharmaceutical stores and clinics around the user, deeply searching for whether the peripheral business pharmaceutical stores and clinics have medicines required by the user for sale, and after the user is authorized, directly carrying out the purchasing and substituting service of the purchased medicines through a third party platform, and inquiring whether the user needs to synchronize the physical condition of the user with the reserved emergency contact person;
it should also be noted that if the body temperature of the user is at
Figure SMS_76
Reminding the user to perform secondary measurement within 10 minutes, if the error of the two measurement results is less than 1 degree, sending the physical condition of the user to the emergency contact person reserved by the user, directly calling the emergency contact person for more than one hour without responding, calling for help by the virtual pet through a voice dialogue, and searching corresponding required medicines for a pharmacy or a clinic in the surrounding environment, wherein the search result is synchronously uploaded to the mobile phones of the user and the emergency contact person; if the user does not make a secondary measurement within 10 minutesAnd half an hour is used for calling the unresponsive virtual pet, and the virtual pet directly dials the emergency contact telephone to make help explanation.
S4: and according to the virtual pet interaction rules, the virtual pets are interacted.
Example 2
Referring to fig. 1-2, for one embodiment of the present invention, a virtual pet interactive simulation electronic pet system is provided, which is scientifically demonstrated through experiments in order to verify the advantageous effects of the present invention.
A virtual pet interactive simulation electronic pet system for implementing the method of fig. 1 as an example, which is characterized in that: comprising the steps of (a) a step of,
the system control module is responsible for processing logic and information interaction transmission of each module, inputting a processing result into the virtual pet image, and operating and feeding back the input information by the virtual pet; the information acquisition module is used for acquiring user characteristic information and data information provided by a third party platform and required by the virtual pet, and feeding back a result to the system control module after processing; the temperature sensing module senses the body surface temperature information of the user, and feeds back the processed result to the system control module; the voice interaction module is used for reading the voice interacted by the user, carrying out voice recognition, feeding back the result to the system control module and operating the virtual pet; the background image processing module is responsible for shooting a current surrounding background picture in real time, performing image fusion operation on the virtual pet and the current environment picture, and finally generating a background with more reality.
The system control module comprises a system logic processing unit, an information acquisition unit, a temperature processing unit, a voice interaction processing unit and a background image processing unit, wherein the system logic processing unit is respectively connected with the information processing unit, the information acquisition unit, the temperature processing unit, the voice interaction processing unit and the background image processing unit, the information acquisition unit is connected with the information acquisition module, the temperature processing unit is connected with the temperature sensing module, the voice interaction processing unit is connected with the voice interaction module, and the background image processing unit is connected with the background image processing module.
The information acquisition module comprises an information processing system and an information transmission system, wherein the information transmission system browses and primarily screens information authorized to be checked, the information which is screened through the primary screening is transmitted to the information processing system through the information transmission system after the user is authorized, the information processing system classifies and integrates the input information data, the key parameters required by the system are extracted, the parameters are input into the information acquisition module, the information acquisition module transmits the key parameters to the system control module for unified processing, and meanwhile, the system control module guides the obtained key parameter information into the system logic processing unit and carries out logic processing on the parameters according to interaction rules in the system logic processing unit.
The temperature sensing module comprises a thermometer and a temperature processing unit which are sequentially connected, wherein the thermometer obtains the body surface temperature of a user in the current state, the body surface temperature is input into the temperature processing unit, an information acquisition unit and a system logic processing unit, the system logic processing unit processes the input temperature information according to interaction rules and transmits the information acquisition rules to the temperature processing unit and the information acquisition unit through a system control module, the temperature processing unit processes, archives and uploads the processed temperature information according to the obtained information, the processed temperature information is transmitted to the virtual pet image, meanwhile, the information acquisition unit performs conditional information secondary screening on the information authorized to be checked according to the information obtained from the thermometer and the system logic processing unit, and transmits the screened information to the system control module, and the system control module performs unified operation or modification on interaction parameters of the virtual pet.
The voice interaction module comprises a microphone, a voice recognition module and a voice interaction processing module, wherein the microphone is used for acquiring the input of voice information of a user and a third party, the voice recognition module is used for recognizing the voice content of the user and the source of the third party, the voice interaction processing module is used for preprocessing the operation appointed by the voice content of the user or the authorized third party according to the voice recognized content, all the information acquired by the voice interaction module is synchronized to the information acquisition unit, the information acquisition unit is used for extracting and processing key information through the system control module, under the condition that the user cannot respond to the voice content of the third party in time, the voice content is responded according to the interaction rule of the system logic processing unit, and finally all the information is transmitted to the voice interaction processing unit, and the voice interaction processing unit is used for performing the operation corresponding to the voice.
The background image processing module comprises a camera and an image fusion module, wherein the camera shoots an environmental photo, the environmental photo is transmitted into the image fusion module, the image fusion module obtains the current virtual pet image information from a background image processing unit and an information acquisition unit of the system control module, the virtual pet image is subjected to image fusion with the transmitted environmental picture, and a virtual pet image under the current environmental background is obtained and displayed.
The invention relates to a virtual pet interactive simulation electronic pet technology, which is mainly used for providing humanized interactive service and practical and effective life assistance for users. Firstly, collecting user characteristic information, establishing a virtual pet model according to the information, learning virtual pet interaction behaviors through the virtual pet model by the user characteristic information, and finally outputting learned virtual pet images to formulate and implement virtual pet interaction rules.
It should be noted that the above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that the technical solution of the present invention may be modified or substituted without departing from the spirit and scope of the technical solution of the present invention, which is intended to be covered in the scope of the claims of the present invention.
It should be noted that the above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that the technical solution of the present invention may be modified or substituted without departing from the spirit and scope of the technical solution of the present invention, which is intended to be covered in the scope of the claims of the present invention.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein. The solutions in the embodiments of the present application may be implemented in various computer languages, for example, object-oriented programming language Java, and an transliterated scripting language JavaScript, etc.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the spirit or scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (6)

1. A virtual pet interactive simulation electronic pet method is characterized in that: comprising the steps of (a) a step of,
collecting user characteristic information, and establishing a virtual pet model according to the information;
the virtual pet model learns the interactive behavior of the virtual pet through the user characteristic information;
outputting the learned virtual pet image, and making a virtual pet interaction rule;
according to the virtual pet interaction rules, the virtual pets are interacted;
the user characteristic information comprises pet video information loved by a user, and static and dynamic photo information of the pet stored or loved by the user;
the establishing a virtual pet model according to the information comprises the steps of analyzing the pet variety, hair color, hair length, pet sound, sound level, behavior habit and character feature of the user's preferred physical sign information through the favorite pet video information of the user and the static and dynamic photo information of the pet stored or favorite by the user, and associating the habit of the related pet to establish the virtual pet model;
the virtual pet interaction behavior comprises daily interaction with a user, real-time retrieval of personal health conditions of the user, prediction of user demands and timely response to the user demands;
the daily interaction with the user comprises the steps that the virtual pet builds a motion function according to user information, daily workload of the user is estimated through the daily quantity of motion of the user, and the interaction duration and degree which are provided by the virtual pet and required by the user on the same day are deduced;
the interaction rule comprises that after the virtual pet is authorized by a user, the past daily exercise step number and daily activity range are recorded and stored, and an exercise function is constructed according to the content in the record, wherein the exercise function is as follows:
Figure QLYQS_1
wherein ,
Figure QLYQS_2
represents the total days that can be traced, +.>
Figure QLYQS_3
Indicates the number of exercise steps or the range of exercise on the same day of the user, +.>
Figure QLYQS_4
Represents the i-th day, < > among the n days calculated>
Figure QLYQS_5
Representing the actual number of steps or range of motion on day i;
when (when)
Figure QLYQS_6
Allowing the virtual pet and the related interaction information of the virtual pet to pop up automatically when the interaction frequency is less than or equal to 0.3, setting the fluctuation of the number of times of the active interaction of the virtual pet and the user to be 50-100% of the average number of times of the daily interaction, and improving the frequency of long actions or long voices to be 112% of the average frequency of the daily interaction in the active interaction process;
when 0.3<
Figure QLYQS_7
<1, allowing the related interaction information of the virtual pet to pop up automatically, setting the fluctuation of the number of times of the active interaction of the virtual pet and the user to be 20% -30% of the average number of times of the daily interaction, and simultaneously setting the frequency of long actions or long voices of the virtual pet in the active interaction process to be equal to the previous day;
when (when)
Figure QLYQS_8
And when the frequency of the pop-up frequency is 60% of the average pop-up frequency of the current day, the long-acting or long-voice interaction of the virtual pet at 8-18 points on the current day is canceled, and the long-acting or long-voice interaction is not performed under the condition of no active requirement of a user.
2. The method for interactively simulating an electronic pet for a virtual pet of claim 1, wherein: the real-time retrieval of the personal health condition of the user comprises the steps that the virtual pet presumes the physical state of the user through the acquired physical sign data of the user, and meanwhile, actual assistance is provided for the user according to the physical state of the user;
the interaction rule further comprises that the physical state of the user is presumed, after the virtual pet is authorized by the user, the local climate temperature of the day is obtained, the body temperature of the user is measured, and if the body temperature of the user is below 37.3 ℃, the virtual pet normally broadcasts the climate condition of the day and epidemic diseases and disease preventive measures in the season;
if the body temperature of the user is at
Figure QLYQS_9
The virtual pet can search the business pharmacy and clinic around the user while acquiring the epidemic diseases and the preventive measure information thereof in the current season, and can deeply search whether the medicine required by the user is sold in the business pharmacy and clinic around the user after the user is authorized; />
The search function of the search range is as follows:
Figure QLYQS_10
Figure QLYQS_11
Figure QLYQS_12
wherein ,
Figure QLYQS_14
represents a health influencing factor function, and +>
Figure QLYQS_16
Figure QLYQS_19
Representing the factor of influence, ++>
Figure QLYQS_15
Indicating the body temperature of the user +.>
Figure QLYQS_17
Representing the influencing factor parameters->
Figure QLYQS_18
Representing physical characteristic parameters of the user->
Figure QLYQS_20
Representing a preset minimum value of the physical characteristic parameter, set to 18.5,/for>
Figure QLYQS_13
Representing the maximum value of preset physical characteristic parameters, setting the maximum value to be 23.9, wherein y represents the weight of a user, and z represents the height of the user;
if the user is male, then
Figure QLYQS_21
Figure QLYQS_22
When->
Figure QLYQS_23
Setting +.f when the value range is less than 18.5 or more than 23.9>
Figure QLYQS_24
1250->
Figure QLYQS_25
48125;
if the user is female, then
Figure QLYQS_26
Figure QLYQS_27
When->
Figure QLYQS_28
Setting +.f when the value range is less than 18.5 or more than 23.9>
Figure QLYQS_29
750->
Figure QLYQS_30
28875;
when (when)
Figure QLYQS_31
The value range is +.>
Figure QLYQS_32
When (I)>
Figure QLYQS_33
Not substitute->
Figure QLYQS_34
Performing operation directly on
Figure QLYQS_35
Figure QLYQS_36
Taking a value according to a numerical value preset by a user;
if the body temperature of the user is at
Figure QLYQS_37
Searching for business pharmaceutical stores and clinics around the user, deeply searching for whether the peripheral business pharmaceutical stores and clinics have medicines required by the user for sale, and after the user is authorized, directly carrying out the purchasing and substituting service of the purchased medicines through a third party platform, and inquiring whether the user needs to synchronize the physical condition of the user with the reserved emergency contact person;
if the body temperature of the user is at
Figure QLYQS_38
Reminding the user to perform secondary measurement within 10 minutes, if the error of the two measurement results is less than 1 degree, sending the physical condition of the user to the emergency contact person reserved by the user, directly calling the emergency contact person for more than one hour without responding, calling for help by the virtual pet through a voice dialogue, and searching corresponding required medicines for a pharmacy or a clinic in the surrounding environment, wherein the search result is synchronously uploaded to the mobile phones of the user and the emergency contact person; if the user does not make the secondary measurement within 10 minutes and the virtual pet call is not responded for half an hour, the virtual pet directly dials the emergency contact phone to make a help explanation.
3. The method for interactively simulating an electronic pet for a virtual pet according to claim 2, wherein: the predicting the user demand and responding to the user demand in time comprises counting the voice interaction time of the user and the virtual pet image and the time rule of the voice interaction of the user and the virtual pet, carrying out key measurement and calculation on the voice interaction which occurs in the period from 17 hours to 8 hours in the morning, and actively popping up the voice interaction request in the period after the user authorization.
4. A system employing the method for interactive simulation of electronic pets by virtual pets according to any of claims 1 to 3, characterized in that: comprising the steps of (a) a step of,
the system control module is responsible for processing logic and information interaction transmission of each module, inputting a processing result into the virtual pet image, and operating and feeding back the input information by the virtual pet;
the information acquisition module is used for acquiring user characteristic information and data information provided by a third party platform and required by the virtual pet, and feeding back a result to the system control module after processing;
the temperature sensing module senses the body surface temperature information of the user, and feeds back the processed result to the system control module;
the voice interaction module is used for reading the voice interacted by the user, carrying out voice recognition, feeding back the result to the system control module and operating the virtual pet;
the background image processing module is responsible for shooting a current surrounding background picture in real time, performing image fusion operation on the virtual pet and the current environment picture, and finally generating a background with more reality.
5. A virtual pet interactive simulated electronic pet system as claimed in claim 4, wherein: the system control module comprises a system logic processing unit, an information acquisition unit, a temperature processing unit, a voice interaction processing unit and a background image processing unit, wherein the system logic processing unit is respectively connected with the information processing unit, the information acquisition unit, the temperature processing unit, the voice interaction processing unit and the background image processing unit, the information acquisition unit is connected with the information acquisition module, the temperature processing unit is connected with the temperature sensing module, the voice interaction processing unit is connected with the voice interaction module, and the background image processing unit is connected with the background image processing module.
6. The virtual pet interactive simulated electronic pet system as claimed in claim 5, wherein: the information acquisition module comprises an information processing system and an information transmission system, wherein the information transmission system browses and primarily screens information authorized to be checked, the information which is screened through the primary screening is transmitted to the information processing system through the information transmission system after the user is authorized, the information processing system performs classified integration processing on the input information data, key parameters required by the system are extracted, the parameters are input into the information acquisition module, the information acquisition module transmits the key parameters to a system control module for unified processing, and meanwhile, the system control module guides the obtained key parameter information into a system logic processing unit and performs logic processing on the parameters according to interaction rules in the system logic processing unit;
the temperature sensing module comprises a thermometer and a temperature processing unit which are sequentially connected, wherein the thermometer obtains the body surface temperature of a user in the current state, the body surface temperature is input into the temperature processing unit, an information acquisition unit and a system logic processing unit, the system logic processing unit processes the input temperature information according to interaction rules and transmits the information acquisition rules to the temperature processing unit and the information acquisition unit through a system control module, the temperature processing unit processes, archives and uploads the processed temperature information according to the obtained information, the processed temperature information is transmitted to the virtual pet image, meanwhile, the information acquisition unit performs conditional information secondary screening on information authorized to be checked according to the information obtained from the thermometer and the system logic processing unit, and transmits the screened information to the system control module, and the system control module performs unified operation or modification on interaction parameters of the virtual pet;
the voice interaction module comprises a microphone, a voice recognition module and a voice interaction processing module, wherein the microphone is used for acquiring the input of voice information of a user and a third party, the voice recognition module recognizes the voice content of the user and the source of the third party, the voice interaction processing module performs preprocessing work of the operation appointed by the voice content of the user or the authorized third party according to the voice recognized content, all the information acquired by the voice interaction module is synchronized to the information acquisition unit, the information acquisition unit extracts and processes the key information through the system control module, and under the condition that the user cannot respond to the voice content of the third party in time, the voice content is responded according to the interaction rule of the system logic processing unit, and finally all the information is transmitted to the voice interaction processing unit, and the voice interaction processing unit performs the operation corresponding to the voice;
the background image processing module comprises a camera and an image fusion module, wherein the camera shoots an environmental photo, the environmental photo is transmitted into the image fusion module, the image fusion module obtains the current virtual pet image information from a background image processing unit and an information acquisition unit of the system control module, the virtual pet image is subjected to image fusion with the transmitted environmental picture, and a virtual pet image under the current environmental background is obtained and displayed.
CN202310266436.4A 2023-03-20 2023-03-20 Method and system for interactive simulation of electronic pet by virtual pet Active CN115983051B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310266436.4A CN115983051B (en) 2023-03-20 2023-03-20 Method and system for interactive simulation of electronic pet by virtual pet

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310266436.4A CN115983051B (en) 2023-03-20 2023-03-20 Method and system for interactive simulation of electronic pet by virtual pet

Publications (2)

Publication Number Publication Date
CN115983051A CN115983051A (en) 2023-04-18
CN115983051B true CN115983051B (en) 2023-06-06

Family

ID=85968584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310266436.4A Active CN115983051B (en) 2023-03-20 2023-03-20 Method and system for interactive simulation of electronic pet by virtual pet

Country Status (1)

Country Link
CN (1) CN115983051B (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010214009A (en) * 2009-03-18 2010-09-30 Fujitsu Ltd Virtual pet raising device, virtual pet raising method, virtual pet raising program, and mobile terminal device
CN114712862A (en) * 2022-03-31 2022-07-08 新瑞鹏宠物医疗集团有限公司 Virtual pet interaction method, electronic device and computer-readable storage medium
CN114887335A (en) * 2022-04-26 2022-08-12 新瑞鹏宠物医疗集团有限公司 Multi-parameter-based virtual pet physical condition setting method and device

Also Published As

Publication number Publication date
CN115983051A (en) 2023-04-18

Similar Documents

Publication Publication Date Title
JP6888096B2 (en) Robot, server and human-machine interaction methods
CN110531860B (en) Animation image driving method and device based on artificial intelligence
CN112215927B (en) Face video synthesis method, device, equipment and medium
Le et al. Live speech driven head-and-eye motion generators
Liu et al. The path of film and television animation creation using virtual reality technology under the artificial intelligence
US20240070397A1 (en) Human-computer interaction method, apparatus and system, electronic device and computer medium
CN105843381A (en) Data processing method for realizing multi-modal interaction and multi-modal interaction system
CN113569892A (en) Image description information generation method and device, computer equipment and storage medium
CN109871450A (en) Based on the multi-modal exchange method and system for drawing this reading
CN111414506B (en) Emotion processing method and device based on artificial intelligence, electronic equipment and storage medium
CN113496156B (en) Emotion prediction method and equipment thereof
US20220230740A1 (en) Method and computer program to determine user&#39;s mental state by using user&#39;s behavior data or input data
CN111949773A (en) Reading equipment, server and data processing method
CN111078005A (en) Virtual partner creating method and virtual partner system
CN115983051B (en) Method and system for interactive simulation of electronic pet by virtual pet
CN117520498A (en) Virtual digital human interaction processing method, system, terminal, equipment and medium
US10614626B2 (en) System and method for providing augmented reality challenges
CN116798129A (en) Living body detection method and device, storage medium and electronic equipment
CN116910201A (en) Dialogue data generation method and related equipment thereof
CN113301352A (en) Automatic chat during video playback
CN111797303A (en) Information processing method, information processing apparatus, storage medium, and electronic device
KR20210019182A (en) Device and method for generating job image having face to which age transformation is applied
CN116343350A (en) Living body detection method and device, storage medium and electronic equipment
CN115983499A (en) Box office prediction method and device, electronic equipment and storage medium
KR102120936B1 (en) System for providing customized character doll including smart phone

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant