CN115983051A - Method and system for interactively simulating electronic pet by virtual pet - Google Patents

Method and system for interactively simulating electronic pet by virtual pet Download PDF

Info

Publication number
CN115983051A
CN115983051A CN202310266436.4A CN202310266436A CN115983051A CN 115983051 A CN115983051 A CN 115983051A CN 202310266436 A CN202310266436 A CN 202310266436A CN 115983051 A CN115983051 A CN 115983051A
Authority
CN
China
Prior art keywords
user
information
virtual pet
interaction
voice
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310266436.4A
Other languages
Chinese (zh)
Other versions
CN115983051B (en
Inventor
宋程
刘保国
胡金有
吴浩
梁开岩
郭玮鹏
李学奇
范存龙
李海
巩京京
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xingchong Kingdom Beijing Technology Co ltd
Original Assignee
Xingchong Kingdom Beijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xingchong Kingdom Beijing Technology Co ltd filed Critical Xingchong Kingdom Beijing Technology Co ltd
Priority to CN202310266436.4A priority Critical patent/CN115983051B/en
Publication of CN115983051A publication Critical patent/CN115983051A/en
Application granted granted Critical
Publication of CN115983051B publication Critical patent/CN115983051B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The invention discloses a method and a system for interactively simulating a virtual pet, which comprises the steps of collecting user characteristic information, and establishing a virtual pet model according to the information; the virtual pet model learns the interaction behavior of the virtual pet through the user characteristic information; outputting the virtual pet image after learning, and making and implementing virtual pet interaction rules. The method can calculate the adaptive interaction method which is fit with the actual requirements of the user, the virtual pet learns and trains according to the adaptive interaction method, meanwhile, practical help can be provided for seeking help from outside or provided through a third party platform under partial special conditions, the actual requirements of the user can be met, high-level emotional value can be provided, and the problem that the traditional virtual pet cannot meet the emotional requirements and emotional consignment of the contemporary people is solved.

Description

Method and system for interactively simulating electronic pet by virtual pet
Technical Field
The invention relates to the technical field of virtual pet simulation, in particular to a method and a system for interactively simulating an electronic pet by a virtual pet.
Background
Because the current simulated electronic virtual pet only serves as a simple electronic image and has a large gap with a pet which can provide emotional value and actual help in reality, the current simulated electronic virtual pet only can be simply interacted with a user for feeding, stroking and the like, the emotional value obtained by the user through the simple interaction is very limited, the actual help cannot be obtained deeply, the user viscosity of the simulated electronic virtual pet is low, and the problem of time limitation exists, so that how to improve the actual utility of the virtual pet is the key for solving the current problem. The traditional virtual pet is limited in a cartoon image and a two-dimensional plane, and simultaneously does not have the technical capability of real-time updating, so that the appearance, behavior, calling, size and the like of the virtual pet are difficult to update, and the repetition rate of interactive response actions is high, so that the traditional virtual pet cannot meet the emotional requirements and emotional trusting of the contemporary people.
Disclosure of Invention
This section is for the purpose of summarizing some aspects of embodiments of the invention and to briefly introduce some preferred embodiments. In this section, as well as in the abstract and the title of the invention of this application, simplifications or omissions may be made to avoid obscuring the purpose of the section, the abstract and the title, and such simplifications or omissions are not intended to limit the scope of the invention.
The present invention has been made in view of the above-mentioned conventional problems.
Therefore, the invention provides a virtual pet interactive simulation electronic pet method and system, which can solve the problem that the traditional virtual pet cannot meet the emotional requirements and emotional setbacks of the contemporary people.
In order to solve the above technical problem, the present invention provides the following technical solution, a method for a virtual pet to interactively simulate an electronic pet, comprising:
collecting user characteristic information, and establishing a virtual pet model according to the information;
the virtual pet model learns the interaction behavior of the virtual pet through the user characteristic information;
outputting the learned virtual pet image and formulating a virtual pet interaction rule;
and interacting the virtual pet according to the virtual pet interaction rule.
As a preferred scheme of the method for the interactive simulation of the electronic pet by the virtual pet, the method comprises the following steps: the user characteristic information comprises video information of pets favored by users and static and dynamic photo information of pets stored or favored by users.
As a preferred scheme of the method for the interactive simulation of the electronic pet by the virtual pet, the method comprises the following steps: the step of establishing the virtual pet model according to the information comprises the step of analyzing the physical sign information of pet varieties, hair colors, hair lengths, pet sounds, call sound sizes, behavior habits and character features preferred by the user through the pet video information preferred by the user and the static and dynamic photo information of pets stored or preferred by the user, and establishing the virtual pet model by associating the habits of related pets.
As a preferable scheme of the method for interactive simulation of an electronic pet by a virtual pet, the method comprises the following steps: the virtual pet interaction behavior comprises daily interaction with the user, real-time retrieval of personal health conditions of the user, prediction of user requirements and timely response to the user requirements.
As a preferred scheme of the method for the interactive simulation of the electronic pet by the virtual pet, the method comprises the following steps: the daily interaction with the user comprises the steps that the virtual pet constructs a motion function according to user information, the daily workload of the user is estimated according to the daily motion quantity of the user, and the interaction duration and degree required by the virtual pet on the day are deduced;
the interaction rule comprises the steps of recording and storing the previous daily movement steps and the daily movement range of the virtual pet after the virtual pet is authorized by the user, and constructing a movement function according to the contents in the record, wherein the movement function is as follows:
Figure SMS_1
wherein ,
Figure SMS_2
represents the total number of days that can be traced, and>
Figure SMS_3
indicates the number of moving steps or range of motion on the user's day, based on the current day>
Figure SMS_4
Indicates the i-th day of the n days of calculation, is>
Figure SMS_5
Represents the actual number of exercise steps or range of motion on day i;
when in use
Figure SMS_6
When the number of the virtual pets and the related interaction information of the virtual pets are allowed to automatically pop up when the number of the virtual pets is less than or equal to 0.3, the rising amplitude of the number of the active interaction times of the virtual pets and the user is set to be 50% -100% of the average number of the previous day interaction times, and the frequency of long actions or long voice is increased to be 112% of the average frequency of the previous day in the active interaction process;
when 0.3<
Figure SMS_7
<1 hour, allowing the relevant interaction information of the virtual pet to automatically pop up, setting the amplitude of the active interaction times of the virtual pet and the user to be 20-30% of the average interaction times in the past day, and simultaneously setting the virtual pet to have long action or long length in the active interaction processThe frequency of the voice is equal to that of the previous day;
when in use
Figure SMS_8
When the pop-up time is more than or equal to 1, the virtual pet can not be popped up actively, the related interaction information of the virtual pet can be popped up actively after authorization, the pop-up frequency is 60% of the average pop-up frequency in the past day, the long action or long voice interaction of the virtual pet from 8 o 'clock to 18 o' clock in the current day is cancelled, and the long action or long voice interaction is not carried out under the condition that the user does not actively request.
As a preferred scheme of the method for the interactive simulation of the electronic pet by the virtual pet, the method comprises the following steps: the real-time retrieval of the personal health condition of the user comprises the steps that the virtual pet conjectures the physical state of the user through the acquired physical sign data of the user, and meanwhile, actual help is provided for the user according to the physical state of the user;
the interaction rule further comprises the steps of presuming the body state of the user, obtaining the current local climate temperature and measuring the body temperature of the user after the virtual pet is authorized by the user, and if the body temperature of the user is below 37.3 ℃, the virtual pet normally broadcasts the current climate temperature condition, the current season epidemic diseases and preventive measures thereof;
if the user's body temperature is
Figure SMS_9
Searching the peripheral business Chinese medicine shops and clinics of the user while acquiring the current epidemic diseases and the preventive measure information of the current epidemic diseases by the virtual pet, and deeply searching whether the peripheral business Chinese medicine shops and clinics have the medicines required by the user for sale after the authorization of the user;
the search function of the search range is as follows:
Figure SMS_10
Figure SMS_11
Figure SMS_12
wherein ,
Figure SMS_13
represents a health-affecting factor function, and>
Figure SMS_16
,/>
Figure SMS_19
represents an influencing factor coefficient, <' > is selected>
Figure SMS_15
Indicates the body temperature of the user and is up or down>
Figure SMS_17
Represents an influencing factor parameter, <' > is selected>
Figure SMS_18
Represents a user's body characteristic parameter, <' > or>
Figure SMS_20
Represents a preset minimum value of a physical characteristic parameter set to 18.5>
Figure SMS_14
The maximum value of the preset physical characteristic parameter is represented, the setting is 23.9, y represents the weight of the user, and z represents the height of the user;
if the user is male, then
Figure SMS_21
,/>
Figure SMS_22
When is greater than or equal to>
Figure SMS_23
Setting & is in a range of less than 18.5 or greater than 23.9>
Figure SMS_24
1250>
Figure SMS_25
48125;
if the user is female, then
Figure SMS_26
,/>
Figure SMS_27
When is on>
Figure SMS_28
Setting & is in a range of less than 18.5 or greater than 23.9>
Figure SMS_29
Is 750, based on>
Figure SMS_30
28875;
when in use
Figure SMS_31
Value range of->
Figure SMS_32
When, is greater or less>
Figure SMS_33
Does not substitute into>
Figure SMS_34
Perform an operation directly on
Figure SMS_35
,/>
Figure SMS_36
Taking values according to numerical values preset by a user;
if the user's body temperature is
Figure SMS_37
Searching the peripheral business Chinese medicine shops and clinics of the user, deeply searching whether the peripheral business Chinese medicine shops and clinics have the medicines required by the user for sale, directly carrying out the purchased medicines purchasing and delivering service through a third-party platform after the authorization of the user, and simultaneously inquiring the user that the user isIf not, synchronizing the physical condition of the user to the reserved emergency contact;
if the user's body temperature is in
Figure SMS_38
If the error of the two measurement results is less than 1 degree, the physical condition of the user is sent to an emergency contact reserved by the user, the user can directly dial the telephone of the emergency contact after one hour of unresponsiveness, a virtual pet seeks help through voice conversation, and searches corresponding required medicines for a pharmacy or clinic in the surrounding environment, and the search results are synchronously uploaded to the user and the mobile phone of the emergency contact; if the user does not perform the secondary measurement within 10 minutes and does not respond to the virtual pet call for half an hour, the virtual pet directly dials the emergency contact phone for help explanation.
As a preferred scheme of the method for the interactive simulation of the electronic pet by the virtual pet, the method comprises the following steps: the method for predicting the user demand and responding to the user demand timely comprises the steps of counting the voice interaction time length of the user and the virtual pet image and the time rule of the voice interaction between the user and the virtual pet, carrying out key calculation on the time interval voice interaction from 17 th night to 8 th morning, and actively popping up the time interval after the user is authorized to send a voice interaction request actively.
Another object of the present invention is to provide a virtual pet interactive simulation electronic pet system, which can solve the problems that the existing virtual pet can not satisfy the actual needs, emotional needs and emotional trusting of the contemporary people by implementing the virtual pet interactive simulation electronic pet method.
As a preferable scheme of the virtual pet interactive simulation electronic pet system of the invention, wherein: the system comprises a plurality of devices which are connected with each other,
the system control module is responsible for processing the logic and information interactive transmission of each module, inputting the processing result into the virtual pet image, and operating and feeding back the input information by the virtual pet;
the information acquisition module is used for acquiring the characteristic information of the user and the data information required by the virtual pet and provided by the third-party platform, and feeding back a result to the system control module after processing;
the temperature sensing module senses the body surface temperature information of the user and feeds back the processed result to the system control module;
the voice interaction module is responsible for reading voice interacted by the user, performing voice recognition, feeding back a result to the system control module and operating the virtual pet;
and the background image processing module is responsible for shooting the current surrounding background picture in real time, and carrying out image fusion operation on the virtual pet and the current environment picture to finally generate a background with more reality.
As a preferable scheme of the virtual pet interactive simulation electronic pet system of the invention, wherein: the system control module comprises a system logic processing unit, an information acquisition unit, a temperature processing unit, a voice interaction processing unit and a background image processing unit, wherein the system logic processing unit is respectively connected with the information processing unit, the information acquisition unit, the temperature processing unit, the voice interaction processing unit and the background image processing unit, the information acquisition unit is connected with the information acquisition module, the temperature processing unit is connected with the temperature sensing module, the voice interaction processing unit is connected with the voice interaction module, and the background image processing unit is connected with the background image processing module.
As a preferable scheme of the virtual pet interactive simulation electronic pet system of the invention, wherein: the information acquisition module comprises an information processing system and an information transmission system, wherein the information transmission system browses and primarily screens authorized information, the primarily screened information is transmitted to the information processing system through the information transmission system after being authorized by a user, the information processing system classifies and integrates the transmitted information data, extracts key parameters required by the system, inputs the parameters into the information acquisition module, transmits the parameters to the system control module for unified processing, and simultaneously, the system control module introduces the obtained key parameter information into the system logic processing unit and logically processes the parameters according to interaction rules in the system logic processing unit;
the temperature sensing module comprises a thermometer and a temperature processing unit which are sequentially connected, wherein the thermometer obtains the body surface temperature of a user in the current state and inputs the body surface temperature into the temperature processing unit, the information acquisition unit and the system logic processing unit, the system logic processing unit processes input temperature information according to an interaction rule and transmits the information acquisition rule to the temperature processing unit and the information acquisition unit through the system control module, the temperature processing unit processes, files and uploads the processed temperature information to the virtual pet image according to the obtained information, meanwhile, the information acquisition unit performs conditional information secondary screening on authorized information according to the information obtained from the thermometer and the system logic processing unit and transmits the screened information to the system control module, and the system control module performs unified operation or modification on interaction parameters of virtual pets;
the voice interaction module comprises a microphone, a voice recognition module and a voice interaction processing module, wherein the microphone is used for acquiring input of voice information of a user and a third party, the voice recognition module recognizes voice contents of the user and the third party, the voice interaction processing module carries out preprocessing work of operation appointed by the voice of the user or authorized voice contents of the third party according to the voice recognition contents, all information acquired by the voice interaction module is synchronized to an information acquisition unit, the information acquisition unit extracts and processes key information through a system control module, under the condition that the user cannot timely respond to the voice contents of the third party, response to the voice contents is carried out according to interaction rules of a system logic processing unit, all information is finally transmitted to the voice interaction processing unit, and the voice interaction processing unit carries out voice corresponding operation;
the background image processing module comprises a camera and an image fusion module, the camera shoots an environment picture and transmits the environment picture into the image fusion module, the image fusion module acquires current virtual pet image information from a background image processing unit and an information acquisition unit of the system control module, the virtual pet image and the transmitted environment picture are subjected to image fusion, a virtual pet image under the current environment background is acquired, and the virtual pet image is displayed.
The invention has the beneficial effects that: the method establishes a virtual pet model through user characteristic information which can be updated in real time, and obtains a simulated electronic virtual pet with actual utility based on a motion function model and a retrieval range function model. Because the virtual pet in the invention can be learned, the behavior, content and playing richness of the virtual pet is far higher than that of the electronic pet in the traditional technology, thereby solving the problem that the traditional virtual pet can not meet the emotional requirements and emotional setbacks of the contemporary people.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise. Wherein:
FIG. 1 is a schematic flow chart illustrating a method for simulating an cyber pet in a virtual pet interaction manner according to an embodiment of the present invention.
FIG. 2 is a schematic view of a virtual pet interactive simulation electronic pet system module according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, specific embodiments accompanied with figures are described in detail below, and it is apparent that the described embodiments are a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making creative efforts based on the embodiments of the present invention, shall fall within the protection scope of the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways than those specifically described and will be readily apparent to those of ordinary skill in the art without departing from the spirit of the present invention, and therefore the present invention is not limited to the specific embodiments disclosed below.
Furthermore, reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one implementation of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments.
The present invention will be described in detail with reference to the drawings, wherein the cross-sectional views illustrating the structure of the device are not enlarged partially in general scale for convenience of illustration, and the drawings are only exemplary and should not be construed as limiting the scope of the present invention. In addition, the three-dimensional dimensions of length, width and depth should be included in the actual fabrication.
Meanwhile, in the description of the present invention, it should be noted that the terms "upper, lower, inner and outer" and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation and operate, and thus, cannot be construed as limiting the present invention. Furthermore, the terms first, second, or third are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
The terms "mounted, connected and connected" in the present invention are to be understood broadly, unless otherwise explicitly specified or limited, for example: can be fixedly connected, detachably connected or integrally connected; they may be mechanically, electrically, or directly connected, or indirectly connected through intervening media, or may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Example 1
Referring to fig. 1, there is provided a first embodiment of the present invention, which provides a virtual pet interactive simulation cyber pet method, including:
s1: and collecting the characteristic information of the user, and establishing a virtual pet model according to the information.
Furthermore, the user characteristic information comprises video information of pets favored by users, and static and dynamic photo information of pets stored or favored by users.
It should be noted that the establishing of the virtual pet model according to the information includes analyzing physical sign information of pet varieties, hair colors, hair lengths, pet sounds, call sound sizes, behavior habits and character features preferred by the user through pet video information preferred by the user and static and dynamic photo information of pets stored or preferred by the user, and establishing the virtual pet model by associating habits of related pets.
S2: and the virtual pet model learns the interaction behavior of the virtual pet through the characteristic information of the user.
Further, the virtual pet interaction behavior comprises daily interaction with the user, real-time retrieval of personal health conditions of the user, prediction of user requirements and timely response to the user requirements.
It should be noted that the performing of daily interaction with the user includes that the virtual pet constructs a motion function according to the user information, estimates the daily workload of the user through the daily motion amount of the user, and deduces the interaction duration and degree required by the virtual pet by the user on the day.
It should be noted that the real-time search of the personal health condition of the user includes that the virtual pet conjectures the physical state of the user through the acquired physical sign data of the user, and meanwhile, the virtual pet provides actual help for the user according to the physical state of the user.
It should be noted that predicting the user demand and responding to the user demand in time includes counting the time length of the voice interaction between the user and the virtual pet image and the time law of the voice interaction between the user and the virtual pet image, performing key calculation on the time interval voice interaction between 17 o 'clock and 8 o' clock in the morning, and after the user is authorized, actively popping up the time interval to actively send a voice interaction request.
S3: and outputting the learned virtual pet image and formulating virtual pet interaction rules.
Furthermore, the interaction rule includes that after the virtual pet is authorized by the user, the previous daily exercise step number and the daily activity range are recorded and stored, and an exercise function is constructed according to the content in the record, wherein the exercise function is as follows:
Figure SMS_39
wherein ,
Figure SMS_40
represents the total number of days that can be traced, and>
Figure SMS_41
indicates the number of moving steps or range of motion on the user's day, based on the current day>
Figure SMS_42
Indicates the i-th day of the n days of calculation, is>
Figure SMS_43
Representing the actual number of steps or range of motion on day i.
It should be noted that when
Figure SMS_44
When the number of the virtual pets and the related interaction information of the virtual pets are allowed to automatically pop up when the number of the virtual pets actively interact with the user is less than or equal to 0.3, the rising amplitude of the number of the virtual pets actively interact with the user is set to be 50% -100% of the average number of the interaction times of the previous day, and the frequency of long actions or long voice is increased to be 112% of the average frequency of the previous day in the active interaction process.
Further, when 0.3<
Figure SMS_45
<1, allowing the relevant interaction information of the virtual pet to automatically pop up, setting the amplitude of the number of active interaction times of the virtual pet and a user to be 20% -30% of the average number of interaction times in the past day, and simultaneously setting the frequency of long actions or long voice of the virtual pet in the active interaction process to be equal to the frequency of the long voice in the previous day.
It should be noted that when
Figure SMS_46
When the pop-up time is more than or equal to 1, the virtual pet can not be popped up actively, the related interaction information of the virtual pet can be popped up actively after authorization, the pop-up frequency is 60% of the average pop-up frequency in the past day, the long action or long voice interaction of the virtual pet from 8 o 'clock to 18 o' clock in the current day is cancelled, and the long action or long voice interaction is not carried out under the condition that the user does not actively request.
Furthermore, the interaction rule also comprises the steps of presuming the physical state of the user, obtaining the local climate temperature of the current day and measuring the body temperature of the user after the virtual pet is authorized by the user, and if the body temperature of the user is below 37.3 ℃, the virtual pet normally broadcasts the climate condition of the current day temperature, the current season epidemic diseases and preventive measures thereof;
it should be noted that if the body temperature of the user is in
Figure SMS_47
The virtual pet can search the surrounding business pharmacy and clinic of the user while acquiring the epidemic diseases and the preventive measure information of the current season, and can deeply search whether the medicine required by the user is sold in the surrounding business pharmacy and clinic after the authorization of the user.
The search function of the search range is as follows:
Figure SMS_48
Figure SMS_49
Figure SMS_50
wherein ,
Figure SMS_52
represents a health-affecting factor function, and>
Figure SMS_56
,/>
Figure SMS_58
represents influencing factor coefficients, <' > based on the characteristic value>
Figure SMS_53
Indicates the body temperature of the user and is up or down>
Figure SMS_54
Represents an influencing factor parameter, <' > is selected>
Figure SMS_55
Represents a user's body characteristic parameter, <' > or>
Figure SMS_57
Represents a preset minimum value of a physical characteristic parameter, is set to 18.5>
Figure SMS_51
The maximum value of the preset physical characteristic parameter is represented, the setting is 23.9, y represents the weight of the user, and z represents the height of the user;
further, if the user is male, then
Figure SMS_59
,/>
Figure SMS_60
When is greater than or equal to>
Figure SMS_61
Setting in a range of less than 18.5 or more than 23.9>
Figure SMS_62
1250->
Figure SMS_63
48125;
it should be noted that, if the user is female, the user is not restricted to the female
Figure SMS_64
,/>
Figure SMS_65
When is on>
Figure SMS_66
Setting & is in a range of less than 18.5 or greater than 23.9>
Figure SMS_67
Is 750, based on>
Figure SMS_68
28875;
further, when
Figure SMS_69
Value range of->
Figure SMS_70
In combination of time>
Figure SMS_71
Not substituted into>
Figure SMS_72
Perform an operation directly on
Figure SMS_73
,/>
Figure SMS_74
Taking values according to numerical values preset by a user;
it should be noted that if the body temperature of the user is in
Figure SMS_75
Searching the business Chinese medicine shops and clinics around the user and deeply searching the business Chinese medicines around the userWhether the medicine needed by the user is sold in the shops and clinics or not can be judged, and after the authorization of the user, the purchased medicine is directly purchased and delivered instead of the medicine, and meanwhile, the user is inquired whether the physical condition of the user needs to be synchronized with the reserved emergency contact or not;
it should also be noted that if the user's body temperature is at
Figure SMS_76
If the error of the two measurement results is less than 1 degree, the physical condition of the user is sent to an emergency contact reserved by the user, the user can directly dial the telephone of the emergency contact after one hour of unresponsiveness, a virtual pet seeks help through voice conversation, and searches corresponding required medicines for a pharmacy or clinic in the surrounding environment, and the search results are synchronously uploaded to the user and the mobile phone of the emergency contact; if the user does not perform secondary measurement within 10 minutes and does not respond to the virtual pet call for half an hour, the virtual pet directly dials the emergency contact phone for help instruction.
S4: and interacting the virtual pet according to the virtual pet interaction rule.
Example 2
Referring to fig. 1-2, in order to verify the advantageous effects of the present invention, a virtual pet interactive simulation electronic pet system is provided as an embodiment of the present invention, and scientific demonstration is performed through experiments.
A virtual pet interactive simulation electronic pet system for realizing the method of fig. 1 is characterized in that: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
the system control module is responsible for processing the logic and information interactive transmission of each module, inputting the processing result into the virtual pet image, and operating and feeding back the input information by the virtual pet; the information acquisition module is used for acquiring the characteristic information of the user and the data information required by the virtual pet and provided by the third-party platform, and feeding back the processed result to the system control module; the temperature sensing module senses the body surface temperature information of the user and feeds back the processed result to the system control module; the voice interaction module is responsible for reading voice interacted by the user, performing voice recognition, feeding back a result to the system control module and operating the virtual pet; and the background image processing module is responsible for shooting the current surrounding background picture in real time, and carrying out image fusion operation on the virtual pet and the current environment picture to finally generate a background with more reality.
The system control module comprises a system logic processing unit, an information acquisition unit, a temperature processing unit, a voice interaction processing unit and a background image processing unit, wherein the system logic processing unit is respectively connected with the information processing unit, the information acquisition unit, the temperature processing unit, the voice interaction processing unit and the background image processing unit, the information acquisition unit is connected with the information acquisition module, the temperature processing unit is connected with the temperature sensing module, the voice interaction processing unit is connected with the voice interaction module, and the background image processing unit is connected with the background image processing module.
The information acquisition module comprises an information processing system and an information transmission system, wherein the information transmission system browses and primarily screens authorized information, the primarily screened information is transmitted to the information processing system through the information transmission system after being authorized by a user, the information processing system classifies, integrates and processes incoming information data, extracts key parameters required by the system, inputs the parameters into the information acquisition module, transmits the parameters to a system control module for unified processing, and simultaneously, the system control module introduces the obtained key parameter information into a system logic processing unit and logically processes the parameters according to interaction rules in the system logic processing unit.
The temperature sensing module comprises a thermometer and a temperature processing unit which are sequentially connected, the thermometer obtains the body surface temperature of a user in the current state, and then the body surface temperature is input into the temperature processing unit, the information acquisition unit and the system logic processing unit, the system logic processing unit processes input temperature information according to interaction rules and transmits the information acquisition rules to the temperature processing unit and the information acquisition unit through the system control module, the temperature processing unit processes, files and uploads the processed temperature information to the virtual pet image according to the obtained information, meanwhile, the information acquisition unit conducts conditional information secondary screening on authorized information according to the information obtained from the thermometer and the system logic processing unit, the screened information is transmitted to the system control module, and the system control module conducts unified operation or modification on interaction parameters of virtual pets.
The voice interaction module comprises a microphone, a voice recognition module and a voice interaction processing module, wherein the microphone is used for acquiring input of voice information of a user and a third party, the voice recognition module recognizes voice contents of the user and the third party, the voice interaction processing module carries out preprocessing work of operation appointed by the voice of the user or authorized voice contents of the third party according to the voice recognition contents, all information acquired by the voice interaction module is synchronized to the information acquisition unit, the information acquisition unit extracts and processes key information through the system control module, under the condition that the user cannot timely respond to the voice contents of the third party, response to the voice contents is carried out according to interaction rules of the system logic processing unit, all information is finally transmitted to the voice interaction processing unit, and the voice interaction processing unit carries out voice corresponding operation.
The background image processing module comprises a camera and an image fusion module, the camera shoots an environment picture and transmits the environment picture into the image fusion module, the image fusion module acquires current virtual pet image information from a background image processing unit and an information acquisition unit of the system control module, the virtual pet image and the transmitted environment picture are subjected to image fusion, a virtual pet image under the current environment background is acquired, and the virtual pet image is displayed.
The invention relates to a virtual pet interactive simulation electronic pet technology which is mainly used for providing humanized interactive service and practical and effective life help for users. The method comprises the steps of firstly collecting user characteristic information, establishing a virtual pet model according to the information, learning the interaction behavior of the virtual pet through the virtual pet model and the user characteristic information, finally outputting the learned virtual pet image, and formulating and implementing the virtual pet interaction rule.
It should be noted that the above-mentioned embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, which should be covered by the claims of the present invention.
It should be noted that the above-mentioned embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions can be made to the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, which should be covered by the claims of the present invention.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein. The scheme in the embodiment of the application can be implemented by adopting various computer languages, such as object-oriented programming language Java and transliterated scripting language JavaScript.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all changes and modifications that fall within the scope of the present application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A method for simulating an electronic pet through virtual pet interaction is characterized in that: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
collecting user characteristic information, and establishing a virtual pet model according to the information;
the virtual pet model learns the interaction behavior of the virtual pet through the user characteristic information;
outputting the learned virtual pet image and formulating virtual pet interaction rules;
and interacting the virtual pet according to the virtual pet interaction rule.
2. The method of claim 1, wherein the method comprises: the user characteristic information comprises favorite pet video information of the user and static and dynamic photo information of pets stored or favored by the user.
3. The method of claim 2, wherein the method comprises: the step of establishing the virtual pet model according to the information comprises the step of analyzing the physical sign information of pet varieties, hair colors, hair lengths, pet sounds, call sizes, behavior habits and character features preferred by the user through the video information of pets favored by the user and the static and dynamic photo information of pets stored or favored by the user, and establishing the virtual pet model by associating the habits of related pets.
4. The method of claim 3, wherein the method comprises: the virtual pet interaction behavior comprises daily interaction with the user, real-time retrieval of personal health conditions of the user, prediction of user requirements and timely response to the user requirements.
5. The method of claim 4, wherein the method comprises: the daily interaction with the user comprises the steps that the virtual pet constructs a motion function according to user information, the daily workload of the user is estimated according to the daily motion quantity of the user, and the interaction duration and degree required by the virtual pet on the day are deduced;
the interaction rule comprises the steps of recording and storing the previous daily exercise steps and the daily activity range of the virtual pet after the virtual pet is authorized by the user, and constructing an exercise function according to the contents in the record, wherein the exercise function is as follows:
Figure QLYQS_1
wherein ,
Figure QLYQS_2
represents the total number of days that can be traced, and>
Figure QLYQS_3
indicates the number of moving steps or range of motion on the user's day, based on the current day>
Figure QLYQS_4
Indicates the i-th day of the n days of calculation, is>
Figure QLYQS_5
Represents the actual number of exercise steps or range of motion on day i;
when in use
Figure QLYQS_6
When the number of the virtual pets and the related interaction information of the virtual pets are allowed to automatically pop up when the number of the virtual pets is less than or equal to 0.3, the rising amplitude of the number of the active interaction times of the virtual pets and the user is set to be 50% -100% of the average number of the previous day interaction times, and the frequency of long actions or long voice is increased to be 112% of the average frequency of the previous day in the active interaction process;
when 0.3<
Figure QLYQS_7
<1, allowing the relevant interaction information of the virtual pet to automatically pop up, setting the amplitude of the number of active interaction times of the virtual pet and a user to be 20% -30% of the average number of interaction times in the past day, and simultaneously setting the frequency of long actions or long voice of the virtual pet in the active interaction process to be equal to the frequency of the long voice in the previous day;
when in use
Figure QLYQS_8
When the virtual pet is larger than or equal to 1, the virtual pet can not be actively popped up, and the interactive information related to the virtual pet can be authorizedThen actively popping up, wherein the popping frequency is 60% of the average popping frequency of the current day, canceling the virtual pet long action or long voice interaction from 8 o 'clock to 18 o' clock on the current day, and not carrying out the long action or long voice interaction under the condition of not actively requiring by the user.
6. The method of claim 5, wherein the method comprises: the real-time retrieval of the personal health condition of the user comprises the steps that the virtual pet conjectures the physical state of the user through the acquired physical sign data of the user, and meanwhile, actual help is provided for the user according to the physical state of the user;
the interaction rules further comprise the steps of presuming the body state of the user, obtaining the current local climate temperature and measuring the body temperature of the user after the virtual pet is authorized by the user, and if the body temperature of the user is lower than 37.3 ℃, the virtual pet normally broadcasts the current temperature climate condition, the current season epidemic disease and disease prevention measures;
if the user's body temperature is
Figure QLYQS_9
Searching the peripheral business Chinese medicine shops and clinics of the user while acquiring the current epidemic diseases and the preventive measure information of the current epidemic diseases by the virtual pet, and deeply searching whether the peripheral business Chinese medicine shops and clinics have the medicines required by the user for sale after the authorization of the user;
the search function of the search range is as follows:
Figure QLYQS_10
Figure QLYQS_11
Figure QLYQS_12
wherein ,
Figure QLYQS_13
represents a health-affecting factor function, and>
Figure QLYQS_17
,/>
Figure QLYQS_18
represents an influencing factor coefficient, <' > is selected>
Figure QLYQS_14
Indicates the body temperature of the user and is up or down>
Figure QLYQS_16
Represents an influencing factor parameter, <' > is selected>
Figure QLYQS_19
Represents a user's body characteristic parameter, <' > or>
Figure QLYQS_20
Represents a preset minimum value of a physical characteristic parameter, is set to 18.5>
Figure QLYQS_15
The maximum value of the preset physical characteristic parameter is set to be 23.9, y represents the weight of the user, and z represents the height of the user;
if the user is male, then
Figure QLYQS_21
,/>
Figure QLYQS_22
When is on>
Figure QLYQS_23
Setting & is in a range of less than 18.5 or greater than 23.9>
Figure QLYQS_24
1250>
Figure QLYQS_25
48125;
if the user is female, then
Figure QLYQS_26
,/>
Figure QLYQS_27
When is greater than or equal to>
Figure QLYQS_28
Setting in a range of less than 18.5 or more than 23.9>
Figure QLYQS_29
Is 750, based on>
Figure QLYQS_30
28875;
when in use
Figure QLYQS_31
Value range of->
Figure QLYQS_32
When, is greater or less>
Figure QLYQS_33
Does not substitute into>
Figure QLYQS_34
Performs operation and directly stays in->
Figure QLYQS_35
Figure QLYQS_36
Taking values according to numerical values preset by a user;
if the user's body temperature is
Figure QLYQS_37
Then search the user peripheryThe business traditional Chinese medicine shops and clinics are deeply searched to determine whether the medicines needed by the user are sold in the peripheral business traditional Chinese medicine shops and clinics, after the authorization of the user, the user can directly purchase the medicines through a third-party platform to transmit the medicines instead of the medicines, and simultaneously, the user is inquired whether the user needs to synchronize the physical condition of the user to the reserved emergency contact;
if the user's body temperature is in
Figure QLYQS_38
Reminding a user to carry out secondary measurement within 10 minutes, if the error of the two measurement results is less than 1 degree, sending the physical condition of the user to an emergency contact person reserved by the user, directly dialing the telephone of the emergency contact person if the error of the two measurement results exceeds one hour, seeking help by a virtual pet through voice conversation, retrieving corresponding required medicines for a pharmacy or clinic in the surrounding environment, and synchronously uploading the retrieval results to the user and the mobile phone of the emergency contact person; if the user does not perform secondary measurement within 10 minutes and does not respond to the virtual pet call for half an hour, the virtual pet directly dials the emergency contact phone for help instruction.
7. The method of claim 6, wherein the method comprises: predicting the user demand and responding the user demand timely comprises the steps of counting the time length of voice interaction between the user and the virtual pet image and the time rule of the voice interaction between the user and the virtual pet, performing key measurement and calculation on the voice interaction occurring in the time period from 17 o 'clock at night to 8 o' clock at morning, and actively popping up the time period interval to actively send a voice interaction request after the user authorization.
8. A virtual pet interactive simulation electronic pet system is characterized in that: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
the system control module is responsible for processing the logic and information interactive transmission of each module, inputting the processing result into the virtual pet image, and operating and feeding back the input information by the virtual pet;
the information acquisition module is used for acquiring the characteristic information of the user and the data information required by the virtual pet and provided by the third-party platform, and feeding back a result to the system control module after processing;
the temperature sensing module senses the body surface temperature information of the user and feeds back the processed result to the system control module;
the voice interaction module is responsible for reading voice interacted by the user, performing voice recognition, feeding back a result to the system control module and operating the virtual pet;
and the background image processing module is responsible for shooting the current surrounding background picture in real time, and carrying out image fusion operation on the virtual pet and the current environment picture to finally generate a background with more reality.
9. The system of claim 8, wherein the virtual pet interactive simulation electronic pet system comprises: the system control module comprises a system logic processing unit, an information acquisition unit, a temperature processing unit, a voice interaction processing unit and a background image processing unit, wherein the system logic processing unit is respectively connected with the information processing unit, the information acquisition unit, the temperature processing unit, the voice interaction processing unit and the background image processing unit, the information acquisition unit is connected with the information acquisition module, the temperature processing unit is connected with the temperature sensing module, the voice interaction processing unit is connected with the voice interaction module, and the background image processing unit is connected with the background image processing module.
10. The system of claim 9, wherein the virtual pet interactive simulation electronic pet system comprises: the information acquisition module comprises an information processing system and an information transmission system, wherein the information transmission system browses and primarily screens authorized information, the primarily screened information is transmitted to the information processing system through the information transmission system after being authorized by a user, the information processing system classifies and integrates the transmitted information data, extracts key parameters required by the system, inputs the parameters into the information acquisition module, transmits the parameters to the system control module for unified processing, and simultaneously, the system control module introduces the obtained key parameter information into the system logic processing unit and logically processes the parameters according to interaction rules in the system logic processing unit;
the temperature sensing module comprises a thermometer and a temperature processing unit which are sequentially connected, wherein the thermometer obtains the body surface temperature of a user in the current state and inputs the body surface temperature into the temperature processing unit, the information acquisition unit and the system logic processing unit, the system logic processing unit processes input temperature information according to an interaction rule and transmits the information acquisition rule to the temperature processing unit and the information acquisition unit through a system control module, the temperature processing unit processes, files and uploads the obtained information and transmits the processed temperature information to the virtual pet image, meanwhile, the information acquisition unit carries out conditional information secondary screening on the information authorized to be checked according to the information obtained from the thermometer and the system logic processing unit and transmits the screened information to the system control module, and the system control module carries out unified operation or modification on the interaction parameters of the virtual pet;
the voice interaction module comprises a microphone, a voice recognition module and a voice interaction processing module, wherein the microphone is used for acquiring input of voice information of a user and a third party, the voice recognition module recognizes voice contents of the user and the third party, the voice interaction processing module carries out preprocessing work of operation appointed by the voice of the user or authorized voice contents of the third party according to the voice recognition contents, all information acquired by the voice interaction module is synchronized to an information acquisition unit, the information acquisition unit extracts and processes key information through a system control module, under the condition that the user cannot timely respond to the voice contents of the third party, response to the voice contents is carried out according to interaction rules of a system logic processing unit, all information is finally transmitted to the voice interaction processing unit, and the voice interaction processing unit carries out voice corresponding operation;
the background image processing module comprises a camera and an image fusion module, the camera shoots an environment picture and transmits the environment picture into the image fusion module, the image fusion module acquires current virtual pet image information from a background image processing unit and an information acquisition unit of the system control module, the virtual pet image and the transmitted environment picture are subjected to image fusion, a virtual pet image under the current environment background is acquired, and the virtual pet image is displayed.
CN202310266436.4A 2023-03-20 2023-03-20 Method and system for interactive simulation of electronic pet by virtual pet Active CN115983051B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310266436.4A CN115983051B (en) 2023-03-20 2023-03-20 Method and system for interactive simulation of electronic pet by virtual pet

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310266436.4A CN115983051B (en) 2023-03-20 2023-03-20 Method and system for interactive simulation of electronic pet by virtual pet

Publications (2)

Publication Number Publication Date
CN115983051A true CN115983051A (en) 2023-04-18
CN115983051B CN115983051B (en) 2023-06-06

Family

ID=85968584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310266436.4A Active CN115983051B (en) 2023-03-20 2023-03-20 Method and system for interactive simulation of electronic pet by virtual pet

Country Status (1)

Country Link
CN (1) CN115983051B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010214009A (en) * 2009-03-18 2010-09-30 Fujitsu Ltd Virtual pet raising device, virtual pet raising method, virtual pet raising program, and mobile terminal device
CN114712862A (en) * 2022-03-31 2022-07-08 新瑞鹏宠物医疗集团有限公司 Virtual pet interaction method, electronic device and computer-readable storage medium
CN114887335A (en) * 2022-04-26 2022-08-12 新瑞鹏宠物医疗集团有限公司 Multi-parameter-based virtual pet physical condition setting method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010214009A (en) * 2009-03-18 2010-09-30 Fujitsu Ltd Virtual pet raising device, virtual pet raising method, virtual pet raising program, and mobile terminal device
CN114712862A (en) * 2022-03-31 2022-07-08 新瑞鹏宠物医疗集团有限公司 Virtual pet interaction method, electronic device and computer-readable storage medium
CN114887335A (en) * 2022-04-26 2022-08-12 新瑞鹏宠物医疗集团有限公司 Multi-parameter-based virtual pet physical condition setting method and device

Also Published As

Publication number Publication date
CN115983051B (en) 2023-06-06

Similar Documents

Publication Publication Date Title
JP6888096B2 (en) Robot, server and human-machine interaction methods
CN110531860B (en) Animation image driving method and device based on artificial intelligence
US20200089661A1 (en) System and method for providing augmented reality challenges
Tucker The Naked Future: What happens in a world that anticipates your every move?
CN110488975B (en) Data processing method based on artificial intelligence and related device
CN110458360B (en) Method, device, equipment and storage medium for predicting hot resources
CN106462598A (en) Information processing device, information processing method, and program
CN110110203A (en) Resource information method for pushing and server, resource information methods of exhibiting and terminal
CN108140383A (en) Display device, topic selection method, topic option program, image display method and image show program
TWI680400B (en) Device and method of managing user information based on image
CN109074117A (en) Built-in storage and cognition insight are felt with the computer-readable cognition based on personal mood made decision for promoting memory
CN110209774A (en) Handle the method, apparatus and terminal device of session information
AU2013331185A1 (en) Method relating to presence granularity with augmented reality
CN107977928A (en) Expression generation method, apparatus, terminal and storage medium
CN109521927A (en) Robot interactive approach and equipment
CN110390705A (en) A kind of method and device generating virtual image
CN109327737A (en) TV programme suggesting method, terminal, system and storage medium
CN109564579A (en) The case where for Internet of Things integrated platform forecasting mechanism
CN108292322A (en) Use tissue, retrieval, annotation and the presentation of the media data file from the signal for checking environment capture
CN106062806A (en) Utilizing interactivity signals to generate relationships and promote content
CN105893771A (en) Information service method and device and device used for information services
CN108566534A (en) Alarm method, device, terminal based on video monitoring and storage medium
CN110278447A (en) Video pushing method, device and electronic equipment based on continuous feature
JP2002055686A (en) Method and system for selling voice data
US9197592B2 (en) Social network service system, image display method, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant