CN111714108A - Computer storage medium, content display method, and content display apparatus - Google Patents

Computer storage medium, content display method, and content display apparatus Download PDF

Info

Publication number
CN111714108A
CN111714108A CN202010187334.XA CN202010187334A CN111714108A CN 111714108 A CN111714108 A CN 111714108A CN 202010187334 A CN202010187334 A CN 202010187334A CN 111714108 A CN111714108 A CN 111714108A
Authority
CN
China
Prior art keywords
content
child
display
measurement
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010187334.XA
Other languages
Chinese (zh)
Inventor
合田文美
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Unicharm Corp
Original Assignee
Unicharm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Unicharm Corp filed Critical Unicharm Corp
Publication of CN111714108A publication Critical patent/CN111714108A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7435Displaying user selection data, e.g. icons in a graphical user interface

Abstract

The present invention provides a computer storage medium, a content display method, and a content display apparatus, which can relax parents and children. The program causes the computer to execute: a first acquisition process for acquiring a first measurement result representing a measurement result of a heart rate of a guardian; a second acquisition process for acquiring a second measurement result representing a measurement result of the heart rate of the child of the guardian; a first display process of displaying a first content on the display screen, the first content being selected from the first content group in accordance with the first measurement result; a second display process of displaying a second content on the display screen, the second content being selected from the second content group according to the second measurement result; and third display processing for displaying a third content on the display screen, the third content being selected from a third content group according to the first measurement result and the second measurement result which are obtained in common.

Description

Computer storage medium, content display method, and content display apparatus
Technical Field
The invention relates to a computer storage medium, a content display method and a content display apparatus.
Background
Patent document 1 discloses a technique for identifying the health condition of a person to be measured based on a heart rate or the like.
CITATION LIST
Patent document
Patent document 1: japanese patent laid-open No. 9-187429
Disclosure of Invention
Problems to be solved by the invention
Typically, guardians who are nursing children are subjected to various stresses during the years of nursing children. In the technique disclosed in patent document 1, although individual guardians can recognize that they are under stress, they may have difficulty in relaxing themselves and their children.
The present invention has been made in view of the conventional problems such as the above-mentioned problem, and aspects of the present invention provide a computer storage medium, a content display method, and a content display apparatus.
Means for solving the problems
A main aspect of the present invention for achieving the above-described aspect is a computer storage medium storing a program that, when executed by a computer, causes the computer to execute:
a first acquisition process for acquiring a first measurement result,
the first measurement represents a measurement of a guardian's heart rate;
a second acquisition process for acquiring a second measurement result,
the second measurement represents a measurement of the heart rate of the guardian's child;
a first display process for displaying a first content on a display screen,
the first content is selected from a first group of content according to the first measurement;
a second display process of displaying second content on the display screen,
the second content is selected from a second group of content according to the second measurement; and
a third display process of displaying a third content on the display screen,
the third content is selected from a third content group according to the first measurement result and the second measurement result which are obtained together.
A content display method, comprising:
a first obtaining step of obtaining a first measurement result,
the first measurement represents a measurement of a guardian's heart rate;
a second obtaining step of obtaining a second measurement result,
the second measurement represents a measurement of the heart rate of the guardian's child;
a first display step of displaying a first content on a display screen,
the first content is selected from a first group of content according to the first measurement;
a second display step of displaying second content on the display screen,
the second content is selected from a second group of content according to the second measurement; and
a third display step of displaying a third content on the display screen,
the third content is selected from a third content group according to the first measurement result and the second measurement result which are obtained together.
A content display apparatus comprising:
a first acquisition section configured to acquire a first measurement result,
the first measurement represents a measurement of a guardian's heart rate;
a second acquisition section configured to acquire a second measurement result,
the second measurement represents a measurement of the heart rate of the guardian's child;
a first display section configured to display first content on a display screen,
the first content is selected from a first group of content according to the first measurement;
a second display section configured to display second content on the display screen,
the second content is selected from a second group of content according to the second measurement; and
a third display section configured to display third content on the display screen,
the third content is selected from a third content group according to the first measurement result and the second measurement result which are obtained together.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the invention, parents and children can relax.
Drawings
Fig. 1 illustrates a computer system 10.
Fig. 2 shows details of the functional blocks included in the display section 43 and the receiving section 44.
Fig. 3 shows functional blocks implemented by the server 21.
Fig. 4 is an explanatory diagram of the content 121.
Fig. 5 is an explanatory diagram of information included in the content 121.
Fig. 6 illustrates a home screen 300 displayed on the display screen 33.
Fig. 7 is a flowchart showing the content display processing.
Fig. 8 is a flowchart showing an example of the measurement processing.
Fig. 9 shows a selection screen 301 for selecting a measurement object.
Fig. 10 illustrates a parental measurements screen 302.
Fig. 11 shows a child measurement screen 303.
Fig. 12 is a flowchart showing details of the selection processing.
Fig. 13 is a flowchart of an example of the parent display-acceptance process.
Fig. 14 shows an example of the parent screen 310.
Fig. 15 shows a screen 311 including the release content 640 a.
Fig. 16 is an explanatory diagram of the pull-down menu 641.
Fig. 17 is a flowchart showing an example of child display-acceptance processing.
Fig. 18 shows an example of the child screen 312.
Fig. 19 shows a screen 313 including interactive content 660 a.
Fig. 20 is an explanatory diagram of the pull-down menu 661.
Fig. 21 is a flowchart showing an example of the parent-child display-acceptance process.
Fig. 22 shows a parent-child screen 320 a.
Fig. 23 shows a parent-child screen 320 b.
Fig. 24 shows a screen 330 including interactive content 670 a.
Fig. 25 is a flowchart of an example of SNS publishing process.
Fig. 26 shows a posting screen 350.
Fig. 27 illustrates an SNS screen 360 for displaying post information.
Fig. 28 is an explanatory diagram of options of the mother's emotion and options of the child's emotion.
Fig. 29 is an explanatory view of the analysis result of the experiment.
List of reference numerals
10 computer system
20 terminal
21 Server
30 memory cell
31 control unit
32 input unit
33 display screen
34 Camera
35 timing unit
36 communication unit
40 measuring part
41 first acquisition processing unit
42 second acquisition processing unit
43 display part
44 receiving part
60 first display part
61 second display part
62 third display part
63 first input screen image display unit
64 second input screen image display part
65 third input screen image display part
66 fourth input screen image display unit
67 fifth input screen image display part
68 index display part
110 first state estimating section
111 second state estimating section
112 selection part
113 index calculating section
114 distributed information processing unit
Detailed Description
At least the following matters will become clear from the description of the present specification and the accompanying drawings.
A program that causes a computer to execute:
a first acquisition process for acquiring a first measurement result,
the first measurement represents a measurement of a guardian's heart rate;
a second acquisition process for acquiring a second measurement result,
the second measurement represents a measurement of the heart rate of the guardian's child;
a first display process for displaying a first content on a display screen,
the first content is selected from a first group of content according to the first measurement;
a second display process of displaying second content on the display screen,
the second content is selected from a second group of content according to the second measurement; and
a third display process of displaying a third content on the display screen,
the third content is selected from a third content group according to the first measurement result and the second measurement result which are obtained together.
With the above-described procedure, the measured person can relax because the content selected according to the measurement result of the heart rate of each measurement object is presented.
In such a procedure, it is desirable to:
the first display processing displays the first content on the display screen,
the first content is selected from the first content group in which the number of selectable contents increases as the number of times the first measurement result is acquired increases, according to the first measurement result.
With the above procedure, new content is presented according to the number of times the parent's heart rate is measured. Thus, motivation to encourage the user to continue using the program may be increased.
In such a procedure, it is desirable to:
the second display processing displays the second content on the display screen,
the second content is selected from the second content group in which the number of selectable contents increases as the number of times the second measurement result is acquired increases, according to the second measurement result.
With the above procedure, new content is presented according to the number of times the child's heart rate is measured. Thus, motivation to encourage the user to continue using the program may be increased.
In such a procedure, it is desirable to:
the third display processing displays the third content on the display screen,
the third content is selected from the third content group in which the number of selectable contents increases as the number of times the first measurement result and the second measurement result are collectively acquired increases, according to the first measurement result and the second measurement result.
With the above procedure, new content is presented according to the number of times the parent and child heart rates are measured. Thus, motivation to encourage the user to continue using the program may be increased.
In such a procedure, it is desirable to:
the first display processing displays the first content on the display screen,
the first content is selected from the first content group according to the first measurement, a time period after birth of the child, and a time of day.
The above procedure may present parent content that matches the measured time of day and the child's age in months.
In such a procedure, it is desirable to:
the second display processing displays the second content on the display screen,
the second content is selected from the second content group according to the second measurement, a time period after birth of the child, and a time of day.
The above procedure may present child content that matches the measured time of day and the child's age in months.
In such a procedure, it is desirable to:
the third display processing displays the third content on the display screen,
the third content is selected from the third content group according to the first and second measurements, a time period after birth of the child, and a time of day.
The above procedure may present parent and child content that matches the measured time of day and the child's age in months.
In such a procedure, it is desirable to:
the first display processing displays the first content on the display screen,
the first content is selected from the first content group based on the first measurement, a time period after birth of the child, a time of day, and past data of the first measurement.
The program may present parental content that matches the parent's temperament.
In such a procedure, it is desirable to:
the second display processing displays the second content on the display screen,
the second content is selected from the second content group based on the second measurement, a time period after birth of the child, a time of day, and past data of the first measurement.
The above-described program may present child content that matches the parent's temperament.
In such a procedure, it is desirable to:
the third display processing displays the third content on the display screen,
the third content is selected from the third content group based on the first and second measurements, a time period after birth of the child, a time of day, and past data of the first measurement.
The above-described program may present parent and child content that matches the parent's temperament.
In such a procedure, it is desirable to:
the program causes the computer to further execute:
a first input screen image display process for displaying a first input screen image on the display screen,
the first input screen image is used to prompt the user to input first information,
the first information is information related to execution of content displayed on the display screen, the content belonging to a plurality of contents included in the first content group; and
a first acceptance process of accepting the input first information in association with content displayed on the display screen,
the first display processing displays the first content on the display screen,
the first content is selected from the first content group based on the first measurement, a time period after birth of the child, a time of day, and the first information.
The above-described program may present content corresponding to the status of execution of parental content or the like.
In such a procedure, it is desirable to:
the program causes the computer to further execute:
a second input screen image display process for displaying a second input screen image on the display screen,
the second input screen image is used to prompt the user to input second information,
the second information is information related to execution of content displayed on the display screen, the content belonging to a plurality of contents included in the second content group; and
a second acceptance process of accepting the input second information in association with the content displayed on the display screen,
the second display processing displays the second content on the display screen,
the second content is selected from the second content group according to the second measurement, a time period after birth of the child, a time of day, and the second information.
The above-described program can present content corresponding to the status of execution of child content or the like.
In such a procedure, it is desirable to:
the program causes the computer to further execute:
a third input screen image display process of displaying a third input screen image on the display screen,
the third input screen image is used to prompt the user to input third information,
the third information is information related to execution of content displayed on the display screen, the content belonging to a plurality of contents included in the third content group; and
a third acceptance process of accepting the input third information in association with the content displayed on the display screen,
the third display processing displays the third content on the display screen,
the third content is selected from the third group of content based on the first and second measurements, a time period after birth of the child, a time of day, and the third information.
The above-described program can present content corresponding to the status of execution of parent and child content, and the like.
In such a procedure, it is desirable to:
the program causes the computer to further execute:
a fourth input screen image display process of displaying a fourth input screen image on the display screen,
the fourth input screen image is used to prompt the user to input fourth information,
the fourth information is information on a state of the child when executing content displayed on the display screen, the content belonging to a plurality of contents included in the second content group; and
a fourth acceptance process of accepting the input fourth information in association with the content displayed on the display screen,
the second display processing displays the second content on the display screen,
the second content is selected from the second content group according to the second measurement, a time period after birth of the child, a time of day, and the fourth information.
The program may present content corresponding to the state of the child when the child content was executed.
In such a procedure, it is desirable to:
the program causes the computer to further execute:
a fifth input screen image display process of displaying a fifth input screen image on the display screen,
the fifth input screen image is used to prompt the user to input fifth information,
the fifth information is information on a state of the child when executing content displayed on the display screen, the content belonging to a plurality of contents included in the third content group; and
a fifth acceptance process of accepting the input fifth information in association with the content displayed on the display screen,
the third display processing displays the third content on the display screen,
the third content is selected from the third content group according to the first and second measurements, a time period after birth of the child, a time of day, and the fifth information.
The program may present content corresponding to the status of the child when parent and child content was executed.
In such a procedure, it is desirable to:
the program causes the computer to further execute:
an image display process of displaying a first image, a second image, and a third image on the display screen,
the first image is used to prompt the user to start measuring the guardian's heart rate,
the second image is for prompting a user to begin measuring the child's heart rate, an
The third image is used to prompt the user to start measuring the guardian's heart rate and the child's heart rate together.
With the above procedure, both the measurement of the heart rate of the parent and the child can be easily performed.
In such a procedure, it is desirable to:
music created based on heart sounds of the guardian and the child is included as content in at least any one of the second content group and the third content group.
With the above procedure, children (especially infants) can relax because music generated based on heart sounds can be presented.
In such a procedure, it is desirable to:
the program causes the computer to further execute:
an index calculation process of calculating an index based on the first input information,
the indicator represents a level of interaction between the guardian and the child,
the first input information is input after the second content or the third content is displayed on the display screen, and the second content or the third content is used for prompting the interaction between the guardian and the child; and
and an index display process of displaying the index on the display screen.
With the above program, since the user can objectively confirm the index indicating the level of interaction, it is possible to increase motivation for encouraging the user to continue using the program.
In such a procedure, it is desirable to:
the program causes the computer to further execute:
a first estimation process for estimating a psychological state of the guardian based on the first measurement result;
a second estimation process for estimating a psychological state of the child based on the second measurement result; and
a transmission process of transmitting estimation results of the first estimation process and the second estimation process and second input information to a server,
the server presents a social networking service that is,
the second input information is input in association with the estimation result.
With the above-described program, a user can easily share information about his or her mental state with many other users.
A content display method, comprising:
a first obtaining step of obtaining a first measurement result,
the first measurement represents a measurement of a guardian's heart rate;
a second obtaining step of obtaining a second measurement result,
the second measurement represents a measurement of the heart rate of the guardian's child;
a first display step of displaying a first content on a display screen,
the first content is selected from a first group of content according to the first measurement;
a second display step of displaying second content on the display screen,
the second content is selected from a second group of content according to the second measurement; and
a third display step of displaying a third content on the display screen,
the third content is selected from a third content group according to the first measurement result and the second measurement result which are obtained together.
With the above method, since contents selected according to the measurement result of the heart rate of each measurement object are presented, the measured person can relax.
A computer, comprising:
a first acquisition section configured to acquire a first measurement result,
the first measurement represents a measurement of a guardian's heart rate;
a second acquisition section configured to acquire a second measurement result,
the second measurement represents a measurement of the heart rate of the guardian's child;
a first display section configured to display first content on a display screen,
the first content is selected from a first group of content according to the first measurement;
a second display section configured to display second content on the display screen,
the second content is selected from a second group of content according to the second measurement; and
a third display section configured to display third content on the display screen,
the third content is selected from a third content group according to the first measurement result and the second measurement result which are obtained together.
With the above computer, since contents selected according to the measurement result of the heart rate of each measurement object are presented, the measured person can relax.
Examples
Architecture for computer system 10
FIG. 1 illustrates an example of a computer system 10. The computer system 10 displays content for relaxing the user on a display screen 33 of the terminal 20 (described below) according to the measurement result of the heart rate of the user, i.e., the child and the parent (guardian). The computer system 10 includes a terminal 20 and a server 21.
The terminal 20 is, for example, a smartphone or a tablet computer, and various functions are implemented by the terminal 20. The terminal 20 measures the heart rate of the child and the parent and sends the measurement result to the server 21. In addition, the terminal 20 displays the content selected by the server 21.
The server 21 selects content for relieving stress of the user, for example, according to the measurement result of the heart rate of the user transmitted from the terminal 20. Further, although detailed description will be made later, the server 21 manages measurement results of the heart rate and a predetermined social network service via which text information and the like are published.
Details of the terminal 20
The terminal 20 includes a storage unit 30, a control unit 31, an input unit 32, a display screen 33, a camera 34, a timer unit 35, and a communication unit 36.
The storage unit 30 stores a predetermined program to be executed by a processor (not shown) in the terminal 20, and various pieces of information.
The control unit 31 is constituted by function blocks that control the terminal 20 in a supervisory manner, and these function blocks are realized with the terminal 20 by a processor in the terminal 20 executing a predetermined program. Details of the functional blocks included in the control unit 31 will be described later.
The input unit 32 is implemented as a display screen 33, and the display screen 33 is a touch panel of the terminal 20 that performs input by tapping. The input unit 32 receives an operation result of the user of the terminal 20. The display screen 33 displays contents and various items of information according to commands transmitted from the control unit 31 and operation results of the user.
The camera 34 (measuring means) measures light reflected from blood flow while illuminating a finger, palm or other body part with light from a light emitting diode (not shown). In other words, the camera 34 in the present embodiment functions not only as a device for taking general pictures and videos, but also as an optical heart rate (pulse) measuring device.
The timing unit 35 times the time, and the communication unit 36 performs the transfer of information between the terminal 20 and each of the other terminals and the server 21.
Function blocks implemented by the control unit 31
The control unit 31 includes a measurement section 40, a first acquisition processing section 41, a second acquisition processing section 42, a display section 43, and an acceptance section 44.
The measurement unit 40 is a block for measuring the heart rate of the measurement target based on the reflected light captured by the camera 34. Although detailed later, the measurement section 40 performs three types of measurement, that is, "measurement of the parent's heart rate (hereinafter referred to as" parent measurement "in some cases)," "measurement of the child's heart rate (hereinafter referred to as" child measurement "in some cases)," and "measurement of the parent's and child's heart rates (hereinafter referred to as" parent and child measurement "in some cases)". The measurement method is not limited to a method using a camera, and measurement may be performed using other wearable devices, wearable articles (including clothes, diapers, underwear, and the like), or contact/non-contact sensing devices (including image analysis-based sensing and the like).
The first acquisition processing part 41 acquires a measurement result X1 (first measurement result) of the parent heart rate, and the second acquisition processing part 42 acquires a measurement result X2 (second measurement result) of the child heart rate.
The display unit 43 displays various screens on the display screen 33 according to the operation result of the user and the like. For example, the display unit 43 displays the content and various items of information selected by the server 21 on the display screen 33 based on the measurement result of the heart rate. Although described in detail later, the contents displayed on the display screen 33 are, for example, videos (including music, voice, and text) and images for relieving the parent and the child of stress and relaxing them.
The receiving unit 44 receives various items of information input by the user. The pieces of information input by the user are, for example, information on whether or not the content displayed on the display screen 33 is executed, and information on the emotion of the child when the content is executed.
Fig. 2 shows details of the functional blocks included in the display section 43 and the receiving section 44. The display section 43 is a functional block for displaying information such as images and videos on the display screen 33. The display section 43 includes a first display section 60, a second display section 61, a third display section 62, a first input screen image display section 63, a second input screen image display section 64, a third input screen image display section 65, a fourth input screen image display section 66, a fifth input screen image display section 67, and an index display section 68.
The first display unit 60 displays the parental content selected by the server 21 based on the measurement result of the parental heart rate on the display screen 33.
The second display section 61 displays the child content selected by the server 21 based on the measurement result of the child heart rate on the display screen 33.
The third display portion 62 displays the parent and child contents selected by the server 21 based on the measurement results of the parent and child heart rates on the display screen 33.
The first input screen image display section 63 displays an image for prompting the user to input information I1 (first information) relating to execution of the parental content on the display screen 33. The information on the execution of the content is information indicating, for example, whether or not the content is executed. In the following description of the present embodiment, parental content is referred to as "relaxation content" in some cases.
The second input screen image display section 64 displays an image prompting the user to input information I2 (second information) relating to the execution of child content on the display screen 33.
The third input screen image display section 65 displays an image on the display screen 33 prompting the user to input information I3 (third information) relating to the execution of parent and child content.
The fourth input screen image display section 66 displays an image for prompting the user to input information I4 (fourth information) on the state of the child when the child content is executed on the display screen 33. Here, the information related to the state of the child is, for example, information indicating the emotions ("good emotion", "normal", and "bad emotion") of the child.
The fifth input screen image display section 67 displays an image prompting the user to input information I5 (fifth information) on the status of the child when the parent and child contents were executed on the display screen 33.
The index display unit 68 displays, for example, "interaction points (indexes)" calculated by the server 21 based on the number of times child content or parent and child content is executed on the display screen 33. Although described in detail later, the level of parent-child interaction can be objectively identified by displaying "interaction points" on the display screen 33. Thus, a parent may, for example, consciously increase the number of interactions a parent has with a child. In the following description of the present embodiment, both child content and parent content are collectively referred to as "interactive content".
The receiving unit 44 is a functional block that receives various items of information input by the user, and the receiving unit 44 includes a first receiving unit 80, a second receiving unit 81, a third receiving unit 82, a fourth receiving unit 83, and a fifth receiving unit 84.
The first accepting section 80 accepts information I1 relating to execution of parental content input by the user in association with the parental content displayed on the display screen 33.
The second accepting unit 81 accepts information I2 relating to the execution of child content input by the user in association with the child content displayed on the display screen 33.
The third accepting section 82 accepts information I3 input by the user relating to the execution of parent and child content in a manner correlated with the parent and child content displayed on the display screen 33.
The fourth accepting unit 83 accepts information I4 regarding the status of the child when the child content is executed.
The fifth accepting section 84 accepts information I5 regarding the status of the child when parent and child content was executed.
Details of the server 21
Fig. 3 shows an example of functional blocks implemented by the server 21 when a processor (not shown) in the server 21 executes a predetermined program.
The server 21 is constituted by a control unit 100 for controlling the server 21 in a supervised manner, a storage unit 101 for storing various items of information, and a communication unit 102 for making communication of information with respect to the terminal 20 and the like.
Control unit 100
The control unit 100 includes a first state estimating section 110, a second state estimating section 111, a selecting section 112, an index calculating section 113, and a distribution information processing section 114.
The first state estimating unit 110 estimates the psychological state of the parent while quantifying the psychological state of the parent, based on the measurement result X1 of the heart rate of the parent transmitted from the terminal 20. More specifically, the first state estimating section 110 estimates that the parent is in the "active state" in which the "active level is o%" in the case where the measurement result X1 is higher than the predetermined threshold value T1, and estimates that the parent is in the "relaxed state" in which the "relaxed level is o%" in the case where the heart rate is lower than the predetermined threshold value. Here, the "active state" (or "anxiety state") means a state in which the heart rate is high and the parent is in a lively state or an anxiety state. "relaxed state" means a state where the heart rate is low and the parent is in a comfortable state.
The second state estimating unit 111 estimates the psychological state of the child while quantifying the psychological state, based on the measurement result X2 of the child's heart rate transmitted from the terminal 20. More specifically, the second state estimating section 111 estimates that the child is in the "active state" in which the "activity level is □%" in the case where the measurement result X2 is higher than the predetermined threshold value T2, and estimates that the child is in the "relaxed state" in which the "relaxation level is □%" in the case where the heart rate is lower than the predetermined threshold value.
In the present embodiment, although the psychological state of the measured person is estimated as a quantitative value by comparing the measurement result of the heart rate with the threshold value, for example, "sympathetic" level and "parasympathetic" level based on the variability of the heart rate obtained by frequency analysis of the change in the heart rate and the time-series change in the beat interval of each heartbeat may be used. More specifically, the "sympathetic" level and the "parasympathetic" level may be derived from the heart rate of the measured person, and the psychological state may be estimated to be in an "active state" in case the "sympathetic" level is higher than the "parasympathetic" level, and may be estimated to be in a "relaxed state" in case the "parasympathetic" level is higher than the "sympathetic" level.
Alternatively, the state in which the "sympathetic" level and the "parasympathetic" level are balanced may be estimated as a "balanced state", and the psychological state may be classified into three states of an "active state", a "relaxed state", and a "balanced state". In the case where the "state of balance" is determined, for example, based on the measured heart rate and a threshold, the mental state may be estimated to be in the "state of balance" when the measured heart rate and the threshold are substantially equal to each other (e.g., when the heart rate falls within ± 5% of the threshold). In this case, when the heart rate is + 5% or more higher than the threshold value, the "active state" is estimated, and when the heart rate is-5% or more lower than the threshold value, the "relaxed state" is estimated.
An "equilibrium state" is a state that is more relaxed than an "active state" estimated, for example, when excitement increases. The state estimated when the user is in a fatigue condition is a more active state compared to the "relaxed state". Therefore, the "balance state" is a psychological state suitable for the measured person. In the present embodiment, although detailed description will be made later, the measured person can recognize the closeness of his or her psychological state to the "equilibrium state" from the estimation results (i.e., the active level □% and the relaxation level o%) displayed on the display screen 33.
Selection unit 112 selects content suitable for the user from content 121 stored in storage unit 101 based on each item of information. The "content suitable for the user" differs depending on various factors such as the following: the psychological state of the parent and child, the age of the child (whether the child can hold his or her neck up, whether the child can play with his or her hands), the time-band (morning or night), the relationship between the child and the guardian (mother, father, grandfather, grandmother or nurse, etc.). Therefore, although detailed description will be made later, the selection section 112 selects the content based on, for example, the number of heart rate measurements for each of three different types, the psychological states of the parent and the child, the child's age at month, the time zone, the parent's temperament, the number of executions of the content, and the effect on the child.
The index calculation section 113 calculates "interaction points" from the information transmitted from the terminal 20 after the "interaction content" (i.e., the child content or the parent content) is executed. For example, when the index calculation unit 113 acquires the information I2 or I3 indicating that "interactive content" is executed, the index calculation unit 113 increases "the number of interactive points" by "1". In another example, in a case where a parent-child image is transmitted from the terminal 20 to the server 21 after "interactive content" has been displayed on the display screen 33, the index calculation section 113 increases "the number of interactive points" by "2". Further, the index calculation section 113 may increase the "interaction points" by a predetermined point according to the information indicating that the terminal 20 acquires the parent-child image. Here, the information I2 or I3 indicating that "interactive content" is executed, the parent-child image, and the information indicating that the terminal 20 acquires the parent-child image each correspond to the first input information.
The posting information processing section 114 processes posting information 125 (described later) that is transmitted from the terminal 20 and posted on a predetermined social network service (hereinafter referred to as SNS) so that the posting information 125 can be viewed with a browser of the terminal 20.
Memory cell 101
The storage unit 101 stores user information 120, content 121, history information 122, estimation information 123, captured image 124, and distribution information 125.
The user information 120 is information relating to the child and the parent as the user, and includes, for example, "identification information", "name", "address", "sex", "height", "weight", and "date of birth" of the child and the parent, respectively.
The content 121 is video and images played or reproduced on the terminal 20 to relax parents and children. In the present embodiment, although the content 121 is a video and an image, the content 121 may be only voice and/or music. Details of the content 121 will be described later.
The history information 122 is information related to the measurement results of the heart rates of the parent and child measured in the past. More specifically, the history information 122 includes "measurement result X1 of the parent heart rate", "measurement result X2 of the child heart rate", "measurement times of the parent heart rate", "measurement times of the child heart rate", and "measurement times of the parent and child heart rates". In the present embodiment, the control unit 100 updates the history information 122 each time the control unit 100 receives, from the terminal 20, the transmitted measurement result of the heart rate or information indicating which of the three types of measurements (i.e., "parent measurement", "child measurement", and "parent and child measurement") was performed.
The estimation information 123 is information indicating the psychological state of the parent estimated by the first state estimating unit 110 in the past, that is, an "estimation result" of the past psychological state. The first state estimating part 110 updates the estimation information 123 each time the first state estimating part 110 estimates the psychological state of the parent. Therefore, the tendency of the psychological state of the parent (i.e., which of the "active state" and the "relaxed state" is dominant) can be identified based on the estimation information 123.
The captured image 124 is an image of a parent and a child captured by the user, and the distribution information 125 is information distributed on a predetermined SNS managed by the server 21.
Details of content 121
Fig. 4 is an explanatory diagram of the content 121. The content 121 includes n pieces of parent content a1 through An (a first content group), m pieces of child content B1 through Bm (a second content group), and x pieces of parent and child content C1 through Cx (a third content group).
Here, "parent content" is relaxation content for the parent selected according to "measurement of the parent's heart rate", and "child content" is interactive content selected according to "measurement of the child's heart rate". "parent and child content" is interactive content selected based on "measurements of parent and child heart rate". Further, n pieces of parent content a1 to An are defined as "content group a (first content group)", m pieces of child content B1 to Bm are defined as "content group B (second content group)", and x pieces of parent and child content C1 to Cx are defined as "content group C (third content group)".
The contents a1 to Cx for relaxing the user include, for example, "a video prompting the parent to take the child to walk together", "a video prompting the parent to exercise together with the child", and "a video prompting the parent to breathe deeply". However, as described above, it is necessary to select and present content suitable for the user in consideration of various factors such as the user's psychological state, time zone, and the child's age in the month. More specifically, for example, in a case where the user's psychological state is an active state (e.g., a state where the heart rate is high and the user is in an energetic state), or in a case of nighttime, it is generally not desirable to present a "video prompting the parent to take the child to walk together".
As another example, it is also not desirable to present the user with a "video prompting the parent to exercise with the baby" in the event that the child is younger in age and the child is unable to raise his or her neck.
In view of the above, in the present embodiment, pieces of information are added to the contents a1 to Cx, respectively, so that the selection section 112 can select a content suitable for the user.
FIG. 5 is an explanatory diagram of items of information 200 to 206 added to each content 121.
In the present embodiment, as the number of measurements of the heart rate increases, the number of selectable contents of the selection section 112 also increases. Each of the contents 121 becomes selectable by the selection section 112 when the number of measurements of the heart rate reaches a certain number, and the information 200 is information representing such a certain number of measurements of the heart rate. More specifically, the information 200 of the content a1 is "0", so that the content a1 can be selected from the beginning. On the other hand, the information 200 of the content An is "5", and therefore the content An will not be the object of selection unless the number of measurements of the parental heart rate becomes 5 or more.
The information 200 of the content Bm is "30", and therefore the content Bm will not be the object of selection unless the number of measurements of the child heart rate becomes 30 or more. Further, the information 200 of the content Cx is "30", and therefore the content Cx will not be an object of selection unless the number of measurements of the heart rate of the parent and child is 30 or more.
In the present embodiment, although detailed later, as the number of measurements of the heart rate increases, new content is presented to the user. This increases the user's motivation to continue using the computer system 10.
Information 201 represents the estimated mental state (i.e., "active state" or "relaxed state"). In the present embodiment, for example, since the information 201 of the content a1 is "active", when the parent is in the "active state", the content a1 becomes the object of selection. Further, since the information 201 of each of the contents B1 and C1 is "active", the content B1 becomes a selection object when the child is in the "active state", and the content C1 becomes a selection object when both the parent and the child are in the "active state". Thus, in the present embodiment, the selection section 112 can select a content suitable for the psychological state of the user.
Information 202 represents the age of the child. For example, since the information 202 of the content a1 is "0 to 12", the content a1 becomes the object of selection when the child ages 0 to 12 months.
Information 203 represents a time band. For example, since the information 203 of the content a1 is "morning", the content a1 becomes an object of selection when the time zone is morning. In the present embodiment, for example, "morning" means a time zone of 6:00 to 11:00, "daytime" means a time zone of 11:00 to 16:00, "evening" means a time zone of 16:00 to 18:00, and "night" means a time zone of 18:00 to 6: 00.
Information 204 represents a past mental state trend of the parent. For example, since the information 204 of the content a1 is "active", the object targeted by the content a1 is a parent whose percentage of "active state" in the past mental state is relatively high. The content for which the information 204 is "relaxed" is directed to a subject that is a parent for which the percentage of "relaxed state" in the past mental state is relatively high.
The information 205 indicates the number of times of execution of the content. For example, since the information 205 of the content a1 is "5", the content a1 is executed five times.
The information 206 represents the emotion of the child when the content is executed by using a numerical value. The information 206 is given as a numerical value calculated based on information input by the user after the content is executed. The information 206 is added to the child content B1-Bm and the parent and child content C1-Cx. After the content is executed, a numerical value is assigned to "+ 1" when the state of the child is "good mood", a numerical value is assigned to "0" when the state of the child is "normal", and a numerical value is assigned to "-1" when the state of the child is "bad mood".
For example, in the case where the content B1 is executed twice and "good emotion" is input in both cases, the information 206 of the content B1 is set to "2" as shown in fig. 5. Thus, the child preference information 206 has higher value contents.
Here, the pieces of information 200 to 204 are set in advance when the content 121 is stored, for example. On the other hand, regarding the information 205 on the "number of executions", the control unit 100 updates the information 205 of the content 121 according to the above-described items of information I1 to I3 on the execution of the content transmitted from the terminal 20. Regarding the information 206 relating to "emotion", the control unit 100 updates the information 206 of the content 121 according to the above-described items of information I4 and I5 relating to the state of children transmitted from the terminal 20. Details of the process of the control unit 100 updating the pieces of information 205 and 206 will be described later.
Examples of the Main Picture
Fig. 6 shows an example of a home screen 300 displayed on the display screen 33 after the program in the terminal 20 is started. The home screen 300 is, for example, a screen for guiding the user to perform various types of processing such as measurement of the child's heart rate. The home screen 300 includes images 500 and 501, buttons 600 to 602, and icons 800 to 804.
The image 500 is an image containing the values of the characters "interaction points" and "interaction points", and the image 501 is an image containing the characters "parent-child mental balance" and the number of coincidences in mental balance between the parent and the child. The "number of times of agreement in mental balance between parents and children" refers to the number of times when estimated psychological states of parents and children agree with each other when measurements of parents and children are made. The value of "interaction points" and "the number of times of coincidence in mental balance between parent and child" are appropriately updated by the display section 43.
The button 600 includes the expression "measuring mood state" and the button 600 is tapped to start heart rate measurement.
The button 601 includes the expression "interaction", and the button 601 is tapped to display "interactive contents" for interaction with children on the display 33.
The button 602 includes the expression "relax", and the button 602 is tapped to display "relaxation content" for relaxing the parent on the display screen 33.
The icon 800 includes the expression "home", and the icon 800 is tapped to display a home screen (not shown) on the display screen 33.
The icon 801 includes the expression "history", and the icon 801 is tapped to display the aggregated result of the past mental states on the display screen 33, for example.
The icon 802 includes the expression "forum", and the icon 802 is tapped to display the SNS screen on the display screen 33.
The icon 803 includes the expression "notification", and the icon 803 is tapped to display the notification from the program administrator on the display screen 33.
The icon 804 includes the expression "setting", and the icon 804 is tapped to start execution of processing to set information related to the user (parent and child), for example. Here, it is assumed that the icon 804 of "setting" is tapped, and various information of the user (such as name and date of birth) has been set and stored as the user information 120 of the server 21.
Content display processing (content display method)
Fig. 7 is a flowchart showing the content display processing executed by the computer system 10.
First, when the button 600 indicating "measuring the mood state" in the home screen 300 of fig. 6 is tapped, the measurement process is executed (S10). Fig. 8 is a flowchart showing an example of the measurement processing. First, the display unit 43 displays a selection screen 301 (see fig. 9) for prompting the user to select a measurement target on the display screen 33 (S100). The selection screen 301 includes a button 610 (first image) indicating "perform own measurement", a button 611 (second image) indicating "perform child measurement", and a button 612 (third image) indicating "perform parent and child measurement".
When the button 610 indicating "perform self measurement" is tapped ("parent" in S101), the display unit 43 displays the parent measurement screen 302 shown in fig. 10 on the display screen 33 (S102). The parental measurements screen 302 includes a button 620 representing "measure".
When the button 620 indicating "measure" is tapped, the measurement section 40 starts measurement of the parental heart rate (S103). When the heart rate measurement is finished, the first acquisition processing section 41 acquires the measurement result X1 of the parental heart rate (S104: first acquisition processing (first acquisition step)).
When the button 611 indicating "perform child measurement" is tapped in the processing S101 (child in S101), the display unit 43 displays the child measurement screen 303 shown in fig. 11 on the display screen 33 (S105). The child measurement screen 303 includes a button 621 indicating "measure".
When the button 621 indicating "measure" is tapped, the measuring section 40 starts measurement of the child heart rate (S106). When the heart rate measurement is completed, the second acquisition processing section 42 acquires the measurement result X2 of the child' S heart rate (S107: second acquisition processing (second acquisition step)).
When the button 612 indicating "perform parent and child measurement" is tapped in the processing S101 ("parent-child" in S101), the display section 43 executes the parent measurement processing including the above-described processing S102 to S104 (S108). When the processing S108 is finished, the display section 43 further executes child measurement processing including the above-described processing S105 to S107 (S109).
Thus, in the measurement processing S10, the measurement result X1 of the parent heart rate is acquired when "measurement of the parent heart rate" is performed, the measurement result X2 of the child heart rate is acquired when "measurement of the child heart rate" is performed, and the measurement results X1 and X2 of the parent and child heart rates are acquired when "measurement of the parent and child heart rates" is performed.
After executing the process S10 of fig. 7, the communication unit 36 transmits the one or more measurement results acquired in the process S10 to the server 21 (S11).
Then, the control unit 100 in the server 21 acquires the measurement results and stores the measurement results in the storage unit 101 (S12). Thus, the history information 122 including "measurement X1", "measurement X2", "number of measurements of parent heart rate", "number of measurements of child heart rate", and "number of measurements of parent and child heart rates" is updated. For example, after the measurement result X1 has been transmitted to the server 21 in the process S11, the control unit 100 updates "measurement result X1" and "the number of measurements of the parent heart rate".
After performing the process S12, the control unit 100 performs estimation of the psychological state of the measured person (S13). More specifically, in the case where the parent' S heart rate is measured and "measurement result X1" is transmitted, the first state estimating section 110 estimates the psychological state of the parent based on "measurement result X1" (S13: first estimation processing). In the case where the child heart rate is measured and "measurement result X2" is transmitted, the second state estimating section 111 estimates the psychological state of the child based on "measurement result X2" (S13: second estimation processing). In the case where the parent and child heart rates are measured, and both the "measurement result X1" and the "measurement result X2" are transmitted, the first state estimating section 110 and the second state estimating section 111 estimate the psychological states of the parent and the child, respectively.
The first state estimating section 110 and the second state estimating section 111 store the obtained estimation results in the storage unit 101 (S13). Thus, the estimation information 123 is updated.
Next, the selection unit 112 executes a process for selecting a content based on the measurement result and the like (S14). Fig. 12 is a flowchart showing the details of the selection processing S14.
First, the selection unit 112 specifies a content group from the measurement result acquired in the processing S11, for example (S200). More specifically, the selection section 112 specifies "parent content group a" when the "measurement result X1" is acquired, "child content group B" when the "measurement result X2" is acquired, and "parent and child content group C" when the "measurement result X1" and the "measurement result X2" are acquired.
Then, by referring to the history information 122 including the current "number of measurements" and the information 200 of the content 121 described above with reference to fig. 5 (i.e., information indicating "number of measurements"), the selection part 112 specifies one or more contents satisfying the condition of "number of measurements" from the selected content group (S201). For example, in the case where the number of times of measurement of the parental heart rate is "50" or more, all the contents included in the content group B are specified.
Further, by referring to the estimation information 123 including the current estimation result of the psychological state of the measured person and the information 201 of the content 121 (i.e., information indicating "estimation result"), the selection part 112 specifies one or more contents satisfying the condition of "estimation result" from among the contents specified in the processing S201 (S202). For example, in a case where the current psychological state of the parent is "active state", the contents B1 and B2, etc. to which the information 201 representing "active" is assigned are specified from the content group B.
Further, by referring to the "month age" (i.e., the time period after birth) calculated from the current date and the user information 120, and the information 202 of the content 121 (i.e., the information indicating the "month age"), the selection part 112 specifies one or more contents satisfying the condition of the "month age" from among the contents specified in the processing S202 (S203). In the present embodiment, for example, the control unit 100 calculates the child's month age from the current date and the child's birth date in the user information 120.
After executing the processing S203, the selection portion 112 specifies one or more contents satisfying the condition of "time band" from among the contents specified in the processing S203 by referring to the current time of day and the information 203 of the contents 121 (i.e., information indicating "time band") (S204).
By referring to the estimation information 123 and the information 204 of the content 121 (i.e., information indicating "tendency of mental state"), the selection section 112 specifies one or more contents satisfying the condition of "tendency of mental state" from among the contents specified in the processing S204 (S205).
By referring to the information 205 indicating "the number of executions" of the content and the information 206 indicating "the emotion", the selection part 112 specifies one content presented to the user from the contents specified in the processing S205 (S206). An example of processing when the selection section 112 specifies one content will be described below.
In the case where the content designated in the process S205 is the content a10 to a15 of the parent content group a, the selection section 112 designates, for example, a content having the maximum value (the number of executions + the random number) among values obtained by adding the generated random number to the values of the "number of executions" of the respective contents a10 to a 15.
In the case where the content designated in the process S205 is the content B20 through B30 of the child content group B, the selection part 112 designates, for example, a content having the maximum value (the number of executions + the emotion + the random number) among values obtained by adding the generated random number to the sum of the values of the "number of executions" and the value of the "emotion" of each of the contents B20 through B30.
In the case where the content specified in the process S205 is the content C30 to C35 of the parent and child content group C, the selection part 112 specifies, for example, a content having the maximum value (the number of executions + the emotion + the random number) among values obtained by adding the generated random number to the sum of the values of the "number of executions" and the value of the "emotion" of the respective contents C30 to C35. Although a random number is used in the process S205, for example, the use of a random number is merely an example, and other methods may be used to specify contents. More specifically, the selection part 112 may preferentially specify the content that has been newly set to be selectable, and in the absence of such content, the selection part 112 may randomly select one content from among the contents in which the sum of the values of the "number of executions" and the "emotion" exceeds a predetermined value (for example, "0").
Then, the selection section 112 selects one content designated in the processing S206 as a content to be presented to the user (S207).
After selecting the content in the process S207, the communication unit 102 in the server 21 transmits the estimation result of the psychological state obtained in the process S13, and the information on the selected content as shown in fig. 7 (S15). The "information related to the selected content" is, for example, information indicating a storage location of the selected content in the server 21. Instead of the above, the selected content itself may be transmitted to the terminal 20.
After the communication unit 36 in the terminal 20 receives the estimation result and the information related to the selected content, the display section 43 displays the received information and the like, and the acceptance section 44 performs processing to receive the information related to the execution of the selected content (S16). Since the details of the process S16 differ for "parent", "child", and "parent and child", these details will be described separately below.
Parent display-acceptance processing (S16a)
Fig. 13 is a flowchart showing an example of the parent display-acceptance process (S16 a). First, the first display unit 60 displays the parent screen 310 shown in fig. 14 on the display screen 33 (S300).
The parent screen 310 includes an image 630 representing the estimation result, an image 631, and buttons 632 to 635.
The image 630 representing the estimation result represents whether the psychological state of the parent (i.e., the measured person) is in the "relaxation mode" or in the "active mode" based on the estimation result. The image 630 also includes information obtained by quantifying the mental states (i.e., "relaxation level" and "activity level"). In the illustrated example, the "activity level" is displayed as "68%".
The image 631 is an input screen image for prompting a parent (i.e., a measured person) to input a current feeling, and the image 631 includes a message "how do you feel now? ".
Button 632 includes the expression "take child measurement" and button 632 is tapped to start measurement of child heart rate.
The button 633 includes the expression "view relaxation content", and the button 633 is tapped to display content for relaxing the parent, which is selected in the selection process S14 according to the result of parental measurement or the like.
The button 634 includes the expression "view overall result", and in the case where both the "parent and child" measurements are made, the button 634 is tapped to display the "parent and child" psychological state and the like on the display screen 33. The button 634 representing "view overall results" is selectable only when both the "child" measurement and the "parent" measurement are taken with one of the "child" measurement and the "parent" measurement taken and the other measurement taken thereafter. Thus, button 634 may be selected here where a "child" measurement is taken.
The button 635 includes the expression "end", and the button 635 is tapped to end the currently displayed screen 310 and display the home screen 300.
When the button 632 indicating "measure child" is pressed while the parent screen 310 is being displayed (in the processing S301 in fig. 13, "measure child"), the child measurement screen in the measurement processing S10 in fig. 8 described above is displayed (in the processing S105 in fig. 8). Thus, the person being measured can take a measurement of the child's heart rate after a parent measurement.
When the button 633 indicating "view relaxed content" is tapped (in the processing S301 of fig. 13, "relaxed content"), the first display section 60 displays a screen 311 on the display screen 33, the screen 311 including the relaxed content 640a (first content) shown in fig. 15 (S302: first display processing (first display step)). The relaxation content 640a is a playable video with the expression "drink cup hot can take a break". Watching video and, for example, drinking points may allow parents to relax.
Further, the first input screen image display section 63 displays the input screen image 640b on the display screen 33, and the input screen image 640b (first input screen image) prompts the user to input the information I1 related to the execution of the relaxation content 640a (S303: first input screen image display processing).
The input screen image 640b includes a message "how do you feel after the content is executed? ", a pull-down menu 641, and buttons 642 and 643.
The pull-down menu 641 displays an option to allow the user to input (select) information I1 related to the execution of the relaxation content 640 a. As shown in fig. 16, the pull-down menu 641 displays an option "not to execute" and a plurality of options each representing not only execution of the content but also a psychological state of the parent. Options that each represent the psychological state of the parent after the content is executed are, for example, "very relaxed", "slightly relaxed", "neither relaxed nor anxious", "feeling somewhat anxious", and "feeling very anxious". By selecting one of these options, the state of the parent's mood after the content execution is recorded in detail.
The button 642 includes the expression "again measure", and the button 642 is tapped to again make a parental measurement, for example, to confirm an effect after the content is executed.
The button 643 includes the expression "end", and the button 643 is tapped to end the currently displayed screen 311 and display the home screen 300.
In the process S16a of fig. 13, for example, when the user selects one of the options from the pull-down menu 641 and then taps the button 643 indicating "end", the first accepting section 80 accepts information of the selected option in the pull-down menu 641 (S304), which is information related to the execution of the relaxation content 640 a.
Child display-acceptance processing (S16b)
Fig. 17 is a flowchart showing an example of the child display-acceptance process (S16 b). First, the second display unit 61 displays the child screen 312 shown in fig. 18 on the display screen 33 (S310).
The child screen 312 includes an image 650 representing the estimation result, an image 651, and buttons 652 to 655.
The image 650 representing the estimation result represents whether the psychological state of the child (i.e., the measured person) is "relaxed state" or "active state" based on the estimation result. The image 650 includes, in addition to the display of the estimated state ("relaxation"), a display of information of the measured heart rate ("heart rate: 65").
The image 651 is an input screen image for prompting the user to input the state (emotion) of the child (i.e., the person being measured), and the image 651 includes a message "what emotion is the child? ". The image 651 is a drop-down menu that can select the mood (good mood, normal and bad mood) of the child before executing the interactive content.
Button 652 includes the expression "take own measurements," and button 652 is tapped to begin measurement of the parent's heart rate.
The button 653 includes the expression "view interactive content", and the button 653 is tapped to display the content selected in the selection processing S14 based on the child measurement result or the like.
The button 654 includes the expression "view overall result", and in the case where both the "parent and child" measurements are made, the button 654 is tapped to display the "parent and child" psychological state and the like on the display screen 33. The button 654 representing "view overall results" is selectable only after making both the "child" measurement and the "parent" measurement if one of the measurements is made and then the other measurement is made. Thus, button 654 may be selected here where a "parental" measurement is taken.
The button 655 includes the expression "end", and the button 655 is tapped to end the currently displayed screen 312 and display the home screen 300.
If the button 652 indicating "perform self-measurement" is pressed while the child screen 312 is being displayed (i.e., "self-measurement" in the processing S311 of fig. 17), the parent measurement screen in the measurement processing S10 of fig. 8 described above is displayed (S102 of fig. 8). Thus, the person being measured can make a measurement of the heart rate of the parent (himself) after the child measurement.
When the button 653 indicating "view interactive content" is tapped ("interactive content" in the processing S311 of fig. 17), the second display section 61 displays the screen 313 on the display screen 33, the screen 313 including the interactive content 660a (second content) shown in fig. 19 (S312: second display processing (second display step)). The interactive content 660a displayed in the process S312 is "child content" selected from the child content group B. Further, the interactive content 660a is a playable video with the expression "diaper-change song". Watching the video and, for example, changing diapers allows the parent to interact with the child while relaxing.
Further, the second input screen image display section 64 and the fourth input screen image display section 66 display the input screen image 660b (i.e., the second input screen image and the fourth input screen image) on the display screen 33, the input screen image 660b being used to prompt the user to input the information I2 relating to the execution of the interactive content 660a and the information I4 relating to the state of the child (S313: the second input screen image display processing and the fourth input screen image display processing).
The input screen image 660b includes a message "what is the current mood state after execution of the content? ", pull-down menu 661, and buttons 662-664.
The pull-down menu 661 displays options to allow the user to input (select) information I2 related to the execution of the interactive content 660a and information I4 related to the state of the child when the interactive content 660a is executed. The options set in the pull-down menu 661 include an option "do not execute", and a plurality of options each indicating the psychological states of the parent and child when the content is executed.
Here, as shown in fig. 20, the options each indicating the psychological states of the parent and the child when the content is executed are, for example, options that are a combination of the state of the parent (relaxation to anxiety) and the state of the child (good mood to bad mood), such as "parent is relaxed, child is well-minded", "parent is relaxed, child is normal", "parent is relaxed, child is poorly mood", "parent is normal, child is well-minded", and the like.
By selecting one of these options, information I2 on whether or not the content is executed and information I4 on the state of the mood of the child after the content is executed (including the state of the mood of the parent) can be recorded in detail.
The button 662 includes the expression "make own measurement", and the button 662 is tapped to make own (parent) measurement again, for example, to confirm the effect after the content execution.
The button 663 is tapped to start "image taking (photo taking)". Upon tapping of the button 663, the camera 34 is activated. By tapping the button 663 after the content is executed, the user can record the expression of the child or the expressions of the child and parent after the content is executed. The button 663 may include, for example, a message "let us take a photo bar of a parent and a child after execution of the content" for prompting the user to take images of the parent and the child.
The above message is effective in actively prompting the user for photo taking after content execution.
The button 664 includes the expression "end", and the button 664 is tapped to end the currently displayed screen 313 and display the home screen 300.
In the process S16b of fig. 17, for example, when the user selects an option from the pull-down list 661 and then taps the button 664 indicating "end", the second accepting section 81 and the fourth accepting section 83 accept not only the information I2 and the information I4 corresponding to the options in the pull-down menu 661, respectively, but also images (in the case of shooting) (S314: the second accepting process and the fourth accepting process).
Parent-child display-acceptance process (S16c)
Fig. 21 is a flowchart showing an example of the parent-child display-acceptance process (S16 c). In the case where the measurement of the parental and child heart rates is performed and both the parental and child measurement results X1 and X2 are obtained, the parental-child display-acceptance process is performed (S16 c). More specifically, in the case where the "parent and child measurement" is performed, or in the case where the button 634 or 654 indicating "view overall result" is tapped after the measurement of one of the "parent" and "child" is performed and then the measurement of the other is performed, the parent-child display-acceptance process is performed (S16 c).
First, the third display portion 62 displays the parent- child screens 320a and 320b shown in fig. 22 and 23, respectively, on the display screen 33 (S320). For example, by scrolling the display screen 33, the screen 320b is displayed after the screen 320 a.
The parent-child screen 320a includes an image 700 representing the estimation result, and an image 701. The image 700 representing the estimation result represents whether the respective psychological states of the parent and the child are "relaxed state" or "active state" based on the estimation result.
The image 701 indicates which of the four categories ("parent: relaxed, child: active", "parent: active, child: relaxed", "parent and child: both relaxed" and "parent and child: both active") corresponds to a combination of the psychological states of the parent and child.
Frame 320b includes image 710 and buttons 711-714. Image 710 is an example of an image that includes a suggestion to relax a parent. Image 710 includes, for example, a suggested message "parent is" active "and child is" relaxed ", why did not relax by drinking herbal tea or listening to music? ".
The button 711 includes the expression "post", and the button 711 is tapped to post the estimation result or comment on SNS as described later, for example.
The button 712 includes the expression "relax", and the button 712 is tapped to display "relaxed content" depending on the "parent" estimation result when the heart rates of both the parent and the child are measured.
The button 713 includes the expression "interactive", and the button 713 is tapped to display "interactive contents" according to the "parent and child" estimation result when the heart rate of both the parent and child is measured.
The button 714 includes the expression "end", and the button 714 is tapped to end the currently displayed screen 320b and display the home screen 300.
If the button 712 indicating "relax" is tapped while the parent-child screen 320b is being displayed (to "relax" in the processing S321 in fig. 21), the same processing steps S322 to S324 as those in the processing steps S302 to S304 described above are executed. Therefore, detailed description of the steps of the processes S322 to S324 is omitted here.
On the other hand, when the button 713 showing "interactive" is tapped in the state where the parent-child screen 320b is displayed (the "interactive" in the processing S321 in fig. 21), the steps of the processing S325 to S327 similar to the steps of the processing S312 to S314 described above are executed.
More specifically, first, the third display unit 62 displays the screen 330 on the display screen 33, the screen 330 including the interactive content 670a shown in fig. 24 (S325: third display processing (third display step)).
Here, the interactive content 670a (third content) is a playable video with the expression "baby massage introduction video". The interactive content 670a displayed in the process S325 is "parent and child content" selected from the parent and child content group C. Further, the interactive contents 670a "baby massage introduction video" in the present embodiment includes, for example, music generated based on the heart sounds of the parent and the child. Parents watch videos with children and parents massage the children, for example, to make the children more relaxed.
Further, the third input screen image display section 65 and the fifth input screen image display section 67 display the input screen images 670b (i.e., the third input screen image and the fifth input screen image) on the display screen 33, the input screen images 670b being used to prompt the user to input the information I3 relating to the execution of the interactive content 670a and the information I5 relating to the state of the child (S326: the third input screen image display processing and the fifth input screen image display processing).
The input screen image 670b includes a message "what is the current mood state after the content execution? ", pull-down menu 671 and buttons 672-674.
The pull-down menu 671 displays options for allowing the user to input (select) information I3 related to the execution of the interactive content 670a and information I5 related to the state of the child when the interactive content 670a is executed. The options set in the pull-down menu 671 include an option "not to execute" and a plurality of options each indicating the psychological states of the parent and child after the execution of the content. The pull-down menu 671 is the same as the pull-down menu 661 shown in FIG. 20, and the buttons 672 to 674 are the same as the buttons 662 to 664, respectively. Therefore, the detailed description of the pull-down menu 671 and the buttons 672 to 674 is omitted here.
In the process S16c of fig. 21, for example, in the case where the user selects an option from the pull-down menu 671 and then taps the button 673 indicating "end", the third accepting section 82 and the fifth accepting section 84 accept not only the information I3 and the information I5 corresponding to the options in the pull-down menu 671, respectively, but also the captured image (in the case of capturing) (S327: the third accepting process and the fifth accepting process).
In fig. 7, when the above-described process S16 (i.e., "parent process S16 a", "child process S16 b", or "parent and child process S16 c") is executed, the control unit 31 determines (in S17) whether there is information including the information items I1 to I5 and the captured image (hereinafter referred to as "acceptance information") obtained in the process S16. If there is no acceptance information (no in S17), the control unit 31 ends the series of processing.
On the other hand, in the case where there is acceptance information (yes in S17), the communication unit 36 transmits the acceptance information to the server 21 (S18).
The control unit 100 in the server 21 acquires the acceptance information and stores the acceptance information in the storage unit 101 (S19). More specifically, the control unit 100 updates the information 205 of the content group a (which indicates "number of executions") according to the information I1 in the acceptance information, updates the information 205 of the content group B according to the information I2 in the acceptance information, and updates the information 205 of the content group C according to the information I3 in the acceptance information.
Further, the control unit 100 updates the information 206 of the content group B (which represents "emotion") according to the information I4 in the acceptance information, and updates the information 206 of the content group C according to the information I5 in the acceptance information.
In addition, the control unit 100 stores the captured image in the acceptance information in the storage unit 101, and updates the captured image 124.
Then, the control unit 100 determines whether information I2, I3 indicating the execution of "interactive content" in the content group B, C or a captured image is included in the acceptance information (S20).
In the case where the information I2, I3 indicating the execution of the "interactive content" in the content group B, C or the captured image is not included in the acceptance information (no in S20), the series of processes is ended.
On the other hand, in a case where the information I2, I3 indicating the execution of the "interactive content" in the content group B, C or the captured image is included in the acceptance information (yes in S20), the index calculation unit 113 calculates (calculates) "interactive points" (S21: index calculation processing).
More specifically, as described above, in the case where the information I2 or I3 indicating the execution of "interactive content" is obtained, the index calculation section 113 increases "the number of interaction points" by "1". In another example, in a case where a parent and a child are included in a captured image, the index calculation section 113 increases the "interaction point" by "2".
After the index calculation section 113 calculates the "interaction points", the communication unit 102 transmits the calculated "interaction points" to the terminal 20 (S22).
The index display unit 68 displays the main screen 300 including the image 500 including the received "interaction points" on the display screen 33 (S23). Thus, when any "interactive content" of the "interactive contents" in the content groups B and C is executed, the "number of interactive points" in the home screen 300 increases. Accordingly, the motivation for the user to continue interacting with the child can be increased by executing the content displayed on the display screen 33.
SNS publication processing
Fig. 25 is a flowchart showing an example of processing when the user publishes the estimation result or the like on the SNS. In the present embodiment, it is assumed that, for example, when a "parent and child" measurement is performed, the user can release the estimation result and various types of information on the SNS.
First, if the user taps the button 711 indicating "distribution" on the screen 320b shown in fig. 23 displayed when the "parent and child" measurement is performed, the display section 43 displays the distribution screen 350 on the display screen 33 (S500).
As shown in fig. 26, the posting screen 350 includes an image 700, an image 701, a text input field 750, an image selection field 751, and a button 752. The images 700 and 701 are the same as those displayed in fig. 22.
The text input field 750 is a field in which the user inputs comments after measurement. The user may, for example, input impressions, effects, reactions of children, etc. after executing the "interactive content".
The image selection field 751 is a field in which the user can select an image (photograph) taken along with a child, for example, after measurement.
The button 752 includes the expression "issue", and the button 752 is tapped to issue the information input to the text input field 750 and the image selected in the image selection field 751 to the server 21 together with the images 700 and 701. The information input to the text input field 750 and the image selected in the image selection field 751 each correspond to second input information.
When the user taps the button 752 indicating "issue", the accepting section 44 accepts issue information including the images 700 and 701, the text input to the text input field 750, and the image selected in the image selection field 751 (S501).
Upon receiving the distribution information in the process S501, the communication unit 36 transmits the distribution information to the server 21 (S502).
Then, the server 21 stores the received distribution information in the storage unit 101 (S503). Thus, the distribution information 125 in the storage unit 101 is updated.
Fig. 27 illustrates an example of an SNS screen 360 displaying published information. The screen 360 includes a publication image 760 of the user and an icon 773.
Image 760 displays information posted by the user and includes textual information 770 entered by the user (i.e., "very relaxed"), an image 771 taken with the child selected by the user, and an image 772 representing the results of the parent and child measurements.
The icon 773 includes the expression "like" and is tapped by a viewer who is judged to be like to issue information. Thus, each time the icon 773 is tapped, the number of "like" points (currently "0") is increased by "1".
The display section 43 in the present embodiment may display each data of the distribution information in descending order of the "like" points, for example. Therefore, the SNS user can easily view the published information effective in relaxing children, and can effectively relieve stress in fostering their children.
Further, when the "like" point added to the posting information reaches a predetermined level, the posting information processing section 114 in the server 21 transmits a notification of comment creation to the user who posted the information given the predetermined "like" level. The comment-created notification is, for example, "you posted information is very helpful. Please upload more details to SNS ".
By transmitting the above notification to the user, information more useful for fostering children will be published on the SNS.
Although the icon 773 in this embodiment includes the expression "like" meaning "like," the expression "like" may be replaced with, for example, an icon representing a sense of identity (such as "i understand!" or the like) or an icon representing an encouragement (such as "continue to refuel!" or the like).
With the SNS in the present embodiment, there is no column for inputting comments about posted information. Thus, input of comments including defamation and slur related to the release information may be prevented.
Experiment for confirming correlation between heart rate and emotion (psychological state)
In the present embodiment, the psychological state of the person being measured is estimated based on the heart rate of the person being measured. An experiment E for proving a correlation between the Heart Rate Variability (HRV) of the measured person and "emotion" representing the psychological state of the measured person will be described below.
Object
The subjects of experiment E are, for example, 19 pairs of mother and child. The average age of 19 mothers was 30.4 years and the age of the mothers ranged from 20 to 38 years. On the other hand, the average age of 19 children is 5.7 months, and the age of the children is in the range of 3-8 months after birth.
Heart rate variability
In experiment E, mothers measured their own and child heart rate variability several times a day over two weeks. The heart rate variability is obtained from the following equation:
Nu=LF/(LF+HF)×100
wherein: "Nu" denotes a value representing heart rate variability (e.g.,% o), "LF" denotes a low frequency component of the measured heart rate variability (e.g., 0.004-0.15 Hz), and "HF" denotes a high frequency component of the measured heart rate variability (e.g., 0.15-0.4 Hz).
Furthermore, in experiment E, each mother recorded her own mood and her child's mood, for example, when measuring heart rate variability. Fig. 28 is an explanatory diagram of options of the mother's emotion and options of the child's emotion. The respective mood of the parent and child may be selected from five options. Options for the mother's mood include "1. very relaxed", "2. relaxed", "3. normal", "4. anxiety", and "5. very anxious". The choice of emotions for children includes "1. emotions very good", "2. emotions good", "3. normal", "4. emotions bad", and "5. emotions very bad".
As an example, each time heart rate variability is measured, the mother records her own mood and her child's mood, respectively, by selecting one of the options in the respective groups. The options shown in fig. 28 may be displayed on the display screen 33 of the terminal that mother wants to select, for example, and the selection result may be recorded in the storage unit 30.
Analysis results
FIG. 29 is a diagram illustrating the analysis result of experiment E. Since a multilayer Structural Equation Model (SEM) was applied to the measured heart rate variability, the mother's emotion, and the child's emotion, fig. 29 shows the results of the estimation using the multilayer SEM.
The upper area of fig. 29 represents a relational model of the inter-person layer corresponding to the so-called Between layers (inter-layers). The lower area of fig. 29 represents a relationship model of the personal intra-layer face corresponding to the so-called Within layer (intra-layer).
As seen from the model of the Between layer, the maternal heart rate variability (Nu%) has a significant effect on the maternal mood. In other words, there is a significant influence on the relationship between the heart rate variability of the measured person and the "emotion" representing the psychological state of the measured person. Therefore, as in the present embodiment, the psychological state of the person being measured can be easily estimated by measuring the heart rate of the person being measured.
Furthermore, as seen from the model of Within layer, the mood of the child has a significant impact on the mood of the mother. Thus, for example, when child content is presented and the child feels better, the likelihood that the mother feels better is high. Therefore, with the computer system 10 of the present embodiment that presents content corresponding to the psychological state of the person being measured, the mother and child can feel better.
OTHER EMBODIMENTS
Although the embodiments of the present invention have been described above, the above-described embodiments of the present invention are merely to facilitate the understanding of the present invention and should not be construed as limiting the present invention in any way. The present invention may be variously changed or modified without departing from the gist thereof, and includes equivalents thereof. For example, modifications to be described below are possible.
In the present embodiment, the guardian may be, for example, grandfather, grandmother, brother, sister, or unrelated, although it is assumed that the guardian is the parent (i.e., father or mother) of the child. In the above case, the information included in the contents shown in fig. 5 may include "guardian type" (relationship between guardian and child) to display the contents corresponding to the guardian. The content to be displayed may vary according to the type of guardian, such as presenting more active and more motile content if the type of guardian is "father", or presenting content allowing the user to participate in a seated state if the type of guardian is "grandfather" or "grandmother", or the like.
In the present embodiment, although the anxiety state or the relaxed state is estimated by comparing the heart rate with the threshold value, the present invention is not limited to such a case. The state of the measured person may be judged based on the result of frequency analysis such as "strength of autonomic nerve" and/or heart rate.
In the present embodiment, although the parent measurement screen 302 and the child measurement screen 303 are displayed separately when parent and child measurements are taken, the present invention is not limited to such a case, and only a screen (not shown) including display of a message such as "continuous measurement" or the like needs to be displayed. Since the parent heart rate is generally higher than the child heart rate, the measurement section 40 may recognize, for example, the higher heart rate of two heart rates that have been continuously measured as the parent heart rate, and the lower heart rate as the child heart rate. Alternatively, the measuring section 40 may be trained by using a large amount of measurement data of the parent and the child as teaching data, and the heart rates of the parent and the child may be discriminated by using a model trained in advance.
In another example, the index calculation section 113 may increase the "interaction points" based not on the images of the parent and the child but on the information indicating the images in which the parent and the child are captured. For example, by determining whether two persons are included in the captured image (i.e., whether two heads are present in the captured image) using the control unit 31, it can be determined whether images of a parent and a child are captured.
In still another example, the index calculation section 113 may increase the "interaction points" in a case where the estimated psychological state of the parent and the estimated psychological state of the child are consistent with each other when the parent and the child are measured, that is, in a case where the states of both the parent and the child are the "active state" or the "relaxed state". Generally, a case where the psychological states of the parent and the child are consistent is more preferable than a case where the psychological states of the parent and the child are different. The power of parents for fostering children can be increased by increasing the "interaction points" in consideration of the above points by the index calculation section 113.
In addition to the diaper changing song, contents suitable for other care scenes related to bathing, eating, changing clothes, and the like may be prepared in advance as contents to be output, the time of day at which each of these contents is output may be acquired, and contents suitable for the care rhythm of the user may be displayed at the next and subsequent output of the contents. For example, when the content of the replacement clothes is output more than a predetermined number of times in a time period of 8:00 to 9:00 in the morning, the content of the replacement clothes may be preferentially displayed in the morning time slot. For example, in the case where the bath content is never output in the morning time band, the bath content may be controlled not to be displayed in the time band)
The ranking of the "interaction points" or "number of interactions" for multiple users may be displayed. The ranking may be displayed in a descending order of "interaction points" provided to the user, in a descending order of "number of interactions" provided to the user, or may be prepared according to both "interaction points" and "number of interactions" (such as a score obtained by adding or multiplying both points and number of times, etc.). Further, the ranking may be aggregated using the number of interactions and/or the number of interactions points per a particular time period (e.g., weekly or monthly). The manner in which the rankings are displayed is not limited to merely displaying the rankings of the users, and the ranking positions of the relevant users may be displayed all the time (e.g., as a pop-up window) during the period in which the rankings are being displayed. By displaying the ranking, each user can confirm the number of interactions and the number of interactions of other users, and can be encouraged to increasingly perform heart rate measurements and interactions with children. Here, the display unit 43 displays the ranking on the display screen 33 based on, for example, "interaction points" or the like.
In the present embodiment, although the program causes the computer system 10, which is an example of a computer and includes the terminal 20 and the server 21, to execute various processes such as display of contents, the computer is not limited to the computer system 10. For example, the computer may be constituted by the terminal alone or by a server including a display screen alone, or by the terminal and a plurality of servers. Further, although the program that performs display of contents and the like in the present embodiment is described so that the terminal 20 and the server 21 realize functional blocks for selecting and displaying contents, the present invention is not limited to such a case. For example, a program that performs display of content or the like may be stored in one of the terminal 20 and the server 21, and only a device that stores the program may be caused to implement functional blocks for selecting and displaying content.
Further, the program may be supplied to the computer by using a non-transitory computer-readable medium having an executable program. Examples of non-transitory computer readable media are magnetic recording media (such as floppy disks, tapes, or hard drives), CD-ROMs (read only memories), and the like.

Claims (21)

1. A computer storage medium storing a program that, when executed by a computer, causes the computer to perform:
a first acquisition process for acquiring a first measurement result,
the first measurement represents a measurement of a guardian's heart rate;
a second acquisition process for acquiring a second measurement result,
the second measurement represents a measurement of the heart rate of the guardian's child; a first display process for displaying a first content on a display screen,
the first content is selected from a first group of content according to the first measurement;
a second display process of displaying second content on the display screen,
the second content is selected from a second group of content according to the second measurement; and
a third display process of displaying a third content on the display screen,
the third content is selected from a third content group according to the first measurement result and the second measurement result which are obtained together.
2. The computer storage medium of claim 1,
the first display processing displays the first content on the display screen,
the first content is selected from the first content group in which the number of selectable contents increases as the number of times the first measurement result is acquired increases, according to the first measurement result.
3. The computer storage medium of claim 1 or 2,
the second display processing displays the second content on the display screen,
the second content is selected from the second content group in which the number of selectable contents increases as the number of times the second measurement result is acquired increases, according to the second measurement result.
4. The computer storage medium of claim 1 or 2,
the third display processing displays the third content on the display screen,
the third content is selected from the third content group in which the number of selectable contents increases as the number of times the first measurement result and the second measurement result are collectively acquired increases, according to the first measurement result and the second measurement result.
5. The computer storage medium of claim 1 or 2,
the first display processing displays the first content on the display screen,
the first content is selected from the first content group according to the first measurement, a time period after birth of the child, and a time of day.
6. The computer storage medium of claim 1 or 2,
the second display processing displays the second content on the display screen,
the second content is selected from the second content group according to the second measurement, a time period after birth of the child, and a time of day.
7. The computer storage medium of claim 1 or 2,
the third display processing displays the third content on the display screen,
the third content is selected from the third content group according to the first and second measurements, a time period after birth of the child, and a time of day.
8. The computer storage medium of claim 5,
the first display processing displays the first content on the display screen,
the first content is selected from the first content group based on the first measurement, a time period after birth of the child, a time of day, and past data of the first measurement.
9. The computer storage medium of claim 5,
the second display processing displays the second content on the display screen,
the second content is selected from the second content group based on the second measurement, a time period after birth of the child, a time of day, and past data of the first measurement.
10. The computer storage medium of claim 5,
the third display processing displays the third content on the display screen,
the third content is selected from the third content group based on the first and second measurements, a time period after birth of the child, a time of day, and past data of the first measurement.
11. The computer storage medium of claim 5,
the program, when executed by the computer, causes the computer to further perform:
a first input screen image display process for displaying a first input screen image on the display screen,
the first input screen image is used to prompt the user to input first information,
the first information is information related to execution of content displayed on the display screen, the content belonging to a plurality of contents included in the first content group; and
a first acceptance process of accepting the input first information in association with content displayed on the display screen,
the first display processing displays the first content on the display screen,
the first content is selected from the first content group based on the first measurement, a time period after birth of the child, a time of day, and the first information.
12. The computer storage medium of claim 5,
the program, when executed by the computer, causes the computer to further perform:
a second input screen image display process for displaying a second input screen image on the display screen,
the second input screen image is used to prompt the user to input second information,
the second information is information related to execution of content displayed on the display screen, the content belonging to a plurality of contents included in the second content group; and
a second acceptance process of accepting the input second information in association with the content displayed on the display screen, the second display process displaying the second content on the display screen,
the second content is selected from the second content group according to the second measurement, a time period after birth of the child, a time of day, and the second information.
13. The computer storage medium of claim 5,
the program, when executed by the computer, causes the computer to further perform:
a third input screen image display process of displaying a third input screen image on the display screen,
the third input screen image is used to prompt the user to input third information,
the third information is information related to execution of content displayed on the display screen, the content belonging to a plurality of contents included in the third content group; and
a third acceptance process of accepting the input third information in association with the content displayed on the display screen,
the third display processing displays the third content on the display screen,
the third content is selected from the third group of content based on the first and second measurements, a time period after birth of the child, a time of day, and the third information.
14. The computer storage medium of claim 5,
the program, when executed by the computer, causes the computer to further perform:
a fourth input screen image display process of displaying a fourth input screen image on the display screen,
the fourth input screen image is used to prompt the user to input fourth information,
the fourth information is information on a state of the child when executing content displayed on the display screen, the content belonging to a plurality of contents included in the second content group; and
a fourth acceptance process of accepting the input fourth information in association with the content displayed on the display screen,
the second display processing displays the second content on the display screen,
the second content is selected from the second content group according to the second measurement, a time period after birth of the child, a time of day, and the fourth information.
15. The computer storage medium of claim 5,
the program, when executed by the computer, causes the computer to further perform:
a fifth input screen image display process of displaying a fifth input screen image on the display screen,
the fifth input screen image is used to prompt the user to input fifth information,
the fifth information is information on a state of the child when executing content displayed on the display screen, the content belonging to a plurality of contents included in the third content group; and
a fifth acceptance process of accepting the input fifth information in association with the content displayed on the display screen,
the third display processing displays the third content on the display screen,
the third content is selected from the third content group according to the first and second measurements, a time period after birth of the child, a time of day, and the fifth information.
16. The computer storage medium of claim 1 or 2,
the program, when executed by the computer, causes the computer to further perform:
an image display process of displaying a first image, a second image, and a third image on the display screen,
the first image is used to prompt the user to start measuring the guardian's heart rate,
the second image is for prompting a user to begin measuring the child's heart rate, an
The third image is used to prompt the user to start measuring the guardian's heart rate and the child's heart rate together.
17. The computer storage medium of claim 1 or 2,
music created based on heart sounds of the guardian and the child is included as content in at least any one of the second content group and the third content group.
18. The computer storage medium of claim 1 or 2,
the program, when executed by the computer, causes the computer to further perform:
an index calculation process of calculating an index based on the first input information,
the indicator represents a level of interaction between the guardian and the child,
the first input information is input after the second content or the third content is displayed on the display screen, and the second content or the third content is used for prompting the interaction between the guardian and the child; and
and an index display process of displaying the index on the display screen.
19. The computer storage medium of claim 1 or 2,
the program, when executed by the computer, causes the computer to further perform:
a first estimation process for estimating a psychological state of the guardian based on the first measurement result;
a second estimation process for estimating a psychological state of the child based on the second measurement result; and
a transmission process of transmitting estimation results of the first estimation process and the second estimation process and second input information to a server,
the server presents a social networking service that is,
the second input information is input in association with the estimation result.
20. A content display method, comprising:
a first obtaining step of obtaining a first measurement result,
the first measurement represents a measurement of a guardian's heart rate;
a second obtaining step of obtaining a second measurement result,
the second measurement represents a measurement of the heart rate of the guardian's child;
a first display step of displaying a first content on a display screen,
the first content is selected from a first group of content according to the first measurement;
a second display step of displaying second content on the display screen,
the second content is selected from a second group of content according to the second measurement; and
a third display step of displaying a third content on the display screen,
the third content is selected from a third content group according to the first measurement result and the second measurement result which are obtained together.
21. A content display apparatus comprising:
a first acquisition section configured to acquire a first measurement result,
the first measurement represents a measurement of a guardian's heart rate;
a second acquisition section configured to acquire a second measurement result,
the second measurement represents a measurement of the heart rate of the guardian's child;
a first display section configured to display first content on a display screen,
the first content is selected from a first group of content according to the first measurement;
a second display section configured to display second content on the display screen,
the second content is selected from a second group of content according to the second measurement; and
a third display section configured to display third content on the display screen,
the third content is selected from a third content group according to the first measurement result and the second measurement result which are obtained together.
CN202010187334.XA 2019-03-19 2020-03-17 Computer storage medium, content display method, and content display apparatus Pending CN111714108A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-051932 2019-03-19
JP2019051932A JP7125908B2 (en) 2019-03-19 2019-03-19 Program, content display method, and computer

Publications (1)

Publication Number Publication Date
CN111714108A true CN111714108A (en) 2020-09-29

Family

ID=72559112

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010187334.XA Pending CN111714108A (en) 2019-03-19 2020-03-17 Computer storage medium, content display method, and content display apparatus

Country Status (2)

Country Link
JP (1) JP7125908B2 (en)
CN (1) CN111714108A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003290155A (en) * 2002-03-29 2003-10-14 Toshiba Corp System and device for life support
JP2014502454A (en) * 2010-11-12 2014-01-30 マイクロソフト コーポレーション Present and customize content based on audience
JP2015054002A (en) * 2013-09-11 2015-03-23 株式会社日立システムズ Examination system for fatigue and stress
CN104739369A (en) * 2013-12-31 2015-07-01 深圳市劲升迪龙科技发展有限公司 Baby health and emotion state monitoring device
WO2016121848A1 (en) * 2015-01-28 2016-08-04 株式会社野村総合研究所 Health care system
CN107106033A (en) * 2014-12-26 2017-08-29 尤妮佳股份有限公司 The program of childcare support, childcare support method, childcare to support system and infant's monitoring device
JP2017198866A (en) * 2016-04-27 2017-11-02 ユニ・チャーム株式会社 Support method, support system, and program
CN107735137A (en) * 2015-06-30 2018-02-23 尤妮佳股份有限公司 For supporting the program of pregnant woman, pregnant woman to support method and pregnant woman to support system
KR20180087898A (en) * 2017-01-25 2018-08-03 한양대학교 에리카산학협력단 Condition transmission system for infant

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009087074A (en) * 2007-09-28 2009-04-23 Panasonic Electric Works Co Ltd Equipment control system
JP2015102851A (en) * 2013-11-28 2015-06-04 パイオニア株式会社 Voice output device, control method for voice output device, and program
JP2016048495A (en) * 2014-08-28 2016-04-07 京セラ株式会社 Portable terminal, recommendation program, recommendation system, and recommendation method
JP6737342B2 (en) * 2016-10-31 2020-08-05 ヤマハ株式会社 Signal processing device and signal processing method
JP2018068962A (en) * 2016-11-04 2018-05-10 国立大学法人大阪大学 Sound sleep device
JP6483180B2 (en) * 2017-03-30 2019-03-13 優介 庄子 Training menu presentation system and training menu presentation program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003290155A (en) * 2002-03-29 2003-10-14 Toshiba Corp System and device for life support
JP2014502454A (en) * 2010-11-12 2014-01-30 マイクロソフト コーポレーション Present and customize content based on audience
JP2015054002A (en) * 2013-09-11 2015-03-23 株式会社日立システムズ Examination system for fatigue and stress
CN104739369A (en) * 2013-12-31 2015-07-01 深圳市劲升迪龙科技发展有限公司 Baby health and emotion state monitoring device
CN107106033A (en) * 2014-12-26 2017-08-29 尤妮佳股份有限公司 The program of childcare support, childcare support method, childcare to support system and infant's monitoring device
WO2016121848A1 (en) * 2015-01-28 2016-08-04 株式会社野村総合研究所 Health care system
CN107735137A (en) * 2015-06-30 2018-02-23 尤妮佳股份有限公司 For supporting the program of pregnant woman, pregnant woman to support method and pregnant woman to support system
JP2017198866A (en) * 2016-04-27 2017-11-02 ユニ・チャーム株式会社 Support method, support system, and program
KR20180087898A (en) * 2017-01-25 2018-08-03 한양대학교 에리카산학협력단 Condition transmission system for infant

Also Published As

Publication number Publication date
JP2020154645A (en) 2020-09-24
JP7125908B2 (en) 2022-08-25

Similar Documents

Publication Publication Date Title
US10943407B1 (en) XR health platform, system and method
US10922996B2 (en) Systems and methods for generating a presentation of an energy level based on sleep and daily activity
US10170016B2 (en) Comprehensive management of human health
Coorevits et al. The rise and fall of wearable fitness trackers
Mackintosh et al. Parental perspectives of a wearable activity tracker for children younger than 13 years: acceptability and usability study
US20170055899A1 (en) Visualizing, Scoring, Recording, and Analyzing Sleep Data and Hypnograms
Ejupi et al. A kinect and inertial sensor-based system for the self-assessment of fall risk: A home-based study in older people
CN108446013A (en) Respiration sequence user interface
WO2018117663A1 (en) A method of allowing a user to receive information associated with a goal
US20150170531A1 (en) Method for communicating wellness-related communications to a user
US11189192B2 (en) Digital apparatus and application for treating myopia
US20140279728A1 (en) System and Method for Caring for a Person Afflicted with Dementia
JP5789735B2 (en) Content evaluation apparatus, method, and program thereof
Moreno-Gutierrez et al. ATOPE+: an mHealth system to support personalized therapeutic exercise interventions in patients with cancer
US20220375572A1 (en) Iterative generation of instructions for treating a sleep condition
Harding et al. Evaluation of relations between specific antecedent stimuli and self-injury during functional analysis conditions
US20190117149A1 (en) Assist method, assist system and program
JP2021026556A (en) Lifestyle modification support device, terminal device, computer program, and lifestyle modification support method
CN111714108A (en) Computer storage medium, content display method, and content display apparatus
CN107205646A (en) System and method for monitoring and promoting baby to take exercise
US20230120262A1 (en) Method for Improving the Success of Immediate Wellbeing Interventions to Achieve a Desired Emotional State
Tsang Using wearable sensors for physical activity measurement and promotion in manual wheelchair users
JPWO2019130488A1 (en) Programs used to support parents, parent support methods, and parent support systems
Drijfhout Improving eSports performance: conducting stress measurements during Fifa gameplay
US20230170074A1 (en) Systems and methods for automated behavioral activation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination