CN110427155B - Nursing method and device - Google Patents

Nursing method and device Download PDF

Info

Publication number
CN110427155B
CN110427155B CN201910804691.3A CN201910804691A CN110427155B CN 110427155 B CN110427155 B CN 110427155B CN 201910804691 A CN201910804691 A CN 201910804691A CN 110427155 B CN110427155 B CN 110427155B
Authority
CN
China
Prior art keywords
user
preset
flexible screen
frequency
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910804691.3A
Other languages
Chinese (zh)
Other versions
CN110427155A (en
Inventor
张楚楚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910804691.3A priority Critical patent/CN110427155B/en
Publication of CN110427155A publication Critical patent/CN110427155A/en
Application granted granted Critical
Publication of CN110427155B publication Critical patent/CN110427155B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04102Flexible digitiser, i.e. constructional details for allowing the whole digitising part of a device to be flexed or rolled like a sheet of paper
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/178Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Abstract

The invention provides a nursing method and a device, wherein the method is applied to a mobile terminal with a flexible screen, and the method can comprise the following steps: receiving a first input of a user; responding to the first input, and executing a preset placating operation on the user; according to the scheme provided by the invention, the pacifying operation comprises the change of the form of the flexible screen, and the pacifying can be effectively performed in time according to the reaction of a nursing object, so that a nursing person can have more time to perform other affairs, the time and the economic cost are saved, and the efficiency is improved.

Description

Nursing method and device
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a nursing method and a nursing device.
Background
With the continuous development of modern society, the work and life of people are busy increasingly. Such busy may be further aggravated when a newborn or a member who needs care for the old, sick, etc. is present in the home, and people not only need to be busy in work and home business, but also need to take care of the baby or the member who needs care.
When people are busy in other affairs, if the baby wakes up and finds that parents are not around, the baby will cry more than ever, and long-time crying and screaming not only affects the physical and mental health of the baby, but also affects other people around the baby, or when other members needing to be cared for cannot obtain sufficient care, the baby may feel bad mood and even affect the recovery of the baby, so that a method for appeasing the baby or other members needing to be cared for in time is needed urgently at present.
Disclosure of Invention
The embodiment of the invention provides a nursing method and a nursing device, which are used for solving the problems of high time cost and low efficiency of effectively nursing a baby or other members needing to be cared in a family at present.
In order to solve the above technical problem, in a first aspect, the present invention provides a nursing method, which may include:
a first input is received from a user.
Responding to the first input, and executing a preset placating operation on the user; wherein the soothing operation comprises changing a conformation of the flexible screen.
In a second aspect, embodiments of the present invention also provide a nursing device, which may include:
the receiving module is used for receiving a first input of a user.
A pacifying module for responding to the first input and executing preset pacifying operation on the user; wherein the soothing operation comprises changing a conformation of the flexible screen.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the aforementioned care processing method.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the aforementioned care processing method.
In the embodiment of the invention, a nursing method is provided, wherein a first input of a nursing object is received, and the nursing object is pacified according to the first input, wherein the pacifying operation is realized by changing the form of a flexible screen, so that correspondingly effective pacifying can be performed according to the first input of the nursing object, and therefore, a nursing person can have more time to perform other affairs, and the time, the energy and the economic cost of the nursing person are saved.
Drawings
FIG. 1 is a flow chart of the steps of a method of nursing in accordance with an embodiment of the present invention;
FIG. 2 is a flow chart illustrating the steps of a method of nursing according to an embodiment of the present invention;
FIG. 3 is a flow chart of the steps involved in another method of nursing according to an embodiment of the present invention;
fig. 4 is a block diagram showing a specific structure of a nursing device according to an embodiment of the present invention;
fig. 5 is a block diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a flowchart illustrating steps of a nursing method in an embodiment of the present invention is shown, where the method is applied to a mobile terminal having a flexible screen, and as shown in fig. 1, the method may include:
step 101: a first input is received from a user.
In the embodiment of the present invention, optionally, before receiving the first input, a preset operation for triggering the mobile terminal to enter a care mode may be set, after receiving the preset operation, the mobile terminal enters the care mode, so as to sooth the user, after receiving the preset operation, information of the user may be collected, and whether the user is a care subject is determined according to the information, when the user is identified as the care subject, the first input of the user is received, and when the user is identified as not the care subject, the information of the user is collected again or the care mode is exited.
In the embodiment of the present invention, the cared object may be an individual who cannot live independently in real life, which may include an infant, a baby, an elderly person, or a patient, and the cared person may store information of the cared object, such as an age, a care reason, a face, a body type, and the like, in the mobile terminal so as to confirm whether the received first input is from the cared object.
In the embodiment of the present invention, the preset operation may be that the user clicks and presses a preset mode in a preset area of the mobile terminal, or clicks and presses a side volume key, a power key, and the like in a preset mode, and enters a watching mode, or that a crying and screaming sound in the surrounding environment is detected to reach a preset decibel or a preset duration, and the like, and the present invention is not limited to this specifically.
In the embodiment of the present invention, after the user is determined to be a nursing subject, a specific soothing manner may be determined according to the first input of the user, and optionally, image information of the user may be collected, for example, photos and videos of the user may be collected by a camera of the mobile terminal, so as to obtain image information of the user.
In the embodiment of the invention, after the image information of the nursing object is collected, the image information is input into the emotion recognition model, the emotion recognition model can be obtained by training according to behavior and expression data of the nursing object under different light rays, scenes and postures and corresponding emotion of the nursing object as training data, and the training method of the emotion recognition model can adopt conventional deep learning training, such as building and training in a form of combining a convolutional neural network or the convolutional neural network with a cyclic neural network, so that the emotion recognition model capable of accurately recognizing the emotion of the nursing object according to the image information of the nursing object is obtained.
In the embodiment of the invention, after the image information of the nursing object is input into the emotion recognition model, the trained emotion recognition model can output emotion recognition results, such as sleeping, happy, crying and peaceful, according to the information of the mind, behavior and the like of the nursing object in the image information of the nursing object, and the emotion recognition results are used as the first input of the user to reflect the current state of the nursing object.
Step 102: responding to the first input, and executing a preset placating operation on the user; wherein the soothing operation comprises changing a conformation of the flexible screen.
In the embodiment of the invention, after receiving the first input of the user, corresponding preset soothing operation can be executed according to different first inputs, optionally, when the first input of the nursing object is sleeping, relatively relaxing music can be played for soothing, and the flapping soothing action of the nursing person is simulated by curling the flexible screen, when the first input of the nursing object is happy, crying and quietly, the interactive picture with the nursing object can be displayed on the display screen in a preset interactive mode, when the interactive picture is displayed, the flexible screen can be deformed according to the interactive picture, for example, the flexible screen forms of the face, the hand position and the like of the virtual image are changed according to the action of the virtual image in the interactive picture, so that the interestingness of interaction and the like are enhanced, the emotion of the nursing object is soothed, and the nursing object can be soothed in time when the nursing person cannot look after looking after the nursing object in time, saving time and energy of the caretaker.
Referring to fig. 2, a flow chart illustrating specific steps of a nursing method in an embodiment of the present invention is shown, where the specific steps may include:
step 201: a first input is received from a user.
In this embodiment of the present invention, optionally, the information for determining whether the user is the cared subject may be image information acquired by a camera of the mobile terminal, optionally, data including information such as a user's picture and video including an outline and a contour of the user may be included, information of all objects including the user may be included in a shooting range of the camera, and an object whose size or position is within a predetermined threshold range may also be included, which is not specifically limited in this embodiment of the present invention.
In the embodiment of the invention, after the image information is acquired, the image information can be simply cut or filtered, and parts containing other scene information, such as image information of toys and furniture, are removed, so that the interference of the other scene information on the judgment of whether a user is a nursing object is avoided, and the accuracy of an identification result is improved, or the image information can be directly identified and extracted without shooting a complete picture, so that the processing steps of subsequent information are further saved.
In the embodiment of the present invention, after obtaining image information of a user, the image information of the user may be input into an age identification model, the age identification model is obtained by training according to portrait information of people in each age group, the portrait information of people in each age group may include height, contour, color development, skin glossiness, and the like of people in each age group, and may also be weighted according to region information, portrait information of people in each age group of parents, children, and the like, so as to further ensure accuracy of model identification.
In the embodiment of the present invention, the age identification model may identify the age stage to which the user belongs according to the obtained image information to obtain an age identification result, and optionally, the age identification result may be an age range of the user, for example, the age of the target object is 2-3 years, 60-65 years, or may directly output the result that the target object is the care object, as long as it is clear whether the user is the care object, and the form of the age identification result is not particularly limited by the present invention.
In the embodiment of the present invention, when the age identification result output by the age identification model is information such as an age section of the user, whether the user is a care subject or not can be determined according to the age section, that is, whether the output age section is an age section to which the care subject preset by the mobile terminal belongs or whether the age identification result output by the age identification model is information that the user is a care subject or not can be directly known according to the age identification result.
In the embodiment of the invention, in addition to identifying whether the user is the nursing object through the image information of the user, the user can also identify through the fingerprint information of the user, optionally, when receiving the touch screen operation of the user for the mobile terminal, the fingerprint information of the user can be collected through a screen, the fingerprint information can be information such as lines of the fingerprint or screen duty of the fingerprint, and the like.
In the embodiment of the invention, whether the user is the nursing object can be judged by comparing the fingerprint information acquired by the display screen with the fingerprint information of the nursing object input in advance, and certainly, the finger size of people in each age can be counted according to big data, and the judgment method can be selected by a person skilled in the art according to the actual situation by combining the screen occupation ratio information of the fingerprint so as to judge whether the fingerprint of the user belongs to people in which age.
Optionally, step 201 includes:
step 2011: the method includes receiving a first touch input of a user and/or receiving a first expression input of the user.
In the embodiment of the present invention, the first input of the user may be a first touch input and/or a first expression input, where the first touch input may be a contact operation of the user on the mobile terminal, such as a touch screen operation on a screen of the mobile terminal, a shake operation on the mobile terminal, and the like, and the first expression input may be a recognition result of a facial expression of the user according to image information of the user.
In the embodiment of the invention, if the judgment is carried out through the image information of the user when the judgment is carried out on whether the user is the nursing object, the judgment of the image information collected before can be continuously carried out in the process of inputting the expression by the first expression of the user so as to simplify the step of judging and identifying, thereby improving the efficiency of information processing, or the collection of the image information of the user can be carried out through the camera again so as to obtain a more accurate and real-time emotion identification result.
In the embodiment of the present invention, the first expression input may be reflected in states of sleep, distraction, calmness, crying, and the like, or may be reflected in a form of an emotional index, for example, numbers in 1 to 10 represent the first expression input of the care subject, where a closer 1 indicates a more stable state, a 1 indicates that the care subject is in a sleep state, a closer 10 indicates a more unstable state, and a 10 indicates that the care subject is in a crying state, where states of uneasiness, anxiety, and the like may also be included.
Step 202: responding to the first input, and executing a preset placating operation on the user; wherein the soothing operation comprises changing a conformation of the flexible screen.
Optionally, the step 202 includes:
step 2021: and playing a preset interactive picture on the flexible screen, and changing the form of the flexible screen according to the preset interactive picture.
In the embodiment of the invention, the preset interactive pictures, such as videos, songs and the like recorded in advance by a caregiver, can be played on the flexible screen, and in order to enhance the soothing effect, the forms of the flexible screen, such as curling, rolling, bending and the like, can be changed according to the interactive pictures.
Fig. 3 shows a flowchart of specific steps of another nursing method in the embodiment of the present invention, and optionally, step 201, further includes:
step 2012: and receiving a second touch operation of which the touch duration of the user exceeds a preset duration.
In the embodiment of the present invention, the first input may be a second touch operation, when a touch screen operation on a display screen of the mobile terminal is detected and a user triggering the touch screen operation is a nursing subject, if a continuous touch duration exceeds a preset duration, the touch screen operation is determined as the second touch operation, the preset duration may be a default duration of the mobile terminal, or may be set by a parent according to a condition of a baby, and may be set to 1 minute.
At this time, step 202 includes:
step 2022: and vibrating the flexible screen according to a first preset frequency.
In the embodiment of the invention, after receiving the second touch operation, the flexible screen of the mobile terminal can trigger a vibration operation on the nursing object according to the first preset frequency, and optionally, as the nursing object continuously contacts the display screen, the vibration operation of the mobile terminal according to the first frequency can be transmitted to the perception of the nursing object, so that the nursing object is pacified.
And/or, step 2023: and repeatedly executing the curling operation on the flexible screen according to a second preset frequency.
In the embodiment of the invention, after the second touch operation is received, the flexible screen of the mobile terminal can be curled and deformed according to the preset curling angle and the curling force, and also can be preset with the curling frequency, so that the touch and the beat of a caregiver when the caregiver pacifies a nursing object can be simulated, and the nursing object can be further pacified.
Optionally, when the user confirms that the infant cares for the subject, the first preset frequency is a breathing frequency or a heartbeat frequency of the mother of the infant, and the second preset frequency is a beating frequency of the mother of the infant.
In the embodiment of the invention, when the second touch operation of the user is received and the user is confirmed to be the infant nursing object, the frequency of the flexible screen vibration can be the breathing frequency or the heartbeat frequency of the infant mother, and the second frequency can be the beating frequency of the infant mother when the infant mother soothes the infant, so that the familiar and soothing scene of the infant can be simulated, and the infant can be better soothed.
In this embodiment of the present invention, when the continuous touch duration is shorter than the preset duration, the first expression input of the user may be received again, or the first expression input of the user may be directly received without receiving the first touch input of the user, where the method further includes:
a first expression input of a user is received.
In the embodiment of the invention, if the judgment is carried out through the image information of the user when the judgment is carried out on whether the user is the nursing object, the judgment can be carried out by continuously using the image information input emotion recognition model collected before the expression input process of the user, so as to simplify the steps of judgment and recognition, thereby improving the efficiency of information processing, or the image information of the user can be collected through the camera again, thereby obtaining a more accurate and real-time emotion recognition result.
In the embodiment of the present invention, the first expression input may be reflected in states of sleep, distraction, calmness, crying, and the like, or may be reflected in a form of an emotional index, for example, numbers in 1 to 10 represent the first expression input of the care subject, where a closer 1 indicates a more stable state, a 1 indicates that the care subject is in a sleep state, a closer 10 indicates a more unstable state, and a 10 indicates that the care subject is in a crying state, where states of uneasiness, anxiety, and the like may also be included.
And if the first expression input is in a sleep state, playing audio according to a third preset frequency, wherein the third preset frequency comprises a preset respiratory frequency of a caregiver or a preset heartbeat frequency of the caregiver.
In the embodiment of the invention, corresponding soothing operation can be executed according to the first expression input of the nursing object, for example, when the first expression input of the nursing object is in a sleep state, audio can be played according to a third preset frequency, and optionally, the third preset frequency can be the heartbeat and breathing frequency of the nursing object recorded in advance, so that a safe and comfortable environmental atmosphere is created for the nursing object, and the time for the nursing object to be watched by the nursing object is saved for the nursing object.
In the embodiment of the invention, the audio can be directly recorded audio such as respiratory sound and heartbeat sound of a caretaker, so that the caretaker can better sooth the sleeping caretaker in a real original sound environment, or can also be preset white noise such as water flow and bird song, wherein the white noise refers to uniform sound with power in the whole audible range (0-20 KHZ).
And if the first expression input is in a non-sleep state, displaying a preset interactive picture in the flexible screen, and changing the form of the flexible screen according to the change condition of the displayed picture.
In the embodiment of the invention, when the first expression input of the nursing object is in the non-sleep state, a preset interactive picture can be displayed to the nursing object, for example, a preset cartoon or other television programs which the nursing object likes are played, or a preset video which is recorded by the nursing person in advance is played, wherein the preset interactive picture can comprise interactive contents such as storytelling, singing, playing games and the like, so as to attract the attention of the nursing object and correspondingly placate the nursing object.
In the embodiment of the invention, in the flexible screen of the mobile terminal, optionally when the preset mutual animation surface is displayed, the form of the flexible screen can be changed according to the change of the displayed image, so that the soothing interactivity of the mobile terminal is enhanced, a better soothing operation is performed on a nursing object, optionally, a preferred cartoon character image of the nursing object, a 3D (3D) model of the nursing person and the like can be displayed in the flexible screen to interact with the nursing object, optionally, when the cartoon character image or the 3D model of parents nod and shake heads in the image, the area corresponding to the portrait in the flexible screen is correspondingly distorted along with the swinging direction and amplitude of the portrait, so that the interestingness is enhanced, the interaction is more real, and the soothing effect on the nursing object is improved.
Optionally, the method may further comprise,
when the user is an infant, if the fact that the continuous crying duration of the infant reaches the preset crying duration is detected, sending warning information to a preset contact.
In the embodiment of the invention, the preset crying time of the infant nursing object can be set, when the continuous crying time of the infant reaches the preset crying time, the infant may not cry due to emotional reasons but may be thirsty, hungry or uncomfortable, and the like, and the soothing operation on the infant is usually not obvious, so that the crying time of the infant needs to be detected and compared with the preset crying time, and the crying reason of the infant is judged.
In the embodiment of the invention, when the continuous crying time of the baby reaches the preset crying time, the preset contact person can be triggered to send the warning information, wherein the preset contact person can be a nurse of the father, mother, milk and the like of the baby, and the warning information can be 'the baby has crying xx time and please check as soon as possible', and the like, so that the contact person can conveniently know the current state of the baby.
In the embodiment of the invention, when dangerous actions of the nursing object, such as climbing of a baby to a bed, plugging of a toy in a mouth, falling of an old man and the like, are monitored, the alarm information can be sent to the preset contact person, and the safety of the nursing object can be further ensured while the emotion of the nursing object is ensured to be soothed.
According to the nursing method, after the current target object is judged to be the infant needing nursing, image information of the infant is collected, the current emotion state of the infant is identified according to a pre-trained emotion recognition model, and corresponding soothing operation is executed according to the current emotion state of the infant, so that corresponding effective soothing can be performed in real time according to the emotion of the infant, parents can be guaranteed to have more time to perform other affairs, and time, energy and economic cost of the parents are saved.
It should be noted that the foregoing method embodiments are described as a series of acts or combinations for simplicity in explanation, but it should be understood by those skilled in the art that the present invention is not limited by the order of acts or acts described, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 4, a block diagram of a care device 300 in an embodiment of the invention is shown. The device is applied to a terminal with a flexible screen, and comprises:
a receiving module 301, configured to receive a first input of a user;
a pacifying module 302 configured to perform a preset pacifying operation on the user in response to the first input; wherein the soothing operation comprises changing a conformation of the flexible screen.
Optionally, the receiving module 301 is specifically configured to receive a first touch input of a user, and/or receive a first expression input of the user;
the pacifying module 302 is specifically configured to play a preset interactive picture on the flexible screen, and change a form of the flexible screen according to the preset interactive picture.
Optionally, the receiving module 301 is specifically configured to receive a second touch operation that the touch duration of the user exceeds a preset duration;
the pacifying module 302 is specifically configured to vibrate the flexible screen according to a first preset frequency; and/or repeatedly executing the curling operation on the flexible screen according to a second preset frequency.
Optionally, the user is an infant, the first preset frequency is a breathing frequency or a heartbeat frequency of the mother of the infant, and the second preset frequency is a beating frequency of the mother of the infant.
The above-mentioned apparatus can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 to fig. 2, and is not described here again to avoid repetition.
Fig. 5 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present invention.
The mobile terminal 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and a power supply 511. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 5 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The user input unit 507 is configured to receive a first input of a user; a processor 510, configured to perform a preset soothing operation on the user at least through the display unit 506 in response to the first input; wherein the soothing operation comprises changing a conformation of the flexible screen.
In the embodiment of the invention, a nursing method is provided, wherein a first input of a nursing object is received, and the nursing object is pacified according to the first input, wherein the pacifying operation is realized by changing the form of a flexible screen, so that correspondingly effective pacifying can be performed according to the first input of the nursing object, and therefore, a nursing person can have more time to perform other affairs, and the time, the energy and the economic cost of the nursing person are saved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 510; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 502, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output related to a specific function performed by the mobile terminal 500 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used to receive an audio or video signal. The input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphic processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. The microphone 5042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 501 in case of the phone call mode.
The mobile terminal 500 also includes at least one sensor 505, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 5061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 5061 and/or a backlight when the mobile terminal 500 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 505 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 506 is used to display information input by the user or information provided to the user. The Display unit 506 may include a Display panel 5061, and the Display panel 5061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 5071 using a finger, stylus, or any suitable object or attachment). The touch panel 5071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 5071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 510 to determine the type of the touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of the touch event. Although in fig. 5, the touch panel 5071 and the display panel 5061 are two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 5071 and the display panel 5061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 508 is an interface through which an external device is connected to the mobile terminal 500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 500 or may be used to transmit data between the mobile terminal 500 and external devices.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 509 and calling data stored in the memory 509, thereby performing overall monitoring of the mobile terminal. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510. The mobile terminal 500 may further include a power supply 511 (e.g., a battery) for supplying power to various components, and preferably, the power supply 511 may be logically connected to the processor 510 via a power management system, so that functions of managing charging, discharging, and power consumption are performed via the power management system.
In addition, the mobile terminal 500 includes some functional modules that are not shown, and thus, are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, including a processor 510, a memory 509, and a computer program stored in the memory 509 and capable of running on the processor 510, where the computer program, when executed by the processor 510, implements each process of the above-mentioned nursing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned nursing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (6)

1. A nursing method is applied to a terminal with a flexible screen, and is characterized by comprising the following steps:
receiving a first input of a user;
responding to the first input, and executing a preset placating operation on the user; wherein the soothing operation comprises changing a conformation of the flexible screen;
the receiving of the first input of the user specifically includes:
receiving a second touch operation of which the touch duration of the user exceeds a preset duration;
the executing of the preset pacifying operation on the user specifically comprises the following steps:
vibrating the flexible screen according to a first preset frequency so that the terminal can placate the user continuously contacting the flexible screen; and/or the presence of a gas in the gas,
and repeatedly executing curling operation on the flexible screen according to a second preset frequency so that the flexible screen simulates stroking and beating of a caregiver when the caregiver pacifies the user.
2. The method of claim 1, wherein the user is an infant, the first preset frequency is a breathing frequency or a heartbeat frequency of the infant's mother, and the second preset frequency is a beating frequency of the infant's mother.
3. Nursing device, applied to a terminal with a flexible screen, characterized in that it comprises:
the receiving module is used for receiving a first input of a user;
a pacifying module for responding to the first input and executing preset pacifying operation on the user; wherein the soothing operation comprises changing a conformation of the flexible screen;
the receiving module is specifically used for receiving a second touch operation of which the touch duration of the user exceeds a preset duration;
the pacifying module is specifically used for vibrating the flexible screen according to a first preset frequency so that the terminal pacifies the user continuously contacting the flexible screen; and/or repeatedly performing curling operation on the flexible screen according to a second preset frequency so that the flexible screen simulates stroking and beating of a caregiver when the caregiver pacifies the user.
4. The device of claim 3, wherein the user is an infant, the first predetermined frequency is a breathing frequency or a heartbeat frequency of the infant's mother, and the second predetermined frequency is a beating frequency of the infant's mother.
5. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the care method according to any one of claims 1 to 2.
6. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the care method according to one of claims 1 to 2.
CN201910804691.3A 2019-08-28 2019-08-28 Nursing method and device Active CN110427155B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910804691.3A CN110427155B (en) 2019-08-28 2019-08-28 Nursing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910804691.3A CN110427155B (en) 2019-08-28 2019-08-28 Nursing method and device

Publications (2)

Publication Number Publication Date
CN110427155A CN110427155A (en) 2019-11-08
CN110427155B true CN110427155B (en) 2021-05-18

Family

ID=68416447

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910804691.3A Active CN110427155B (en) 2019-08-28 2019-08-28 Nursing method and device

Country Status (1)

Country Link
CN (1) CN110427155B (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201742464U (en) * 2010-08-06 2011-02-09 华为终端有限公司 Mobile terminal with function of nursing baby
US10975603B2 (en) * 2014-06-12 2021-04-13 Microsoft Technology Licensing, Llc Flexible display computing device
CN204500095U (en) * 2015-03-31 2015-07-29 肖月明 A kind of have the infanette of pacifying function
US10152125B2 (en) * 2015-11-20 2018-12-11 Immersion Corporation Haptically enabled flexible devices
CN107007074A (en) * 2017-05-18 2017-08-04 万建国 Simulate the body-sensing shaking table of parent heartbeat environment
CN108994839A (en) * 2018-08-22 2018-12-14 深圳威琳懋生物科技有限公司 The control method and storage medium of intelligent infant nurse robot

Also Published As

Publication number Publication date
CN110427155A (en) 2019-11-08

Similar Documents

Publication Publication Date Title
US9848796B2 (en) Method and apparatus for controlling media play device
CN108711430B (en) Speech recognition method, intelligent device and storage medium
CN109938720B (en) Heart rate-based reminding method, wearable device and computer-readable storage medium
CN106878390B (en) Electronic pet interaction control method and device and wearable equipment
CN109065060B (en) Voice awakening method and terminal
CN109756626B (en) Reminding method and mobile terminal
CN107527466A (en) A kind of fire alarm method, terminal and computer-readable recording medium
CN109215683A (en) A kind of reminding method and terminal
CN109819167A (en) A kind of image processing method, device and mobile terminal
CN109391842B (en) Dubbing method and mobile terminal
CN111387978A (en) Method, device, equipment and medium for detecting action section of surface electromyogram signal
CN110533651A (en) A kind of image processing method and device
CN109660654A (en) A kind of eyeshield based reminding method, flexible screen terminal and computer readable storage medium
CN110517463B (en) Method for reminding user to wear wearable device and mobile terminal
CN109190607A (en) A kind of motion images processing method, device and terminal
CN111415722A (en) Screen control method and electronic equipment
US11169599B2 (en) Information processing apparatus, information processing method, and program
CN109164908B (en) Interface control method and mobile terminal
CN110113487A (en) A kind of information prompting method, mobile terminal and computer readable storage medium
CN110225196A (en) Terminal control method and terminal device
CN108833791A (en) A kind of image pickup method and device
CN110427155B (en) Nursing method and device
JP2000020222A (en) Portable stuffed doll type remote operation device, information processor, communication device, and security device
CN111274419A (en) Control system and control method of community network
WO2021169918A1 (en) Information output method, electronic device, and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant