WO2019157633A1 - Intelligent service terminal and platform system and methods thereof - Google Patents

Intelligent service terminal and platform system and methods thereof Download PDF

Info

Publication number
WO2019157633A1
WO2019157633A1 PCT/CN2018/076659 CN2018076659W WO2019157633A1 WO 2019157633 A1 WO2019157633 A1 WO 2019157633A1 CN 2018076659 W CN2018076659 W CN 2018076659W WO 2019157633 A1 WO2019157633 A1 WO 2019157633A1
Authority
WO
WIPO (PCT)
Prior art keywords
intelligence service
user
information
service terminal
platform system
Prior art date
Application number
PCT/CN2018/076659
Other languages
French (fr)
Inventor
Yukkuen WONG
Rajiv Khosla
Original Assignee
Nec Hong Kong Limited
Human Centred Innovations Pty. Ltd .
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nec Hong Kong Limited, Human Centred Innovations Pty. Ltd . filed Critical Nec Hong Kong Limited
Priority to SG11202007610SA priority Critical patent/SG11202007610SA/en
Priority to CN201880009316.3A priority patent/CN110337698B/en
Priority to PCT/CN2018/076659 priority patent/WO2019157633A1/en
Publication of WO2019157633A1 publication Critical patent/WO2019157633A1/en

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/08Elderly
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1112Global tracking of patients, e.g. by using GPS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4088Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4815Sleep quality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/10Machine learning using kernel methods, e.g. support vector machines [SVM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/01Dynamic search techniques; Heuristics; Dynamic trees; Branch-and-bound
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N7/00Computing arrangements based on specific mathematical models
    • G06N7/01Probabilistic graphical models, e.g. probabilistic networks

Definitions

  • the non-limiting and exemplary embodiments of the present disclosure generally relate to the field of intelligence service techniques, and more particularly relate to an intelligent service terminal, intelligence service platform system, and intelligence service methods at the intelligence service terminal and the intelligence service platform system.
  • Companion mobile robots have been designed and used to communicate with human like dementia/Alzheimer patients to help them and give them a companion.
  • Such companion mobile robots could interact with the patients using a set of predefined messages, instructions, songs scripts and stories and were trained in controlled environment.
  • these companions are not directed to be used in a home environment for a long term. They can only interacting with the patients using a set of predefined messages, instructions, songs, scripts and stories. Thus, the existing robot looks more like a cold machine than a friend or a companion. And in turn, it is hard to build a close relationship between the patient and the robot and also hard to make the best of the companion mobile robot to help the patient.
  • the intelligence service platform system may comprise an input module, a deep learning engine and an output model.
  • the input module may be configured to receive user related information collected by an intelligence service terminal.
  • the deep learning engine may be based on a human mind deep learning model, wherein the human mind deep learning model describes a manner that the brain encodes information, wherein the human mind deep learning model is trained based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use, and wherein the deep learning engine is further configured to identify the user’intention or emotion from the received user related information and generate a corresponding response or command output.
  • the output module may be configured to provide the corresponding response or command output to the intelligence service terminal.
  • an intelligence service terminal may comprise a user information acquisition module, an image acquisition module, a sound acquisition module, a data transmission module, a response receiving module and a control module.
  • the user data acquisition module may be configured to acquire user related information from a body sensor associated with a user.
  • the image acquisition module may be configured to capture a face image of the user.
  • the sound acquisition module may be configured capture a voice input from the user.
  • the data transmission module may be configured to transmit at least one of the user related information, users’face image, or the voice input from the user to an intelligence service platform system.
  • the response receiving module may be configured to receive a response or command from the intelligence service platform system.
  • the control module may be configured to control the intelligence service terminal to provide a response to the user based on the response or command from the intelligence service platform system.
  • a method for providing intelligence services at a service platform may comprise receiving user related information collected by an intelligence service terminal; identifying, by a deep learning engine, the user’intention or emotion from the received information and generating a corresponding response or command output, wherein the human mind deep learning engine is based on a human mind deep learning model, wherein the human mind deep learning model describes a manner that the brain encodes information, and wherein the human mind deep learning model is trained based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use; and providing the corresponding response or command output to the intelligence service terminal.
  • a method for providing intelligence services at an intelligence service terminal may comprise acquiring user related information from body sensor associated with a user; capturing a face image of the user; capturing a voice input from the user; transmitting at least one of the user related data, the face image of the user, or the voice input from the user to an intelligence service platform system; receiving a response or command from the intelligence service platform system; and controlling the intelligence service terminal to provide a response to the user based on the response or command from the intelligence service platform system.
  • an intelligence server may comprise a processor and a memory.
  • the memory may be coupled with the processor and having program codes therein, which, when executed on the processor, cause intelligence server to perform operations of the third aspect.
  • an intelligence service terminal may comprise a processor and a memory.
  • the memory may be coupled with the processor and have program codes therein, which, when executed on the processor, cause the intelligence service terminal to perform operations of the fourth aspect.
  • a computer-readable storage media with computer program codes embodied thereon, the computer program codes configured to, when executed, cause an apparatus to perform actions in the method according to any embodiment in the third aspect.
  • a computer-readable storage media with computer program codes embodied thereon, the computer program codes configured to, when executed, cause an apparatus to perform actions in the method according to any embodiment in the fourth aspect.
  • a computer program product comprising a computer-readable storage media according to the seventh aspect.
  • a computer program product comprising a computer-readable storage media according to the eighth aspect.
  • the deep learning model could be trained based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use, and thus the robot can learn from the user day by day and output appropriate commands or response to respective users accordingly without requiring pre-defined instructions and messages.
  • Fig. 1 schematically illustrates a diagram of an example architecture of an intelligence service system according to an embodiment of the present disclosure
  • Fig. 2 schematically illustrates a diagram of an example intelligence service solution according to an embodiment of the present disclosure
  • Fig. 3 schematically illustrates a diagram of an example intelligence service solution at the user’side according to an embodiment of the present disclosure
  • Fig. 4 further illustrates an example daily activity management of the intelligence service terminal according to an embodiment of the present disclosure
  • Fig. 5 illustrates another example healthy and emotion monitoring functionality of the intelligence service terminal according to an embodiment of the present disclosure
  • Fig. 6 illustrates a block diagram of the intelligence service terminal according to an embodiment of the present disclosure
  • Fig. 7 schematically illustrates a diagram of an example intelligence service solution at the service platform side according to an embodiment of the present disclosure
  • Fig. 8A schematically illustrates a diagram of an example robot service functionality according to an embodiment of the present disclosure
  • Fig. 8B schematically illustrates a diagram of an example emotion engagement functionality according to an embodiment of the present disclosure
  • Fig. 8C schematically illustrates a diagram of an example emotion tracking functionality according to an embodiment of the present disclosure
  • Fig. 8D schematically illustrates a diagram of an example emotion interpretation functionality according to an embodiment of the present disclosure
  • Fig. 9 schematically illustrates a diagram of an example face detection and feature location according to an embodiment of the present disclosure
  • FIG. 10A to 10G schematically illustrate lifestyle analytics examples according to embodiments of the present disclosure
  • Fig. 11 schematically illustrates a block diagram of the intelligence service platform system according to an embodiment of the present disclosure
  • Fig. 12 schematically illustrates a specific implementation of the intelligence service platform system according to an embodiment of the present disclosure
  • Fig. 13 illustrates a method for providing intelligence services at the intelligence service terminal according to an embodiment of the present disclosure
  • Fig. 14 illustrates a method for providing intelligence services at the intelligence service platform system according to an embodiment of the present disclosure.
  • Fig. 15 illustrates a method for providing intelligence services at the intelligence service platform system according to an embodiment of the present disclosure.
  • each block in the flowcharts or blocks may represent a module, a program, or a part of code, which contains one or more executable instructions for performing specified logic functions, and in the present disclosure, a dispensable block is illustrated in a dotted line.
  • these blocks are illustrated in particular sequences for performing the steps of the methods, as a matter of fact, they may not necessarily be performed strictly according to the illustrated sequence. For example, they might be performed in reverse sequence or simultaneously, which is dependent on natures of respective operations.
  • block diagrams and/or each block in the flowcharts and a combination of thereof may be implemented by a dedicated hardware-based system for performing specified functions/operations or by a combination of dedicated hardware and computer instructions.
  • these existing robot companions are not directed to be used in a home environment for a long term, they can only interact with the patients using a set of predefined messages, instructions, songs, scripts and stories.
  • the existing robot looks more like a cold machine than a friend or a companion. And in turn, it is hard to build a close relationship between the patient and the robot and also hard to make the best of the companion mobile robot to help the patient.
  • a new intelligence service system to mitigate or at least partially alleviate the problems in the existing solutions.
  • it is proposed to build a centralized service platform for robots and use the human mind deep learning model to serve users.
  • the human mind deep learning model could be trained separately based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use, and thus the robot can learn from the user day by day and output appropriate commands or response for respective users accordingly without requiring pre-defined instructions and message.
  • a deep learning engine 130 configured to perform both training 131 and reasoning 132.
  • the deep learning engine 130 could be based on a human mind deep learning model.
  • the basic infrastructure of the human mind deep learning model can be built based on any suitable deep learning technologies, such as deep neural network, K-nearest neighbors algorithm, K-means algorithm, linear regression algorithm, support vector machine, naive Bayes, classification trees, etc.
  • Data collection modules 120A-1, 120A-2, 120A-3 collect data from respective users 110A-1, 110A-2, 110A-3.
  • the human mind deep learning model could be trained separately based on individual related data collected from respective users and further adjusted by individual’s daily responses collected by the intelligence service terminal in use. At the same time, the human mind deep learning model could provide suitable response to collected information as required by respective applications 120B-1, 120B-2 and 120B-3.
  • the applications 120B-1, 120B-2 and 120B-3 can be different from each other.
  • the deep learning engine could learn and train for different user of respective applications.
  • the applications 120B-1, 120B-2 and 120B-3 can also be same and individuals could use the same type of applications.
  • health care applications will be taken as an example to describe embodiments of the present disclosure; however, it can be understood that they are only given for illustrative purposes and the present disclosure is not limited thereto.
  • Fig. 2 schematically illustrates a diagram of an example intelligence service solution according to an embodiment of the present disclosure.
  • a robot 220 in the example solution there is a robot 220, a private network 230, the service platform system 240, and external assistant service entities 251 to 254 and user’s family 255.
  • the user 210 can be bound with the robot 220 for example by associating his/her identity card number, social security card number, or other identity with the robot.
  • the user 210 could communicate with the robot 220 through microphone, speaker, cameras etc., and the robot 220 could provide a response to the user by speaker, lamps and/or movements. Therefore, the response may include a light response, a sound response, a movement response, any other suitable responses or any combination thereof.
  • Some of responses to the user could be provided by the robot 220 itself. For example, when the robot 220 detects a person nearby, it could turn its head toward the person, when the robot 220 detect a sound from the user, its hear lamp could light to wait the user 210 to speak. Some other responses may be obtained from the service platform 240.
  • the robot 220 could also collect data or information from the user.
  • the robot 220 could provide the collected data or information to the service platform system 240 and get command or response from the service platform system 240.
  • the robot could track the user’s face images and send them to the service platform, the service platform system 240 could process these images and generate suitable response command and get it back to the robot.
  • the robot 220 can be connected within a private network 230 which may also be connected with other terminal devices like smart phones, personal computer, tablet, notepad, etc.
  • the private network 230 is further connected to a wide area network, through which the robot 220 could be connected to the service platform system 240.
  • the access control module 241 is configured to perform access control on the access of terminals such as robots, or terminal devices.
  • the analytical server 243 is configured to use the deep learning engine to predict the user’s intention or motion based on the input information and provide a corresponding response or command.
  • the service platform system 240 could be further connected to external assistant service entities such as public or private hospital 251, general practitioner (GP) and care agencies 252, online health provider 253, community center and residential care facilities 254. From these external assistant service entities, it is possible to obtain information associated with the user and it may could obtain professional recommendation or suggestion therefrom and provide them to the user.
  • external assistant service entities such as public or private hospital 251, general practitioner (GP) and care agencies 252, online health provider 253, community center and residential care facilities 254. From these external assistant service entities, it is possible to obtain information associated with the user and it may could obtain professional recommendation or suggestion therefrom and provide them to the user.
  • the server platform system 240 may also be connected with the user’s family 255 to collect additional information from his/her family, share use r’information with them, send message or alert them in emergency.
  • Fig. 3 schematically illustrates the intelligence service solution at the user’side according to an embodiment of the present disclosure.
  • the user may be equipped with a body sensor 211, which may be for example a wearable device like Apple Watch, intelligence bracelet, or any other sensors.
  • body sensors 211 could sense heart rates, anxiety, emotional profile, sleep quality, blood pressure, etc. They may also track the movement of the user, amount of exercise, location tracking, etc.
  • the body sensor 211 could be connected wirelessly to the robot 220, for example through Bluetooth. Thus the robot 220 could collect information from the body sensors and provide it to the service platform system.
  • terminal devices like smart phone 231, notepad 232, tablet 233, personal computer 234, etc., can be connected in the private network 230 as well. These terminal devices can be connected to the robot wirelessly, and the robot 220 could collect information from these terminal devices too.
  • the terminal device may also be connected to the service platform system to view information on the robot and information on the user associated with the robot. For example the user may log into the service platform website through internet with the user’s account to review the user’s information. Or alternatively, it is also possible to download an intelligence service application from the website or application store to view the user information.
  • the robot 220 could provide alertness service 261, health recommendation services 262, connectivity service 263, healthy indicator service 264, lift style service 265, gamification services 266 or any other service.
  • the alertness service 261 is configured to provide alertness to the user, a nurse and/or a family member when some abnormality is detected. For example, if the robot 220 or the service platform system 240 detects an abnormal heart rate, it may send an alert to the user, the nurse, or the family member. The service platform system 240 may even make a call to an emergency center, or a designed person, like a doctor, a family member in emergency.
  • the health recommendation service 262 is configured to provide recommendations or suggestions on health. For example, if the robot 220 or the service platform 240 detects physical parameter not in a good condition, it may provide some recommendations, advices, or suggestions to improve the current condition. These recommendation, advices, or suggestions may be given based on information stored in the robot or the service platform, or even be given by a professional expert like doctor on line if the user likes to use such a service.
  • the connectivity service 263 is configured to provide functionality for enabling other terminal devices to connect with the robot or the service platform system so that they could review or share the information of the user.
  • Other terminal devices could connect the robot or the service platform system, as family members, by means of the serial number of the robot, user device ID or family member account.
  • the health indicator service 264 is configured to provide healthy indicator based on the user’s health data.
  • the health indicator can be provided by the service platform, and the healthy indicator can be shared between the robot and other terminal devices like those of the family members.
  • Lifestyle service 265 is an analysis function, which could provide various lifestyle analyses based on user history data. For example, it could provide daily service usage pattern, duration of interaction, text analytics, frequency of interaction, service comparison among specified days in home-based care, etc. These services will be described with reference to Figs. 10A to 10G and thus will be not elaborated herein.
  • Gamification service 266 is configured to employ game design elements to motivate participation, and improve engagement.
  • the user interface may be designed to use big buttons for ease of use, and each application is simple and able to access by a few buttons.
  • Fig. 4 further illustrates an example daily activity management of the robot to an embodiment of the present disclosure.
  • the robot 220 could get daily scheduled activity plan 271 for the user from a calendar application.
  • the daily scheduled activity plan 271 can be generated by the service platform system 240 based on user history data and then transmitted to the robot 220.
  • the robot 220 manages the scheduled tasks in the daily scheduled activity plan. For example, it could remind the user 210 of a task when it is time to take it and track the progress of tasks in the task progress table 272.
  • the user 210 could send a video to his/her family member, call the doctor to get some advices, or share information with other people with dementia too.
  • Fig. 5 illustrates another example healthy and emotion monitoring functionality of the intelligence service terminal (such as a robot) according to an embodiment of the present disclosure.
  • the user 210 may be equipped with body sensors 211, which may be for example a wearable device like Apple Watch, intelligence bracelet, or any other sensor. These body sensors 211 could sense heart rates 211a, anxiety/emotional profile 211b, sleep quality 211c, blood pressure211d, etc. If the robot 220 or the service platform system 240 determines healthy data abnormality (such a rather high or low blood pressure or heart rate) , it is possible to send an alert to nurses or doctors, and send monitored data to hospital for diagnosis and treatment (221) .
  • healthy data abnormality such a rather high or low blood pressure or heart rate
  • the service platform 240 may command the robot to sing and/or dance favorite tunes of the user to boost up emotion (222) .
  • the robot 210 or the service platform system 240 could also adapt or change daily scheduled activity plan to fit the monitored heathy condition or the mood condition (223) .
  • Fig. 6 illustrates a block diagram of the intelligence service terminal according to an embodiment of the present disclosure, which could implement the above-mentioned functionalities of the intelligence service terminal.
  • the intelligence service terminal 600 includes a user information acquisition module 610, an image acquisition module 620, a sound acquisition module 630, a data transmission module 640, a response receiving module 650, and a control module 660.
  • the user data acquisition module 610 may be configured to acquire user related information from body sensor associated with a user.
  • the user 210 may be equipped with body sensors 211, which could sense heart rates 211 a, anxiety/emotional profile 211b, sleep quality 211c, blood pressure211d, etc. These sensors may also track the movement of the user, amount of exercise, location tracking, etc.
  • the image acquisition module 620 may be configured to capture a face image of the user by means of camera.
  • the face image can be used to track the user’s emotion. It can be understand that in different emotions, the user could have different expressions on the face, and thus it is possible track the user’s emotions by capturing the user’s face image.
  • the image acquisition module 620 can be used to capture a Quick Response (QR) Code.
  • QR Quick Response
  • the service platform 240 can be used to command the robot to execute a corresponding application. For example, all services of the robot could be QR coded on smart phone. The user could put a QR code of a desirable application in front of the camber of the robot.
  • the image acquisition module 610 could capture the QR code and send it to the service platform which could in turn identifies the QR code and command the robot to run a corresponding application.
  • the application could be story, a song, quiz, etc. In such a way, it could reduce application development and implementation time and increase the flexibility. In other words, it could improve emotional well-being and improve social interaction personalized to an end user by automatically physical embodiment of human emotions and sentiments in a robot embedded in A QR encoded application.
  • the sound acquisition module 630 may be configured to capture a voice input from the user.
  • the robot could collect the voice input from the user, understand the meaning of the words and respond in an appropriate way.
  • the data transmission module 640 may be configured to transmit at least one of the user related data, the face image of the user, or the voice input from the user to an intelligence service platform system. These types of information can be further processed at the service platform system, for example to determine a suitable response to the user.
  • the response receiving module 650 may be configured to receive a response or command from the intelligence service platform system.
  • the control module 660 may be configured to control the intelligence service terminal to provide a response to the user based on the response or command from the intelligence service platform system. For example, the control module 660 may control the robot to sing and/or dance a user’s favorite tune when the user is in a blue mood, or show a sad emotion to the user too.
  • the intelligence service terminal 600 may further include a daily activity management module 670.
  • the daily activity management module 670 may be configured to provide a daily scheduled activity plan and track the progress of daily scheduled activities, as described with reference to Fig. 3.
  • the daily activity scheduling module 670 may further provide a daily scheduled activity plan adapted by the service platform in accordance with at least one of user’s healthy condition and mood.
  • the intelligence service terminal 600 may also include a connection module configured to enable the intelligence service terminal to connect with other terminal device, like other user terminals, body sensors, the private network, the mobile communication network, etc. In such a way, the intelligence service terminal can share information to other parties and alert specified person in case of abnormal conditions.
  • the intelligence service terminal 600 could be an intelligence service robot.
  • the control module 660 may be configured to control the intelligence service robot to sing or dance in the user’s favorite tune to boost up user’s emotion.
  • the service platform system 240 may include access control module 241, data server 242, and an analytical server 243.
  • the access control module 241 may be configured to perform access control on the access of terminals such as robots, or user terminals. For example, the access control module 241 validates the robot by means of its unique serial number and permits its access only if the validation is successful. It may also validate the information provided by the robot.
  • the analytical server 243 is configured to use the deep learning engine to predict the user’s intention or motion based on the input information and provide a corresponding response or command.
  • the human mind deep learning engine may be based on a human mind deep learning model.
  • the human mind deep learning model describes a manner that the brain encodes information, it can be trained be based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use.
  • the individual related history data for training the human mind deep learning model may include for example one or more of game playing information, program selection information, robot communication information, music playing information, activity history information, healthy data, medical information, social context personality profile, family configurations, life style information, and personal tags.
  • the human mind deep learning engine may be further configured to identify the user’intention or emotion from the received information and generate a corresponding response or command output. For example, when the robot 220 sends a sad face image or voice to the human mind deep learning engine, the deep learning engine could predict that the user is in a blue mood and generate a command to the robot to make the robot to sing or dance in a tune that the user likes to boost up the emotion. On the other hand, if the human mind deep learning engine determines that the user is happy, it could generate a command to the robot sot that it could provide a corresponding light response together with swing of the robot head from the left to the right to share the happiness of the user. Many similar cases can be imaged and they will not be elaborated herein for simplification purposes.
  • the service platform system 240 could be further connected to external assistant service entities such as public or private hospital 251, GP and care agencies 252, online health provider 253, community center and residential care facilities 254. From these external assistant service entities, it is possible to obtain information associated with the user and it may could obtain professional recommendation or suggestion therefrom and provide them to the user.
  • the server platform system 240 may also be connected with the user’s family 255 to collect additional information from his/her family, send messages or alert them in emergency.
  • Fig. 8A illustrates a robot service functionality in the intelligence service platform system according to an embodiment of the present disclosure.
  • the robot service module 810 could receive service discovery and like/dislike engagement from the emotion engagement (E.E) module 820 (Fig. 8B) , and receive emotion intelligence parameters from an emotion interpretation (E.I. ) module 830 (Fig. 8C) .
  • the robot service module 810 could send like/dislike engagement to the robot so that it could perform emotion engagement with the user.
  • the robot service module 810 may adapt the service and provided the service adaption to the emotion engagement module 820.
  • the robot service module 810 could provide a robot navigation function 811.
  • the robot could collect location information in terms of navigation path, relative configuration of one location with respect to another, environment parameters of each location and so on.
  • These location parameters could be provided to a social context module 812.
  • the social context module 812 could obtain the social context of users on a day basis in terms of their lifestyles, languages, age groups, disability, locations (home, retirement village, nursing home, workplace, school, etc. ) .
  • These social parameters could further be provided to a subjective experience module 813.
  • the subjective experience module could collect information in terms of lifestyles, disabilities, psychological behavior profile, psychological needs, preventative care, proactive reactive care needs, entertainments in group, one to one interaction, and so on to improve the emotional wellbeing of the user.
  • the interaction environment 814 is built for the robot which is a physical embodied machine instead of avatars.
  • the robot may include features like singing and dancing with head and body movement, multilingual voice vocalization and recognition, emotive expressions like blushing, emotional adaption of expressions and dialog based on emotional response or facial expressions of the human partner, eye blink detection, mental state estimation, etc. All these features provided by the interaction environment 814could improve the communication between the robot and the user and facilitate building of a close relationship between the robot and the user.
  • Fig. 8B illustrates an emotion engagement functionality according to an embodiment of the present disclosure.
  • the emotion engagement (E.E. ) module 820 may determine suitable emotion engagement such as like/dislike engagement for the robot.
  • Verbal, non-verbal parameters can be input into the human mind deep learning engine to train the classifier, the like/dislike engagement with the user can be predicted by the classifier and the misclassification can be fed back to the human mind deep learning engine for further training the classifier.
  • the like/dislike engagement can be provided to the robot service module 810 and at the same time the service discovery could be provided to the robot service application too.
  • the robot can automatically provide service learnt from the user behavior that stores in the deep learning engine. For example, if the user prefers to listen to music when she is happy, the robot will ask user if she would like to listen to songs when happy facial expression is detected; and if the user prefers to listen to a story when she is sad, the robot will ask if the user would like to listen to a story if sadness is detected on her face.
  • the video from the robot is transmitted to the emotion tracking (E.T. ) module 830, affect changes from the emotional tracking module 830 can be received and the mode for emotion engagement can be adapted based on the affect change.
  • E.T. emotion tracking
  • Fig. 8C illustrates an emotion tracking process according to an embodiment of the present disclosure.
  • the emotion tracking module 830 may perform face detection and feature location (FD&FL) .
  • FD&FL face detection and feature location
  • first the face is detected from a frame in the video, and then feature points such as eyes, mouths, nostril, etc., can be located.
  • the located feature points can be provided to the classifier to determine affect changes.
  • the misclassification can be fed back to the human mind deep learning engine for further training the classifier.
  • the detected changes can be provided to both the emotion engagement module 820 and the emotion interpretation process 830.
  • Fig. 8D illustrates an emotion interpretation process according to an embodiment of the present disclosure.
  • the emotion interpretation module 840 interprets the affect changes based on pattern correlation, frequency, duration, intensity, and regulation.
  • Pattern correlation means a degree that the affect change is associated with a predetermined affect pattern as a benchmark (such as happy pattern, sad pattern, angry pattern, etc. ) and it is to identify the basic motion related to the affect change.
  • Frequency means the frequency of affect change;
  • “duration” means the time duration of affect change, and “intensity” means the intensity of affect change which could reflect the passion of the user. These information can be used to determine the degree of affect pattern.
  • Regularulation means the corresponding action to an emotional experience.
  • the analytical server 243 may further provide near real-time lifestyle analytics.
  • Figs. 10A to 10G to describe several examples. However, it shall be understood that these examples are only given for illustrative purposes, and the present disclosure is not limited to. In practices, it is possible to provide more analytics or less analytics, or provide different kinds of analysis, and all of these modifications fall within the scope of the present disclosure.
  • Fig. 10A it may use the service usage information of respective users and provide daily service usage pattern of various users. By means of such a pattern, it can learn the usage information of various users. It may also further provide daily service usage pattern of a specific user, as illustrated in Fig. 10B. Such usage pattern could reflect a service usage information of a specified user.
  • the service platform system could also perform text analytics and provide analytics result as illustrate in Fig. 10C, which could show the lifestyle of users or a single user in a different forms. It is also possible to use the interaction information to provide an analytics of interaction duration or frequency as illustrated in Figs. 10D and 10E.
  • the service platform system may further use the non-verbal usage information to provide a report of non-verbal expression feedback as illustrated in Fig. 10F. It may also provide a comparison of service among specified days as illustrated in Fig. 10G. Such information can be reviewed by nurses, doctors, family members or even users themselves to have a general review of the user’s usage conditions.
  • Fig. 11 illustrates a block diagram of the intelligence service platform system according to an embodiment of the present disclosure, which could implement the above-mentioned functionalities of the service platform.
  • Fig. 11 illustrates a block diagram of the intelligence service platform system according to an embodiment of the present disclosure, which could implement the above-mentioned functionalities of the service platform.
  • these are only given for illustrative purposes, and the present disclosure is not limited thereto.
  • the intelligence service platform system 1100 includes an input module 1101, a deep learning engine 1102, and an output module 1103.
  • the input module 1101 may be configured to receive user related information collected by an intelligence service terminal.
  • the user related information may include healthy data sensed by body sensors, face images captured by the camera of the robot, the voice input from the user captured by microphone of the robot.
  • the deep learning engine 1102 may be based on a human mind deep learning model.
  • the human mind deep learning model describes a manner that the brain encodes information, and it can be trained based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use.
  • the human mind deep learning engine may be further configured to identify the user’intention or emotion from the received information and generate a corresponding response or command output, so as to provide suitable feedback to individuals.
  • the output module 1103 may be configured to provide the corresponding response or command output to the intelligence service terminal.
  • the information collected by the intelligence service terminal may include a face image of a user.
  • the deep learning engine 1102 may be configured to interpret emotion of the user based on the face image and generate a corresponding response or command output to the intelligence service terminal.
  • the individual related history data may comprise on one or more of: game playing information, program selection information, robot communication information, music playing information, activity history information, healthy data, medical information, social context personality profile, family configurations, life style information, and personal tags.
  • the deep learning engine 1102 may be further configured to generate a daily scheduled activity plan and adapt the scheduled activity plan based on at least one of the identified healthy condition or mood condition.
  • the intelligence service platform system 1100 may further comprise an input validation module 1104.
  • the input validation module 1104 may be configured to validate the received information collected by the intelligence service terminal and identify the intelligence service terminal collecting the received information.
  • the intelligence service platform system 1100 may further comprise an information analysis module 1105.
  • the information analysis module 1105 may be configured to perform lifestyle analytics on user history data.
  • the information analysis module 1105 can be configured to provide various analysis functions as described with reference Figs. 10A to 10G.
  • the information analysis module 1105 may be further configured to acquire, from an external professional assistance service entity, at least one of assistive information on the user associated with the intelligence service terminal, and to provide professional analysis result as at least part of the response or command output to the intelligence service terminal.
  • the external professional assistance service entity may comprise at least one of medical school, public hospital, private hospital, mental health center, elderly health center, and rehabilitation center.
  • the intelligence service platform system 1100 may further comprise a user recording module 1106 configured to record information on individual users.
  • the user recording module 1106 could record the account information of the user, the user service usage information, user healthy data, user personal information collected during use like voice input, captured face images, video, etc.
  • the intelligence service platform system 1100 may further comprise a content streaming module 1107 configured to stream contents between the user associated with the intelligence service terminal and another terminal device Through the content streaming module 1107, the user could send a video captured by the robot to his/her friend, family member or doctor, to share his/her current healthy condition, feelings, emotions, or any other information.
  • a content streaming module 1107 configured to stream contents between the user associated with the intelligence service terminal and another terminal device Through the content streaming module 1107, the user could send a video captured by the robot to his/her friend, family member or doctor, to share his/her current healthy condition, feelings, emotions, or any other information.
  • the intelligence service platform system 1100 may further comprise a video compression module 1108 configured to compress a video content before transmitting.
  • a video compression module By means of such a video compression module, it could provide compress a huge video into an acceptable size for streaming.
  • the intelligence service platform system 1100 may further comprise a short message service module 1109 configured to transmit a short message between the user associated with the intelligence service terminal and another terminal device.
  • a short message service module 1109 configured to transmit a short message between the user associated with the intelligence service terminal and another terminal device.
  • a family member could send a short message through the SMS module 1109, which further forwards the SMS to the robot.
  • the robot could present the short message to the user by means of its speaker.
  • the intelligence service platform system 1100 may further comprise a call center module 1110.
  • the call center module 1110 is configured to make a call to an emergency center or a specified person in response to detection of emergency regarding the user associated with the intelligence service terminal.
  • the call center module could make a call to an emergency center for asking help, and/or call a specified person like a nurse, a doctor, a family member, and so on to get help.
  • the intelligence service platform system 1100 may further comprise a billing module 1111 configured to bill services provided to the user.
  • the billing module 111 could record the services provided to the user and generate a bill for the user. The user could also review the bill in real time.
  • the intelligence service platform system 1100 may further comprise a feedback management module 1112 configured to receive and manage feedbacks from users.
  • the feedback management module 1112 could also provide a public information sharing and exchanging platform for users so that they could share and communicate their experiences. User feedbacks are valuable for platform improvement and by mean of real feedbacks from users; the platform developer could improve the user experience and provide higher quality of service.
  • the intelligence service platform system 1100 may further comprise a document management module 1113.
  • the document management module 1113 may be configured to store and manage documents related to the intelligence service platform system.
  • the intelligence service platform system 1100 could be a cloud based system, which is built based on cloud technologies.
  • the cloud infrastructure can be distributed in different areas, regions, and countries as long as it could provide the services as proposed herein.
  • Fig. 12 illustrates an example implementation of the intelligence service platform system according to an embodiment of the present disclosure.
  • Fig. 12 illustrates an example implementation of the intelligence service platform system according to an embodiment of the present disclosure.
  • the intelligence service terminal 210 could be connected to the intelligence service platform 1200 through Internet.
  • the intelligence service platform 1200 may include an interface layer 1210, and an online platform 1220.
  • the interface layer 1210 provides interface of the platform to external devices or entities.
  • the intelligence service terminal 210 accesses the online platform 1220 via the interface layer 1210, and the online platform 1220 may also use the interface layer 1210 to exchange information with external assistant service entities like medical school, public hospital, private hospital, mental health center, elderly health center, and rehabilitation center, as well.
  • input invalidation module On the online platform 1220, there are provided input invalidation module, a response output module, a call center module, human mind deep learning engine, information analysis module, user recording system, content streaming system, SMS module, billing system, feedback management system, and a document management module.
  • input invalidation module On the online platform 1220, there are provided input invalidation module, a response output module, a call center module, human mind deep learning engine, information analysis module, user recording system, content streaming system, SMS module, billing system, feedback management system, and a document management module.
  • a failover subsystem 1221 On the online platform 1220, there are further provided a failover subsystem 1221, an operation management subsystem 1222, a storage subsystem 1223, a server subsystem 1224, and network virtualization subsystem 1225.
  • the failover subsystem 1222 may facilitate the building of a higher resilience system. It may assume ExpressClusterX which is failover software based clustering to build higher resilience system. It could provide automatic failover function for servers, and monitor hardware, operation systems, applications, database failures.
  • the operation management subsystem 1222 is an integrated operations management software suite for management for the platform system. MasterScope can be used to implement the operation management subsystem 1222 to provide simple unified management. It may manage day-to-day operation information such as incident, problem, change and release information, enable automation of job scheduling, virtualization management, software distribution, cloud management, and monitor server, network, application performance, or etc.
  • the storage subsystem 1223 could be used for storage and/or backup storage and archiving with high compressed data.
  • the storage subsystem 1223 could be implemented as HYDRAstor due to its high compression and data deduplication, good scaling-out performance without operation stop, remote replication feature, write-once-read-many capability for regulation of long-term storage as well as its encryption function.
  • the server subsystem 1224 can be implemented as Express 5800 server which has several lineup.
  • Express 5800 server is a general server for small to middle performance use; it has a high availability and could provide hardware based failover within a unit. At the same time, it is also a high density server for cloud infrastructure with 42U rack accommodating 572 servers.
  • the network virtualization subsystem 1225 could be implemented as ProgrammableFlow, which is the software defined network solution for providing network virtualization.
  • the network virtualization subsystem 1225 could provide centralized monitoring and easy configuration by separation of physical switch and configuration, and thus it requires no skilled network engineer because of centralized GUI based configuration, and could provide a high security solution for cyber attacking by automatic virus detection and network separation.
  • Fig. 13 illustrates a method for providing intelligence services at the intelligence service terminal according to an embodiment of the present disclosure.
  • the intelligence service terminal may acquire user related information from body sensor associated with a user (step 1301) , acquire a face image of the user (step 1302) , and acquire a voice input from the user (step 1303) . It can be understood that operations in step 1301 to 1303 can be performed in different orders, which is dependent on real application cases.
  • the intelligence service terminal may transmit at least one of the user related data, the face image of the user, or the voice input from the user to an intelligence service platform system.
  • the intelligence service terminal may receive a response or command from the intelligence service platform system. Further in step 1306, it may control the intelligence service terminal to provide a response to the user based on the response or command from the intelligence service platform system.
  • the intelligence service terminal may provide a daily scheduled activity plan and tracking the progress of daily scheduled activities. If the daily scheduled activity plan is adapted in accordance with at least one of user’s healthy condition or mood condition, the intelligence service terminal may provide an adapted daily scheduled activity plan to the user in step 1308. Further in step 1309, it may control the intelligence service robot to sing or dance in the user’s favorite tune to boost up user’s emotion. In addition, it is also possible to capture a Quick Response (QR) Code in step 1310, wherein a corresponding application is executed in response to the captured QR code.
  • QR Quick Response
  • Fig. 14 illustrates a method for providing intelligence services at the intelligence service platform system according to an embodiment of the present disclosure.
  • the service platform system receives user related information collected by an intelligence service terminal.
  • the user related information may include healthy data sensed by body sensors, face images captured by the camera of the robot, voice input from the user captured by microphone of the robot.
  • the service platform system may identify, by a deep learning engine, the user’intention or emotion from the received information and generate a corresponding response or command output.
  • the human mind deep learning engine is based on a human mind deep learning model which describes a manner that the brain encodes information.
  • the human mind deep learning model is trained based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use.
  • the service platform system may provide the corresponding response or command output to the intelligence service terminal.
  • the information collected by the intelligence service terminal may comprise a face image of a user.
  • the method may further comprise interpreting, by the deep learning engine, emotion of the user based on the face image and generating a corresponding response or command output to the intelligence service terminal (step 1404) .
  • the individual related history data may comprise on one or more of: game playing information, program selection information, robot communication information, music playing information, activity history information, healthy data, medical information, social context personality profile, family configurations, life style information, and personal tags.
  • the method 1400 may further comprise generating, by the deep learning engine, a daily scheduled activity plan and adapt the scheduled activity plan based on at least one of the identified healthy and mood in step1405. Further in step 1406, the service platform system may validate the received user related information collected by the intelligence service terminal and identify the intelligence service terminal. In step 1407, the method 1400 may further comprise performing lifestyle analytics on user history data like those illustrated in Figs. 10A to 10G. The method 1400 may also comprise, in step 1408, acquiring, from an external professional assistance service entity, assistive information on the user associated with the intelligence service terminal, and providing professional analysis result as at least part of the response or command output to the intelligence service terminal.
  • the external professional assistance service entity may comprise at least one of medical school, public hospital, private hospital, mental health center, elderly health center, and rehabilitation center.
  • the service platform system may further perform at least one of:
  • methods 1300 and 1400 are described with reference to Figs. 13 to 15 in brief. It can be noted that the methods 1300 and 1400 may configured to implement functionalities as described with reference to Figs. 1 to 12. Therefore, for details about operations of various steps in these apparatuses, one may refer to those descriptions made with respect to the modules of the intelligence service terminal and platform system with reference to Figs. 1 to 12.
  • components of the intelligence service terminal 600 and the intelligence service platform system 1100 may be embodied in hardware, software, firmware, and/or any combination thereof.
  • the components of the intelligence service terminal 600 and the intelligence service platform system 1100 may be respectively implemented by a processor, a server or any other appropriate device.
  • the intelligence service terminal 600 and the intelligence service platform system 1100 may include at least one processor.
  • the at least one processor suitable for use with embodiments of the present disclosure may include, by way of example, both general and special purpose processors already known or developed in the future.
  • the intelligence service terminal 600 and the intelligence service platform system 1100 may further include at least one memory.
  • the at least one memory may include, for example, semiconductor memory devices, e.g., RAM, ROM, EPROM, EEPROM, and flash memory devices.
  • the at least one memory may be used to store program of computer executable instructions. The program can be written in any high-level and/or low-level compliable or interpretable programming languages.
  • the computer executable instructions may be configured, with the at least one processor, to cause the intelligence service terminal 600 and the intelligence service platform system 1100 to at least perform operations according to the method as discussed with reference to Figs. 1 to 5 and 7 to 10 and 12 respectively.
  • the present disclosure may also provide a carrier containing the computer program as mentioned above, wherein the carrier is one of an electronic signal, optical signal, radio signal, or computer readable storage medium.
  • the computer readable storage medium can be, for example, an optical compact disk or an electronic memory device like a RAM (random access memory) , a ROM (read only memory) , Flash memory, magnetic tape, CD-ROM, DVD, Blue-ray disc and the like.
  • an apparatus implementing one or more functions of a corresponding apparatus described with an embodiment comprises not only prior art means, but also means for implementing the one or more functions of the corresponding apparatus described with the embodiment and it may comprise separate means for each separate function, or means that may be configured to perform two or more functions.
  • these techniques may be implemented in hardware (one or more apparatuses) , firmware (one or more apparatuses) , software (one or more modules) , or combinations thereof.
  • firmware or software implementation may be made through modules (e.g., procedures, functions, and so on) that perform the functions described herein.

Abstract

An intelligence service platform comprises an input module, a deep learning engine based on a human mind deep learning model and an output module. The input module is configured to receive user related information collected by an intelligence service terminal. The deep learning model describes a manner that the brain encodes information, and the human mind deep learning model is trained based on individual related history data and further adjusted by individual's daily responses collected by the intelligence service terminal in use. The deep learning engine is further configured to identify the user' intention or emotion from the received user related information and generate a corresponding response or command output. The output module can be configured to provide the corresponding response or command output to the intelligence service terminal. With embodiments of the present disclosure, the deep learning model could be trained based on individual related history data and further adjusted by individual's daily responses collected by the intelligence service terminal in use, and thus the robot can learn from the user day by day and output appropriate commands or response for respective users accordingly without requiring pre-defined instructions and messages.

Description

INTELLIGENT SERVICE TERMINAL AND PLATFORM SYSTEM AND METHODS THEREOF FIELD OF THE INVENTION
The non-limiting and exemplary embodiments of the present disclosure generally relate to the field of intelligence service techniques, and more particularly relate to an intelligent service terminal, intelligence service platform system, and intelligence service methods at the intelligence service terminal and the intelligence service platform system.
BACKGROUND OF THE INVENTION
With the coming of aging society, it requires enormous human power and considerable cost to take care of senior citizen, especially those having dementia/Alzheimer. Companion mobile robots have been designed and used to communicate with human like dementia/Alzheimer patients to help them and give them a companion. Such companion mobile robots could interact with the patients using a set of predefined messages, instructions, songs scripts and stories and were trained in controlled environment.
However, these companions are not directed to be used in a home environment for a long term. They can only interacting with the patients using a set of predefined messages, instructions, songs, scripts and stories. Thus, the existing robot looks more like a cold machine than a friend or a companion. And in turn, it is hard to build a close relationship between the patient and the robot and also hard to make the best of the companion mobile robot to help the patient.
SUMMARY OF THE INVENTION
To this end, in the present disclosure, there is provided a new solution of an intelligent service robot, to mitigate or at least alleviate at least part of the issues in the prior art.
In a first aspect of the present disclosure, there is provided intelligence service platform system. The intelligence service platform system may comprise an input module, a deep learning engine and an output model. The input module may be  configured to receive user related information collected by an intelligence service terminal. The deep learning engine may be based on a human mind deep learning model, wherein the human mind deep learning model describes a manner that the brain encodes information, wherein the human mind deep learning model is trained based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use, and wherein the deep learning engine is further configured to identify the user’intention or emotion from the received user related information and generate a corresponding response or command output. The output module may be configured to provide the corresponding response or command output to the intelligence service terminal.
In a second aspect of the present disclosure, there is provided an intelligence service terminal. The intelligence service terminal may comprise a user information acquisition module, an image acquisition module, a sound acquisition module, a data transmission module, a response receiving module and a control module. The user data acquisition module may be configured to acquire user related information from a body sensor associated with a user. The image acquisition module may be configured to capture a face image of the user. The sound acquisition module may be configured capture a voice input from the user. The data transmission module may be configured to transmit at least one of the user related information, users’face image, or the voice input from the user to an intelligence service platform system. The response receiving module may be configured to receive a response or command from the intelligence service platform system. The control module may be configured to control the intelligence service terminal to provide a response to the user based on the response or command from the intelligence service platform system.
In a third aspect of the present disclosure, there is further provided a method for providing intelligence services at a service platform. The method may comprise receiving user related information collected by an intelligence service terminal; identifying, by a deep learning engine, the user’intention or emotion from the received information and generating a corresponding response or command output, wherein the human mind deep learning engine is based on a human mind deep learning model, wherein the human mind deep learning model describes a manner that the brain encodes information, and wherein the human mind deep learning model is trained based on  individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use; and providing the corresponding response or command output to the intelligence service terminal.
In a fourth aspect of the present disclosure, there is further provided a method for providing intelligence services at an intelligence service terminal. The method may comprise acquiring user related information from body sensor associated with a user; capturing a face image of the user; capturing a voice input from the user; transmitting at least one of the user related data, the face image of the user, or the voice input from the user to an intelligence service platform system; receiving a response or command from the intelligence service platform system; and controlling the intelligence service terminal to provide a response to the user based on the response or command from the intelligence service platform system.
According to a fifth aspect of the present disclosure, there is provided an intelligence server. The intelligence server may comprise a processor and a memory. The memory may be coupled with the processor and having program codes therein, which, when executed on the processor, cause intelligence server to perform operations of the third aspect.
According to a sixth aspect of the present disclosure, there is provided an intelligence service terminal. The intelligence service terminal may comprise a processor and a memory. The memory may be coupled with the processor and have program codes therein, which, when executed on the processor, cause the intelligence service terminal to perform operations of the fourth aspect.
According to a seventh aspect of the present disclosure, there is provided a computer-readable storage media with computer program codes embodied thereon, the computer program codes configured to, when executed, cause an apparatus to perform actions in the method according to any embodiment in the third aspect.
According to an eighth aspect of the present disclosure, there is provided a computer-readable storage media with computer program codes embodied thereon, the computer program codes configured to, when executed, cause an apparatus to perform actions in the method according to any embodiment in the fourth aspect.
According to a ninth aspect of the present disclosure, there is provided a computer program product comprising a computer-readable storage media according to the seventh aspect.
According to a tenth aspect of the present disclosure, there is provided a computer program product comprising a computer-readable storage media according to the eighth aspect.
With embodiments of the present disclosure, the deep learning model could be trained based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use, and thus the robot can learn from the user day by day and output appropriate commands or response to respective users accordingly without requiring pre-defined instructions and messages.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other features of the present disclosure will become more apparent through detailed explanation on the embodiments as illustrated in the embodiments with reference to the accompanying drawings, throughout which like reference numbers represent same or similar components and wherein:
Fig. 1 schematically illustrates a diagram of an example architecture of an intelligence service system according to an embodiment of the present disclosure;
Fig. 2 schematically illustrates a diagram of an example intelligence service solution according to an embodiment of the present disclosure;
Fig. 3 schematically illustrates a diagram of an example intelligence service solution at the user’side according to an embodiment of the present disclosure;
Fig. 4 further illustrates an example daily activity management of the intelligence service terminal according to an embodiment of the present disclosure;
Fig. 5 illustrates another example healthy and emotion monitoring functionality of the intelligence service terminal according to an embodiment of the present disclosure;
Fig. 6 illustrates a block diagram of the intelligence service terminal according to an embodiment of the present disclosure;
Fig. 7 schematically illustrates a diagram of an example intelligence service solution at the service platform side according to an embodiment of the present disclosure;
Fig. 8A schematically illustrates a diagram of an example robot service functionality according to an embodiment of the present disclosure;
Fig. 8B schematically illustrates a diagram of an example emotion engagement functionality according to an embodiment of the present disclosure;
Fig. 8C schematically illustrates a diagram of an example emotion tracking functionality according to an embodiment of the present disclosure;
Fig. 8D schematically illustrates a diagram of an example emotion interpretation functionality according to an embodiment of the present disclosure;
Fig. 9 schematically illustrates a diagram of an example face detection and feature location according to an embodiment of the present disclosure;
Figs. 10A to 10G schematically illustrate lifestyle analytics examples according to embodiments of the present disclosure;
Fig. 11 schematically illustrates a block diagram of the intelligence service platform system according to an embodiment of the present disclosure;
Fig. 12 schematically illustrates a specific implementation of the intelligence service platform system according to an embodiment of the present disclosure;
Fig. 13 illustrates a method for providing intelligence services at the intelligence service terminal according to an embodiment of the present disclosure;
Fig. 14 illustrates a method for providing intelligence services at the intelligence service platform system according to an embodiment of the present disclosure; and
Fig. 15 illustrates a method for providing intelligence services at the intelligence service platform system according to an embodiment of the present disclosure.
DETAILED DESCRIPTION OF EMBODIMENTS
Hereinafter, the solution as provided in the present disclosure will be described in details through embodiments with reference to the accompanying drawings.  It should be appreciated that these embodiments are presented only to enable those skilled in the art to better understand and implement the present disclosure, not intended to limit the scope of the present disclosure in any manner.
In the accompanying drawings, various embodiments of the present disclosure are illustrated in block diagrams, flow charts and other diagrams. Each block in the flowcharts or blocks may represent a module, a program, or a part of code, which contains one or more executable instructions for performing specified logic functions, and in the present disclosure, a dispensable block is illustrated in a dotted line. Besides, although these blocks are illustrated in particular sequences for performing the steps of the methods, as a matter of fact, they may not necessarily be performed strictly according to the illustrated sequence. For example, they might be performed in reverse sequence or simultaneously, which is dependent on natures of respective operations. It should also be noted that block diagrams and/or each block in the flowcharts and a combination of thereof may be implemented by a dedicated hardware-based system for performing specified functions/operations or by a combination of dedicated hardware and computer instructions.
Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to “a/an/the/said [element, device, component, means, step, etc. ] ” are to be interpreted openly as referring to at least one instance of said element, device, component, means, unit, step, etc., without excluding a plurality of such devices, components, means, units, steps, etc., unless explicitly stated otherwise. Besides, the indefinite article “a/an” as used herein does not exclude a plurality of such steps, units, modules, devices, and objects, and etc.
As mentioned in Background, these existing robot companions are not directed to be used in a home environment for a long term, they can only interact with the patients using a set of predefined messages, instructions, songs, scripts and stories. Thus, the existing robot looks more like a cold machine than a friend or a companion. And in turn, it is hard to build a close relationship between the patient and the robot and also hard to make the best of the companion mobile robot to help the patient.
To this end, in the present disclosure, there is provided a new intelligence service system to mitigate or at least partially alleviate the problems in the  existing solutions. In the present disclosure, it is proposed to build a centralized service platform for robots and use the human mind deep learning model to serve users. The human mind deep learning model could be trained separately based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use, and thus the robot can learn from the user day by day and output appropriate commands or response for respective users accordingly without requiring pre-defined instructions and message.
Hereinafter, reference will be made to Figs. 1 to 15 to describe the solutions as proposed in the present disclosure in details. However, it shall be appreciated that the following embodiments are given only for illustrative purposes and the present disclosure is not limited thereto.
Reference is first made to Fig. 1 to describe an example architecture of an intelligence service system according to an embodiment of the present disclosure. As illustrated in Fig. 1, in the present disclosure, there is provided a deep learning engine 130. The deep learning engine 130 is configured to perform both training 131 and reasoning 132. The deep learning engine 130 could be based on a human mind deep learning model. The basic infrastructure of the human mind deep learning model can be built based on any suitable deep learning technologies, such as deep neural network, K-nearest neighbors algorithm, K-means algorithm, linear regression algorithm, support vector machine, naive Bayes, classification trees, etc. Data collection modules 120A-1, 120A-2, 120A-3 collect data from respective users 110A-1, 110A-2, 110A-3. The human mind deep learning model could be trained separately based on individual related data collected from respective users and further adjusted by individual’s daily responses collected by the intelligence service terminal in use. At the same time, the human mind deep learning model could provide suitable response to collected information as required by respective applications 120B-1, 120B-2 and 120B-3. The applications 120B-1, 120B-2 and 120B-3 can be different from each other. The deep learning engine could learn and train for different user of respective applications. In addition, the applications 120B-1, 120B-2 and 120B-3 can also be same and individuals could use the same type of applications. Hereinafter, health care applications will be taken as an example to describe embodiments of the present disclosure; however, it can  be understood that they are only given for illustrative purposes and the present disclosure is not limited thereto.
Fig. 2 schematically illustrates a diagram of an example intelligence service solution according to an embodiment of the present disclosure. As illustrated in Fig. 2, in the example solution there is a robot 220, a private network 230, the service platform system 240, and external assistant service entities 251 to 254 and user’s family 255.
The user 210 can be bound with the robot 220 for example by associating his/her identity card number, social security card number, or other identity with the robot. The user 210 could communicate with the robot 220 through microphone, speaker, cameras etc., and the robot 220 could provide a response to the user by speaker, lamps and/or movements. Therefore, the response may include a light response, a sound response, a movement response, any other suitable responses or any combination thereof. Some of responses to the user could be provided by the robot 220 itself. For example, when the robot 220 detects a person nearby, it could turn its head toward the person, when the robot 220 detect a sound from the user, its hear lamp could light to wait the user 210 to speak. Some other responses may be obtained from the service platform 240. The robot 220 could also collect data or information from the user. The robot 220 could provide the collected data or information to the service platform system 240 and get command or response from the service platform system 240. For example, the robot could track the user’s face images and send them to the service platform, the service platform system 240 could process these images and generate suitable response command and get it back to the robot. The robot 220 can be connected within a private network 230 which may also be connected with other terminal devices like smart phones, personal computer, tablet, notepad, etc. The private network 230 is further connected to a wide area network, through which the robot 220 could be connected to the service platform system 240.
On the service platform system 240, there are provided access control module 241, data server 242, and an analytical server 243. The access control module 241 is configured to perform access control on the access of terminals such as robots, or terminal devices. The analytical server 243 is configured to use the deep learning  engine to predict the user’s intention or motion based on the input information and provide a corresponding response or command.
Additionally, the service platform system 240 could be further connected to external assistant service entities such as public or private hospital 251, general practitioner (GP) and care agencies 252, online health provider 253, community center and residential care facilities 254. From these external assistant service entities, it is possible to obtain information associated with the user and it may could obtain professional recommendation or suggestion therefrom and provide them to the user.
The server platform system 240 may also be connected with the user’s family 255 to collect additional information from his/her family, share use r’information with them, send message or alert them in emergency.
Fig. 3 schematically illustrates the intelligence service solution at the user’side according to an embodiment of the present disclosure. As illustrated in Fig. 2, the user may be equipped with a body sensor 211, which may be for example a wearable device like Apple Watch, intelligence bracelet, or any other sensors. These body sensors 211 could sense heart rates, anxiety, emotional profile, sleep quality, blood pressure, etc. They may also track the movement of the user, amount of exercise, location tracking, etc. The body sensor 211 could be connected wirelessly to the robot 220, for example through Bluetooth. Thus the robot 220 could collect information from the body sensors and provide it to the service platform system.
Other terminal devices like smart phone 231, notepad 232, tablet 233, personal computer 234, etc., can be connected in the private network 230 as well. These terminal devices can be connected to the robot wirelessly, and the robot 220 could collect information from these terminal devices too. In addition, the terminal device may also be connected to the service platform system to view information on the robot and information on the user associated with the robot. For example the user may log into the service platform website through internet with the user’s account to review the user’s information. Or alternatively, it is also possible to download an intelligence service application from the website or application store to view the user information.
With the assistance of the service platform system, the robot 220 could provide alertness service 261, health recommendation services 262, connectivity service  263, healthy indicator service 264, lift style service 265, gamification services 266 or any other service.
The alertness service 261 is configured to provide alertness to the user, a nurse and/or a family member when some abnormality is detected. For example, if the robot 220 or the service platform system 240 detects an abnormal heart rate, it may send an alert to the user, the nurse, or the family member. The service platform system 240 may even make a call to an emergency center, or a designed person, like a doctor, a family member in emergency.
The health recommendation service 262 is configured to provide recommendations or suggestions on health. For example, if the robot 220 or the service platform 240 detects physical parameter not in a good condition, it may provide some recommendations, advices, or suggestions to improve the current condition. These recommendation, advices, or suggestions may be given based on information stored in the robot or the service platform, or even be given by a professional expert like doctor on line if the user likes to use such a service.
The connectivity service 263 is configured to provide functionality for enabling other terminal devices to connect with the robot or the service platform system so that they could review or share the information of the user. Other terminal devices could connect the robot or the service platform system, as family members, by means of the serial number of the robot, user device ID or family member account.
The health indicator service 264 is configured to provide healthy indicator based on the user’s health data. The health indicator can be provided by the service platform, and the healthy indicator can be shared between the robot and other terminal devices like those of the family members.
Lifestyle service 265 is an analysis function, which could provide various lifestyle analyses based on user history data. For example, it could provide daily service usage pattern, duration of interaction, text analytics, frequency of interaction, service comparison among specified days in home-based care, etc. These services will be described with reference to Figs. 10A to 10G and thus will be not elaborated herein.
Gamification service 266 is configured to employ game design elements to motivate participation, and improve engagement. For example, the user  interface may be designed to use big buttons for ease of use, and each application is simple and able to access by a few buttons.
Fig. 4 further illustrates an example daily activity management of the robot to an embodiment of the present disclosure. As illustrated in Fig. 4, the robot 220 could get daily scheduled activity plan 271 for the user from a calendar application. The daily scheduled activity plan 271 can be generated by the service platform system 240 based on user history data and then transmitted to the robot 220. The robot 220 manages the scheduled tasks in the daily scheduled activity plan. For example, it could remind the user 210 of a task when it is time to take it and track the progress of tasks in the task progress table 272. At the same time, it is also possible to provide an information sharing mechanism by the robot and the service platform system. For example, the user 210 could send a video to his/her family member, call the doctor to get some advices, or share information with other people with dementia too.
Fig. 5 illustrates another example healthy and emotion monitoring functionality of the intelligence service terminal (such as a robot) according to an embodiment of the present disclosure. As illustrated in Fig. 5, the user 210 may be equipped with body sensors 211, which may be for example a wearable device like Apple Watch, intelligence bracelet, or any other sensor. These body sensors 211 could sense heart rates 211a, anxiety/emotional profile 211b, sleep quality 211c, blood pressure211d, etc. If the robot 220 or the service platform system 240 determines healthy data abnormality (such a rather high or low blood pressure or heart rate) , it is possible to send an alert to nurses or doctors, and send monitored data to hospital for diagnosis and treatment (221) . If the robot or the service platform 240 detects that the user is not in a good mood, the service platform 240 may command the robot to sing and/or dance favorite tunes of the user to boost up emotion (222) . In addition, the robot 210 or the service platform system 240 could also adapt or change daily scheduled activity plan to fit the monitored heathy condition or the mood condition (223) .
Fig. 6 illustrates a block diagram of the intelligence service terminal according to an embodiment of the present disclosure, which could implement the above-mentioned functionalities of the intelligence service terminal. As illustrated in Fig. 6, the intelligence service terminal 600 includes a user information acquisition  module 610, an image acquisition module 620, a sound acquisition module 630, a data transmission module 640, a response receiving module 650, and a control module 660.
The user data acquisition module 610 may be configured to acquire user related information from body sensor associated with a user. As mentioned, the user 210 may be equipped with body sensors 211, which could sense heart rates 211 a, anxiety/emotional profile 211b, sleep quality 211c, blood pressure211d, etc. These sensors may also track the movement of the user, amount of exercise, location tracking, etc.
The image acquisition module 620 may be configured to capture a face image of the user by means of camera. The face image can be used to track the user’s emotion. It can be understand that in different emotions, the user could have different expressions on the face, and thus it is possible track the user’s emotions by capturing the user’s face image. In addition, the image acquisition module 620 can be used to capture a Quick Response (QR) Code. In response to the captured QR code, the service platform 240 can be used to command the robot to execute a corresponding application. For example, all services of the robot could be QR coded on smart phone. The user could put a QR code of a desirable application in front of the camber of the robot. The image acquisition module 610 could capture the QR code and send it to the service platform which could in turn identifies the QR code and command the robot to run a corresponding application. The application could be story, a song, quiz, etc. In such a way, it could reduce application development and implementation time and increase the flexibility. In other words, it could improve emotional well-being and improve social interaction personalized to an end user by automatically physical embodiment of human emotions and sentiments in a robot embedded in A QR encoded application.
The sound acquisition module 630 may be configured to capture a voice input from the user. By mean of the sound acquisition module 630, the robot could collect the voice input from the user, understand the meaning of the words and respond in an appropriate way. In addition, it is also possible to identify the user’s emotion based on the voice input.
The data transmission module 640 may be configured to transmit at least one of the user related data, the face image of the user, or the voice input from the  user to an intelligence service platform system. These types of information can be further processed at the service platform system, for example to determine a suitable response to the user. The response receiving module 650 may be configured to receive a response or command from the intelligence service platform system.
The control module 660 may be configured to control the intelligence service terminal to provide a response to the user based on the response or command from the intelligence service platform system. For example, the control module 660 may control the robot to sing and/or dance a user’s favorite tune when the user is in a blue mood, or show a sad emotion to the user too.
The intelligence service terminal 600 may further include a daily activity management module 670. The daily activity management module 670 may be configured to provide a daily scheduled activity plan and track the progress of daily scheduled activities, as described with reference to Fig. 3. The daily activity scheduling module 670 may further provide a daily scheduled activity plan adapted by the service platform in accordance with at least one of user’s healthy condition and mood.
Besides, the intelligence service terminal 600 may also include a connection module configured to enable the intelligence service terminal to connect with other terminal device, like other user terminals, body sensors, the private network, the mobile communication network, etc. In such a way, the intelligence service terminal can share information to other parties and alert specified person in case of abnormal conditions.
The intelligence service terminal 600 could be an intelligence service robot. The control module 660 may be configured to control the intelligence service robot to sing or dance in the user’s favorite tune to boost up user’s emotion.
Next, reference will be made to Figs. 7 to 12 to describe the intelligence service solution at the intelligence service platform system according to embodiments of the present disclosure.
Reference is made to Fig. 7 to describe an example intelligence service solution at the intelligence service platform system according to an embodiment of the present disclosure. As illustrated in Fig. 7, the service platform system 240 may include access control module 241, data server 242, and an analytical server 243.
The access control module 241 may be configured to perform access control on the access of terminals such as robots, or user terminals. For example, the access control module 241 validates the robot by means of its unique serial number and permits its access only if the validation is successful. It may also validate the information provided by the robot.
The analytical server 243 is configured to use the deep learning engine to predict the user’s intention or motion based on the input information and provide a corresponding response or command. The human mind deep learning engine may be based on a human mind deep learning model. The human mind deep learning model describes a manner that the brain encodes information, it can be trained be based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use. The individual related history data for training the human mind deep learning model may include for example one or more of game playing information, program selection information, robot communication information, music playing information, activity history information, healthy data, medical information, social context personality profile, family configurations, life style information, and personal tags.
Especially, the human mind deep learning engine may be further configured to identify the user’intention or emotion from the received information and generate a corresponding response or command output. For example, when the robot 220 sends a sad face image or voice to the human mind deep learning engine, the deep learning engine could predict that the user is in a blue mood and generate a command to the robot to make the robot to sing or dance in a tune that the user likes to boost up the emotion. On the other hand, if the human mind deep learning engine determines that the user is happy, it could generate a command to the robot sot that it could provide a corresponding light response together with swing of the robot head from the left to the right to share the happiness of the user. Many similar cases can be imaged and they will not be elaborated herein for simplification purposes.
Additionally, the service platform system 240 could be further connected to external assistant service entities such as public or private hospital 251, GP and care agencies 252, online health provider 253, community center and residential care facilities 254. From these external assistant service entities, it is possible to  obtain information associated with the user and it may could obtain professional recommendation or suggestion therefrom and provide them to the user. The server platform system 240 may also be connected with the user’s family 255 to collect additional information from his/her family, send messages or alert them in emergency.
At the intelligence service platform system, it could provide further services. Amongst others, functionalities regarding emotion identification will be described in detailed with reference to Fig. 8A to 8D and Fig. 9.
Fig. 8A illustrates a robot service functionality in the intelligence service platform system according to an embodiment of the present disclosure. As illustrated in Fig. 8A, the robot service module 810 could receive service discovery and like/dislike engagement from the emotion engagement (E.E) module 820 (Fig. 8B) , and receive emotion intelligence parameters from an emotion interpretation (E.I. ) module 830 (Fig. 8C) . The robot service module 810 could send like/dislike engagement to the robot so that it could perform emotion engagement with the user. In response to the emotion intelligence parameters and the service discovery, the robot service module 810 may adapt the service and provided the service adaption to the emotion engagement module 820.
The robot service module 810 could provide a robot navigation function 811. Thus, the robot could collect location information in terms of navigation path, relative configuration of one location with respect to another, environment parameters of each location and so on. These location parameters could be provided to a social context module 812. The social context module 812 could obtain the social context of users on a day basis in terms of their lifestyles, languages, age groups, disability, locations (home, retirement village, nursing home, workplace, school, etc. ) . These social parameters could further be provided to a subjective experience module 813. The subjective experience module could collect information in terms of lifestyles, disabilities, psychological behavior profile, psychological needs, preventative care, proactive reactive care needs, entertainments in group, one to one interaction, and so on to improve the emotional wellbeing of the user. Respective needs could be provided to an interaction environment 814. The interaction environment 814 is built for the robot which is a physical embodied machine instead of avatars. The robot may include features like singing and dancing with head and body movement, multilingual voice  vocalization and recognition, emotive expressions like blushing, emotional adaption of expressions and dialog based on emotional response or facial expressions of the human partner, eye blink detection, mental state estimation, etc. All these features provided by the interaction environment 814could improve the communication between the robot and the user and facilitate building of a close relationship between the robot and the user.
Fig. 8B illustrates an emotion engagement functionality according to an embodiment of the present disclosure. As illustrated in Fig. 8B, in response to the service or service adaption from the robot service (R.S. ) module 810, the emotion engagement (E.E. ) module 820 may determine suitable emotion engagement such as like/dislike engagement for the robot. Verbal, non-verbal parameters can be input into the human mind deep learning engine to train the classifier, the like/dislike engagement with the user can be predicted by the classifier and the misclassification can be fed back to the human mind deep learning engine for further training the classifier. The like/dislike engagement can be provided to the robot service module 810 and at the same time the service discovery could be provided to the robot service application too. Depending on the like/dislike engagement detected, the robot can automatically provide service learnt from the user behavior that stores in the deep learning engine. For example, if the user prefers to listen to music when she is happy, the robot will ask user if she would like to listen to songs when happy facial expression is detected; and if the user prefers to listen to a story when she is sad, the robot will ask if the user would like to listen to a story if sadness is detected on her face. The video from the robot is transmitted to the emotion tracking (E.T. ) module 830, affect changes from the emotional tracking module 830 can be received and the mode for emotion engagement can be adapted based on the affect change.
Fig. 8C illustrates an emotion tracking process according to an embodiment of the present disclosure. As illustrated in Fig. 8C, in response to video from the emotion engagement module 820, the emotion tracking module 830 may perform face detection and feature location (FD&FL) . As illustrated in Fig. 9, first the face is detected from a frame in the video, and then feature points such as eyes, mouths, nostril, etc., can be located. The located feature points can be provided to the classifier to determine affect changes. The misclassification can be fed back to the human mind  deep learning engine for further training the classifier. The detected changes can be provided to both the emotion engagement module 820 and the emotion interpretation process 830.
Fig. 8D illustrates an emotion interpretation process according to an embodiment of the present disclosure. As illustrated in Fig. 8D, in response to the affect changes from the emotion tracking module 830, the emotion interpretation module 840 interprets the affect changes based on pattern correlation, frequency, duration, intensity, and regulation. “Pattern correlation” means a degree that the affect change is associated with a predetermined affect pattern as a benchmark (such as happy pattern, sad pattern, angry pattern, etc. ) and it is to identify the basic motion related to the affect change. “Frequency” means the frequency of affect change; “duration” means the time duration of affect change, and “intensity” means the intensity of affect change which could reflect the passion of the user. These information can be used to determine the degree of affect pattern. “Regulation” means the corresponding action to an emotional experience.
In addition to functionalities regarding emotion identification, the analytical server 243 may further provide near real-time lifestyle analytics. Hereinafter, reference will be further made to Figs. 10A to 10G to describe several examples. However, it shall be understood that these examples are only given for illustrative purposes, and the present disclosure is not limited to. In practices, it is possible to provide more analytics or less analytics, or provide different kinds of analysis, and all of these modifications fall within the scope of the present disclosure.
As illustrated in Fig. 10A, it may use the service usage information of respective users and provide daily service usage pattern of various users. By means of such a pattern, it can learn the usage information of various users. It may also further provide daily service usage pattern of a specific user, as illustrated in Fig. 10B. Such usage pattern could reflect a service usage information of a specified user. The service platform system could also perform text analytics and provide analytics result as illustrate in Fig. 10C, which could show the lifestyle of users or a single user in a different forms. It is also possible to use the interaction information to provide an analytics of interaction duration or frequency as illustrated in Figs. 10D and 10E. In addition, the service platform system may further use the non-verbal usage information  to provide a report of non-verbal expression feedback as illustrated in Fig. 10F. It may also provide a comparison of service among specified days as illustrated in Fig. 10G. Such information can be reviewed by nurses, doctors, family members or even users themselves to have a general review of the user’s usage conditions.
Fig. 11 illustrates a block diagram of the intelligence service platform system according to an embodiment of the present disclosure, which could implement the above-mentioned functionalities of the service platform. However, it shall be understood that these are only given for illustrative purposes, and the present disclosure is not limited thereto.
As illustrated in Fig. 11, the intelligence service platform system 1100 includes an input module 1101, a deep learning engine 1102, and an output module 1103. The input module 1101 may be configured to receive user related information collected by an intelligence service terminal. The user related information may include healthy data sensed by body sensors, face images captured by the camera of the robot, the voice input from the user captured by microphone of the robot. The deep learning engine 1102 may be based on a human mind deep learning model. As mentioned above, the human mind deep learning model describes a manner that the brain encodes information, and it can be trained based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use. Especially, the human mind deep learning engine may be further configured to identify the user’intention or emotion from the received information and generate a corresponding response or command output, so as to provide suitable feedback to individuals. The output module 1103 may be configured to provide the corresponding response or command output to the intelligence service terminal.
In an embodiment of the present disclosure, the information collected by the intelligence service terminal may include a face image of a user. The deep learning engine 1102 may be configured to interpret emotion of the user based on the face image and generate a corresponding response or command output to the intelligence service terminal.
In another embodiment of the present disclosure, the individual related history data may comprise on one or more of: game playing information, program selection information, robot communication information, music playing information,  activity history information, healthy data, medical information, social context personality profile, family configurations, life style information, and personal tags.
In a further embodiment of the present disclosure, the deep learning engine 1102 may be further configured to generate a daily scheduled activity plan and adapt the scheduled activity plan based on at least one of the identified healthy condition or mood condition.
In a still further embodiment of the present disclosure, the intelligence service platform system 1100 may further comprise an input validation module 1104. The input validation module 1104 may be configured to validate the received information collected by the intelligence service terminal and identify the intelligence service terminal collecting the received information.
In a yet further embodiment of the present disclosure, the intelligence service platform system 1100 may further comprise an information analysis module 1105. The information analysis module 1105 may be configured to perform lifestyle analytics on user history data. For example, the information analysis module 1105 can be configured to provide various analysis functions as described with reference Figs. 10A to 10G.
In another embodiment of the present disclosure, the information analysis module 1105 may be further configured to acquire, from an external professional assistance service entity, at least one of assistive information on the user associated with the intelligence service terminal, and to provide professional analysis result as at least part of the response or command output to the intelligence service terminal.
In a further embodiment of the present disclosure, the external professional assistance service entity may comprise at least one of medical school, public hospital, private hospital, mental health center, elderly health center, and rehabilitation center.
In a still further embodiment of the present disclosure, the intelligence service platform system 1100 may further comprise a user recording module 1106 configured to record information on individual users. For example, the user recording module 1106 could record the account information of the user, the user service usage  information, user healthy data, user personal information collected during use like voice input, captured face images, video, etc.
In a yet further embodiment of the present disclosure, the intelligence service platform system 1100 may further comprise a content streaming module 1107 configured to stream contents between the user associated with the intelligence service terminal and another terminal device Through the content streaming module 1107, the user could send a video captured by the robot to his/her friend, family member or doctor, to share his/her current healthy condition, feelings, emotions, or any other information.
In another embodiment of the present disclosure, the intelligence service platform system 1100 may further comprise a video compression module 1108 configured to compress a video content before transmitting. By means of such a video compression module, it could provide compress a huge video into an acceptable size for streaming.
In a further embodiment of the present disclosure, the intelligence service platform system 1100 may further comprise a short message service module 1109 configured to transmit a short message between the user associated with the intelligence service terminal and another terminal device. For example, a family member could send a short message through the SMS module 1109, which further forwards the SMS to the robot. The robot could present the short message to the user by means of its speaker.
In a still further embodiment of the present disclosure, the intelligence service platform system 1100 may further comprise a call center module 1110. The call center module 1110 is configured to make a call to an emergency center or a specified person in response to detection of emergency regarding the user associated with the intelligence service terminal. In case of an abnormal healthy condition, the call center module could make a call to an emergency center for asking help, and/or call a specified person like a nurse, a doctor, a family member, and so on to get help.
In a yet further embodiment of the present disclosure, the intelligence service platform system 1100 may further comprise a billing module 1111 configured to bill services provided to the user. The billing module 111 could record the services provided to the user and generate a bill for the user. The user could also review the bill in real time.
In another embodiment of the present disclosure, the intelligence service platform system 1100 may further comprise a feedback management module 1112 configured to receive and manage feedbacks from users. In addition, the feedback management module 1112 could also provide a public information sharing and exchanging platform for users so that they could share and communicate their experiences. User feedbacks are valuable for platform improvement and by mean of real feedbacks from users; the platform developer could improve the user experience and provide higher quality of service.
In a further embodiment of the present disclosure, the intelligence service platform system 1100 may further comprise a document management module 1113. The document management module 1113 may be configured to store and manage documents related to the intelligence service platform system.
In a still further embodiment of the present disclosure, the intelligence service platform system 1100 could be a cloud based system, which is built based on cloud technologies. The cloud infrastructure can be distributed in different areas, regions, and countries as long as it could provide the services as proposed herein.
For illustrative purposes, Fig. 12 illustrates an example implementation of the intelligence service platform system according to an embodiment of the present disclosure. However, it shall be understood that it is only given for illustrative purposes, and the present disclosure is not limited to.
As illustrated in Fig. 12, the intelligence service terminal 210 could be connected to the intelligence service platform 1200 through Internet. The intelligence service platform 1200 may include an interface layer 1210, and an online platform 1220.
The interface layer 1210 provides interface of the platform to external devices or entities. For example, the intelligence service terminal 210 accesses the online platform 1220 via the interface layer 1210, and the online platform 1220 may also use the interface layer 1210 to exchange information with external assistant service entities like medical school, public hospital, private hospital, mental health center, elderly health center, and rehabilitation center, as well.
On the online platform 1220, there are provided input invalidation module, a response output module, a call center module, human mind deep learning  engine, information analysis module, user recording system, content streaming system, SMS module, billing system, feedback management system, and a document management module. These modules or systems on the online platform 1220 are similar to those illustrated in Fig. 11 and thus will not be elaborated herein.
On the online platform 1220, there are further provided a failover subsystem 1221, an operation management subsystem 1222, a storage subsystem 1223, a server subsystem 1224, and network virtualization subsystem 1225.
The failover subsystem 1222 may facilitate the building of a higher resilience system. It may assume ExpressClusterX which is failover software based clustering to build higher resilience system. It could provide automatic failover function for servers, and monitor hardware, operation systems, applications, database failures.
The operation management subsystem 1222 is an integrated operations management software suite for management for the platform system. MasterScope can be used to implement the operation management subsystem 1222 to provide simple unified management. It may manage day-to-day operation information such as incident, problem, change and release information, enable automation of job scheduling, virtualization management, software distribution, cloud management, and monitor server, network, application performance, or etc.
The storage subsystem 1223 could be used for storage and/or backup storage and archiving with high compressed data. The storage subsystem 1223 could be implemented as HYDRAstor due to its high compression and data deduplication, good scaling-out performance without operation stop, remote replication feature, write-once-read-many capability for regulation of long-term storage as well as its encryption function.
The server subsystem 1224 can be implemented as Express 5800 server which has several lineup. Express 5800 server is a general server for small to middle performance use; it has a high availability and could provide hardware based failover within a unit. At the same time, it is also a high density server for cloud infrastructure with 42U rack accommodating 572 servers.
The network virtualization subsystem 1225 could be implemented as ProgrammableFlow, which is the software defined network solution for providing  network virtualization. The network virtualization subsystem 1225 could provide centralized monitoring and easy configuration by separation of physical switch and configuration, and thus it requires no skilled network engineer because of centralized GUI based configuration, and could provide a high security solution for cyber attacking by automatic virus detection and network separation.
Fig. 13 illustrates a method for providing intelligence services at the intelligence service terminal according to an embodiment of the present disclosure. As illustrated in Fig. 13, the method 1300, the intelligence service terminal may acquire user related information from body sensor associated with a user (step 1301) , acquire a face image of the user (step 1302) , and acquire a voice input from the user (step 1303) . It can be understood that operations in step 1301 to 1303 can be performed in different orders, which is dependent on real application cases.
In step 1304, the intelligence service terminal may transmit at least one of the user related data, the face image of the user, or the voice input from the user to an intelligence service platform system. In step 1305, the intelligence service terminal may receive a response or command from the intelligence service platform system. Further in step 1306, it may control the intelligence service terminal to provide a response to the user based on the response or command from the intelligence service platform system.
In step 1307, the intelligence service terminal may provide a daily scheduled activity plan and tracking the progress of daily scheduled activities. If the daily scheduled activity plan is adapted in accordance with at least one of user’s healthy condition or mood condition, the intelligence service terminal may provide an adapted daily scheduled activity plan to the user in step 1308. Further in step 1309, it may control the intelligence service robot to sing or dance in the user’s favorite tune to boost up user’s emotion. In addition, it is also possible to capture a Quick Response (QR) Code in step 1310, wherein a corresponding application is executed in response to the captured QR code.
Fig. 14 illustrates a method for providing intelligence services at the intelligence service platform system according to an embodiment of the present disclosure.
As illustrated in Fig. 14, in method 1400, first in step 1401, the service platform system receives user related information collected by an intelligence service terminal. The user related information may include healthy data sensed by body sensors, face images captured by the camera of the robot, voice input from the user captured by microphone of the robot.
In step 1402, the service platform system may identify, by a deep learning engine, the user’intention or emotion from the received information and generate a corresponding response or command output. The human mind deep learning engine is based on a human mind deep learning model which describes a manner that the brain encodes information. The human mind deep learning model is trained based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use.
In step 1403, the service platform system may provide the corresponding response or command output to the intelligence service terminal.
In an embodiment of the present disclosure, the information collected by the intelligence service terminal may comprise a face image of a user. The method may further comprise interpreting, by the deep learning engine, emotion of the user based on the face image and generating a corresponding response or command output to the intelligence service terminal (step 1404) .
In an embodiment of the present disclosure, the individual related history data may comprise on one or more of: game playing information, program selection information, robot communication information, music playing information, activity history information, healthy data, medical information, social context personality profile, family configurations, life style information, and personal tags.
As illustrated in Fig. 14, the method 1400 may further comprise generating, by the deep learning engine, a daily scheduled activity plan and adapt the scheduled activity plan based on at least one of the identified healthy and mood in step1405. Further in step 1406, the service platform system may validate the received user related information collected by the intelligence service terminal and identify the intelligence service terminal. In step 1407, the method 1400 may further comprise performing lifestyle analytics on user history data like those illustrated in Figs. 10A to 10G. The method 1400 may also comprise, in step 1408, acquiring, from an external  professional assistance service entity, assistive information on the user associated with the intelligence service terminal, and providing professional analysis result as at least part of the response or command output to the intelligence service terminal.
In an embodiment of the present disclosure, the external professional assistance service entity may comprise at least one of medical school, public hospital, private hospital, mental health center, elderly health center, and rehabilitation center.
Further as illustrated in Fig. 15, in step 1408, the service platform system may further perform at least one of:
● recording information on individual users;
● streaming contents between the intelligence service terminal and another terminal device;
● compressing a video content before transmitting;
● transmitting a short message between the intelligence service terminal and another terminal device;
● making a call to an emergency center or a family member in response to detection of emergency regarding the user associated with the intelligence service terminal;
● billing services provided to the user;
● receiving and managing feedbacks from users and provide a public information sharing and exchanging platform for users so that they could share and communicate their experiences; and
● managing documents related to the intelligence service platform system.
Hereinbefore,  methods  1300 and 1400 are described with reference to Figs. 13 to 15 in brief. It can be noted that the  methods  1300 and 1400 may configured to implement functionalities as described with reference to Figs. 1 to 12. Therefore, for details about operations of various steps in these apparatuses, one may refer to those descriptions made with respect to the modules of the intelligence service terminal and platform system with reference to Figs. 1 to 12.
In addition, it is further noted that components of the intelligence service terminal 600 and the intelligence service platform system 1100 may be embodied in hardware, software, firmware, and/or any combination thereof. For  example, the components of the intelligence service terminal 600 and the intelligence service platform system 1100 may be respectively implemented by a processor, a server or any other appropriate device.
Those skilled in the art will appreciate that the aforesaid examples are only for illustration not limitation and the present disclosure is not limited thereto; one can readily conceive many variations, additions, deletions and modifications from the teaching provided herein and all these variations, additions, deletions and modifications fall the protection scope of the present disclosure.
In addition, in some embodiment of the present disclosure, the intelligence service terminal 600 and the intelligence service platform system 1100 may include at least one processor. The at least one processor suitable for use with embodiments of the present disclosure may include, by way of example, both general and special purpose processors already known or developed in the future. The intelligence service terminal 600 and the intelligence service platform system 1100 may further include at least one memory. The at least one memory may include, for example, semiconductor memory devices, e.g., RAM, ROM, EPROM, EEPROM, and flash memory devices. The at least one memory may be used to store program of computer executable instructions. The program can be written in any high-level and/or low-level compliable or interpretable programming languages. In accordance with embodiments, the computer executable instructions may be configured, with the at least one processor, to cause the intelligence service terminal 600 and the intelligence service platform system 1100 to at least perform operations according to the method as discussed with reference to Figs. 1 to 5 and 7 to 10 and 12 respectively.
In addition, the present disclosure may also provide a carrier containing the computer program as mentioned above, wherein the carrier is one of an electronic signal, optical signal, radio signal, or computer readable storage medium. The computer readable storage medium can be, for example, an optical compact disk or an electronic memory device like a RAM (random access memory) , a ROM (read only memory) , Flash memory, magnetic tape, CD-ROM, DVD, Blue-ray disc and the like.
The techniques described herein may be implemented by various means so that an apparatus implementing one or more functions of a corresponding apparatus described with an embodiment comprises not only prior art means, but also means for  implementing the one or more functions of the corresponding apparatus described with the embodiment and it may comprise separate means for each separate function, or means that may be configured to perform two or more functions. For example, these techniques may be implemented in hardware (one or more apparatuses) , firmware (one or more apparatuses) , software (one or more modules) , or combinations thereof. For a firmware or software, implementation may be made through modules (e.g., procedures, functions, and so on) that perform the functions described herein.
Exemplary embodiments herein have been described above with reference to block diagrams and flowchart illustrations of methods and apparatuses. It will be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by various means including computer program instructions. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks.
While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any implementation or of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments of particular implementations. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination.
It will be obvious to a person skilled in the art that, as the technology advances, the inventive concept can be implemented in various ways. The above described embodiments are given for describing rather than limiting the disclosure, and  it is to be understood that modifications and variations may be resorted to without departing from the spirit and scope of the disclosure as those skilled in the art readily understand. Such modifications and variations are considered to be within the scope of the disclosure and the appended claims. The protection scope of the disclosure is defined by the accompanying claims.

Claims (37)

  1. An intelligence service platform system, comprising:
    an input module configured to receive user related information collected by an intelligence service terminal;
    a deep learning engine based on a human mind deep learning model,
    wherein the human mind deep learning model describes a manner that the brain encodes information,
    wherein the human mind deep learning model is trained based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use, and
    wherein the deep learning engine is further configured to identify the user’ intention or emotion from the received user related information and generate a corresponding response or command output thereto; and
    an output module configured to provide the corresponding response or command output to the intelligence service terminal.
  2. The intelligence service platform system of Claim 1, wherein the information collected by the intelligence service terminal comprises a face image of a user, and the deep learning engine is configured to interpret emotion of the user based on the face image and generate a corresponding response or command output to the intelligence service terminal.
  3. The intelligence service platform system of Claims 1 or 2, wherein the individual related history data comprises on one or more of: game playing information, program selection information, robot communication information, music playing information, activity history information, healthy data, medical information, social context personality profile, family configurations, life style information, and personal tags.
  4. The intelligence service platform system of any of Claims 1 to 3, wherein the deep learning engine is further configured to generate a daily scheduled activity plan  and adapt the scheduled activity plan based on at least one of identified healthy condition or mood condition.
  5. The intelligence service platform system of any of Claims 1 to 4, further comprising:
    an input validation module configured to validate the received user related information collected by the intelligence service terminal and identify the intelligence service terminal collecting the user related information.
  6. The intelligence service platform system of any of Claims 1 to 5, further comprising:
    an information analysis module configured to perform lifestyle analytics on user history data.
  7. The intelligence service platform system of any of Claims 1 to 6, further comprising:
    an information analysis module configured to acquire, from an external professional assistance service entity, assistive information on the user associated with the intelligence service terminal, and to provide a professional analysis result as at least part of the response or command output to the intelligence service terminal.
  8. The intelligence service platform system of Claim 7, wherein the external professional assistance service entity comprises at least one of medical school, public hospital, private hospital, mental health center, elderly health center, and rehabilitation center.
  9. The intelligence service platform system of any of Claims 1 to 8, further comprising:
    a user recording module configured to record information on individual users.
  10. The intelligence service platform system of any of Claims 1 to 9, further comprising:
    a content streaming module configured to stream contents between the intelligence service terminal and another terminal device.
  11. The intelligence service platform system of any of Claims 1 to 10, further comprising:
    a video compression module configured to compress a video content before transmitting.
  12. The intelligence service platform system of any of Claims 1 to 11, further comprising:
    a short message service module configured to transmit a short message between the intelligence service terminal and another terminal device.
  13. The intelligence service platform system of any of Claims 1 to 12, further comprising:
    a call center module configured to make a call to an emergency center or a specified person in response to detection of emergency regarding the user.
  14. The intelligence service platform system of any of Claims 1 to 13, further comprising:
    a billing module configured to bill services provided to the user.
  15. The intelligence service platform system of any of Claims 1 to 14, further comprising:
    a feedback management module configured to receive and manage feedbacks from users and provide a public information sharing and exchanging platform for users so that they could share and communicate their experiences.
  16. The intelligence service platform system of any of Claims 1 to 15, further comprising:
    a document management module configured to manage documents related to the intelligence service platform system.
  17. The intelligence service platform system of any of Claims 1 to 16, wherein the intelligence service platform system is a cloud based system.
  18. An intelligence service terminal, comprising:
    a user information acquisition module, configured to acquire user related information from a body sensor associated with a user;
    an image acquisition module, configured to capture a face image of the user;
    a sound acquisition module, configured capture a voice input from the user;
    a data transmission module, configured to transmit at least one of the user related information, users’ face image, or user’ voice input to an intelligence service platform system;
    a response receiving module, configured to receive a response or command from the intelligence service platform system; and
    a control module, configured to control the intelligence service terminal to provide a response to the user based on the response or command from the intelligence service platform system.
  19. The intelligence service terminal of Claim 18, further comprising:
    a daily activity management module configured to provide a daily scheduled activity plan and track the progress of daily scheduled activity plan.
  20. The intelligence service terminal of Claim 18 or 19, further comprising:
    a daily activity scheduling module configured to provide a daily scheduled activity plan adapted in accordance with at least one of user’s healthy condition or mood condition.
  21. The intelligence service terminal of any of Claims 18 to 20, wherein the intelligence service terminal is an intelligence service robot, and wherein the control module is configured to control the intelligence service robot to sing or dance in the user’s favorite tune to boost up user’s emotion.
  22. The intelligence service terminal of any of Claims 18 to 21, further comprising:
    a connection module configured to enable the intelligence service terminal connect with other terminal devices.
  23. The intelligence service terminal of any of Claims 18 to 22, wherein the at least one image capturing device is further configured to capture a Quick Response (QR) Code and wherein a corresponding application is executed in response to the captured QR code.
  24. A method for providing intelligence services at a service platform, comprising:
    receiving user related information collected by an intelligence service terminal;
    identifying, by a deep learning engine, the user’ intention or emotion from the received information and generating a corresponding response or command output, wherein the deep learning engine is based on a human mind deep learning model, wherein the human mind deep learning model describes a manner that the brain encodes information, and wherein the human mind deep learning model is trained based on individual related history data and further adjusted by individual’s daily responses collected by the intelligence service terminal in use; and
    providing the corresponding response or command output to the intelligence service terminal.
  25. The method of Claim 24, wherein the information collected by the intelligence service terminal comprises a face image of a user, the method further comprising:
    interpreting, by the deep learning engine, emotion of the user based on the face image, and
    generating a corresponding response or command output to the intelligence service terminal.
  26. The method of Claims 24 or 25, wherein the individual related history data comprises on one or more of: game playing information, program selection information, robot communication information, music playing information, activity history  information, healthy data, medical information, social context personality profile, family configurations, life style information, and personal tags.
  27. The method of any of Claims 24 to 26, further comprising:
    generating, by the deep learning engine, a daily scheduled activity plan and adapt the scheduled activity plan based on at least one of the identified healthy condition or mood condition.
  28. The method of any of Claims 24 to 27, further comprising:
    validating the received user related information collected by the intelligence service terminal and identifies the intelligence service terminal.
  29. The method of any of Claims 24 to 28, further comprising:
    performing lifestyle analytics on user history data.
  30. The method of any of Claims 24 to 28, further comprising:
    acquiring, from an external professional assistance service entity, assistive information on the user associated with the intelligence service terminal, and
    providing a professional analysis result as at least part of the response or command output to the intelligence service terminal.
  31. The method of Claim 30, wherein the external professional assistance service entity comprises at least one of medical school, public hospital, private hospital, mental health center, elderly health center, and rehabilitation center.
  32. The method of any of Claims 24 to 31, further comprising at least one of:
    recording information on individual users;
    streaming contents between the intelligence service terminal and another terminal device;
    compressing a video content before transmitting;
    transmitting a short message between the intelligence service terminal and another terminal device;
    making a call to an emergency center or a specified person in response to detection of emergency regarding the user associated with the intelligence service terminal;
    billing services provided to the user;
    receiving and managing feedbacks from users and provide a public information sharing and exchanging platform for users so that they could share and communicate their experiences; and
    managing documents related to the intelligence service platform system.
  33. A method for providing intelligence services at an intelligence service terminal, comprising:
    acquiring user related information from body sensor associated with a user;
    capturing a face image of the user;
    capturing a voice input from the user;
    transmitting at least one of the user related data, the face image of the user, or the voice input from the user to an intelligence service platform system;
    receiving a response or command from the intelligence service platform system; and
    controlling the intelligence service terminal to provide a response to the user based on the response or command from the intelligence service platform system.
  34. The method of claim 33, further comprising:
    providing a daily scheduled activity plan and tracking the progress of daily scheduled activities.
  35. The method of Claim 33 or 34 further comprising:
    providing a daily scheduled activity plan adapted in accordance with at least one of user’s healthy condition or mood condition.
  36. The method of any of Claims 33 to 35, wherein the intelligence service terminal is an intelligence service robot, and the method further comprising:
    controlling the intelligence service robot to sing or dance in the user’s favorite tune to boost up user’s emotion.
  37. The method of any of Claims 33 to 36, further comprising:
    capturing a Quick Response (QR) Code, wherein a corresponding application is executed in response to the captured QR code.
PCT/CN2018/076659 2018-02-13 2018-02-13 Intelligent service terminal and platform system and methods thereof WO2019157633A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
SG11202007610SA SG11202007610SA (en) 2018-02-13 2018-02-13 Intelligent service terminal and platform system and methods thereof
CN201880009316.3A CN110337698B (en) 2018-02-13 2018-02-13 Intelligent service terminal and platform system and method thereof
PCT/CN2018/076659 WO2019157633A1 (en) 2018-02-13 2018-02-13 Intelligent service terminal and platform system and methods thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/076659 WO2019157633A1 (en) 2018-02-13 2018-02-13 Intelligent service terminal and platform system and methods thereof

Publications (1)

Publication Number Publication Date
WO2019157633A1 true WO2019157633A1 (en) 2019-08-22

Family

ID=67620963

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/076659 WO2019157633A1 (en) 2018-02-13 2018-02-13 Intelligent service terminal and platform system and methods thereof

Country Status (3)

Country Link
CN (1) CN110337698B (en)
SG (1) SG11202007610SA (en)
WO (1) WO2019157633A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111563160A (en) * 2020-04-15 2020-08-21 华南理工大学 Text automatic summarization method, device, medium and equipment based on global semantics
CN112394885A (en) * 2020-11-27 2021-02-23 咸阳师范学院 Tourism data storage system
CN114131626A (en) * 2021-12-09 2022-03-04 昆山市工研院智能制造技术有限公司 Robot, service system and method
WO2022067372A1 (en) * 2020-09-29 2022-04-07 Human Centred Innovations Pty Ltd Virtual and physical social robot with humanoid features
CN115687492A (en) * 2022-12-28 2023-02-03 北京百车宝科技有限公司 Automobile intelligent service platform and method based on multi-party service connection

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113571204A (en) * 2020-04-29 2021-10-29 阿里巴巴集团控股有限公司 Information interaction method, device and system
CN113435126B (en) * 2021-07-07 2024-02-02 魏天骐 Knowledge sharing processing method, intelligent robot device, knowledge sharing system and task learning system
CN116313074A (en) * 2023-02-16 2023-06-23 京大(北京)技术有限公司 Old person incapacitation prevention and rehabilitation management system based on endowment robot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130085758A1 (en) * 2011-09-30 2013-04-04 General Electric Company Telecare and/or telehealth communication method and system
WO2013088307A1 (en) * 2011-12-16 2013-06-20 Koninklijke Philips Electronics N.V. History log of user's activities and associated emotional states
CN106156850A (en) * 2015-04-24 2016-11-23 江苏卓顿信息科技有限公司 A kind of psychological consultant's robot system based on cloud computing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130085758A1 (en) * 2011-09-30 2013-04-04 General Electric Company Telecare and/or telehealth communication method and system
WO2013088307A1 (en) * 2011-12-16 2013-06-20 Koninklijke Philips Electronics N.V. History log of user's activities and associated emotional states
CN106156850A (en) * 2015-04-24 2016-11-23 江苏卓顿信息科技有限公司 A kind of psychological consultant's robot system based on cloud computing

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111563160A (en) * 2020-04-15 2020-08-21 华南理工大学 Text automatic summarization method, device, medium and equipment based on global semantics
CN111563160B (en) * 2020-04-15 2023-03-31 华南理工大学 Text automatic summarization method, device, medium and equipment based on global semantics
WO2022067372A1 (en) * 2020-09-29 2022-04-07 Human Centred Innovations Pty Ltd Virtual and physical social robot with humanoid features
CN112394885A (en) * 2020-11-27 2021-02-23 咸阳师范学院 Tourism data storage system
CN112394885B (en) * 2020-11-27 2023-05-12 咸阳师范学院 Travel data storage system
CN114131626A (en) * 2021-12-09 2022-03-04 昆山市工研院智能制造技术有限公司 Robot, service system and method
CN115687492A (en) * 2022-12-28 2023-02-03 北京百车宝科技有限公司 Automobile intelligent service platform and method based on multi-party service connection
CN115687492B (en) * 2022-12-28 2023-03-31 北京百车宝科技有限公司 Automobile intelligent service platform and method based on multi-party service connection

Also Published As

Publication number Publication date
CN110337698B (en) 2021-12-31
SG11202007610SA (en) 2020-09-29
CN110337698A (en) 2019-10-15

Similar Documents

Publication Publication Date Title
WO2019157633A1 (en) Intelligent service terminal and platform system and methods thereof
Kim et al. Emergency situation monitoring service using context motion tracking of chronic disease patients
Al-Shaqi et al. Progress in ambient assisted systems for independent living by the elderly
US10726846B2 (en) Virtual health assistant for promotion of well-being and independent living
Alkhomsan et al. Situation awareness in ambient assisted living for smart healthcare
US20200143286A1 (en) Affective Response-based User Authentication
Javed et al. Toward an automated measure of social engagement for children with autism spectrum disorder—a personalized computational modeling approach
US20230281813A1 (en) Medical device for transcription of appearances in an image to text with machine learning
US20210151154A1 (en) Method for personalized social robot interaction
KR102111775B1 (en) Medical practice data collection and management system and method
Arnrich A survey on measuring happiness with smart phones
Malgaroli et al. Digital health and artificial intelligence for PTSD: improving treatment delivery through personalization
Kang et al. An ecological approach to smart homes for health care services: conceptual framework of a smart servicescape wheel
Forkan et al. A context-aware, predictive and protective approach for wellness monitoring of cardiac patients
Bonilla et al. Facial recognition of emotions with smartphones to improve the elder quality of life
KR101890399B1 (en) Social network service system using artificial intelligence robot
JP2019197509A (en) Nursing-care robot, nursing-care robot control method and nursing-care robot control program
Kosiedowski et al. On applying ambient intelligence to assist people with profound intellectual and multiple disabilities
Giannakopoulos et al. Daily activity recognition based on meta-classification of low-level audio events
US20210142047A1 (en) Salient feature extraction using neural networks with temporal modeling for real time incorporation (sentri) autism aide
Ktistakis et al. Applications of ai in healthcare and assistive technologies
Keller et al. Receptivity to Mobile health interventions
Bastaki et al. Intelligent assisted living framework for monitoring elders
Ma et al. Health status prediction with local-global heterogeneous behavior graph
Deniz et al. Deep Multimodal Habit Tracking System: A User-adaptive Approach for Low-power Embedded Systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18906224

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18906224

Country of ref document: EP

Kind code of ref document: A1