WO2016168738A1 - System and methods for haptic learning platform - Google Patents

System and methods for haptic learning platform Download PDF

Info

Publication number
WO2016168738A1
WO2016168738A1 PCT/US2016/027943 US2016027943W WO2016168738A1 WO 2016168738 A1 WO2016168738 A1 WO 2016168738A1 US 2016027943 W US2016027943 W US 2016027943W WO 2016168738 A1 WO2016168738 A1 WO 2016168738A1
Authority
WO
WIPO (PCT)
Prior art keywords
haptic
user
sensors
learning
server
Prior art date
Application number
PCT/US2016/027943
Other languages
French (fr)
Inventor
Eric HAUTALA
Original Assignee
Declara, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Declara, Inc. filed Critical Declara, Inc.
Publication of WO2016168738A1 publication Critical patent/WO2016168738A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/30Anatomical models
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers

Definitions

  • Prior methods and apparatus for teaching learning with computers can be less than ideal in at least some respects.
  • computers can be used to provide electronic books and other formats of presenting material
  • the information presented can be less interactive than would be ideal.
  • the prior methods and apparatus can provide less than ideal feedback to a student user as to how the student can improve.
  • many manuals and online materials may not provide feedback to the user such as biofeedback or other measurements of the student.
  • prior devices have been proposed to measure feedback with patients, prior methods and apparatus for measuring patients can be less than ideally suited for teaching learning in at least some respects.
  • prior medical apparatus can be more complex than would be ideal, and may be configured with a closed architecture that is less adaptable to relevant user metrics.
  • examples as disclosed herein are directed to a learning platform with sensor inputs such as a haptic device
  • the technologies disclosed herein can have applications in many fields, such as social networking, search engines, multimedia technologies and crowd sourcing.
  • the systems and methods described herein are suitable for use with one or more individuals of a population, and can be combined in many ways, for example with crowd sourced learning for many users with haptic devices.
  • the methods and apparatus disclosed herein can be configured to measure one or more sensor signals from a user, in order to provide feedback to the user as to how to improve.
  • the feedback to the user can be provided while the user engages in the learning activity, such that the user can receive input and adjust activities during the lesson to improve the user's learning experience.
  • a sensor can detect various types of signals such as one or more haptic signals, user signals, or environmental signals.
  • the sensor data which may comprise communication signals can be received and combined with a processor in order to provide an assessment as to how the user is performing and provide suggestions as to how the user can modify behavior in order to improve performance.
  • the feedback to the user can be provided in response to a comparison with an expert learning profile or graph.
  • the methods, apparatus, media, systems, and platforms can be configured in many ways to utilize haptic input and out functions to improve learning.
  • the user can select a lesson among a plurality of lessons, and the system configured with instructions of the selected lesson.
  • One or more other sensors can be combined with the haptic input and output in order to provide improved haptic feedback to the user.
  • the methods and apparatus disclosed herein may comprise an open system, in which the haptic and optionally one or more sensors are dynamically bound to the processor in order to receive user data and provide feedback to the user.
  • the processor can be a processor of a local device or a processor of a remote system, or both.
  • the instructions can configured to identify and bind one or more local devices or sensors appropriate for the user selected lesson.
  • the open system has the advantage of allowing the use of devices readily available to the user, such that the lessons and feedback can be readily provided to many users.
  • a haptic learning platform comprises: (a) a server including a server processor and a server operating system configured to provide a user with a server application comprising: (1) a recommendation module configured to recommend one or more contents to the user, and (2) a database configured to store a plurality of lessons; and (b) the mobile device including a mobile processor and one or more haptic sensors configured to provide the user with a mobile application, the mobile application comprising: (1) a haptic analysis module configured to receive one or more haptic signals of the user from the one or more haptic sensors; and (2) a mobile communication module configured to transmit the one or more haptic signals to the server.
  • the recommendation module comprises a haptic learning engine configured to analyze the one or more haptic signals.
  • the haptic learning engine may comprise an artificial intelligence algorithm for the analyzing the one or more haptic signals.
  • Analyzing the one or more haptic signals may comprise inferring a purpose of a motion of the user. Analyzing the one or more haptic signals can comprise exploring a plurality of contents stored in the server application.
  • analyzing the one or more haptic signals comprises identifying an interest of the user. Analyzing the one or more haptic signals comprises identifying a role of the user.
  • Recommending the one or more contents may be based on the analyzing the one or more haptic signals.
  • the user may play a role in a learning process.
  • the user may purchase a lesson from the plurality of lessons.
  • the one or more contents can comprise one or more of the following: one or more images, one or more video files, one or more audio files, one or more articles, one or more spreadsheets, and one or more RSS feeds.
  • the plurality of lessons may comprise a plurality of contents.
  • the one or more haptic signals may comprise one or more of the following activities: viewing, reading, watching, and listening.
  • non-transitory computer-readable storage media encoded with a computer program including instructions executable by one or more digital signal processors to provide a user with a server application and a mobile application for enhancing haptic learning.
  • the server application comprises: (a) a recommendation module configured to recommend one or more contents to the user; and (b) a database configured to store a plurality of lessons.
  • the mobile application comprises: (a) a haptic analysis module configured to receive one or more haptic signals of the user from one or more haptic sensors; and (b) a mobile communication module configured to transmit the one or more haptic signals to the server application.
  • the recommendation module comprises a haptic learning engine configured to analyze the one or more haptic signals.
  • the haptic learning engine may comprise an artificial intelligence algorithm for the analyzing the one or more haptic signals.
  • Analyzing the one or more haptic signals may comprise inferring a purpose of a motion of the user. Analyzing the one or more haptic signals can comprise exploring a plurality of contents stored in the server application.
  • analyzing the one or more haptic signals comprises identifying an interest of the user. Analyzing the one or more haptic signals comprises identifying a role of the user.
  • Recommending the one or more contents may be based on the analyzing the one or more haptic signals.
  • the user may play a role in a learning process.
  • the user may purchase a lesson from the plurality of lessons.
  • the one or more contents can comprise one or more of the following: one or more images, one or more video files, one or more audio files, one or more articles, one or more spreadsheets, and one or more RSS feeds.
  • the plurality of lessons may comprise a plurality of contents.
  • the one or more haptic signals may comprise one or more of the following activities: viewing, reading, watching, and listening.
  • a server comprising a server processor and a server operating system configured to provide a user with a server application, the server application comprising a haptic learning engine for learning purpose.
  • a mobile device comprising a mobile processor and one or more haptic sensors configured to provide a user with a mobile application, the mobile application comprising a mobile communication module configured to transmit one or more haptic signals to a server.
  • a mobile device comprising a mobile processor and one or more haptic sensors configured to provide a user with a mobile application, the mobile application comprising a haptic analysis module configured to receive one or more haptic signals of a user from one or more haptic sensors.
  • a method for performing haptic learning by a server and a mobile device comprising: (a) recommending, by the server, one or more contents to a user; (b) storing, by the server, a database of a plurality of lessons; (c) receiving, by the mobile device, one or more haptic signals of the user from one or more haptic sensors; and (d) transmitting, by the mobile device, the one or more haptic signals to a server.
  • an apparatus used by a user for haptic learning comprising: (a) a recommendation module configured to recommend one or more contents to the user; (b) a haptic analysis module configured to receive one or more haptic signals of the user from one or more haptic sensors; (c) a communication module configured to transmit the one or more haptic signals to a server.
  • FIG. 1 depicts an example system of haptic learning.
  • FIG. 2 depicts an example of a recommendation engine comprising a haptic learning engine.
  • FIG. 3 depicts an example lesson based on haptic learning.
  • FIG. 4 depicts an example system of haptic learning with various functions.
  • FIG. 5 depicts an example computing architecture for haptic learning.
  • FIG. 6 depicts an example of a haptic learning system.
  • FIG. 7 depicts an example of advanced haptic learning system.
  • FIG. 8 depicts example actions of a haptic learning system.
  • FIG. 9 depicts an example of operation flow of a haptic learning system.
  • the methods and apparatus disclosed herein are well suited for use with many types of learning in many environments. Although specific reference is made to teaching a new mother, the methods and apparatus will find use with many learning applications, such as learning related to relaxing, health, yoga, coping with being blind, sleep apnea, autism, diaries, meditation, sports, martial arts, marksmanship, police, fire and military training, for example.
  • the methods and apparatus disclosed herein are well suited for combination with devices that may already be available to a user, such as smart watches, Apple Watch, haptics, smart phones, smart watches, tablet personal computers, notebook computers, iPodsTM, iPadsTM, smart thermostats and in home sensors, for example.
  • the technology described herein comprises a haptic learning platform.
  • the platform may comprise a server including a server processor and a server operating system configured to provide a user with a server application.
  • the server application may comprise (a) a recommendation module configured to recommend one or more contents to the user; (b) a database configured to store a plurality of lessons.
  • the platform may comprise a mobile device including a mobile processor and one or more haptic sensors configured to provide the user with a mobile application.
  • the mobile application may comprise: (a) a haptic analysis module configured to receive one or more haptic signals of the user from the one or more haptic sensors; and (b) a mobile
  • FIG. 1 illustrates an example of a haptic learning platform.
  • the platform comprises a recommendation engine comprising a haptic learning engine.
  • the platform also comprises a wearable device.
  • a smart watch with sensing capability is shown.
  • the users of this platform can play roles of a new mom and a baby.
  • the mom may have little knowledge of raising children, so she can use the platform to learn a child-raising lesson.
  • the baby could be a real baby, or a simulated baby rendered on a computer display, or a baby toy.
  • the baby may be attached with one or more sensors.
  • the lesson can teach the new mom various concepts.
  • the platform allows the mom to learn how to hold the baby, feed the baby, etc.
  • the methods, systems, media, and platforms disclosed herein may comprise a recommendation module.
  • the recommendation module may comprise a haptic learning engine configured to analyze the one or more haptic signals.
  • the haptic learning engine can comprise an artificial intelligence algorithm for the analyzing the one or more haptic signals.
  • Analyzing the one or more haptic signals can comprise inferring a purpose of a motion of the user. In some cases, analyzing the one or more haptic signals comprises exploring a plurality of contents stored in the server application. In some implementations, analyzing the one or more haptic signals comprises identifying an interest of the user. In some systems, analyzing the one or more haptic signals can comprise identifying a role of the user. The aforementioned analyses can be arbitrarily combined as well.
  • the methods, systems, media, and platforms disclosed herein may provide haptic learning with haptic input and output.
  • the haptic output to the user may comprise output configured to provide learning feedback to the user, and may provide an indication to the user as to how to improve performance, for example.
  • the platform may comprise a mobile device coupled to one or more haptic sensors.
  • the sensors may be embedded in the mobile device, or external to the mobile device, or both internal and external.
  • the sensors may detect and record any types of signals, such as optical, sound, voice, electromagnetic, acoustic, mechanical, etc.
  • the signals may represent the following one or more activities of a user, for instance, viewing, reading, watching, moving, pointing, jumping, walking, talking, typing, touching, speaking, singing, crawling, and listening.
  • a person with general knowledge can easily recognize that any motion of a portion of the human body can be included.
  • Recommending the one or more contents may be based on the analyzing the one or more haptic signals.
  • recommendation is made based on a user's profile. Alternatively or in combination, the recommendation can be made based on past activities.
  • the platform can be used by a user for playing a role in a learning process.
  • the platform may comprise a lesson store.
  • the user may purchase a lesson from among a plurality of lessons.
  • the user can purchase a lesson or a part of a lesson.
  • a new mom may use the platform to learn how to take care of a baby. She can browse the lessons offered on the platform and select one of her interest.
  • the lesson may comprise passive lectures delivered via computers, for example. Alternatively or in
  • the lesson may ask the student to actively participate in an interactive lecture, where the student may use sensing technology possessed by the mobile device that she carries to detect motions, emotions, gestures, poses, and expressions.
  • the platform may comprise a recommendation engine 401 in a server which offers various lessons.
  • the platform may comprise a wearable device 402 on an actor or a user.
  • the user plays a role in the lesson; in this example, the role is a new mom 403.
  • Corresponding to another user (or object) is a baby 404, who wears one or more sensors.
  • the new mom can register the wearable device 402 with the platform.
  • she can browse lesson store & purchase a lesson.
  • the lesson may be downloaded into the wearable device or another user device, for example.
  • the sensors are bound to the platform, calibrate the environment, and start scripting. When a lesson script executes, events are posted to the actor, inputs are posted from the actor, and inputs are posted from the sensor.
  • the sensors on the object such as baby and the new mom can be unbound from the platform.
  • the one or more contents in the platform may comprise one or more of the following: one or more images, one or more video files, one or more audio files, one or more articles, one or more spreadsheets, and one or more RSS feeds.
  • the lessons stored in the platform may comprise a plurality of contents with one or more types.
  • the haptic learning may utilize any stationary or mobile computing devices with sensing capabilities.
  • Non-limiting examples include game consoles, wearable devices, smartphones, computers, laptops, desktops, servers, headphones, etc.
  • the devices are specific for sensing purposes only, and may comprise communication interfaces coupled with another computing device.
  • Microsoft Kinect connected to a computer, a monitor, or a TV assembles a system with sensing capabilities.
  • the methods, systems, media, mobile devices and platforms as disclosed herein may comprise a mobile communication module.
  • the mobile communication module can allow a plurality of users to access the server for learning purpose. Furthermore, the mobile communication may send push notifications of recommended contents to the users, for example.
  • a mobile device comprises mobile access module to provide on- the-go access to a plurality of contents in the server, and/or one or more lessons in the server.
  • the mobile access module allows the mobile user to receive one or more push notifications from the server application.
  • the haptic learning platform may track the location of users and contents.
  • the platform may store geolocated information to visualize where learning is happening on a map that is similar or accretive to the users. Being able to use questions based on geolocation of content and expertise may be helpful to the user.
  • the haptic learning platform's mobile technology can provide capacity for collecting information related to learning. In many configurations, promoting an understanding an expert graph is helpful to the success of the users of the platform.
  • FIG. 5 shows a computer system 501 that is
  • the computer system 501 can execute various modules of the haptic learning platform as described herein.
  • the computer system 501 can be an electronic device of a user or a computer system that is remotely located with respect to the electronic device, and combinations thereof.
  • the electronic device can be a mobile electronic device, for example.
  • the computer system 501 comprises a central processing unit (CPU, also referred herein as and used interchangeably with “processor” and “computer processor” herein) 505, which can be a single core or multi core processor, or a plurality of processors for parallel processing.
  • the computer system 501 also comprises memory or memory location 510 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 515 (e.g., hard disk), communication interface 520 (e.g., network adapter) for communicating with one or more other systems, and peripheral devices 525, such as cache, other memory, data storage or electronic display adapters.
  • the memory 510, storage unit 515, interface 520 and peripheral devices 525 are in communication with the CPU 505 through a communication bus (solid lines), such as a motherboard.
  • the storage unit 515 can be a data storage unit (or data repository) for storing data.
  • the computer system 501 can be operatively coupled to a computer network ("network") 530 with the aid of the communication interface 520.
  • the network 530 can be the Internet, an internet or extranet, or an intranet and/or extranet that is in communication with the Internet.
  • the network 530 in some cases is a telecommunication and/or data network.
  • the network 530 can comprise one or more computer servers, which can enable distributed computing, such as cloud computing.
  • the network 530 in some cases with the aid of the computer system 501, can implement a peer-to-peer network, which may enable devices coupled to the computer system 501 to behave as a client or a server.
  • the CPU 505 can execute a sequence of machine-readable instructions, which can be embodied in a program or software.
  • the instructions may be stored in a memory location, such as the memory 510.
  • the instructions can be directed to the CPU 505, which can subsequently program or otherwise configure the CPU 505 to implement methods of the present disclosure. Examples of operations performed by the CPU 505 can include fetch, decode, execute, and writeback.
  • the CPU 505 can be part of a circuit, such as an integrated circuit.
  • a circuit such as an integrated circuit.
  • One or more other components of the system 501 can be included in the circuit.
  • the circuit is an application specific integrated circuit (ASIC).
  • the storage unit 515 can store files, such as drivers, libraries and saved programs.
  • the storage unit 515 can store user data, e.g., user preferences and user programs.
  • the computer system 501 in some cases can include one or more additional data storage units that are external to the computer system 501, such as located on a remote server that is in communication with the computer system 501 through an intranet or the Internet.
  • the computer system 501 can communicate with one or more remote computer systems through the network 530.
  • the computer system 501 can communicate with a remote computer system of a user (e.g., smartphones, laptops, desktops, tablets, wearable computing devices, and other computing devices).
  • remote computer systems include personal computers (e.g., portable PC), slate or tablet PC's (e.g., Apple® iPad, Samsung® Galaxy Tab), telephones, Smart phones (e.g., Apple® iPhone, Android- enabled device, Blackberry®), or personal digital assistants.
  • the user can access the computer system 501 via the network 530.
  • Methods as described herein can be implemented by way of machine (e.g., computer processor) executable code stored on an electronic storage location of the computer system 501, such as, for example, on the memory 510 or electronic storage unit 515.
  • the machine executable or machine readable code can be provided in the form of software.
  • the code can be executed by the processor 505.
  • the code can be retrieved from the storage unit 515 and stored on the memory 510 for ready access by the processor 505.
  • the electronic storage unit 515 can be precluded, and machine-executable instructions are stored on memory 510.
  • the code can be pre-compiled and configured for use with a machine have a processer adapted to execute the code, or can be compiled during runtime.
  • the code can be supplied in a programming language that can be selected to enable the code to execute in a pre-compiled or as-compiled fashion.
  • processor Various aspects of the technology may comprise “products” or “articles of manufacture” typically in the form of machine (or processor) executable code and/or associated data that is carried on or embodied in a type of machine readable medium.
  • Machine-executable code can be stored on an electronic storage unit, such memory (e.g., read-only memory, random-access memory, flash memory) or a hard disk.
  • Storage type media can comprise any or all of the tangible memory of the computers, processors or the like, or associated modules thereof, such as various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the software programming. All or portions of the software may at times be communicated through the Internet or various other telecommunication networks. Such communications, for example, may enable loading of the software from one computer or processor into another, for example, from a management server or host computer into the computer platform of an application server.
  • another type of media that may bear the software elements comprises optical, electrical and electromagnetic waves, such as used across physical interfaces between local devices, through wired and optical landline networks and over various air-links.
  • the physical elements that carry such waves, such as wired or wireless links, optical links or the like, also may be considered as media bearing the software.
  • terms such as computer or machine "readable medium” refer to any medium that participates in providing instructions to a processor for execution.
  • a machine readable medium such as computer-executable code
  • a tangible storage medium such as computer-executable code
  • Non-volatile storage media include, for example, optical or magnetic disks, such as any of the storage devices in any computer(s) or the like, such as may be used to implement the databases, etc. shown in the drawings.
  • Volatile storage media include dynamic memory, such as main memory of such a computer platform.
  • Tangible transmission media include coaxial cables; copper wire and fiber optics, including the wires that comprise a bus within a computer system.
  • Carrier-wave transmission media may take the form of electric or electromagnetic signals, or acoustic or light waves such as those generated during radio frequency (RF) and infrared (IR) data communications.
  • RF radio frequency
  • IR infrared
  • Common forms of computer-readable media therefore include for example: a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD or DVD-ROM, any other optical medium, punch cards paper tape, any other physical storage medium with patterns of holes, a RAM, a ROM, a PROM and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave transporting data or instructions, cables or links transporting such a carrier wave, or any other medium from which a computer may read programming code and/or data.
  • Many of these forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution.
  • the computer system 501 can include or be in communication with an electronic display 535 that comprises a user interface (UI) 540 for providing, for example, monitoring signals, viewing analysis results, viewing recommended contents.
  • UI user interface
  • Examples of UI's include, without limitation, a graphical user interface (GUI) and web-based user interface.
  • Methods and systems of the present disclosure can be implemented by way of one or more algorithms.
  • An algorithm can be implemented by way of software upon execution by the central processing unit 505.
  • the algorithm can be, for example, artificial intelligence algorithms.
  • FIG. 6 shows an example of a haptic learning system.
  • a user 601 uses a haptic device 602 for the purpose of haptic learning.
  • the user 601 registers the haptic device 602 to the system.
  • the system comprises a haptic learning engine 603; once registered, the haptic device 602 is able to communicate with the haptic learning engine 603.
  • the user can use the haptic device to browse lessons recommended by the haptic learning engine 603.
  • the haptic learning engine delivers lessons to the haptic device 602.
  • the user follows the instructions or scripts of the lesson to learn a new skill, where the haptic output cues or audio or visual cues are presented by the haptic device to the user.
  • the haptic device 602 can be configured in many ways, and may comprise haptic input and output.
  • the haptic device 602 can be configured to receive user commands with input, and can be configured to provide user feedback with haptic output.
  • the user input may comprise touch input, for example.
  • the haptic output to the user may comprise one or more of a vibration, a spatially localized sensation such as an edge, a click, a tap, or a temperature, for example.
  • the haptic output can be configured with one or more of temporal or spatially localized patterns, for example spatially localized touch sensations such as Braille, or temporal patterns such a number of vibrations or a frequency of vibrations, for example.
  • the haptic device 602 may comprise one or more of many known prior haptic devices suitable for incorporation in accordance with examples disclosed herein.
  • haptic 602 may comprise one or more components of a smart watch such as the Apple Watch commercially available from Apple.
  • Haptic 602 may comprise on or more haptic components as described in U.S. App. Ser. No. 14/059693, filed October 22, 2013, entitled “Touch Surface for Simulating Materials", published as US 20150109215 on April 23, 2015; and U.S. App. Ser. No. 12/504821, filed July 17, 2009, entitled “METHOD AND APPARATUS FOR LOCALIZATION OF HAPTIC FEEDBACK", published as US 20110012717 on January 20, 2011; the entire disclosures of which are incorporated herein by reference.
  • the user follows the scripts and instructions, and takes some actions.
  • the sensor 611 collects the signals of the user during the actions.
  • the signals are sent to or generated with the haptic device 602.
  • the haptic device may use the signals to issue other scripts to ask the user to perform other actions or correct the previous actions.
  • the haptic device may send the signals to the haptic learning engine 603.
  • the engine can issue new events or lessons for the user with the haptic device 602.
  • a sensor 612 coupled to the haptic learning engine 603.
  • the sensor can be local to the user 601 or local to the haptic learning engine 603.
  • the signals from the sensor 612 can allow the haptic learning engine to deliver new set of scripts or instructions to the haptic device 602 for helping the user to learning lesson more effectively.
  • FIG. 7 shows an example of an advanced haptic learning system.
  • a user 701 uses a haptic device 702 for the purpose of haptic learning.
  • the haptic device 702 can be a wearable device.
  • the haptic device 702 comprises one or more sensors 703.
  • the user 701 registers the haptic device 702 to the system.
  • the haptic device 702 allows the user 701 to browse lessons available on the system.
  • the haptic learning engine 704 suggests lessons matching with the user's profile. In some applications, the lessons are recommended based on analyses of the user's past activities.
  • the lesson is downloaded to the haptic device 702.
  • the device is also coupled to one or more external sensors 710, 711 and 713.
  • the haptic device collects sensing signals from the sensors 710, 711 and 713.
  • a sensing signal may comprise one or more of the following: a location, a direction, a rotation angle, a rotation speed, a speed, an acceleration, an angular acceleration, a lifting angle, a pressure, a temperature, a concentration, a force, a torque, a stability, a balance, a cardiovascular signal, a respiration signal, a health signal, a biological signal, and others.
  • the haptic device 702 uses the sensing signal(s) to monitor the user's participation in the lesson.
  • the haptic device 702 delivers lessons to the user 701.
  • the haptic device 702 can also send events to the user.
  • the haptic device 702 sends haptic and other cues regarding the lessons to the user.
  • the user takes action based on scripts and instructions of the lessons.
  • the user's responses can be monitored by the sensors 711, 712, and 713. Or, the responses can be input into the haptic device directly. In some cases, the sensing signals are sent by sensors 703 and 713 to the haptic learning engine 704.
  • the haptic learning engine 704 can communicate with the haptic device 702. The communication may allow the user to browse lessons or recommendations made from the engine 704. In some situations, the haptic device 702 sends the responses of the user 701 to the haptic learning engine 704.
  • FIG. 8 shows example actions of a haptic learning system.
  • the system comprises haptic device 802 or 803 wearable by a user 801.
  • the actions may include haptic queues 811.
  • the actions comprise haptic responses 812.
  • the actions comprise visual or audio signals, assembling visual/audio queues 813.
  • the actions create standard responses 814.
  • the device 802 may be a specific device designed to sense particular signals.
  • a smartphone 803 with sensing capability can be used.
  • the system comprises haptic learning engine 804.
  • the system may comprise sensors 805.
  • the devices may handle events 821 from the haptic learning engine 804; the events 821 may be presented to the user 801.
  • the haptic device 802 or 803 may receive inputs 822 from the user 801, and delivers the inputs to the haptic learning engine 804.
  • the system comprises a haptic learning engine 804.
  • the engine comprises lesson information 831, such as lesson scripts.
  • the engine comprises input/output.
  • the input interface receives information delivered from the haptic device 802 or 803.
  • the input interface may receive sensor data 841 from sensors 805.
  • the output comprises an interface delivering the lesson information to the haptic device 802 or 803.
  • the system comprises sensors 805.
  • the sensors may be embedded in the haptic device 802 or 803. In some applications, the sensors are external to the haptic device, as independent devices.
  • the sensors can be coupled to the haptic device 802 or 803, or to the haptic learning engine 804.
  • the sensors 805 collect sensing signal and sensor data 841 and transmit to the haptic device 802 or 803.
  • the haptic device 802 or 803 organizes the sensor data, and then presents further lesson instructions and scripts to the user 801 or requests more information from the haptic learning engine 804.
  • FIG. 9 shows an example of the system operation flow.
  • the user 901 wants to learn a specific skill or skills as described herein, for example. For instance, a new mom may want to learn skills of taking care of a new baby.
  • the user 901 registers her haptic device 902 or 903 to the haptic learning system.
  • the haptic device 902 or 903 can be any mobile device enabled with haptic functions.
  • the user can browse various lessons from a lesson store.
  • a lesson store 921 the information of lessons (e.g., author, publishing information, brief description, targeted students) can be shown.
  • the lesson information may further show how to ingest, earn, and buy the lesson.
  • Each lesson can be created as a mobile application, and is downloadable to the haptic device once the user is granted access to the lesson; see step 912.
  • the lesson access can be free, or based on purchase, or based on membership programs, or based on a combination of therein.
  • the haptic learning engine 904 may comprise searching capability.
  • the search may be based on artificial intelligence or machine learning, for example.
  • the intelligent search may find lessons relevant and suitable to the user 901.
  • step 913 Once the user has access to a lesson, (e.g., how to be a new mom), she can start the lesson; see step 913.
  • the lesson is downloaded onto the haptic device.
  • steps 922 and 931 when a lesson starts, the system binds and calibrates sensors 905.
  • scripts of the lesson may then start to execute.
  • the script execution comprises posting events to the haptic device 902/903, and getting input from the haptic device.
  • the sensors collect sensor data and, in step 932, stream the data to the haptic device 902/903 or the learning engine 904.
  • the new mom is learning how to feed a baby with milk.
  • the system comprises sensors 905 on a baby and on the mom.
  • the scripts guide the mom to prepare milk, hold a bottle, hold the baby, and other aspects of the lesson.
  • the guiding can be based on voice, sound, video, audio, images, or texts.
  • the guiding may comprise s sequence of movements the mom should do with the baby.
  • the mom follows the haptic commands to experience the learning process and learns from the haptic output.
  • the sensors as described herein can be designed based on various technologies (e.g., hologram, actuators, etc.) to detect signals of temperature, touch, pulse, vibrations, micro motions, finger movements, or any kind of sensation, for example.
  • the signals can be ID, 2D, 3D, or 4D.
  • the sensor data regarding the mom and the baby is collectively sent to the haptic device 902/903 or the learning engine 904, or both.
  • the haptic device can analyze the sensor data and provide feedback to the mom to correct her positions, for example. In some portions of the lessons, the sensor data are sent to the haptic learning engine 904 to download more information or lessons.
  • the haptic learning engine can analyze the sensor data and generate output.
  • the feedback provided by the haptic device or the learning engine may comprise corrections of the mom's actions (e.g., hold the baby in a correct position) or requests to perform following actions (e.g., pacify the baby when the baby stops taking milk).
  • the haptic device 902/903 and the sensors 905 may perform dynamic binding. For example, when some scripts are regarding pacifying the baby, video sensors can be bind to monitor the actions taken by the mom.
  • the CPR haptic output may comprise signals related to the administration of CPR for example a low rate of vibrations to slow down or a fast rate to speed up, or a first spatial haptic cue to slow down and a second spatial haptic cue to speed up the CPR.
  • CPR includes a cardio component and a pulmonary component with haptic cues to one or more users.
  • Sensors as described herein can be provided on a model or volunteer and activity compared to an expert profile as described herein.
  • Haptic pulmonary cues can be provided to a first user to slow down, speed up or maintain the rate of cardiac resuscitation, for example.
  • Haptic pulmonary cues can be provided to a second user to slow down, speed up or maintain the rate of pulmonary resuscitation, for example.
  • sensors coupled to the user can be configured to measure one or more of user breathing or heart rate.
  • the user can be instructed with haptic feedback to lower the heart rate to an amount corresponding to an expert profile.
  • the user can be instructed with haptic feedback to control breathing and other actions such as pulling the trigger in accordance with an expert profile.
  • the lesson may comprise a lesson for one or more of many applications as described herein, for example to control breathing and lower a heart rate to deal with stress.
  • the user 901 follows the scripts and performs actions. The system is highly interactive, and the user is guided in response to specific signals and sensor data.
  • the haptic device 902/903 and the learning engine 904 process and analyze the sensor data. Then the user can receive feedback from the haptic device 902/903 or the sensors 905.
  • the haptic learning system is highly multitasking, so the haptic device 902/903 or the learning engine 904 may comprise multitasking software on processors to handle queues and signals.
  • the haptic learning system comprises communications among the haptic device 902/903, the haptic learning engine 904, and the sensors 905. Therefore, the system comprises a protocol to schedule and organize the data routing and wireless/wired communications.
  • the haptic device 902/903 coordinates sensors, the device may request sensors 905 to acquire specific signals. Alternatively or in combination, the device may receive the signals from the sensors 905.
  • the haptic learning system can be configured in many ways to provide feedback to the user.
  • the downloaded lesson may comprise one or more expert profiles or expert graphs for comparison with user data to determine user performance and provide haptic feedback to the user.
  • the expert profile can be determined in many ways, for example from input sensor and haptic data profiles of experts or combinations of profiles of experts identified based on influence such as crowd sourcing.
  • the expert profile or graph can be compared with the input data from the one or more sensors such as the haptic sensor, and compared with one or more of the expert profile or graph.
  • Haptic output can be feed back to the user in response to the comparison in order to direct the user to act in accordance with the expert profile or graph, or both.
  • the system described herein can be used in a variety of situations and conditions.
  • Non-limiting examples include schools, universities, training, simulation, active learning, orientations, factories, sale skills learning, maintenance learning, negotiation learning, research, development, labs, and others as described herein, for example.
  • Each of the examples as described herein can be combined with one or more other examples. Further, one or more components of one or more examples can be combined with one or more components of other examples.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Educational Technology (AREA)
  • Educational Administration (AREA)
  • Multimedia (AREA)
  • Medicinal Chemistry (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Algebra (AREA)
  • Mathematical Analysis (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Chemical & Material Sciences (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

A haptic learning based platform is provided to enhance the efficiency and effectiveness of digital collaborations. The platform may comprise a server including a server processor and a server operating system configured to provide a user with a server application. The platform may further comprise a mobile device comprising a mobile processor and one or more haptic sensors configured to provide the user with a mobile application.

Description

SYSTEM AND METHODS FOR HAPTIC LEARNING PLATFORM
CROSS REFERENCE
[0001] The present PCT patent application claims priority to the U.S. Provisional Patent Application Serial No. 62/149,458, filed on April 17, 2015, entitled "SYSTEM AND
METHODS FOR HAPTIC LEARNING PLATFORM," attorney docket no. 46429-709.101; the entire disclosures of which are incorporated herein by reference.
BACKGROUND OF THE INVENTION
[0002] People need to learn. In the age without computers, learning was often based on oral communication, for example between teachers and students, or between parents and children. The learning materials often comprised books and audio cassettes. While learning methods have become more diversified through the evolution of technology, many learning methods generally remain passive. To achieve better learning efficiency and effectiveness, active learning based on modern technology would be desirable.
[0003] Prior methods and apparatus for teaching learning with computers can be less than ideal in at least some respects. For example, although computers can be used to provide electronic books and other formats of presenting material, the information presented can be less interactive than would be ideal. Also, the prior methods and apparatus can provide less than ideal feedback to a student user as to how the student can improve. For example, many manuals and online materials may not provide feedback to the user such as biofeedback or other measurements of the student. Although prior devices have been proposed to measure feedback with patients, prior methods and apparatus for measuring patients can be less than ideally suited for teaching learning in at least some respects. For example, prior medical apparatus can be more complex than would be ideal, and may be configured with a closed architecture that is less adaptable to relevant user metrics.
SUMMARY OF THE INVENTION
[0004] Although examples as disclosed herein are directed to a learning platform with sensor inputs such as a haptic device, the technologies disclosed herein can have applications in many fields, such as social networking, search engines, multimedia technologies and crowd sourcing. The systems and methods described herein are suitable for use with one or more individuals of a population, and can be combined in many ways, for example with crowd sourced learning for many users with haptic devices. [0005] The methods and apparatus disclosed herein can be configured to measure one or more sensor signals from a user, in order to provide feedback to the user as to how to improve. The feedback to the user can be provided while the user engages in the learning activity, such that the user can receive input and adjust activities during the lesson to improve the user's learning experience. A sensor can detect various types of signals such as one or more haptic signals, user signals, or environmental signals. The sensor data which may comprise communication signals can be received and combined with a processor in order to provide an assessment as to how the user is performing and provide suggestions as to how the user can modify behavior in order to improve performance. The feedback to the user can be provided in response to a comparison with an expert learning profile or graph. The methods, apparatus, media, systems, and platforms can be configured in many ways to utilize haptic input and out functions to improve learning. The user can select a lesson among a plurality of lessons, and the system configured with instructions of the selected lesson. One or more other sensors can be combined with the haptic input and output in order to provide improved haptic feedback to the user. The methods and apparatus disclosed herein may comprise an open system, in which the haptic and optionally one or more sensors are dynamically bound to the processor in order to receive user data and provide feedback to the user. The processor can be a processor of a local device or a processor of a remote system, or both. The instructions can configured to identify and bind one or more local devices or sensors appropriate for the user selected lesson. The open system has the advantage of allowing the use of devices readily available to the user, such that the lessons and feedback can be readily provided to many users.
[0006] In a first aspect, a haptic learning platform comprises: (a) a server including a server processor and a server operating system configured to provide a user with a server application comprising: (1) a recommendation module configured to recommend one or more contents to the user, and (2) a database configured to store a plurality of lessons; and (b) the mobile device including a mobile processor and one or more haptic sensors configured to provide the user with a mobile application, the mobile application comprising: (1) a haptic analysis module configured to receive one or more haptic signals of the user from the one or more haptic sensors; and (2) a mobile communication module configured to transmit the one or more haptic signals to the server.
[0007] In some cases, the recommendation module comprises a haptic learning engine configured to analyze the one or more haptic signals. The haptic learning engine may comprise an artificial intelligence algorithm for the analyzing the one or more haptic signals. Analyzing the one or more haptic signals may comprise inferring a purpose of a motion of the user. Analyzing the one or more haptic signals can comprise exploring a plurality of contents stored in the server application. In some applications, analyzing the one or more haptic signals comprises identifying an interest of the user. Analyzing the one or more haptic signals comprises identifying a role of the user.
[0008] Recommending the one or more contents may be based on the analyzing the one or more haptic signals. The user may play a role in a learning process. The user may purchase a lesson from the plurality of lessons. The one or more contents can comprise one or more of the following: one or more images, one or more video files, one or more audio files, one or more articles, one or more spreadsheets, and one or more RSS feeds. The plurality of lessons may comprise a plurality of contents. The one or more haptic signals may comprise one or more of the following activities: viewing, reading, watching, and listening.
[0009] In another aspect, disclosed herein are non-transitory computer-readable storage media encoded with a computer program including instructions executable by one or more digital signal processors to provide a user with a server application and a mobile application for enhancing haptic learning. The server application comprises: (a) a recommendation module configured to recommend one or more contents to the user; and (b) a database configured to store a plurality of lessons. The mobile application comprises: (a) a haptic analysis module configured to receive one or more haptic signals of the user from one or more haptic sensors; and (b) a mobile communication module configured to transmit the one or more haptic signals to the server application.
[0010] In some cases, the recommendation module comprises a haptic learning engine configured to analyze the one or more haptic signals. The haptic learning engine may comprise an artificial intelligence algorithm for the analyzing the one or more haptic signals. Analyzing the one or more haptic signals may comprise inferring a purpose of a motion of the user. Analyzing the one or more haptic signals can comprise exploring a plurality of contents stored in the server application. In some applications, analyzing the one or more haptic signals comprises identifying an interest of the user. Analyzing the one or more haptic signals comprises identifying a role of the user.
[0011] Recommending the one or more contents may be based on the analyzing the one or more haptic signals. The user may play a role in a learning process. The user may purchase a lesson from the plurality of lessons. The one or more contents can comprise one or more of the following: one or more images, one or more video files, one or more audio files, one or more articles, one or more spreadsheets, and one or more RSS feeds. The plurality of lessons may comprise a plurality of contents. The one or more haptic signals may comprise one or more of the following activities: viewing, reading, watching, and listening.
[0012] In another aspect, disclosed is a server comprising a server processor and a server operating system configured to provide a user with a server application, the server application comprising a haptic learning engine for learning purpose.
[0013] In another aspect, disclosed is a mobile device comprising a mobile processor and one or more haptic sensors configured to provide a user with a mobile application, the mobile application comprising a mobile communication module configured to transmit one or more haptic signals to a server.
[0014] In another aspect, disclosed is a mobile device comprising a mobile processor and one or more haptic sensors configured to provide a user with a mobile application, the mobile application comprising a haptic analysis module configured to receive one or more haptic signals of a user from one or more haptic sensors.
[0015] In another aspect, disclosed is a method for performing haptic learning by a server and a mobile device, the method comprising: (a) recommending, by the server, one or more contents to a user; (b) storing, by the server, a database of a plurality of lessons; (c) receiving, by the mobile device, one or more haptic signals of the user from one or more haptic sensors; and (d) transmitting, by the mobile device, the one or more haptic signals to a server.
[0016] In another aspect, disclosed is an apparatus used by a user for haptic learning, the apparatus comprising: (a) a recommendation module configured to recommend one or more contents to the user; (b) a haptic analysis module configured to receive one or more haptic signals of the user from one or more haptic sensors; (c) a communication module configured to transmit the one or more haptic signals to a server.
[0017] Aspects and advantages of the present disclosure will become readily apparent to those skilled in this art from the following detailed description, wherein only illustrative embodiments of the present disclosure are shown and described. As will be realized, the present disclosure is capable of other and different embodiments, and its several details are capable of modifications in various obvious respects, all without departing from the disclosure. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
INCORPORATION BY REFERENCE
[0018] All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.
[0019] The technology described herein incorporate patent applications No. 62/088,514, 62/111,063, 62/111,626, 62/112, 144, 62/112,631, 62/113,319, 62/111,056, 62/111,523, 62/112,139, 62/112,633, 62/113,323, and 62/149,458.
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings (also "FIG." and "FIGs." herein), of which:
[0021] FIG. 1 depicts an example system of haptic learning.
[0022] FIG. 2 depicts an example of a recommendation engine comprising a haptic learning engine.
[0023] FIG. 3 depicts an example lesson based on haptic learning.
[0024] FIG. 4 depicts an example system of haptic learning with various functions.
[0025] FIG. 5 depicts an example computing architecture for haptic learning.
[0026] FIG. 6 depicts an example of a haptic learning system.
[0027] FIG. 7 depicts an example of advanced haptic learning system.
[0028] FIG. 8 depicts example actions of a haptic learning system.
[0029] FIG. 9 depicts an example of operation flow of a haptic learning system.
DETAILED DESCRIPTION OF THE INVENTION
[0030] While various embodiments of the invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions may occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed. It shall be
understood that different aspects of the invention can be appreciated individually, collectively, or in combination with each other.
[0031] The methods and apparatus disclosed herein are well suited for use with many types of learning in many environments. Although specific reference is made to teaching a new mother, the methods and apparatus will find use with many learning applications, such as learning related to relaxing, health, yoga, coping with being blind, sleep apnea, autism, diaries, meditation, sports, martial arts, marksmanship, police, fire and military training, for example. The methods and apparatus disclosed herein are well suited for combination with devices that may already be available to a user, such as smart watches, Apple Watch, haptics, smart phones, smart watches, tablet personal computers, notebook computers, iPods™, iPads™, smart thermostats and in home sensors, for example.
Haptic learning platform
[0032] The technology described herein comprises a haptic learning platform. The platform may comprise a server including a server processor and a server operating system configured to provide a user with a server application. The server application may comprise (a) a recommendation module configured to recommend one or more contents to the user; (b) a database configured to store a plurality of lessons.
[0033] The platform may comprise a mobile device including a mobile processor and one or more haptic sensors configured to provide the user with a mobile application. The mobile application may comprise: (a) a haptic analysis module configured to receive one or more haptic signals of the user from the one or more haptic sensors; and (b) a mobile
communication module configured to transmit the one or more haptic signals to the server.
[0034] FIG. 1 illustrates an example of a haptic learning platform. The platform comprises a recommendation engine comprising a haptic learning engine. The platform also comprises a wearable device. In this example, a smart watch with sensing capability is shown. The users of this platform can play roles of a new mom and a baby. The mom may have little knowledge of raising children, so she can use the platform to learn a child-raising lesson. The baby could be a real baby, or a simulated baby rendered on a computer display, or a baby toy. The baby may be attached with one or more sensors. The lesson can teach the new mom various concepts. In addition, the platform allows the mom to learn how to hold the baby, feed the baby, etc.
Recommendation engine
[0035] The methods, systems, media, and platforms disclosed herein may comprise a recommendation module. Referring to FIG. 2, the recommendation module may comprise a haptic learning engine configured to analyze the one or more haptic signals. The haptic learning engine can comprise an artificial intelligence algorithm for the analyzing the one or more haptic signals.
[0036] Analyzing the one or more haptic signals can comprise inferring a purpose of a motion of the user. In some cases, analyzing the one or more haptic signals comprises exploring a plurality of contents stored in the server application. In some implementations, analyzing the one or more haptic signals comprises identifying an interest of the user. In some systems, analyzing the one or more haptic signals can comprise identifying a role of the user. The aforementioned analyses can be arbitrarily combined as well.
Haptic learning
[0037] The methods, systems, media, and platforms disclosed herein may provide haptic learning with haptic input and output. The haptic output to the user may comprise output configured to provide learning feedback to the user, and may provide an indication to the user as to how to improve performance, for example. The platform may comprise a mobile device coupled to one or more haptic sensors. The sensors may be embedded in the mobile device, or external to the mobile device, or both internal and external. The sensors may detect and record any types of signals, such as optical, sound, voice, electromagnetic, acoustic, mechanical, etc. The signals may represent the following one or more activities of a user, for instance, viewing, reading, watching, moving, pointing, jumping, walking, talking, typing, touching, speaking, singing, crawling, and listening. A person with general knowledge can easily recognize that any motion of a portion of the human body can be included.
[0038] Recommending the one or more contents may be based on the analyzing the one or more haptic signals. In some applications, recommendation is made based on a user's profile. Alternatively or in combination, the recommendation can be made based on past activities.
[0039] The platform can be used by a user for playing a role in a learning process. The platform may comprise a lesson store. The user may purchase a lesson from among a plurality of lessons.
[0040] Referring to FIG. 3, the user can purchase a lesson or a part of a lesson. In this example, a new mom may use the platform to learn how to take care of a baby. She can browse the lessons offered on the platform and select one of her interest. The lesson may comprise passive lectures delivered via computers, for example. Alternatively or in
combination, the lesson may ask the student to actively participate in an interactive lecture, where the student may use sensing technology possessed by the mobile device that she carries to detect motions, emotions, gestures, poses, and expressions.
[0041] Referring to FIG. 4, the platform may comprise a recommendation engine 401 in a server which offers various lessons. The platform may comprise a wearable device 402 on an actor or a user. The user plays a role in the lesson; in this example, the role is a new mom 403. Corresponding to another user (or object) is a baby 404, who wears one or more sensors. The new mom can register the wearable device 402 with the platform. Then, she can browse lesson store & purchase a lesson. The lesson may be downloaded into the wearable device or another user device, for example. To begin a lesson, the sensors are bound to the platform, calibrate the environment, and start scripting. When a lesson script executes, events are posted to the actor, inputs are posted from the actor, and inputs are posted from the sensor. When the lesson ends, the sensors on the object such as baby and the new mom can be unbound from the platform.
[0042] The one or more contents in the platform may comprise one or more of the following: one or more images, one or more video files, one or more audio files, one or more articles, one or more spreadsheets, and one or more RSS feeds. The lessons stored in the platform may comprise a plurality of contents with one or more types.
[0043] The haptic learning may utilize any stationary or mobile computing devices with sensing capabilities. Non-limiting examples include game consoles, wearable devices, smartphones, computers, laptops, desktops, servers, headphones, etc.
[0044] In some applications, the devices are specific for sensing purposes only, and may comprise communication interfaces coupled with another computing device. For instance, Microsoft Kinect connected to a computer, a monitor, or a TV assembles a system with sensing capabilities.
Mobile communication
[0045] The methods, systems, media, mobile devices and platforms as disclosed herein may comprise a mobile communication module. The mobile communication module can allow a plurality of users to access the server for learning purpose. Furthermore, the mobile communication may send push notifications of recommended contents to the users, for example.
[0046] The methods, systems, media, and platforms disclosed herein may comprise mobile devices, or use of the same. A mobile device comprises mobile access module to provide on- the-go access to a plurality of contents in the server, and/or one or more lessons in the server. The mobile access module allows the mobile user to receive one or more push notifications from the server application.
[0047] The haptic learning platform may track the location of users and contents. The platform may store geolocated information to visualize where learning is happening on a map that is similar or accretive to the users. Being able to use questions based on geolocation of content and expertise may be helpful to the user. [0048] The haptic learning platform's mobile technology can provide capacity for collecting information related to learning. In many configurations, promoting an understanding an expert graph is helpful to the success of the users of the platform.
Underlying computer systems
[0049] The present disclosure describes computer control systems that are programmed to implement methods of the disclosure. FIG. 5 shows a computer system 501 that is
programmed or otherwise configured to perform haptic learning. The computer system 501 can execute various modules of the haptic learning platform as described herein. The computer system 501 can be an electronic device of a user or a computer system that is remotely located with respect to the electronic device, and combinations thereof. The electronic device can be a mobile electronic device, for example.
[0050] The computer system 501 comprises a central processing unit (CPU, also referred herein as and used interchangeably with "processor" and "computer processor" herein) 505, which can be a single core or multi core processor, or a plurality of processors for parallel processing. The computer system 501 also comprises memory or memory location 510 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 515 (e.g., hard disk), communication interface 520 (e.g., network adapter) for communicating with one or more other systems, and peripheral devices 525, such as cache, other memory, data storage or electronic display adapters. The memory 510, storage unit 515, interface 520 and peripheral devices 525 are in communication with the CPU 505 through a communication bus (solid lines), such as a motherboard. The storage unit 515 can be a data storage unit (or data repository) for storing data. The computer system 501 can be operatively coupled to a computer network ("network") 530 with the aid of the communication interface 520. The network 530 can be the Internet, an internet or extranet, or an intranet and/or extranet that is in communication with the Internet. The network 530 in some cases is a telecommunication and/or data network. The network 530 can comprise one or more computer servers, which can enable distributed computing, such as cloud computing. The network 530, in some cases with the aid of the computer system 501, can implement a peer-to-peer network, which may enable devices coupled to the computer system 501 to behave as a client or a server.
[0051] The CPU 505 can execute a sequence of machine-readable instructions, which can be embodied in a program or software. The instructions may be stored in a memory location, such as the memory 510. The instructions can be directed to the CPU 505, which can subsequently program or otherwise configure the CPU 505 to implement methods of the present disclosure. Examples of operations performed by the CPU 505 can include fetch, decode, execute, and writeback.
[0052] The CPU 505 can be part of a circuit, such as an integrated circuit. One or more other components of the system 501 can be included in the circuit. In some cases, the circuit is an application specific integrated circuit (ASIC).
[0053] The storage unit 515 can store files, such as drivers, libraries and saved programs. The storage unit 515 can store user data, e.g., user preferences and user programs. The computer system 501 in some cases can include one or more additional data storage units that are external to the computer system 501, such as located on a remote server that is in communication with the computer system 501 through an intranet or the Internet.
[0054] The computer system 501 can communicate with one or more remote computer systems through the network 530. For instance, the computer system 501 can communicate with a remote computer system of a user (e.g., smartphones, laptops, desktops, tablets, wearable computing devices, and other computing devices). Examples of remote computer systems include personal computers (e.g., portable PC), slate or tablet PC's (e.g., Apple® iPad, Samsung® Galaxy Tab), telephones, Smart phones (e.g., Apple® iPhone, Android- enabled device, Blackberry®), or personal digital assistants. The user can access the computer system 501 via the network 530.
[0055] Methods as described herein can be implemented by way of machine (e.g., computer processor) executable code stored on an electronic storage location of the computer system 501, such as, for example, on the memory 510 or electronic storage unit 515. The machine executable or machine readable code can be provided in the form of software. During use, the code can be executed by the processor 505. In some cases, the code can be retrieved from the storage unit 515 and stored on the memory 510 for ready access by the processor 505. In some situations, the electronic storage unit 515 can be precluded, and machine-executable instructions are stored on memory 510.
[0056] The code can be pre-compiled and configured for use with a machine have a processer adapted to execute the code, or can be compiled during runtime. The code can be supplied in a programming language that can be selected to enable the code to execute in a pre-compiled or as-compiled fashion.
[0057] Aspects of the systems and methods provided herein, such as the computer system 501, can be embodied in programming and with instructions readable by the
processor. Various aspects of the technology may comprise "products" or "articles of manufacture" typically in the form of machine (or processor) executable code and/or associated data that is carried on or embodied in a type of machine readable medium.
Machine-executable code can be stored on an electronic storage unit, such memory (e.g., read-only memory, random-access memory, flash memory) or a hard disk. "Storage" type media can comprise any or all of the tangible memory of the computers, processors or the like, or associated modules thereof, such as various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the software programming. All or portions of the software may at times be communicated through the Internet or various other telecommunication networks. Such communications, for example, may enable loading of the software from one computer or processor into another, for example, from a management server or host computer into the computer platform of an application server. Thus, another type of media that may bear the software elements comprises optical, electrical and electromagnetic waves, such as used across physical interfaces between local devices, through wired and optical landline networks and over various air-links. The physical elements that carry such waves, such as wired or wireless links, optical links or the like, also may be considered as media bearing the software. As used herein, unless restricted to non-transitory, tangible "storage" media, terms such as computer or machine "readable medium" refer to any medium that participates in providing instructions to a processor for execution.
[0058] Hence, a machine readable medium, such as computer-executable code, may take many forms, including but not limited to, a tangible storage medium, a carrier wave medium or physical transmission medium. Non-volatile storage media include, for example, optical or magnetic disks, such as any of the storage devices in any computer(s) or the like, such as may be used to implement the databases, etc. shown in the drawings. Volatile storage media include dynamic memory, such as main memory of such a computer platform. Tangible transmission media include coaxial cables; copper wire and fiber optics, including the wires that comprise a bus within a computer system. Carrier-wave transmission media may take the form of electric or electromagnetic signals, or acoustic or light waves such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media therefore include for example: a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD or DVD-ROM, any other optical medium, punch cards paper tape, any other physical storage medium with patterns of holes, a RAM, a ROM, a PROM and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave transporting data or instructions, cables or links transporting such a carrier wave, or any other medium from which a computer may read programming code and/or data. Many of these forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution.
[0059] The computer system 501 can include or be in communication with an electronic display 535 that comprises a user interface (UI) 540 for providing, for example, monitoring signals, viewing analysis results, viewing recommended contents. Examples of UI's include, without limitation, a graphical user interface (GUI) and web-based user interface.
[0060] Methods and systems of the present disclosure can be implemented by way of one or more algorithms. An algorithm can be implemented by way of software upon execution by the central processing unit 505. The algorithm can be, for example, artificial intelligence algorithms.
EXAMPLE
Example 1: Basic Haptic Learning System
[0061] FIG. 6 shows an example of a haptic learning system. A user 601 uses a haptic device 602 for the purpose of haptic learning. To use the learning system, the user 601 registers the haptic device 602 to the system. The system comprises a haptic learning engine 603; once registered, the haptic device 602 is able to communicate with the haptic learning engine 603. The user can use the haptic device to browse lessons recommended by the haptic learning engine 603. When taking a lesson, the haptic learning engine delivers lessons to the haptic device 602. The user follows the instructions or scripts of the lesson to learn a new skill, where the haptic output cues or audio or visual cues are presented by the haptic device to the user.
[0062] The haptic device 602 can be configured in many ways, and may comprise haptic input and output. The haptic device 602 can be configured to receive user commands with input, and can be configured to provide user feedback with haptic output. The user input may comprise touch input, for example. The haptic output to the user may comprise one or more of a vibration, a spatially localized sensation such as an edge, a click, a tap, or a temperature, for example. The haptic output can be configured with one or more of temporal or spatially localized patterns, for example spatially localized touch sensations such as Braille, or temporal patterns such a number of vibrations or a frequency of vibrations, for example.
[0063] The haptic device 602 may comprise one or more of many known prior haptic devices suitable for incorporation in accordance with examples disclosed herein. For example, haptic 602 may comprise one or more components of a smart watch such as the Apple Watch commercially available from Apple. Haptic 602 may comprise on or more haptic components as described in U.S. App. Ser. No. 14/059693, filed October 22, 2013, entitled "Touch Surface for Simulating Materials", published as US 20150109215 on April 23, 2015; and U.S. App. Ser. No. 12/504821, filed July 17, 2009, entitled "METHOD AND APPARATUS FOR LOCALIZATION OF HAPTIC FEEDBACK", published as US 20110012717 on January 20, 2011; the entire disclosures of which are incorporated herein by reference.
[0064] The user follows the scripts and instructions, and takes some actions. The sensor 611 collects the signals of the user during the actions. The signals are sent to or generated with the haptic device 602. The haptic device may use the signals to issue other scripts to ask the user to perform other actions or correct the previous actions. Alternatively or in combination, the haptic device may send the signals to the haptic learning engine 603. The engine can issue new events or lessons for the user with the haptic device 602.
[0065] Moreover, there may be a sensor 612 coupled to the haptic learning engine 603. The sensor can be local to the user 601 or local to the haptic learning engine 603. The signals from the sensor 612 can allow the haptic learning engine to deliver new set of scripts or instructions to the haptic device 602 for helping the user to learning lesson more effectively.
Example 2: Advanced Haptic Learning System
[0066] FIG. 7 shows an example of an advanced haptic learning system. A user 701 uses a haptic device 702 for the purpose of haptic learning. The haptic device 702 can be a wearable device. The haptic device 702 comprises one or more sensors 703. To use the learning system, the user 701 registers the haptic device 702 to the system. Upon registration, the haptic device 702 allows the user 701 to browse lessons available on the system. In some embodiments, the haptic learning engine 704 suggests lessons matching with the user's profile. In some applications, the lessons are recommended based on analyses of the user's past activities.
[0067] When the user enrolls in a lesson, the lesson is downloaded to the haptic device 702. The device is also coupled to one or more external sensors 710, 711 and 713. The haptic device collects sensing signals from the sensors 710, 711 and 713. A sensing signal may comprise one or more of the following: a location, a direction, a rotation angle, a rotation speed, a speed, an acceleration, an angular acceleration, a lifting angle, a pressure, a temperature, a concentration, a force, a torque, a stability, a balance, a cardiovascular signal, a respiration signal, a health signal, a biological signal, and others. The haptic device 702 uses the sensing signal(s) to monitor the user's participation in the lesson.
[0068] The haptic device 702 delivers lessons to the user 701. The haptic device 702 can also send events to the user. In some lessons, the haptic device 702 sends haptic and other cues regarding the lessons to the user. The user takes action based on scripts and instructions of the lessons. The user's responses can be monitored by the sensors 711, 712, and 713. Or, the responses can be input into the haptic device directly. In some cases, the sensing signals are sent by sensors 703 and 713 to the haptic learning engine 704.
[0069] The haptic learning engine 704 can communicate with the haptic device 702. The communication may allow the user to browse lessons or recommendations made from the engine 704. In some situations, the haptic device 702 sends the responses of the user 701 to the haptic learning engine 704.
Example 3: Actions of Basic Haptic Learning System
[0070] FIG. 8 shows example actions of a haptic learning system. The system comprises haptic device 802 or 803 wearable by a user 801. When the user is using a haptic device, the user takes actions. The actions may include haptic queues 811. In some cases, the actions comprise haptic responses 812. In some applications, the actions comprise visual or audio signals, assembling visual/audio queues 813. Alternatively, the actions create standard responses 814.
[0071] The device 802 may be a specific device designed to sense particular signals.
Alternatively or in combination, a smartphone 803 with sensing capability can be used.
Further, the system comprises haptic learning engine 804. In additional applications, the system may comprise sensors 805. The devices may handle events 821 from the haptic learning engine 804; the events 821 may be presented to the user 801. Furthermore, the haptic device 802 or 803 may receive inputs 822 from the user 801, and delivers the inputs to the haptic learning engine 804.
[0072] The system comprises a haptic learning engine 804. The engine comprises lesson information 831, such as lesson scripts. The engine comprises input/output. The input interface receives information delivered from the haptic device 802 or 803. The input interface may receive sensor data 841 from sensors 805. The output comprises an interface delivering the lesson information to the haptic device 802 or 803.
[0073] The system comprises sensors 805. The sensors may be embedded in the haptic device 802 or 803. In some applications, the sensors are external to the haptic device, as independent devices. The sensors can be coupled to the haptic device 802 or 803, or to the haptic learning engine 804. The sensors 805 collect sensing signal and sensor data 841 and transmit to the haptic device 802 or 803. The haptic device 802 or 803 organizes the sensor data, and then presents further lesson instructions and scripts to the user 801 or requests more information from the haptic learning engine 804.
Example 4: Operation Flow of Basic Haptic Learning System
[0074] FIG. 9 shows an example of the system operation flow. The user 901 wants to learn a specific skill or skills as described herein, for example. For instance, a new mom may want to learn skills of taking care of a new baby. In step 911, the user 901 registers her haptic device 902 or 903 to the haptic learning system. The haptic device 902 or 903 can be any mobile device enabled with haptic functions.
[0075] On the haptic device, the user can browse various lessons from a lesson store. In a lesson store 921, the information of lessons (e.g., author, publishing information, brief description, targeted students) can be shown. The lesson information may further show how to ingest, earn, and buy the lesson. Each lesson can be created as a mobile application, and is downloadable to the haptic device once the user is granted access to the lesson; see step 912. The lesson access can be free, or based on purchase, or based on membership programs, or based on a combination of therein.
[0076] The haptic learning engine 904 may comprise searching capability. The search may be based on artificial intelligence or machine learning, for example. The intelligent search may find lessons relevant and suitable to the user 901.
[0077] Once the user has access to a lesson, (e.g., how to be a new mom), she can start the lesson; see step 913. The lesson is downloaded onto the haptic device. In steps 922 and 931, when a lesson starts, the system binds and calibrates sensors 905. Then, scripts of the lesson may then start to execute. Referring to step 923, the script execution comprises posting events to the haptic device 902/903, and getting input from the haptic device. Furthermore, the sensors collect sensor data and, in step 932, stream the data to the haptic device 902/903 or the learning engine 904.
[0078] For instance, the new mom is learning how to feed a baby with milk. The system comprises sensors 905 on a baby and on the mom. When the lesson scripts execute on the haptic device 902/903, the scripts guide the mom to prepare milk, hold a bottle, hold the baby, and other aspects of the lesson. The guiding can be based on voice, sound, video, audio, images, or texts. The guiding may comprise s sequence of movements the mom should do with the baby. The mom follows the haptic commands to experience the learning process and learns from the haptic output. [0079] The sensors as described herein can be designed based on various technologies (e.g., hologram, actuators, etc.) to detect signals of temperature, touch, pulse, vibrations, micro motions, finger movements, or any kind of sensation, for example. The signals can be ID, 2D, 3D, or 4D. The sensor data regarding the mom and the baby is collectively sent to the haptic device 902/903 or the learning engine 904, or both. The haptic device can analyze the sensor data and provide feedback to the mom to correct her positions, for example. In some portions of the lessons, the sensor data are sent to the haptic learning engine 904 to download more information or lessons. The haptic learning engine can analyze the sensor data and generate output.
[0080] The feedback provided by the haptic device or the learning engine may comprise corrections of the mom's actions (e.g., hold the baby in a correct position) or requests to perform following actions (e.g., pacify the baby when the baby stops taking milk).
[0081] The haptic device 902/903 and the sensors 905 may perform dynamic binding. For example, when some scripts are regarding pacifying the baby, video sensors can be bind to monitor the actions taken by the mom.
[0082] With scripts and lessons related to CPR, the sensors for monitoring user activity to administer are triggered. When the CPR session ends, the sensors unbind, see steps 924 and 933. The CPR haptic output may comprise signals related to the administration of CPR for example a low rate of vibrations to slow down or a fast rate to speed up, or a first spatial haptic cue to slow down and a second spatial haptic cue to speed up the CPR. The
administration of CPR includes a cardio component and a pulmonary component with haptic cues to one or more users. Sensors as described herein can be provided on a model or volunteer and activity compared to an expert profile as described herein. Haptic pulmonary cues can be provided to a first user to slow down, speed up or maintain the rate of cardiac resuscitation, for example. Haptic pulmonary cues can be provided to a second user to slow down, speed up or maintain the rate of pulmonary resuscitation, for example.
[0083] With sensors related to marksman ship, sensors coupled to the user can be configured to measure one or more of user breathing or heart rate. The user can be instructed with haptic feedback to lower the heart rate to an amount corresponding to an expert profile. The user can be instructed with haptic feedback to control breathing and other actions such as pulling the trigger in accordance with an expert profile.
[0084] The lesson may comprise a lesson for one or more of many applications as described herein, for example to control breathing and lower a heart rate to deal with stress. [0085] The user 901 follows the scripts and performs actions. The system is highly interactive, and the user is guided in response to specific signals and sensor data. The haptic device 902/903 and the learning engine 904 process and analyze the sensor data. Then the user can receive feedback from the haptic device 902/903 or the sensors 905.
[0086] The haptic learning system is highly multitasking, so the haptic device 902/903 or the learning engine 904 may comprise multitasking software on processors to handle queues and signals.
[0087] The haptic learning system comprises communications among the haptic device 902/903, the haptic learning engine 904, and the sensors 905. Therefore, the system comprises a protocol to schedule and organize the data routing and wireless/wired communications. In particular, the haptic device 902/903 coordinates sensors, the device may request sensors 905 to acquire specific signals. Alternatively or in combination, the device may receive the signals from the sensors 905.
[0088] The haptic learning system can be configured in many ways to provide feedback to the user. The downloaded lesson may comprise one or more expert profiles or expert graphs for comparison with user data to determine user performance and provide haptic feedback to the user. The expert profile can be determined in many ways, for example from input sensor and haptic data profiles of experts or combinations of profiles of experts identified based on influence such as crowd sourcing. The expert profile or graph can be compared with the input data from the one or more sensors such as the haptic sensor, and compared with one or more of the expert profile or graph. Haptic output can be feed back to the user in response to the comparison in order to direct the user to act in accordance with the expert profile or graph, or both.
[0089] Examples of systems and methods capable of identifying experts and influence suitable for incorporation in accordance with the present disclosure are described in U.S. Prov. App. Ser. No. 62/1 13,319, filed on February 6, 2015, entitled "Intent Based Digital Collaboration Platform Architecture and Design" [attorney docket no. 46429-701.106]; and U.S. Prov. App. Ser. No. 62/113,323, filed on February 6, 2015, entitled "Insights for Effective Computer Based Collaboration"; the entire disclosures of which have been previously incorporated herein by reference.
[0090] The system described herein can be used in a variety of situations and conditions. Non-limiting examples include schools, universities, training, simulation, active learning, orientations, factories, sale skills learning, maintenance learning, negotiation learning, research, development, labs, and others as described herein, for example. [0091] Each of the examples as described herein can be combined with one or more other examples. Further, one or more components of one or more examples can be combined with one or more components of other examples.
[0092] Reference is made to the following claims which recite combinations that are part of the present disclosure, including combinations recited by multiple dependent claims dependent upon multiple dependent claims, which combinations will be understood by a person of ordinary skill in the art and are part of the present disclosure.
[0093] While preferred examples of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such examples are provided by way of example only. It is not intended that the invention be limited by the specific examples provided within the specification. While the invention has been described with reference to the aforementioned specification, the descriptions and illustrations of the examples herein are not meant to be construed in a limiting sense. Numerous variations, changes, and
substitutions will now occur to those skilled in the art without departing from the invention. Furthermore, it shall be understood that all aspects of the invention are not limited to the specific depictions, configurations or relative proportions set forth herein which depend upon a variety of conditions and variables. It should be understood that various alternatives to the examples of the invention described herein may be employed in practicing the invention. It is therefore contemplated that the invention shall also cover any such alternatives,
modifications, variations or equivalents. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and the equivalents thereof.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A learning system, comprising: one or more sensors; a haptic device configured to couple a user; and a processor configured with instructions to communicate with the haptic device and provide haptic output to the user in response to learning activity of the user measured with the one or more sensors.
2. The learning system of claim 1, wherein the processor is configured with the
instructions to provide haptic output to the user in response to a comparison of data from the one or more sensors with an expert profile of a user selected lesson.
3. The learning system of claim 2, wherein the expert profile comprises a plurality of target activities and wherein the haptic output to the user is determined in response to a comparison of the plurality of target activities with the data from the one or more sensors.
4. The learning system of claim 1, wherein the processor is configured with instructions to bind one or more of the one or more sensors or the haptic device to measure the learning activity of the user and provide feedback to the user in response to the user selecting a lesson.
5. The learning system of claim 1, wherein the processor comprises a processor of a local device configured to download a lesson application, the lesson application comprising an expert profile and instructions to compare the learning activity with the expert profile and provide feedback to the user with the haptic output in response to a comparison of the user profile with the expert profile.
6. The learning system of claim 1, wherein the sensors comprise sensors associated with the user and the lesson application comprises instructions to identify and bind the one or more sensors associated with the user in response to a lesson selected by the user with a user input.
7. The learning system of claim 1, wherein the processor comprises one or more of a processor located remotely from the haptic device or a processor of a local
communication device.
8. The learning system of claim 1, wherein the haptic device comprises the one or more sensors.
9. The learning system of claim 1, wherein the one or more sensors comprises one or more of an accelerometer, a temperature sensor, a haptic, a thermostat sensor, a microphone, a blood oxygen sensor, a camera, a video camera, an activity sensor, a touch sensor, a heart rate sensor, a vibration sensor, a micro motion sensor, or finger movement sensor.
10. An apparatus comprising: a processor configured with instructions to receive data from one or more sensors and communicate with a haptic device coupled to a user and provide haptic output to the user in response to learning activity of the user measured with the one or more sensors.
11. A haptic learning platform comprising:
(a) a server comprising a server processor and a server operating system configured to provide a user with a server application, the server application comprising:
(1) a recommendation module configured to recommend one or more contents to the user;
(2) a database configured to store a plurality of lessons;
(b) a mobile device comprising a mobile processor and one or more haptic sensors configured to provide the user with a mobile application, the mobile application comprising:
(1) a haptic analysis module configured to receive one or more haptic signals of the user from the one or more haptic sensors; and
(2) a mobile communication module configured to transmit the one or more haptic signals to the server.
12. The platform of claim 11, wherein the recommendation module comprises a haptic learning engine configured to analyze the one or more haptic signals.
13. The platform of claim 12, wherein the haptic learning engine comprises an artificial intelligence algorithm for the analyzing the one or more haptic signals.
14. The platform of claim 12, wherein the analyzing the one or more haptic signals
comprising inferring a purpose of a motion of the user.
15. The platform of claim 12, wherein the analyzing the one or more haptic signals comprising exploring a plurality of contents stored in the server application.
16. The platform of claim 12, wherein the analyzing the one or more haptic signals
comprising identifying an interest of the user.
17. The platform of claim 12, wherein the analyzing the one or more haptic signals
comprising identifying a role of the user.
18. The platform of claim 12, wherein the recommending the one or more contents is based on the analyzing the one or more haptic signals.
19. The platform of claim 11, wherein the user plays a role in a learning process.
20. The platform of claim 11 , wherein the user purchases a lesson from the plurality of lessons.
21. The platform of claim 11, wherein the one or more contents comprise one or more of the following: one or more images, one or more video files, one or more audio files, one or more articles, one or more spreadsheets, and one or more RSS feeds.
22. The platform of claim 11 , wherein the plurality of lessons comprises a plurality of contents.
23. The platform of claim 11, wherein the one or more haptic signals comprise one or more of the following activities: viewing, reading, watching, or listening.
24. A server comprising a server processor and a server operating system configured to provide a user with a server application, the server application comprising a haptic learning engine for learning.
25. A mobile device comprising a mobile processor and one or more haptic sensors configured to provide a user with a mobile application, the mobile application comprising a mobile communication module configured to transmit one or more haptic signals to a server.
26. A mobile device comprising a mobile processor and one or more haptic sensors configured to provide a user with a mobile application, the mobile application comprising a haptic analysis module configured to receive one or more haptic signals of a user from one or more haptic sensors.
27. A method for performing haptic learning by a server and a mobile device, the method comprising:
(a) recommending, by the server, one or more contents to a user;
(b) storing, by the server, a database of a plurality of lessons;
(c) receiving, by the mobile device, one or more haptic signals of the user from one or more haptic sensors; and
(d) transmitting, by the mobile device, the one or more haptic signals to a server.
28. An apparatus used by a user for haptic learning, the apparatus comprising:
(a) a recommendation module configured to recommend one or more contents to the user;
(b) a haptic analysis module configured to receive one or more haptic signals of the user from one or more haptic sensors; and
(c) a communication module configured to transmit the one or more haptic signals to a server.
29. A method, comprising: providing one or more of a system, an apparatus, a platform, a haptic device, a mobile device, one or more sensors, a processor or a module as in any one of the preceding claims.
PCT/US2016/027943 2015-04-17 2016-04-15 System and methods for haptic learning platform WO2016168738A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562149458P 2015-04-17 2015-04-17
US62/149,458 2015-04-17

Publications (1)

Publication Number Publication Date
WO2016168738A1 true WO2016168738A1 (en) 2016-10-20

Family

ID=57127214

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/027943 WO2016168738A1 (en) 2015-04-17 2016-04-15 System and methods for haptic learning platform

Country Status (1)

Country Link
WO (1) WO2016168738A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10748437B2 (en) 2017-08-14 2020-08-18 Parker Appleton LYNCH Learning aid apparatus and system

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020144145A1 (en) * 2001-03-27 2002-10-03 Burtner Richard L. Method of activating computer-readable data
US20040214145A1 (en) * 2003-04-23 2004-10-28 Say-Ling Wen Sentence-conversation teaching system with environment and role selections and method of the same
US20060234201A1 (en) * 2005-04-19 2006-10-19 Interactive Alchemy, Inc. System and method for adaptive electronic-based learning programs
US20090263777A1 (en) * 2007-11-19 2009-10-22 Kohn Arthur J Immersive interactive environment for asynchronous learning and entertainment
US20120194439A1 (en) * 2011-01-27 2012-08-02 Michelle Denise Noris Communication and Academic Achievement Assistive Device, System, and Method
US20120280988A1 (en) * 2010-04-09 2012-11-08 University Of Florida Research Foundation, Inc. Interactive mixed reality system and uses thereof
US20130260886A1 (en) * 2012-03-29 2013-10-03 Adam Smith Multi-sensory Learning Game System
US20130257777A1 (en) * 2011-02-11 2013-10-03 Microsoft Corporation Motion and context sharing for pen-based computing inputs
US20140024009A1 (en) * 2012-07-11 2014-01-23 Fishtree Ltd. Systems and methods for providing a personalized educational platform
WO2014015378A1 (en) * 2012-07-24 2014-01-30 Nexel Pty Ltd. A mobile computing device, application server, computer readable storage medium and system for calculating a vitality indicia, detecting an environmental hazard, vision assistance and detecting disease
US20140220520A1 (en) * 2011-09-09 2014-08-07 Articulate Technologies Inc. Intraoral tactile feedback methods, devices, and systems for speech and language training
US20150009168A1 (en) * 2013-07-02 2015-01-08 Immersion Corporation Systems and Methods For Perceptual Normalization of Haptic Effects

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020144145A1 (en) * 2001-03-27 2002-10-03 Burtner Richard L. Method of activating computer-readable data
US20040214145A1 (en) * 2003-04-23 2004-10-28 Say-Ling Wen Sentence-conversation teaching system with environment and role selections and method of the same
US20060234201A1 (en) * 2005-04-19 2006-10-19 Interactive Alchemy, Inc. System and method for adaptive electronic-based learning programs
US20090263777A1 (en) * 2007-11-19 2009-10-22 Kohn Arthur J Immersive interactive environment for asynchronous learning and entertainment
US20120280988A1 (en) * 2010-04-09 2012-11-08 University Of Florida Research Foundation, Inc. Interactive mixed reality system and uses thereof
US20120194439A1 (en) * 2011-01-27 2012-08-02 Michelle Denise Noris Communication and Academic Achievement Assistive Device, System, and Method
US20130257777A1 (en) * 2011-02-11 2013-10-03 Microsoft Corporation Motion and context sharing for pen-based computing inputs
US20140220520A1 (en) * 2011-09-09 2014-08-07 Articulate Technologies Inc. Intraoral tactile feedback methods, devices, and systems for speech and language training
US20130260886A1 (en) * 2012-03-29 2013-10-03 Adam Smith Multi-sensory Learning Game System
US20140024009A1 (en) * 2012-07-11 2014-01-23 Fishtree Ltd. Systems and methods for providing a personalized educational platform
WO2014015378A1 (en) * 2012-07-24 2014-01-30 Nexel Pty Ltd. A mobile computing device, application server, computer readable storage medium and system for calculating a vitality indicia, detecting an environmental hazard, vision assistance and detecting disease
US20150009168A1 (en) * 2013-07-02 2015-01-08 Immersion Corporation Systems and Methods For Perceptual Normalization of Haptic Effects

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10748437B2 (en) 2017-08-14 2020-08-18 Parker Appleton LYNCH Learning aid apparatus and system

Similar Documents

Publication Publication Date Title
Ratcliffe et al. Extended reality (XR) remote research: A survey of drawbacks and opportunities
US10885719B1 (en) Methods and systems for treating autism
DeFalco et al. Detecting and addressing frustration in a serious game for military training
Bekele et al. Pilot clinical application of an adaptive robotic system for young children with autism
CN111936036B (en) Using biometric sensor data to detect neurological status to guide in-situ entertainment
David et al. Effects of a robot-enhanced intervention for children with ASD on teaching turn-taking skills
Preum et al. A review of cognitive assistants for healthcare: Trends, prospects, and future directions
de Oliveira et al. Novel virtual environment for alternative treatment of children with cerebral palsy
Muñoz et al. Immersive virtual reality exergames for persons living with dementia: user-centered design study as a multistakeholder team during the COVID-19 pandemic
Ramírez-Duque et al. Robot-Assisted Intervention for children with special needs: A comparative assessment for autism screening
Jerčić et al. The effect of emotions and social behavior on performance in a collaborative serious game between humans and autonomous robots
Block et al. In the arms of a robot: Designing autonomous hugging robots with intra-hug gestures
Ali et al. Virtual reality as a tool for physical training
Özdener et al. Determining students’ views about an educational game-based mobile application supported with sensors
Stütz et al. An interactive 3D health app with multimodal information representation for frozen shoulder
Llorente et al. Wearable computers and big data: Interaction paradigms for knowledge building in higher education
Kouroupetroglou Enhancing the human experience through assistive technologies and e-accessibility
Triantafyllidis et al. A social robot-based platform for health behavior change toward prevention of childhood obesity
Gomilko et al. Attention training game with aldebaran robotics NAO and brain-computer interface
WO2016168738A1 (en) System and methods for haptic learning platform
Fitter et al. Teaching a robot bimanual hand-clapping games via wrist-worn imus
Vogiatzaki et al. Maintaining mental wellbeing of elderly at home
Lee et al. Factors associated with intention of sustainable use in players of the Wii Fit or smartphone-based fitness applications
Specht et al. Sensors for seamless learning
Tung GEC-HR: Gamification exercise companion for home robot with IoT

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16780931

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16780931

Country of ref document: EP

Kind code of ref document: A1