CN110152160B - Autism rehabilitation intervention system - Google Patents

Autism rehabilitation intervention system Download PDF

Info

Publication number
CN110152160B
CN110152160B CN201910330877.XA CN201910330877A CN110152160B CN 110152160 B CN110152160 B CN 110152160B CN 201910330877 A CN201910330877 A CN 201910330877A CN 110152160 B CN110152160 B CN 110152160B
Authority
CN
China
Prior art keywords
user
rehabilitation intervention
processing unit
autism
rehabilitation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910330877.XA
Other languages
Chinese (zh)
Other versions
CN110152160A (en
Inventor
程建宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Azuaba Technology Co ltd
Original Assignee
Beijing Azuaba Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Azuaba Technology Co ltd filed Critical Beijing Azuaba Technology Co ltd
Priority to CN201910330877.XA priority Critical patent/CN110152160B/en
Publication of CN110152160A publication Critical patent/CN110152160A/en
Application granted granted Critical
Publication of CN110152160B publication Critical patent/CN110152160B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M2021/0005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
    • A61M2021/0027Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the hearing sense
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M2021/0005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
    • A61M2021/0044Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense
    • A61M2021/005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense images, e.g. video

Landscapes

  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Acoustics & Sound (AREA)
  • Psychology (AREA)
  • Engineering & Computer Science (AREA)
  • Anesthesiology (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Hematology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Rehabilitation Tools (AREA)

Abstract

An autism rehabilitation intervention system is disclosed. The system comprises: a display screen; an invisible light source; a light sensitive sensor that images a user by detecting reflected light reflected by the user from light emitted by the invisible light source; and a processing unit that receives the imaging data from the light-sensitive sensor, detects a behavioral characteristic of the user based on the imaging data, and causes a corresponding rehabilitation intervention item to be displayed on the display screen based on the behavioral characteristic.

Description

Autism rehabilitation intervention system
Technical Field
The invention relates to the technical field of autism medical equipment, in particular to an autism rehabilitation intervention system.
Background
Childhood Autism (Childhood austism) is a type of neurological developmental disorder. .
The existing rehabilitation intervention therapy utilizes training modes such as environmental scenes, games, action combinations and the like to increase the perception of a patient, promote the intelligence development of the patient and enhance language communication and emotion communication with the patient. These training modalities are also known as rehabilitation intervention programs. Rehabilitation intervention programs may be ranked differently depending on the intensity and/or difficulty of the training. Traditional approaches to treating autistic pediatric patients require either the production of a large number of cards or the provision of a large number of videos in advance. The emotional changes of the patient are observed by continuously changing the card or the video, so as to determine the interest preference of the child patient, and the mode is tedious and inefficient. Meanwhile, the treatment effect is related to the experience of doctors, the optimal intervention scheme is difficult to adjust according to the emotional response of the autistic patients, the standardization is difficult, and the effect is limited. In addition, in the prior art, a therapy for assisting an autistic patient in rehabilitation training by constructing a virtual scene using a virtual reality technology has also been proposed. The main equipment of the therapy is a helmet. However, since the age of the children with autism is generally low and the helmet has certain quality, the contact type of the children with autism has a conflicting feeling on the helmet, which affects the attention and rehabilitation training effect of the children.
In addition, in the current research, it is also attempted to enhance the ability of training the brain to switch "attention" by effectively inducing the brain to "attention" to the stimulation signal while continuously directing the brain to switch between different information channels. However, current therapies lack temperature and emotional care for the patient. The infant infantile autism patient is easy to conflict with the infantile autism patient, and the infantile autism patient is not matched with the infantile autism patient for treatment, so that the effect is poor.
In addition, in the prior art, there is a technical scheme of acquiring an image of a user through a camera.
Disclosure of Invention
It is an object of the present invention to provide a new technical solution for autism rehabilitation intervention.
The invention provides an autism rehabilitation intervention system, which comprises: a display screen; an invisible light source; a light sensitive sensor that images a user by detecting reflected light reflected by the user from light emitted by the invisible light source; and a processing unit that receives imaging data from the light sensitive sensor, detects a behavioral characteristic of the user based on the imaging data, and causes a corresponding rehabilitation intervention item to be displayed on the display screen based on the behavioral characteristic.
According to embodiments disclosed herein, the impact of an autistic rehabilitation intervention device on an autistic child during treatment may be reduced, thereby improving treatment efficacy.
Other features of the present invention and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a schematic block diagram of an autism rehabilitation intervention system provided in accordance with one embodiment of the present disclosure.
Fig. 2 is a schematic block diagram of an autism rehabilitation intervention system provided in accordance with another embodiment of the present disclosure.
Fig. 3 is a schematic block diagram of an autism rehabilitation intervention system provided in accordance with yet another embodiment of the present disclosure.
Fig. 4-6 are one example of an autism rehabilitation intervention system according to the present disclosure.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Various embodiments and examples according to the present invention are described below with reference to the accompanying drawings.
< autism rehabilitation intervention System >
Fig. 1 is a schematic block diagram of an autism rehabilitation intervention system provided in accordance with one embodiment of the present disclosure.
As shown in fig. 1, the autism rehabilitation intervention system includes: a display screen 22, a source of invisible light 23, a photosensor 24 and a processing unit 21.
The display screen 22 is at least used to display the corresponding rehabilitation intervention item under the control of the processing unit 21. Rehabilitation intervention is a term of art specific to the treatment of autism and refers to images or videos designed to treat autism. Furthermore, as shown in fig. 2, in the case where the speaker 25 is included in the autism rehabilitation intervention system, the rehabilitation intervention item may further include an audio output. In this case, the rehabilitation intervention program may be a combination of images, video and/or audio. Rehabilitation intervention programs can be divided into various intensities. In the rehabilitation training process, the rehabilitation intervention items with the corresponding level strength are selected according to the personal condition of the patient and displayed to the user. The partitioning of the intensity of the rehabilitation intervention program may be based on big data statistics. This is not of interest to the present disclosure and, therefore, is not explained in detail herein.
The invisible light source 23 may emit invisible light. The photosensor 24 images the user by detecting reflected light reflected by the light emitted by the invisible light source 23 via the user.
The processing unit 21 receives imaging data from the light sensitive sensor 24, detects a behavioral characteristic of the user based on the imaging data, and causes a corresponding rehabilitation intervention item to be displayed on the display screen 22 based on the behavioral characteristic.
The processing unit 21 may include a local processor and memory. In this case, the received imaging data may be processed locally and a rehabilitation intervention program determined. In addition, the processing unit 21 may further include a processor and a memory in a network, for example, a server located in a cloud. In this case, the acquired image data may be sent to a processor and memory on the network for processing.
Here, an appropriate rehabilitation intervention program may be determined based on the current condition of the user, which enables the therapy to the user to be adapted to the current condition of the user in real time. Further, since the invisible light source is employed to detect the condition of the user, the user is not aware even if the user is irradiated with light of the light source to accurately detect the state of the user. The user state can be detected more accurately by using a specific light source to image the object. Unlike normal adults, autistic children are sensitive to certain external influences. Sometimes, autistic children may be sensitive to the camera that takes the photograph, thereby creating a conflicting mood. In this case, it is sometimes difficult to determine whether the emotional change of the user is caused by the imaging or the therapeutic effect is not satisfactory due to the problem of the rehabilitation intervention item itself. Therefore, the method in the embodiment can reduce the influence of the detection processing on the user while ensuring the detection effect. For autistic children, this can significantly reduce the effect of the treatment device on him, thus significantly improving the treatment effect and the accuracy of the measurement.
The invisible light source 23 may be a near infrared light source. Near Infrared (NIR) is an electromagnetic wave between Visible (VIS) and mid-Infrared (MIR) light. The wavelength of the near infrared light is in the range of 780 to 2526nm as defined by ASTM (American society for testing and materials testing). Since the near infrared light source is stable and less disturbed by the outside world, this may enable the photosensitive sensor 24 to generate a more accurate image of the user based on the invisible light source 24. Because the behaviors of the autistic children are uncontrollable and difficult to predict, and the expression and action changes of the autistic children are small sometimes, the near-infrared light source is adopted, so that the detection accuracy can be improved, and a more accurate basis is provided for adjusting the rehabilitation intervention project.
The invisible light source 23 may be disposed on an outer surface of the display screen 22. For example, at the bottom of the display screen 23.
In one example, the invisible light source 23 may be embedded in the display screen 22 and transparent. In this case, the invisible light source 23 is not easily perceived by the user. This further avoids the invisible light source 23 interfering with the user and causing the user to be out of compliance.
In addition, the light-sensitive sensor 24 may also be embedded in the display screen 22 and be transparent to avoid interference with the user.
Here, the invisible light source 23 and the photosensor 24 may be hidden at the same time. Therefore, in the treatment process, the influence of the detection device on the user can be further shielded, so that the response of the user to the rehabilitation intervention project can be more accurately acquired, and a more accurate basis is provided for the adjustment of the rehabilitation intervention project. Alternatively, for example, the invisible light source 23 and the photosensor 24 may be provided in a bezel portion on the outer surface of the display screen and provided in the same color as the bezel of the display screen, for example, both black, so as to hide them.
The plurality of invisible light sources 23 may be provided on the outer surface of the display screen 22, or the plurality of invisible light sources 23 may be embedded in the display screen 22. The number of invisible light sources 23 is not limited in the embodiment of the present invention.
The light sensitive sensor 24 may be a structured light depth camera, such as a time of flight tof (time of flight) video camera. More behavioral information of the user is available to the TOF camera, which enables the processing unit 21 to extract accurate and rich user characteristics based on the imaging data imaged by the photosensitive sensor 24, thereby making the adjustment of the rehabilitation intervention program more accurate.
Both the light sensitive sensor 24 and the invisible light source 23 are directed towards the user. The pointing of the photosensor 24 to the invisible light source 23 can be realized, for example, in four ways as follows.
The first method is as follows: the orientations of the photosensor 24 and the invisible light source 23 are fixed and the same. In using the autism rehabilitation intervention system, the user moves into the orientation of the light sensitive sensor 24 and the invisible light source 23.
The second method comprises the following steps: the orientation of the light sensitive sensor 24 is fixed. In using the autism rehabilitation intervention system, the user moves into the orientation of the light sensitive sensor 24. The pointing direction of the invisible light source 23 is adjusted to the orientation in which the user is located by the processing unit 21.
The third method comprises the following steps: the pointing direction of the invisible light source 23 is fixed. In using the autism rehabilitation intervention system, the user moves into the orientation of the invisible light source 23. The orientation of the light sensitive sensor 24 is adjusted by the processing unit 21 to the orientation in which the user is located.
The method is as follows: when using the autism rehabilitation intervention system, the orientation of the invisible light source 23 and the light sensitive sensor 24 is adjusted by the processing unit 21 to the orientation in which the user is located.
The pointing direction of the invisible light source 23 and/or the photosensitive sensor 24 can be manually adjusted to set the user within the detection range of the invisible light source 23 and/or the photosensitive sensor 24. It is also possible to determine whether the user is within the detection range from the image detected by the light-sensitive sensor 24 and to instruct the user to make a corresponding movement.
Furthermore, the orientation of the user may also be determined using the microphone array to adjust the pointing direction of the invisible light source 23 and/or the light sensitive sensor 24.
For example, the autism rehabilitation intervention system may also include a microphone array 25 disposed near the light sensitive sensor 24. The microphone array 25 senses the orientation of the user. The microphone array 25 may sense orientation in a variety of ways. For example, it may be determined that the user uttered the sound by detecting the sound in a specific frequency range. The sound source localization algorithm based on the microphone array 25 may be a time-delay (TDE) based sound source localization algorithm, a controllable beam forming algorithm based on maximum output power, a high resolution spectrogram estimation algorithm, or the like. In this case, the microphones in the microphone array 25 may not be microphones for voice detection, but may be microphones that detect certain specific frequency bands.
Since the invisible light source 23 and/or the photosensitive sensor 24 are hidden, the invisible light source 23 and/or the photosensitive sensor 24 may lose some orientation information or detect slowly due to the hidden condition. By assisting the positioning with the microphone alignment, the invisible light source 23 and/or the light sensitive sensor 24 may be assisted in positioning the user faster, e.g. determining the gaze orientation of the user.
Furthermore, the detection of the orientation of the user may also be assisted by the use of an invisible light source 23 and/or a light sensitive sensor 24. For example, the orientation of the user is first sensed with the microphone array 25, and then image data of the orientation is acquired with the invisible light source 23 and/or the photosensitive sensor 24. If the processing unit 21 detects a portrait in the image data, it determines that the sensed orientation is correct; otherwise, the processing unit 21 may instruct the microphone array 25 to re-sense the orientation of the user.
Here, the processing unit 21 may determine the orientation of the user using the microphone array 25 and control the invisible light sources 23 and/or the light sensitive sensors 24 to point to the orientation at which the user is located based on the sensed orientation. This may eliminate the need for the user to adjust his or her position to fit the autism rehabilitation intervention system. Therefore, the user experience can be improved, the influence of the electronic equipment on the autism children in the rehabilitation intervention treatment process can be avoided, and the conflict psychology of the autism children on the system can be avoided.
The behavioral characteristic of the user may be at least one of a gaze direction, a gaze time, a translational change in gaze direction, and a user limb action characteristic of the user and/or a language interaction feedback characteristic of the user, and/or the like.
The processing manner of the embodiment of the present disclosure is described below by taking the user gaze direction and the corresponding gaze time as examples. For example, the microphone array 25 senses the orientation of the user. The processing unit 21 controls the pointing direction of the invisible light source 23 and/or the photosensitive sensor 24 using the sensed orientation of the user. The light sensitive sensor 24 acquires imaging data of the user. The processing unit 21 determines the gaze direction and gaze time of the user based on the imaging data, thereby determining the current state of the user to adjust the rehabilitation intervention program. For example, the processing unit 21 may extract the user eye information in the user image data generated by the photosensor 24 and determine a time during which the eye information remains unchanged within a certain range to take the time as the gaze time.
For example, the processing unit 21 causes the corresponding rehabilitation intervention item to be displayed on the display screen 22 based on the behavior characteristics in two ways.
The first method is as follows: the processing unit 21 stores a mapping relationship between the behavior characteristics and the rehabilitation intervention items in advance. The processing unit 21 inputs the detected user behavior characteristics into the mapping relationship to obtain a corresponding rehabilitation intervention item. The processing unit 21 displays the resulting rehabilitation intervention item on the display screen 22. The mapping relationship may be a pre-trained deep learning based neural network.
The second method comprises the following steps: the processing unit 21 evaluates the symptom level of the user based on the detected user behavior characteristics. Then, the processing unit 21 searches for a rehabilitation intervention item corresponding to the finally evaluated symptom level according to the finally evaluated symptom level. Finally, the processing unit 21 displays the rehabilitation intervention item in the display screen.
It should be noted that the processing unit 21 may be locally located, or may be distributed: part of its processing is performed locally; part of the processing is performed at the server (or cloud).
In the autism rehabilitation intervention system provided by the embodiment of the disclosure, the interference of the electronic device to the user can be reduced as much as possible. In addition, the system may also adjust the appropriate rehabilitation intervention program in real time to the user's condition with minimal disturbance. The training device is very beneficial to the rehabilitation training of the autistic children, reduces the interference to the children and can improve the treatment effect.
The autism rehabilitation intervention system according to one embodiment may also determine the state of the user using the user's voice signal. For example, the autism rehabilitation intervention system may include a voice microphone. The speech microphone may be the microphone array 25, or at least one of the microphones of the microphone array 25 may be. The voice microphone receives a voice signal from a user. The processing unit 21 changes the rehabilitation intervention item displayed on the display screen based on the voice signal.
For example, the processing unit 21 may change the rehabilitation intervention item displayed on the display screen 22 based on the voice signal in the following manner.
The first method is as follows: the processing unit 21 recognizes the content of the voice signal, and when the voice signal is recognized as a reply to the current rehabilitation intervention item and is an erroneous reply, the processing unit 21 causes a rehabilitation intervention item of lower intensity than the current rehabilitation intervention item to be displayed on the display screen 22.
Here, the processing unit 21 may store rehabilitation intervention items at different intensity levels. The processing unit 21 recognizes the content of the voice signal, and when the voice signal is recognized as a reply to the current rehabilitation intervention item and is a wrong reply, it indicates that the intensity of the current rehabilitation intervention item exceeds the user's perception, and effective rehabilitation intervention cannot be achieved for the user. At this time, the processing unit 21 may cause a rehabilitation intervention item reduced in intensity by one level compared to the current rehabilitation intervention item to be displayed on the display screen 22.
In order to avoid false identifications, the processing unit 21 may cause a less intense rehabilitation intervention item than the current rehabilitation intervention item to be displayed on the display 22 only when a predetermined number of false answers occur. For example, the predetermined number of times may be two times, three times, or the like.
The second method comprises the following steps: the processing unit 21 may identify the content of the speech signal and when the speech signal is identified as abnormal speech not for the current rehabilitation intervention item, the processing unit 21 causes a different type of rehabilitation intervention item to be displayed on the display screen 22.
The abnormal voice may be, for example, a loud sound or the like. When the voice signal is recognized as an abnormal voice not for the current rehabilitation intervention item, which indicates that there is a conflict psychology for the current rehabilitation intervention item by the user, or the intensity of the current rehabilitation intervention item exceeds the cognition of the user, and the like, the processing unit 21 displays different types of rehabilitation intervention items on the display screen 22. Different types of rehabilitation intervention programs may be selected based on historical user data. The historical user data represents rehabilitation intervention programs that are historically better suited for the user. The different type of rehabilitation intervention program may also be a rehabilitation intervention program for soothing the user.
As shown in fig. 3, the autism rehabilitation intervention system may further include: a speaker 26. The speaker 26 issues different voice commands based on the current rehabilitation intervention program. The processing unit 21 detects different user action amplitudes corresponding to different voice instructions based on the imaging data and determines the amount of change in the user action amplitudes. The processing unit 21 causes the changed rehabilitation intervention item to be displayed on the display screen 22 based on the amount of change in the magnitude of the user's motion.
In one example, the speaker 26 may be disposed on the display screen 22 and also on the housing of the autism rehabilitation intervention system.
In one example, for example, the current rehabilitation intervention program instructs the user to raise the right hand upward. The speaker 26 issues a voice instruction such as "raise right hand up". The invisible light source 23 emits invisible light to the user, and the photosensor 24 images the user by detecting reflected light of the light emitted by the invisible light source reflected via the user. The processing unit 21 receives imaging data from the light sensitive sensor 24 and detects the position of the right hand of the user based on the imaging data. The foregoing steps are repeatedly performed at least twice. The processing unit 21 obtains the amount of change in the position of the right hand of the user based on the positions of the right hand of the user detected in sequence. In many cases, the autistic child will not lift the right hand directly. However, if there are subtle changes in his right hand, this is already sufficient to prove healing in autistic children. The processing unit 21 changes the rehabilitation intervention item based on the amount of change in the position of the right hand of the user. This way the status of an autistic child can be determined more accurately.
The processing unit 21 causes the intensity of the rehabilitation intervention item with the increased intensity to be displayed on the display screen 22 when it is determined that the amount of change increases. The processing unit 21 causes a reduced intensity rehabilitation intervention item or a changed rehabilitation intervention item to be displayed on the display screen 22 when it is determined that the amount of change is reduced.
This example is illustrated with the above user action of raising the right hand upwards. Rehabilitation intervention programs and their corresponding intensity levels may be pre-stored. When the processing unit 21 determines that the variation is increased, indicating that the user is able to adapt the current rehabilitation intervention program, a rehabilitation intervention program is displayed on the display 22 with an intensity increased compared to the intensity of the current rehabilitation intervention program. Otherwise, indicating that the user is not well suited to the current rehabilitation intervention program, a rehabilitation intervention program having a reduced intensity, or a changed rehabilitation intervention program, compared to the current rehabilitation intervention program is displayed on the display screen 22. The altered rehabilitation intervention program may be a default, lowest intensity rehabilitation intervention program.
The displayed rehabilitation intervention items can be optimized, so that the rehabilitation intervention items which are more suitable for the self condition of the user can be generated, and the treatment effect can be further improved. The autism rehabilitation intervention system can also evaluate rehabilitation intervention items obtained based on any one of the above embodiments.
In one example of making the evaluation, the processing unit 21 determines at least one of a gaze direction, a gaze time, a translational change in gaze direction, and a user limb motion characteristic of the user based on the imaging data. Furthermore, the processing unit 21 may also determine the language interaction feedback feature of the user based on the speech signal from the microphone. The processing unit 21 determines whether the current rehabilitation intervention program is appropriate for the user based on at least one of gaze direction, gaze time, translational changes in gaze direction, user limb motion characteristics, and language interaction feedback characteristics.
In this example, the behavioral characteristics of the user include: at least one of gaze direction, gaze time, translational changes in gaze direction, user limb action features, and language interaction feedback features. The language interaction feedback feature may be: correct feedback, substantially correct feedback, and incorrect feedback.
In another example of evaluating, the processing unit 21 further determines whether the current rehabilitation intervention program is appropriate for the user based on a temporal relationship between at least two of the gaze direction, the translational change in gaze direction, the user limb action feature, and the language interaction feedback feature.
The processing unit 21 may determine the user's "idea" reaction to the rehabilitation intervention program based on the above features and/or the temporal relationship between the above features. For example, when the user is instructed to look to the right, the user shifts gaze to the right and gazes at a certain time. This indicates that the user's "mind" is being transferred as directed by the rehabilitation intervention program. Thus, current rehabilitation intervention programs are effective.
Furthermore, the speech signal indicative of the user may be improved based on the above-described features. The processing unit 21 may further determine at least one of a gaze direction, a gaze time, a translational change in gaze direction, and a user limb action characteristic of the user based on the imaging data, and/or the processing unit 21 determines a language interaction feedback characteristic of the user based on a speech signal from the microphone 26. The processing unit 21 may adjust at least one of a frequency combination and a speech rate of the indicated speech for sending instructions to the user to determine the indicated speech suitable for the user based on at least one of the gaze direction, the shift change in the gaze direction, the user limb action feature, and the language interaction feedback feature.
The user's reaction to the speech used for the instruction can be determined by the above-described features. For example, the user may exhibit discomfort such as being noisy or the user may be more responsive to certain voices. The speech appropriate for the user may be adjusted and selected based on changes in these characteristics.
Currently, existing rehabilitation intervention devices typically set speech based on an adult's own understanding. This approach is designed to stand at the angle of an adult, rather than an autistic child. In fact, speech often has a certain influence on the psychology of children. In the case of fixed speech settings, it is sometimes difficult to determine whether the training effect is not ideal due to the rehabilitation intervention program itself or due to speech. By the voice adjusting mode, on one hand, the care starting from the emotion of the user is reflected; on the other hand, the training accuracy can be improved to a certain extent.
Furthermore, although not shown, the autism rehabilitation intervention system may also include a tactile generating device such as a vibrator, nebulizer, heater, or cooler. For example, the user may be prompted to switch attention by vibration. For example, the scent may be emitted by a nebulizer, attracting the attention of the user. The touch feeling of the user can also be excited by a heater or a refrigerator. Furthermore, a pleasant feeling can be made to the user with the atomizer, the heater, or the refrigerator. For example, a user's favorite scent may be sprayed through a sprayer. The user-friendly temperature can be set by the heater/cooler. For example, a heater is provided to heat in a cold environment (winter) to provide a user with a feeling of warmth, thereby soothing the mood of the user. A refrigerator is set to cool in a hot environment (summer) to provide a user with a suitable feeling. These approaches are very advantageous for sensitive autistic children. Therefore, the rehabilitation effect of the autism rehabilitation intervention system is improved.
< example >
Fig. 4-6 are one example of an autism rehabilitation intervention system according to the present disclosure.
Fig. 4 shows a block diagram of the autism rehabilitation intervention system.
As shown in fig. 4, the autism rehabilitation intervention system 1000 includes a processing unit 1100. The processing unit 1100 may include a processor 1110, an input/output interface unit 1120, a storage unit 1130, and a network unit 1140.
The storage unit 1130 includes, for example, a random access memory RAM1131, a cache memory 1132, and a nonvolatile memory 1133. The storage unit 1130 is used for storing programs and/or data required by the autism rehabilitation intervention system to operate, for example, rehabilitation intervention program data and the like. Instructions may be stored in the memory unit 1130. The processor 1110 executes a program to perform the above-described processing based on instructions in the storage unit.
The input/output interface unit 1120 may connect, for example, the display 1150, the invisible light source 1160, the photosensitive sensor 1170, the microphone array 1180, the speaker 1190, and other external devices 1200.
Display 1150, invisible light source 1160, photosensitive sensor 1170, microphone array 1180, and speaker 1190 may be devices as described in the above embodiments and will not be repeated here.
Other external devices 1200 may include, for example, a keyboard, a mouse, and a tactile generating device that may also include a vibrator, a nebulizer, a heater, or a refrigerator.
The network unit 1140 may include, for example, a wired/wireless network adapter. It may be, for example, a Wifi network adapter, a bluetooth network adapter, a 3G/4G/5G network adapter, etc. May be connected to other processors/memories on the network through network elements to collectively form a larger processing unit.
Fig. 5 shows an example of connecting terminal devices through a network.
In fig. 5, terminal devices 1000-a, 1000-b structured as described in fig. 4 may be provided at the terminal locations. The users 4000-a, 4000-b use the terminal devices 1000-a, 1000-b for rehabilitation training.
The terminal devices 1000-a, 1000-b are connected to the network 2000 (cloud) via network access points 3000-a, 3000-b. The terminal devices 1000-a, 1000-b are shown in fig. 5 to be connected to the network access points 3000-a, 3000-b by wireless means, but they may also be connected by wired means.
A plurality of server devices 2100 are included in the network 2000. Each server device 2100 includes, for example, a respective processor 2101 and storage unit 2102. They can cooperate with the processor and the memory unit in the terminal to realize the rehabilitation training process.
Fig. 6 shows an example of terminal setup.
As shown in fig. 6, in the terminal position, the user 4000 performs rehabilitation training using the terminal device 1000. The terminal device 1000 has a structure shown in fig. 4, for example.
A hidden invisible light source 1160 and a photosensitive sensor 1170 are provided under the screen 1050 of the terminal apparatus 1000.
The examples herein are summarized as follows.
A1, an autism rehabilitation intervention system, comprising:
a display screen;
an invisible light source;
a light sensitive sensor that images a user by detecting reflected light reflected by the user from light emitted by the invisible light source; and
a processing unit that receives imaging data from the light sensitive sensor, detects a behavioral characteristic of the user based on the imaging data, and causes a corresponding rehabilitation intervention item to be displayed on the display screen based on the behavioral characteristic.
A2, the autism rehabilitation intervention system of a1, wherein the invisible light source is a near-infrared light source.
A3, the autism rehabilitation intervention system according to A1 or A2, wherein the invisible light source is embedded in the display screen and is transparent.
A4, the autism rehabilitation intervention system according to any of a1-A3, wherein the light sensitive sensor is embedded in the display screen and is transparent.
A5, the autism rehabilitation intervention system according to any of a1-a4, wherein the light sensitive sensor is a time of flight TOF camera.
A6, the autism rehabilitation intervention system according to any one of A1-A5, further comprising: a microphone array disposed proximate to the light sensitive sensor, wherein the microphone array senses an orientation of a user, the behavioral characteristics include a gaze direction and gaze time of the user, and the processing unit controls a pointing direction of the invisible light source and/or the light sensitive sensor with the sensed orientation of the user and determines the gaze direction and gaze time of the user based on the imaging data.
A7, the autism rehabilitation intervention system according to any one of A1-A6, further comprising: a microphone array disposed proximate to the light sensitive sensor, wherein the microphone array senses an orientation of a user, the processing unit further controlling a pointing direction of the invisible light source and/or the light sensitive sensor using the sensed orientation.
A8, the autism rehabilitation intervention system according to any one of A1-A7, further comprising: a voice microphone, wherein the voice microphone receives a voice signal from a user, and the processing unit changes a rehabilitation intervention item displayed on the display screen based on the voice signal.
A9, the autism rehabilitation intervention system according to any one of a1-A8, wherein the processing unit recognizes the content of the voice signal, and when the voice signal is recognized as a reply to the current rehabilitation intervention item and is an erroneous reply, the processing unit causes a lower intensity rehabilitation intervention item than the current rehabilitation intervention item to be displayed on the display screen.
A10, the autism rehabilitation intervention system according to any one of a1-a9, wherein the processing unit, upon occurrence of a predetermined number of false answers, causes a less intense rehabilitation intervention item to be displayed on the display screen compared to the current rehabilitation intervention item.
A11, the autism rehabilitation intervention system according to any one of a1-a10, wherein the processing unit recognizes the content of the voice signal, and when the voice signal is recognized as abnormal voice not for a current rehabilitation intervention item, the processing unit causes a different type of rehabilitation intervention item to be displayed on the display screen.
A12, the autism rehabilitation intervention system according to any of A1-A11, wherein different types of rehabilitation intervention items are selected based on historical user data,
wherein the historical user data represents rehabilitation intervention programs that are historically better suited for the user.
A13, the autism rehabilitation intervention system according to any one of A1-A12, further comprising: a speaker, a sound source,
wherein the speaker issues different voice instructions based on a current rehabilitation intervention program,
wherein the processing unit detects different user action magnitudes corresponding to different voice instructions based on the imaging data and determines an amount of change in the user action magnitudes, an
Wherein the processing unit causes the display of the changed rehabilitation intervention item on the display screen based on the amount of change in the magnitude of the user action.
A14, the autism rehabilitation intervention system according to any one of a1-a13, wherein the processing unit causes an increased-intensity rehabilitation intervention item to be displayed on the display screen when it is determined that the amount of change increases, or causes a decreased-intensity rehabilitation intervention item or a changed rehabilitation intervention item to be displayed on the display screen when it is determined that the amount of change decreases.
A15, the autism rehabilitation intervention system according to any of A1-A14, wherein the processing unit determines at least one of a gaze direction, a gaze time, a shift change in gaze direction, and a user limb movement feature of the user based on the imaging data, and/or the processing unit determines a language interaction feedback feature of the user based on a speech signal from a microphone, and
wherein the processing unit determines whether the current rehabilitation intervention program is appropriate for the user based on at least one of the gaze direction, gaze time, translational changes in gaze direction, user limb motion characteristics, and language interaction feedback characteristics.
A16, the autism rehabilitation intervention system according to any one of a1-a15, wherein the processing unit further determines whether the current rehabilitation intervention program is appropriate for the user based on a timing relationship between at least two of the gaze direction, the shift change in gaze direction, the user limb action feature, and the language interaction feedback feature.
A17, the autism rehabilitation intervention system according to any of A1-A16, wherein the processing unit determines at least one of a gaze direction, a gaze time, a shift change in gaze direction, and a user limb movement feature of the user based on the imaging data, and/or the processing unit determines a language interaction feedback feature of the user based on a speech signal from a microphone, and
wherein the processing unit adjusts at least one of a frequency combination and a speech rate of the indicated speech for sending instructions to the user to determine the indicated speech suitable for the user based on at least one of the gaze direction, the translational change in gaze direction, the user limb action characteristics, and the language interaction feedback characteristics.
The processing units herein may be embodied as electronic devices and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for causing a processor to implement various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to block diagrams of systems and computer program products according to embodiments of the invention. It will be understood that each block of the block diagrams can be implemented by computer readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams, and combinations of blocks in the block diagrams, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (15)

1. An autism rehabilitation intervention system, comprising:
a display screen;
an invisible light source;
a light sensitive sensor that images a user by detecting reflected light reflected by the user from light emitted by the invisible light source; and
a processing unit that receives imaging data from a photosensitive sensor, detects a behavioral characteristic of a user based on the imaging data, and causes a corresponding rehabilitation intervention item to be displayed on the display screen based on the behavioral characteristic;
wherein the autism rehabilitation intervention system further comprises: a speaker, a sound source,
the speaker issues different voice commands based on the current rehabilitation intervention program,
the processing unit detects different user action magnitudes corresponding to different voice instructions based on the imaging data and determines a variation amount of the user action magnitudes, an
The processing unit causes the display of the changed rehabilitation intervention item on the display screen based on the amount of change in the magnitude of the user action;
wherein the processing unit causes an increased intensity rehabilitation intervention item to be displayed on the display screen when it is determined that the amount of change is increased, or causes a decreased intensity rehabilitation intervention item or a changed rehabilitation intervention item to be displayed on the display screen when it is determined that the amount of change is decreased.
2. The autism rehabilitation intervention system of claim 1, wherein the invisible light source is a near infrared light source.
3. The autism rehabilitation intervention system of claim 1, wherein the invisible light source is embedded in the display screen and is transparent.
4. The autism rehabilitation intervention system of claim 1, wherein the light sensitive sensor is embedded in the display screen and is transparent.
5. The autism rehabilitation intervention system of claim 1, wherein the light sensitive sensor is a time of flight (TOF) camera.
6. The autism rehabilitation intervention system of claim 1, further comprising: a microphone array disposed proximate to the light sensitive sensor, wherein the microphone array senses an orientation of a user, the behavioral characteristics include a gaze direction and gaze time of the user, and the processing unit controls a pointing direction of the invisible light source and/or the light sensitive sensor with the sensed orientation of the user and determines the gaze direction and gaze time of the user based on the imaging data.
7. The autism rehabilitation intervention system of claim 1, further comprising: a microphone array disposed proximate to the light sensitive sensor, wherein the microphone array senses an orientation of a user, the processing unit further controlling a pointing direction of the invisible light source and/or the light sensitive sensor using the sensed orientation.
8. The autism rehabilitation intervention system of claim 1, further comprising: a voice microphone, wherein the voice microphone receives a voice signal from a user, and the processing unit changes a rehabilitation intervention item displayed on the display screen based on the voice signal.
9. The autism rehabilitation intervention system of claim 8, wherein the processing unit recognizes a content of the voice signal, and when the voice signal is recognized as a reply to a current rehabilitation intervention program and is an erroneous reply, the processing unit causes a lower intensity rehabilitation intervention program to be displayed on the display screen compared to the current rehabilitation intervention program.
10. The autism rehabilitation intervention system of claim 9, wherein the processing unit, upon a predetermined number of false answers, causes a less intense rehabilitation intervention program to be displayed on the display screen than a current rehabilitation intervention program.
11. The autism rehabilitation intervention system of claim 8, wherein the processing unit identifies content of the voice signal, and causes different types of rehabilitation intervention items to be displayed on the display screen when the voice signal is identified as abnormal voice other than for a current rehabilitation intervention item.
12. The autism rehabilitation intervention system of claim 11, wherein different types of rehabilitation intervention programs are selected based on historical user data,
wherein the historical user data represents rehabilitation intervention programs that are historically better suited for the user.
13. The autism rehabilitation intervention system of claim 1, wherein the processing unit determines at least one of a gaze direction, a gaze time, a shift change in gaze direction, and a user limb action feature of the user based on the imaging data, and/or the processing unit determines a language interaction feedback feature of the user based on a voice signal from a microphone, and
wherein the processing unit determines whether the current rehabilitation intervention program is appropriate for the user based on at least one of the gaze direction, gaze time, translational changes in gaze direction, user limb motion characteristics, and language interaction feedback characteristics.
14. The autism rehabilitation intervention system of claim 13, wherein the processing unit further determines whether a current rehabilitation intervention program is appropriate for the user based on a timing relationship between at least two of the gaze direction, the translational change in gaze direction, the user limb motion feature, and the language interaction feedback feature.
15. The autism rehabilitation intervention system of claim 13, wherein the processing unit determines at least one of a gaze direction, a gaze time, a shift change in gaze direction, and a user limb action feature of the user based on the imaging data, and/or the processing unit determines a language interaction feedback feature of the user based on a voice signal from a microphone, and
wherein the processing unit adjusts at least one of a frequency combination and a speech rate of the indicated speech for sending instructions to the user to determine the indicated speech suitable for the user based on at least one of the gaze direction, the translational change in gaze direction, the user limb action characteristics, and the language interaction feedback characteristics.
CN201910330877.XA 2019-04-23 2019-04-23 Autism rehabilitation intervention system Active CN110152160B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910330877.XA CN110152160B (en) 2019-04-23 2019-04-23 Autism rehabilitation intervention system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910330877.XA CN110152160B (en) 2019-04-23 2019-04-23 Autism rehabilitation intervention system

Publications (2)

Publication Number Publication Date
CN110152160A CN110152160A (en) 2019-08-23
CN110152160B true CN110152160B (en) 2021-08-31

Family

ID=67639955

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910330877.XA Active CN110152160B (en) 2019-04-23 2019-04-23 Autism rehabilitation intervention system

Country Status (1)

Country Link
CN (1) CN110152160B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116440382B (en) * 2023-03-14 2024-01-09 北京阿叟阿巴科技有限公司 Autism intervention system and method based on multilayer reinforcement strategy
CN116665892B (en) * 2023-03-24 2023-11-17 北京大学第六医院 Autism evaluation system, method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017197411A1 (en) * 2016-05-13 2017-11-16 The General Hospital Corporation Methods and apparatus for treatment of disorders
CN108899081A (en) * 2018-06-14 2018-11-27 北京科技大学 A kind of man-machine interactive system towards self-closing disease recovering aid
CN108938379A (en) * 2018-07-24 2018-12-07 广州狄卡视觉科技有限公司 A kind of self-closing disease rehabilitation education human-computer interaction intensive training system
CN109414164A (en) * 2016-05-09 2019-03-01 奇跃公司 Augmented reality system and method for user health analysis

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102491130B1 (en) * 2016-06-20 2023-01-19 매직 립, 인코포레이티드 Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109414164A (en) * 2016-05-09 2019-03-01 奇跃公司 Augmented reality system and method for user health analysis
WO2017197411A1 (en) * 2016-05-13 2017-11-16 The General Hospital Corporation Methods and apparatus for treatment of disorders
CN108899081A (en) * 2018-06-14 2018-11-27 北京科技大学 A kind of man-machine interactive system towards self-closing disease recovering aid
CN108938379A (en) * 2018-07-24 2018-12-07 广州狄卡视觉科技有限公司 A kind of self-closing disease rehabilitation education human-computer interaction intensive training system

Also Published As

Publication number Publication date
CN110152160A (en) 2019-08-23

Similar Documents

Publication Publication Date Title
US10366778B2 (en) Method and device for processing content based on bio-signals
JP6939797B2 (en) Information processing equipment, information processing methods, and programs
CN112771454A (en) Method and system for providing control user interface for household appliance
Vetter et al. Context estimation for sensorimotor control
CN110152160B (en) Autism rehabilitation intervention system
US11407106B2 (en) Electronic device capable of moving and operating method thereof
US11937930B2 (en) Cognitive state-based seamless stimuli
KR20190066428A (en) Apparatus and method for machine learning based prediction model and quantitative control of virtual reality contents’ cyber sickness
KR20220036368A (en) brain-computer interface
KR20220100862A (en) smart ring
US20220300143A1 (en) Light Field Display System for Consumer Devices
US20230229246A1 (en) Optimization on an input sensor based on sensor data
WO2016143415A1 (en) Information processing apparatus, information processing method, and program
US20230026513A1 (en) Human interface device
EP4379511A2 (en) Brain-computer interface
US20230022442A1 (en) Human interface system
EP4314998A1 (en) Stress detection
US11457804B1 (en) System for monitoring vision changes
KR102078792B1 (en) Apparatus for treatment of asperger
WO2019190812A1 (en) Intelligent assistant device communicating non-verbal cues
WO2022059784A1 (en) Information provision device, information provision method, and program
TW200416581A (en) Vision-driving control system and control method thereof
CN115129163B (en) Virtual human behavior interaction system
Afergan Using brain-computer interfaces for implicit input
US20230350492A1 (en) Smart Ring

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant