CN112148884B - Systems and methods for autism intervention - Google Patents

Systems and methods for autism intervention Download PDF

Info

Publication number
CN112148884B
CN112148884B CN202010852052.7A CN202010852052A CN112148884B CN 112148884 B CN112148884 B CN 112148884B CN 202010852052 A CN202010852052 A CN 202010852052A CN 112148884 B CN112148884 B CN 112148884B
Authority
CN
China
Prior art keywords
intervention
entity
node
mode
task
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010852052.7A
Other languages
Chinese (zh)
Other versions
CN112148884A (en
Inventor
程建宏
宋华俐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Azuaba Technology Co ltd
Original Assignee
Beijing Azuaba Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Azuaba Technology Co ltd filed Critical Beijing Azuaba Technology Co ltd
Priority to CN202010852052.7A priority Critical patent/CN112148884B/en
Publication of CN112148884A publication Critical patent/CN112148884A/en
Application granted granted Critical
Publication of CN112148884B publication Critical patent/CN112148884B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri
    • G06F16/367Ontology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/432Query formulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/432Query formulation
    • G06F16/433Query formulation using audio data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/432Query formulation
    • G06F16/434Query formulation using image data, e.g. images, photos, pictures taken by a user
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The present disclosure relates to a system and method for autism intervention. The system comprises: a processing device and an output device, wherein the processing device includes a processor and a memory storing executable instructions that when the processing device is running control the processor to perform the following: acquiring an intervention knowledge graph for autism, wherein the intervention knowledge graph comprises task nodes and mode nodes, the content of the task nodes represents tasks to be subjected to the intervention of the autism, and the content of the mode nodes represents modes to be subjected to the intervention of the autism; determining a first task node; determining a first mode node based on the first task node; and obtaining at least one first intervention entity based on the first mode node; wherein the output device outputs the first intervention entity to the intervention object.

Description

Systems and methods for autism intervention
Technical Field
The present disclosure relates to the technical field of autism medical devices, and more particularly, to a system for autism intervention and a method for autism intervention.
Background
Autism, also known as autism or autism disorder, is a type of nervous system development disorder disease.
Currently, in intervention treatment of autism, human experience is mainly relied on. On the one hand, this does not guarantee consistency of the therapeutic level. For example, the level of the treatment person varies, and different treatment methods may be used by different treatment persons for the same child symptom, which results in unstable treatment effects for the child. On the other hand, it is difficult to share the experience of one therapist with other therapists in a timely manner. On the other hand, children have a certain conflict psychological for the diagnostician, so that the children are not matched with the treatment, and the treatment effect is poor.
Disclosure of Invention
It is an object of the present disclosure to provide a new system for autism intervention.
According to a first aspect of the present disclosure, there is provided a system for autism intervention, comprising: a processing device and an output device, wherein the processing device comprises a processor and a memory, the memory storing executable instructions that, when the processing device is running, control the processor to perform the following: acquiring an intervention knowledge graph for autism, wherein the intervention knowledge graph comprises task nodes and mode nodes, the content of the task nodes represents tasks to be subjected to the intervention of the autism, and the content of the mode nodes represents modes to be subjected to the intervention of the autism; determining a first task node; determining a first mode node based on the first task node; and obtaining at least one first intervention entity based on the first mode node; wherein the output device outputs the first intervention entity to the intervention object.
According to a second aspect of the present disclosure, there is provided a method for autism intervention, comprising: acquiring an intervention knowledge graph for autism, wherein the intervention knowledge graph comprises task nodes and mode nodes, the content of the task nodes represents tasks to be subjected to the intervention of the autism, and the content of the mode nodes represents modes to be subjected to the intervention of the autism; determining a first task node; determining a first mode node based on the first task node; acquiring at least one first intervention entity based on the first mode node; and outputting the first intervention entity to the intervention subject.
According to embodiments of the present disclosure, an autism intervention is performed using a knowledge-graph, such that an automated autism intervention scheme may be provided.
Other features of the present disclosure and its advantages will become apparent from the following detailed description of exemplary embodiments of the disclosure, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic structural diagram of a system for autism intervention provided by an embodiment of the present disclosure.
Fig. 2 is a schematic structural diagram of an intervention knowledge-graph according to an embodiment of the disclosure.
Fig. 3 is a schematic structural diagram of another system for autism intervention provided by embodiments of the present disclosure.
Fig. 4 is a flow chart of a method for autism intervention provided by an embodiment of the present disclosure.
Fig. 5 is a schematic structural view of a device for autism intervention provided in an embodiment of the present disclosure.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless it is specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of exemplary embodiments may have different values.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
< System >
Fig. 1 is a schematic block diagram of a system for autism intervention provided in accordance with one embodiment of the present disclosure.
As shown in fig. 1, the system for autism intervention includes: a processing device 11 and an output device 12.
The processing device 11 comprises a processor 111 and a memory 112. The processor 111 may include CPU, MPU, MCU, for example. Memory 112 may store underlying software, system software, application software, data, and the like. The memory 112 may include various forms of memory, such as ROM, RAM, flash, etc. The memory 112 stores executable instructions that control the processor 111 to perform the following processing of S1100-S1400 when the processing device 11 is running.
S1100, acquiring an intervention knowledge-graph for autism.
The intervention knowledge graph comprises task nodes and mode nodes, the content of the task nodes represents tasks to be subjected to autism intervention, and the content of the mode nodes represents modes to be subjected to the autism intervention.
The intervention knowledge-graph may be stored in memory 112. A knowledge graph (knowledgegraph) is a graph-based data structure that consists of nodes (points) and edges (edges). In the knowledge graph, each node represents an "entity" existing in the real world, and each edge is a "relationship" between entities. Knowledge graph is an effective representation of the "relationship". Knowledge-graph can be considered as a network of relationships that are obtained by linking together different kinds of information. Knowledge-graph concepts were first proposed by Google and are mainly used to optimize existing search engines. Typically, the knowledge graph is composed of a knowledge piece, each of which can be represented as an SPO (Subject-prediction-Object) triplet. Here, the intervention knowledge-graph is used for intervention in autism in a child. The intervention knowledge-graph may be acquired in a number of ways. For example, the intervention knowledge-graph may be manually entered by a designer.
The intervention knowledge graph comprises at least one task node and at least one mode node. One task node may be connected to at least one task node and/or mode node, and one mode node may be connected to at least one task node and/or mode node.
The contents of the task node may include, for example: cognitive intervention tasks, life self-care intervention tasks, and social intervention tasks. These tasks may be in a side-by-side relationship. Furthermore, in the intervention knowledge-graph, a task may also include one or more subtasks. For example, social intervention tasks may include language requirements subtasks, active fetching subtasks, and the like. A task node may be connected by edges to other parallel nodes, and may also be connected to its child nodes.
The mode node may represent how autism intervention is performed. The contents of the modality nodes may directly represent the modality of the intervention, e.g., the contents of the modality nodes may include questions that the intervention subject is expected to answer. The contents of the modality node may indirectly represent the manner of intervention, e.g., the contents of the modality node may include keywords of what the intervention object is desired to identify, e.g., "fruit". Next, the keyword related things can be searched by the processing device 11 for presentation to the intervention object, e.g. various "fruits".
S1200, determining a first task node.
Herein, "first", "second" … … are used to identify different things, and do not indicate the order, priority, etc. of indicated nodes or other things. The task node currently to be processed in the intervention knowledge-graph may be referred to herein as a first task node. The first task node may be determined in a number of ways, e.g. it may be determined by receiving a setting of an operator by the processing device 11, it may be determined automatically by the processing device 11 based on information of the intervention object, etc.
S1300, determining a first mode node based on the first task node.
In the intervention knowledge-graph, the nodes are connected with each other by edges. Nodes represent "entities," i.e., tasks and/or ways of intervention, and "edges" represent relationships. In the case of determining the first task node, the first mode node may be determined by an edge connected to the first task node. The first mode node may be directly connected to the first task node, or may be connected to the first task node through another task node.
The first mode node may be randomly determined among nodes connected with the first task node via edges; or the first mode node may be sequentially determined among nodes connected with the first task node via edges; or the first mode node may be determined based on the weights of the respective edges connected to the first task node.
S1400, at least one first intervention entity is acquired based on the first mode node.
The intervention entity is the actual content for intervention on the intervention object, e.g. audio, video, pictures, vibrations, smell, etc. After the at least one first intervention entity is acquired, the output device 12 outputs the first intervention entity to the intervention subject.
Here, the intervention knowledge-graph is applied to intervention of autism, which is advantageous to eliminate empirical differences between different interveners. In addition, after the latest progress on autism intervention is obtained, the intervention knowledge-graph can be updated directly. This can eliminate the need for training or the like for intervening personnel. On the one hand, this may save costs; on the other hand, this is advantageous for rapid popularization of research results on autism. In addition, unlike the usual use of knowledge-graphs, here, intervention knowledge-graphs are used to relate nodes of different nature to produce intervention entity outputs for intervention subjects, thereby forming intervention-subject-oriented knowledge-graphs.
The first mode node may directly comprise the first intervention entity. For example, the content of the first mode node includes: the text and/or audio of "what you need". This content may be output to the intervention object as a first intervention entity and await a response from the intervention object.
Furthermore, the first mode node may contain information for retrieving the first intervention entity. For example, the above-described process S1400 may be implemented by the following process S1410.
S1410, retrieving a first intervention entity from a database based on the keywords in the first schema node.
The database may be stored in memory 112. Multiple intervening entities may be stored in a repository. Intervention entities matching the keywords in the first mode node may be retrieved from the repository as first intervention entities. For example, the first mode node includes the keyword "fruit". The processing device 11 may retrieve pictures of various fruits, e.g. apples, peaches, bananas, etc., from a database. The retrieved pictures of the various fruits may be output as intervention entities to the intervention subject.
Furthermore, the above two ways may be combined, i.e. the first-way node may directly contain the first intervention entity and contain information for acquiring the first intervention entity. For example, the content of the first mode node includes text and/or audio of "what you need" and also includes the keyword "fruit". In this case, it is possible to output words and/or audio of "what you need" to the intervention subject and to show pictures of the various fruits retrieved, thereby determining the response of the intervention subject.
For example, intervention entities in the repository may be obtained through a web crawler. Furthermore, the intervention entity in the database may be entered manually.
The intervening entities may be heterogeneous entities, i.e. the data types of the intervening entities are different. The existing knowledge graph is a semantic network used for representing the meaning of language. In the existing knowledge graph, each knowledge can be represented as an SPO triplet, which is composed of text data. In this regard, existing knowledge-graphs are used primarily to mimic human language. Here, the inventors propose to use intervention knowledge-graph for establishing a link between different intervention tasks/intervention modalities/intervention entities. The semantics of the individual nodes are not considered here. Thus, the inventors propose that heterogeneous intervention entities may be used, i.e. the data types of the intervention entities are not limited to text, but may be heterogeneous data. Herein, "heterogeneous" refers to: the intervening entities may be text, audio, video, scent data, temperature data, etc. without the need to convert them to unified text data, rather than converting them to unified format text data as in the prior art.
As previously described, the mode node may contain text and/or audio. Furthermore, the intervention entity in the repository may include: an audio entity, an image entity, an odor entity, a temperature entity, a vibration entity, and a spray entity.
In the case where the intervention entities in the database are heterogeneous data, the intervention objects (typically autistic children) may be provided with interventions of various dimensions. For example, to provide visual, auditory, olfactory, and other dimensional intervention. In this way, the effect of the intervention on the intervention subject can be enhanced.
Here, by using heterogeneous intervention entities, the cognitive processes of humans can be simulated to coincide with the cognition of the intervention subject, thereby enhancing the intervention effect. For example, at the beginning of a newborn infant's cognition, the infant does not have semantic understanding capabilities, but the infant simply links things it observes, sounds received, touch, smell, etc., thereby gradually establishing abstract and cognitive capabilities. Here, the original heterogeneous intervention entity is used as intervention material, and the association of the heterogeneous intervention entity is established through a knowledge graph. This simulates the basic cognitive set-up process in humans. The simulation of the basic cognition establishment process of human beings is beneficial to finding the symptom of an intervention object in the intervention process, so that the intervention effect is realized.
An example of acquiring and outputting a intervention entity is described below with reference to fig. 2. For example, as shown in FIG. 2, task nodes include "social" task nodes, "claim for items by language" task nodes, "get active" task nodes, and so forth. The task nodes of the claim are sub-nodes of the social task nodes, and the task nodes of the claim are sub-nodes of the task nodes of the claim. A "social" task node, a "claim to ask" task node, a "claim to language article" task node, or an "active pick" task node may be determined as the first task node.
Fig. 2 shows 3 nodes of modes 1, 2, and 3, respectively. The first mode node may be selected from modes 1, 2, 3.
1. The content of mode 1 includes two aspects of content. The content of the first aspect is "what you want? "audio. The content of the second aspect is information of "items that may be of interest to children".
Based on the information of "items that may be of interest to children", a picture of the following items may be retrieved in the database: kiwi, banana, pomegranate, pear, pineapple, watermelon, apple, peach, cake, orange, litchi, peanut, pistachio, ham, cucumber, tomato, dragon fruit, corn, egg, cantaloupe, automobile, ball, cube, balloon, grape, lollipop, snowflake, clip, collar, sleeve, small train toy, seaweed, melon seed, potato chip, hawthorn, many fish, small steamed bread, pad, bubble, building blocks, pistol, mane building blocks, television remote control, biscuit, cola, shrimp cracker, raisin, saber, small aircraft, ocean ball, jigsaw, bullet, hamburger, robot, mobile phone, bear.
"what you want? "and displaying a picture of the retrieved item for selection by the intervention subject.
2. The content of mode 2 includes only "what is you want? "audio.
When "what you want? "after audio, the desired item needs to be spoken by the intervening subject himself.
3. The content of mode 3 includes only information of "items that may be of interest to children".
Based on the information of the "items of possible interest to the child", a picture of the items can be retrieved in the database.
A picture of the retrieved item may be presented by the output device 12 for intervention object selection.
Based on the intervention knowledge-graph, a first mode node is determined by the first task node. The "language claim item" task node is connected to modes 1, 2, 3. Thus, any of ways 1, 2, 3 may be determined by the "claim item" task node. The "active pick" task node is connected to mode 3. Thus, the mode node 3 may be determined by the "active pick" task node.
The intervention entity may be an audio-like entity (e.g. sound), an image-like entity (e.g. a picture or video of fruit), an odor-like entity (e.g. fragrance of flowers), a temperature-like entity (e.g. an increase or decrease in temperature), a vibration-like entity (e.g. vibration stimulus for an intervention subject), and a spray-like entity (e.g. a spray of water mist). In this case, the output device 12 includes at least one of the following components:
-an output interface for connecting the processing device 11 and an external device;
-display means outputting an intervention entity of the image class;
-a speaker outputting an intervention entity of the audio class;
-a vibrating device outputting a vibrating-like intervention entity;
-a scent generating means outputting a scent-like intervention entity;
-temperature adjustment means outputting a intervention entity of the temperature class; and
-a nebulizer outputting a nebulizing-like intervention entity.
The system for autism intervention disclosed herein can automatically and intelligently intervene on an intervention subject. This reduces the differences that result from different interventionalists working on the interventionalist. This avoids to some extent the unstable therapeutic effects that can be brought about by interventions with artificial experience. Furthermore, this may facilitate the intervention staff to share the intervention experience.
The intervention of autism by means of the processing device and the output device can reduce the participation of the intervention staff. In some cases, this may reduce the impact of human factors on the intervention subject, so that the effect of the intervention modality may be determined more accurately. Furthermore, this may also improve the intervention effect.
In one embodiment, the above-described process S1200 may include the following processes S1210 and S1211.
S1210, counting the processes of the task nodes.
S1211, disabling the task node during the current intervention when the count value is greater than a first predetermined threshold.
The first predetermined threshold may be set according to practical experience, for example, setting the first predetermined threshold to 7, 8 or 9.
During the current intervention, the same first task node determined for a plurality of times can be counted to obtain a count value. And under the condition that the count value reaches a first preset threshold value, prohibiting the first task node in the current intervention process so as to determine other task nodes as the first task node, and accordingly performing the corresponding intervention processing. In this way, the intervention process is prevented from entering a closed loop state.
Alternatively, the above-described process S1300 may include the following processes S1310 and S1311.
S1310, counting the processes of the nodes.
S1311, disabling the mode node during the current intervention when the count value is greater than a second predetermined threshold.
The second predetermined threshold may be set based on practical experience, for example, setting the second predetermined threshold to 3, 4 or 5.
During the current intervention, the same first mode node determined for a plurality of times can be counted to obtain a count value. In case the count value reaches a second predetermined threshold, the first task node is disabled during the current intervention. In this way, the intervention into a closed loop state can be avoided.
Optionally, the intervening entities in the database have weights. Thus, the above-described process S1410 can hold you for the following processes S1411 and S1412.
S1411, retrieving intervention entities from a database based on the keywords of the first mode node.
At S1412, at least one intervention entity with a higher weight is determined as the first intervention entity.
Processor 111 may retrieve all intervening entities matching the keywords from the database based on the keywords of the first mode node. The processor 111 may take as the first intervention entity the intervention entity having a weight greater than the preset threshold, or the retrieved intervention entities may be arranged in descending order of weight, with the intervention entity located in the previous preset number of bits as the first intervention entity.
Alternatively, the weights of the intervening entities in the database may be manually set in the initial state of the database. After the database is used, the weights of the intervention entities in the database may be updated based on the intervention effect.
An embodiment of obtaining the intervention result is described below with reference to fig. 3. As shown in fig. 3, the system for autism intervention further comprises: an input device 13. The input device 13 is for inputting first intervention result data representing the result of an intervention of the intervention subject by the first intervention entity.
In one example, the first intervention result data may include data entered manually by an intervention person through an input device, i.e., the input device receives first intervention result data entered manually by the intervention person.
In another embodiment, the input device 13 comprises a capturing device that captures at least one of an image and sound of the intervention subject as the first intervention result data. For example, the first intervention result data characterizes a reaction of the intervention subject to the first intervention entity. Such reactions include intervening subject actions, statues, language, etc. Data characterizing such a reaction may be captured by a capture device. For example, when an image of one banana is presented, the intervention subject reacts to the image, and the first result data may be image data of the intervention subject looking at the banana, image data of the intervention subject's finger at the banana, the intervention subject uttering a voice of "I want banana", or the like. In this case, the capturing device may include a camera, a microphone, or the like.
In another embodiment, the processor 111 may also perform the following processes S1500 and S1600.
S1500, based on the first intervention result data, first intervention effect information is generated.
Processor 111 may evaluate the intervention effect of the first intervention entity based on the first intervention result data and evaluate the result as first intervention effect information.
In one example, the first intervention effect information may be a score or an effect level for the intervention outcome.
For example, when the intervention staff inputs "good", the first intervention effect information may be set to 9.
In another example, processor 111 analyzes the first intervention result data to obtain first intervention effect information. For example, the first intervention result data captured by the capturing device represents a response of the intervention subject to the intervention entity, based on which response the processing device 11 may determine the intervention effect information. The processing device 11 may also determine intervention effect information from multiple dimensions. For example, when the video in the first intervention result data indicates that the intervention object is responsive to the first intervention entity, processing device 11 may set the first intervention effect information to +1 points. When the video in the first intervention result data indicates that the response time of the intervention object to the first intervention entity is 1s, the processing device 11 may set the first intervention effect information to +1 point. When the video in the first intervention result data indicates that the intervention object is acting and/or sounding on the first intervention entity, the processing device 11 may set the first intervention effect information to +2 points. At this time, the processing device 11 may generate first intervention effect information, that is, a score of 4 in total, based on the first intervention result data. Here, the higher the score, the better the intervention effect.
S1600 adjusts the weight of the first intervention entity based on the first intervention effect information.
The processing device 11 may increase the weight of the first intervention entity in case the intervention effect characterized by the first intervention effect information is better. And under the condition that the intervention effect represented by the first intervention effect information is poor, reducing the path weight of the first intervention entity.
Further, the processing device 11 may also execute the following process S1700.
S1700, the captured first intervention result data is added to the database as a newly added intervention entity.
The database can be expanded by adding the first intervention result data as a newly added intervention entity to the database; alternatively, the first intervention result data of the intervention object itself may be utilized as part of a database. In this way, the intervention subject may be more familiar with the intervention entity's perception, and thus may improve the intervention effect.
Relationships between nodes in the intervention knowledge-graph may be defined by a neural network. Thus, the above-described process S1300 may include the following process S1320.
S1320, determining a first mode node from the first task node using the neural network.
The method can be used for modeling based on data of task nodes and mode nodes, learning and training are carried out through a deep neural network, so that evaluation and classification of interference effects are achieved, and task nodes and mode node recommendation is optimized. The data includes, for example, intervention success rate, number of interventions, time spent, intervention style, number of visual interactions per unit time, duration of visual interactions, etc. Here, the types of classification are ascending, descending, homonymic trial and successful homonymic repetition. On the basis, the intervention mode is determined according to the classification result of the neural network. If the classification result is ascending order, the intervention mode corresponding to the ascending order path is recommended preferentially.
In existing knowledge-graphs, the "edges" between nodes are static, explicit, i.e., the relationship or level represented by the edges is determined when the knowledge-graph is used. Here, unlike the processing of semantic networks, the real-time requirements for autism intervention are not high, so that a neural network can be used as an edge of an intervention knowledge graph, thereby determining the relationship between different nodes in the knowledge graph. Furthermore, unlike general knowledge maps, in autism intervention, not only the relationships between different intervention tasks/modes, but also the current intervention state and the previous intervention state need to be considered. Thus, these factors can be comprehensively considered by using the neural network. In the intervention knowledge-graph, a neural network may be used to determine relationships between different nodes (including task nodes and mode nodes).
In another embodiment, the processing apparatus 11 further performs the following process S1330.
S1330, training the neural network by using the first intervention effect information.
In the embodiment of the application, the first intervention effect information can be used as the input of the neural network, and the neural network is trained, so that the learning capacity of the neural network can be improved by using the actual first intervention effect information, and therefore, the excellent first mode node can be determined from the first task node.
Further, the processing device 11 may also execute the following processes S1340 and S1341.
S1340, using the first intervention effect information as input to the neural network to determine a second mode node from the first mode nodes.
S1341, based on the second mode node, obtaining at least one second intervention entity.
The implementation of process S1341 may be similar to process S1400. In this regard, a detailed description thereof is omitted.
The output device 12 outputs the second intervention entity to the intervention subject.
Further, the processing device 11 may also execute the following processing S1350-S1352.
And S1350, taking the first intervention effect information as input of the neural network to determine a second task node from the first mode node.
S1351, determining a second mode node from the second task node by using the neural network.
The implementation of process S1351 may be similar to process S1320. In this regard, a detailed description thereof is omitted.
S1352, based on the second manner node, at least one second intervention entity is acquired. The output device 12 outputs the second intervention entity to the intervention subject. The implementation of process S1352 may be similar to process S1400. In this regard, a detailed description thereof is omitted.
Here, after the current intervention process is completed, a next intervention mode may be determined based on the current intervention effect. In this way, the current intervention state can be taken into account to determine the next intervention scheme or process. This can generate an appropriate intervention scheme for the state of the intervention subject in time, thereby improving the intervention effect.
< method >
Fig. 4 illustrates a method for autism intervention according to an embodiment of the present disclosure.
As shown in fig. 4, in step S4100, an intervention knowledge-graph for autism is acquired. The intervention knowledge graph comprises task nodes and mode nodes, the content of the task nodes represents tasks to be subjected to autism intervention, and the content of the mode nodes represents modes to be subjected to the autism intervention.
In step S4200, a first task node is determined.
In step S4300, a first mode node is determined based on the first task node.
At step S4400, at least one first intervention entity is acquired based on the first mode node.
In step S4500, a first intervention entity is output to an intervention subject.
In one embodiment, obtaining at least one intervention entity based on the first mode node comprises: a first intervention entity is retrieved from the database based on the keywords in the first schema node.
In one embodiment, the intervening entities in the database have weights. Based on the keywords in the first mode node, retrieving the first intervention entity from the database comprises: retrieving intervention entities from a database based on the keywords in the first schema node; and determining at least one intervention entity with higher weight as a first intervention entity. The method further comprises the steps of: inputting first intervention result data, wherein the first intervention result data represents the intervention result of a first intervention entity on the intervention object, and generating first intervention effect information based on the first intervention result data; and adjusting the weight of the first intervention entity based on the first intervention effect information.
In one embodiment, at least one of an image and sound of the intervention object is captured by a capture device as first intervention result data.
In one embodiment, the method further comprises: the captured first intervention result data is added to the database as a newly added intervention entity.
In one embodiment, inputting the first intervention result data comprises: first intervention result data manually input by a user is received.
In one embodiment, the relationships between nodes in the intervention knowledge-graph are defined by a neural network. Determining a first mode node based on the first task node includes: a first mode node is determined from the first task node using the neural network.
In one embodiment, the method further comprises: training the neural network with first intervention effect information.
In one embodiment, the method further comprises: taking first intervention effect information as input of the neural network to determine a second mode node from the first mode node; acquiring at least one second intervention entity based on the second mode node; and outputting the second intervention entity to the intervention subject.
In one embodiment, the method further comprises: taking first intervention effect information as input of the neural network to determine a second task node from the first mode node; determining a second mode node from a second task node using the neural network; acquiring at least one second intervention entity based on the second mode node; and outputting the second intervention entity to the intervention subject.
In one embodiment, determining the first task node comprises: counting the processes of the task nodes; and disabling the task node during the current intervention when the count value is greater than the first predetermined threshold. Determining a first mode node based on the first task node includes: counting the processes of the nodes of the method; and disabling the mode node during the current intervention when the count value is greater than a second predetermined threshold.
In one embodiment, the first intervention entity is output to the intervention object by at least one of the following components:
-an output interface;
-a display device;
-a speaker;
-a vibrating device;
-an odor generating device;
-temperature adjustment means; and
-a nebulizer.
The implementation and effect of the various treatments for autism intervention have been described above with respect to system embodiments. Therefore, in the method embodiment section, these descriptions are not repeated for the sake of brevity.
< device >
As shown in fig. 5, an embodiment of the present disclosure provides a device 50 for autism intervention. The apparatus 50 includes: a first obtaining module 51, configured to obtain an intervention knowledge graph for autism, where the intervention knowledge graph includes task nodes and mode nodes, content of the task nodes represents a task to be subjected to the intervention of autism, and content of the mode nodes represents a mode to be subjected to the intervention of autism; a first determining module 52, configured to determine a first task node; a second determining module 53, configured to determine a first mode node based on the first task node; a second obtaining module 54, configured to obtain at least one first intervention entity based on the first mode node; and an output module 55 for outputting the first intervention entity to the intervention object.
In one embodiment, the second retrieval module 54 retrieves the first intervention entity from the database based on the keywords in the first schema node.
In one embodiment, the intervening entities in the database have weights. The second acquisition module 54 also performs the following processing: retrieving intervention entities from a database based on the keywords in the first schema node; and determining at least one intervention entity with higher weight as a first intervention entity.
Furthermore, the apparatus 50 for autism intervention may further comprise an input module, a generation module, and an adjustment module. The input module is used for inputting first intervention result data, and the first intervention result data represent the result of intervention of a first intervention entity on the intervention object. And the generation module is used for generating first intervention effect information based on the first intervention result data. And the adjusting module is used for adjusting the weight of the first intervention entity based on the first intervention effect information.
In one embodiment, at least one of an image and sound of the intervention object is captured by a capture device as first intervention result data.
In one embodiment, the means 50 for autism intervention further comprises an adding module for adding the captured first intervention result data as a newly added intervention entity to the database.
In one embodiment, the apparatus 50 for autism intervention further comprises an input module for receiving first intervention result data manually entered by a user.
In one embodiment, the relationships between nodes in the intervention knowledge-graph are defined by a neural network. The second determination module 53 may determine the first mode node from the first task node using a neural network.
In one embodiment, the apparatus 50 for autism intervention further comprises a training module. The training module is used for training the neural network by using the first intervention effect information.
In one embodiment, the second determining module 53 is further configured to take the first intervention effect information as input to the neural network to determine a second mode node from the first mode nodes. The second obtaining module 54 is further configured to obtain at least one second intervention entity based on the second mode node.
The output module 55 is also for: the second intervention entity is output to the intervention subject.
In one embodiment, the first determination module 52 is further configured to take the first intervention effect information as input to the neural network to determine a second task node from the first mode node. The second determining module 53 is further configured to determine a second mode node from the second task nodes using the neural network. The second obtaining module 54 is further configured to obtain at least one second intervention entity based on the second mode node. The output module 55 is further configured to output a second intervention entity to the intervention subject.
In one embodiment, the first determination module 52 includes a first counting unit, a first disabling unit. The first counting unit is used for counting the processing of the task nodes. The first prohibiting unit is used for prohibiting the task node in the current intervention process when the count value is larger than a first preset threshold value.
The second determining unit 53 may include a second counting unit, a second prohibiting unit. The second counting unit is used for counting the processing of the opposite node. The second prohibiting unit is configured to prohibit the mode node during the current intervention when the count value is greater than a second predetermined threshold.
< apparatus >
The disclosed embodiments provide an electronic device 60, the electronic device 60 comprising an apparatus 50 for autism intervention provided by the apparatus embodiments described above.
Optionally, in another embodiment, the electronic device 60 comprises a memory 61 and a processor 62. The memory 61 is used to store computer instructions. Processor 62 is operative to invoke computer instructions from memory 61 to perform any of the methods for autism intervention as provided in the method embodiments described above.
< storage Medium >
Embodiments of the present application provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method for autism intervention according to any of the above-provided method embodiments.
< summary of examples >
Embodiment 1, a system for autism intervention, comprising: a processing device and an output device,
wherein the processing device comprises a processor and a memory storing executable instructions that, when the processing device is running, control the processor to perform the following:
acquiring an intervention knowledge graph for autism, wherein the intervention knowledge graph comprises task nodes and mode nodes, the content of the task nodes represents tasks to be subjected to the intervention of the autism, and the content of the mode nodes represents modes to be subjected to the intervention of the autism;
determining a first task node;
determining a first mode node based on the first task node; and
acquiring at least one first intervention entity based on the first mode node;
wherein the output device outputs the first intervention entity to the intervention object. Requirements for
Embodiment 2, the system according to embodiment 1, wherein the process of obtaining at least one intervention entity based on the first mode node comprises:
a first intervention entity is retrieved from a database based on the keywords in the first schema node.
Embodiment 3, the system of embodiment 2, wherein the intervening entities in the database have weights,
Wherein retrieving the first intervention entity from the database based on the keywords in the first schema node comprises:
retrieving intervention entities from a database based on keywords in the first schema node; and
at least one intervention entity with a higher weight is determined as a first intervention entity,
wherein the system further comprises: an input device for inputting first intervention result data representing a result of an intervention by a first intervention entity for said intervention subject,
wherein the executable instructions control the processor to further perform the following when the processing device is running:
generating first intervention effect information based on the first intervention result data; and
the weight of the first intervention entity is adjusted based on the first intervention effect information.
Embodiment 4, the system of embodiment 3, wherein the input device comprises a capture device that captures at least one of an image and sound of the intervention subject as first intervention result data.
Embodiment 5, the system of embodiment 4, wherein the executable instructions control the processor to perform the following when the processing device is operating:
The captured first intervention result data is added to the database as a newly added intervention entity.
Embodiment 6, the system of embodiment 3, wherein the input device receives first intervention result data entered manually by a user.
Embodiment 7, the system of embodiment 3, wherein the relationships between nodes in the intervention knowledge-graph are defined by a neural network, and
wherein the determining of the first mode node based on the first task node comprises:
a first mode node is determined from the first task node using the neural network.
Embodiment 8, the system of embodiment 7, wherein the executable instructions control the processor to further perform the following when the processing device is running:
training the neural network with first intervention effect information.
Embodiment 9, the system of embodiment 7, wherein the executable instructions control the processor to further perform the following when the processing device is running:
taking first intervention effect information as input of the neural network to determine a second mode node from the first mode node; and
based on the second modality node, at least one second intervention entity is acquired,
Wherein the output device outputs the second intervention entity to the intervention object.
Embodiment 10, the system of embodiment 7, wherein the executable instructions control the processor to further perform the following when the processing device is running:
taking first intervention effect information as input of the neural network to determine a second task node from the first mode node;
determining a second mode node from a second task node using the neural network; and
based on the second modality node, at least one second intervention entity is acquired,
wherein the output device outputs the second intervention entity to the intervention object.
Embodiment 11, the system of embodiment 1, wherein the determining the first task node includes:
counting the processes of the task nodes; and
disabling the task node during the current intervention when the count value is greater than a first predetermined threshold, an
Wherein the determining of the first mode node based on the first task node comprises:
counting the processes of the nodes of the method; and
the mode node is disabled during the current intervention when the count value is greater than a second predetermined threshold.
Embodiment 12, the system of embodiment 1, wherein the output device comprises at least one of the following:
-an output interface;
-a display device;
-a speaker;
-a vibrating device;
-an odor generating device;
-temperature adjustment means; and
-a nebulizer.
Example 13, a method for autism intervention, comprising:
acquiring an intervention knowledge graph for autism, wherein the intervention knowledge graph comprises task nodes and mode nodes, the content of the task nodes represents tasks to be subjected to the intervention of the autism, and the content of the mode nodes represents modes to be subjected to the intervention of the autism;
determining a first task node;
determining a first mode node based on the first task node;
acquiring at least one first intervention entity based on the first mode node; and
the first intervention entity is output to the intervention subject.
Embodiment 14, the method of embodiment 13, wherein obtaining at least one intervention entity based on the first mode node comprises:
a first intervention entity is retrieved from the database based on the keywords in the first schema node.
Embodiment 15, the method of embodiment 14, wherein the intervening entities in the database have weights,
wherein retrieving the first intervention entity from the database based on the keywords in the first schema node comprises:
Retrieving intervention entities from a database based on the keywords in the first schema node;
and
at least one intervention entity with a higher weight is determined as a first intervention entity,
wherein the method further comprises:
inputting first intervention result data representing a result of an intervention by a first intervention entity for the intervention subject,
generating first intervention effect information based on the first intervention result data; and
the weight of the first intervention entity is adjusted based on the first intervention effect information.
Embodiment 16, the method of embodiment 15, wherein at least one of an image and sound of the intervention subject is captured by a capture device as the first intervention result data.
Embodiment 17, the method of embodiment 16, further comprising:
the captured first intervention result data is added to the database as a newly added intervention entity.
Embodiment 18, the method of embodiment 15, wherein inputting the first intervention result data comprises: first intervention result data manually input by a user is received.
Embodiment 19, the method of embodiment 15, wherein the relationships between nodes in the intervention knowledge-graph are defined by a neural network, and
Wherein determining the first mode node based on the first task node comprises:
a first mode node is determined from the first task node using the neural network.
Embodiment 20, the method of embodiment 19, further comprising:
training the neural network with first intervention effect information.
Embodiment 21, the method of embodiment 19, further comprising:
taking first intervention effect information as input of the neural network to determine a second mode node from the first mode node;
acquiring at least one second intervention entity based on the second mode node; and
the second intervention entity is output to the intervention subject.
Embodiment 22, the method of embodiment 19, further comprising:
taking first intervention effect information as input of the neural network to determine a second task node from the first mode node;
determining a second mode node from a second task node using the neural network;
acquiring at least one second intervention entity based on the second mode node; and
the second intervention entity is output to the intervention subject.
Embodiment 23, the method of embodiment 13, wherein determining the first task node includes:
counting the processes of the task nodes; and
Disabling the task node during the current intervention when the count value is greater than a first predetermined threshold, an
Wherein determining the first mode node based on the first task node comprises:
counting the processes of the nodes of the method; and
the mode node is disabled during the current intervention when the count value is greater than a second predetermined threshold.
Embodiment 24, the method of embodiment 13, wherein the first intervention entity is output to the intervention subject by at least one of:
-an output interface;
-a display device;
-a speaker;
-a vibrating device;
-an odor generating device;
-temperature adjustment means; and
-a nebulizer.
The present disclosure may be a system, method, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for performing the operations of the present disclosure can be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, c++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of remote computers, the remote computer may be connected to the user computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (e.g., connected through the internet using an internet service provider). In some embodiments, aspects of the present disclosure are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information of computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are all equivalent.
The foregoing description of the embodiments of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the technical improvement of the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the present disclosure is defined by the appended claims.

Claims (10)

1. A system for autism intervention, comprising: a processing device and an output device,
wherein the processing device comprises a processor and a memory storing executable instructions that, when the processing device is running, control the processor to perform the following:
acquiring an intervention knowledge graph for autism, wherein the intervention knowledge graph comprises task nodes and mode nodes, the content of the task nodes represents tasks to be subjected to the intervention of the autism, and the content of the mode nodes represents modes to be subjected to the intervention of the autism;
Determining a first task node;
determining a first mode node based on the first task node; and
acquiring at least one first intervention entity based on the first mode node;
the output device outputs a first intervention entity to the intervention object, wherein the intervention entity is the actual content for intervention of the intervention object, and comprises an audio entity, an image entity, an odor entity, a temperature entity, a vibration entity and a spraying entity.
2. The system of claim 1, wherein the process of acquiring at least one intervening entity based on the first mode node comprises:
a first intervention entity is retrieved from a database based on the keywords in the first schema node.
3. The system of claim 2, wherein the intervening entities in the database have weights,
wherein retrieving the first intervention entity from the database based on the keywords in the first schema node comprises:
retrieving intervention entities from a database based on keywords in the first schema node; and
at least one intervention entity with a higher weight is determined as a first intervention entity,
wherein the system further comprises: an input device for inputting first intervention result data representing a result of an intervention by a first intervention entity for said intervention subject,
Wherein the executable instructions control the processor to further perform the following when the processing device is running:
generating first intervention effect information based on the first intervention result data; and
the weight of the first intervention entity is adjusted based on the first intervention effect information.
4. A system according to claim 3, wherein the input device comprises a capture device that captures at least one of an image and sound of the intervention subject as first intervention result data.
5. The system of claim 4, wherein the executable instructions, when the processing device is operating, control the processor to:
the captured first intervention result data is added to the database as a newly added intervention entity.
6. A system according to claim 3, wherein the input device receives first intervention result data entered manually by a user.
7. A system according to claim 3, wherein the relationships between nodes in the intervention knowledge-graph are defined by a neural network, and
wherein the determining of the first mode node based on the first task node comprises:
a first mode node is determined from the first task node using the neural network.
8. The system of claim 7, wherein the executable instructions control the processor to further perform, when the processing device is running, the following:
training the neural network with first intervention effect information.
9. The system of claim 7, wherein the executable instructions control the processor to further perform, when the processing device is running, the following:
taking first intervention effect information as input of the neural network to determine a second mode node from the first mode node; and
based on the second modality node, at least one second intervention entity is acquired,
wherein the output device outputs the second intervention entity to the intervention object.
10. The system of claim 7, wherein the executable instructions control the processor to further perform, when the processing device is running, the following:
taking first intervention effect information as input of the neural network to determine a second task node from the first mode node;
determining a second mode node from a second task node using the neural network; and
based on the second modality node, at least one second intervention entity is acquired,
Wherein the output device outputs the second intervention entity to the intervention object.
CN202010852052.7A 2020-08-21 2020-08-21 Systems and methods for autism intervention Active CN112148884B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010852052.7A CN112148884B (en) 2020-08-21 2020-08-21 Systems and methods for autism intervention

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010852052.7A CN112148884B (en) 2020-08-21 2020-08-21 Systems and methods for autism intervention

Publications (2)

Publication Number Publication Date
CN112148884A CN112148884A (en) 2020-12-29
CN112148884B true CN112148884B (en) 2023-09-22

Family

ID=73889103

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010852052.7A Active CN112148884B (en) 2020-08-21 2020-08-21 Systems and methods for autism intervention

Country Status (1)

Country Link
CN (1) CN112148884B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113506624B (en) * 2021-08-16 2023-08-08 北京阿叟阿巴科技有限公司 Autism children cognitive ability evaluation intervention system based on hierarchical generalization push logic

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017185887A1 (en) * 2016-04-29 2017-11-02 Boe Technology Group Co., Ltd. Apparatus and method for analyzing natural language medical text and generating medical knowledge graph representing natural language medical text
CN109145119A (en) * 2018-07-02 2019-01-04 北京妙医佳信息技术有限公司 The knowledge mapping construction device and construction method of health management arts
CN109284396A (en) * 2018-09-27 2019-01-29 北京大学深圳研究生院 Medical knowledge map construction method, apparatus, server and storage medium
CN109284342A (en) * 2018-11-22 2019-01-29 北京百度网讯科技有限公司 Method and apparatus for output information
CN110335676A (en) * 2019-07-09 2019-10-15 泰康保险集团股份有限公司 Data processing method, device, medium and electronic equipment
CN110363129A (en) * 2019-07-05 2019-10-22 昆山杜克大学 Autism early screening system based on smile normal form and audio-video behavioural analysis
CN110377745A (en) * 2018-04-11 2019-10-25 阿里巴巴集团控股有限公司 Information processing method, information retrieval method, device and server
CN110415822A (en) * 2019-07-23 2019-11-05 珠海格力电器股份有限公司 A kind of method and apparatus for predicting cancer
CN110532360A (en) * 2019-07-19 2019-12-03 平安科技(深圳)有限公司 Medical field knowledge mapping question and answer processing method, device, equipment and storage medium
CN111128391A (en) * 2019-12-24 2020-05-08 北京推想科技有限公司 Information processing apparatus, method and storage medium
CN111462841A (en) * 2020-03-12 2020-07-28 华南理工大学 Depression intelligent diagnosis device and system based on knowledge graph
CN111475631A (en) * 2020-04-05 2020-07-31 北京亿阳信通科技有限公司 Disease question-answering method and device based on knowledge graph and deep learning

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017185887A1 (en) * 2016-04-29 2017-11-02 Boe Technology Group Co., Ltd. Apparatus and method for analyzing natural language medical text and generating medical knowledge graph representing natural language medical text
CN110377745A (en) * 2018-04-11 2019-10-25 阿里巴巴集团控股有限公司 Information processing method, information retrieval method, device and server
CN109145119A (en) * 2018-07-02 2019-01-04 北京妙医佳信息技术有限公司 The knowledge mapping construction device and construction method of health management arts
CN109284396A (en) * 2018-09-27 2019-01-29 北京大学深圳研究生院 Medical knowledge map construction method, apparatus, server and storage medium
CN109284342A (en) * 2018-11-22 2019-01-29 北京百度网讯科技有限公司 Method and apparatus for output information
CN110363129A (en) * 2019-07-05 2019-10-22 昆山杜克大学 Autism early screening system based on smile normal form and audio-video behavioural analysis
CN110335676A (en) * 2019-07-09 2019-10-15 泰康保险集团股份有限公司 Data processing method, device, medium and electronic equipment
CN110532360A (en) * 2019-07-19 2019-12-03 平安科技(深圳)有限公司 Medical field knowledge mapping question and answer processing method, device, equipment and storage medium
CN110415822A (en) * 2019-07-23 2019-11-05 珠海格力电器股份有限公司 A kind of method and apparatus for predicting cancer
CN111128391A (en) * 2019-12-24 2020-05-08 北京推想科技有限公司 Information processing apparatus, method and storage medium
CN111462841A (en) * 2020-03-12 2020-07-28 华南理工大学 Depression intelligent diagnosis device and system based on knowledge graph
CN111475631A (en) * 2020-04-05 2020-07-31 北京亿阳信通科技有限公司 Disease question-answering method and device based on knowledge graph and deep learning

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
基于主动健康访问技术的医养结合智能综合服务管理平台研究;苏明亮;王士泉;李伟;;医疗卫生装备(第06期);37-41 *
基于知识图谱的自闭症谱系障碍研究主题分析;张靓;齐昊;;中国卫生产业(第20期);197-201+204 *
我国自闭症儿童干预研究的可视化知识图谱分析;吴彦;;南京晓庄学院学报(第05期);71-75+128 *

Also Published As

Publication number Publication date
CN112148884A (en) 2020-12-29

Similar Documents

Publication Publication Date Title
US11511436B2 (en) Robot control method and companion robot
US9724824B1 (en) Sensor use and analysis for dynamic update of interaction in a social robot
TWI778477B (en) Interaction methods, apparatuses thereof, electronic devices and computer readable storage media
US20230362457A1 (en) Intelligent commentary generation and playing methods, apparatuses, and devices, and computer storage medium
US10617961B2 (en) Online learning simulator using machine learning
US20230042654A1 (en) Action synchronization for target object
US11908483B2 (en) Inter-channel feature extraction method, audio separation method and apparatus, and computing device
CN109710748B (en) Intelligent robot-oriented picture book reading interaction method and system
US11869524B2 (en) Audio processing method and apparatus, computer device, and storage medium
CN105105771B (en) The cognition index analysis method of latent energy value test
CN111414506B (en) Emotion processing method and device based on artificial intelligence, electronic equipment and storage medium
US10692498B2 (en) Question urgency in QA system with visual representation in three dimensional space
CN110531849A (en) A kind of intelligent tutoring system of the augmented reality based on 5G communication
CN116009748B (en) Picture information interaction method and device in children interaction story
CN109278051A (en) Exchange method and system based on intelligent robot
Suhail et al. Mixture-kernel graph attention network for situation recognition
Suhaimi et al. Modeling the affective space of 360 virtual reality videos based on arousal and valence for wearable EEG-based VR emotion classification
KR20130082701A (en) Emotion recognition avatar service apparatus and method using artificial intelligences
CN112148884B (en) Systems and methods for autism intervention
CN110442867A (en) Image processing method, device, terminal and computer storage medium
WO2020228349A1 (en) Virtual news anchor system based on air imaging and implementation method therefor
CN111931036A (en) Multi-mode fusion interaction system and method, intelligent robot and storage medium
CN111949773A (en) Reading equipment, server and data processing method
CN110349461A (en) Education and entertainment combination method and system based on children special-purpose smart machine
JP2024505503A (en) Methods and systems that enable natural language processing, understanding and generation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant