CN107102740B - Device and method for realizing brain-computer interface aiming at P300 component - Google Patents

Device and method for realizing brain-computer interface aiming at P300 component Download PDF

Info

Publication number
CN107102740B
CN107102740B CN201710362272.XA CN201710362272A CN107102740B CN 107102740 B CN107102740 B CN 107102740B CN 201710362272 A CN201710362272 A CN 201710362272A CN 107102740 B CN107102740 B CN 107102740B
Authority
CN
China
Prior art keywords
brain
items
computer interface
subset
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710362272.XA
Other languages
Chinese (zh)
Other versions
CN107102740A (en
Inventor
施锦河
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Semiconductor China R&D Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Samsung Semiconductor China R&D Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Semiconductor China R&D Co Ltd, Samsung Electronics Co Ltd filed Critical Samsung Semiconductor China R&D Co Ltd
Priority to CN201710362272.XA priority Critical patent/CN107102740B/en
Publication of CN107102740A publication Critical patent/CN107102740A/en
Application granted granted Critical
Publication of CN107102740B publication Critical patent/CN107102740B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Dermatology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An apparatus for implementing a brain-computer interface for a P300 component and a method thereof. The apparatus comprises: the display unit displays a brain-computer interface aiming at the P300 component to a user, wherein the brain-computer interface comprises a plurality of to-be-selected items; the selection item dividing unit divides the plurality of selection items to be selected into N subsets according to the using frequency of each selection item to be selected, wherein N is an integer greater than 1, and the higher the using frequency of the selection items is, the smaller the scale of the divided subsets is; and the flashing unit is used for prompting N to-be-selected items in the brain-computer interface to emit light at each time according to a preset frequency, so that a user feels the flashing of each to-be-selected item, wherein one to-be-selected item is randomly selected from each subset to form the N to-be-selected items. According to the invention, the identification time can be shortened, and the identification speed can be improved.

Description

Device and method for realizing brain-computer interface aiming at P300 component
Technical Field
The present invention relates to a brain-computer interface technology, and more particularly, to an apparatus for implementing a brain-computer interface for a P300 component and a method thereof, and a brain-computer interface apparatus for a P300 component and a method thereof.
Background
The Brain Computer Interface (BCI) is a control signal for converting mental activities of the brain into peripheral devices. A typical BCI system consists of three parts, namely data acquisition, signal processing and equipment control. The data acquisition part is directly connected with the brain and is responsible for signal acquisition of brain nerve activity. The signal processing part analyzes and processes the acquired signals, recognizes the intention of the brain and converts the intention into a control instruction, and the signal processing part is a core part of the BCI system, and the performance of the system is directly influenced by the quality of signal processing. The device control part executes the operation of the peripheral device according to the control instruction, and can realize the operation such as computer input, wheelchair control, mechanical arm and the like, which is also the function which can be finally realized by the BCI system.
Among the existing BCI systems based on the scalp electroencephalogram (EEG), the most commonly used electroencephalogram signals are: the P300 component in the visual evoked potential, the slow cortex potential, the motor imagery potential and the event related potential, wherein the P300 component in the event related potential is an endogenous evoked potential related to human attention, generally appears about 300ms after stimulation, has time domain waveform characteristics, has the advantages of stable generation, no need of training of a subject and the like, and is mainly used for character input system research of BCI.
The generation of the P300 component is required to satisfy the Oddball condition, which is the application of two stimuli to the same sensory channel, one with a high probability of occurrence, called standard stimulus, and the other with a low probability of occurrence, called bias stimulus. The order of appearance of the two stimuli was randomized. The presence of biased stimuli is incidental to the subject. In the experiment, the subject is required to pay attention to the bias stimulus, and in this case, the P300 component can be induced by the bias stimulus having a small occurrence probability and having an accidental nature. The smaller the biased stimulation probability, the larger the evoked single P300 potential amplitude. And the P300 component waveform is more obvious after multiple times of superposition.
Farwell et al, first used the P300 component for character entry experiments, suggesting a stimulating paradigm of random blinking of rows and columns. The stimulation paradigm arranges 36 characters into a 6 x 6 virtual keyboard matrix and randomly blinks one of the rows or one of the columns of characters. The subject gazes at a target character to be input, and if the flashing row (column) on the screen contains the target character, a P300 component appears in the subject's EEG signal. By detecting whether the P300 component exists in the EEG signal, the line (column) watched by the subject can be known, and then the target character is determined according to the line position. However, the use of this stimulation paradigm presents adjacent interference and double flash problems.
In order to solve these two disadvantages, various stimulation paradigms are proposed, for example, on the basis of the traditional row-column stimulation paradigms, the virtual keyboard matrix is divided regularly to obtain a larger number of sub-matrices. However, in practical applications, the brain-computer interface device using the stimulation paradigm based on the submatrix still has the problems of long recognition time and low recognition efficiency.
Disclosure of Invention
An object of exemplary embodiments of the present invention is to provide an apparatus for implementing a brain-computer interface for a P300 component and a method thereof, which can shorten a recognition time and increase a recognition speed.
An aspect of exemplary embodiments of the present invention provides an apparatus for implementing a brain-computer interface for a P300 component, including: the display unit displays a brain-computer interface aiming at the P300 component to a user, wherein the brain-computer interface comprises a plurality of to-be-selected items; the selection item dividing unit divides the plurality of selection items to be selected into N subsets according to the using frequency of each selection item to be selected, wherein N is an integer greater than 1, and the higher the using frequency of the selection items is, the smaller the scale of the divided subsets is; and the flashing unit is used for prompting N to-be-selected items in the brain-computer interface to emit light at each time according to a preset frequency, so that a user feels the flashing of each to-be-selected item, wherein one to-be-selected item is randomly selected from each subset to form the N to-be-selected items.
Optionally, the candidate item is at least one of a character, an icon, and a thumbnail.
Optionally, the to-be-selected item dividing unit includes: the frequency table acquisition module is used for acquiring the use frequency tables of the multiple to-be-selected items, wherein the multiple to-be-selected items are sequenced in the use frequency tables according to the sequence of the use frequency of the to-be-selected items from large to small; and the dividing module is used for sequentially dividing the to-be-selected items into a first subset to an Nth subset according to the sequence in the using frequency table, wherein the size of each subset is larger from the first subset to the Nth subset.
Optionally, the candidate is a character, and the frequency table obtaining module obtains a usage frequency table corresponding to an input environment of the character.
Optionally, the input environment of the character comprises a chinese input environment and/or an english input environment.
Optionally, the flash unit comprises: and the to-be-selected item selection module randomly selects one to-be-selected item from each of the first subset to the Nth subset to form N to-be-selected items for each light emission, wherein the ith subset comprises M iIndividual to be selected, M iIs an integer greater than 1, i is 1, …, N, and every M starting with the first selection of the candidate from the ith subset iIn the secondary selection, the items to be selected each time are different; and the light-emitting module is used for prompting the N to-be-selected items selected by the to-be-selected item selecting module in the brain-computer interface to emit light every time according to the preset frequency, so that a user feels the flicker of each to-be-selected item.
Another aspect of exemplary embodiments of the present invention provides a method of implementing a brain-computer interface for a P300 component, including: a) displaying a brain-computer interface aiming at the P300 component to a user, wherein the brain-computer interface comprises a plurality of to-be-selected items, the to-be-selected items are divided into N subsets according to the using frequency of the to-be-selected items, N is an integer larger than 1, and the higher the using frequency of the to-be-selected items is, the smaller the size of the divided subsets is; b) and according to a preset frequency, prompting N candidate items in the brain-computer interface to emit light each time, so that a user feels the flicker of each candidate item, wherein one candidate item is randomly selected from each subset to form the N candidate items.
Optionally, the candidate item is at least one of a character, an icon, and a thumbnail.
Optionally, the dividing the plurality of candidate items into N subsets according to the usage frequency thereof includes: acquiring a usage frequency table of the multiple to-be-selected items, wherein the multiple to-be-selected items are sorted in the usage frequency table according to the sequence of the usage frequency of the to-be-selected items from large to small; and sequentially dividing the plurality of to-be-selected items into a first subset to an Nth subset according to the sequence in the using frequency table, wherein the size of each subset is larger from the first subset to the Nth subset.
Optionally, the candidate is a character, and the obtained usage frequency table corresponds to an input environment of the character.
Optionally, the input environment of the character comprises a chinese input environment and/or an english input environment.
Optionally, step b) comprises: randomly selecting one to-be-selected item from each of the first subset to the Nth subset to form N to-be-selected items for each lighting, wherein the ith subset comprises M iIndividual to be selected, M iIs an integer greater than 1, i is 1, …, N, and every M starting with the first selection of the candidate from the ith subset iIn the secondary selection, the items to be selected each time are different; and according to a preset frequency, the N selected candidate items in the brain-computer interface are caused to emit light each time, so that the user feels the flicker of each candidate item.
Another aspect of exemplary embodiments of the present invention provides a brain-computer interface device for a P300 component, including: the above-mentioned apparatus for implementing a brain-computer interface for the P300 component; the acquisition unit is used for acquiring the electroencephalogram signals of the user acquired by each flash; the identification unit is used for identifying a to-be-selected item expected to be selected by a user according to a P300 component in the acquired electroencephalogram signal; and the control unit executes corresponding control operation according to the identified to-be-selected item.
Optionally, the identification unit comprises: the P300 component acquisition module is used for acquiring a P300 component in the electroencephalogram signals of the user acquired by each flash; the superposition module is used for superposing the P300 component acquired for each flash to the P300 component corresponding to each to-be-selected item of the light emission; and the determining module is used for determining an option to be selected which has the most overlapped P300 components as the option to be selected by the user.
Optionally, the apparatus further comprises: and the updating unit is used for updating the use frequency of each option to be selected according to the identified options to be selected.
Optionally, the control operation includes at least one of inputting content corresponding to the identified candidate item, running an application corresponding to the identified candidate item, and executing a process corresponding to the identified candidate item.
Another aspect of exemplary embodiments of the present invention provides a brain-computer interface method for a P300 component, including: displaying a brain-computer interface aiming at the P300 component to a user, wherein the brain-computer interface comprises a plurality of to-be-selected items, the to-be-selected items are divided into N subsets according to the using frequency of the to-be-selected items, N is an integer larger than 1, and the higher the using frequency of the to-be-selected items is, the smaller the size of the divided subsets is; according to a preset frequency, N to-be-selected items in the brain-computer interface are prompted to emit light each time, so that a user feels flickering of each to-be-selected item, wherein one to-be-selected item is randomly selected from each subset to form the N to-be-selected items; acquiring electroencephalogram signals of a user acquired by each flash; identifying a to-be-selected item expected to be selected by a user according to a P300 component in the acquired electroencephalogram signal; and executing corresponding control operation according to the identified to-be-selected item.
Optionally, the step of identifying the candidate item desired to be selected by the user according to the P300 component in the acquired electroencephalogram signal includes: acquiring a P300 component in an electroencephalogram signal of a user acquired by each flash; superposing the P300 component acquired for each flash to the P300 component corresponding to each candidate of the light emission; and determining an option with the most overlapped P300 components as the option desired to be selected by the user.
Optionally, the method further comprises: and updating the use frequency of each option to be selected according to the identified options to be selected.
Optionally, the control operation includes at least one of inputting content corresponding to the identified candidate item, executing a function corresponding to the identified candidate item, and executing a process corresponding to the identified candidate item.
According to the device and the method for realizing the brain-computer interface aiming at the P300 component, the identification time can be shortened, and the identification speed can be improved.
Additional aspects and/or advantages of the present general inventive concept will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the general inventive concept.
Drawings
These and/or other aspects and advantages of the present invention will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 illustrates a block diagram of an apparatus for implementing a brain-computer interface for a P300 component according to an exemplary embodiment of the present invention.
Fig. 2 shows a flowchart of a method for implementing a brain-computer interface for a P300 component according to an exemplary embodiment of the present invention.
Fig. 3 illustrates a block diagram of a brain-computer interface device for a P300 component according to an exemplary embodiment of the present invention.
Fig. 4 shows a block diagram of an identification unit according to an exemplary embodiment of the present invention.
Fig. 5 illustrates a flowchart of a brain-computer interface method for a P300 component according to an exemplary embodiment of the present invention.
Fig. 6 illustrates a flowchart of a method of identifying an option desired to be selected by a user according to a P300 component in an acquired brain electrical signal according to an exemplary embodiment of the present invention.
Detailed Description
Reference will now be made in detail to the exemplary embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout.
Fig. 1 illustrates a block diagram of an apparatus for implementing a brain-computer interface for a P300 component according to an exemplary embodiment of the present invention. Here, the device may be, for example, a device that can be used to implement a brain-computer interface for the P300 component, such as various displays (e.g., a mobile phone display, a computer display, a television display, etc.), a projector, and the like.
As shown in fig. 1, the apparatus 100 for implementing a brain-computer interface for a P300 component according to the present invention includes: a display unit 110, a candidate division unit 120 and a flash unit 130.
The display unit 110 is configured to display a brain-computer interface for the P300 component to a user, where the brain-computer interface includes a plurality of candidate items.
The display unit 110 may display a brain-computer interface for the P300 component to the user through a screen or the like. The user may select an option to be selected by looking at the option to be selected on the brain-computer interface. The candidate items may be characters, icons, thumbnails, etc. For example, the brain-computer interface may be a virtual keyboard interface including a plurality of characters, a user interface including a plurality of application icons, and the like.
The to-be-selected item dividing unit 120 divides the to-be-selected items into N subsets according to the use frequency of each to-be-selected item. N is an integer greater than 1, and N may be a preset fixed value, or a value determined according to the number of the multiple candidate items or the usage frequency distribution of the multiple candidate items.
Specifically, the higher the frequency of use of the candidate, the smaller the size of the subset into which it is divided. In other words, the candidate items with higher usage frequency are divided into the subsets with smaller size, and the candidate items with lower usage frequency are divided into the subsets with larger size. Each subset is relatively independent, and one to-be-selected item is randomly selected from each subset to flash each time, so that the to-be-selected item with higher use frequency can flash at higher frequency, and the identification speed is increased.
It should be understood that, the dividing unit 120 for the to-be-selected items into N subsets according to the use frequency of each to-be-selected item is to logically group the to-be-selected items, rather than grouping the to-be-selected items in physical locations, in other words, the locations of the to-be-selected items in the brain-computer interface are independent of the dividing of the subsets. Namely, the positions of the to-be-selected items in the brain-computer interface are not limited and can be flexibly changed, so that convenience is provided for users, and the application range of the brain-computer interface technology is expanded. For example, when character input is performed by applying the brain-computer interface technology for the P300 component, the brain-computer interface may be displayed in a form of a virtual keyboard familiar to the user, so as to facilitate the user's selection. The brain-computer interface technology for the P300 component is also applicable to user interface operations for electronic terminals, and the like.
As an example, the candidate division unit 120 may include: a frequency table acquisition module (not shown) and a partitioning module (not shown).
And the frequency table acquisition module is used for acquiring the use frequency tables of the multiple to-be-selected items. And in the use frequency table, sequencing the plurality of options to be selected according to the sequence of the use frequency of the options to be selected from large to small.
The usage frequency table can be obtained by counting the number of times that each option to be selected is used by the user, or the existing counted usage frequency table of the options to be selected can be obtained. For example, if the candidate is an english character, the usage frequency table may be an existing alphabet usage frequency table according to english language material statistics.
As an example, in the case where the candidate is a character, the frequency table acquisition module may acquire a usage frequency table corresponding to an input environment of the character.
The input environment of the character may be a chinese input environment, an english input environment, etc. For example, if the input environment of the character is an english input environment, the alphabet usage frequency table for english input may be acquired. If the input environment of the character is a Chinese input environment, a pinyin-character usage frequency table for Chinese input may be obtained. In addition, the usage may be further refined to obtain a usage frequency table in a specific usage, for example, if the input environment of the character is the first letter of an input english word, a letter usage frequency table regarding a probability distribution of occurrence of the first letter of the english word may be obtained, and if the input environment of the character is input continuously after a certain english letter has been input, a letter usage frequency table regarding which english letter appears next to the english letter may be obtained.
The dividing module divides the plurality of to-be-selected items into a first subset to an Nth subset in sequence according to the sequence in the usage frequency table, wherein the size of each subset is larger from the first subset to the Nth subset.
Specifically, the ith subset includes M iIndividual to be selected, M iIs an integer greater than 1, denotes the size of the ith subset, i is 1, …, N, from the first subset to the nth subset, M iLarger and larger, i.e. of increasing scaleIs large. M iThe selection value may be a preset fixed value, or may be a value determined according to the number of the plurality of candidate items, the usage frequency distribution of the plurality of candidate items, the size of N, and the like. After all M are determined iAfter the value of (c), the partitioning module may use the first M in the frequency table 1Dividing the candidate items into a first subset, and using the first M of the candidate items left in the frequency table 2The items to be selected are divided into a second subset, and so on.
When dividing the options using the frequency usage table, various related frequency usage tables may be used, for example, a general frequency usage table based on a large number of statistical samples or a personalized frequency usage table based on usage habits of the user, and accordingly, by reasonably dividing the subsets based on the frequency of usage of the options in the frequency usage table, it will be helpful to quickly identify the options to be selected by the user.
The flashing unit 130 causes N candidate items in the brain-computer interface to be lighted at a time according to a predetermined frequency, so that the user feels the flashing of each candidate item, wherein one candidate item is randomly selected from each subset to constitute the N candidate items.
As an example, the flicker unit 130 may include: a candidate selecting module (not shown) and a light emitting module (not shown).
The selection module to be selected randomly selects one to-be-selected item from each of the first subset to the Nth subset to form N to-be-selected items for each light emission, and selects the to-be-selected item from the ith subset every M iIn the secondary selection, the items to be selected each time are different.
And the light-emitting module prompts the N to-be-selected items selected by the to-be-selected item selecting module in the brain-computer interface to emit light every time according to the preset frequency, so that the user feels the flicker of each to-be-selected item.
Fig. 2 shows a flowchart of a method for implementing a brain-computer interface for a P300 component according to an exemplary embodiment of the present invention.
As shown in fig. 2, in step 201, a brain-computer interface for a P300 component is displayed to a user, wherein the brain-computer interface includes a plurality of candidate items.
The brain-computer interface for the P300 component may be displayed to the user via a screen or the like. The user may select an option to be selected by looking at the option to be selected on the brain-computer interface. The candidate items may be characters, icons, thumbnails, etc. For example, the brain-computer interface may be a virtual keyboard interface including a plurality of characters, a user interface including a plurality of application icons, and the like.
The brain-computer interface comprises a plurality of to-be-selected items which are divided into N subsets according to the use frequency of the to-be-selected items, wherein N is an integer larger than 1, and the higher the use frequency of the to-be-selected items is, the smaller the size of the divided subsets is. In other words, the candidate items with higher usage frequency are divided into the subsets with smaller size, and the candidate items with lower usage frequency are divided into the subsets with larger size. Each subset is relatively independent, and one to-be-selected item is randomly selected from each subset to flash each time, so that the to-be-selected item with higher use frequency can flash at higher frequency, and the identification speed is increased.
It should be understood that dividing each to-be-selected item into N subsets according to its usage frequency merely groups to-be-selected items logically, not to group to-be-selected items physically, in other words, the positions of to-be-selected items in the brain-computer interface are independent of the subset division. Namely, the positions of the to-be-selected items in the brain-computer interface are not limited and can be flexibly changed, so that convenience is provided for users, and the application range of the brain-computer interface technology is expanded. For example, when character input is performed by applying the brain-computer interface technology for the P300 component, the brain-computer interface may be displayed in a form of a virtual keyboard familiar to the user, so as to facilitate the user's selection. The brain-computer interface technology for the P300 component is also applicable to user interface operations for electronic terminals, and the like.
As an example, the plurality of candidate items may be divided into N subsets according to their frequency of use in the following manner. Firstly, obtaining a usage frequency table of the multiple to-be-selected items, wherein in the usage frequency table, the multiple to-be-selected items are sorted according to the sequence of the usage frequency of the to-be-selected items from large to small.
The usage frequency table can be obtained by counting the number of times that each option to be selected is used by the user, or the existing counted usage frequency table of the options to be selected can be obtained. For example, if the candidate is an english character, the usage frequency table may be an existing alphabet usage frequency table according to english language material statistics.
As an example, in the case where the candidate is a character, the obtained usage frequency table may correspond to an input environment of the character. The input environment of the character may be a chinese input environment, an english input environment, etc.
For example, if the input environment of the character is an english input environment, the alphabet usage frequency table for english input may be acquired. If the input environment of the character is a Chinese input environment, a pinyin-character usage frequency table for Chinese input may be obtained. In addition, the usage may be further refined to obtain a usage frequency table in a specific usage, for example, if the input environment of the character is the first letter of an input english word, a letter usage frequency table regarding a probability distribution of occurrence of the first letter of the english word may be obtained, and if the input environment of the character is input continuously after a certain english letter has been input, a letter usage frequency table regarding which english letter appears next to the english letter may be obtained.
And then, sequentially dividing the plurality of to-be-selected items into a first subset to an Nth subset according to the sequence in the usage frequency table, wherein the size of each subset is larger from the first subset to the Nth subset.
Specifically, the ith subset includes M iIndividual to be selected, M iIs an integer greater than 1, denotes the size of the ith subset, i is 1, …, N, from the first subset to the nth subset, M iAnd larger, i.e., larger and larger in scale. M iThe selection value may be a preset fixed value, or may be a value determined according to the number of the plurality of candidate items, the usage frequency distribution of the plurality of candidate items, the size of N, and the like. After all M are determined iAfter the value of (3), the first M in the frequency table may be used 1Dividing the items to be selected into a first subset, and using the rest of the frequency tableTop M in standby options 2The items to be selected are divided into a second subset, and so on.
When dividing the options using the frequency usage table, various related frequency usage tables may be used, for example, a general frequency usage table based on a large number of statistical samples or a personalized frequency usage table based on usage habits of the user, and accordingly, by reasonably dividing the subsets based on the frequency of usage of the options in the frequency usage table, it will be helpful to quickly identify the options to be selected by the user.
In step 202, N candidate items in the brain-computer interface are caused to emit light at a predetermined frequency each time, so that a user feels flickering of the candidate items, wherein one candidate item is randomly selected from each subset to form the N candidate items.
As an example, for each light emission, one candidate may be randomly selected from each of the first to nth subsets to form N candidates, and every M candidates may be selected from the ith subset to the ith subset for the first time iIn the secondary selection, the items to be selected each time are different.
And then, according to a preset frequency, the N selected candidate items in the brain-computer interface are caused to emit light each time, so that the user feels the flicker of each candidate item.
Fig. 3 illustrates a block diagram of a brain-computer interface device for a P300 component according to an exemplary embodiment of the present invention. Here, the device may be a mobile communication terminal, a personal computer, a tablet computer, a game machine, a television, or the like, as an example.
The brain-computer interface device 300 for the P300 component according to the present invention includes: the display unit 110, the to-be-selected item dividing unit 120, the flashing unit 130, the obtaining unit 310, the identifying unit 320 and the control unit 330. Here, the display unit 110, the to-be-selected item dividing unit 120, and the blinking unit 130 may be constructed in a similar manner to fig. 1.
Specifically, the display unit 110 is configured to display a brain-computer interface for the P300 component to a user, where the brain-computer interface includes a plurality of candidate items.
The candidate dividing unit 120 divides the multiple candidates into N subsets according to the use frequency of each candidate, where N is an integer greater than 1, and the higher the use frequency of the candidate is, the smaller the size of the subset to which the candidate belongs is.
The flashing unit 130 causes N candidate items in the brain-computer interface to be lighted at a time according to a predetermined frequency, so that the user feels the flashing of each candidate item, wherein one candidate item is randomly selected from each subset to constitute the N candidate items.
The acquisition unit 310 acquires the electroencephalogram signal of the user acquired for each flash.
Specifically, the obtaining unit 310 may obtain the electroencephalogram signal of the user collected by it from a device (e.g., an electroencephalogram electrode cap, etc.) that collects the electroencephalogram signal of the user. The obtaining unit 310 may obtain, after each flash, the electroencephalogram signal of the user collected for the flash from the device that collected the electroencephalogram signal of the user. Or after the preset number of times of flicker, acquiring the electroencephalogram signal of the user acquired by the preset number of times of flicker from equipment for acquiring the electroencephalogram signal of the user, and then acquiring the electroencephalogram signal of the user corresponding to each flicker in the preset number of times of flicker.
The identification unit 320 identifies the candidate item desired to be selected by the user according to the P300 component in the acquired electroencephalogram signal.
Specifically, the identifying unit 320 identifies the candidate desired to be selected by the user, that is, the candidate on the brain-computer interface at which the user gazes, according to the P300 component in the electroencephalogram signal for each blink. The recognition unit 320 may recognize an option to be selected by the user from the P300 component in the acquired brain electrical signal using various techniques. An exemplary structure of the recognition unit 320 will be described below with reference to fig. 4.
The control unit 330 performs a corresponding control operation according to the identified candidate item.
The control operation may be inputting content corresponding to the identified candidate, running an application corresponding to the identified candidate, executing processing corresponding to the identified candidate, and the like. For example, if the candidate is a character, the control action may be to input the recognized character. If the to-be-selected item is an icon, the control operation may be to run an application corresponding to the recognized icon or to perform a process corresponding to the recognized icon.
In one embodiment, the brain-computer interface device 300 for the P300 component may further include: an update unit (not shown).
The updating unit updates the use frequency of each candidate according to the candidate identified by the identifying unit 320.
Specifically, the updating unit updates the number of times of use of the identified candidate items according to the candidate items identified by the identifying unit 320, and then updates the frequency of use of each candidate item.
The updating unit may update the frequency of use of each candidate item after the identifying unit 320 identifies the candidate item desired to be selected by the user each time. The frequency of use of each candidate item may also be updated at predetermined intervals according to the candidate items identified by the identifying unit 320. By the method, the use frequency of the to-be-selected items can be adjusted in time, so that the use frequency of the to-be-selected items can better accord with habits of users, and more accurate bases are provided for dividing the to-be-selected items.
The updating unit may determine which way to update the frequency of use of each candidate item according to the performance of the brain-computer interface device 300 for the P300 component, user settings, and the like. For example, when the performance of the device is high, the frequency of use of each candidate item may be updated each time the recognition unit 320 recognizes the candidate item that the user desires to select. And updating the frequency of use of each candidate according to the candidate identified by the identifying unit 320 at predetermined intervals of time has lower requirements on the performance of the device.
Fig. 4 shows a block diagram of an identification unit according to an exemplary embodiment of the present invention.
As shown in fig. 4, the recognition unit 320 includes: a P300 component acquisition module 410, a superposition module 420, and a determination module 430.
The P300 component acquisition module 410 acquires a P300 component in the brain electrical signal of the user for each flicker acquisition.
The P300 component acquisition module 410 may acquire the P300 components in the user's brain electrical signal for each flash acquisition using various techniques.
The superimposing module 420 superimposes the P300 component acquired for each flash on the P300 component corresponding to each candidate of the lighting.
Specifically, the flashing unit 130 causes N candidate items in the brain-computer interface to be lit each time, so that the user feels the flashing of each candidate item, and the superimposing module 420 superimposes the P300 component acquired for each flashing onto the P300 component corresponding to each candidate item in the N candidate items that are lit each time.
The determining module 430 determines an option to be selected which is desired to be selected by the user, as an option to be selected which has the most overlapped P300 components.
Specifically, when the P300 component corresponding to an option is significantly the most, that is, when the P300 component corresponding to an option is the most and exceeds the P300 components corresponding to other options by a certain amount, the option is determined to be the option desired to be selected by the user.
Fig. 5 illustrates a flowchart of a brain-computer interface method for a P300 component according to an exemplary embodiment of the present invention. Here, step 201 and step 202 may be implemented in a similar manner as fig. 2.
Specifically, in step 201, a brain-computer interface for the P300 component is displayed to the user. Specifically, the brain-computer interface comprises a plurality of candidate items, wherein the candidate items are divided into N subsets according to the use frequency of the candidate items, N is an integer greater than 1, and the size of the subset into which the candidate items are divided is smaller as the use frequency of the candidate items is higher.
In step 202, N candidate items in the brain-computer interface are caused to emit light at a predetermined frequency each time, so that a user feels flickering of the candidate items, wherein one candidate item is randomly selected from each subset to form the N candidate items.
In step 501, a user's electroencephalogram signal is acquired for each flash acquisition.
Specifically, the electroencephalogram of the user collected by the device (e.g., electroencephalogram electrode cap, etc.) can be obtained from the device collecting the electroencephalogram of the user. The electroencephalogram signal of the user collected for each flash can be obtained from the equipment for collecting the electroencephalogram signal of the user after each flash. Or after the preset number of times of flicker, acquiring the electroencephalogram signal of the user acquired by the preset number of times of flicker from equipment for acquiring the electroencephalogram signal of the user, and then acquiring the electroencephalogram signal of the user corresponding to each flicker in the preset number of times of flicker.
In step 502, a candidate item desired to be selected by a user is identified according to a P300 component in the acquired electroencephalogram signal.
Specifically, the candidate which the user desires to select, namely, the candidate on the brain-computer interface focused by the user, is identified according to the P300 component in the brain electrical signal aiming at each flash.
Various techniques may be used to identify the candidate that the user desires to select from the P300 component in the acquired brain electrical signal. Preferably, the method shown in fig. 6 may be executed to identify the candidate item desired to be selected by the user according to the P300 component in the acquired electroencephalogram signal.
In step 503, corresponding control operations are executed according to the identified to-be-selected items. The control operation may be inputting content corresponding to the identified candidate, running a function corresponding to the identified candidate, executing a process corresponding to the identified candidate, and the like. For example, if the candidate is a character, the control action may be to input the recognized character. If the to-be-selected item is an icon, the control operation may be to run an application corresponding to the recognized icon or to perform a process corresponding to the recognized icon.
As an example, the brain-computer interface method for the P300 component may further include: and updating the use frequency of each option to be selected according to the identified options to be selected.
The use frequency of each candidate item can be updated after the candidate item desired to be selected by the user is identified. The use frequency of each to-be-selected item can also be updated according to the identified to-be-selected item at preset time intervals. By the method, the use frequency of the to-be-selected items can be adjusted in time, so that the use frequency of the to-be-selected items can better accord with habits of users, and more accurate bases are provided for dividing the to-be-selected items.
Fig. 6 illustrates a flowchart of a method of identifying an option desired to be selected by a user according to a P300 component in an acquired brain electrical signal according to an exemplary embodiment of the present invention. The method shown in fig. 6 may be performed when step 502 is performed.
As shown in FIG. 6, in step 601, P300 components in the user's brain electrical signal for each flash acquisition are acquired. Various techniques may be used to obtain the P300 component in the user's brain electrical signal for each flash acquisition.
In step 602, the P300 component acquired for each flash is superimposed on the P300 component corresponding to each candidate of the light emission.
In step 603, an option to be selected, which has the most overlapped P300 components, is determined as the option to be selected by the user.
Specifically, when the P300 component corresponding to an option is significantly the most, that is, when the P300 component corresponding to an option is the most and exceeds the P300 components corresponding to other options by a certain amount, the option is determined to be the option desired to be selected by the user.
Furthermore, the above-described method according to an exemplary embodiment of the present invention may be implemented as a computer program so that, when the program is executed, the above-described method is implemented. The respective units in the device according to an exemplary embodiment of the present invention may be implemented as hardware components. The individual units may be implemented, for example, using Field Programmable Gate Arrays (FPGAs) or Application Specific Integrated Circuits (ASICs), depending on the processing performed by the individual units as defined by the skilled person.
According to the equipment and the method for realizing the brain-computer interface aiming at the P300 component, the identification time can be shortened, the identification speed can be improved, and the application range of the brain-computer interface technology can be expanded.
Although a few exemplary embodiments of the present invention have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.

Claims (10)

1. An apparatus for implementing a brain-computer interface for a P300 component, comprising:
the display unit displays a brain-computer interface aiming at the P300 component to a user, wherein the brain-computer interface comprises a plurality of to-be-selected items;
the to-be-selected item dividing unit divides the to-be-selected items into N subsets according to the use frequency of each to-be-selected item, wherein the higher the use frequency of the to-be-selected item is, the smaller the scale of the divided subsets is; each to-be-selected item is only divided into one subset; n is an integer greater than 1;
a flash unit, which causes N to-be-selected items in the brain-computer interface to emit light each time according to a preset frequency, so that a user feels the flash of each to-be-selected item, wherein one to-be-selected item is randomly selected from each subset to form the N to-be-selected items,
and the positions of the plurality of to-be-selected items in the brain-computer interface are independent of the division of the subset.
2. The device of claim 1, wherein the candidate item is at least one of a character, an icon, and a thumbnail.
3. The apparatus of claim 1, wherein the to-be-selected item dividing unit comprises:
the frequency table acquisition module is used for acquiring the use frequency tables of the multiple to-be-selected items, wherein the multiple to-be-selected items are sequenced in the use frequency tables according to the sequence of the use frequency of the to-be-selected items from large to small;
and the dividing module is used for sequentially dividing the to-be-selected items into a first subset to an Nth subset according to the sequence in the using frequency table, wherein the size of each subset is larger from the first subset to the Nth subset.
4. The apparatus according to claim 3, wherein the candidate is a character, and the frequency table obtaining module obtains the usage frequency table corresponding to an input environment of the character.
5. The apparatus of claim 1, wherein the flashing unit comprises:
and the to-be-selected item selection module randomly selects one to-be-selected item from each of the first subset to the Nth subset to form N to-be-selected items for each light emission, wherein the ith subset comprises M iIndividual to be selected, M iIs an integer greater than 1, i is 1, …, N, and every M starting with the first selection of the candidate from the ith subset iIn the secondary selection, the items to be selected each time are different;
and the light-emitting module is used for prompting the N to-be-selected items selected by the to-be-selected item selecting module in the brain-computer interface to emit light every time according to the preset frequency, so that a user feels the flicker of each to-be-selected item.
6. A method of implementing a brain-computer interface for a P300 component, comprising:
a) displaying a brain-computer interface aiming at the P300 component to a user, wherein the brain-computer interface comprises a plurality of to-be-selected items, the to-be-selected items are divided into N subsets according to the use frequency of the to-be-selected items, and the size of the subset into which the to-be-selected items are divided is smaller as the use frequency of the to-be-selected items is higher; each to-be-selected item is only divided into one subset; n is an integer greater than 1;
b) the method comprises the steps of enabling N to-be-selected items in the brain-computer interface to emit light each time according to a preset frequency, enabling a user to feel flickering of each to-be-selected item, wherein one to-be-selected item is randomly selected from each subset to form the N to-be-selected items,
and the positions of the plurality of to-be-selected items in the brain-computer interface are independent of the division of the subset.
7. A brain-computer interface device for P300 composition, comprising:
a device as claimed in any one of claims 1 to 5 that implements a brain-computer interface for the P300 component;
the acquisition unit is used for acquiring the electroencephalogram signals of the user acquired by each flash;
the identification unit is used for identifying a to-be-selected item expected to be selected by a user according to a P300 component in the acquired electroencephalogram signal;
and the control unit executes corresponding control operation according to the identified to-be-selected item.
8. The apparatus of claim 7, wherein the identifying unit comprises:
the P300 component acquisition module is used for acquiring a P300 component in the electroencephalogram signals of the user acquired by each flash;
the superposition module is used for superposing the P300 component acquired for each flash to the P300 component corresponding to each to-be-selected item of the light emission;
and the determining module is used for determining an option to be selected which has the most overlapped P300 components as the option to be selected by the user.
9. The apparatus of claim 7, further comprising:
and the updating unit is used for updating the use frequency of each option to be selected according to the identified options to be selected.
10. A brain-computer interface method for a P300 component, comprising:
displaying a brain-computer interface aiming at the P300 component to a user, wherein the brain-computer interface comprises a plurality of to-be-selected items, the to-be-selected items are divided into N subsets according to the use frequency of the to-be-selected items, and the size of the subset into which the to-be-selected items are divided is smaller as the use frequency of the to-be-selected items is higher; each to-be-selected item is only divided into one subset; n is an integer greater than 1;
according to a preset frequency, N to-be-selected items in the brain-computer interface are prompted to emit light each time, so that a user feels flickering of each to-be-selected item, wherein one to-be-selected item is randomly selected from each subset to form the N to-be-selected items;
acquiring electroencephalogram signals of a user acquired by each flash;
identifying a to-be-selected item expected to be selected by a user according to a P300 component in the acquired electroencephalogram signal;
executing corresponding control operation according to the identified to-be-selected item,
and the positions of the plurality of to-be-selected items in the brain-computer interface are independent of the division of the subset.
CN201710362272.XA 2014-04-28 2014-04-28 Device and method for realizing brain-computer interface aiming at P300 component Active CN107102740B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710362272.XA CN107102740B (en) 2014-04-28 2014-04-28 Device and method for realizing brain-computer interface aiming at P300 component

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410175100.8A CN103995583B (en) 2014-04-28 2014-04-28 Realize the device and method thereof at the brain-computer interface interface for P300 compositions
CN201710362272.XA CN107102740B (en) 2014-04-28 2014-04-28 Device and method for realizing brain-computer interface aiming at P300 component

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201410175100.8A Division CN103995583B (en) 2014-04-28 2014-04-28 Realize the device and method thereof at the brain-computer interface interface for P300 compositions

Publications (2)

Publication Number Publication Date
CN107102740A CN107102740A (en) 2017-08-29
CN107102740B true CN107102740B (en) 2020-02-11

Family

ID=51309771

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201710362272.XA Active CN107102740B (en) 2014-04-28 2014-04-28 Device and method for realizing brain-computer interface aiming at P300 component
CN201410175100.8A Expired - Fee Related CN103995583B (en) 2014-04-28 2014-04-28 Realize the device and method thereof at the brain-computer interface interface for P300 compositions

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201410175100.8A Expired - Fee Related CN103995583B (en) 2014-04-28 2014-04-28 Realize the device and method thereof at the brain-computer interface interface for P300 compositions

Country Status (2)

Country Link
KR (1) KR20150124368A (en)
CN (2) CN107102740B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106681494B (en) * 2016-12-07 2020-08-11 华南脑控(广东)智能科技有限公司 Environment control method based on brain-computer interface
CN107272880B (en) * 2017-04-25 2019-11-15 中国农业大学 A kind of positioning of cursor, cursor control method and device
CN107239137B (en) * 2017-04-25 2019-11-15 中国农业大学 A kind of character input method and device based on dummy keyboard
CN107229330B (en) * 2017-04-25 2019-11-15 中国农业大学 A kind of character input method and device based on Steady State Visual Evoked Potential
CN107291240B (en) * 2017-06-29 2018-09-18 华南理工大学 A kind of virtual reality multilevel menu exchange method based on brain-computer interface
US10838496B2 (en) 2017-06-29 2020-11-17 South China University Of Technology Human-machine interaction method based on visual stimulation
CN107272905B (en) * 2017-06-29 2018-10-09 华南理工大学 A kind of exchange method based on EOG and EMG
CN107481359A (en) * 2017-07-14 2017-12-15 昆明理工大学 Intelligent door lock system and its control method based on P300 and Mental imagery
CN107390869B (en) * 2017-07-17 2019-07-02 西安交通大学 Efficient brain control Chinese character input method based on movement vision Evoked ptential
CN110018743A (en) * 2019-04-12 2019-07-16 福州大学 Brain control Chinese pinyin tone input method
CN114327061B (en) * 2021-12-27 2023-09-29 福州大学 Method for realizing calibration-free P300 brain-computer interface
CN118131917B (en) * 2024-05-08 2024-07-12 小舟科技有限公司 Multi-user real-time interaction method based on electroencephalogram signals and computer equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102184018A (en) * 2011-05-13 2011-09-14 天津大学 Brain-computer interface system and control method thereof

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1412662A (en) * 2002-12-13 2003-04-23 李治国 Digital keyboard and Chinese character phonetic input method
WO2008097201A1 (en) * 2007-02-09 2008-08-14 Agency For Science, Technology And Research A system and method for processing brain signals in a bci system
CN101201696B (en) * 2007-11-29 2011-04-27 浙江大学 Chinese input BCI system based on P300 brain electric potential
CN101741945A (en) * 2008-11-14 2010-06-16 朱冠军 Numeric keyboard arrangement method and spelling Chinese character input method thereof of mobilephone
CN101515199B (en) * 2009-03-24 2011-01-05 北京理工大学 Character input device based on eye tracking and P300 electrical potential of the brain electricity
US8319669B2 (en) * 2009-04-22 2012-11-27 Jeffrey C Weller Text entry device with radial keypad layout
US9672335B2 (en) * 2009-12-17 2017-06-06 Laird H Shuart Cognitive-based logon process for computing device
KR20110072730A (en) * 2009-12-23 2011-06-29 한국과학기술원 Adaptive brain-computer interface device
CN101968715B (en) * 2010-10-15 2012-10-31 华南理工大学 Brain computer interface mouse control-based Internet browsing method
CN101976115B (en) * 2010-10-15 2011-12-28 华南理工大学 Motor imagery and P300 electroencephalographic potential-based functional key selection method
KR20130006172A (en) * 2011-07-08 2013-01-16 연세대학교 산학협력단 Display system and method, and input apparatus for communicating with display apparatus
CN102609090B (en) * 2012-01-16 2014-06-04 中国人民解放军国防科学技术大学 Electrocerebral time-frequency component dual positioning normal form quick character input method
CN103150023B (en) * 2013-04-01 2016-02-10 北京理工大学 A kind of cursor control system based on brain-computer interface and method
CN103399639B (en) * 2013-08-14 2016-04-06 天津医科大学 Brain-machine interface method and device is combined based on SSVEP and P300

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102184018A (en) * 2011-05-13 2011-09-14 天津大学 Brain-computer interface system and control method thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《An SSVEP-Based BCI Using High Duty-Cycle Visual Flicker》;Po-Lei Lee等;《IEEE》;20110722;全文 *
《运动想象脑电信号处理与P300刺激范式研究》;施锦河;《中国博士学位论文库全文数据库信息科技辑》;20130515;全文 *

Also Published As

Publication number Publication date
CN103995583B (en) 2017-06-16
KR20150124368A (en) 2015-11-05
CN103995583A (en) 2014-08-20
CN107102740A (en) 2017-08-29

Similar Documents

Publication Publication Date Title
CN107102740B (en) Device and method for realizing brain-computer interface aiming at P300 component
EP3396495B1 (en) Neurocomputer system for selecting commands on the basis of recording brain activity
Pasqualotto et al. Toward functioning and usable brain–computer interfaces (BCIs): A literature review
CN100366215C (en) Control method and system and sense organs test method and system based on electrical steady induced response
US7546158B2 (en) Communication methods based on brain computer interfaces
Chen et al. A high-itr ssvep-based bci speller
Halder et al. Brain-controlled applications using dynamic P300 speller matrices
CN103699216B (en) A kind of based on Mental imagery and the E-mail communication system of vision attention mixing brain-computer interface and method
Congedo et al. " Brain Invaders": a prototype of an open-source P300-based video game working with the OpenViBE platform
Colwell et al. Channel selection methods for the P300 Speller
Akram et al. A P300-based brain computer interface system for words typing
Townsend et al. A general P300 brain–computer interface presentation paradigm based on performance guided constraints
Groen et al. The time course of natural scene perception with reduced attention
Jin et al. P300 Chinese input system based on Bayesian LDA
KR101581895B1 (en) Brain-computer interface system for word typing using p300 and word typing methods therefor
CN109961018B (en) Electroencephalogram signal analysis method and system and terminal equipment
Akram et al. A novel P300-based BCI system for words typing
Yu et al. Toward a hybrid BCI: self-paced operation of a P300-based speller by merging a motor imagery-based “brain switch” into a P300 Spelling Approach
Gavett et al. Reducing human error in P300 speller paradigm for brain-computer interface
Yu et al. A P300-based brain–computer interface for Chinese character input
Kick et al. Evaluation of different spelling layouts for SSVEP based BCIs
RU2725782C2 (en) System for communication of users without using muscular movements and speech
Capati et al. Hybrid SSVEP/P300 BCI keyboard
Al-Abdullatif et al. Mind-controlled augmentative and alternative communication for people with severe motor disabilities
Minett et al. An assistive communication brain-computer interface for Chinese text input

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant