[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN108108763B - Electroencephalogram classification model generation method and device and electronic equipment - Google Patents

Electroencephalogram classification model generation method and device and electronic equipment Download PDF

Info

Publication number
CN108108763B
CN108108763B CN201711417656.3A CN201711417656A CN108108763B CN 108108763 B CN108108763 B CN 108108763B CN 201711417656 A CN201711417656 A CN 201711417656A CN 108108763 B CN108108763 B CN 108108763B
Authority
CN
China
Prior art keywords
electroencephalogram
sample data
transformation matrix
orthogonal transformation
classification model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711417656.3A
Other languages
Chinese (zh)
Other versions
CN108108763A (en
Inventor
梁爽
杭文龙
王琼
王平安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201711417656.3A priority Critical patent/CN108108763B/en
Publication of CN108108763A publication Critical patent/CN108108763A/en
Application granted granted Critical
Publication of CN108108763B publication Critical patent/CN108108763B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Other Investigation Or Analysis Of Materials By Electrical Means (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The application provides an electroencephalogram classification model generation method, an electroencephalogram classification model generation device, an electronic device and a computer-readable storage medium, wherein the electroencephalogram classification model generation method comprises the following steps: obtaining sample data of K subjects, wherein the sample data comprises: the classified electroencephalogram information and the classification result of the corresponding electroencephalogram information, wherein K is greater than or equal to 2; calculating an orthogonal transformation matrix which enables the first target function to take the minimum value based on sample data of K tested objects and a preset first target function, wherein the first target function is a function related to the orthogonal transformation matrix and electroencephalogram information of the K tested objects, and the orthogonal transformation matrix is used for transforming the electroencephalogram information of the K tested objects into correlation information among the K tested objects; and generating an electroencephalogram classification model based on the orthogonal transformation matrix. The technical scheme is used for generating the electroencephalogram classification models applicable to a plurality of subjects, and the maintenance cost of the electroencephalogram classification models is saved.

Description

Electroencephalogram classification model generation method and device and electronic equipment
Technical Field
The application belongs to the technical field of brain-computer interaction, and particularly relates to an electroencephalogram classification model generation method, an electroencephalogram classification model generation device, an electronic device and a computer-readable storage medium.
Background
Electroencephalography (EEG) is a general reflection of the electrophysiological activity of brain neurons on the surface of the cerebral cortex or scalp.
At present, brain-computer interaction technology based on electroencephalogram signals becomes a hot point of research in the industry. The key technology of the brain-computer interaction technology is how to quickly and effectively extract electroencephalogram information and improve identification accuracy. Considering that electroencephalogram signals have high non-stationarity and individual difference, electroencephalogram classification models obtained based on electroencephalogram information training of different subjects have obvious difference, therefore, in the existing brain-computer interaction system, an independent electroencephalogram classification model is trained for each subject (namely each electroencephalogram classification model is suitable for one subject) respectively, so that electroencephalogram information of the corresponding subject is classified through the electroencephalogram classification models obtained through training, and therefore the identification accuracy is improved. Because the brain-computer classification model can only be applied to one tested object, the generalization performance of the existing brain-computer interaction system is poorer, and with the increase of the number of the tested objects, the brain-computer interaction system needs to maintain more and more brain-computer classification models, and correspondingly, the maintenance cost of the brain-computer classification models is increased.
Disclosure of Invention
In view of this, the present application provides an electroencephalogram classification model generation method, an electroencephalogram classification model generation apparatus, an electronic device, and a computer-readable storage medium, which are used to generate an electroencephalogram classification model applicable to a plurality of subjects, and save the maintenance cost of the electroencephalogram classification model.
A first aspect of the embodiments of the present application provides a method for generating an electroencephalogram classification model, including:
obtaining sample data of K subjects, wherein the sample data comprises: the classified electroencephalogram information and the classification result of the corresponding electroencephalogram information, wherein K is greater than or equal to 2;
calculating an orthogonal transformation matrix which enables the first target function to take the minimum value based on sample data of the K subjects and a preset first target function, wherein the first target function is a function related to the orthogonal transformation matrix and electroencephalogram information of the K subjects, and the orthogonal transformation matrix is used for transforming the electroencephalogram information of the K subjects into correlation information among the K subjects;
and generating an electroencephalogram classification model based on the orthogonal transformation matrix so as to classify the electroencephalogram information of any one of the K subjects by adopting the electroencephalogram classification model.
Based on the first aspect of the present application, in a first possible implementation manner, the first objective function is:
Figure GDA0003115366800000021
n in the first objective functionkNumber of sample data representing the kth test subject, NlRepresenting the number of sample data of the first subject, P representing an orthogonal transformation matrix, PTIs a transpose of the P,
Figure GDA0003115366800000022
denotes xi,kTranspose of (x)i,kRepresenting electroencephalogram information in the ith sample data of the kth subject,
Figure GDA0003115366800000023
denotes xj,lTranspose of (x)j,lRepresenting electroencephalogram information in jth sample data of an ith test subject;
the orthogonal transformation matrix which enables the first objective function to take the minimum value is calculated based on the sample data of the K test subjects and a preset first objective function, and is as follows:
meets the requirement of PPTCalculating P that minimizes the first objective function based on sample data of the K test subjects, where I is an identity matrix.
Based on the first possible implementation manner of the first aspect of the present application, in a second possible implementation manner, the generating an electroencephalogram classification model based on the orthogonal transformation matrix includes:
determining a first weight vector, a second weight vector and a bias value in a second target function based on a Lagrangian expression according to the calculated orthogonal transformation matrix, the first target function and a preset second target function;
generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector and the bias value;
wherein the second objective function is:
Figure GDA0003115366800000031
and the second objective function satisfies:
Figure GDA0003115366800000032
ρi,krepresenting the degree of contribution, ε, of the ith sample data of the kth test subjecti,kIs xi,kA relaxation variable of CkAnd λ is a value greater than 0, yi,kA classification result of the ith sample data representing the kth test subject,
Figure GDA0003115366800000033
is WkTranspose of (W)kRepresenting a first weight vector, V, associated with a kth subjectTIs a transpose of V, V representing a second weight vector, bkRepresenting a bias value associated with the kth subject.
Based on the second possible implementation manner of the first aspect of the present application, in a third possible implementation manner, the generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the determined second weight vector, and the bias value is:
a classification algorithm for generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector and the bias value;
the classification algorithm is as follows:
Figure GDA0003115366800000034
wherein, when k ═ k, Zi,k=Pxi,kWhen k' ≠ k, Zi,k=Qxi,kAnd, in addition,
Figure GDA0003115366800000035
and, αi,k'And bkSatisfies the equation [ K + Z]α ═ Y, in the equation,
Figure GDA0003115366800000036
Figure GDA0003115366800000037
Figure GDA0003115366800000038
the second aspect of the present application provides an electroencephalogram classification model generation apparatus, including:
an obtaining unit, configured to obtain sample data of K test subjects, where the sample data includes: the classified electroencephalogram information and the classification result of the corresponding electroencephalogram information, wherein K is greater than or equal to 2;
a calculating unit, configured to calculate an orthogonal transformation matrix that minimizes the first target function based on sample data of the K test subjects and a preset first target function, where the first target function is a function related to the orthogonal transformation matrix and electroencephalogram information of the K test subjects, and the orthogonal transformation matrix is used to transform the electroencephalogram information of each of the K test subjects into correlation information between the K test subjects;
and the generation unit is used for generating an electroencephalogram classification model based on the orthogonal transformation matrix so as to classify the electroencephalogram information of any one of the K subjects by adopting the electroencephalogram classification model.
Based on the second aspect of the present application, in a first possible implementation manner, the first objective function is:
Figure GDA0003115366800000041
n in the first objective functionkNumber of sample data representing the kth test subject, NlThe number of sample data representing the first test subject,p represents an orthogonal transformation matrix, PTIs a transpose of the P,
Figure GDA0003115366800000042
denotes xi,kTranspose of (x)i,kRepresenting electroencephalogram information in the ith sample data of the kth subject,
Figure GDA0003115366800000043
denotes xj,lTranspose of (x)j,lRepresenting electroencephalogram information in jth sample data of an ith test subject;
the computing unit is specifically configured to: meets the requirement of PPTCalculating P that minimizes the first objective function based on sample data of the K test subjects, where I is an identity matrix.
Based on the first possible implementation manner of the second aspect of the present application, in a second possible implementation manner, the generating unit includes:
the determining unit is used for determining a first weight vector, a second weight vector and a bias value in a second target function based on a Lagrangian expression according to the orthogonal transformation matrix calculated by the calculating unit, the first target function and a preset second target function;
a sub-generation unit, configured to generate an electroencephalogram classification model based on the orthogonal transformation matrix and the first weight vector, the second weight vector, and the bias value determined by the determination unit;
wherein the second objective function is:
Figure GDA0003115366800000044
and the second objective function satisfies:
Figure GDA0003115366800000045
ρi,krepresenting the degree of contribution, ε, of the ith sample data of the kth test subjecti,kIs xi,kIs changed intoAmount, CkAnd λ is a value greater than 0, yi,kA classification result of the ith sample data representing the kth test subject,
Figure GDA0003115366800000051
is WkTranspose of (W)kRepresenting a first weight vector, V, associated with a kth subjectTIs a transpose of V, V representing a second weight vector, bkRepresenting a bias value associated with the kth subject.
Based on the second possible implementation manner of the second aspect of the present application, in a third possible implementation manner, the sub-generative model is specifically configured to:
a classification algorithm for generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector and the bias value;
the classification algorithm is as follows:
Figure GDA0003115366800000052
wherein, when k ═ k, Zi,k=Pxi,kWhen k' ≠ k, Zi,k=Qxi,kAnd, in addition,
Figure GDA0003115366800000053
and, αi,k'And bkSatisfies the equation [ K + Z]α ═ Y, in the equation,
Figure GDA0003115366800000054
Figure GDA0003115366800000055
Figure GDA0003115366800000056
a third aspect of the present application provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor executes the computer program to implement the electroencephalogram classification model generation method mentioned in the first aspect or any possible implementation manner of the first aspect.
A fourth aspect of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the electroencephalogram classification model generating method mentioned in the first aspect above or any one of the possible implementations of the first aspect above.
A fifth aspect of the present application provides a computer program product comprising a computer program which, when executed by one or more processors, implements the method for generating an electroencephalogram classification model as referred to in the first aspect above or any one of the possible implementations of the first aspect above.
According to the method, the sample data of the K tested objects are obtained, the orthogonal transformation matrix is calculated based on the sample data of the K tested objects and the first objective function, and then the electroencephalogram classification model is generated based on the orthogonal transformation matrix. Because the orthogonal transformation matrix can be used for transforming the electroencephalogram information of the K tested objects into the correlation information among the K tested objects, the electroencephalogram classification model generated based on the orthogonal transformation matrix can adapt to the difference of the electroencephalogram information among the K tested objects, so that the generated electroencephalogram classification model can be applied to the K tested objects, and because the plurality of tested objects can share the same electroencephalogram classification model, compared with the traditional scheme, the electroencephalogram classification model can be maintained for the plurality of tested objects, and the maintenance cost of the electroencephalogram classification model is effectively saved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1-a is a schematic flow chart of an embodiment of a method for generating an electroencephalogram classification model provided by the present application;
FIG. 1-b is a schematic diagram of an algorithm framework of an electroencephalogram classification model provided by the present application;
FIG. 2 is a schematic structural diagram of an embodiment of an electroencephalogram classification model generation apparatus provided in the present application;
fig. 3 is a schematic structural diagram of an embodiment of an electronic device provided in the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the sequence numbers of the steps in the method embodiments described below do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation on the implementation process of each embodiment.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Example one
The embodiment of the application provides an electroencephalogram classification model generation method, which can be applied to an electroencephalogram classification model generation device, and the electroencephalogram classification model generation device can be an independent device, or the electroencephalogram classification model generation device can also be integrated in electronic equipment (such as a smart phone, a tablet computer, a computer, wearable equipment and the like). Optionally, the operating system carried by the device or the electronic device integrating the electroencephalogram classification model generation apparatus may be an ios system, an android system, a windows system, or another operating system, which is not limited herein.
Referring to fig. 1-a, a method for generating a classification model of brain electricity in an embodiment of the present application may include:
step 101, obtaining sample data of K test subjects;
in an embodiment of the present application, the sample data of the subject includes: the classified electroencephalogram information of the subject and the classification result of the corresponding electroencephalogram information. In practical application, for any one of K subjects, a technician can manually classify part of electroencephalogram information of the subject to obtain a classification result of the corresponding electroencephalogram information, so as to obtain sample data of the subject, and after the generation (i.e. training) of an electroencephalogram classification model is completed based on the sample data, the electroencephalogram information of the subject can be automatically classified subsequently by using the generated electroencephalogram classification model.
In the embodiment of the present application, the electroencephalogram information refers to characteristic information extracted from an electroencephalogram signal, and specifically, the acquisition process of the electroencephalogram signal and the extraction process of the electroencephalogram information may be implemented by referring to the prior art, which is not described herein again.
In the embodiment of the present application, K is greater than or equal to 2, that is, sample data of a plurality of subjects is acquired in step 101.
102, calculating an orthogonal transformation matrix which enables the first objective function to take the minimum value based on the sample data of the K tested subjects and a preset first objective function;
in the embodiment of the present application, each subject can be regarded as each task in the multitask learning. By combining the hidden structure learning theory, a shared hidden space is assumed to exist, and information contained in the shared hidden space is correlation information among a plurality of subjects obtained by transforming electroencephalogram information of the plurality of subjects through an orthogonal transformation matrix. By utilizing the maximum joint probability distribution criterion, the relevance information of a plurality of tasks mapped to the shared hidden space is discovered, and the learning effect of each task is improved by utilizing the relevance information, so that the multi-task learning is realized.
In step 102, the orthogonal transformation matrix is used to transform the electroencephalogram information of each of the K subjects into the correlation information between the K subjects. The first objective function is a function related to the orthogonal transformation matrix and the electroencephalogram information of the K subjects. Specifically, the first objective function may be derived as follows:
according to the Gaussian distribution function
Figure GDA0003115366800000081
For any kth task and/th task in implicit space, the Parzen window density estimate can be expressed as:
Figure GDA0003115366800000082
Figure GDA0003115366800000083
by derivation, minimize the difference in the distribution of the kth task and the l-th task ^ ^k,l(Pk(X)-Pl(X))2dX, should be such that
Figure GDA0003115366800000084
Minimum, i.e. minimize, as follows
Figure GDA0003115366800000085
Figure GDA0003115366800000086
Through a series of mathematical derivations, a first objective function for solving the orthogonal transformation matrix P can be obtained as follows:
Figure GDA0003115366800000087
in the first objective function, NkNumber of sample data representing the kth test subject, NlRepresenting the number of sample data of the first subject, P representing an orthogonal transformation matrix, PTIs a transpose of the P,
Figure GDA0003115366800000088
denotes xi,kTranspose of (x)i,kRepresenting electroencephalogram information in the ith sample data of the kth subject,
Figure GDA0003115366800000089
denotes xj,lTranspose of (x)j,lRepresenting the electroencephalogram information in the jth sample data of the ith subject.
Based on the first objective function, step 102 may be specifically expressed as: meets the requirement of PPTUnder the condition of I, P (i.e., an orthogonal transformation matrix) for minimizing the first objective function is calculated based on sample data of the K subjects, where E is an identity matrix. In particular, the PP is satisfiedTIn the condition of I, P that minimizes the first objective function may be calculated by a gradient descent method based on the sample data of the K subjects.
Optionally, in the process of calculating the orthogonal transformation matrix, besides the classified electroencephalogram information, the acquired electroencephalogram information that is not classified may also be used. The sample data acquired in step 101 may further include: acquired but unclassified brain electrical information.
103, generating an electroencephalogram classification model based on the orthogonal transformation matrix;
in step 103, an electroencephalogram classification model is generated based on the orthogonal transformation matrix calculated in step 102, so that electroencephalogram information of any one of the K subjects is classified by using the electroencephalogram classification model.
An algorithm framework of the electroencephalogram classification model in the embodiment of the application is shown in fig. 1-b, and the electroencephalogram classification model comprises two learning parts, namely learning based on an original data space and learning based on a shared hidden space. The data in the original data space is the electroencephalogram information of each of a plurality of subjects, and the data in the shared hidden space is the correlation information between different subjects (the correlation information is obtained by transforming the electroencephalogram information through an orthogonal transformation matrix). And finally, combining the learning result based on the original data space and the learning result based on the shared hidden space. Noting that the dimension of the original data space is d, and the dimension of the shared hidden space is r, the learning process of the two parts can be described by the following optimization problem:
Figure GDA0003115366800000091
s.t.PPT=Id×d
wherein, WkRepresents a first weight vector associated with a kth subject, V represents a second weight vector, and
Figure GDA0003115366800000092
and
Figure GDA0003115366800000093
p is an orthogonal transformation matrix for projecting the electroencephalogram information of each subject from an original data space to a shared hidden space
Figure GDA0003115366800000094
The dimension of the shared hidden space is denoted as r. Phi (-) then represents the correlation information between subjects in the shared hidden space, which can be expressed as:
Figure GDA0003115366800000101
for K tasks (as before, each subject is considered a task), gkIs for the kth task
Figure GDA0003115366800000102
Can be expressed as:
Figure GDA0003115366800000103
wherein, bkIs a bias value, C, associated with the kth subjectkAnd λ is a value greater than 0.
On this basis, step 103 may include:
determining a first weight vector, a second weight vector and a bias value in a second target function based on a Lagrangian expression according to the calculated orthogonal transformation matrix, the first target function and a preset second target function;
generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector and the bias value;
wherein the second objective function is:
Figure GDA0003115366800000104
and the second objective function satisfies:
Figure GDA0003115366800000105
ρi,krepresenting the degree of contribution, ε, of the ith sample data of the kth test subjecti,kIs xi,kOf the relaxation variable, yi,kA classification result of the ith sample data representing the kth test subject,
Figure GDA0003115366800000106
is WkTranspose of (W)kRepresenting a first weight vector, V, associated with a kth subjectTIs the transpose of V, which represents the second weight vector.
Further, the generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, second weight vector, and the bias value is:
a classification algorithm for generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector and the bias value;
the classification algorithm is as follows:
Figure GDA0003115366800000107
wherein, when k ═ k,Zi,k=Pxi,kWhen k' ≠ k, Zi,k=Qxi,kAnd, in addition,
Figure GDA0003115366800000111
and, αi,k'And bkSatisfies the equation [ K + Z]α ═ Y, in the above equation,
Figure GDA0003115366800000112
Figure GDA0003115366800000113
Figure GDA0003115366800000114
after the classification algorithm is adopted, the electroencephalogram information of the subject can be classified through the classification algorithm.
As can be seen from the above, in the embodiment of the application, the sample data of the K test subjects is acquired, the orthogonal transformation matrix is calculated based on the sample data of the K test subjects and the first objective function, and then the electroencephalogram classification model is generated based on the orthogonal transformation matrix. Because the orthogonal transformation matrix can be used for transforming the electroencephalogram information of the K tested objects into the correlation information among the K tested objects, the electroencephalogram classification model generated based on the orthogonal transformation matrix can adapt to the difference of the electroencephalogram information among the K tested objects, so that the generated electroencephalogram classification model can be applied to the K tested objects, and because the plurality of tested objects can share the same electroencephalogram classification model, compared with the traditional scheme, the electroencephalogram classification model can be maintained for the plurality of tested objects, and the maintenance cost of the electroencephalogram classification model is effectively saved. Further, in the process of generating the electroencephalogram classification model, training is performed based on sample data of a plurality of tested objects, and the training process is corrected by using correlation information among the plurality of tested objects, so that compared with the traditional electroencephalogram classification model obtained by training based on sample data of a single tested object, the electroencephalogram classification model generated in the embodiment of the application can improve the electroencephalogram classification accuracy of each tested object to a certain extent.
Example two
The embodiment of the present application provides an electroencephalogram classification model generation apparatus, as shown in fig. 2, the electroencephalogram classification model generation apparatus 200 in the embodiment of the present application includes:
an obtaining unit 201, configured to obtain sample data of K test subjects, where the sample data includes: the classified electroencephalogram information and the classification result of the corresponding electroencephalogram information, wherein K is greater than or equal to 2;
a calculating unit 202, configured to calculate an orthogonal transformation matrix that minimizes the first objective function based on sample data of the K test subjects and a preset first objective function, where the first objective function is a function related to the orthogonal transformation matrix and electroencephalogram information of the K test subjects, and the orthogonal transformation matrix is used to transform the electroencephalogram information of each of the K test subjects into correlation information between the K test subjects;
the generating unit 203 is configured to generate an electroencephalogram classification model based on the orthogonal transformation matrix, so as to classify electroencephalogram information of any one of the K subjects by using the electroencephalogram classification model.
Optionally, the first objective function is:
Figure GDA0003115366800000121
n in the first objective functionkNumber of sample data representing the kth test subject, NlRepresenting the number of sample data of the first subject, P representing an orthogonal transformation matrix, PTIs a transpose of the P,
Figure GDA0003115366800000122
denotes xi,kTranspose of (x)i,kRepresenting electroencephalogram information in the ith sample data of the kth subject,
Figure GDA0003115366800000123
denotes xj,lTranspose of (x)j,lRepresenting electroencephalogram information in jth sample data of an ith test subject;
the calculating unit 202 is specifically configured to: meets the requirement of PPTCalculating P that minimizes the first objective function based on sample data of the K test subjects, where I is an identity matrix.
Optionally, the generating unit 203 includes:
a determining unit, configured to determine, according to the orthogonal transformation matrix calculated by the calculating unit 202, the first objective function, and a preset second objective function, a first weight vector, a second weight vector, and a bias value in the second objective function based on a lagrangian expression;
a sub-generation unit, configured to generate an electroencephalogram classification model based on the orthogonal transformation matrix and the first weight vector, the second weight vector, and the bias value determined by the determination unit;
wherein the second objective function is:
Figure GDA0003115366800000124
and the second objective function satisfies:
Figure GDA0003115366800000131
ρi,krepresenting the degree of contribution, ε, of the ith sample data of the kth test subjecti,kIs xi,kA relaxation variable of CkAnd λ is a value greater than 0, yi,A classification result of the ith sample data representing the kth test subject,
Figure GDA0003115366800000132
is WkTranspose of (W)kRepresenting a first weight vector, V, associated with a kth subjectTIs a transpose of V, V representing a second weight vector, bkRepresenting a bias value associated with the kth subject.
Optionally, the sub-generative model is specifically configured to:
a classification algorithm for generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector and the bias value;
the classification algorithm is as follows:
Figure GDA0003115366800000133
wherein, when k ═ k, Zi,k=Pxi,kWhen k' ≠ k, Zi,k=Qxi,kAnd, in addition,
Figure GDA0003115366800000134
and, αi,k'And bkSatisfies the equation [ K + Z]α ═ Y, in the equation,
Figure GDA0003115366800000135
Figure GDA0003115366800000136
Figure GDA0003115366800000137
it should be noted that the electroencephalogram classification model generation apparatus in the embodiment of the present application may be an independent device, or the electroencephalogram classification model generation apparatus may also be integrated in an electronic device (for example, a smartphone, a tablet computer, a wearable device, and the like). Optionally, the operating system carried by the device or the electronic device integrating the electroencephalogram classification model generation apparatus may be an ios system, an android system, a windows system, or another operating system, which is not limited herein.
As can be seen from the above, in the embodiment of the application, the sample data of the K test subjects is acquired, the orthogonal transformation matrix is calculated based on the sample data of the K test subjects and the first objective function, and then the electroencephalogram classification model is generated based on the orthogonal transformation matrix. Because the orthogonal transformation matrix can be used for transforming the electroencephalogram information of the K tested objects into the correlation information among the K tested objects, the electroencephalogram classification model generated based on the orthogonal transformation matrix can adapt to the difference of the electroencephalogram information among the K tested objects, so that the generated electroencephalogram classification model can be applied to the K tested objects, and because the plurality of tested objects can share the same electroencephalogram classification model, compared with the traditional scheme, the electroencephalogram classification model can be maintained for the plurality of tested objects, and the maintenance cost of the electroencephalogram classification model is effectively saved. Further, in the process of generating the electroencephalogram classification model, training is performed based on sample data of a plurality of tested objects, and the training process is corrected by using correlation information among the plurality of tested objects, so that compared with the traditional electroencephalogram classification model obtained by training based on sample data of a single tested object, the electroencephalogram classification model generated in the embodiment of the application can improve the electroencephalogram classification accuracy of each tested object to a certain extent.
EXAMPLE III
An embodiment of the present application provides an electronic device, please refer to fig. 3, where the electronic device in the embodiment of the present application includes: a memory 301, one or more processors 302 (only one shown in fig. 3), and a computer program stored on the memory 301 and executable on the processors. Wherein: the memory 301 is used to store software programs and modules, and the processor 302 executes various functional applications and data processing by operating the software programs and units stored in the memory 301. Specifically, the processor 302 realizes the following steps by running the above-mentioned computer program stored in the memory 301:
obtaining sample data of K subjects, wherein the sample data comprises: the classified electroencephalogram information and the classification result of the corresponding electroencephalogram information, wherein K is greater than or equal to 2;
calculating an orthogonal transformation matrix which enables the first target function to take the minimum value based on sample data of the K subjects and a preset first target function, wherein the first target function is a function related to the orthogonal transformation matrix and electroencephalogram information of the K subjects, and the orthogonal transformation matrix is used for transforming the electroencephalogram information of the K subjects into correlation information among the K subjects;
and generating an electroencephalogram classification model based on the orthogonal transformation matrix so as to classify the electroencephalogram information of any one of the K subjects by adopting the electroencephalogram classification model.
Assuming that the above is the first possible implementation manner, in a second possible implementation manner provided on the basis of the first possible implementation manner, the first objective function is:
Figure GDA0003115366800000151
n in the first objective functionkNumber of sample data representing the kth test subject, NlRepresenting the number of sample data of the first subject, P representing an orthogonal transformation matrix, PTIs a transpose of the P,
Figure GDA0003115366800000152
denotes xi,kTranspose of (x)i,kRepresenting electroencephalogram information in the ith sample data of the kth subject,
Figure GDA0003115366800000153
denotes xj,lTranspose of (x)j,lRepresenting electroencephalogram information in jth sample data of an ith test subject;
the orthogonal transformation matrix which enables the first objective function to take the minimum value is calculated based on the sample data of the K test subjects and a preset first objective function, and is as follows:
meets the requirement of PPTCalculating P that minimizes the first objective function based on sample data of the K test subjects, where I is an identity matrix.
In a third possible implementation manner provided on the basis of the second possible implementation manner, the generating an electroencephalogram classification model based on the orthogonal transformation matrix includes:
determining a first weight vector, a second weight vector and a bias value in a second target function based on a Lagrangian expression according to the calculated orthogonal transformation matrix, the first target function and a preset second target function;
generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector and the bias value;
wherein the second objective function is:
Figure GDA0003115366800000154
and the second objective function satisfies:
Figure GDA0003115366800000155
ρi,krepresenting the degree of contribution, ε, of the ith sample data of the kth test subjecti,kIs xi,kA relaxation variable of CkAnd λ is a value greater than 0, yi,kA classification result of the ith sample data representing the kth test subject,
Figure GDA0003115366800000156
is WkTranspose of (W)kRepresenting a first weight vector, V, associated with a kth subjectTIs a transpose of V, V representing a second weight vector, bkRepresenting a bias value associated with the kth subject.
In a fourth possible implementation manner provided on the basis of the third possible implementation manner, the generating a brain electrical classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector, and the bias value is as follows:
a classification algorithm for generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector and the bias value;
the classification algorithm is as follows:
Figure GDA0003115366800000161
wherein,when k ═ k, Zi,k=Pxi,kWhen k' ≠ k, Zi,k=Qxi,kAnd, in addition,
Figure GDA0003115366800000162
and, αi,k'And bkSatisfies the equation [ K + Z]α ═ Y, in the equation,
Figure GDA0003115366800000163
Figure GDA0003115366800000164
Figure GDA0003115366800000165
Figure GDA0003115366800000166
optionally, as shown in fig. 3, the electronic device may further include: one or more input devices 303 (only one shown in fig. 3) and one or more output devices 304 (only one shown in fig. 3). The memory 301, processor 302, input device 303, and output device 304 are connected by a bus 305.
It should be understood that in the embodiments of the present Application, the Processor 302 may be a Central Processing Unit (CPU), and the Processor may be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 303 may include a keyboard, a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, etc., and the output device 304 may include a display, a speaker, etc.
The memory 304 may include a read-only memory and a random access memory, and provides instructions and data to the processor 301. Some or all of memory 304 may also include non-volatile random access memory. For example, the memory 304 may also store device type information.
According to the method, the sample data of the K tested objects are obtained, the orthogonal transformation matrix is calculated based on the sample data of the K tested objects and the first objective function, and then the electroencephalogram classification model is generated based on the orthogonal transformation matrix. Because the orthogonal transformation matrix can be used for transforming the electroencephalogram information of the K tested objects into the correlation information among the K tested objects, the electroencephalogram classification model generated based on the orthogonal transformation matrix can adapt to the difference of the electroencephalogram information among the K tested objects, so that the generated electroencephalogram classification model can be applied to the K tested objects, and because the plurality of tested objects can share the same electroencephalogram classification model, compared with the traditional scheme, the electroencephalogram classification model can be maintained for the plurality of tested objects, and the maintenance cost of the electroencephalogram classification model is effectively saved. Further, in the process of generating the electroencephalogram classification model, training is performed based on sample data of a plurality of tested objects, and the training process is corrected by using correlation information among the plurality of tested objects, so that compared with the traditional electroencephalogram classification model obtained by training based on sample data of a single tested object, the electroencephalogram classification model generated in the embodiment of the application can improve the electroencephalogram classification accuracy of each tested object to a certain extent.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described system embodiments are merely illustrative, and for example, the division of the above-described modules or units is only one logical functional division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, all or part of the flow in the method of the embodiments described above may be implemented by a computer program, which may be stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer readable medium may include: any entity or device capable of carrying the above-mentioned computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, software distribution medium, etc. It should be noted that the computer readable medium described above may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media excludes electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (8)

1. A method for generating an electroencephalogram classification model is characterized by comprising the following steps:
obtaining sample data of K subjects, wherein the sample data comprises: the classified electroencephalogram information and the classification result of the corresponding electroencephalogram information, wherein K is greater than or equal to 2;
calculating an orthogonal transformation matrix which enables the first target function to take the minimum value based on sample data of the K subjects and a preset first target function, wherein the first target function is a function related to the orthogonal transformation matrix and electroencephalogram information of the K subjects, and the orthogonal transformation matrix is used for transforming the electroencephalogram information of the K subjects into correlation information among the K subjects;
generating an electroencephalogram classification model based on the orthogonal transformation matrix so as to classify electroencephalogram information of any one of the K subjects by adopting the electroencephalogram classification model;
the first objective function is:
Figure FDA0003115366790000011
n in the first objective functionkNumber of sample data representing the kth test subject, N1Representing the number of sample data of the 1 st subject, P representing an orthogonal transformation matrix, PTIs a transpose of the P,
Figure FDA0003115366790000012
denotes xi,kTranspose of (x)i,kRepresenting electroencephalogram information in the ith sample data of the kth subject,
Figure FDA0003115366790000013
denotes xj,lTranspose of (x)j,lRepresenting electroencephalogram information in jth sample data of an ith test subject;
the orthogonal transformation matrix which enables the first objective function to take the minimum value is calculated based on the sample data of the K test subjects and a preset first objective function, and is as follows:
meets the requirement of PPTCalculating P that minimizes the first objective function based on sample data of the K test subjects, where I is an identity matrix.
2. The method of generating an electroencephalogram classification model according to claim 1, wherein said generating an electroencephalogram classification model based on the orthogonal transformation matrix comprises:
determining a first weight vector, a second weight vector and a bias value in a second target function based on a Lagrangian expression according to the calculated orthogonal transformation matrix, the first target function and a preset second target function;
generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector and the bias value;
wherein the second objective function is:
Figure FDA0003115366790000021
and the second objective function satisfies:
Figure FDA0003115366790000022
ρi,krepresenting the degree of contribution, ε, of the ith sample data of the kth test subjecti,kIs xi,kA relaxation variable of CkAnd λ is a value greater than 0, yi,kA classification result of the ith sample data representing the kth test subject,
Figure FDA0003115366790000023
is WkTranspose of (W)kRepresenting a first weight vector, V, associated with a kth subjectTIs a transpose of V, V representing a second weight vector, bkRepresenting the phase with the kth subjectAn off bias value.
3. The method of generating an electroencephalogram classification model according to claim 2, wherein the generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector, and the bias value is:
a classification algorithm for generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector and the bias value;
the classification algorithm is as follows:
Figure FDA0003115366790000024
wherein, when k ═ k, Zi,k=Pxi,kWhen k' ≠ k, Zi,k=Qxi,kAnd, in addition,
Figure FDA0003115366790000025
and, αi,k′And bkSatisfies the equation [ K + z]α ═ Y, in the equation,
Figure FDA0003115366790000026
Figure FDA0003115366790000027
Figure FDA0003115366790000028
4. an electroencephalogram classification model generation apparatus, characterized by comprising:
an obtaining unit, configured to obtain sample data of K test subjects, where the sample data includes: the classified electroencephalogram information and the classification result of the corresponding electroencephalogram information, wherein K is greater than or equal to 2;
a calculating unit, configured to calculate an orthogonal transformation matrix that minimizes the first target function based on sample data of the K test subjects and a preset first target function, where the first target function is a function related to the orthogonal transformation matrix and electroencephalogram information of the K test subjects, and the orthogonal transformation matrix is used to transform the electroencephalogram information of each of the K test subjects into correlation information between the K test subjects;
the generation unit is used for generating an electroencephalogram classification model based on the orthogonal transformation matrix so as to classify electroencephalogram information of any one of the K subjects by adopting the electroencephalogram classification model;
the first objective function is:
Figure FDA0003115366790000031
n in the first objective functionkNumber of sample data representing the kth test subject, N1Representing the number of sample data of the 1 st subject, P representing an orthogonal transformation matrix, PTIs a transpose of the P,
Figure FDA0003115366790000032
denotes xi,kTranspose of (x)i,kRepresenting electroencephalogram information in the ith sample data of the kth subject,
Figure FDA0003115366790000033
denotes xj,lTranspose of (x)j,lRepresenting electroencephalogram information in jth sample data of a 1 st tested object;
the computing unit is specifically configured to: meets the requirement of PPTCalculating P that minimizes the first objective function based on sample data of the K test subjects, where I is an identity matrix.
5. The electroencephalogram classification model generation apparatus according to claim 4, wherein the generation unit includes:
the determining unit is used for determining a first weight vector, a second weight vector and a bias value in a second target function based on a Lagrangian expression according to the orthogonal transformation matrix calculated by the calculating unit, the first target function and a preset second target function;
a sub-generation unit, configured to generate an electroencephalogram classification model based on the orthogonal transformation matrix and the first weight vector, the second weight vector, and the bias value determined by the determination unit;
wherein the second objective function is:
Figure FDA0003115366790000041
and the second objective function satisfies:
Figure FDA0003115366790000042
ρi,krepresenting the degree of contribution, ε, of the ith sample data of the kth test subjecti,kIs xi,kA relaxation variable of CkAnd λ is a value greater than 0, yi,kA classification result of the ith sample data representing the kth test subject,
Figure FDA0003115366790000043
is WkTranspose of (W)kRepresenting a first weight vector, V, associated with a kth subjectTIs a transpose of V, V representing a second weight vector, bkRepresenting a bias value associated with the kth subject.
6. The electroencephalogram classification model generation apparatus of claim 5, wherein the sub-generation model is specifically configured to:
a classification algorithm for generating an electroencephalogram classification model based on the orthogonal transformation matrix and the determined first weight vector, the second weight vector and the bias value;
the classification algorithm is as follows:
Figure FDA0003115366790000044
wherein, when k ═ k, Zi,k=Pxi,kWhen k' ≠ k, Zi,k=Qxi,kAnd, in addition,
Figure FDA0003115366790000045
and, αi,k′And bkSatisfies the equation [ K + z]α ═ Y, in the equation,
Figure FDA0003115366790000046
Figure FDA0003115366790000047
Figure FDA0003115366790000048
7. an electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the steps of the method according to any of claims 1 to 3 are implemented when the computer program is executed by the processor.
8. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 3.
CN201711417656.3A 2017-12-25 2017-12-25 Electroencephalogram classification model generation method and device and electronic equipment Active CN108108763B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711417656.3A CN108108763B (en) 2017-12-25 2017-12-25 Electroencephalogram classification model generation method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711417656.3A CN108108763B (en) 2017-12-25 2017-12-25 Electroencephalogram classification model generation method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN108108763A CN108108763A (en) 2018-06-01
CN108108763B true CN108108763B (en) 2021-07-23

Family

ID=62212781

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711417656.3A Active CN108108763B (en) 2017-12-25 2017-12-25 Electroencephalogram classification model generation method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN108108763B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102573619A (en) * 2008-12-19 2012-07-11 新加坡科技研究局 Device and method for generating a representation of a subject's attention level
CN103092971A (en) * 2013-01-24 2013-05-08 电子科技大学 Classification method used in brain-computer interfaces
CN103116279A (en) * 2013-01-16 2013-05-22 大连理工大学 Vague discrete event shared control method of brain-controlled robotic system
CN103425249A (en) * 2013-09-06 2013-12-04 西安电子科技大学 Electroencephalogram signal classifying and recognizing method based on regularized CSP and regularized SRC and electroencephalogram signal remote control system
CN104536572A (en) * 2014-12-30 2015-04-22 天津大学 Cross-individual universal type brain-computer interface method based on event related potential
CN104771163A (en) * 2015-01-30 2015-07-15 杭州电子科技大学 Electroencephalogram feature extraction method based on CSP and R-CSP algorithms
CN106709469A (en) * 2017-01-03 2017-05-24 中国科学院苏州生物医学工程技术研究所 Automatic sleep staging method based on multiple electroencephalogram and electromyography characteristics
CN106803081A (en) * 2017-01-25 2017-06-06 东南大学 A kind of brain electricity sorting technique based on Multi-classifers integrated
CN107169434A (en) * 2017-05-10 2017-09-15 广东工业大学 One kind possesses the electric personal identification method of exclusive brain
CN107292329A (en) * 2017-04-19 2017-10-24 西北工业大学 Event imagination sorting technique based on CI CSP algorithms
CN107479685A (en) * 2016-06-07 2017-12-15 林闯 Brain electric control method and brain controller for electric consumption

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2879896A1 (en) * 2012-07-24 2014-01-30 Cerephex Corporation Method and apparatus for diagnosing and assessing central pain
EP3865056A1 (en) * 2012-09-14 2021-08-18 InteraXon Inc. Systems and methods for collecting, analyzing, and sharing bio-signal and non-bio-signal data
US11229364B2 (en) * 2013-06-14 2022-01-25 Medtronic, Inc. Patient motion analysis for behavior identification based on video frames with user selecting the head and torso from a frame
AT515038B1 (en) * 2013-10-21 2015-12-15 Guger Christoph Dipl Ing Dr Techn Method for quantifying the perceptibility of a person
WO2016077530A1 (en) * 2014-11-12 2016-05-19 The University Of Memphis Fully reconfigurable modular body-worn sensors

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102573619A (en) * 2008-12-19 2012-07-11 新加坡科技研究局 Device and method for generating a representation of a subject's attention level
CN103116279A (en) * 2013-01-16 2013-05-22 大连理工大学 Vague discrete event shared control method of brain-controlled robotic system
CN103092971A (en) * 2013-01-24 2013-05-08 电子科技大学 Classification method used in brain-computer interfaces
CN103425249A (en) * 2013-09-06 2013-12-04 西安电子科技大学 Electroencephalogram signal classifying and recognizing method based on regularized CSP and regularized SRC and electroencephalogram signal remote control system
CN104536572A (en) * 2014-12-30 2015-04-22 天津大学 Cross-individual universal type brain-computer interface method based on event related potential
CN104771163A (en) * 2015-01-30 2015-07-15 杭州电子科技大学 Electroencephalogram feature extraction method based on CSP and R-CSP algorithms
CN107479685A (en) * 2016-06-07 2017-12-15 林闯 Brain electric control method and brain controller for electric consumption
CN106709469A (en) * 2017-01-03 2017-05-24 中国科学院苏州生物医学工程技术研究所 Automatic sleep staging method based on multiple electroencephalogram and electromyography characteristics
CN106803081A (en) * 2017-01-25 2017-06-06 东南大学 A kind of brain electricity sorting technique based on Multi-classifers integrated
CN107292329A (en) * 2017-04-19 2017-10-24 西北工业大学 Event imagination sorting technique based on CI CSP algorithms
CN107169434A (en) * 2017-05-10 2017-09-15 广东工业大学 One kind possesses the electric personal identification method of exclusive brain

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Classification of Motor Imagery Tasks Using Phase Synchronization Analysis of EEG Based on Multivariate Empirical Mode Decomposition;Shuang Liang,etc.;《Proceedings of 2014 4th IEEE International Conference on Information Science and Technology》;20140426;第675-678页 *
基于人工神经网络的脑电信号模式分类;佚名;《https://wenku.baidu.com/view/ec0dddefe009581b6bd9eb55.html》;20120320;第1-3页 *

Also Published As

Publication number Publication date
CN108108763A (en) 2018-06-01

Similar Documents

Publication Publication Date Title
US10621971B2 (en) Method and device for extracting speech feature based on artificial intelligence
Wang et al. Rafiki: Machine learning as an analytics service system
KR20170091963A (en) Gesture classification apparatus and method using electromyogram signals
Al-Fahoum et al. Methods of EEG Signal Features Extraction Using Linear Analysis in Frequency and Time‐Frequency Domains
JP7125562B2 (en) Target tracking method, computer program, and electronic device
Yang et al. Removal of electrooculogram artifacts from electroencephalogram using canonical correlation analysis with ensemble empirical mode decomposition
CN109965871B (en) Method, system, medium, and apparatus for analyzing brain-computer interface signal
CN113133769A (en) Equipment control method, device and terminal based on motor imagery electroencephalogram signals
CN111671420A (en) Method for extracting features from resting electroencephalogram data and terminal equipment
Athif et al. WaveCSP: a robust motor imagery classifier for consumer EEG devices
CN113627361B (en) Training method and device for face recognition model and computer program product
CN113143295A (en) Equipment control method and terminal based on motor imagery electroencephalogram signals
Jiao et al. Effective connectivity analysis of fMRI data based on network motifs
Ramakrishnan et al. Epileptic eeg signal classification using multi-class convolutional neural network
Ball et al. PWC‐ICA: a method for stationary ordered blind source separation with application to EEG
Mi et al. A comparative study and improvement of two ICA using reference signal methods
CN108108763B (en) Electroencephalogram classification model generation method and device and electronic equipment
Köster et al. A two-layer model of natural stimuli estimated with score matching
CN115169384A (en) Electroencephalogram classification model training method, intention identification method, equipment and medium
Escola et al. Discrete Wavelet Transform in digital audio signal processing: A case study of programming languages performance analysis
Wu et al. fMRI activations via low-complexity second-order inverse-sparse-transform blind separation
Janardhanan et al. Vision based Human Activity Recognition using Deep Neural Network Framework
Song et al. Probing epileptic disorders with lightweight neural network and EEG's intrinsic geometry
Vahabi et al. Enhancing P300 wave of BCI systems via negentropy in adaptive wavelet denoising
CN111274924A (en) Palm vein detection model modeling method, palm vein detection method and palm vein detection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant