[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN110334626B - Online learning system based on emotional state - Google Patents

Online learning system based on emotional state Download PDF

Info

Publication number
CN110334626B
CN110334626B CN201910559707.9A CN201910559707A CN110334626B CN 110334626 B CN110334626 B CN 110334626B CN 201910559707 A CN201910559707 A CN 201910559707A CN 110334626 B CN110334626 B CN 110334626B
Authority
CN
China
Prior art keywords
learner
learning
online
emotion
head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201910559707.9A
Other languages
Chinese (zh)
Other versions
CN110334626A (en
Inventor
解仑
谭志凌
张秋瑜
王志良
王先梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology Beijing USTB
Original Assignee
University of Science and Technology Beijing USTB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology Beijing USTB filed Critical University of Science and Technology Beijing USTB
Priority to CN201910559707.9A priority Critical patent/CN110334626B/en
Publication of CN110334626A publication Critical patent/CN110334626A/en
Application granted granted Critical
Publication of CN110334626B publication Critical patent/CN110334626B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Strategic Management (AREA)
  • Human Computer Interaction (AREA)
  • Educational Administration (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Educational Technology (AREA)
  • Tourism & Hospitality (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • Marketing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Game Theory and Decision Science (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Quality & Reliability (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Development Economics (AREA)
  • Primary Health Care (AREA)
  • Operations Research (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an online learning system based on emotional state, which comprises: the data acquisition module is used for acquiring images of the head of the learner and online interactive behavior data; the learning emotion evaluation module is used for respectively analyzing and processing the head image and the online interactive behavior data of the learner and acquiring the learning emotion state of the learner in real time based on a pre-established learning emotion model; the concentration adjustment module is used for judging whether to enter an interactive relaxation mode according to the current learning emotional state of the learner; and the learning content intervention module is used for dynamically adjusting the learning content in the learning content library according to the current learning emotional state of the learner. The learning emotion assessment mechanism integrating facial expression recognition, head posture recognition and interactive behavior analysis is applied to the whole learning process of online learning, learning content adjustment and break between classes are carried out in time according to the emotional state of a learner, invalid learning is effectively reduced, and online learning efficiency is improved.

Description

Online learning system based on emotional state
Technical Field
The invention relates to the technical field of intelligent service and online education, in particular to an online learning system based on emotional states.
Background
With the continuous progress of technologies such as artificial intelligence, computers, big data, and the like, and the continuous increase of human needs, the number of people performing online education through networks is also rapidly increasing. However, the unsupervised self-learning mode led by the emerging education mode lacks emotional interaction in the traditional education mode, so that the learning efficiency is low, and the learning content is too much to be in the eyes and the brain. In order to improve online learning efficiency and obtain a more accurate private customized knowledge structure, emotional interaction is urgently needed to be added in online learning, and more intelligent human-computer interaction is achieved.
At present, many colleges and universities adopt large open network courses such as MOOC (multimedia over cellular) to perform auxiliary teaching, teachers transfer part of teaching contents to the network, and students learn corresponding contents in an online learning mode. But current online learning systems; on one hand, for teachers, the online learning state of students is difficult to analyze, weak knowledge links of the students cannot be obtained, and classroom content cannot be dynamically adjusted according to feedback of online learning. On the other hand, for students, fatigue is easily generated due to lack of an interactive teaching mode, the students lack the power of active learning in order to deal with mechanical watching of teachers and homework, and lack cognition on the learning progress and mastering degree of the students, so that the optimal learning efficiency is difficult to obtain.
Disclosure of Invention
The technical problem to be solved by the invention is to provide an online learning system based on emotional states aiming at the defects of the existing online teaching system, the learning emotional states of learners are comprehensively analyzed through head images of the learners and the interactive frequency of input equipment, learning contents are dynamically adjusted and break among classes are arranged according to different emotional states, the learning efficiency of the learners is further improved, online learning becomes an effective learning medium, the learners can learn more actively, and the maximum learning efficiency is obtained in a short time.
In order to solve the above technical problem, the present invention provides an online learning system based on emotional state, including:
the data acquisition module is used for acquiring head image data and online interactive behavior data of the learner;
the learning emotion assessment module is used for respectively analyzing and processing the head image data and the online interactive behavior data of the learner, which are acquired by the data acquisition module, and acquiring the learning emotion state of the learner in real time based on a pre-established learning emotion model through an analysis processing result;
the concentration adjustment module is used for judging whether the learning is required to be suspended at present or not according to the current learning emotional state of the learner and entering an interactive relaxation mode;
the learning content intervention module is used for dynamically adjusting the learning content in a learning content library according to the current learning emotional state of the learner; the learning content library comprises the contents to be learned by the learner.
Further, the data acquisition module comprises a head image data acquisition unit and an online interactive behavior data acquisition unit;
the head image data acquisition unit is used for acquiring head image data comprising facial expression characteristics of a learner during online learning and a head rotation posture of the learner during the online learning;
the online interactive behavior data acquisition unit is used for acquiring the operation time length and frequency of the learner on the input equipment in the process of the timing question-answer test.
Further, the learning emotion assessment module comprises a data processing unit and an emotion assessment unit;
the data processing unit is used for analyzing and processing head image data and online interactive behavior data of the learner to obtain subjective emotional state and objective learning effect of the learner;
the emotion assessment unit is used for fusing the subjective emotion state and the objective learning effect of the learner on the basis of the learning emotion mapping model and acquiring the current emotion state of the learner.
Further, the data processing unit comprises an image data processing subunit and an interactive behavior data processing subunit;
the image data processing subunit is used for carrying out noise reduction, segmentation and normalization pretreatment on head image data of the learner, then carrying out facial expression feature recognition and head rotation posture analysis to obtain facial expression features and head rotation postures of the learner, and further obtaining the subjective emotional state of the learner;
when the facial expression characteristics of the learner are pleasant and calm and the head rotation posture is that the head deflection amplitude is in a preset range, indicating that the subjective emotional state of the learner is a positive learning state; when the facial expression characteristics of the learner are sadness, surprise or the head rotation posture is that the head always deviates from the screen within a preset time length, the subjective emotional state of the learner is a negative learning state;
the interactive behavior data processing subunit is used for extracting online interactive behavior data and question and answer result data of the learner, then calculating the answering time length and answering accuracy of the learner, and analyzing the objective learning effect of the learner based on the answering time length and answering accuracy of the learner.
Further, the learning emotion mapping model reflects the understanding degree of the learner on the current learning content through the facial expression characteristics of the learner, reflects the concentration degree of the learner in the learning process through the head rotation posture of the learner, and reflects the mastering degree of the learner on the current learning content through the online interactive behavior data of the learner.
Further, the process of the image data processing subunit performing facial expression feature recognition includes:
training a large number of public expression data sets by adopting a convolutional neural network to obtain an expression classification model;
and matching the preprocessed head image of the learner with the expression classification model to obtain the facial expression characteristics of the learner.
Further, the process of the image data processing subunit performing head rotation posture analysis includes:
selecting an active shape model to position facial feature points of the human face;
establishing a three-dimensional coordinate system, and extracting the position coordinates of each characteristic point;
and performing geometric operation on the extracted position coordinates and the feature point coordinates of the head image of the learner to obtain the head rotation posture of the learner.
Further, the emotion assessment unit is specifically configured to:
on the basis of the learning emotion mapping model, combining subjective emotion state and objective learning effect of a learner, acquiring the understanding degree of the learner on the current learning content according to facial expression of the learner, acquiring the concentration degree of the learner in the current learning process according to the head rotation posture of the learner, and acquiring the mastering degree of the learner on the current learning content according to online interactive behavior data of the learner;
acquiring the current learning emotional state of the learner according to a preset classification mode; wherein, the preset classification mode divides the learning emotional state of the learner into: concentration and skillful, concentration and loose and fresh, distraction and skillful, distraction and fresh and loose.
Further, the concentration adjustment module is specifically configured to:
when the learner is assessed to be in the 'distraction' state, judging that the learner is in the attention-dropping state at the moment and is not suitable for continuing learning; and then suspending learning, entering a voice interaction relaxation mode, and prompting the learner to perform a relaxation time period with a preset relaxation time duration.
Further, the learning content intervention module is specifically configured to:
when the learner is judged to be in the 'proficiency' state, the corresponding learning content is placed into the mastered content library, and is removed from the learning content library, so that the learner does not continue to learn;
when the learner is judged to be in the 'lively and sparse' state, the corresponding learning content is still left in the learning content library, and after the learner finishes one round of learning, the corresponding learning content still needs to be continuously learned.
The technical scheme of the invention has the following beneficial effects:
the online learning system based on the emotional state applies a learning emotion evaluation mechanism which integrates facial expression recognition, head posture recognition and interactive behavior analysis to the whole learning process of online learning; the method comprises the steps that by collecting head image data and online interactive behavior data of a learner, a learning emotion assessment mechanism is utilized to analyze the collected data in real time, and the learning process of the learner is continuously monitored; adjusting learning content and taking a rest between classes in time according to the emotional state of the learner; and once the concentration degree of the learner deviates from the normal learning dimension, activating a voice interaction system to enter a relaxation link, and guiding the learner to improve the attention. When the learner is found not to understand the current learning content, the content in the learning content library is dynamically adjusted, and the repeated learning stage is immediately started after one round of learning is finished until the comprehensive test is started after all the content in the learning content library is finished. Therefore, invalid learning is effectively reduced, and the online learning efficiency is improved.
In addition, the online learning system based on emotional states adopts a CAM500A camera with 500 ten thousand pixels and highest support of 720P @30fps fluent high-definition video to collect head image data of a learner, obtains real-time facial expression characteristics and head rotation gestures of the learner, can pay attention to the learning state of the learner in real time like a family teacher, and emphatically analyzes the acceptance degree of the learner on learning contents, and simultaneously effectively reduces repeated labor, so that the learner obtains the best learning effect in the learning process.
Drawings
FIG. 1 is a schematic structural diagram of an online learning system based on emotional states according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of the working flow of the online learning system based on emotional states according to the embodiment of the present invention;
FIG. 3 is a functional diagram of an online learning system based on emotional states according to an embodiment of the invention;
FIG. 4 is a schematic flow chart of head rotation gesture recognition according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating learning emotion recognition according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail below with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Firstly, the invention is dedicated to organically combine online learning with real-time emotional states, analyze the learning states through the emotional expressions of learners, and properly adjust the learning contents according to different learning states to realize more intelligent human-computer interaction and achieve the purpose of improving the learning efficiency.
And various ways may be selected for studying the emotional state of the learner during learning. Psychologists indicate that human emotional expression is composed of 55% facial expression, 38% voice and 7% language. Therefore, the invention mainly analyzes the learning emotion of the learner through the facial expression of the learner, assists the emotion regulation by voice, and simultaneously takes the rotating posture of the head as a medium for learning emotion recognition.
The emotional state in the online learning process mainly comprises the understanding degree of the learner on the learning content, the mastering degree of the learner on the learning content, the current attention degree of the learner and the like; in the learning process, the expression, body posture and reaction time of the learner are feedback of the understanding degree of the current learning content, and have certain universality. When the learner has difficulty understanding the learned contents, the expression of the learner is usually anxiety and confusion, the head posture is stiff, and the reaction time is long. When the learner receives the learned contents, the expression of the learner is usually calmer, slight pleasure is presented, the sight line is stably concentrated in the screen area, and the reaction time is shorter. When the learner is tired, the learner often becomes dull and distracted, the posture of the head is freer, the learner is difficult to sit and stand, and the like.
Based on the above theoretical basis, this embodiment proposes an online learning system based on emotional state, as shown in fig. 1, including:
the data acquisition module is used for acquiring head image data and online interactive behavior data of the learner;
the learning emotion assessment module is used for respectively analyzing and processing the head image data and the online interactive behavior data of the learner, which are acquired by the data acquisition module, and acquiring the learning emotion state of the learner in real time based on a pre-established learning emotion model through an analysis processing result;
the concentration adjustment module is used for judging whether the learning is required to be suspended at present or not according to the current learning emotional state of the learner and entering an interactive relaxation mode;
the learning content intervention module is used for dynamically adjusting the learning content in the learning content library according to the current learning emotional state of the learner; the learning content library comprises the contents to be learned by the learner.
Further, the data acquisition module comprises a head image data acquisition unit and an online interactive behavior data acquisition unit; the head image data acquisition unit adopts a CAM500A camera with 500 ten thousand pixels and highest support of 720P @30fps fluent high-definition video, and is mainly used for acquiring head image data including facial expression characteristics of a learner during online learning and head rotation gestures of the learner during the online learning process; the online interactive behavior data acquisition unit is used for acquiring behavior data of the learner on the operation duration, frequency and the like of the input equipment in the process of the timing question-answer test. The method is mainly used for analyzing the response time of answering questions and the proficiency degree of corresponding knowledge points in the online learning process of learners.
The learning emotion assessment module comprises a data processing unit and an emotion assessment unit; the data processing unit comprises an image data processing subunit and an interactive behavior data processing subunit; the image data processing subunit is used for respectively adopting a large number of public expression data sets to train to obtain a corresponding facial expression feature classification model and a head rotation attitude data set to train to obtain a head rotation attitude classification model, and comparing the image data acquired by the head image data acquisition unit with the facial expression feature classification model and the head rotation attitude classification model after preprocessing such as noise reduction, segmentation, normalization and the like, so as to output a facial expression feature classification result and a head rotation attitude feature classification result and further obtain the subjective emotional state of a learner;
that is, when the facial expression characteristics of the learner are pleasant and calm and the head rotation posture is that the head deflection amplitude is in a preset range, the subjective emotional state of the learner is an active learning state; when the facial expression characteristics of the learner are sadness, surprise or the head rotation posture is that the head always deviates from the screen within a preset time length, the subjective emotional state of the learner is a negative learning state;
the interactive behavior data processing subunit is used for extracting online interactive behavior data and question and answer result data of the learner, then calculating the answering time length and answering accuracy of the learner, analyzing the objective learning effect of the learner based on the answering time length and answering accuracy of the learner, and obtaining the mastery degree of the learner on the test content.
And the emotion evaluation unit is used for fusing the subjective emotion state and the objective learning effect of the learner on the basis of the learning emotion mapping model and acquiring the current emotion state of the learner. The learning emotion mapping model reflects the understanding degree of a learner on the current learning content through the facial expression characteristics of the learner, reflects the concentration degree of the learner in the learning process through the head rotation posture of the learner, reflects the mastering degree of the learner on the current learning content through online interactive behavior data of the learner, and is fused with the learner to form a learning emotion state.
Specifically, the process of the image data processing subunit performing facial expression feature recognition includes: training a large number of public expression data sets by adopting a convolutional neural network to obtain an expression classification model; and matching the head image preprocessed by the learner with the expression classification model to obtain the facial expression characteristics of the learner.
The process of the image data processing subunit for analyzing the head rotation posture comprises the following steps: selecting an active shape model to position facial feature points of the human face; establishing a three-dimensional coordinate system, and extracting the position coordinates of each characteristic point; and performing geometric operation on the extracted position coordinates and the feature point coordinates of the head image of the learner to obtain the head rotation posture of the learner.
The emotion assessment unit is specifically configured to:
on the basis of learning an emotion mapping model, integrating subjective emotion states and objective learning effects of a learner, acquiring the understanding degree of the learner on the current learning content according to facial expressions of the learner, acquiring the concentration degree of the learner in the current learning process according to the head rotation posture of the learner, and acquiring the mastering degree of the learner on the current learning content according to online interactive behavior data of the learner; acquiring the current learning emotional state of the learner according to a preset classification mode; wherein, the preset classification mode divides the learning emotional state of the learner into: concentration and skillful, concentration and loose and fresh, distraction and skillful, distraction and fresh and loose.
Further, the concentration adjustment module is specifically configured to:
when the learner is assessed to be in a distraction state, namely when the head movement amplitude of the learner exceeds the main screen range or is at the same deflection angle for a long time, the learner is judged to be in a attentive descending state at the moment and is not suitable for continuing learning; and then suspending learning, entering a voice interaction relaxation mode, prompting the learner to have a 30-60 s break in class, guiding the learner to re-concentrate attention, and adjusting the concentration state of the learner in time. The module aims at activating classroom atmosphere, improving learning interest of learners and blocking learning behaviors with low efficiency in time in a physical interruption mode. Of course the user may reserve the right not to focus on the adjustment.
Further, the learning content intervention module is specifically configured to:
when the learner is judged to be in the 'proficiency' state, the corresponding learning content is placed into the mastered content library, and is removed from the learning content library, so that the learner does not continue to learn; when the learner is judged to be in the 'lively and sparse' state, the corresponding learning content is still left in the learning content library, and after the learner finishes one round of learning, the corresponding learning content still needs to be continuously learned. When all the contents are learned, namely the data in the learning content library is zero, a comprehensive test is carried out to detect the online learning effect of the learner. The module enables the learning target of the user to be always concentrated on an unfamiliar part by dynamically adjusting the repetition degree of the learning content, so that the invalid learning and the time waste are reduced, and the intelligent interaction of the online learner and the online learning system is realized.
The working process of the online learning system based on emotional states is shown in fig. 2, firstly, the online learning system collects learning data of a learner, including head image data of the learner and online interactive behavior data of an input device, and then in the process that the learner enters a learning content library of the system to learn, the system analyzes the collected data in real time through a constructed learning emotion assessment mechanism integrating online expression recognition, real-time posture analysis and interactive behavior judgment, and continuously monitors the learning process of the learner. Once the concentration degree of the user deviates from the normal learning dimension, the system activates a voice interaction system to enter a relaxation link, and the learner is attracted to improve the attention. When the learner does not understand the current learning content well, the learning content library is dynamically adjusted, the repeated learning stage is immediately started after one round of learning is finished, the comprehensive test is started until the contents in the learning content library are completely finished, and the system operation is finished.
Fig. 3 is a functional diagram of the online learning system based on emotional states according to the embodiment. The online learning system has the functions of providing learning resources for users and analyzing, storing and feeding back the learning effect of the users in time, so that the users can carry out efficient learning activities. The main functions comprise user registration, user login, course query, online course learning, learning effect query, personal information query and the like. After the user finishes learning, the system can automatically store the emotion trend in the whole online learning process so that the user can inquire the learning state of any time period, know the weak links of the knowledge and develop review in time.
Fig. 4 is a schematic flow chart of a head pose recognition algorithm adopted by the online learning system based on emotional states according to the embodiment. The part selects an Active Shape Model (ASM) as a main algorithm for head posture recognition, the algorithm has the advantages that the key of head posture estimation is determined as the extraction of facial feature points, and the motion posture of the head can be judged only by simple geometric operation after the position coordinates of the facial key feature points are obtained, and the method comprises the following steps:
establishing an ASM model based on a human face, and selecting 68 characteristic points which can better describe the basic outline of the human face, such as edge points, large curvature points, T-shaped connecting points and the like to establish the ASM model;
step two, model training is carried out on the selected n training samples, and each training sample marked with the characteristic points is regarded asTwo-dimensional shape vector a composed of 68 feature pointsiAnd i represents the ith image. And then, carrying out alignment operation on all training samples in the training set, taking one reference image as a standard sample, and aligning other models onto the standard sample through geometric transformation. Statistical model construction using principal component analysis
Figure BDA0002107884930000081
Repeating the following steps until the aligned adjacent average shapes converge:
a. calculate the average shape vector:
Figure BDA0002107884930000082
b. calculating a covariance matrix:
Figure BDA0002107884930000083
c. calculating the characteristic values of the S and sequencing the characteristic values in the sequence from big to small:
Spi=λipii≥λi+1,pi Tpi=1,i=1,2,...,2n)
by piRepresenting the eigenvectors lambda in the matrix SiThe corresponding feature vector.
d. Selecting the first t characteristic values and the characteristic vectors to form a P vector: p ═ P (P)1,p2,...,pt)
Step three, searching and matching the model:
a. carrying out affine on the average shape to obtain an initial model: x ═ M (s, θ) X + Xc
M (s, θ) X is the change in position made by X, s represents zoom, θ represents rotation, XcIndicating the amount of translation.
b. Calculating the new position of each feature point in the image: x + dX
c. In order to make the position X of the current feature point closest to the new position X + dX, i.e., dX minimum, the shape and pose parameters are recalculated using affine:
Figure BDA0002107884930000091
the following can be found:
Figure BDA0002107884930000092
Figure BDA0002107884930000093
by
Figure BDA0002107884930000094
Can find db ═ PTda
d. Updating the parameters:
s=s(1+ds)
θ=θ(1+dθ)
b=b(1+d)
t=t(1+d)
e. the new shape is calculated: x ═ M (s, θ) X + Xc
f. Repeating the processes from b to e to enable the characteristic points in the final model to be closest to the characteristic points in the actual image, and obtaining the final matched shape, namely the actual human face shape.
And step four, establishing a three-dimensional coordinate system of the head model, setting the upper left corner of the image as a coordinate origin (0,0), respectively representing the positive direction of the X axis and the positive direction of the Y axis horizontally, rightwards and vertically downwards, and defining a vertical screen according to the right-hand rule and pointing to the inside of the screen as the positive direction of the Z axis.
And fifthly, selecting five feature points of the face, wherein the left eye center point, the right eye center point, the left mouth corner, the right mouth corner and the nose tip are used as key feature points of the face calculated in the next step.
And step six, substituting the position coordinates of the characteristic points into a trigonometric function for judging the head posture, deducing the ranges of the Pitch, Yaw and Roll, and estimating the head posture range of the user. Pitch represents the angle of rotation of the head about the X-axis, also known as Pitch. Yaw denotes the angle of rotation of the head about the Y axis, also known as the Yaw angle. Roll denotes the angle of rotation of the head about the Z axis, also known as Roll angle.
Fig. 5 is a schematic diagram of a learning emotion recognition flow of the online learning system based on emotional states according to the embodiment. The learning emotion assessment mechanism comprises two parts of image data analysis and input equipment online interaction behavior data analysis, an online learning platform is used for collecting image data and operation behavior data of a learner in an online learning process, and meanwhile an open human expression data set is used for training an expression classification model and a feature matching model for head posture recognition.
Preprocessing such as noise reduction, enhancement, normalization and the like are carried out on the acquired image data, face detection is carried out on the processed image, and the target area is locked on the face. Extracting key feature points of the human face through an active shape model algorithm, obtaining position coordinates of the key points, and obtaining a head posture movement trend according to the change of the position coordinates so as to judge the concentration degree of the learner; extracting texture features from the preprocessed image, comparing the extracted texture features with the trained expression classification model, and outputting the emotion classification results of pleasure, calmness, sadness, surprise and the like, thereby judging the understanding degree of the learner on the learning content. And calculating the answering time of the acquired interactive behavior data, thereby obtaining the thinking time of the learner when answering the questions, and comprehensively analyzing the mastery degree of the learner on the learned contents by combining the answering accuracy. And information of three channels of expression characteristics, motion trend and behavior analysis is fused to comprehensively evaluate the learning emotional state of the learner, and the learning emotional state is divided into three aspects of understanding degree, concentration degree and mastering degree.
The online learning system based on the emotional state applies a learning emotion evaluation mechanism which integrates facial expression recognition, head posture recognition and interactive behavior analysis to the whole learning process of online learning; the method comprises the steps that by collecting head image data and online interactive behavior data of a learner, a learning emotion assessment mechanism is utilized to analyze the collected data in real time, and the learning process of the learner is continuously monitored; adjusting learning content and taking a rest between classes in time according to the emotional state of the learner; and once the concentration degree of the learner deviates from the normal learning dimension, activating a voice interaction system to enter a relaxation link, and guiding the learner to improve the attention. When the learner is found not to understand the current learning content, the content in the learning content library is dynamically adjusted, and the repeated learning stage is immediately started after one round of learning is finished until the comprehensive test is started after all the content in the learning content library is finished. Therefore, invalid learning is effectively reduced, and the online learning efficiency is improved.
In addition, the online learning system based on emotional states adopts a CAM500A camera with 500 ten thousand pixels and highest support of 720P @30fps fluent high-definition video to collect head image data of a learner, obtains real-time facial expression characteristics and head rotation gestures of the learner, can pay attention to the learning state of the learner in real time like a family teacher, and emphatically analyzes the acceptance degree of the learner on learning contents, and simultaneously effectively reduces repeated labor, so that the learner obtains the best learning effect in the learning process.
Furthermore, it should be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
It should also be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (3)

1. An online learning system based on emotional states, comprising:
the data acquisition module is used for acquiring head image data and online interactive behavior data of the learner;
the learning emotion assessment module is used for respectively analyzing and processing the head image data and the online interactive behavior data of the learner, which are acquired by the data acquisition module, and acquiring the learning emotion state of the learner in real time based on a pre-established learning emotion model through an analysis processing result;
the concentration adjustment module is used for judging whether the learning is required to be suspended at present or not according to the current learning emotional state of the learner and entering an interactive relaxation mode;
the learning content intervention module is used for dynamically adjusting the learning content in a learning content library according to the current learning emotional state of the learner; the learning content library comprises contents to be learned by a learner;
the data acquisition module comprises a head image data acquisition unit and an online interaction behavior data acquisition unit;
the head image data acquisition unit is used for acquiring head image data comprising facial expression characteristics of a learner during online learning and a head rotation posture of the learner during the online learning;
the online interactive behavior data acquisition unit is used for acquiring the operation time length and frequency of the learner on the input equipment in the process of the timing question-answer test;
the learning emotion assessment module comprises a data processing unit and an emotion assessment unit;
the data processing unit is used for analyzing and processing head image data and online interactive behavior data of the learner to obtain subjective emotional state and objective learning effect of the learner;
the emotion assessment unit is used for fusing the subjective emotion state and the objective learning effect of the learner on the basis of the learning emotion mapping model to acquire the current emotion state of the learner;
the data processing unit comprises an image data processing subunit and an interactive behavior data processing subunit;
the image data processing subunit is used for carrying out noise reduction, segmentation and normalization pretreatment on head image data of the learner, then carrying out facial expression feature recognition and head rotation posture analysis to obtain facial expression features and head rotation postures of the learner, and further obtaining the subjective emotional state of the learner;
when the facial expression characteristics of the learner are pleasant and calm and the head rotation posture is that the head deflection amplitude is in a preset range, indicating that the subjective emotional state of the learner is a positive learning state; when the facial expression characteristics of the learner are sadness, surprise or the head rotation posture is that the head always deviates from the screen within a preset time length, the subjective emotional state of the learner is a negative learning state;
the interactive behavior data processing subunit is used for extracting online interactive behavior data and question and answer result data of the learner, then calculating the answering time length and answering accuracy of the learner, and analyzing the objective learning effect of the learner based on the answering time length and answering accuracy of the learner;
the learning emotion mapping model reflects the understanding degree of the learner on the current learning content through the facial expression characteristics of the learner, reflects the concentration degree of the learner in the learning process through the head rotation posture of the learner, and reflects the mastering degree of the learner on the current learning content through the online interactive behavior data of the learner; the emotion assessment unit is specifically configured to:
on the basis of the learning emotion mapping model, combining subjective emotion state and objective learning effect of a learner, acquiring the understanding degree of the learner on the current learning content according to facial expression of the learner, acquiring the concentration degree of the learner in the current learning process according to the head rotation posture of the learner, and acquiring the mastering degree of the learner on the current learning content according to online interactive behavior data of the learner;
acquiring the current learning emotional state of the learner according to a preset classification mode; wherein, the preset classification mode divides the learning emotional state of the learner into: concentration and skillful, concentration and loose birth, distraction and skillful, distraction and loose birth;
the concentration adjustment module is specifically configured to:
when the learner is assessed to be in the 'distraction' state, judging that the learner is in the attention-dropping state at the moment and is not suitable for continuing learning; further suspending learning, entering a voice interaction relaxation mode, and prompting the learner to perform a relaxation time period with a preset relaxation time duration;
the learning content intervention module is specifically configured to:
when the learner is judged to be in the 'proficiency' state, the corresponding learning content is placed into the mastered content library, and is removed from the learning content library, so that the learner does not continue to learn;
when the learner is judged to be in the 'lively and sparse' state, the corresponding learning content is still left in the learning content library, and after the learner finishes one round of learning, the corresponding learning content still needs to be continuously learned;
after the user finishes learning, the system automatically stores the emotion trend in the whole online learning process so that the user can inquire the learning state of any time period, know the weak links of the knowledge and develop review in time.
2. An emotional state-based online learning system as claimed in claim 1, wherein the image data processing sub-unit performs facial expression feature recognition comprising:
training a large number of public expression data sets by adopting a convolutional neural network to obtain an expression classification model;
and matching the preprocessed head image of the learner with the expression classification model to obtain the facial expression characteristics of the learner.
3. An emotional state-based online learning system according to claim 1, wherein the image data processing subunit performs head rotation pose analysis by:
selecting an active shape model to position facial feature points of the human face;
establishing a three-dimensional coordinate system, and extracting the position coordinates of each characteristic point;
and performing geometric operation on the extracted position coordinates and the feature point coordinates of the head image of the learner to obtain the head rotation posture of the learner.
CN201910559707.9A 2019-06-26 2019-06-26 Online learning system based on emotional state Expired - Fee Related CN110334626B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910559707.9A CN110334626B (en) 2019-06-26 2019-06-26 Online learning system based on emotional state

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910559707.9A CN110334626B (en) 2019-06-26 2019-06-26 Online learning system based on emotional state

Publications (2)

Publication Number Publication Date
CN110334626A CN110334626A (en) 2019-10-15
CN110334626B true CN110334626B (en) 2022-03-04

Family

ID=68142739

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910559707.9A Expired - Fee Related CN110334626B (en) 2019-06-26 2019-06-26 Online learning system based on emotional state

Country Status (1)

Country Link
CN (1) CN110334626B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111027477B (en) * 2019-12-10 2021-05-28 珠海读书郎网络教育有限公司 Online flat learning degree early warning method based on facial recognition
CN111242049B (en) * 2020-01-15 2023-08-04 武汉科技大学 Face recognition-based student online class learning state evaluation method and system
CN111625098B (en) * 2020-06-01 2022-11-18 广州市大湾区虚拟现实研究院 Intelligent virtual avatar interaction method and device based on multi-channel information fusion
CN111709362B (en) * 2020-06-16 2023-08-08 百度在线网络技术(北京)有限公司 Method, device, equipment and storage medium for determining important learning content
CN111687863B (en) * 2020-06-24 2021-10-01 中国银行股份有限公司 Network point robot self-adjusting method and device
CN111858968A (en) * 2020-07-08 2020-10-30 墨子(深圳)人工智能技术有限公司 Artificial intelligence learning device and system with pen-holding sitting posture correction function
CN112306832A (en) * 2020-10-27 2021-02-02 北京字节跳动网络技术有限公司 User state response method and device, electronic equipment and storage medium
CN112541529A (en) * 2020-12-04 2021-03-23 北京科技大学 Expression and posture fusion bimodal teaching evaluation method, device and storage medium
CN112507243B (en) * 2021-02-07 2021-05-18 深圳市阿卡索资讯股份有限公司 Content pushing method and device based on expressions
CN112907406B (en) * 2021-02-07 2022-04-08 北京科技大学 Online learning system based on cloud fusion multi-modal analysis
CN113139439B (en) * 2021-04-06 2022-06-10 广州大学 Online learning concentration evaluation method and device based on face recognition
CN113239794B (en) * 2021-05-11 2023-05-23 西北工业大学 Online learning-oriented learning state automatic identification method
CN113421068B (en) * 2021-08-23 2021-11-30 深圳市创能亿科科技开发有限公司 Study time recommendation method and device and computer readable storage medium
CN115205764B (en) * 2022-09-15 2022-11-25 深圳市企鹅网络科技有限公司 Online learning concentration monitoring method, system and medium based on machine vision
CN116453384A (en) * 2023-06-19 2023-07-18 江西德瑞光电技术有限责任公司 Immersion type intelligent learning system based on TOF technology and control method
CN116797090B (en) * 2023-06-26 2024-03-26 国信蓝桥教育科技股份有限公司 Online assessment method and system for classroom learning state of student

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9110501B2 (en) * 2012-04-17 2015-08-18 Samsung Electronics Co., Ltd. Method and apparatus for detecting talking segments in a video sequence using visual cues
CN102867173B (en) * 2012-08-28 2015-01-28 华南理工大学 Human face recognition method and system thereof
CN103366618B (en) * 2013-07-18 2015-04-01 梁亚楠 Scene device for Chinese learning training based on artificial intelligence and virtual reality
US20160180722A1 (en) * 2014-12-22 2016-06-23 Intel Corporation Systems and methods for self-learning, content-aware affect recognition
CN107291654A (en) * 2016-03-31 2017-10-24 深圳光启合众科技有限公司 The intelligent decision system and method for robot
CN106529409B (en) * 2016-10-10 2019-08-09 中山大学 A kind of eye gaze visual angle measuring method based on head pose
CN106844675B (en) * 2017-01-24 2020-11-17 北京光年无限科技有限公司 Robot multi-mode output method for children and robot
CN107066956B (en) * 2017-03-24 2020-06-19 北京科技大学 Multisource emotion recognition robot based on body area network
CN108664932B (en) * 2017-05-12 2021-07-09 华中师范大学 Learning emotional state identification method based on multi-source information fusion
CN107316520B (en) * 2017-08-17 2020-10-02 广州视源电子科技股份有限公司 Video teaching interaction method, device, equipment and storage medium
CN108182541A (en) * 2018-01-10 2018-06-19 张木华 A kind of blended learning recruitment evaluation and interference method and device
CN108805087B (en) * 2018-06-14 2021-06-15 南京云思创智信息科技有限公司 Time sequence semantic fusion association judgment subsystem based on multi-modal emotion recognition system
CN108876676A (en) * 2018-06-15 2018-11-23 四川文理学院 A kind of robot teaching's method and system based on scene
CN109101883B (en) * 2018-07-09 2021-11-09 山东师范大学 Depression tendency evaluation device and system
CN109255366B (en) * 2018-08-01 2020-07-17 北京科技大学 Emotional state adjusting system for online learning

Also Published As

Publication number Publication date
CN110334626A (en) 2019-10-15

Similar Documents

Publication Publication Date Title
CN110334626B (en) Online learning system based on emotional state
Pathak et al. Neural Correlate-Based E-Learning Validation and Classification Using Convolutional and Long Short-Term Memory Networks.
CN111242049B (en) Face recognition-based student online class learning state evaluation method and system
CN108399376B (en) Intelligent analysis method and system for classroom learning interest of students
Dewan et al. A deep learning approach to detecting engagement of online learners
CN109637207B (en) Preschool education interactive teaching device and teaching method
CN111796752B (en) Interactive teaching system based on PC
CN110134863B (en) Application program recommendation method and device
CN113657168A (en) Convolutional neural network-based student learning emotion recognition method
CN107544956A (en) A kind of text wants point detecting method and system
Butko et al. Automated facial affect analysis for one-on-one tutoring applications
CN115810163B (en) Teaching evaluation method and system based on AI classroom behavior recognition
CN113065757A (en) Method and device for evaluating on-line course teaching quality
Meriem et al. Determine the level of concentration of students in real time from their facial expressions
Jain et al. Study for emotion recognition of different age groups students during online class
Vishnumolakala et al. In-class student emotion and engagement detection system (iSEEDS): an AI-based approach for responsive teaching
Binh et al. Detecting student engagement in classrooms for intelligent tutoring systems
Chen et al. Intelligent Recognition of Physical Education Teachers' Behaviors Using Kinect Sensors and Machine Learning.
Yabunaka et al. Facial expression sequence recognition for a japanese sign language training system
Jiang et al. Predicting when teachers look at their students in 1-on-1 tutoring sessions
Shi et al. A visual sensing platform for robot teachers
Yuanyuan et al. Research on the Application Framework of Intelligent Technologies to Promote Teachers' Classroom Teaching Behavior Evaluation.
Budiharto et al. Modeling of natural interaction and algorithm of Intelligent humanoid robot for education
CN113792626A (en) Teaching process evaluation method based on teacher non-verbal behaviors
Bian et al. An Immersive Learning System with Multimodal Cognitive Processes Inference

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220304