[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN109034279B - Handwriting model training method, handwriting character recognition method, device, equipment and medium - Google Patents

Handwriting model training method, handwriting character recognition method, device, equipment and medium Download PDF

Info

Publication number
CN109034279B
CN109034279B CN201810563480.0A CN201810563480A CN109034279B CN 109034279 B CN109034279 B CN 109034279B CN 201810563480 A CN201810563480 A CN 201810563480A CN 109034279 B CN109034279 B CN 109034279B
Authority
CN
China
Prior art keywords
chinese character
chinese
training
recognition model
handwriting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810563480.0A
Other languages
Chinese (zh)
Other versions
CN109034279A (en
Inventor
黄春岑
周罡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201810563480.0A priority Critical patent/CN109034279B/en
Priority to PCT/CN2018/094269 priority patent/WO2019232859A1/en
Publication of CN109034279A publication Critical patent/CN109034279A/en
Application granted granted Critical
Publication of CN109034279B publication Critical patent/CN109034279B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/24Character recognition characterised by the processing or recognition method
    • G06V30/242Division of the character sequences into groups prior to recognition; Selection of dictionaries
    • G06V30/244Division of the character sequences into groups prior to recognition; Selection of dictionaries using graphical properties, e.g. alphabet type or font
    • G06V30/2455Discrimination between machine-print, hand-print and cursive writing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Character Discrimination (AREA)

Abstract

The invention discloses a handwriting model training method, a handwriting recognition device, handwriting recognition equipment and a handwriting recognition medium. The handwriting model training method comprises the following steps: acquiring standard Chinese character training samples, and dividing the standard Chinese character training samples into batches according to preset batches; the method comprises the steps of adopting a standard Chinese character training sample after training and classifying a two-way long-short-term memory neural network, and adopting a time-dependent back propagation algorithm to update network parameters so as to obtain a standard Chinese character recognition model; acquiring and adopting a non-standard Chinese character training sample, training and acquiring an adjusted Chinese handwriting recognition model; obtaining and adopting a Chinese character sample to be tested to obtain an error character training sample; based on a time-dependent back propagation algorithm of batch gradient descent, updating network parameters of the Chinese handwriting recognition model by adopting an error word training sample to obtain a target Chinese handwriting recognition model. By adopting the handwriting model training method, the target Chinese handwriting character recognition model with high recognition rate of recognition handwriting characters can be obtained.

Description

Handwriting model training method, handwriting character recognition method, device, equipment and medium
Technical Field
The present invention relates to the field of Chinese character recognition, and in particular, to a handwriting model training method, a handwriting recognition device, a handwriting recognition apparatus, and a handwriting recognition medium.
Background
The traditional handwriting recognition method mostly comprises the steps of binarization processing, character segmentation, feature extraction, support vector machine and the like for recognition, and when the traditional handwriting recognition method is used for recognizing relatively poor non-standard characters (handwriting Chinese characters), the recognition accuracy is not high, so that the recognition effect is not ideal. The traditional handwritten character recognition method can only recognize standard characters to a great extent, and has lower accuracy when recognizing various handwritten characters in actual life.
Disclosure of Invention
The embodiment of the invention provides a handwriting model training method, device, equipment and medium, which are used for solving the problem of low accuracy of current handwriting recognition.
A handwriting model training method, comprising:
acquiring standard Chinese character training samples, and dividing the standard Chinese character training samples into batches according to preset batches;
inputting the classified standard Chinese character training samples into a bidirectional long-short-time memory neural network for training, obtaining forward output of the bidirectional long-short-time memory neural network, updating network parameters of the bidirectional long-short-time memory neural network according to the forward output of the bidirectional long-short-time memory neural network by adopting a time-dependent back propagation algorithm, and obtaining a standard Chinese character recognition model;
Acquiring non-standard Chinese character training samples, and dividing the non-standard Chinese character training samples into batches according to preset batches;
inputting the classified non-standard Chinese character training samples into the standard Chinese character recognition model for training, obtaining the forward output of the standard Chinese character recognition model, updating the network parameters of the standard Chinese character recognition model by adopting a time-dependent back propagation algorithm according to the forward output of the standard Chinese character recognition model, and obtaining and adjusting the Chinese handwriting character recognition model;
acquiring a Chinese character sample to be tested, identifying the Chinese character sample to be tested by adopting the Chinese character handwriting identification model, acquiring error characters with identification results not consistent with real results, and taking all the error characters as error character training samples;
and inputting the error word training sample into the adjustment Chinese handwriting recognition model for training, obtaining the forward output of the adjustment Chinese handwriting recognition model, updating the network parameters of the adjustment Chinese handwriting recognition model by adopting a time-dependent back propagation algorithm based on batch gradient descent according to the forward output of the adjustment Chinese handwriting recognition model, and obtaining the target Chinese handwriting recognition model.
A handwriting model training apparatus comprising:
the standard Chinese character training sample acquisition module is used for acquiring standard Chinese character training samples and dividing the standard Chinese character training samples into batches according to preset batches;
the standard Chinese character recognition model acquisition module is used for inputting the batched standard Chinese character training samples into the bidirectional long-short-time memory neural network for training, acquiring forward output of the bidirectional long-short-time memory neural network, updating network parameters of the bidirectional long-short-time memory neural network according to the forward output of the bidirectional long-short-time memory neural network, and acquiring a standard Chinese character recognition model by adopting a time-dependent back propagation algorithm;
the non-standard Chinese character training sample acquisition module is used for acquiring non-standard Chinese character training samples and dividing the non-standard Chinese character training samples into batches according to preset batches;
the adjustment Chinese handwriting recognition model acquisition module is used for inputting the classified non-standard Chinese character training samples into the standard Chinese character recognition model for training, acquiring the forward output of the standard Chinese character recognition model, updating the network parameters of the standard Chinese character recognition model by adopting a time-dependent back propagation algorithm according to the forward output of the standard Chinese character recognition model, and acquiring the adjustment Chinese handwriting recognition model;
The error word training sample acquisition module is used for acquiring a Chinese word sample to be tested, identifying the Chinese word sample to be tested by adopting the adjusted Chinese handwriting recognition model, acquiring error words with recognition results not consistent with real results, and taking all the error words as error word training samples;
and the target Chinese handwritten character recognition model acquisition module is used for inputting the error character training sample into the adjustment Chinese handwritten character recognition model for training, acquiring the forward output of the adjustment Chinese handwritten character recognition model, updating the network parameters of the adjustment Chinese handwritten character recognition model by adopting a time-dependent back propagation algorithm based on batch gradient descent according to the forward output of the adjustment Chinese handwritten character recognition model, and acquiring the target Chinese handwritten character recognition model.
The embodiment of the invention also provides a handwritten character recognition method, device, equipment and medium, so as to solve the problem of low accuracy of the current handwritten character recognition.
A method of handwriting recognition, comprising:
acquiring a Chinese character to be identified, identifying the Chinese character to be identified by adopting a target Chinese handwriting recognition model, and acquiring an output value of the Chinese character to be identified in the target Chinese handwriting recognition model; the target Chinese handwriting recognition model is obtained by adopting the handwriting model training method;
And acquiring a target probability output value according to the output value and a preset Chinese semantic word stock, and acquiring the recognition result of the Chinese character to be recognized based on the target probability output value.
The embodiment of the invention provides a handwritten character recognition device, which comprises:
the output value acquisition module is used for acquiring the Chinese characters to be identified, identifying the Chinese characters to be identified by adopting a target Chinese handwriting recognition model, and acquiring the output value of the Chinese characters to be identified in the target Chinese handwriting recognition model; the target Chinese handwriting recognition model is obtained by adopting the handwriting model training method;
the recognition result acquisition module is used for acquiring a target probability output value according to the output value and a preset Chinese semantic word stock, and acquiring a recognition result of the Chinese character to be recognized based on the target probability output value.
A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the above-described handwriting model training method when the computer program is executed.
A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the handwriting recognition method described above when the computer program is executed.
An embodiment of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the handwriting model training method described above.
An embodiment of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the handwritten word recognition method described above.
According to the handwriting model training method, device, equipment and medium provided by the embodiment of the invention, the standard Chinese character training samples are obtained, the standard Chinese character training samples are batched according to the preset batch, the batched standard Chinese character training samples are adopted for training, the standard Chinese character recognition model is obtained, and the standard Chinese character recognition model has the capability of recognizing standard Chinese handwriting. And then, the standard Chinese character recognition model is updated in an adjustable way through the classified non-standard Chinese character training samples, so that the updated adjustable Chinese character recognition model learns the deep features of the handwritten Chinese characters in a training and updating mode on the premise of having the standard character recognition capability, and the adjustable Chinese character recognition model can better recognize the handwritten Chinese characters. And then adopting an adjustment Chinese handwriting recognition model to recognize a Chinese character sample to be tested, obtaining error-out words with recognition results not conforming to real results, inputting all error-out words serving as error-out word training samples into the adjustment Chinese handwriting recognition model for training and updating, and adopting a time-dependent back propagation algorithm based on batch gradient descent to update network parameters of the adjustment Chinese handwriting recognition model, so as to obtain the target Chinese handwriting recognition model. The error word training sample is adopted to further optimize the recognition accuracy, and the influence of excessive learning and excessive weakening generated during model training can be further reduced. The training of each model adopts a two-way long-short-time memory neural network, and the neural network can combine the sequence characteristics of Chinese characters, learn the deep features of the Chinese characters from the forward direction of the sequence and the reverse direction of the sequence, and realize the function of identifying different Chinese handwriting. The standard Chinese character recognition model and the adjustment Chinese handwriting recognition model adopt a time-dependent back propagation algorithm based on small batch gradients (taking the standard Chinese character recognition model as an example, standard Chinese character training samples are batched according to preset batches, and the standard Chinese character recognition model is trained by adopting the batched standard Chinese character training samples), so that the training efficiency and the training effect are better under the condition of a large number of training samples, the error can be ensured to have global characteristics within a certain range relative to a single training sample, and the minimum value of an error function can be found more easily. The target Chinese handwriting recognition model adopts a time-dependent back propagation algorithm based on batch gradient descent during training, so that the parameters in the model can be fully updated, the parameters can be comprehensively updated according to the generated errors, and the recognition accuracy of the acquired model is improved.
In the handwritten character recognition method, device, equipment and medium provided by the embodiment of the invention, the Chinese character to be recognized is input into the target Chinese handwritten character recognition model for recognition, and the recognition result is obtained by combining with the preset Chinese semantic word stock. When the target Chinese handwriting recognition model is adopted to recognize the Chinese handwriting, an accurate recognition result can be obtained.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments of the present invention will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a diagram of an application environment of a handwriting model training method according to an embodiment of the invention;
FIG. 2 is a flow chart of a handwriting model training method according to an embodiment of the invention;
FIG. 3 is a flowchart showing step S10 in FIG. 2;
FIG. 4 is a flowchart showing step S20 in FIG. 2;
FIG. 5 is a flowchart showing step S50 in FIG. 2;
FIG. 6 is a schematic diagram of a handwriting model training apparatus according to an embodiment of the invention;
FIG. 7 is a flow chart of a method for recognizing handwritten characters in an embodiment of the invention;
FIG. 8 is a schematic diagram of a handwriting recognition device according to an embodiment of the invention;
FIG. 9 is a schematic diagram of a computer device in accordance with an embodiment of the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 shows an application environment of a handwriting model training method provided by an embodiment of the present invention. The application environment of the handwriting model training method comprises a server and a client, wherein the server and the client are connected through a network, the client is equipment capable of performing man-machine interaction with a user, the equipment comprises but not limited to a computer, a smart phone, a tablet and the like, and the server can be realized by an independent server or a server cluster formed by a plurality of servers. The handwriting model training method provided by the embodiment of the invention is applied to the server.
As shown in fig. 2, fig. 2 shows a flowchart of a handwriting model training method according to an embodiment of the invention, the handwriting model training method includes the following steps:
s10: and acquiring standard Chinese character training samples, and dividing the standard Chinese character training samples into batches according to preset batches.
The standard Chinese character training sample refers to a training sample obtained according to standard characters (such as characters belonging to fonts of regular script, song style body or script, and the like, general fonts select regular script or Song Ti).
In this embodiment, standard Chinese character training samples are obtained, and the standard Chinese character training samples are batched according to a preset batch, for example, the standard Chinese character training samples are batched according to 5 preset batches, so as to obtain sub-samples of the standard Chinese character training samples for training in 5 batches. The training sample of the Chinese characters in the specification is obtained from standard characters belonging to Chinese characters such as regular script, song body or a script, and the embodiment takes Song body as an example for explanation. It is to be understood that, here, the standard character refers to a character belonging to the main stream font in the current chinese font, such as a character of the default font Song body in the input method of the computer device, a character of the main stream font regular script commonly used for copying, and the like; and the characters with Chinese characters such as grass characters and young circles which are rarely used in daily life are not included in the standard specification characters.
S20: inputting the classified standard Chinese character training samples into a bidirectional long-short-time memory neural network for training, obtaining the forward output of the bidirectional long-short-time memory neural network, updating the network parameters of the bidirectional long-short-time memory neural network by adopting a time-dependent back propagation algorithm according to the forward output of the bidirectional long-short-time memory neural network, and obtaining the standard Chinese character recognition model.
The Bi-directional Long Short-Term Memory (BILSTM) is a time recurrent neural network, which is used for training data with sequence characteristics from the forward direction and the reverse direction of the sequence. The bidirectional long-short-term memory neural network can be used for not only associating the preamble data, but also associating the post-sequence data, so that deep features of the data related to the sequence can be learned according to the front-back relation of the sequence. The data with the sequence characteristics are trained on the two-way long-short-time memory neural network model, and an identification model corresponding to the data can be obtained. In one embodiment, the classified standard Chinese character training samples are input into the convolutional neural network for training, and a small batch gradient descent method is adopted in the process of updating the weight and bias of the convolutional neural network by adopting a backward propagation algorithm. The small batch gradient descent (Mini-batch Gradient Descent, MBGD for short) is a processing method for carrying out parameter updating by accumulating errors generated in the training process according to preset batches to obtain accumulated errors corresponding to a plurality of batches and adopting the accumulated errors corresponding to the batches. The time-dependent back propagation algorithm (Back Propagation Though Time, abbreviated as BPTT algorithm) is a training and learning method in neural network learning, and is used for updating and adjusting network parameters between nodes in the neural network. When the time-dependent back propagation algorithm is adopted to adjust the network parameters in the neural network, the minimum value of the error function is required to be obtained, and in the embodiment, the minimum value of the error function is specifically obtained by adopting a small-batch gradient descent processing method.
In this embodiment, the classified standard Chinese character training samples are input into the bidirectional long-short-time memory neural network for training, and the network parameters of the bidirectional long-short-time memory neural network are updated by adopting a backward propagation algorithm (based on small batch gradient), so as to obtain the standard Chinese character recognition model. The standard Chinese character recognition model learns deep features of a standard Chinese character training sample in the training process, so that the model can accurately recognize standard Chinese characters and has recognition capability on the standard Chinese characters. It should be noted that, whether the standard Chinese character training sample is a standard character corresponding to other Chinese characters such as regular script, song body or clerical script, the standard character is not greatly different in character recognition, so that the standard Chinese character recognition model can accurately recognize the standard character corresponding to the characters such as regular script, song body or clerical script, and the like, and a more accurate recognition result is obtained.
S30: obtaining non-standard Chinese character training samples, and dividing the non-standard Chinese character training samples into batches according to preset batches.
The non-standard Chinese character training sample is a training sample obtained according to handwritten Chinese characters, and the handwritten Chinese characters can be specifically characters obtained in a handwriting mode according to the font forms of standard characters corresponding to fonts such as regular script, song style or a script. It will be appreciated that the non-canonical chinese character training pattern differs from the canonical chinese character training pattern in that the non-canonical chinese character training pattern is derived from handwritten chinese characters, which, since handwritten, naturally comprise a variety of different font morphologies.
In this embodiment, the server obtains a non-standard chinese character training sample, where the non-standard chinese character training sample includes features of a handwritten chinese character. After obtaining the non-standard Chinese character training samples, the non-standard Chinese character training samples are batched according to a preset batch, for example, the non-standard Chinese character training samples are batched according to 5 preset batches, and subsamples of the 5 batches of non-standard Chinese character training samples for training are obtained.
S40: inputting the classified non-standard Chinese character training samples into a standard Chinese character recognition model for training, obtaining the forward output of the standard Chinese character recognition model, updating the network parameters of the standard Chinese character recognition model by adopting a time-dependent back propagation algorithm according to the forward output of the standard Chinese character recognition model, and obtaining the regulated Chinese handwriting character recognition model.
In this embodiment, the classified non-standard chinese character training samples are input into the standard chinese character recognition model for training and adjustment, and the network parameters of the standard chinese character recognition model are updated by using a time-dependent back propagation algorithm, so as to obtain an adjusted chinese handwriting recognition model. It can be appreciated that the standard chinese character recognition model has the ability to recognize standard chinese characters, but does not have high recognition accuracy when recognizing handwritten chinese characters. Therefore, the embodiment adopts the non-standard Chinese character training samples after batch classification for training, so that the standard Chinese character recognition model adjusts parameters (weight and bias) in the model on the basis of the existing recognition standard character to obtain the adjusted Chinese character recognition model. The adjustment Chinese handwriting recognition model learns the deep features of the handwritten Chinese characters on the basis of original recognition standard characters, so that the adjustment Chinese handwriting recognition model combines the standard characters and the deep features of the handwritten Chinese characters, and can effectively recognize the standard characters and the handwritten Chinese characters at the same time, thereby obtaining a recognition result with higher accuracy.
When the two-way long-short-term memory neural network is used for word recognition, the two-way long-short-term memory neural network is judged according to the pixel distribution of the words, and the handwritten Chinese characters in actual life are different from the standard words, but the difference is much smaller than the difference of other non-corresponding standard words, for example, the difference between the written Chinese character "I" and the standard specification word "I" is significantly smaller in pixel distribution than the difference between the written Chinese character "you" and the standard specification word "I". It can be considered that even if there is a certain difference between the handwritten chinese characters and the corresponding standard specification words, the difference is much smaller than that of the non-corresponding standard specification words, and thus the recognition result can be determined by the principle of the most similarity (i.e., the least difference). The Chinese handwriting recognition model is trained by a two-way long-short-time memory neural network, and can be combined with deep features of standard characters and handwritten Chinese characters to effectively recognize the handwritten Chinese characters according to the deep features.
For steps S20 and S30-S40, the efficiency and effect of network training can be improved by adopting a time-dependent back propagation algorithm to update the error back propagation, and the error can be ensured to have global characteristics within a certain range relative to a single training sample, so that the minimum value of an error function can be found more easily, and the network can be trained more effectively.
It should be noted that the order of steps S20 and steps S30-S40 in the present embodiment is not exchangeable, and step S20 is executed first and steps S30-S40 are executed second. The standard Chinese training sample is firstly adopted to train the two-way long-short-time memory neural network, so that the acquired standard Chinese character recognition model has better recognition capability, and the standard Chinese character recognition model has accurate recognition result. And the fine adjustment of the steps S30-S40 is performed on the basis of having good recognition capability, so that the training-acquired Chinese handwriting recognition model can effectively recognize the handwriting Chinese characters according to the deep features of the learned handwriting Chinese characters, and the handwriting Chinese characters can be recognized with accurate recognition results. If steps S30-S40 are performed first or steps S30-S40 are performed only, since the handwritten Chinese characters have various forms, the features learned by training the handwritten Chinese characters directly cannot better reflect the features of the handwritten Chinese characters, so that the model is learned "bad" at first, and it is difficult to perform adjustment later so that the handwritten Chinese characters are recognized with accurate recognition results. Although the handwritten Chinese characters of everyone are different, most of them are similar to standard specification characters (such as the handwritten Chinese characters imitate the standard specification characters). Therefore, the model training is more in line with the objective condition at the beginning according to the standard words, the effect of the model training is better than that of the direct model training of the handwritten Chinese characters, and the corresponding adjustment can be carried out under the 'good' model, so that the adjusted Chinese character recognition model with high handwritten Chinese character recognition rate is obtained.
S50: and acquiring a Chinese character sample to be tested, identifying the Chinese character sample to be tested by adopting an adjustment Chinese handwriting recognition model, acquiring error characters with recognition results different from real results, and taking all error characters as error character training samples.
The Chinese character sample to be tested is a training sample for testing obtained according to standard specification characters and handwritten Chinese characters, and the standard specification characters adopted in the step are identical to the standard specification characters used for training in the step S20 (because each character corresponding to fonts such as regular script and Song Ti is uniquely determined); the handwritten Chinese characters adopted and the handwritten Chinese characters used for training in the steps S30-S40 can be different (the handwritten Chinese characters in different handwriting are not identical, each character corresponding to the handwritten Chinese characters can correspond to various font forms, and in order to distinguish the handwritten Chinese characters from the non-standard Chinese character training samples used for training in the steps S30-S40, the situation that the model is trained and fitted is avoided, and generally, the handwritten Chinese characters different from the handwritten Chinese characters used in the steps S30-S40 are adopted in the step.
In this embodiment, the trained and adjusted chinese handwriting recognition model is used to recognize a sample of the chinese characters to be tested, where the sample of the chinese characters to be tested includes a standard character and a preset tag value (i.e., a real result) thereof, and the handwritten chinese characters and a preset tag value thereof. Standard characters and handwritten Chinese characters can be input into the adjusted Chinese handwriting recognition model in a mixed mode. When the Chinese character sample to be tested is identified by adopting the Chinese character adjustment hand-written character identification model, corresponding identification results are obtained, and all error characters, of which the identification results are inconsistent with the tag values (real results), are used as error character training samples. The error word training sample reflects the problem that the recognition accuracy is insufficient when the Chinese handwriting recognition model is adjusted, so that the Chinese handwriting recognition model is further updated, optimized and adjusted according to the error word training sample.
Because the recognition accuracy of the adjusted Chinese handwritten character recognition model is actually affected by the combination of the standard Chinese character training sample and the non-standard Chinese character training sample, the network parameters (weight and bias) are updated by adopting the standard Chinese character training sample, and the characteristics of the non-standard Chinese character training sample are excessively learned by adopting the non-standard Chinese character training sample on the premise that the network parameters (weight and bias) are updated by adopting the non-standard Chinese character training sample, so that the acquired adjusted Chinese handwritten character recognition model has very high recognition accuracy on the non-standard Chinese character training sample (including the handwritten Chinese character), but excessively learns the characteristics of the non-standard Chinese character training sample to affect the recognition accuracy of the handwritten Chinese character except the non-standard Chinese character training sample. The method comprises the steps of identifying a Chinese character sample to be tested by adjusting a Chinese handwriting recognition model so as to find out errors generated by over-learning, wherein the errors can be reflected by error words, and therefore network parameters of the Chinese handwriting recognition model can be further updated and optimized and adjusted according to the error words.
S60: and inputting the error word training sample into the regulated Chinese handwriting recognition model for training, and updating network parameters of the regulated Chinese handwriting recognition model by adopting a time-dependent back propagation algorithm based on batch gradient descent to obtain the target Chinese handwriting recognition model.
In this embodiment, an error word training sample is input into the adjusted chinese handwriting recognition model to perform training, where the error word training sample reflects a problem of inaccurate recognition occurring when the adjusted chinese handwriting recognition model recognizes handwritten chinese characters outside the non-standard chinese training sample due to excessive learning of the characteristics of the non-standard chinese training sample. In addition, the original learned characteristics of the standard words can be excessively weakened due to the fact that the standard Chinese character training samples are adopted firstly and then the non-standard Chinese character training samples are adopted to train the model, and the initial establishment of the model can be influenced to carry out the identification of the standard words. The problem of excessive learning and excessive weakening can be well solved by using the error word training sample, and adverse effects caused by excessive learning and excessive weakening generated in the original training process can be eliminated to a great extent according to the problem of recognition accuracy reflected by the error word training sample. Specifically, when the error word training sample is adopted for training, a time-dependent back propagation algorithm based on batch gradient descent is adopted, network parameters of the Chinese handwriting recognition model are updated and adjusted according to the algorithm, and a target Chinese handwriting recognition model is obtained, wherein the target Chinese handwriting recognition model is a finally trained model capable of recognizing Chinese handwriting. When the network parameters are updated, the sample capacity of the error word training samples is less (error words are less), and the errors generated by the error word training samples during the training of the bidirectional long-short-time memory neural network can be updated in a back-transmission way by adopting a time-dependent back-propagation algorithm based on batch gradient descent, so that the generated errors can be regulated and updated to the network, the bidirectional long-short-time memory neural network can be comprehensively trained, and the recognition accuracy of the target Chinese handwriting recognition model is improved.
It can be understood that the two-way long and short-time memory neural network used for training each model can be combined with the sequence characteristics of the Chinese characters, and the deep features of the Chinese characters are learned from the forward direction of the sequence and the reverse direction of the sequence, so that the function of identifying different Chinese handwriting is realized.
It should be noted that, in the present embodiment, the steps S20 and S40 use a time-dependent back propagation algorithm based on a small batch gradient; step S60 employs a time-dependent back-propagation algorithm based on batch gradient descent.
In step S20, the process of updating the network parameters of the bidirectional long-short-term memory neural network by using the time-dependent back propagation algorithm based on the small batch gradient specifically includes the following steps:
acquiring a binarization pixel value characteristic matrix corresponding to each training sample (each word) in the standard Chinese character training samples, dividing all binarization pixel value characteristic matrices into a plurality of batches of binarization pixel value characteristic matrices according to a preset batch, and then inputting the plurality of batches of binarization pixel value characteristic matrices into a bidirectional long-short-term memory neural network to obtain each binarization imageAnd (3) outputting the forward direction corresponding to the element value characteristic matrix, respectively accumulating errors according to preset batches to obtain accumulated errors corresponding to each batch, adopting the accumulated errors corresponding to each batch to perform backward propagation based on gradient descent, and updating the network parameters of the bidirectional long-short-term memory network. Repeating the process of calculating the accumulated error of the corresponding batch and updating the network parameter by using the accumulated error of the corresponding batch until the error is smaller than the stop iteration threshold epsilon 1 And ending the circulation to obtain updated network parameters, and obtaining the Chinese character recognition model in the specification.
The process of updating the network parameters of the bidirectional long-short-term memory neural network by the time-dependent back propagation algorithm based on the small batch gradient in step S40 is similar to that of step S20, and will not be described again here.
In step S60, the process of updating the network parameters of the bidirectional long-short-term memory neural network by using the time-dependent back propagation algorithm based on batch gradient descent specifically includes the following steps:
obtaining a binarized pixel value characteristic matrix corresponding to one training sample in error word training samples, inputting the binarized pixel value characteristic matrix into an adjustment Chinese handwriting recognition model (basically, a bidirectional long-short-time memory neural network) to obtain forward output, calculating an error between the forward output and a real result, obtaining and sequentially inputting the binarized pixel value characteristic matrix corresponding to the rest training samples into the adjustment Chinese handwriting recognition model, calculating the error between the corresponding forward output and the real result, accumulating the error to obtain the total error of the adjustment Chinese handwriting recognition model for the error word training samples, performing one-time gradient-descent-based back propagation by adopting the total error, updating network parameters of the network, repeating the processes of calculating the total error and updating the network parameters by adopting the total error until the error is smaller than a stop iteration threshold epsilon 2 And ending the circulation to obtain updated network parameters, and obtaining the target Chinese handwriting recognition model.
It can be appreciated that, for steps S20 and S40, since the number of training samples used for performing model training is relatively large, if a time-dependent back propagation algorithm based on batch gradient descent is used, the efficiency and effect of network training will be affected, and even model training cannot be performed normally, so that it is difficult to perform training effectively. The error back propagation updating method based on the time-dependent back propagation algorithm of the small batch gradient can improve the efficiency and effect of network training, ensure that errors have global characteristics within a certain range relative to a single training sample, and find the minimum value of an error function more easily so as to train the network more effectively.
For step S60, the error word training samples have a smaller sample capacity (fewer error words), and the time-dependent back propagation algorithm based on batch gradient descent is adopted to update all errors generated by the error word training samples during training of the bidirectional long-short-time memory neural network, so that all errors generated can be regulated and updated to the network, and the bidirectional long-short-time memory neural network can be comprehensively trained. Compared with a time-dependent back propagation algorithm based on a small-batch gradient, the time-dependent back propagation algorithm based on a batch gradient descent has the advantages that the gradient of the time-dependent back propagation algorithm is standard, and the two-way long-short-time memory neural network can be comprehensively trained; while the latter extracts one batch of corresponding accumulated errors from the accumulated errors of the preset batch at a time to update the parameters of the network, the former is still not as accurate in training despite the global features within a certain range. The accuracy of model training can be improved by adopting a time-dependent back propagation algorithm based on batch gradient descent, so that a target Chinese handwritten character recognition model obtained by training has accurate recognition capability.
In the steps S10-S60, training is performed by using the classified standard Chinese character training samples, a standard Chinese character recognition model is obtained, and the standard Chinese character recognition model is updated in an adjustable manner by using the classified non-standard Chinese characters, so that the updated and obtained adjustable Chinese character recognition model learns the deep features of the handwritten Chinese characters in a training and updating manner on the premise of having the standard character recognition capability, and the adjustable Chinese character recognition model can better recognize the handwritten Chinese characters. And then, recognizing a Chinese character sample to be tested by adopting an adjustment Chinese handwriting recognition model, obtaining error-out words with recognition results not conforming to real results, and inputting all error words serving as error-out word training samples into the adjustment Chinese handwriting recognition model for training and updating to obtain a target Chinese handwriting recognition model. The error word training sample is adopted, so that adverse effects caused by excessive learning and excessive weakening generated in the original training process can be eliminated to a great extent, and the recognition accuracy can be further optimized. In the steps S10-S60, a two-way long-short-time memory neural network is adopted for training each model, and the neural network can be combined with the sequence characteristics of fonts, and the deep characteristics of the fonts are learned from the forward direction of the sequence and the reverse direction of the sequence; the training standard Chinese character recognition model and the adjusting Chinese handwriting recognition model adopt a time-dependent back propagation algorithm based on a small batch gradient, so that the training efficiency and the training effect can be better under the condition of a large number of training samples, the error can be ensured to have global characteristics within a certain range relative to a single training sample, and the minimum value of an error function can be found more easily; the training target Chinese handwriting recognition model adopts a time-dependent back propagation algorithm based on batch gradient descent, the batch gradient descent can ensure the full update of parameters in the model, the back propagation update is carried out on errors generated in the training process of a training sample, the parameter update is comprehensively carried out according to the generated errors, and the recognition accuracy of the acquired model is improved.
In one embodiment, as shown in fig. 3, in step S10, a standard chinese training sample is obtained, and the standard chinese training sample is batched according to a preset batch, which specifically includes the following steps:
s11: acquiring a pixel value characteristic matrix of each Chinese character in a Chinese character training sample to be processed, and carrying out normalization processing on each pixel value in the pixel value characteristic matrix to acquire a normalized pixel value characteristic matrix of each Chinese character, wherein a formula of normalization processing is as follows
Figure BDA0001683877600000111
MaxValue is the maximum value of pixel values in the pixel value feature matrix of each Chinese character, minValue is the minimum Value of the pixel values in the pixel Value feature matrix of each Chinese character, x is the pixel Value before normalization, and y is the pixel Value after normalization.
The Chinese character training sample to be processed refers to an initial acquired unprocessed training sample.
In this embodiment, a pixel value feature matrix of each Chinese character in the Chinese character training sample to be processed is obtained, where the pixel value feature matrix of each Chinese character represents the feature of the corresponding character, and here the pixel value represents the feature of the character. The computer device is capable of recognizing the form of a pixel value feature matrix and reading the values in the pixel value feature matrix. After the server acquires the pixel value feature matrix, the normalization processing formula is adopted to normalize the pixel value of each Chinese character in the feature matrix, and the normalized pixel value feature of each Chinese character is acquired. In this embodiment, the normalization processing manner is adopted to compress the pixel value feature matrix of each Chinese character within the same range, so that the calculation related to the pixel value feature matrix can be accelerated, and the training efficiency of the Chinese character recognition model in the training specification can be improved.
S12: dividing pixel values in the normalized pixel value feature matrix of each Chinese character into two types of pixel values, establishing a binarized pixel value feature matrix of each Chinese character based on the two types of pixel values, combining the binarized pixel feature matrix of each Chinese character to serve as a standard Chinese character training sample, and dividing the standard Chinese character training samples into batches according to preset batches.
In this embodiment, the pixel values in the normalized pixel value feature matrix of each Chinese character are divided into two types of pixel values, where the two types of pixel values refer to that the pixel values only include the pixel value a or the pixel value B. Specifically, a pixel value greater than or equal to 0.5 in the normalized pixel feature matrix may be taken as 1, a pixel value less than 0.5 may be taken as 0, and a corresponding binary pixel value feature matrix of each Chinese character may be established, where the original binary pixel feature matrix of each Chinese character only includes 0 or 1. After the binarization pixel value feature matrix is established, the Chinese character group corresponding to the binarization pixel value feature matrix is used as a standard Chinese character training sample, and the standard Chinese character training samples are batched according to a preset batch. In an image containing words, a portion containing pixels of words and a portion containing pixels of blanks are included. The pixel values on the word are typically darker in color, with a "1" in the feature matrix of binarized pixel values representing a portion of the word's pixel and a "0" representing a portion of the image's blank pixel. It can be understood that the feature representation of the character can be further simplified by establishing the feature matrix of the binarized pixel value, and each Chinese character can be represented and distinguished by only adopting the matrix of 0 and 1, so that the speed of processing the feature matrix of the Chinese character by the computer can be improved, and the training efficiency of the character recognition model in training specification can be further improved.
And S11-S12, carrying out normalization processing and class-II value division on the Chinese character training sample to be processed, obtaining a binarization pixel value feature matrix of each Chinese character, and taking a character corresponding to the binarization pixel value feature matrix of each Chinese character as a standard Chinese character training sample, so that the duration of training a standard Chinese character recognition model can be obviously shortened.
It will be appreciated that the input to the bi-directional long and short term memory neural network for training is actually a respective different binarized pixel feature matrix, each representing each corresponding chinese character. The Chinese characters are respectively provided with sequence features in space, and the features can be also embodied in the binarization pixel feature matrix, so that the deep features of the Chinese characters can be trained and learned from the front-back correlation angle of the sequence by adopting a bidirectional long-short-term memory neural network for the binarization pixel feature matrix.
In one embodiment, as shown in fig. 4, in step S20, a batched standard Chinese character training sample is input into a bidirectional long-short-time memory neural network for training, a forward output of the bidirectional long-short-time memory neural network is obtained, and according to the forward output of the bidirectional long-short-time memory neural network, a time-dependent back propagation algorithm is used to update network parameters of the bidirectional long-short-time memory neural network, so as to obtain a standard Chinese character recognition model, which specifically includes the following steps:
S21: the standard Chinese character training samples after batch division are input into a bidirectional long and short time memory neural network according to the sequence forward direction to obtain forward output F o Reversely inputting the classified standard Chinese character training samples into a bidirectional long-short-time memory neural network according to the sequence to obtain reverse output B o Adding the forward output and the reverse output to obtain a forward output T o Expressed as T o =F o +B o
The two-way long and short-term memory neural network model comprises an input layer, an output layer and a hidden layer. The hidden layer includes an input gate, a forget gate, an output gate, a neuron state, and a hidden layer output. The forgetfulness gate determines the information to be discarded in the neuronal state. The input gate determines the information to be added in the neuron. The output gate determines the information to be output in the neuron. The neuron state determines the information discarded, added and output by the gates, specifically expressed as weights connected to the gates. The hidden layer output determines the connection weight of the next layer (hidden layer or output layer) to which the hidden layer is connected. The network parameters of the two-way long-short-time memory neural network model refer to weights and offsets connected among neurons in the neural network model, and the network parameters (weights and offsets) determine the properties of the network, so that the network has a memory function on a sequence, and data input into the two-way long-short-time memory neural network are subjected to calculation processing of the network parameters to obtain corresponding output. The network parameters mentioned in this embodiment take the weight as an example, and the bias is the same as the method for updating the weight in the stage of updating and adjusting, and the bias will not be described again.
In this embodiment, the classified standard chinese character training samples are input into the bidirectional long-short-term memory neural network for training, and the classified standard chinese character training samples are subjected to response processing of network parameters in the bidirectional long-short-term memory neural network, and output values of each layer of the network are calculated respectively, including the input gate, the forgetting gate, the output gate and the neuron state (also called cell state) of the standard chinese character training samples in the hidden layer, and the output of the hidden layer is calculated according to the neuron record and the state of the hidden layer to which the neuron belongs. Among them, three activation functions f (sigmoid), g (tanh) and h (softmax) are used in calculating the output. The weight result can be converted into the classification result by adopting the activation function, and some nonlinear factors can be added to the neural network, so that the neural network can better solve the complex problem.
The data received and processed by the neurons in the two-way long-short time memory neural network comprises the following steps: input canonical chinese training samples: x, neuronal status: s, S. Furthermore, the parameters mentioned below also include: the input to the neuron is denoted a and the output is denoted b. Subscripts l, phi, and w represent the input gate, the forget gate, and the output gate, respectively. t represents the moment. Weights of the connection of the neuron with the input gate, the forgetting gate and the output gate are respectively recorded as w cl 、w And w 。S c Representing the neuronal status. I represents the number of neurons of the input layer, H is the number of neurons of the hidden layer, C is the number of neurons corresponding to the state of the neurons (I represents the ith neuron of the input layer, H represents the H neuron of the hidden layer, C represents the neuron corresponding to the C-th neuron state).
The input gate receives the input sample (input standard Chinese character training sample) at the current moment
Figure BDA0001683877600000131
Output value b of last time t-1 h Neuron state S at the last moment t-1 c By connecting the input standard Chinese character training sample with the weight w of the input gate il Connecting the output value of the last moment with the weight value w of the input gate hl And a weight w connecting the neuron with the input gate cl According to the formula->
Figure BDA0001683877600000141
Calculating the output of the input gate>
Figure BDA0001683877600000142
The activation function f is applied to +.>
Figure BDA0001683877600000143
By the formula->
Figure BDA0001683877600000144
A scalar of 0-1 interval is obtained. This scalar controls the proportion of current information received by the neuron based on the combined determination of the current state and the past state.
The forgetting gate receives a sample x at the current moment i t, the output value b at the previous time t-1 h State data S of the last moment t-1 c Weight w of training sample of Chinese characters and forgetting gate by connecting input specifications Connecting the output value of the last moment with the weight w of the forgetting gate And a weight w connecting the neuron with the amnestic gate According to the formula
Figure BDA0001683877600000145
Calculating to obtain output of amnesia gate>
Figure BDA0001683877600000146
The activation function f is applied to +.>
Figure BDA0001683877600000147
By the formula->
Figure BDA0001683877600000148
A scalar of 0-1 interval is obtained, which controls the proportion of past information forgotten by the neuron according to the comprehensive judgment of the current state and the past state.
The neuron receives a sample of the current time
Figure BDA0001683877600000149
Output value b of last time t-1 h State data S of the last moment t-1 c Weight w of training sample of Chinese characters for connecting neuron and input standard ic Connecting neuronsWeight w of output value from last moment hc And an output scalar of the input gate, forget gate, according to the formula +.>
Figure BDA00016838776000001410
Figure BDA00016838776000001411
Calculating the neuron state at the present moment +.>
Figure BDA00016838776000001412
Wherein, formula->
Figure BDA00016838776000001413
Item->
Figure BDA00016838776000001414
The hidden layer state is represented and is needed when updating network parameters.
The output gate receives the sample at the current time
Figure BDA00016838776000001415
Output value b of last time t-1 h And the neuron state at the present moment +.>
Figure BDA00016838776000001416
Weight w of input standard Chinese character training sample and output gate through connection iw Connecting the output value of the last moment with the weight value w of the output gate hw Weight w connecting neurons with output gates cw According to the formula->
Figure BDA00016838776000001417
Calculating the output of the output gate>
Figure BDA00016838776000001418
The activation function f is applied to +. >
Figure BDA00016838776000001419
The upper formula +.>
Figure BDA00016838776000001420
A scalar of 0-1 interval is obtained.
Hidden layer output
Figure BDA00016838776000001421
According to the output of the output gate after processing with the activation function +.>
Figure BDA00016838776000001422
And the neuron state can be found, expressed as +.>
Figure BDA0001683877600000151
And (5) calculating to obtain the product. The output values of all layers of the long-short-time memory neural network model can be obtained by the calculation of the Chinese training samples in the specification among all layers.
According to the calculation processing procedure, the output of each layer in the two-way long and short-term memory neural network can be calculated layer by layer, and the output value of the last output layer can be obtained. Since the neural network is bi-directional, the output values include forward and reverse outputs, respectively, using F o And B o Specifically, the standard Chinese character training samples are input into a bidirectional long-short-time memory neural network according to the forward direction of the sequence to obtain forward output F o Inputting the standard Chinese character training samples into a bidirectional long-short-time memory neural network according to the sequence reverse direction to obtain reverse output B o . It will be appreciated that assuming that the feature matrix has N columns, the sequence forward represents from column 1 to column N, and the sequence reverse represents from column N to column 1. Output value of output layer, i.e. forward output T o The forward output T can be obtained by adding the forward output and the reverse output o Expressed as T by a formula o =F o +B o . The forward output shows the output of the input standard Chinese character training sample after the response processing of the network parameters, and the error caused in the training process can be measured according to the forward output and the real result so as to update the network parameters according to the error.
S22: constructing an error function according to the forward output and the real result, wherein the expression of the error function is as follows
Figure BDA0001683877600000152
Wherein N represents the total number of samples of the training samples of the Chinese characters in the specification, and x i Representing the forward output of the ith training sample, y i Representation and x i The true result of the corresponding ith training sample.
Wherein the true result, i.e. the objective fact (also called the label value), is used to calculate the error with the forward output.
In this embodiment, since the forward output obtained after the bidirectional long-short-term memory neural network processes the standard Chinese character training sample has an error with the real result, a corresponding error function can be constructed according to the error, so that the bidirectional long-short-term memory neural network can be trained by using the error function, and the network parameters can be updated, so that the updated network parameters can obtain the forward output which is the same as or more similar to the real result when the input training sample is processed. Specifically, an appropriate error function may be constructed according to the actual situation, where the error function constructed in this embodiment is
Figure BDA0001683877600000153
The error between the forward output and the real result can be better reflected.
S23: according to the error function, updating network parameters of a bidirectional long-short-time memory neural network by adopting a time correlation counter propagation algorithm to obtain a Chinese character recognition model in a standard, wherein the gradient of the hidden layer output is that
Figure BDA0001683877600000154
The gradient of neuronal status is +.>
Figure BDA0001683877600000155
The gradient of the input door is that
Figure BDA0001683877600000161
Amnestic door gradient +.>
Figure BDA0001683877600000162
The gradient of the output gate is +.>
Figure BDA0001683877600000163
Gradient of hidden layer state +.>
Figure BDA0001683877600000164
In this embodiment, after a proper error function is constructed, a time-dependent back propagation algorithm (based on a small batch gradient) is used to update network parameters, and the updated bidirectional long-short-term memory neural network is used as a Chinese character recognition model in the specification. First define gradient of hidden layer output
Figure BDA0001683877600000165
Expressed as +.>
Figure BDA0001683877600000166
Finding the gradient of the neuronal status +.>
Figure BDA0001683877600000167
Expressed as +.>
Figure BDA0001683877600000168
With these two gradients, the gradient of the input gate can be determined accordingly>
Figure BDA0001683877600000169
Gradient of forgetting door
Figure BDA00016838776000001610
Gradient of output door->
Figure BDA00016838776000001611
Gradient of hidden layer state +.>
Figure BDA00016838776000001612
Output by hidden layerDefinition of the gradient of (2) and the gradient of the neuronal states the gradient of the hidden layer output can be calculated>
Figure BDA00016838776000001613
And a gradient of neuronal states
Figure BDA00016838776000001614
Can be according to->
Figure BDA00016838776000001615
And->
Figure BDA00016838776000001616
And (3) calculating: gradient of the input door- >
Figure BDA00016838776000001617
Gradient of amnesia door->
Figure BDA00016838776000001618
Gradient of output door->
Figure BDA00016838776000001619
Gradient of hidden layer state->
Figure BDA00016838776000001620
The meaning of the parameters of the above formula may refer to step S21, and will not be described herein. After each gradient is obtained, when the weight is updated, the product of the learning rate multiplied by the gradient subtracted by the original weight is obtained, and the updated weight is obtained.
Steps S21-S23 can construct an error function according to the forward output obtained by the standard Chinese character training sample in the bidirectional long-short-time memory neural network
Figure BDA00016838776000001621
And the network parameters are updated according to the error function back transmission, so that a standard Chinese character recognition model can be obtained, the model learns deep features of a standard Chinese character training sample, and standard characters can be accurately recognized. />
In one embodiment, as shown in fig. 5, in step S50, a Chinese character sample to be tested is identified by adjusting a Chinese handwriting recognition model, an error word whose recognition result does not match the real result is obtained, and all error words are used as error word training samples, which specifically includes the following steps:
s51: and inputting the Chinese character sample to be tested into the adjustment Chinese handwriting recognition model, and obtaining the output value of each character in the Chinese character sample to be tested in the adjustment Chinese handwriting recognition model.
In this embodiment, an adjusted chinese handwriting recognition model is used to recognize a sample of chinese characters to be tested, where the sample of chinese characters to be tested includes a plurality of chinese characters. In the Chinese character library, the number of commonly used Chinese characters is about three thousand, and a probability value of the similarity degree of each character in the Chinese character library and an input Chinese character sample to be tested is set at an output layer of the Chinese character recognition model, wherein the probability value is an output value of each character in the Chinese character sample to be tested in the Chinese character recognition model, and the probability value can be realized by a softmax function. Briefly, when "i" word is input, the output value (represented by probability) corresponding to each word in the chinese character library is obtained in adjusting the chinese handwriting recognition model, for example, the output value corresponding to "i" in the chinese character library is 99.5%, and the output values of the remaining words are added up to 0.5%. By setting the Chinese character sample to be tested, and the output value corresponding to each character in the Chinese character library after being identified by the adjusted Chinese handwriting recognition model, a reasonable recognition result can be obtained according to the output value.
S52: and selecting a maximum output value in the output values corresponding to each word, and acquiring the identification result of each word according to the maximum output value.
In this embodiment, the maximum output value of all the output values corresponding to each word is selected, and the recognition result of the word can be obtained according to the maximum output value. It can be understood that the output value directly reflects the similarity degree of the input word in the Chinese character sample to be tested and each word in the Chinese character library, and the maximum output value indicates that the word sample to be tested is closest to a certain word in the Chinese character library, and the word corresponding to the maximum output value can be the recognition result of the word, if the recognition result finally output by the input word I is I.
S53: and obtaining error words which are inconsistent with the real result according to the identification result, and taking all error words as error word training samples.
In this embodiment, the obtained recognition result is compared with the true result (objective fact), and an error word, which does not match the comparison recognition result with the true result, is used as an error word training sample. It can be understood that the recognition result is only the result of the training sample of the Chinese character to be tested, which is recognized by adjusting the recognition model of the Chinese character handwriting, and is possibly different from the real result, which reflects that the model still has defects in recognition accuracy, and the defects can be optimized by the training sample of the wrong character, so as to achieve more accurate recognition effect.
S51-S53, according to the output value of each word in the Chinese character sample to be tested in the Chinese handwriting recognition model, selecting the maximum output value capable of reflecting the similarity degree between the words from the output values; and obtaining a recognition result through the maximum output value, and obtaining an error word training sample according to the recognition result, thereby providing an important technical premise for further optimizing recognition accuracy by using the error word training sample.
In one embodiment, before step S10, that is, before the step of obtaining the chinese training samples in the specification, the handwriting model training method further includes the steps of: initializing a bidirectional long-short-term memory neural network.
In one embodiment, initializing the two-way long short-term memory neural network initializes network parameters of the network, giving initial values to the network parameters. If the initialized weights are in a relatively gentle region of the error surface, the convergence rate of the training of the bidirectional long-short-term memory neural network model may be abnormally slow. The network parameters may be initialized to be uniformly distributed over a relatively small interval having a mean value of 0, such as an interval of [ -0.30, +0.30 ]. The two-way long and short-term memory neural network is reasonably initialized, so that the network has flexible adjustment capability in the initial stage, the network can be effectively adjusted in the training process, the minimum value of an error function can be quickly and effectively found, the two-way long and short-term memory neural network is favorably updated and adjusted, and a model obtained by training based on the two-way long and short-term memory neural network has an accurate recognition effect in the process of Chinese handwriting recognition.
In the handwriting model training method provided in this embodiment, the network parameters of the bidirectional long-short-term memory neural network are initialized to be uniformly distributed in a relatively small interval with 0 mean value, such as [ -0.30, +0.30]The interval can quickly and effectively find the minimum value of the error function by adopting the initialization mode, and is beneficial to the update and adjustment of the two-way long and short-time memory neural network. The Chinese character training sample to be processed is normalized and divided into two kinds of values, the feature matrix of the binarized pixel value is obtained, and the character corresponding to the feature matrix is used as the Chinese character training sample in the specification, so that the duration of training the Chinese character recognition model in the specification can be obviously shortened. Constructing an error function according to forward output obtained by a Chinese character training sample in a standard in a bidirectional long-short-term memory neural network
Figure BDA0001683877600000181
And the network parameters are updated according to the error function back transmission, so that a standard Chinese character recognition model can be obtained, the model learns deep features of a standard Chinese character training sample, and standard characters can be accurately recognized. And then, the standard Chinese character recognition model is updated in an adjustable way through the non-standard Chinese characters, so that the updated adjustable Chinese handwritten character recognition model learns deep features of the non-standard Chinese characters in a training updating mode on the premise of having the capability of recognizing the standard Chinese handwritten characters, and the adjustable Chinese handwritten character recognition model can better recognize the non-standard Chinese handwritten characters. Then, according to the output value of each word in the Chinese character sample to be tested in the Chinese character handwriting recognition model, selecting the maximum output value capable of reflecting the similarity degree between the words from the output values, obtaining the recognition result by using the maximum output value, obtaining the error character training sample according to the recognition result, and taking all the error characters as the error characters And inputting the error word training sample into the adjustment Chinese handwriting recognition model for training and updating to obtain the target Chinese handwriting recognition model. The error word training sample is adopted, so that adverse effects caused by excessive learning and excessive weakening generated in the original training process can be eliminated to a great extent, and the recognition accuracy can be further optimized. In addition, in the handwriting model training method provided by the embodiment, a bidirectional long-short-term memory neural network is adopted for training each model, and the neural network can be combined with sequence characteristics of the words, and the deep features of the words are learned from the forward direction of the sequence and the reverse direction of the sequence, so that the function of identifying different Chinese handwriting is realized; the time-dependent back propagation algorithm based on small batch gradient is adopted in the training of the standard Chinese character recognition model and the adjustment Chinese handwriting recognition model, and the training efficiency and the training effect are still good under the condition of a large number of training samples. The target Chinese handwriting recognition model adopts a time-dependent back propagation algorithm based on batch gradient descent during training, so that the parameters in the model can be fully updated, the parameters can be comprehensively updated according to the generated errors, and the recognition accuracy of the acquired model is improved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present invention.
Fig. 6 shows a schematic block diagram of a handwriting model training apparatus in one-to-one correspondence with the handwriting model training method in the embodiment. As shown in fig. 6, the handwriting model training apparatus includes a normalized chinese character training sample acquiring module 10, a normalized chinese character recognition model acquiring module 20, an non-normalized chinese character training sample acquiring module 30, an adjusted chinese character recognition model acquiring module 40, an erroneous character training sample acquiring module 50, and a target chinese character recognition model acquiring module 60. The implementation functions of the standard chinese character training sample acquiring module 10, the standard chinese character recognition model acquiring module 20, the non-standard chinese character training sample acquiring module 30, the adjustment chinese character recognition model acquiring module 40, the error character training sample acquiring module 50, and the target chinese character recognition model acquiring module 60 correspond to the steps corresponding to the handwriting model training method in the embodiment one by one, so that detailed descriptions are omitted.
The standard Chinese character training sample obtaining module 10 is configured to obtain standard Chinese character training samples, and divide the standard Chinese character training samples into batches according to a preset batch.
The standard Chinese character recognition model obtaining module 20 is configured to input the batched standard Chinese character training samples into a bidirectional long-short-time memory neural network for training, obtain forward output of the bidirectional long-short-time memory neural network, update network parameters of the bidirectional long-short-time memory neural network according to the forward output of the bidirectional long-short-time memory neural network, and obtain the standard Chinese character recognition model by adopting a time-dependent back propagation algorithm.
The non-standard Chinese character training sample obtaining module 30 is configured to obtain non-standard Chinese character training samples, and divide the non-standard Chinese character training samples into batches according to a preset batch.
The adjusted chinese handwriting recognition model obtaining module 40 is configured to input the classified non-standard chinese character training samples into the standard chinese character recognition model for training, obtain a forward output of the standard chinese character recognition model, update network parameters of the standard chinese character recognition model according to the forward output of the standard chinese character recognition model, and obtain the adjusted chinese handwriting recognition model by using a time-dependent back propagation algorithm.
The error word training sample obtaining module 50 is configured to obtain a Chinese word sample to be tested, identify the Chinese word sample to be tested by using the adjusted Chinese handwriting recognition model, obtain error words with recognition results inconsistent with the real results, and take all error words as error word training samples.
The target chinese handwriting recognition model obtaining module 60 is configured to input the error word training sample into the adjusted chinese handwriting recognition model for training, obtain a forward output of the adjusted chinese handwriting recognition model, update the network parameters of the adjusted chinese handwriting recognition model according to the forward output of the adjusted chinese handwriting recognition model, and obtain the target chinese handwriting recognition model by using a time-dependent back propagation algorithm based on batch gradient descent.
Preferably, the normalized chinese character recognition model obtaining module 10 includes a normalized pixel value feature matrix obtaining unit 11 and a normalized chinese character training sample obtaining unit 12.
Acquiring a pixel value characteristic matrix of each Chinese character in a Chinese character training sample to be processed, and carrying out normalization processing on each pixel value in the batch-divided pixel value characteristic matrix according to preset 5 batches to acquire a normalized pixel value characteristic matrix of each Chinese character, wherein the formula of the normalization processing is as follows
Figure BDA0001683877600000201
MaxValue is the maximum value of the pixel values in the pixel value feature matrix of each Chinese character, minValue is the minimum value of the pixel values in the pixel value feature matrix of each Chinese character, x is the pixel value before normalization, and y is the pixel value after normalization.
The standard chinese character training sample acquiring unit 12 is configured to divide the pixel values in the normalized pixel value feature matrix of each chinese character into two types of pixel values, establish a binary pixel value feature matrix of each chinese character based on the two types of pixel values that are batched according to the preset 5 batches, combine the binary pixel feature matrices of each chinese character as a standard chinese character training sample, and batch the standard Fan Zhong chinese character training sample according to the preset 5 batches.
Preferably, the normalized chinese character recognition model acquisition module 20 includes a forward output acquisition unit 21, an error function construction unit 22, and a normalized chinese character recognition model acquisition unit 23.
A forward output obtaining unit 21 for inputting the standard Chinese character training samples into the bidirectional long and short memory neural network according to the sequence forward direction to obtain a forward output F o Inputting the standard Chinese character training samples into a bidirectional long-short-time memory neural network according to the sequence reverse direction to obtain reverse output B o Adding the forward output and the reverse output to obtain a forward output T o Formulation ofIs T o =F o +B o
An error function construction unit 22 for constructing an error function based on the forward output and the true result, the error function having the expression of
Figure BDA0001683877600000202
Wherein N represents the total number of training samples, x i Representing the forward output of the ith training sample, y i Representation and x i The true result of the corresponding ith training sample.
A normalized Chinese character recognition model obtaining unit 23, configured to update network parameters of the bidirectional long-short-term memory neural network by using a time-dependent back propagation algorithm according to the error function, to obtain a normalized Chinese character recognition model, where the gradient of the hidden layer output is
Figure BDA0001683877600000203
The gradient of neuronal states is
Figure BDA0001683877600000204
The gradient of the input gate is +.>
Figure BDA0001683877600000205
Amnestic door gradient +.>
Figure BDA0001683877600000211
The gradient of the output gate is +.>
Figure BDA0001683877600000212
Gradient of hidden layer state +.>
Figure BDA0001683877600000213
Preferably, the error word training sample acquisition module 50 includes a model output value acquisition unit 51, a model recognition result acquisition unit 52, and an error word training sample acquisition unit 53.
The model output value obtaining unit 51 is configured to input the Chinese character sample to be tested into the adjusted Chinese handwriting recognition model, and obtain an output value of each word in the Chinese character sample to be tested in the adjusted Chinese handwriting recognition model.
And a model recognition result obtaining unit 52, configured to select a maximum output value from the output values corresponding to each word, and obtain a recognition result of each word according to the maximum output value.
An error word training sample obtaining unit 53, configured to obtain, according to the recognition result, error words that are not consistent with the real result, and take all error words as error word training samples.
Preferably, the handwriting model training apparatus further comprises an initializing module 70 for initializing the bidirectional long-short-term memory neural network.
Fig. 7 shows a flowchart of a handwritten character recognition method in the present embodiment. The handwritten character recognition method can be applied to computer equipment configured by institutions such as banks, investment, insurance and the like, and is used for recognizing handwritten Chinese characters, so that the aim of artificial intelligence is fulfilled. As shown in fig. 7, the handwriting recognition method includes the steps of:
s70: the method comprises the steps of obtaining Chinese characters to be recognized, recognizing the Chinese characters to be recognized by using a target Chinese handwriting recognition model, and obtaining an output value of the Chinese characters to be recognized in the target Chinese handwriting recognition model, wherein the target Chinese handwriting recognition model is obtained by using the handwriting model training method.
The Chinese characters to be identified refer to Chinese characters to be identified.
In this embodiment, the to-be-identified chinese character is obtained, and the to-be-identified chinese character is input into the target chinese handwritten character recognition model for recognition, and an output value of the to-be-identified chinese character in the target chinese handwritten character recognition model is obtained, where one to-be-identified chinese character corresponds to more than three thousand (the specific number is based on the chinese character library), and the recognition result of the to-be-identified chinese character may be determined based on the output value. Specifically, the Chinese character to be identified is specifically represented by a binarized pixel value feature matrix which can be directly identified by a computer.
S80: and obtaining a target probability output value according to the output value and a preset Chinese semantic word stock, and obtaining a recognition result of the Chinese character to be recognized based on the target probability output value.
The preset Chinese semantic word stock refers to a word stock which is preset and is based on word frequency and used for describing semantic relations among Chinese words. For example, for the words of the two words of "X positive" in the chinese semantic thesaurus, the probability of occurrence of "sun" is 30.5%, the probability of occurrence of "sun" is 0.5%, and the sum of the probabilities of occurrence of the remaining words of the two words of "X positive" such as "sun" is 69%. The target probability output value is a probability value which is obtained by combining the output value and a preset Chinese semantic word stock and is used for obtaining the recognition result of the Chinese characters to be recognized.
Specifically, the method for acquiring the target probability output value by adopting the output value and a preset Chinese semantic word stock comprises the following steps: (1) And selecting the maximum value in the output values corresponding to each word in the Chinese characters to be identified as a first probability value, and acquiring a preliminary identification result of the Chinese characters to be identified according to the first probability value. (2) And acquiring left semantic probability values and right semantic probability values of the words to be recognized according to the preliminary recognition result and the Chinese semantic lexicon. It will be appreciated that for a text, the words in the text are ordered sequentially, e.g. "red X positive", and for an "X" word there are probability values corresponding to the left and right directional words "red X" and "X positive", i.e. a left semantic probability value and a right semantic probability value. (3) And respectively setting the weight of the output value corresponding to each word in the Chinese characters to be identified, the weight of the left semantic probability value and the weight of the right semantic probability value. Specifically, a weight of 0.4 can be given to the output value corresponding to each word in the Chinese words to be recognized, a weight of 0.3 is given to the left semantic probability value, and a weight of 0.3 is given to the right semantic probability value. (4) And multiplying the set weight values by the corresponding probability values to obtain probability values after each weighting operation, adding the probability values after each weighting operation to obtain target probability output values (a plurality of target probability output values are provided, the specific number can be based on a Chinese character library), and selecting a character corresponding to the maximum value in the target probability output values as a recognition result of the Chinese character to be recognized. In practice, the first 5 probability values with the largest value in the output values can be selected first, the first 5 probability values represent the most likely 5 words (recognition results), and only the 5 words are combined with the Chinese semantic word bank to calculate the target probability output values, so that the target probability output values are only 5, and the recognition efficiency can be greatly improved. By combining the output value with a preset Chinese semantic word stock, an accurate recognition result can be obtained. It will be appreciated that for recognition of a single word (non-text), the corresponding recognition result may be obtained directly from the maximum value in the output value, without the need to add recognition based on chinese semantics.
And S70-S80, identifying the Chinese characters to be identified by adopting a target Chinese handwriting character identification model, and acquiring an identification result of the Chinese characters to be identified by combining the output value and a preset Chinese semantic word stock. The target Chinese handwriting recognition model has higher recognition accuracy, and the recognition accuracy of Chinese handwriting is further improved by combining with a Chinese semantic word stock.
In the handwritten character recognition method provided by the embodiment of the invention, the Chinese character to be recognized is input into the target Chinese handwritten character recognition model for recognition, and the recognition result is obtained by combining with the preset Chinese semantic word stock. When the target Chinese handwriting recognition model is adopted to recognize the Chinese handwriting, an accurate recognition result can be obtained.
Fig. 8 shows a schematic block diagram of a handwritten character recognition device in one-to-one correspondence to a handwritten character recognition method in an embodiment. As shown in fig. 8, the handwriting recognition apparatus includes an output value acquisition module 80 and a recognition result acquisition module 90. The implementation functions of the output value obtaining module 80 and the recognition result obtaining module 90 correspond to the steps corresponding to the handwriting recognition method in the embodiment one by one, and in order to avoid redundancy, the embodiment is not described in detail one by one.
The handwriting recognition device comprises an output value acquisition module 80, which is used for acquiring the Chinese characters to be recognized, recognizing the Chinese characters to be recognized by adopting a target Chinese handwriting recognition model, and acquiring the output value of the Chinese characters to be recognized in the target Chinese handwriting recognition model; the target Chinese handwriting recognition model is obtained by adopting a handwriting model training method.
The recognition result obtaining module 90 is configured to obtain a target probability output value according to the output value and a preset chinese semantic word stock, and obtain a recognition result of the chinese character to be recognized based on the target probability output value.
The present embodiment provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the handwriting model training method in the embodiment, and in order to avoid repetition, a detailed description is omitted here. Alternatively, the computer program when executed by the processor implements the functions of each module/unit of the handwriting model training apparatus in the embodiment, and in order to avoid repetition, a description is omitted here. Alternatively, the computer program may implement the functions of each step in the handwriting recognition method in the embodiment when executed by the processor, and in order to avoid repetition, details are not described herein. Alternatively, the computer program when executed by the processor implements the functions of each module/unit in the handwriting recognition device in the embodiment, and in order to avoid repetition, details are not described herein.
FIG. 9 is a schematic diagram of a computer device according to an embodiment of the present invention. As shown in fig. 9, the computer device 100 of this embodiment includes: the processor 101, the memory 102, and the computer program 103 stored in the memory 102 and capable of running on the processor 101, where the computer program 103 implements the handwriting model training method in the embodiment when executed by the processor 101, and is not described herein in detail to avoid repetition. Alternatively, the computer program, when executed by the processor 101, performs the functions of each model/unit in the handwriting model training apparatus in the embodiment, and is not described herein in detail for avoiding repetition. Alternatively, the computer program when executed by the processor 101 performs the functions of each step in the handwriting recognition method in the embodiment, and in order to avoid repetition, details are not described herein. Alternatively, the computer program, when executed by the processor 101, performs the functions of the modules/units of the handwriting recognition device in the embodiment. In order to avoid repetition, details are not repeated here.
The computer device 100 may be a desktop computer, a notebook computer, a palm computer, a cloud server, or the like. Computer devices may include, but are not limited to, processor 101, memory 102. It will be appreciated by those skilled in the art that fig. 9 is merely an example of computer device 100 and is not intended to limit computer device 100, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., a computer device may also include an input-output device, a network access device, a bus, etc.
The processor 101 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 102 may be an internal storage unit of the computer device 100, such as a hard disk or a memory of the computer device 100. The memory 102 may also be an external storage device of the computer device 100, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the computer device 100. Further, the memory 102 may also include both internal storage units and external storage devices of the computer device 100. The memory 102 is used to store computer programs and other programs and data required by the computer device. The memory 102 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention.

Claims (10)

1. A method of training a handwriting model, comprising:
acquiring standard Chinese character training samples, and dividing the standard Chinese character training samples into batches according to preset batches;
Inputting the classified standard Chinese character training samples into a bidirectional long-short-time memory neural network for training, obtaining forward output of the bidirectional long-short-time memory neural network, updating network parameters of the bidirectional long-short-time memory neural network according to the forward output of the bidirectional long-short-time memory neural network by adopting a time-dependent back propagation algorithm, and obtaining a standard Chinese character recognition model;
acquiring non-standard Chinese character training samples, and dividing the non-standard Chinese character training samples into batches according to preset batches;
inputting the classified non-standard Chinese character training samples into the standard Chinese character recognition model for training, obtaining the forward output of the standard Chinese character recognition model, updating the network parameters of the standard Chinese character recognition model by adopting a time-dependent back propagation algorithm according to the forward output of the standard Chinese character recognition model, and obtaining and adjusting the Chinese handwriting character recognition model;
acquiring a Chinese character sample to be tested, identifying the Chinese character sample to be tested by adopting the Chinese character handwriting identification model, acquiring error characters with identification results not consistent with real results, and taking all the error characters as error character training samples;
And inputting the error word training sample into the adjustment Chinese handwriting recognition model for training, obtaining the forward output of the adjustment Chinese handwriting recognition model, updating the network parameters of the adjustment Chinese handwriting recognition model by adopting a time-dependent back propagation algorithm based on batch gradient descent according to the forward output of the adjustment Chinese handwriting recognition model, and obtaining the target Chinese handwriting recognition model.
2. The method for training a handwriting model according to claim 1, wherein the steps of obtaining the training samples of the Chinese characters in the specification and classifying the training samples of the Chinese characters in the specification according to a preset batch comprise:
acquiring a pixel value characteristic matrix of each Chinese character in a Chinese character training sample to be processed, and carrying out normalization processing on each pixel value in the pixel value characteristic matrix to acquire a normalized pixel value characteristic matrix of each Chinese character, wherein a formula of normalization processing is as follows
Figure QLYQS_1
MaxValue is the maximum value of the pixel values in the pixel value feature matrix of each Chinese character, minValue is the minimum value of the pixel values in the pixel value feature matrix of each Chinese character, x is the pixel value before normalization, and y is the pixel value after normalization;
Dividing pixel values in the normalized pixel value feature matrix of each Chinese character into two types of pixel values, establishing a binarized pixel value feature matrix of each Chinese character based on the two types of pixel values, combining the binarized pixel value feature matrix of each Chinese character to serve as a standard Chinese character training sample, and dividing the standard Chinese character training samples into batches according to preset batches.
3. The handwriting model training method according to claim 1, wherein the inputting the classified standard Chinese character training samples into the bidirectional long-short-time memory neural network for training, obtaining forward output of the bidirectional long-short-time memory neural network, updating network parameters of the bidirectional long-short-time memory neural network by adopting a time-dependent back propagation algorithm according to the forward output of the bidirectional long-short-time memory neural network, and obtaining the standard Chinese character recognition model comprises:
the standard Chinese character training samples after batch division are input into a bidirectional long and short time memory neural network according to the forward direction of the sequence to obtain the forward direction outputF o Reversely inputting the classified standard Chinese character training samples into a bidirectional long-short-time memory neural network according to the sequence to obtain reverse output B o Adding the forward output and the reverse output to obtain a forward output T o Expressed as T o =F o +B o
Constructing an error function according to the forward output and the real result, wherein the expression of the error function is as follows
Figure QLYQS_2
Wherein N represents the total number of training samples, x i Representing the forward output of the ith training sample, y i Representation and x i The real result of the corresponding ith training sample;
according to the error function, updating network parameters of a bidirectional long-short-time memory neural network by adopting a time correlation back propagation algorithm to obtain a Chinese character recognition model in a standard, wherein the gradient of the hidden layer output is that
Figure QLYQS_7
The gradient of neuronal status is +.>
Figure QLYQS_10
The gradient of the input door is that
Figure QLYQS_12
Amnestic door gradient +.>
Figure QLYQS_9
The gradient of the output gate is +.>
Figure QLYQS_15
Gradient of hidden layer state +.>
Figure QLYQS_6
Wherein K represents the number of neurons of the output layer, K represents the number of neurons of the kth output layer, H represents the number of neurons of the hidden layer, H represents the number of neurons of the H hidden layer, and C represents the godThe number, w, of neurons corresponding to the channel state ck Representing the connection weights, w, of neurons and the kth output layer neurons ch Representing the connection weights of neurons and h hidden layer neurons, +.>
Figure QLYQS_16
Representing the gradient of the output layer neurons at the current moment, < + > >
Figure QLYQS_8
Representing the gradient of hidden layer neurons at the next moment,/->
Figure QLYQS_14
Control the proportion of the neuron output current information, +.>
Figure QLYQS_3
Controlling the proportion of past information left by neurons, +.>
Figure QLYQS_11
Representing the state of the neuron at the current moment, w cl Weights, w, representing the connection of neurons to input gates Weights representing neuron and amnestic gate connections, w cw Weights representing the connection of neurons and output gates, < ->
Figure QLYQS_5
Input representing the corresponding neuron state of the c-th neuron at the current time,/and the like>
Figure QLYQS_13
Input representing input gate, +.>
Figure QLYQS_17
Input representing a forget gate,/->
Figure QLYQS_18
Input representing output gate, +.>
Figure QLYQS_4
The proportion of the current information received by the neurons is controlled.
4. The method for training a handwriting model according to claim 1, wherein said using an adjusted chinese handwriting recognition model to recognize a sample of a chinese character to be tested, obtaining an error word whose recognition result does not match the actual result, and using all the error words as the error word training sample comprises:
inputting a Chinese character sample to be tested into an adjustment Chinese handwriting recognition model, and obtaining an output value of each character in the Chinese character sample to be tested in the adjustment Chinese handwriting recognition model;
selecting a maximum output value in output values corresponding to each word, and acquiring a recognition result of each word according to the maximum output value;
And obtaining error words which are inconsistent with the real result according to the identification result, and taking all the error words as error word training samples.
5. The method of training a handwriting model according to claim 1, wherein prior to the step of obtaining a Chinese-in-specification training sample, the method of training a handwriting model further comprises:
initializing a bidirectional long-short-term memory neural network.
6. A method of handwriting recognition, comprising:
acquiring a Chinese character to be identified, identifying the Chinese character to be identified by adopting a target Chinese handwriting recognition model, and acquiring an output value of the Chinese character to be identified in the target Chinese handwriting recognition model; the target Chinese handwriting recognition model is obtained by adopting the handwriting model training method according to any one of claims 1-5;
and acquiring a target probability output value according to the output value and a preset Chinese semantic word stock, and acquiring the recognition result of the Chinese character to be recognized based on the target probability output value.
7. A handwriting model training device, comprising:
the standard Chinese character training sample acquisition module is used for acquiring standard Chinese character training samples and dividing the standard Chinese character training samples into batches according to preset batches;
The standard Chinese character recognition model acquisition module is used for inputting the batched standard Chinese character training samples into the bidirectional long-short-time memory neural network for training, acquiring forward output of the bidirectional long-short-time memory neural network, updating network parameters of the bidirectional long-short-time memory neural network according to the forward output of the bidirectional long-short-time memory neural network, and acquiring a standard Chinese character recognition model by adopting a time-dependent back propagation algorithm;
the non-standard Chinese character training sample acquisition module is used for acquiring non-standard Chinese character training samples and dividing the non-standard Chinese character training samples into batches according to preset batches;
the adjustment Chinese handwriting recognition model acquisition module is used for inputting the classified non-standard Chinese character training samples into the standard Chinese character recognition model for training, acquiring the forward output of the standard Chinese character recognition model, updating the network parameters of the standard Chinese character recognition model by adopting a time-dependent back propagation algorithm according to the forward output of the standard Chinese character recognition model, and acquiring the adjustment Chinese handwriting recognition model;
the error word training sample acquisition module is used for acquiring a Chinese word sample to be tested, identifying the Chinese word sample to be tested by adopting the adjusted Chinese handwriting recognition model, acquiring error words with recognition results not consistent with real results, and taking all the error words as error word training samples;
And the target Chinese handwritten character recognition model acquisition module is used for inputting the error character training sample into the adjustment Chinese handwritten character recognition model for training, acquiring the forward output of the adjustment Chinese handwritten character recognition model, updating the network parameters of the adjustment Chinese handwritten character recognition model by adopting a time-dependent back propagation algorithm based on batch gradient descent according to the forward output of the adjustment Chinese handwritten character recognition model, and acquiring the target Chinese handwritten character recognition model.
8. A handwritten word recognition apparatus, comprising:
the output value acquisition module is used for acquiring the Chinese characters to be identified, identifying the Chinese characters to be identified by adopting a target Chinese handwriting recognition model, and acquiring the output value of the Chinese characters to be identified in the target Chinese handwriting recognition model; the target Chinese handwriting recognition model is obtained by adopting the handwriting model training method according to any one of claims 1-5;
the recognition result acquisition module is used for acquiring a target probability output value according to the output value and a preset Chinese semantic word stock, and acquiring a recognition result of the Chinese character to be recognized based on the target probability output value.
9. A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the handwriting model training method according to any of claims 1 to 5 when the computer program is executed; alternatively, the processor, when executing the computer program, implements the steps of the handwriting recognition method according to claim 6.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the handwriting model training method according to any of claims 1 to 5; alternatively, the processor, when executing the computer program, implements the steps of the handwriting recognition method according to claim 6.
CN201810563480.0A 2018-06-04 2018-06-04 Handwriting model training method, handwriting character recognition method, device, equipment and medium Active CN109034279B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810563480.0A CN109034279B (en) 2018-06-04 2018-06-04 Handwriting model training method, handwriting character recognition method, device, equipment and medium
PCT/CN2018/094269 WO2019232859A1 (en) 2018-06-04 2018-07-03 Handwriting model training method and apparatus, handwritten character recognition method and apparatus, device, and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810563480.0A CN109034279B (en) 2018-06-04 2018-06-04 Handwriting model training method, handwriting character recognition method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN109034279A CN109034279A (en) 2018-12-18
CN109034279B true CN109034279B (en) 2023-04-25

Family

ID=64612050

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810563480.0A Active CN109034279B (en) 2018-06-04 2018-06-04 Handwriting model training method, handwriting character recognition method, device, equipment and medium

Country Status (2)

Country Link
CN (1) CN109034279B (en)
WO (1) WO2019232859A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111756602B (en) * 2020-06-29 2022-09-27 上海商汤智能科技有限公司 Communication timeout detection method in neural network model training and related product
CN111738269B (en) * 2020-08-25 2020-11-20 北京易真学思教育科技有限公司 Model training method, image processing device, model training apparatus, and storage medium
CN112200312A (en) * 2020-09-10 2021-01-08 北京达佳互联信息技术有限公司 Method and device for training character recognition model and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101785030A (en) * 2007-08-10 2010-07-21 微软公司 Hidden markov model based handwriting/calligraphy generation
CN107316054A (en) * 2017-05-26 2017-11-03 昆山遥矽微电子科技有限公司 Non-standard character recognition methods based on convolutional neural networks and SVMs

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7369702B2 (en) * 2003-11-07 2008-05-06 Microsoft Corporation Template-based cursive handwriting recognition
US7620245B2 (en) * 2006-05-30 2009-11-17 Microsoft Corporation Cursive handwriting recognition with hierarchical prototype search
US20150317336A1 (en) * 2014-04-30 2015-11-05 Hewlett-Packard Development Company, L.P. Data reconstruction
CN105512692B (en) * 2015-11-30 2019-04-09 华南理工大学 Hand script Chinese input equipment mathematical formulae Symbol Recognition based on BLSTM
CN106570456B (en) * 2016-10-13 2019-08-09 华南理工大学 Handwritten Chinese character text recognition method based on full convolution Recursive Networks
CN107316067B (en) * 2017-05-27 2019-11-15 华南理工大学 A kind of aerial hand-written character recognition method based on inertial sensor
CN107665333A (en) * 2017-08-28 2018-02-06 平安科技(深圳)有限公司 A kind of indecency image identification method, terminal, equipment and computer-readable recording medium based on convolutional neural networks

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101785030A (en) * 2007-08-10 2010-07-21 微软公司 Hidden markov model based handwriting/calligraphy generation
CN107316054A (en) * 2017-05-26 2017-11-03 昆山遥矽微电子科技有限公司 Non-standard character recognition methods based on convolutional neural networks and SVMs

Also Published As

Publication number Publication date
WO2019232859A1 (en) 2019-12-12
CN109034279A (en) 2018-12-18

Similar Documents

Publication Publication Date Title
CN108764195B (en) Handwriting model training method, handwritten character recognition method, device, equipment and medium
CN109086654B (en) Handwriting model training method, text recognition method, device, equipment and medium
CN109034280B (en) Handwriting model training method, handwriting character recognition method, device, equipment and medium
Borovykh et al. Conditional time series forecasting with convolutional neural networks
CN109086653B (en) Handwriting model training method, handwritten character recognition method, device, equipment and medium
US10892050B2 (en) Deep image classification of medical images
CN109002461B (en) Handwriting model training method, text recognition method, device, equipment and medium
Yu et al. Deep learning with kernel regularization for visual recognition
CN108985442B (en) Handwriting model training method, handwritten character recognition method, device, equipment and medium
JP2022141931A (en) Method and device for training living body detection model, method and apparatus for living body detection, electronic apparatus, storage medium, and computer program
CN109214378A (en) A kind of method and system integrally identifying metering meter reading based on neural network
Ahmad et al. Offline Urdu Nastaleeq optical character recognition based on stacked denoising autoencoder
CN109034279B (en) Handwriting model training method, handwriting character recognition method, device, equipment and medium
US11625612B2 (en) Systems and methods for domain adaptation
CN108985151B (en) Handwriting model training method, handwritten character recognition method, device, equipment and medium
CN110175500B (en) Finger vein comparison method, device, computer equipment and storage medium
CN113971741A (en) Image labeling method, classification model training method and computer equipment
CN112183336A (en) Expression recognition model training method and device, terminal equipment and storage medium
CN109086651B (en) Handwriting model training method, handwritten character recognition method, device, equipment and medium
CN113762005A (en) Method, device, equipment and medium for training feature selection model and classifying objects
Liu et al. Multi-digit recognition with convolutional neural network and long short-term memory
Liu et al. Combined with the residual and multi-scale method for Chinese thermal power system record text recognition
CN116109853A (en) Task processing model training method, task processing method, device and equipment
CN111382712A (en) Palm image recognition method, system and equipment
Zhao et al. Joint weakly parameter-shared and higher order statistical criteria for domain adaptation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant