NCRF++, a Neural Sequence Labeling Toolkit. Easy use to any sequence labeling tasks (e.g. NER, POS, Segmentation). It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components.
-
Updated
Jun 30, 2022 - Python
8000
NCRF++, a Neural Sequence Labeling Toolkit. Easy use to any sequence labeling tasks (e.g. NER, POS, Segmentation). It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components.
Fantasy name generator in TensorFlow
Multi-layer Recurrent Neural Networks for character-level language models implements by TensorFlow
Wrapper library for text generation / language models at character and word level with RNNs in TensorFlow
Minimal implementation of Multi-layer Recurrent Neural Networks (LSTM) for character-level language modelling in PyTorch
TensorFlow implementation of multi-layer recurrent neural networks for training and sampling from texts
char-rnn implementation for sentiment analysis on twitter data
Extrapolate gender from first names using Naïve-Bayes and PyTorch Char-RNN
Sequence Tagger implementation
⏱️ char-rnn for time series data
Code for "CharManteau: Character Embedding Models For Portmanteau Creation. EMNLP 2017. Varun Gangal*, Harsh Jhamtani*, Graham Neubig, Eduard Hovy, Eric Nyberg"
Simple recurrent neural network for text generation. Based on https://gist.github.com/karpathy/d4dee566867f8291f086
Character Level Language Modelling using PyTorch
This project is the implementation of Andrej Karpathy implementation of chracter RNN model in Keras. This generates the Nepali poet train on Laxmi Prasad Devkota Poems.
Multi-layer Recurrent Neural Networks (LSTM,RNN) for character-level language models in Python using Tensorflow.
Add a description, image, and links to the char-rnn topic page so that developers can more easily learn about it.
To associate your repository with the char-rnn topic, visit your repo's landing page and select "manage topics."