-
Rathore Media
- Morgan Hill, CA
- http://therathores.com
Lists (1)
Sort Name ascending (A-Z)
Stars
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
# Debian Chromium Kiosk on Spotify Car Thing (superbird)
Interact with your documents using the power of GPT, 100% privately, no data leaks
Generate embeddings from large-scale graph-structured data.
Terminal-based progress bar for Java / JVM
Platform for designing and evaluating Graph Neural Networks (GNN)
Stable Diffusion web UI
Taming Transformers for High-Resolution Image Synthesis
PyTorch code for BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
[NeurIPS 2022] Towards Robust Blind Face Restoration with Codebook Lookup Transformer
A latent text-to-image diffusion model
this repository accompanies the book "Grokking Deep Learning"
Python library assists deep learning on graphs
SebastianHurubaru / deepsnap
Forked from snap-stanford/deepsnapPython library assists deep learning on graphs
The official code repository for the second edition of the O'Reilly book Generative Deep Learning: Teaching Machines to Paint, Write, Compose and Play.
My programming assignments for the machine learning course
CoreNLP: A Java suite of core NLP tools for tokenization, sentence segmentation, NER, parsing, coreference, sentiment analysis, etc.
https://cs330.stanford.edu/
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
Resources and tools for Indian language Natural Language Processing
A collaborative catalog of NLP resources for Indic languages
The IIT Bombay English-Hindi Parallel Corpus
This includes the original implementation of SELF-RAG: Learning to Retrieve, Generate and Critique through self-reflection by Akari Asai, Zeqiu Wu, Yizhong Wang, Avirup Sil, and Hannaneh Hajishirzi.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Code for the paper "Language Models are Unsupervised Multitask Learners"