-
University of Hamburg
- Hamburg, Germany
- https://www.inf.uni-hamburg.de/en/inst/ab/lt/people/alexander-panchenko.html
Stars
[NAACL 2024 Outstanding Paper] Source code for the NAACL 2024 paper entitled "R-Tuning: Instructing Large Language Models to Say 'I Don't Know'"
Emotions recognition from audio and text files (only russian language)
SemEval2024-task8: Multidomain, Multimodel and Multilingual Machine-Generated Text Detection
[ACL 2024] TaxoLLaMA: WordNet-based Model for Solving Multiple Lexical Sematic Tasks
ai-forever / gigachain
Forked from langchain-ai/langchain⚡ Набор решений для разработки LLM-приложений на русском языке с поддержкой GigaChat ⚡
MERA (Multimodal Evaluation for Russian-language Architectures) is a new open benchmark for the Russian language for evaluating fundamental models.
Conditional Transformer Language Model for Controllable Generation
Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
Gradient-free optimization method for the multidimensional arrays and discretized multivariate functions based on the tensor train (TT) format.
Solver in the low-rank tensor train format with cross approximation approach for the multidimensional Fokker-Planck equation
Gradient-free optimization method for multivariable functions based on the low rank tensor train (TT) format and maximal-volume principle.
A framework based on the tensor train decomposition for working with multivariate functions and multidimensional arrays
GENA-LM is a transformer masked language model trained on human DNA sequence.
POGEMA stands for Partially-Observable Grid Environment for Multiple Agents. This is a grid-based environment that was specifically designed to be flexible, tunable and scalable. It can be tailored…
OmniFusion — a multimodal model to communicate using text and images
A neural network training framework within a task-based parallel programming paradigm
PromptKG Family: a Gallery of Prompt Learning & KG-related research works, toolkits, and paper-list.
https://arxiv.org/abs/2201.06499
CLIP implementation for Russian language
Fast and customizable framework for automatic ML model creation (AutoML)
nablaDFT: Large-Scale Conformational Energy and Hamiltonian Prediction benchmark and dataset
Kandinsky 2 — multilingual text2image latent diffusion model
This is an open-source toolkit for Heterogeneous Graph Neural Network(OpenHGNN) based on DGL.
Graph Transformer Networks (Authors' PyTorch implementation for the NeurIPS 19 paper)