Stars
Mirror of https://code.tecosaur.net/tec/About.jl
Package for writing high-level code for parallel high-performance stencil computations that can be deployed on both GPUs and CPUs
Implementation of a robust, performant language-level autograd compiler for Julia
One More Einsum for Julia! With runtime order-specification and high-level adjoints for AD
A projection-based framework for gradient-free and parallel learning
Cross-architecture parallel algorithms for Julia's CPU and GPU backends. Targets multithreaded CPUs, and GPUs via Intel oneAPI, AMD ROCm, Apple Metal, Nvidia CUDA.
An algebraic modeling and automatic differentiation tool in Julia Language, specialized for SIMD abstraction of nonlinear programs.
An efficient backend for InfiniteOpt that accelerates NLPs on CPU and GPU.
JuMP implementation of Dual Lagrangian Learning for Conic Optimization
Library for Jacobian descent with PyTorch. It enables the optimization of neural networks with multiple losses (e.g. multi-task learning).
Drawings of partially ordered sets from Posets.jl, fully compatible with Graphs.jl.
Julia package for Model Predictive Control (MPC) of linear systems
Partially ordered sets fully compatible with Graphs.jl
PyTorch implementation of adversarial attacks [torchattacks]
Automatic dualization feature for MathOptInterface.jl and JuMP
A collection of utility functions to work with PyTorch sparse tensors
Tools for constructing and analyzing the incidence graph or matrix of variables and constraints in a JuMP model
Compress JSON into URL friendly strings
A web-based ASCII and Unicode diagram builder written in vanilla Javascript
FiveSheepCo / cloudflare-apns2
Forked from AndrewBarba/apns2Cloudflare workers friendly client for connecting to Apple's Push Notification Service using the new HTTP/2 protocol with JSON web tokens