8000 GitHub - DanielMoraite/larq: An Open Source Machine Learning Library for Training Binarized Neural Networks
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

DanielMoraite/larq

 
 

Repository files navigation

Larq

Azure DevOps builds Azure DevOps coverage PyPI - Python Version PyPI PyPI - License Code style: black

Binder Join the community on Spectrum

Larq is an open source machine learning library for training Quantized Neural Networks (QNNs) with extremely low precision weights and activations (e.g. 1-bit). Existing Deep Neural Networks tend to be large, slow and power-hungry, prohibiting many applications in resource-constrained environments. Larq is designed to provide an easy to use, composable way to train QNNs (e.g. Binarized Neural Networks) based on the tf.keras interface.

Getting Started

To build a QNN Larq introduces the concept of Quantized Layers and Quantizers. A Quantizer defines the way of transforming a full precision input to a quantized output and the pseudo-gradient method used for the backwards pass. Each Quantized Layer requires a input_quantizer and kernel_quantizer that describes the way of quantizing the activation of the previous layer and the weights respectively. If both input_quantizer and kernel_quantizer are None the layer is equivalent to a full precision layer.

You can define a binarized densely-connected layer using the Straight-Through Estimator the following way:

larq.layers.QuantDense(
    32,
    input_quantizer="ste_sign",
    kernel_quantizer="ste_sign",
    kernel_constraint="weight_clip",
)

This layer can be used inside a keras model or with a custom training loop.

Examples

Checkout our examples on how to train a Binarized Neural Network in just a few lines of code:

Requirements

Before installing Larq, please install:

You can also checkout one of our prebuilt docker images.

Installation

You can install Larq with Python's pip package manager:

pip install larq

About

An Open Source Machine Learning Library for Training Binarized Neural Networks

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 99.7%
  • HCL 0.3%
0