Super minimalistic machine-learning framework.
Explore the docs Β»
View Example
|
Report Bug
|
Request Feature
Magnetron is a minimalistic, PyTorch-style machine-learning framework designed for IoT and other resource-limited environments.
The tiny C99 core - wrapped in a modern Python API - gives you dynamic graphs, automatic differentiation and network building blocks without the bloat.
A CUDA backend is also WIP.
- N-dimensional, flattened tensors
- Automatic differentiation on dynamic computation graphs
- CPU multithreading + SIMD (SSE4, AVX2/AVX512, ARM NEON)
- PyTorch-like Python API
- Broadcasting-aware operators with in-place variants
- High-level neural-net building blocks
- float32 and float16 datatypes
- Modern PRNGs (Mersenne Twister, PCG)
- Clear validation and error messages
- Custom compressed tensor file formats
- No C and Python dependencies (except CFFI for the Python wrapper)
A simple XOR neuronal network (MLP) trained with Magnetron. Copy and paste the code below into a file called xor.py
and run it with Python.
import magnetron as mag
from magnetron import optim, nn
from matplotlib import pyplot as plt
EPOCHS: int = 2000
# Create the model, optimizer, and loss function
model = nn.Sequential(nn.Linear(2, 2), nn.Tanh(), nn.Linear(2, 1), nn.Tanh())
optimizer = optim.SGD(model.parameters(), lr=1e-1)
criterion = nn.MSELoss()
loss_values: list[float] = []
x = mag.Tensor.from_data([[0, 0], [0, 1], [1, 0], [1, 1]])
y = mag.Tensor.from_data([[0], [1], [1], [0]])
# Train the model
for epoch in range(EPOCHS):
y_hat = model(x)
loss = criterion(y_hat, y)
loss.backward()
optimizer.step()
optimizer.zero_grad()
loss_values.append(loss.item())
if epoch % 100 == 0:
print(f'Epoch: {epoch}, Loss: {loss.item()}')
# Print the final predictions after the training
print('=== Final Predictions ===')
with mag.no_grad():
y_hat = model(x)
for i in range(x.shape[0]):
print(f'Expected: {y[i]}, Predicted: {y_hat[i]}')
# Plot the loss
plt.figure()
plt.plot(loss_values)
plt.xlabel('Epoch')
plt.ylabel('MSE Loss')
plt.title('Training Loss over Time')
plt.grid(True)
plt.show()
This results in the following output:
To get a local copy up and running follow these simple steps.
Magnetron itself has no Python dependencies except for CFFI to call the C library from Python.
Some examples use matplotlib and numpy for plotting and data generation, but these are not required to use the framework.
- Linux, MacOS or Windows
- A C99 compiler (gcc, clang, msvc)
- Python 3.6 or higher
A pip installable package will be provided, as soon as all core features are implemented.
- Clone the repo
cd magnetron/python
(VENV recommended).pip install -r requirements.txt
Install dependencies for examples.cd magnetron_framework && bash install_wheel_local.sh && cd ../
Install the Magnetron wheel locally, a pip installable package will be provided in the future.python examples/simple/xor.py
Run the XOR example.
See the Examples directory which contains various models and training examples.
For usage in C or C++ see the Unit Tests directory.
The following table lists all available operators and their properties.
Mnemonic | Desc | IN | OUT | Params | Flags | Inplace | Backward | Result | Validation | CPU-Parallel | Type |
---|---|---|---|---|---|---|---|---|---|---|---|
NOP | no-op | 0 | 0 | N/A | N/A | NO | NO | N/A | NO | NO | NO-OP |
CLONE | strided copy | 1 | 1 | N/A | N/A | NO | YES | ISOMORPH | YES | NO | Morph |
VIEW | memory view | 1 | 1 | N/A | N/A | NO | YES | ISOMORPH | YES | NO | Morph |
TRANSPOSE | π₯α΅ | 1 | 1 | N/A | N/A | NO | YES | TRANSPOSED | YES | NO | Morph |
PERMUTE | swap axes by index | 1 | 1 | U64 [6] | N/A | NO | NO | PERMUTED | YES | NO | Morph |
MEAN | (βπ₯) β π | 1 | 1 | N/A | N/A | NO | YES | SCALAR/REDUCED | YES | NO | Reduction |
MIN | min(π₯) | 1 | 1 | N/A | N/A | NO | NO | SCALAR/REDUCED | YES | NO | Reduction |
MAX | max(π₯) | 1 | 1 | N/A | N/A | NO | NO | SCALAR/REDUCED | YES | NO | Reduction |
SUM | βπ₯ | 1 | 1 | N/A | N/A | NO | YES | SCALAR/REDUCED | YES | NO | Reduction |
ABS | |π₯| | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
SGN | π₯β | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
NEG | βπ₯ | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
LOG | logββ(π₯) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
SQR | π₯Β² | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
SQRT | βπ₯ | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
SIN | sin(π₯) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
COS | cos(π₯) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
STEP | π»(π₯) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
EXP | πΛ£ | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
FLOOR | βπ₯β | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
CEIL | βπ₯β | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
ROUND | β¦π₯β§ | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
SOFTMAX | πΛ£β± β βπΛ£Κ² | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
SOFTMAX_DV | πβππ₯ softmax(π₯) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
SIGMOID | 1 β (1 + πβ»Λ£) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
SIGMOID_DV | πβππ₯ sigmoid(π₯) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
HARD_SIGMOID | max(0, min(1, 0.2Γπ₯ + 0.5)) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
SILU | π₯ β (1 + πβ»Λ£) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
SILU_DV | πβππ₯ silu(π₯) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
TANH | tanh(π₯) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
TANH_DV | πβππ₯ tanh(π₯) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
RELU | max(0, π₯) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
RELU_DV | πβππ₯ relu(π₯) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
GELU | 0.5Γπ₯Γ(1 + erf(π₯ β β2)) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
GELU_DV | πβππ₯ gelu(π₯) | 1 | 1 | N/A | N/A | YES | YES | ISOMORPH | YES | YES | Unary OP |
ADD | π₯ + π¦ | 2 | 1 | N/A | N/A | YES | YES | BROADCASTED | YES | YES | Binary OP |
SUB | π₯ β π¦ | 2 | 1 | N/A | N/A | YES | YES | BROADCASTED | YES | YES | Binary OP |
MUL | π₯ β π¦ | 2 | 1 | N/A | N/A | YES | YES | BROADCASTED | YES | YES | Binary OP |
DIV | π₯ β π¦ | 2 | 1 | N/A | N/A | YES | YES | BROADCASTED | YES | YES | Binary OP |
MATMUL | π₯π¦ | 2 | 1 | N/A | N/A | YES | YES | MATRIX | YES | YES | Binary OP |
REPEAT_BACK | gradient broadcast to repeated shape | 2 | 1 | N/A | N/A | YES | YES | BROADCASTED | YES | NO | Binary OP |
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated. If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement".
(c) 2025 Mario "Neo" Sieg. mario.sieg.64@gmail.com
Distributed under the Apache 2 License. See LICENSE.txt
for more information.