Quick Start Guide
Get started with LOOM in 5 minutes. Build and train your first neural network.
Installation
Choose your preferred language and install LOOM:
Go (Native)
bash
go get github.com/openfluke/loom/nn
Python
bash
pip install welvet
JavaScript/TypeScript
bash
npm install @openfluke/welvet
C# / .NET
bash
dotnet add package Welvet
Your First Neural Network
Go Example
go
package main
import (
"fmt"
"github.com/openfluke/loom/nn"
)
func main() {
// Create a simple network: 4 inputs -> 8 hidden -> 2 outputs
network := nn.NewNetwork(4, 1, 1, 2)
// Configure layers
network.SetLayer(0, 0, 0, nn.InitDenseLayer(4, 8, nn.ActivationReLU))
network.SetLayer(0, 0, 1, nn.InitDenseLayer(8, 2, nn.ActivationSigmoid))
// Initialize GPU (optional but recommended)
if err := network.InitGPU(); err != nil {
fmt.Println("GPU not available, using CPU")
}
defer network.ReleaseGPU()
// Create input data
input := []float32{0.1, 0.2, 0.3, 0.4}
// Forward pass
output, _, err := network.ForwardGPU(input)
if err != nil {
panic(err)
}
fmt.Printf("Output: %v\n", output)
}
Python Example
python
import welvet
# Create network with GPU acceleration
network = welvet.create_network(
input_size=4,
grid_rows=1,
grid_cols=1,
layers_per_cell=2,
use_gpu=True
)
# Configure: 4 -> 8 -> 2
welvet.configure_sequential_network(
network,
layer_sizes=[4, 8, 2],
activations=[welvet.Activation.RELU, welvet.Activation.SIGMOID]
)
# Forward pass
output = welvet.forward(network, [0.1, 0.2, 0.3, 0.4])
print(f"Output: {output}")
# Cleanup
welvet.cleanup_gpu(network)
welvet.free_network(network)
JavaScript/TypeScript Example
javascript
// Load WASM module first
import { NewNetwork, InitDenseLayer } from '@openfluke/welvet';
// Create network
const network = NewNetwork(4, 1, 1, 2);
// Configure layers
const layer0 = InitDenseLayer(4, 8, 0); // ReLU
const layer1 = InitDenseLayer(8, 2, 1); // Sigmoid
network.SetLayer(JSON.stringify([0, 0, 0, JSON.parse(layer0)]));
network.SetLayer(JSON.stringify([0, 0, 1, JSON.parse(layer1)]));
// Forward pass
const input = [0.1, 0.2, 0.3, 0.4];
const resultJSON = network.ForwardCPU(JSON.stringify([input]));
const output = JSON.parse(resultJSON)[0];
console.log("Output:", output);
Training Your Network
Go Training Example
go
// Prepare training data
trainBatches := []nn.Batch{
{
Inputs: [][]float32{{0.1, 0.2, 0.3, 0.4}, {0.5, 0.6, 0.7, 0.8}},
Targets: [][]float64{{1.0, 0.0}, {0.0, 1.0}},
},
}
// Configure training
config := &nn.TrainingConfig{
Epochs: 10,
LearningRate: 0.01,
UseGPU: true,
GradientClip: 5.0,
LossType: "mse",
}
// Train the model
result, err := network.Train(trainBatches, config)
if err != nil {
panic(err)
}
fmt.Printf("Final Loss: %.4f\n", result.FinalLoss)
Python Training Example
python
import welvet
# Training data
inputs = [[0.1, 0.2, 0.3, 0.4], [0.5, 0.6, 0.7, 0.8]]
targets = [[1.0, 0.0], [0.0, 1.0]]
# Train
for epoch in range(10):
loss = welvet.train_epoch(network, inputs, targets, learning_rate=0.01)
print(f"Epoch {epoch+1}: loss = {loss:.4f}")
Saving and Loading Models
go
// Save model
err := network.SaveModel("model.json", "my_model")
// Load model
loadedNet, err := nn.LoadModel("model.json", "my_model")
// Or use strings (great for WASM/APIs)
jsonString, err := network.SaveModelToString("my_model")
loadedNet, err := nn.LoadModelFromString(jsonString, "my_model")
Cross-Platform Compatible
Models saved in one language work in all others! Train in Go, deploy
in Python, run in browser with JavaScript/WASM.
Next Steps
Now that you've created your first neural network, explore more features:
- Learn about all 5 layer types (Dense, Conv2D, Attention, RNN, LSTM)
- Explore Grid Softmax and Mixture of Experts
- Deep dive into training and evaluation
- Understand GPU acceleration
- Browse code examples
Full Documentation
Check out the complete
LOOM repository
for detailed documentation, examples, and benchmarks.