Skip to main content

Primitives

Primitives are the fundamental building blocks of NeuroScript. They wrap PyTorch operations and provide the foundation for building complex neural architectures.

Categories

Layers

Activations

  • ReLU - Rectified Linear Unit
  • GELU - Gaussian Error Linear Unit

Normalization

Regularization

Structural

Usage

All primitives are automatically available in your NeuroScript programs. Simply use them by name:

neuron MyModel:
graph:
in -> Linear(512, 256) -> GELU() -> out

Shape Contracts

Every primitive has a well-defined shape contract that specifies:

  • Input shapes (what tensor dimensions are expected)
  • Output shapes (what dimensions are produced)
  • Parameter constraints (what values are valid)

The NeuroScript compiler validates all shape contracts at compile time, catching dimensional errors before code generation.