Skip to main content
PLANNED FOR v0.6.0

This feature is not yet implemented. This page describes the planned execution profile system for AILANG. The --profile flag and formal profile validation are planned for v0.6.0.

Design Document: execution-profiles.md

Current Status: Basic Go codegen works via ailang compile --emit-go, but without formal profile selection.

Execution Profiles — A Unified Architecture

AILANG is not a game scripting language. It's a deterministic state-machine DSL with pluggable effect contexts.

The Key Insight

AILANG's Go codegen work for game simulations (stapledon engine) revealed a deeper truth: the underlying semantics—deterministic world-transition functions + explicit effect contexts—are not game-specific.

They form a general computational model suitable for:

  • Multi-agent environments and games
  • Intelligent agents (LLM-driven or policy-driven)
  • Workflow and state-machine engines
  • Request/response microservices
  • CLI tools and utilities
  • Batch/ETL processing
  • WASM/web simulations

Core Concept: Typed State-Machine DSL

At the semantic level, every AILANG program takes one of three shapes:

Stateful Step Functions (Simulations, Agents)

func step(world: World, input: Input) -> (World, Output) ! {RNG, Debug, AI}

Stateless Handler Functions (Microservices)

pure func handle(req: Request) -> Response ! {AI, Debug}

Entrypoint Functions (CLI Tools)

func main(args: [string]) -> () ! {IO, FS, Env, Debug}

Everything else—ADT pattern matching, recursion, lists, arrays, JSON—is orthogonal and profile-independent.


The Three Execution Profiles

AILANG defines three execution profiles, each with a clear contract and effect budget:

SimProfile — Simulations & Games

Primary use: Games, RL environments, multi-agent sims, workflow engines

Entry Signatures:

func init(seed: int, params: InitParams) -> World ! {Debug}
func step(world: World, input: Input) -> (World, Output) ! {RNG, Debug, AI}

Effect Budget:

EffectPurpose
RNGDeterministic PRNG (seeded by host)
DebugStructured logs and assertions
AILLM calls via JSON-in/JSON-out
TimeVirtual time control (future)

Why SimProfile is powerful:

This is exactly the same shape as:

  • RL Gym environmentsreset() + step(action) -> (obs, reward)
  • Game loopsinit() + update(input) -> state
  • Workflow engines — state machines with explicit transitions
  • Agent swarms — multi-agent simulations

ServiceProfile — Microservices & Tools

Primary use: HTTP handlers, gRPC services, agent tools, request classifiers

Entry Signature:

pure func handle(req: Request) -> Response ! {AI, Debug}

Effect Budget:

EffectPurpose
AILLM or policy model calls
DebugLogging and assertions
FSFile access (optional, sandboxed)
EnvEnvironment variables (optional)

Key insight: The AI effect + JSON encoding makes AILANG handlers ideal for cognitive microservices or tools for LLM agents.


CliProfile — CLI Tools & Utilities

Primary use: Command-line tools, scripts, config transformers

Entry Signature:

func main(args: [string]) -> () ! {IO, FS, Env, Debug}

Effect Budget:

EffectPurpose
IOstdout/stderr printing
FSFile reading and writing
EnvCLI args and environment variables
DebugAssertions and logging

How Profiles Share One Compiler

All profiles compile through the same pipeline:

Surface → Core → ANF → Effect-Lowered IR → Go

The only difference is what wrapper we generate:

  • SimProfile: Init() + Step() wrappers
  • ServiceProfile: Handle() wrapper
  • CliProfile: Main() wrapper

Everything else (pattern matching, ADTs, arrays, functions) is identical.

This is the key point: Profiles do not fragment the compiler. They only define entry semantics + effect budgets.


Effect Contexts Across Profiles

EffectSimProfileServiceProfileCliProfile
RNG
Debug
AIoptional
IOoptionaloptional
FSoptional
Envoptional

Key principle: All effects produce pure trace data (not side effects) which the host interprets.


Using Profiles

CLI Usage

# Explicit profile selection
ailang compile mymod.ail --profile sim --emit-go
ailang compile tool.ail --profile cli --emit-go
ailang compile service.ail --profile service --emit-go

# Auto-detection from entry function shape
ailang compile mymod.ail --emit-go

Profile Validation

The compiler validates:

  1. Entry function matches profile shape
  2. Only allowed effects are used
  3. Required effects have contexts

Go Runtime Structure

The Go backend produces a standard package structure:

<module>/
├── world.go # Types and core structs
├── funcs.go # Compiled AILANG functions
├── effects.go # Effect interface stubs
├── debug.go # Debug effect implementation
├── ai.go # AI effect implementation
├── rng.go # RNG context (SimProfile)
├── step.go # Step/init wrappers (SimProfile)
└── main.go # Entry point (CliProfile only)

Future Profiles

AgentProfile (v0.6+)

AILANG as the deterministic "core brain" of tool-using LLM agents:

func decide(state: AgentState, obs: Observation)
-> (AgentState, Action) ! {AI, Tools, Debug}

This is the direct evolution of SimProfile into a full agent framework.

BatchProfile (v0.7+)

For ETL and ML preprocessing jobs:

func run(config: Config, data: [Record]) -> [Result] ! {FS, Debug}

WasmProfile (v0.7+)

SimProfile semantics compiled to WASM for browser simulations.


Strategic Value

Why Go as Host Language?

  • Predictable runtime
  • Static binary linking
  • Excellent embedding story
  • Can call anything (LLMs, FS, network)

Why Effects-as-Contexts?

  • All side effects flow into host-owned contexts
  • AILANG code stays pure and deterministic
  • Reproducibility and safety guaranteed
  • Policy-enforceable boundaries

Why Multiple Profiles?

One language serves:

  • Simulators
  • Game engines
  • Cognitive microservices
  • CLI tooling
  • Agent frameworks

All from one IR and one compiler. No ecosystem fragmentation.


The Analogy

Effects-as-contexts are to AILANG what shared GPU memory is to CUDA.

Before CUDA: GPU threads had no shared space → limited coordination. After CUDA: Shared memory → massive unlock in speed + capability.

AILANG does the same but for cooperating minds.

Effect contexts provide the deterministic, observable substrate for AI agents to coordinate through shared state machines.


Learn More