A Scheme-like Lisp where completions, tool use, and agentic loops are native forms — not string templates bolted onto a scripting language. Implemented in Rust. 400+ builtins. 11 providers.
;; Define a tool the LLM can call
(deftool get-weather
"Get weather for a city"
{:city {:type :string}}
(lambda (city)
(format "~a: 22°C, sunny" city)))
;; Build an agent with tools
(defagent weather-bot
{:system "You answer weather questions."
:tools [get-weather]
:model "claude-haiku-4-5-20251001"})
(agent/run weather-bot
"What's the weather in Tokyo?")
; => "The weather in Tokyo is 22°C and sunny."Conversations are persistent values. Prompts compose like any other s-expression. Completions, chat, structured extraction, classification, tool use, and agentic loops—all as native forms.
11 providers auto-configured from environment variables. Cost tracking, budgets, and batch processing built in.
;; Simple completion
(llm/complete "Say hello in 5 words"
{:max-tokens 50})
;; Chat with roles
(llm/chat
[(message :system "You are helpful.")
(message :user "What is Lisp?")]
{:max-tokens 100})
;; Structured extraction
(llm/extract
{:vendor {:type :string}
:amount {:type :number}}
"Coffee $4.50 at Blue Bottle")
; => {:amount 4.5 :vendor "Blue Bottle"}
;; Classification
(llm/classify [:positive :negative :neutral]
"This product is amazing!")
; => :positive Define tools with deftool—the LLM sees the schema, calls your Lisp function, and uses the result. Parameters are converted from JSON to Sema values automatically.
defagent combines a system prompt, tools, and a multi-turn loop. The agent calls tools and reasons until it has an answer or hits :max-turns.
;; Define a tool
(deftool lookup-capital
"Look up the capital of a country"
{:country {:type :string
:description "Country name"}}
(lambda (country)
(cond
((= country "Norway") "Oslo")
((= country "France") "Paris")
(else "Unknown"))))
;; Use tools in chat
(llm/chat
[(message :user "Capital of Norway?")]
{:tools [lookup-capital]})
;; Agent with multi-turn loop
(defagent geography-bot
{:system "You answer geography questions."
:tools [lookup-capital]
:max-turns 3})
(agent/run geography-bot "Capital of France?")A full coding agent in 25 lines. Tools are just lambdas with a schema. The agent loop handles tool dispatch, retries, and conversation management automatically.
Or skip the agent entirely—stream a summary from any file in 5 lines. prompt composes roles as s-expressions, llm/stream prints tokens as they arrive.
;; A coding agent in 25 lines
(deftool read-file
"Read a file's contents"
{:path {:type :string}}
(lambda (path) (file/read path)))
(deftool run-command
"Run a shell command"
{:command {:type :string}}
(lambda (command)
(define r (shell "sh" "-c" command))
(string-append (:stdout r) (:stderr r))))
(defagent coder
{:system "You are a coding assistant.
Read files before editing.
Run tests after changes."
:tools [read-file run-command]
:max-turns 10})
(agent/run coder
"Find all TODO comments in src/")The simplest possible LLM program: read a file, compose a prompt with roles, stream the response. No boilerplate, no SDK initialization, no async runtime to configure.
prompt is a special form—role symbols like system and user are syntax, not strings. The result is a first-class value you can store, compose, or pass to any LLM function.
;; Summarize any file with streaming
(define text (file/read "article.md"))
(llm/stream
(prompt
(system "Summarize concisely.")
(user text))
{:max-tokens 500}) Immutable conversation values that accumulate message history. conversation/say sends a message, gets a reply, and returns a new conversation with both appended.
Process collections in parallel with llm/pmap and llm/batch.
;; Persistent conversations
(define c (conversation/new {}))
(define c (conversation/say c
"Remember: the secret is 7"))
(define c (conversation/say c
"What is the secret?"))
(conversation/last-reply c)
; => "The secret is 7."
;; Parallel batch processing
(llm/pmap
(fn (word) (format "Define: ~a" word))
'("serendipity" "ephemeral")
{:max-tokens 50})
;; Provider management
(llm/list-providers) ; => (:anthropic :openai ...)
(llm/set-default :openai)
(llm/set-budget 1.00) ; $1 spending limit Generate embeddings with llm/embed and compute cosine similarity with llm/similarity. Supports Jina, Voyage, Cohere, and OpenAI embedding models.
Batch embed multiple texts in a single call. Use with sort-by to build simple semantic search.
;; Generate embeddings
(define v1 (llm/embed "hello world"))
(define v2 (llm/embed "hi there"))
(llm/similarity v1 v2) ; => 0.87
;; Batch embeddings
(llm/embed ["cat" "dog" "fish"])
; => ((...) (...) (...))
;; Simple semantic search
(define query (llm/embed "programming"))
(define docs
(map (fn (d) (hash-map :text d
:vec (llm/embed d)))
(list "Lisp is great"
"cooking recipes"
"Rust programming")))
(sort-by
(fn (d) (- (llm/similarity query (:vec d))))
docs)29 LLM builtins — completion, chat, streaming, tools, agents, embeddings, and more.
Browse LLM Reference → A Scheme-like core with Clojure-style keywords (:foo), map literals ({:key val}), and vector literals ([1 2 3]).
Tail-call optimized via trampoline. Closures, macros, higher-order functions, and a module system—all in a single-threaded evaluator small enough to read in an afternoon.
;; Recursion
(define (factorial n)
(if (<= n 1) 1 (* n (factorial (- n 1)))))
(factorial 10) ; => 3628800
;; Higher-order functions
(map (lambda (x) (* x x)) (range 1 6))
; => (1 4 9 16 25)
(filter even? (range 1 11))
; => (2 4 6 8 10)
(foldl + 0 (range 1 11))
; => 55
;; Maps — keywords are functions
(define person {:name "Ada" :age 36})
(:name person) ; => "Ada"
;; Closures and composition
(define (compose f g)
(lambda (x) (f (g x))))
(define inc-then-double
(compose (lambda (x) (* x 2))
(lambda (x) (+ x 1))))
(inc-then-double 5) ; => 12 Structured error handling with typed error maps. catch binds an error map with :type, :message, and :stack-trace keys.
defmacro with quasiquote, unquote, and splicing. eval and read for runtime code generation. Inspect expansions with macroexpand.
;; Error handling
(try
(/ 1 0)
(catch e
(println (:message e))
(:type e))) ; => :eval
(throw {:code 404 :reason "not found"})
;; Macros
(defmacro unless (test . body)
`(if ,test nil (begin ,@body)))
(unless #f
(println "this runs!"))
;; Runtime eval
(eval (read "(+ 1 2 3)")) ; => 6Linked lists, vectors, and ordered maps with a full suite of higher-order operations. Slash-namespaced string functions, file I/O, HTTP client, JSON, regex, shell access, and more.
Keywords in function position act as map accessors. Map bodies auto-serialize as JSON in HTTP requests.
;; Collections
(map + '(1 2 3) '(10 20 30)) ; => (11 22 33)
(filter even? (range 1 11)) ; => (2 4 6 8 10)
(define m {:a 1 :b 2 :c 3})
(assoc m :d 4) ; => {:a 1 :b 2 :c 3 :d 4}
(map/select-keys m '(:a :c)) ; => {:a 1 :c 3}
;; Strings
(string/split "a,b,c" ",") ; => ("a" "b" "c")
(string/join '("a" "b") ", ") ; => "a, b"
(string/upper "hello") ; => "HELLO"
;; Files, HTTP & JSON
(file/write "out.txt" "hello")
(file/read "out.txt") ; => "hello"
(define resp (http/get "https://api.example.com/data"))
(json/decode (:body resp)) ; => {:key "val"}
(shell "ls -la") ; => {:exit-code 0 :stdout "..."}400+ builtins across 17 modules — math, strings, lists, maps, I/O, HTTP, regex, and more.
Browse Standard Library Reference →Trampoline-based evaluator. Deep recursion without stack overflow.
I/O, HTTP, regex, JSON, crypto, CSV, datetime, math, and more.
deftool and defagent as native special forms with multi-turn loops.
Define a schema as a map, get typed data back. llm/extract + llm/classify.
Real-time token streaming with llm/stream. Parallel batch with llm/pmap.
Conversations are immutable values. Fork, extend, inspect message history as data.
File-based modules with import and export. Paths resolve relative to current file.
try / catch / throw with typed error maps and full stack traces.
Per-call and session usage tracking. Budget limits with llm/set-budget.
Anthropic, OpenAI, Gemini, Ollama, Groq, xAI, Mistral, Moonshot, Jina, Voyage, Cohere.
defmacro with quasiquote/unquote/splicing. macroexpand for inspection.
Keywords (:foo), maps ({:k v}), vectors ([1 2]). Keywords as functions.
Six Rust crates, one directed dependency graph. No circular dependencies. Single-threaded with Rc, deterministic ordering with BTreeMap.
Value types, environment, errors
Lexer and s-expression parser
Trampoline evaluator, special forms, modules
Comprehensive standard library builtins
LLM provider trait, multi-provider API clients, streaming
REPL, CLI, file runner
sema-core
/ | \
sema-reader | sema-stdlib
\ | /
sema-eval sema-llm
\ /
semaGet running in one command, or build from source.
$ cargo install \
--git https://github.com/HelgeSverre/sema \
sema$ git clone https://github.com/HelgeSverre/sema
$ cd sema
$ cargo build --release$ sema # Start the REPL
$ sema script.sema # Run a file
$ sema -e '(+ 1 2)' # Eval expression
$ sema -p '(filter even? (range 10))' # Eval & print
$ sema -l prelude.sema script.sema # Load then run
$ sema --no-llm script.sema # No LLM features