Skip to main content

@keeb/ollama

v2026.04.22.3

Ollama LLM integration (generate, batch generate, unload)

Repository

https://github.com/keeb/swamp-ollama

Quality score

How well-documented and verifiable this extension is.

100%

Grade A

  • Has README or module doc2/2earned
  • README has a code example1/1earned
  • README is substantive1/1earned
  • Most symbols documented1/1earned
  • No slow types1/1earned
  • Has description1/1earned
  • At least one platform tag (or universal)1/1earned
  • Two or more platform tags (or universal)1/1earned
  • License declared1/1earned
  • Verified public repository2/2earned

Install

$ swamp extension pull @keeb/ollama

Security Notice

This extension includes AI agent skills that can modify AI assistant behavior. Review the skill files before installing.

@keeb/ollamav2026.03.28.1ollama.ts
generateSend a prompt and input to Ollama, return structured output
ArgumentTypeDescription
promptstringSystem prompt / instructions
inputstringInput to process
unloadUnload the model from VRAM to free GPU memory
generate_batchSend multiple inputs through the same prompt (factory: one resource per input)
ArgumentTypeDescription
promptstringSystem prompt / instructions

Resources

result(infinite)— LLM generation result

ollama1 file