@keeb/ollama
v2026.04.22.3
Ollama LLM integration (generate, batch generate, unload)
Repository
https://github.com/keeb/swamp-ollama
Quality score
How well-documented and verifiable this extension is.
Grade A
- Has README or module doc2/2earned
- README has a code example1/1earned
- README is substantive1/1earned
- Most symbols documented1/1earned
- No slow types1/1earned
- Has description1/1earned
- At least one platform tag (or universal)1/1earned
- Two or more platform tags (or universal)1/1earned
- License declared1/1earned
- Verified public repository2/2earned
Install
$ swamp extension pull @keeb/ollamaSecurity Notice
This extension includes AI agent skills that can modify AI assistant behavior. Review the skill files before installing.
@keeb/ollamav2026.03.28.1ollama.ts
generateSend a prompt and input to Ollama, return structured output
| Argument | Type | Description |
|---|---|---|
| prompt | string | System prompt / instructions |
| input | string | Input to process |
unloadUnload the model from VRAM to free GPU memory
generate_batchSend multiple inputs through the same prompt (factory: one resource per input)
| Argument | Type | Description |
|---|---|---|
| prompt | string | System prompt / instructions |
Resources
result(infinite)— LLM generation result
ollama1 file
2026.04.22.26.4 KBApr 22, 2026
Ollama LLM integration (generate, batch generate, unload)
Changelog
skills
+ollama
linux-x86_64linux-aarch64darwin-x86_64darwin-aarch64
ollamallmai
2026.04.06.15.1 KBApr 7, 2026
Ollama LLM integration (generate, batch generate, unload)
linux-x86_64linux-aarch64darwin-x86_64darwin-aarch64
ollamallmai
2026.04.02.12.8 KBApr 2, 2026
Ollama LLM integration (generate, batch generate, unload)
Changelog
Models
~ollamamethods: +unload
linux-x86_64linux-aarch64darwin-x86_64darwin-aarch64
ollamallmai
2026.03.28.12.4 KBMar 29, 2026
Ollama LLM integration (generate, batch generate)
linux-x86_64linux-aarch64darwin-x86_64darwin-aarch64
ollamallmai