Skip to content

Local first human friendly agents toolkit for the browser and Nodejs

License

Notifications You must be signed in to change notification settings

synw/agent-smith

Repository files navigation

Agent Smith

An api to create local first human friendly agents in the browser or Nodejs

Agent Smith

📚 Read the documentation

Check the 💻 examples

What is an agent?

An agent is an anthropomorphic representation of a bot. It can:

  • Think: use language model servers to perform inference queries
  • Interact: perform interactions with the user and get input and feedback
  • Work: manage long running jobs with multiple tasks, use custom terminal commands
  • Remember: use transient or semantic memory to store data

Packages

Version Name Description Nodejs Browser
pub package @agent-smith/body The body
pub package @agent-smith/brain The brain
pub package @agent-smith/jobs Jobs
pub package @agent-smith/tmem Transient memory
pub package @agent-smith/tmem-jobs Jobs transient memory
pub package @agent-smith/smem Semantic memory
pub package @agent-smith/tfm Templates for models
pub package @agent-smith/lmtask Yaml model task
pub package @agent-smith/cli Terminal client
pub package @agent-smith/feat-git Git features

Philosophy

  • Composable: the packages have limited responsibilities and can work together
  • Declarative: focus on the business logic by expressing features simply
  • Explicit: keep it simple and under user control: no hidden magic

FAQ

  • What local or remote inference servers can I use?

Actually it works with Llama.cpp, Koboldcpp and Ollama.

It also works in the browser using gpu only inference and small models

  • Can I use this with OpenAI or other big apis?

Sorry no: this library favours local first or private remote inference servers

Example

Terminal client

Generate a commit message in a git repository (using the @agent-smith/feat-cli plugin):

lm commit .

Nodejs

const backend = useLmBackend({
    name: "koboldcpp",
    localLm: "koboldcpp",
    onToken: (t) => process.stdout.write(t),
});

const ex = useLmExpert({
    name: "koboldcpp",
    backend: backend,
    template: templateName,
    model: { name: modelName, ctx: 2048 },
});
const brain = useAgentBrain([expert]);

console.log("Auto discovering brain backend ...");
await brain.init();
brain.ex.checkStatus();
if (brain.ex.state.get().status != "ready") {
        throw new Error("The expert's backend is not ready")
    }
// run an inference query
const _prompt = "list the planets of the solar sytem";
await brain.think(_prompt, { 
   temperature: 0.2, 
   min_p: 0.05 
});

Libraries

Powered by:

  • Nanostores for the state management and reactive variables
  • Locallm for the inference api servers management
  • Modprompt for the prompt templates management

About

Local first human friendly agents toolkit for the browser and Nodejs

Resources

License

Stars

Watchers

Forks