# `LangchainPrompt.Adapter`
[🔗](https://github.com/exfoundry/langchain_prompt/blob/v0.1.0/lib/langchain_prompt/adapter.ex#L1)

Behaviour for connecting `LangchainPrompt` to a Large Language Model.

Implement this behaviour to add support for any LLM provider not covered by
the built-in adapters.

## Built-in adapters

- `LangchainPrompt.Adapters.Langchain` — delegates to any
  [elixir-langchain](https://hex.pm/packages/langchain) chat model.
- `LangchainPrompt.Adapters.Test` — in-process adapter for ExUnit tests;
  records calls and supports on-demand failure simulation.

## Custom adapter example

    defmodule MyApp.Adapters.OpenAIDirect do
      @behaviour LangchainPrompt.Adapter

      @impl true
      def chat(messages, opts) do
        # build request, call API, return {:ok, %Message{}} or {:error, reason}
      end
    end

# `response`

```elixir
@type response() :: {:ok, LangchainPrompt.Message.t()} | {:error, any()}
```

# `chat`

```elixir
@callback chat(messages :: [LangchainPrompt.Message.t()], opts :: map() | keyword()) ::
  response()
```

---

*Consult [api-reference.md](api-reference.md) for complete listing*
