# `LangchainPrompt.Adapters.Langchain`
[🔗](https://github.com/exfoundry/langchain_prompt/blob/v0.1.0/lib/langchain_prompt/adapters/langchain.ex#L1)

Adapter that delegates to any [elixir-langchain](https://hex.pm/packages/langchain)
chat model.

Requires the `:langchain` dependency (`~> 0.7`).

## Profile opts

Pass the following keys in `Profile.opts`:

| Key            | Required | Description                                      |
|----------------|----------|--------------------------------------------------|
| `:chat_module` | yes      | A `LangChain.ChatModels.*` module                |
| `:model`       | yes      | Model name string                                |
| any other      | no       | Forwarded as-is to `chat_module.new/1`           |

## Examples

    # Google AI
    %Profile{
      adapter: LangchainPrompt.Adapters.Langchain,
      opts: %{
        chat_module: LangChain.ChatModels.ChatGoogleAI,
        model: "gemini-2.0-flash",
        temperature: 0.1
      }
    }

    # OpenAI-compatible (Deepseek, Grok, Mistral, Ollama, …)
    %Profile{
      adapter: LangchainPrompt.Adapters.Langchain,
      opts: %{
        chat_module: LangChain.ChatModels.ChatOpenAI,
        model: "deepseek-chat",
        endpoint: "https://api.deepseek.com/chat/completions",
        api_key: System.get_env("DEEPSEEK_API_KEY")
      }
    }

    # Anthropic
    %Profile{
      adapter: LangchainPrompt.Adapters.Langchain,
      opts: %{
        chat_module: LangChain.ChatModels.ChatAnthropic,
        model: "claude-sonnet-4-6"
      }
    }

---

*Consult [api-reference.md](api-reference.md) for complete listing*
