# `mix hf.model_info`
[🔗](https://github.com/huggingface/huggingface_client/blob/v0.1.0/lib/mix/tasks/hf.model_info.ex#L1)

Fetches metadata for a model from the HuggingFace Hub and displays it,
including all available inference providers.

    $ mix hf.model_info meta-llama/Llama-3.1-8B-Instruct

    Model:         meta-llama/Llama-3.1-8B-Instruct
    Task:          text-generation
    Library:       transformers
    Downloads:     1_234_567
    Likes:         8_901
    Gated:         false
    Private:       false

    Available providers (status: live)
    ─────────────────────────────────
    groq             conversational   llama-3.1-8b-instant
    together         conversational   meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo
    nebius           conversational   meta-llama/Meta-Llama-3.1-8B-Instruct
    ...

## Options

    --token TOKEN    HuggingFace access token (or set HF_TOKEN env var)
    --json           Output as JSON

---

*Consult [api-reference.md](api-reference.md) for complete listing*
