LazyDoc.Providers.GithubAi (LazyDoc v0.6.2)
View SourceThe module LazyDoc.Providers.GithubAi provides an interface for interacting with the Github AI API, facilitating access to AI-powered code generation and completion functionalities.
Description
This module defines a provider behavior for the LazyDoc library, specifically tailored for making requests to the Github AI API. It allows users to send prompts to the API and retrieve generated responses based on specified models and parameters. The functionality includes constructing requests with necessary headers and parameters, handling the responses received from the API, and providing a way to validate the parameters used in the requests. Additionally, it supports the specification of different AI models that can be utilized for generating completions, making this module a flexible tool for leveraging AI in documentation and development processes.
Summary
Functions
Returns a boolean indicating whether all provided parameters are valid.
Returns the content of the message from the response.
Returns the corresponding model name based on the provided model symbol.
Returns a request configuration for querying a model with the specified parameters.
Returns the result of a request made with a prompt using a specified model and token.
Functions
Returns a boolean indicating whether all provided parameters are valid.
Parameters
- params - a list of key-value pairs representing parameters to be checked.
Description
Validates the parameters against a predefined list of acceptable keys.
@spec get_docs_from_response(Req.Response.t()) :: binary()
Returns the content of the message from the response.
Parameters
- response - a
%Req.Response{}
struct containing the body of the response which includes the message.
Description
Extracts the message content from the response body.
Returns the corresponding model name based on the provided model symbol.
Parameters
- model - symbol representing the model type (:codestral, :gpt_4o, or :gpt_4o_mini).
Description
Returns the name of the model as a string based on the input symbol.
Returns a request configuration for querying a model with the specified parameters.
Parameters
- prompt - The input text that will guide the model's response.
- model - The identifier of the model to use for generating the output.
- token - The authentication token for accessing the model's API.
- params - Optional parameters such as temperature, top_p, and max_tokens to fine-tune the model's output.
Description
Constructs a request to the model API with specified settings and configurations.
@spec request_prompt(binary(), binary(), binary(), keyword()) :: {:ok, Req.Response.t()} | {:error, Exception.t()}
Returns the result of a request made with a prompt using a specified model and token.
Parameters
- prompt - The input text or question to be sent in the request.
- model - The model that will process the input prompt.
- token - The authentication token required to access the request.
- params - Optional parameters for customizing the request.
Description
Performs a request to the API with the provided prompt and model.