LLama 3.x LLM agent with example
Created by: jrzkaminski
Add Llama31ChatModel and Structured Chat Agent as example Implementation
Description
This PR introduces a new module that implements a custom Llama31ChatModel
class and sets up a structured chat agent using LangChain. The module allows integration with the Llama 3.1 language model, enabling users to create chat agents with custom LLMs.
Key Components:
-
Custom LLM Class (
Llama31ChatModel
):- Inherits from
BaseChatModel
. - Interfaces with the Llama 3.1 (3.2 is compatible too) API for generating responses.
- Handles message formatting and API requests.
- Includes proper logging with
_logger
to trace requests and responses.
- Inherits from
Example:
-
Tool Definitions:
- Implements two tools using the
@tool
decorator:-
add_numbers(a: int, b: int) -> int
: Adds two numbers. -
multiply_numbers(a: int, b: int) -> int
: Multiplies two numbers.
-
- Implements two tools using the
-
Prompt Templates:
-
System Prompt:
- Provides the agent with instructions and available tools.
- Guides the agent on how to format actions and responses using JSON blobs.
-
Human Prompt:
- Accepts user input.
- Includes
{agent_scratchpad}
to incorporate the agent's intermediate reasoning.
-
System Prompt:
-
Agent and AgentExecutor Setup:
- Uses
create_structured_chat_agent
to create an agent with the custom LLM and tools. - Configures the
AgentExecutor
to manage the agent's execution flow. - Handles prompt variable management and ensures compatibility with LangChain's requirements.
- Uses
-
Error Handling and Fixes:
- Addressed issues related to the
agent_scratchpad
variable by including it correctly in the prompt templates. - Fixed attribute naming inconsistencies (
self.logger
vs.self._logger
) in the custom LLM class. - Updated code to be compatible with the latest version of LangChain, resolving deprecation warnings and method updates.
- Addressed issues related to the
Notes
-
Users must replace placeholder API credentials (
"YOUR_API_KEY"
,"YOUR_BASE_URL"
,"YOUR_MODEL_NAME"
) with actual values to utilize the Llama 3.1 API. -
The module assumes that the Llama 3.1 API endpoints and payload structures are similar to those used in the code. Adjustments may be necessary if the API differs.
Conclusion
This PR adds valuable functionality by demonstrating how to integrate a custom LLM with LangChain's structured chat agents. It provides a solid foundation for further development and customization, allowing for more sophisticated agents tailored to specific use cases. Also, code improvements, further generalization, contributing to the PR, suggestions and so on are very welcome.