lightbench.loaders#
The lightbench.loaders package contains modules for loading and interfacing with various model backends, including OpenAI, Mistral, and LLaMA. It also includes utilities for generation and general loading logic.
Loaders#
- class lightbench.loaders.generation.Generation(response, inference_time=0, ttft=0, peak_memory_usage=0, avg_power_usage=0)#
Bases:
object
- class lightbench.loaders.loader.LLMServiceLoader#
Bases:
ABCAbstract class for LLM loaders. Provides a common interface for generating text, and cleaning up resources.
- abstractmethod cleanup()#
Perform cleanup of model resources and release memory.
- abstractmethod generate(prompt, max_tokens: int) str#
Generate text based on the given prompt and max tokens.
- abstractmethod is_local() bool#
Returns true if using a local model, false if using an API.
- abstractmethod name() str#
Returns the name of the model.
- class lightbench.loaders.huggingface_model_loader.HFModelLoader(model_name: str, quantize: bool = False)#
Bases:
LLMServiceLoader- cleanup()#
Perform cleanup of model resources and release memory.
- generate(prompt, max_tokens: int = 512)#
Generate text based on the given prompt and max tokens.
- is_local()#
Returns true if using a local model, false if using an API.
- is_quantized()#
- name()#
Returns the name of the model.
- class lightbench.loaders.mistral_loader.MistralLoader(model_name: str)#
Bases:
LLMServiceLoader- cleanup()#
Perform cleanup of model resources and release memory.
- generate(prompt, max_tokens: int = 512, temperature: float = 0.7, safe_prompt: bool = False) Generation#
Generate text based on the given prompt and max tokens.
- is_local()#
Returns true if using a local model, false if using an API.
- name()#
Returns the name of the model.
- class lightbench.loaders.openai_loader.OpenAILoader(model_name: str)#
Bases:
LLMServiceLoader- cleanup()#
Perform cleanup of model resources and release memory.
- generate(prompt, max_tokens: int = 512) Generation#
Generate text based on the given prompt and max tokens.
- is_local()#
Returns true if using a local model, false if using an API.
- name()#
Returns the name of the model.