Skip to content

LangBatch is a Python library for large scale AI generation using batch APIs from providers like OpenAI, Anthropic, Azure OpenAI, GCP VertexAI, etc.

License

Notifications You must be signed in to change notification settings

EasyLLM/langbatch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Call all Batch APIs using the OpenAI format [OpenAI, Anthropic, Azure OpenAI, Vertex AI]

GitHub release Build License PyPI Downloads discord-invite

LangBatch is a Python library for large scale AI generation using batch APIs from providers like OpenAI, Azure OpenAI, GCP Vertex AI, etc.

Utlize Batch APIs for

  • Request that don't require immediate responses.
  • Low cost (usually 50% discount)
  • Higher rate limits
  • Example use cases: Data processing pipelines, Evaluations, Classifying huge data, Creating embeddings for large text contents

Key Features

  • Unified API to access Batch APIs from different providers.
  • Standarized OpenAI format for requests and responses
  • Utilities for handling the complete lifecycle of a batch job: Creating, Starting, Monitoring, Retrying and Processing Completed
  • Convert incoming requests into batch jobs

Installation

PyPI:

pip install langbatch

Alternatively, from source:

pip install git+https://github.com/EasyLLM/langbatch

Find the complete Installation guide here.

Quickstart

Here is the 3 main lines to start a batch job:

from langbatch import OpenAIChatCompletionBatch
batch = OpenAIChatCompletionBatch("openai_chat_completion_requests.jsonl")
batch.start()

Check the status of the batch and get the results:

if batch.get_status() == "completed":
    results, _ = batch.get_results()
    for result in results:
        print(f"Custom ID: {result['custom_id']}")
        print(f"Content: {result['choices'][0]['message']['content']}")

Find the complete Get Started guide here.

🫂 Community

If you want to get more involved with LangBatch, check out our discord server

Contributors

+----------------------------------------------------------------------------+
|     +----------------------------------------------------------------+     |
|     | Developers: Those who built with `langbatch`.                  |     |
|     | (You have `import langbatch` somewhere in your project)        |     |
|     |     +----------------------------------------------------+     |     |
|     |     | Contributors: Those who make `langbatch` better.   |     |     |
|     |     | (You make PR to this repo)                         |     |     |
|     |     +----------------------------------------------------+     |     |
|     +----------------------------------------------------------------+     |
+----------------------------------------------------------------------------+

We welcome contributions from the community! Whether it's bug fixes, feature additions, or documentation improvements, your input is valuable.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

About

LangBatch is a Python library for large scale AI generation using batch APIs from providers like OpenAI, Anthropic, Azure OpenAI, GCP VertexAI, etc.

Topics

Resources

License

Stars

Watchers

Forks

Languages