Chatapi Toolkit 2.0.0 | Coderz Repository

chatapi-toolkit 2.0.0

Last updated:

0 purchases

chatapi-toolkit 2.0.0 Image
chatapi-toolkit 2.0.0 Images

Free

Languages

Categories

Add to Cart

Description:

chatapitoolkit 2.0.0

中文文档移步这里。

Chatapi Toolkit





A Python wrapper for ChatAPI Toolkit, supporting multi-turn dialogue, proxy, and asynchronous data processing.
Installation
pip install chatapi-toolkit --upgrade

Usage
Set API Key and Base URL
Method 1, write in Python code:
import chatapi_toolkit
chatapi_toolkit.api_key = "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
chatapi_toolkit.base_url = "https://api.example.com"

Method 2, set environment variables in ~/.bashrc or ~/.zshrc:
export OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export OPENAI_BASE_URL="https://api.example.com"

Examples
Example 1, simulate multi-turn dialogue:
# first chat
chat = Chat("Hello, GPT-3.5!")
resp = chat.getresponse()

# continue the chat
chat.user("How are you?")
next_resp = chat.getresponse()

# add response manually
chat.user("What's your name?")
chat.assistant("My name is GPT-3.5.")

# save the chat history
chat.save("chat.json", mode="w") # default to "a"

# print the chat history
chat.print_log()

Example 2, process data in batch, and use a checkpoint file checkpoint:
# write a function to process the data
def msg2chat(msg):
chat = Chat(api_key=api_key)
chat.system("You are a helpful translator for numbers.")
chat.user(f"Please translate the digit to Roman numerals: {msg}")
chat.getresponse()

checkpoint = "chat.jsonl"
msgs = ["%d" % i for i in range(1, 10)]
# process the data
chats = process_chats(msgs[:5], msg2chat, checkpoint, clearfile=True)
# process the rest data, and read the cache from the last time
continue_chats = process_chats(msgs, msg2chat, checkpoint)

Example 3, process data in batch (asynchronous), print hello using different languages, and use two coroutines:
from chatapi_toolkit import async_chat_completion, load_chats

langs = ["python", "java", "Julia", "C++"]
chatlogs = ["print hello using %s" % lang for lang in langs]
async_chat_completion(chatlogs, chkpoint="async_chat.jsonl", ncoroutines=2)
chats = load_chats("async_chat.jsonl")

License
This package is licensed under the MIT license. See the LICENSE file for more details.
update log
Current version 1.0.0 is a stable version, with the redundant feature function call removed, and the asynchronous processing tool added.
Beta version

Since version 0.2.0, Chat type is used to handle data
Since version 0.3.0, you can use different API Key to send requests.
Since version 0.4.0, this package is mantained by cubenlp.
Since version 0.5.0, one can use process_chats to process the data, with a customized msg2chat function and a checkpoint file.
Since version 0.6.0, the feature function call is added.

License:

For personal and professional use. You cannot resell or redistribute these repositories in their original state.

Files In This Product: (if this is empty don't purchase this product)

Customer Reviews

There are no reviews.