Home / v3.2.1.3
Name Modified Size InfoDownloads / Week
Parent folder
README.md 2024-03-27 6.5 kB
v3.2.1.3 source code.tar.gz 2024-03-27 29.6 MB
v3.2.1.3 source code.zip 2024-03-27 29.7 MB
Totals: 3 Items   59.3 MB 0

New Features

  • [Request: OAIClient] Add a new request plugins for those models which API format is alike OpenAI but have some additional rules like can not support multi system messages or must have strict user-assistant message orders. It's very useful for those local servering models started by local model servering library like Xinference.

HOW TO USE:

:::python
import Agently

agent_factory = (
    Agently.AgentFactory(is_debug=True)
        .set_settings("current_model", "OAIClient")
        # Mixtral for example
        .set_settings("model.OAIClient.url" , "https://api.mistral.ai/v1")
        # if you want to use Moonshot Kimi:
        #.set_settings("model.OAIClient.url", "https://api.moonshot.cn/v1")
        # set model name
        # Mixtral model list: https://docs.mistral.ai/platform/endpoints/
        .set_settings("model.OAIClient.options", { "model": "open-mistral-7b" })
        # Moonshot mode list: https://platform.moonshot.cn/docs/pricing#文本生成模型-moonshot-v1
        # set API-KEY if needed
        .set_settings("model.OAIClient.auth.api_key", "")
        # set proxy if needed
        #.set_proxy("http://127.0.0.1:7890")
        # you can also change message rules
        #.set_settings("model.OAIClient.message_rules", {
        #    "no_multi_system_messages": True, # True by default, will combine multi system messages into one
        #    "strict_orders": True, # True by default, will transform messages' order into "User-Assistant-User-Assitant" strictly
        #    "no_multi_type_messages": True, # True by default, will only allow text messages
        #})
)

agent = agent_factory.create_agent()

(
    agent
        .set_role("You love EMOJI very much and try to use EMOJI in every sentence.")
        .chat_history([
            { "role": "user", "content": "It's a beautiful day, isn't it?" },
            { "role": "assistant", "content": "Right, shine and bright!☀️" }
        ])
        .input("What do you suggest us to do today?")
        # use .start("completions") if your model is a completion model
        .start("chat")
)

Update

Bug Fixed


新功能

  • [模型请求插件: OAIClient] 新增新的模型请求插件OAIClient,用于支持开发者请求那些看起来很像OpenAI API格式的模型(但通常会有些和OpenAI API不一致的潜在规则)。这个请求插件也可以用于请求通过类似Xinference这样的本地模型服务化库启动的本地模型服务。

如何使用:

:::python
import Agently

agent_factory = (
    Agently.AgentFactory(is_debug=True)
        .set_settings("current_model", "OAIClient")
        # 这里用Mixtral举例
        .set_settings("model.OAIClient.url" , "https://api.mistral.ai/v1")
        # 如果你希望使用月之暗面的Kimi可以参考下面这个url
        #.set_settings("model.OAIClient.url", "https://api.moonshot.cn/v1")
        # 设置你要使用的具体模型
        # Mixtral支持的模型列表: https://docs.mistral.ai/platform/endpoints/
        .set_settings("model.OAIClient.options", { "model": "open-mistral-7b" })
        # 月之暗面支持的模型列表: https://platform.moonshot.cn/docs/pricing#文本生成模型-moonshot-v1
        # 设置API-KEY(如果需要的话,本地模型可能不需要)
        .set_settings("model.OAIClient.auth.api_key", "")
        # 设置代理(如果需要的话)
        #.set_proxy("http://127.0.0.1:7890")
        # 你也可以通过设置变更消息处理的规则
        #.set_settings("model.OAIClient.message_rules", {
        #    "no_multi_system_messages": True, # 默认开,如果有多条system消息,将会被合并成一条
        #    "strict_orders": True, # 默认开,将会强制将消息列转换成“用户-助理-用户-助理”顺序
        #    "no_multi_type_messages": True, # 默认开,将只保留文本类消息,并且直接将文本值放到content里
        #})
)

agent = agent_factory.create_agent()

(
    agent
        .set_role("You love EMOJI very much and try to use EMOJI in every sentence.")
        .chat_history([
            { "role": "user", "content": "It's a beautiful day, isn't it?" },
            { "role": "assistant", "content": "Right, shine and bright!☀️" }
        ])
        .input("What do you suggest us to do today?")
        # 使用.start("completions")可以支持补全模型!
        .start("chat")
)

更新

  • [模型请求插件: ERNIE] 添加了在新的API规范中,对system参数的直接支持。现在传递给文心一言的system prompt将会直接传递给API接口的system参数,而不再转换成用户对话消息了;
  • [请求优化] 优化了list中可以包含多个item的prompt提示方法。

问题修复

  • [请求指令] 修复了导致.general().abstract()不能正常工作的问题;
  • [Agent能力插件: Segment] 修复了导致流式输出过程中,处理器(handler)无法工作的问题;
  • [模型请求插件: ERNIE] 修复了一些引号冲突的问题。
Source: README.md, updated 2024-03-27