团队工具包聊天机器人与 azure 开放 AI 和 Bot 框架模拟器绕过 Azure Open AI

问题描述 投票:0回答:1

我使用 Teams 工具包创建了一个示例项目,在 Teams 中创建一个可以利用 Azure 开放 AI 服务(使用 gpt-4o 模型)的助手。默认代码工作正常,我可以运行模拟器并通过机器人框架模拟器发送和接收消息,并且它使用 gpt-4o 处理所有消息。但是,我想更新流程,并且希望包含以下功能:对于某些特定消息,不会调用 Azure Open AI gpt-4o 并将直接响应发送回用户。对于任何其他消息,Azure open AI gpt-4o 应处理该消息并将回复发送回用户。

有两个python文件,app.py和bot.py。我尝试根据 app.py 中的输入消息处理发生的情况。但是,在某些情况下,我无法在不干扰整个流程的情况下跳过 Azure 开放 AI。

app.py
"""
Copyright (c) Microsoft Corporation. All rights reserved.
Licensed under the MIT License.
"""
from http import HTTPStatus

from aiohttp import web
from botbuilder.core.integration import aiohttp_error_middleware
from botbuilder.schema import ChannelAccount

from bot import bot_app
import pandas as pd
from io import BytesIO
import json

from aiohttp import web, ClientRequest
import spacy
from sklearn.metrics.pairwise import cosine_similarity
import numpy as np
from rapidfuzz import fuzz, process
import re
from botbuilder.schema import Activity
import logging

routes = web.RouteTableDef()

@routes.post("/api/messages")
async def on_messages(req: web.Request) -> web.Response:
    print("on_messages ************* ")
    res = await bot_app.process(req)
        
    if res is not None:
        # Return the response from the bot app processing
        return res
    return web.Response(status=HTTPStatus.OK)


# Load the spaCy model with GloVe vectors
nlp = spacy.load("en_core_web_md")

#added below function to handle when a particular message comes, change the prompt to be sent to Azure Open AI for processing - this is working fine
async def modify_request_middleware(app, handler):
    async def middleware_handler(req):
        print(f"Next handler: {handler.__name__} of type {type(handler)}")
        print("Middleware triggered")
        if req.path == "/api/messages" and req.method == "POST":
            print("Processing POST request to /api/messages")
            data = await req.json()
            user_input = data.get('text', "").lower()

            print("User Input:", user_input)
            if "hey" in user_input:
                 data['text'] = "Hey, give me recipe of cake which is extremely moist without eggs"
        # Create a new payload with the modified data
        new_payload = json.dumps(data).encode('utf-8')
        req._read_bytes = new_payload
        return await handler(req)
    return middleware_handler


app = web.Application(middlewares=[modify_request_middleware])
app.add_routes(routes)

from config import Config

if __name__ == "__main__":
    web.run_app(app, host="localhost", port=Config.PORT)

下面是 bot.py 的代码

import os
import sys
import traceback

from botbuilder.core import MemoryStorage, TurnContext
from teams import Application, ApplicationOptions, TeamsAdapter
from teams.ai import AIOptions
from teams.ai.models import AzureOpenAIModelOptions, OpenAIModel, OpenAIModelOptions
from teams.ai.planners import ActionPlanner, ActionPlannerOptions
from teams.ai.prompts import PromptManager, PromptManagerOptions
from teams.state import TurnState
from botbuilder.core.teams import TeamsInfo
import aiohttp

from config import Config
import pandas as pd

from botbuilder.schema import Activity, ActivityTypes

config = Config()

# Create AI components
model: OpenAIModel

model = OpenAIModel(
    AzureOpenAIModelOptions(
        api_key=config.AZURE_OPENAI_API_KEY,
        default_model=config.AZURE_OPENAI_MODEL_DEPLOYMENT_NAME,
        endpoint=config.AZURE_OPENAI_ENDPOINT,
    )
)
    
prompts = PromptManager(PromptManagerOptions(prompts_folder=f"{os.getcwd()}/prompts"))

planner = ActionPlanner(
    ActionPlannerOptions(model=model, prompts=prompts, default_prompt="chat")
)

# Define storage and application
storage = MemoryStorage()
bot_app = Application[TurnState](
    ApplicationOptions(
        bot_app_id=config.APP_ID,
        storage=storage,
        adapter=TeamsAdapter(config),
        ai=AIOptions(planner=planner),
    )
)   

@bot_app.conversation_update("membersAdded")
async def on_members_added(context: TurnContext, state: TurnState):
    print("***************in on_members_added**************************")
    print(f"Activity type: {context.activity.type}")
    try:
        await context.send_activity(f"Hello!  How can I assist you today? ") 
    except Exception as e:
        await context.send_activity("Hello! How can I assist you today? There is some error here")
        print(f"Error fetching user info: {e}")


@bot_app.error
async def on_error(context: TurnContext, error: Exception):
    # This check writes out errors to console log .vs. app insights.
    # NOTE: In production environment, you should consider logging this to Azure
    #       application insights.
    print(f"\n [on_turn_error] unhandled error: {error}", file=sys.stderr)
    traceback.print_exc()

    # Send a message to the user
    await context.send_activity("The bot encountered an error or bug.")

我尝试将以下功能添加到 bot.py 但它不起作用。它正确检查输入文本,然后在模拟器上响应用户,但它不会返回到侦听和等待下一个输入的正常过程。另外,其他部分也没有调用Azure Open AI服务。

# Sample test for bot_app.on_turn
@bot_app.before_turn
async def test_on_turn(context: TurnContext, state: TurnState):
    print("in test_on_turn", context.activity.type)
    if context.activity.type == ActivityTypes.message:
        user_message = context.activity.text.strip().lower()
        # Simulate different responses based on the user's message
        if "hey" in user_message:
            await context.send_activity(f"Hey receiveddd   {context.activity.type}")
        else:
            # how to call the handler for normal process?
    else:
        # Handle other types of activities
        await context.send_activity(f"Hello!  How can I assist you today?) 
azure botframework azure-openai teams-toolkit gpt-4o-mini
1个回答
0
投票

为了实现所需的行为(跳过特定消息的 Azure OpenAI 处理,同时允许其他消息继续处理),

test_on_turn
中的
bot.py
方法需要与
bot_app
的正常消息处理流程集成。

  • bot.py
    中,您已经实现了
    test_on_turn
    中的条件逻辑。对于应绕过 Azure OpenAI 的消息,您可以发送响应并提前返回。对于所有其他消息,您希望恢复正常流程,这是通过调用默认消息处理管道来完成的。

更新了

test_on_turn
中的
bot.py
方法:

@bot_app.before_turn
async def test_on_turn(context: TurnContext, state: TurnState):
    print("in test_on_turn", context.activity.type)
    if context.activity.type == ActivityTypes.message:
        user_message = context.activity.text.strip().lower()

        # Bypass Azure OpenAI processing for specific messages
        if "hey" in user_message:
            await context.send_activity(f"Hey! How can I help you?")
            return  # Early return to stop further processing

    # Continue with normal flow (i.e., Azure OpenAI processing)
    # Let the bot application handle the request as it normally would
    await bot_app.default_turn(context, state)
  • 机器人回复:“嘿!我能帮你什么忙?”并且调用Azure OpenAI。

处理旁路和正常流量时的日志:

2024-10-15 14:32:12 [INFO] Starting bot...
2024-10-15 14:32:15 [DEBUG] on_messages *************
2024-10-15 14:32:15 [DEBUG] in test_on_turn message
2024-10-15 14:32:15 [DEBUG] User input detected: hey
2024-10-15 14:32:15 [INFO] Sending custom response: Hey! How can I help you?
2024-10-15 14:32:15 [DEBUG] Early return: Skipping Azure OpenAI

2024-10-15 14:32:20 [DEBUG] on_messages *************
2024-10-15 14:32:20 [DEBUG] in test_on_turn message
2024-10-15 14:32:20 [DEBUG] User input detected: tell me about quantum physics
2024-10-15 14:32:20 [DEBUG] Message doesn't match bypass conditions. Resuming normal flow.
2024-10-15 14:32:20 [INFO] Passing message to Azure OpenAI for processing
2024-10-15 14:32:25 [DEBUG] Azure OpenAI response received: Quantum physics is the study of matter and energy...
2024-10-15 14:32:25 [INFO] Sending response back to user: Quantum physics is the study of matter and energy...
© www.soinside.com 2019 - 2024. All rights reserved.