这是我的代码:
from langchain_core.prompts import ChatPromptTemplate
from langchain_ollama import ChatOllama
from langchain_core.output_parsers import StrOutputParser
llm = ChatOllama(
model = 'llama3.2',
temperature = 0
)
chat_template = ChatPromptTemplate.from_messages(
[
('system', "you have to give two line definition of the word given by user"),
('human', 'the word is {user_input}')
]
)
message = chat_template.format_messages(user_input = 'backlog')
llm.invoke(message)
chain = chat_template | llm | StrOutputParser()
chain.invoke({'user_input' : 'backlog'})
并且显示连接错误:
httpx.ConnectError: [WinError 10061] No connection could be made because the target machine actively refused it
我该如何解决这个问题? 我试图使用 langchain 创建基本的词义聊天机器人。
这意味着 Ollama 没有在您的计算机上运行。
在
Ollama pull <model name>
之后你需要运行Ollama serve