与 Ollama 一起传输时,Mistral7b 响应以额外的前导空格开头

问题描述 投票:0回答:1

当我使用 Ollama 流式传输 Mistra7b LLM 的响应时,它在第一个流式传输块的左侧有一个额外的空间。下面是我的代码:

import ollama

stream = ollama.chat(
  model='mistral',
  messages=[{'role': 'user', 'content': 'Name an engineer that passes the vibe check'}],
  stream=True
)

for chunk in stream:
  print(chunk['message']['content'], end='', flush=True)

输出如下所示:

$ python3 test.py 
 Elon Musk, the CEO of SpaceX and Tesla, is an engineer who seems to pass the "vibe check." He is known for his innovative ideas in renewable energy, space travel, and transportation. However, it's important to remember that personality and vibes can be subjective, so not everyone may agree with this assessment. Additionally, Musk's public image should not overshadow the contributions of countless other engineers who are equally impressive but less well-known.

请注意字母“E”之前的第一个空格。我该如何删除它?

python whitespace large-language-model ollama mistral-7b
1个回答
0
投票

使用

lstrip()
删除空白。

import ollama

stream = ollama.chat(
  model='mistral',
  messages=[{'role': 'user', 'content': 'Name an engineer that passes the vibe check'}],
  stream=True
)
first = True
for chunk in stream:
    if first:
        chunk = chunk['message']['content'].lstrip()
        print(chunk)
        first = False
    else:
        chunk = chunk['message']['content']
        print(chunk)
© www.soinside.com 2019 - 2024. All rights reserved.