当我尝试每次运行此代码时,即使在重新启动所有内容后,我也在完全加载模型之前结束了。
transformers
版本:4.46.2from transformers import AutoModelForCausalLM, AutoTokenizer
from transformers import TextIteratorStreamer, TextStreamer, StoppingCriteria
if __name__ == "__main__":
# Load your model and tokenizer
model_id = "./Llama-3.2-3b-instruct"
model = AutoModelForCausalLM.from_pretrained(model_id)
tokenizer = AutoTokenizer.from_pretrained(model_id)
Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s]
Process finished with exit code -1073741819 (0xC0000005)
以下截图供参考: 截图因为我没有足够的积分:(
这是计算机CPU内存的问题,我重新启动计算机,现在一切似乎都很好!