Longformer 获取last_hidden_state

问题描述 投票:0回答:2

我正在尝试按照此处的huggingface文档中的示例进行操作https://huggingface.co/transformers/model_doc/longformer.html:

import torch
from transformers import LongformerModel, LongformerTokenizer
model = LongformerModel.from_pretrained('allenai/longformer-base-4096')
tokenizer = LongformerTokenizer.from_pretrained('allenai/longformer-base-4096')
SAMPLE_TEXT = ' '.join(['Hello world! '] * 1000)  # long input document
input_ids = torch.tensor(tokenizer.encode(SAMPLE_TEXT)).unsqueeze(0)  # batch of size 1
# Attention mask values -- 0: no attention, 1: local attention, 2: global attention
attention_mask = torch.ones(input_ids.shape, dtype=torch.long, device=input_ids.device) # initialize to local attention
global_attention_mask = torch.zeros(input_ids.shape, dtype=torch.long, device=input_ids.device) # initialize to global attention to be deactivated for all tokens
global_attention_mask[:, [1, 4, 21,]] = 1  # Set global attention to random tokens for the sake of this example
                                    # Usually, set global attention based on the task. For example,
                                    # classification: the <s> token
                                    # QA: question tokens
                                    # LM: potentially on the beginning of sentences and paragraphs
outputs = model(input_ids, attention_mask=attention_mask, global_attention_mask=global_attention_mask, output_hidden_states= True)
sequence_output = outputs[0].last_hidden_state
pooled_output = outputs.pooler_output

我想这将返回示例文本的文档嵌入。 但是,我遇到了以下错误:

AttributeError: 'Tensor' object has no attribute 'last_hidden_state'

为什么不能调用last_hidden_state?

python nlp pytorch huggingface-transformers
2个回答
2
投票

不要通过索引选择:

sequence_output = outputs.last_hidden_state

outputs
是一个 LongformerBaseModelOutputWithPooling 对象,具有以下属性:

print(outputs.keys())

输出:

odict_keys(['last_hidden_state', 'pooler_output', 'hidden_states'])

调用

outputs[0]
outputs.last_hidden_state
都会给你相同的张量,但这个张量没有名为
last_hidden_state
的属性。


0
投票

outputs是一个transformers.models.longformer.modeling_longformer.LongformerBaseModelOutputWithPooling。 您可以使用以下命令打印其属性:

outputs.keys()

要访问批次中第一个数据点的最后一个隐藏状态,请使用以下命令:

outputs.last_hidden_state[0]

© www.soinside.com 2019 - 2024. All rights reserved.