RuntimeError:输入、输出和索引必须在当前设备上。(填充掩码(“随机文本<掩码>”)

2024-06-16 09:38:28 发布

您现在位置:Python中文网/ 问答频道 /正文

我得到“RuntimeError:输入、输出和索引必须在当前设备上。” 当我跑这条线的时候。 加注口罩(“汽车”)

我在Colab上运行它。 我的代码:

from transformers import BertTokenizer, BertForMaskedLM
from pathlib import Path
from tokenizers import ByteLevelBPETokenizer
from transformers import BertTokenizer, BertForMaskedLM


paths = [str(x) for x in Path(".").glob("**/*.txt")]
print(paths)

bert_tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')

from transformers import BertModel, BertConfig

configuration = BertConfig()
model = BertModel(configuration)
configuration = model.config
print(configuration)

model = BertForMaskedLM.from_pretrained("bert-base-uncased")

from transformers import LineByLineTextDataset
dataset = LineByLineTextDataset(
    tokenizer=bert_tokenizer,
    file_path="./kant.txt",
    block_size=128,
)

from transformers import DataCollatorForLanguageModeling
data_collator = DataCollatorForLanguageModeling(
    tokenizer=bert_tokenizer, mlm=True, mlm_probability=0.15
)

from transformers import Trainer, TrainingArguments

training_args = TrainingArguments(
    output_dir="./KantaiBERT",
    overwrite_output_dir=True,
    num_train_epochs=1,
    per_device_train_batch_size=64,
    save_steps=10_000,
    save_total_limit=2,
    )

trainer = Trainer(
    model=model,
    args=training_args,
    data_collator=data_collator,
    train_dataset=dataset,
)

trainer.train()

from transformers import pipeline

fill_mask = pipeline(
    "fill-mask",
    model=model,
    tokenizer=bert_tokenizer
)

fill_mask("Auto Car <mask>.")

最后一行告诉我上面提到的错误。请让我知道我做错了什么,或者为了消除这个错误我必须做什么


Tags: fromimportdatamodelargstrainmaskdataset
1条回答
网友
1楼 · 发布于 2024-06-16 09:38:28

培训师在GPU(default value no_cuda=False)上自动培训您的模型。您可以通过运行以下命令来验证这一点:

model.device

训练后。管道没有这样做,这会导致您看到的错误(即,您的模型在您的GPU上,但您的示例语句在您的CPU上)。您也可以通过运行支持GPU的管道来修复此问题:

fill_mask = pipeline(
    "fill-mask",
    model=model,
    tokenizer=bert_tokenizer,
    device=0,
)

或者在初始化管道之前将模型传输到CPU:

model.to('cpu')

相关问题 更多 >