我训练我的英语模型跟随这个笔记本(https://nbviewer.jupyter.org/github/amaiya/ktrain/blob/master/tutorials/tutorial-06-sequence-tagging.ipynb)。我能够保存我的预先训练的模型,并运行它没有问题。
然而,我需要再次运行它,但是离线运行,而且它不起作用,我知道我需要下载这个文件,并做一些类似于这里所做的事情。
https://github.com/huggingface/transformers/issues/136
然而,我无法理解我需要在哪里改变火车的设置。
我负责这个:
ktrain.load_predictor('Functions/my_english_nermodel')这就是我遇到的错误:
Traceback (most recent call last):
File "Z:\Functions\NER.py", line 155, in load_bert
reloaded_predictor= ktrain.load_predictor('Z:/Functions/my_english_nermodel')
File "C:\Program Files\Python37\lib\site-packages\ktrain\core.py", line 1316, in load_predictor
preproc = pickle.load(f)
File "C:\Program Files\Python37\lib\site-packages\ktrain\text\ner\anago\preprocessing.py", line 76, in __setstate__
if self.te_model is not None: self.activate_transformer(self.te_model, layers=self.te_layers)
File "C:\Program Files\Python37\lib\site-packages\ktrain\text\ner\anago\preprocessing.py", line 100, in activate_transformer
self.te = TransformerEmbedding(model_name, layers=layers)
File "C:\Program Files\Python37\lib\site-packages\ktrain\text\preprocessor.py", line 1095, in __init__
self.tokenizer = self.tokenizer_type.from_pretrained(model_name)
File "C:\Program Files\Python37\lib\site-packages\transformers\tokenization_utils.py", line 903, in from_pretrained
return cls._from_pretrained(*inputs, **kwargs)
File "C:\Program Files\Python37\lib\site-packages\transformers\tokenization_utils.py", line 1008, in _from_pretrained
list(cls.vocab_files_names.values()),
OSError: Model name 'bert-base-uncased' was not found in tokenizers model name list (bert-base-uncased, bert-large-uncased, bert-base-cased, bert-large-cased, bert-base-multilingual-uncased, bert-base-multilingual-cased, bert-base-chinese, bert-base-german-cased, bert-large-uncased-whole-word-masking, bert-large-cased-whole-word-masking, bert-large-uncased-whole-word-masking-finetuned-squad, bert-large-cased-whole-word-masking-finetuned-squad, bert-base-cased-finetuned-mrpc, bert-base-german-dbmdz-cased, bert-base-german-dbmdz-uncased, bert-base-finnish-cased-v1, bert-base-finnish-uncased-v1, bert-base-dutch-cased). We assumed 'bert-base-dutch-cased' was a path, a model identifier, or url to a directory containing vocabulary files named ['vocab.txt'] but couldn't find such vocabulary files at this path or url.
Process finished with exit code 1发布于 2020-06-04 13:18:34
更普遍的是,变压器-based预培训模型被下载到<home_directory>/.cache/torch/transformers中。例如,在Linux上,这将是/home/<user_name>/.cache/torch/transformers。
正如上面的答案所示,要在没有internet访问的机器上重新加载ktrain predictor (对于利用来自transformers库的模型的ktrain模型),您需要将该文件夹中的模型文件复制到新机器上的相同位置。
发布于 2020-06-03 14:01:21
我找到了一种解决方案,当使用internet连接运行时,它会创建一个文件夹:‘C:\Users\lemolina.cache\torch\transformers’我需要在机器中复制无法访问internet的同一个文件夹
https://stackoverflow.com/questions/62150139
复制相似问题