首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >加载句子转换模型时出错

加载句子转换模型时出错
EN

Stack Overflow用户
提问于 2022-05-29 06:57:49
回答 1查看 789关注 0票数 0

我正在尝试从SentenceTransformer加载转换器模型。下面是代码

代码语言:javascript
复制
# Now we create a SentenceTransformer model from scratch

word_emb = models.Transformer('paraphrase-mpnet-base-v2')
pooling = models.Pooling(word_emb.get_word_embedding_dimension())
model = SentenceTransformer(modules=[word_emb, pooling])

下面是错误

代码语言:javascript
复制
---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
~\AppData\Local\Temp\ipykernel_2948\3254427654.py in <module>
      1 # Now we create a SentenceTransformer model from scratch
----> 2 word_emb = models.Transformer('paraphrase-mpnet-base-v2')
      3 pooling = models.Pooling(word_emb.get_word_embedding_dimension())
      4 model = SentenceTransformer(modules=[word_emb, pooling])

~\miniconda3\envs\atoti\lib\site-packages\sentence_transformers\models\Transformer.py in __init__(self, model_name_or_path, max_seq_length, model_args, cache_dir, tokenizer_args, do_lower_case, tokenizer_name_or_path)
     27 
     28         config = AutoConfig.from_pretrained(model_name_or_path, **model_args, cache_dir=cache_dir)
---> 29         self._load_model(model_name_or_path, config, cache_dir)
     30 
     31         self.tokenizer = AutoTokenizer.from_pretrained(tokenizer_name_or_path if tokenizer_name_or_path is not None else model_name_or_path, cache_dir=cache_dir, **tokenizer_args)

~\miniconda3\envs\atoti\lib\site-packages\sentence_transformers\models\Transformer.py in _load_model(self, model_name_or_path, config, cache_dir)
     47             self._load_t5_model(model_name_or_path, config, cache_dir)
     48         else:
---> 49             self.auto_model = AutoModel.from_pretrained(model_name_or_path, config=config, cache_dir=cache_dir)
     50 
     51     def _load_t5_model(self, model_name_or_path, config, cache_dir):

~\miniconda3\envs\atoti\lib\site-packages\transformers\models\auto\auto_factory.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
    445         elif type(config) in cls._model_mapping.keys():
    446             model_class = _get_model_class(config, cls._model_mapping)
--> 447             return model_class.from_pretrained(pretrained_model_name_or_path, *model_args, config=config, **kwargs)
    448         raise ValueError(
    449             f"Unrecognized configuration class {config.__class__} for this kind of AutoModel: {cls.__name__}.\n"

~\miniconda3\envs\atoti\lib\site-packages\transformers\modeling_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
   1310                 elif os.path.join(pretrained_model_name_or_path, FLAX_WEIGHTS_NAME):
   1311                     raise EnvironmentError(
-> 1312                         f"Error no file named {WEIGHTS_NAME} found in directory {pretrained_model_name_or_path} but "
   1313                         "there is a file for Flax weights. Use `from_flax=True` to load this model from those "
   1314                         "weights."

OSError: Error no file named pytorch_model.bin found in directory paraphrase-mpnet-base-v2 but there is a file for Flax weights. Use `from_flax=True` to load this model from those weights.

我正在使用下面的版本

代码语言:javascript
复制
transformers==4.16.2
torch==1.11.0+cu113
torchaudio==0.11.0+cu113
torchvision==0.12.0+cu113
sentence-transformers==2.2.0
faiss-cpu==1.7.2
sentencepiece==0.1.96

我已经做了两个月了。突然间,它返回了一个错误。我也在用FAISS-CPU。

EN

回答 1

Stack Overflow用户

发布于 2022-09-30 09:25:47

错误是告诉你“我找不到你要加载的模型的权重。”

基于错误跟踪,我猜您正在使用语句-变形金刚库中的models对象(如果我错了,请纠正我)。需要注意的一点是,句子转换器的预训练模型只有以下转述模型

  • 释义-多语种-mpnet-base-v2
  • 释义-艾伯特-小-v2
  • 释义-多语种-MiniLM-L12-v2
  • 翻译-MiniLM-L3-v2,因此,你想要载入的不是一个句子-变形金刚预先训练的模型。

这让我想到你正在尝试从你的本地机器加载一个模型。我建议你建立这样一个句子变形金刚模型:

代码语言:javascript
复制
from sentence_transformers import SentenceTransformer

model_path_or_name = "path/to/model" # A folder that contains model config files, including pytorch_model.bin
model = SentenceTransformer(model_path_or_name)

pytorch_model.bin文件也有可能是用另一个文件名下载的,就像所以线在这里中提到的那样。

如果这能解决你的问题请告诉我。干杯。

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/72421575

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档