Skip to content

Conversation

helpmefindaname
Copy link
Member

I noticed that transformer embeddings download the config from the hub whenever they are loaded.
This PR fixes that.

Example to run on master:

from flair.models import SequenceTagger

tagger = SequenceTagger.load("ner-large")
tagger.save("ner-large.pt")

# now delete the huggingface cache and disable internet, before running the code below

reloaded_tagger = SequenceTagger.load("ner-large.pt")
# this will fail on master but not on this branch

@alanakbik
Copy link
Collaborator

@helpmefindaname thanks for spotting and fixing this!

@alanakbik alanakbik merged commit 534d479 into master Mar 31, 2023
@alanakbik alanakbik deleted the fix_config_seralization_of_transformers branch March 31, 2023 11:09
@alanakbik
Copy link
Collaborator

@helpmefindaname I believe this introduces a new error. If I run:

from flair.models import MultitaskModel

linker = MultitaskModel.load('zelda')

I now get the following error:

TypeError: __init__() got an unexpected keyword argument 'truncate'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants