Upload an upgraded version with lightning v1.9.5 backend

#3
by alvations - opened

Is it possible to save the checkpoints with the upgraded backend?

Currently, it's running the conversion everytime we load the model, i.e.

import os

from huggingface_hub import snapshot_download

from comet.models.multitask.unified_metric import UnifiedMetric

model_path = snapshot_download(repo_id="Unbabel/unite-mup", cache_dir=os.path.abspath(os.path.dirname('.')))
model_checkpoint_path = f"{model_path}/checkpoints/model.ckpt"

unite = UnifiedMetric.load_from_checkpoint(model_checkpoint_path)

[stderr]:

INFO:pytorch_lightning.utilities.migration.utils:Lightning automatically upgraded your loaded checkpoint from v1.6.0 to v1.9.5. To apply the upgrade to your files permanently, run `python -m pytorch_lightning.utilities.upgrade_checkpoint --file models--Unbabel--unite-mup/snapshots/d2d555cff30f53db362ae2899d66a667d6db165b/checkpoints/model.ckpt`
Some weights of the model checkpoint at xlm-roberta-large were not used when initializing XLMRobertaModel: ['lm_head.bias', 'lm_head.dense.weight', 'lm_head.layer_norm.bias', 'roberta.pooler.dense.bias', 'lm_head.dense.bias', 'lm_head.layer_norm.weight', 'lm_head.decoder.weight', 'roberta.pooler.dense.weight']
- This IS expected if you are initializing XLMRobertaModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing XLMRobertaModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).

We had to create our own save_to_checkpoint function which is workable but we couldn't figure out how to save the checkpoint elegantly without much hacking of the comet codebase itself, and we want to avoid a fork of the comet library.

It'll be great if the upgraded checkpoints are uploaded canonically by the model maintainers. Thank you in advance!

Unbabel org

We can do that but the loading performance is not being affected by the conversion.

Can you open an issue in COMET repo? give as much context as possible. I'll find some time to do it.

Sign up or log in to comment