Edit model card

DistilBERT-TC1000new-10epochs

This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1041
  • Recall: {'recall': 0.98}
  • Precision: {'precision': 0.9803125}

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Recall Precision
1.0487 0.35 20 0.9344 {'recall': 0.65} {'precision': 0.7268554095045501}
0.8188 0.7 40 0.5972 {'recall': 0.94} {'precision': 0.9445447154471545}
0.4832 1.05 60 0.3348 {'recall': 0.92} {'precision': 0.9216638655462185}
0.2652 1.4 80 0.1883 {'recall': 0.95} {'precision': 0.9504561403508772}
0.1615 1.75 100 0.1381 {'recall': 0.94} {'precision': 0.9407058791178574}
0.128 2.11 120 0.0923 {'recall': 0.98} {'precision': 0.981025641025641}
0.0625 2.46 140 0.1014 {'recall': 0.97} {'precision': 0.9700347222222223}
0.0318 2.81 160 0.0715 {'recall': 0.98} {'precision': 0.9803125}
0.0187 3.16 180 0.0968 {'recall': 0.98} {'precision': 0.9803125}
0.0106 3.51 200 0.0843 {'recall': 0.98} {'precision': 0.9803125}
0.0268 3.86 220 0.0860 {'recall': 0.98} {'precision': 0.9803125}
0.0074 4.21 240 0.1058 {'recall': 0.98} {'precision': 0.9803125}
0.0194 4.56 260 0.1108 {'recall': 0.98} {'precision': 0.9803125}
0.0056 4.91 280 0.1048 {'recall': 0.98} {'precision': 0.9803125}
0.0044 5.26 300 0.1034 {'recall': 0.98} {'precision': 0.9803125}
0.0063 5.61 320 0.1034 {'recall': 0.98} {'precision': 0.9803125}
0.0032 5.96 340 0.1033 {'recall': 0.98} {'precision': 0.9803125}
0.0031 6.32 360 0.1039 {'recall': 0.98} {'precision': 0.9803125}
0.003 6.67 380 0.1012 {'recall': 0.98} {'precision': 0.9803125}
0.0027 7.02 400 0.1013 {'recall': 0.98} {'precision': 0.9803125}
0.0026 7.37 420 0.1019 {'recall': 0.98} {'precision': 0.9803125}
0.0024 7.72 440 0.1043 {'recall': 0.98} {'precision': 0.9803125}
0.0023 8.07 460 0.1060 {'recall': 0.98} {'precision': 0.9803125}
0.0022 8.42 480 0.1052 {'recall': 0.98} {'precision': 0.9803125}
0.0022 8.77 500 0.1038 {'recall': 0.98} {'precision': 0.9803125}
0.0021 9.12 520 0.1037 {'recall': 0.98} {'precision': 0.9803125}
0.0021 9.47 540 0.1038 {'recall': 0.98} {'precision': 0.9803125}
0.0021 9.82 560 0.1041 {'recall': 0.98} {'precision': 0.9803125}

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.1
  • Tokenizers 0.13.3
Downloads last month
4
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for youlun77/DistilBERT-TC1000new-10epochs

Finetuned
this model