NyxKrage commited on
Commit
4435eb1
1 Parent(s): c02568a

Upload folder using huggingface_hub

Browse files
Files changed (1) hide show
  1. clean.ipynb +1711 -0
clean.ipynb ADDED
@@ -0,0 +1,1711 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "code",
5
+ "execution_count": 12,
6
+ "metadata": {},
7
+ "outputs": [
8
+ {
9
+ "name": "stdout",
10
+ "output_type": "stream",
11
+ "text": [
12
+ "Requirement already satisfied: tqdm in ./.venv/lib/python3.10/site-packages (4.66.1)\n",
13
+ "Requirement already satisfied: transformers in ./.venv/lib/python3.10/site-packages (4.37.2)\n",
14
+ "Requirement already satisfied: pandas in ./.venv/lib/python3.10/site-packages (2.2.0)\n",
15
+ "Requirement already satisfied: pyarrow in ./.venv/lib/python3.10/site-packages (15.0.0)\n",
16
+ "Collecting torch\n",
17
+ " Downloading torch-2.2.0-cp310-cp310-manylinux1_x86_64.whl (755.5 MB)\n",
18
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m755.5/755.5 MB\u001b[0m \u001b[31m3.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
19
+ "\u001b[?25hRequirement already satisfied: filelock in ./.venv/lib/python3.10/site-packages (from transformers) (3.13.1)\n",
20
+ "Requirement already satisfied: huggingface-hub<1.0,>=0.19.3 in ./.venv/lib/python3.10/site-packages (from transformers) (0.20.3)\n",
21
+ "Requirement already satisfied: safetensors>=0.4.1 in ./.venv/lib/python3.10/site-packages (from transformers) (0.4.2)\n",
22
+ "Requirement already satisfied: numpy>=1.17 in ./.venv/lib/python3.10/site-packages (from transformers) (1.26.3)\n",
23
+ "Requirement already satisfied: tokenizers<0.19,>=0.14 in ./.venv/lib/python3.10/site-packages (from transformers) (0.15.1)\n",
24
+ "Requirement already satisfied: packaging>=20.0 in ./.venv/lib/python3.10/site-packages (from transformers) (23.2)\n",
25
+ "Requirement already satisfied: regex!=2019.12.17 in ./.venv/lib/python3.10/site-packages (from transformers) (2023.12.25)\n",
26
+ "Requirement already satisfied: pyyaml>=5.1 in ./.venv/lib/python3.10/site-packages (from transformers) (6.0.1)\n",
27
+ "Requirement already satisfied: requests in ./.venv/lib/python3.10/site-packages (from transformers) (2.31.0)\n",
28
+ "Requirement already satisfied: tzdata>=2022.7 in ./.venv/lib/python3.10/site-packages (from pandas) (2023.4)\n",
29
+ "Requirement already satisfied: python-dateutil>=2.8.2 in ./.venv/lib/python3.10/site-packages (from pandas) (2.8.2)\n",
30
+ "Requirement already satisfied: pytz>=2020.1 in ./.venv/lib/python3.10/site-packages (from pandas) (2023.4)\n",
31
+ "Collecting nvidia-cusparse-cu12==12.1.0.106\n",
32
+ " Using cached nvidia_cusparse_cu12-12.1.0.106-py3-none-manylinux1_x86_64.whl (196.0 MB)\n",
33
+ "Collecting nvidia-cuda-runtime-cu12==12.1.105\n",
34
+ " Using cached nvidia_cuda_runtime_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (823 kB)\n",
35
+ "Collecting triton==2.2.0\n",
36
+ " Using cached triton-2.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (167.9 MB)\n",
37
+ "Collecting nvidia-cublas-cu12==12.1.3.1\n",
38
+ " Using cached nvidia_cublas_cu12-12.1.3.1-py3-none-manylinux1_x86_64.whl (410.6 MB)\n",
39
+ "Collecting jinja2\n",
40
+ " Using cached Jinja2-3.1.3-py3-none-any.whl (133 kB)\n",
41
+ "Collecting nvidia-nvtx-cu12==12.1.105\n",
42
+ " Using cached nvidia_nvtx_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (99 kB)\n",
43
+ "Collecting sympy\n",
44
+ " Using cached sympy-1.12-py3-none-any.whl (5.7 MB)\n",
45
+ "Requirement already satisfied: typing-extensions>=4.8.0 in ./.venv/lib/python3.10/site-packages (from torch) (4.9.0)\n",
46
+ "Collecting nvidia-cuda-cupti-cu12==12.1.105\n",
47
+ " Using cached nvidia_cuda_cupti_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (14.1 MB)\n",
48
+ "Collecting nvidia-cusolver-cu12==11.4.5.107\n",
49
+ " Using cached nvidia_cusolver_cu12-11.4.5.107-py3-none-manylinux1_x86_64.whl (124.2 MB)\n",
50
+ "Collecting nvidia-cufft-cu12==11.0.2.54\n",
51
+ " Using cached nvidia_cufft_cu12-11.0.2.54-py3-none-manylinux1_x86_64.whl (121.6 MB)\n",
52
+ "Requirement already satisfied: fsspec in ./.venv/lib/python3.10/site-packages (from torch) (2023.12.2)\n",
53
+ "Collecting nvidia-cuda-nvrtc-cu12==12.1.105\n",
54
+ " Using cached nvidia_cuda_nvrtc_cu12-12.1.105-py3-none-manylinux1_x86_64.whl (23.7 MB)\n",
55
+ "Collecting networkx\n",
56
+ " Using cached networkx-3.2.1-py3-none-any.whl (1.6 MB)\n",
57
+ "Collecting nvidia-nccl-cu12==2.19.3\n",
58
+ " Downloading nvidia_nccl_cu12-2.19.3-py3-none-manylinux1_x86_64.whl (166.0 MB)\n",
59
+ "\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m166.0/166.0 MB\u001b[0m \u001b[31m13.6 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m00:01\u001b[0m00:01\u001b[0m\n",
60
+ "\u001b[?25hCollecting nvidia-cudnn-cu12==8.9.2.26\n",
61
+ " Using cached nvidia_cudnn_cu12-8.9.2.26-py3-none-manylinux1_x86_64.whl (731.7 MB)\n",
62
+ "Collecting nvidia-curand-cu12==10.3.2.106\n",
63
+ " Using cached nvidia_curand_cu12-10.3.2.106-py3-none-manylinux1_x86_64.whl (56.5 MB)\n",
64
+ "Collecting nvidia-nvjitlink-cu12\n",
65
+ " Using cached nvidia_nvjitlink_cu12-12.3.101-py3-none-manylinux1_x86_64.whl (20.5 MB)\n",
66
+ "Requirement already satisfied: six>=1.5 in ./.venv/lib/python3.10/site-packages (from python-dateutil>=2.8.2->pandas) (1.16.0)\n",
67
+ "Collecting MarkupSafe>=2.0\n",
68
+ " Using cached MarkupSafe-2.1.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25 kB)\n",
69
+ "Requirement already satisfied: urllib3<3,>=1.21.1 in ./.venv/lib/python3.10/site-packages (from requests->transformers) (2.2.0)\n",
70
+ "Requirement already satisfied: charset-normalizer<4,>=2 in ./.venv/lib/python3.10/site-packages (from requests->transformers) (3.3.2)\n",
71
+ "Requirement already satisfied: certifi>=2017.4.17 in ./.venv/lib/python3.10/site-packages (from requests->transformers) (2023.11.17)\n",
72
+ "Requirement already satisfied: idna<4,>=2.5 in ./.venv/lib/python3.10/site-packages (from requests->transformers) (3.6)\n",
73
+ "Collecting mpmath>=0.19\n",
74
+ " Using cached mpmath-1.3.0-py3-none-any.whl (536 kB)\n",
75
+ "Installing collected packages: mpmath, triton, sympy, nvidia-nvtx-cu12, nvidia-nvjitlink-cu12, nvidia-nccl-cu12, nvidia-curand-cu12, nvidia-cufft-cu12, nvidia-cuda-runtime-cu12, nvidia-cuda-nvrtc-cu12, nvidia-cuda-cupti-cu12, nvidia-cublas-cu12, networkx, MarkupSafe, nvidia-cusparse-cu12, nvidia-cudnn-cu12, jinja2, nvidia-cusolver-cu12, torch\n",
76
+ "Successfully installed MarkupSafe-2.1.4 jinja2-3.1.3 mpmath-1.3.0 networkx-3.2.1 nvidia-cublas-cu12-12.1.3.1 nvidia-cuda-cupti-cu12-12.1.105 nvidia-cuda-nvrtc-cu12-12.1.105 nvidia-cuda-runtime-cu12-12.1.105 nvidia-cudnn-cu12-8.9.2.26 nvidia-cufft-cu12-11.0.2.54 nvidia-curand-cu12-10.3.2.106 nvidia-cusolver-cu12-11.4.5.107 nvidia-cusparse-cu12-12.1.0.106 nvidia-nccl-cu12-2.19.3 nvidia-nvjitlink-cu12-12.3.101 nvidia-nvtx-cu12-12.1.105 sympy-1.12 torch-2.2.0 triton-2.2.0\n",
77
+ "\n",
78
+ "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m A new release of pip is available: \u001b[0m\u001b[31;49m23.0.1\u001b[0m\u001b[39;49m -> \u001b[0m\u001b[32;49m23.3.2\u001b[0m\n",
79
+ "\u001b[1m[\u001b[0m\u001b[34;49mnotice\u001b[0m\u001b[1;39;49m]\u001b[0m\u001b[39;49m To update, run: \u001b[0m\u001b[32;49mpip install --upgrade pip\u001b[0m\n",
80
+ "Note: you may need to restart the kernel to use updated packages.\n"
81
+ ]
82
+ }
83
+ ],
84
+ "source": [
85
+ "%pip install tqdm transformers pandas pyarrow torch"
86
+ ]
87
+ },
88
+ {
89
+ "cell_type": "code",
90
+ "execution_count": 13,
91
+ "metadata": {},
92
+ "outputs": [],
93
+ "source": [
94
+ "import glob\n",
95
+ "import torch\n",
96
+ "import os\n",
97
+ "import re\n",
98
+ "import shutil\n",
99
+ "from tqdm import tqdm\n",
100
+ "from transformers import AutoTokenizer, AutoModelForCausalLM\n",
101
+ "import pandas as pd"
102
+ ]
103
+ },
104
+ {
105
+ "cell_type": "code",
106
+ "execution_count": 14,
107
+ "metadata": {},
108
+ "outputs": [],
109
+ "source": [
110
+ "model_name = \"Dans-DiscountModels/Dans-StructureEvaluator-Small\""
111
+ ]
112
+ },
113
+ {
114
+ "cell_type": "code",
115
+ "execution_count": 15,
116
+ "metadata": {},
117
+ "outputs": [
118
+ {
119
+ "ename": "RuntimeError",
120
+ "evalue": "Failed to import transformers.models.mistral.modeling_mistral because of the following error (look up to see its traceback):\nmodule 'torch._subclasses' has no attribute 'functional_tensor'",
121
+ "output_type": "error",
122
+ "traceback": [
123
+ "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
124
+ "\u001b[0;31mAttributeError\u001b[0m Traceback (most recent call last)",
125
+ "File \u001b[0;32m~/ai/llm/datasets/NyxKrage_bambisleep/.venv/lib/python3.10/site-packages/transformers/utils/import_utils.py:1364\u001b[0m, in \u001b[0;36m_LazyModule._get_module\u001b[0;34m(self, module_name)\u001b[0m\n\u001b[1;32m 1363\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m-> 1364\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mimportlib\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mimport_module\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43m.\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m \u001b[49m\u001b[38;5;241;43m+\u001b[39;49m\u001b[43m \u001b[49m\u001b[43mmodule_name\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[38;5;18;43m__name__\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1365\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n",
126
+ "File \u001b[0;32m~/.proto/tools/python/3.10.11/install/lib/python3.10/importlib/__init__.py:126\u001b[0m, in \u001b[0;36mimport_module\u001b[0;34m(name, package)\u001b[0m\n\u001b[1;32m 125\u001b[0m level \u001b[38;5;241m+\u001b[39m\u001b[38;5;241m=\u001b[39m \u001b[38;5;241m1\u001b[39m\n\u001b[0;32m--> 126\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43m_bootstrap\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_gcd_import\u001b[49m\u001b[43m(\u001b[49m\u001b[43mname\u001b[49m\u001b[43m[\u001b[49m\u001b[43mlevel\u001b[49m\u001b[43m:\u001b[49m\u001b[43m]\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mpackage\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mlevel\u001b[49m\u001b[43m)\u001b[49m\n",
127
+ "File \u001b[0;32m<frozen importlib._bootstrap>:1050\u001b[0m, in \u001b[0;36m_gcd_import\u001b[0;34m(name, package, level)\u001b[0m\n",
128
+ "File \u001b[0;32m<frozen importlib._bootstrap>:1027\u001b[0m, in \u001b[0;36m_find_and_load\u001b[0;34m(name, import_)\u001b[0m\n",
129
+ "File \u001b[0;32m<frozen importlib._bootstrap>:1006\u001b[0m, in \u001b[0;36m_find_and_load_unlocked\u001b[0;34m(name, import_)\u001b[0m\n",
130
+ "File \u001b[0;32m<frozen importlib._bootstrap>:688\u001b[0m, in \u001b[0;36m_load_unlocked\u001b[0;34m(spec)\u001b[0m\n",
131
+ "File \u001b[0;32m<frozen importlib._bootstrap_external>:883\u001b[0m, in \u001b[0;36mexec_module\u001b[0;34m(self, module)\u001b[0m\n",
132
+ "File \u001b[0;32m<frozen importlib._bootstrap>:241\u001b[0m, in \u001b[0;36m_call_with_frames_removed\u001b[0;34m(f, *args, **kwds)\u001b[0m\n",
133
+ "File \u001b[0;32m~/ai/llm/datasets/NyxKrage_bambisleep/.venv/lib/python3.10/site-packages/transformers/models/mistral/modeling_mistral.py:28\u001b[0m\n\u001b[1;32m 27\u001b[0m \u001b[38;5;28;01mimport\u001b[39;00m \u001b[38;5;21;01mtorch\u001b[39;00m\u001b[38;5;21;01m.\u001b[39;00m\u001b[38;5;21;01mnn\u001b[39;00m\u001b[38;5;21;01m.\u001b[39;00m\u001b[38;5;21;01mfunctional\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m \u001b[38;5;21;01mF\u001b[39;00m\n\u001b[0;32m---> 28\u001b[0m \u001b[38;5;28;01mimport\u001b[39;00m \u001b[38;5;21;01mtorch\u001b[39;00m\u001b[38;5;21;01m.\u001b[39;00m\u001b[38;5;21;01mutils\u001b[39;00m\u001b[38;5;21;01m.\u001b[39;00m\u001b[38;5;21;01mcheckpoint\u001b[39;00m\n\u001b[1;32m 29\u001b[0m \u001b[38;5;28;01mfrom\u001b[39;00m \u001b[38;5;21;01mtorch\u001b[39;00m \u001b[38;5;28;01mimport\u001b[39;00m nn\n",
134
+ "File \u001b[0;32m~/ai/llm/datasets/NyxKrage_bambisleep/.venv/lib/python3.10/site-packages/torch/utils/checkpoint.py:1169\u001b[0m\n\u001b[1;32m 1163\u001b[0m \u001b[38;5;66;03m# NOTE: torch.utils.checkpoint internal logic will call these two functions unknown number of times\u001b[39;00m\n\u001b[1;32m 1164\u001b[0m \u001b[38;5;66;03m# (i.e. there could be _CachedTorchDispatchMode calls that doesn't map to a _CachingTorchDispatchMode call),\u001b[39;00m\n\u001b[1;32m 1165\u001b[0m \u001b[38;5;66;03m# so we ignore these ops and just always recompute them.\u001b[39;00m\n\u001b[1;32m 1166\u001b[0m _ignored_ops \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 1167\u001b[0m torch\u001b[38;5;241m.\u001b[39mops\u001b[38;5;241m.\u001b[39mprim\u001b[38;5;241m.\u001b[39mdevice\u001b[38;5;241m.\u001b[39mdefault,\n\u001b[1;32m 1168\u001b[0m torch\u001b[38;5;241m.\u001b[39mops\u001b[38;5;241m.\u001b[39maten\u001b[38;5;241m.\u001b[39mdetach\u001b[38;5;241m.\u001b[39mdefault,\n\u001b[0;32m-> 1169\u001b[0m } \u001b[38;5;241m|\u001b[39m \u001b[38;5;28mset\u001b[39m(\u001b[43mtorch\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_subclasses\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mfunctional_tensor\u001b[49m\u001b[38;5;241m.\u001b[39mFunctionalTensor\u001b[38;5;241m.\u001b[39mmetadata_fns)\n\u001b[1;32m 1172\u001b[0m \u001b[38;5;28;01mclass\u001b[39;00m \u001b[38;5;21;01m_CachingTorchDispatchMode\u001b[39;00m(TorchDispatchMode):\n",
135
+ "\u001b[0;31mAttributeError\u001b[0m: module 'torch._subclasses' has no attribute 'functional_tensor'",
136
+ "\nThe above exception was the direct cause of the following exception:\n",
137
+ "\u001b[0;31mRuntimeError\u001b[0m Traceback (most recent call last)",
138
+ "Cell \u001b[0;32mIn[15], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m model \u001b[38;5;241m=\u001b[39m \u001b[43mAutoModelForCausalLM\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mfrom_pretrained\u001b[49m\u001b[43m(\u001b[49m\u001b[43mmodel_name\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mdevice_map\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43m \u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mcuda\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m)\u001b[49m\n\u001b[1;32m 2\u001b[0m tokenizer \u001b[38;5;241m=\u001b[39m AutoTokenizer\u001b[38;5;241m.\u001b[39mfrom_pretrained(model_name)\n",
139
+ "File \u001b[0;32m~/ai/llm/datasets/NyxKrage_bambisleep/.venv/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:565\u001b[0m, in \u001b[0;36m_BaseAutoModelClass.from_pretrained\u001b[0;34m(cls, pretrained_model_name_or_path, *model_args, **kwargs)\u001b[0m\n\u001b[1;32m 561\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m model_class\u001b[38;5;241m.\u001b[39mfrom_pretrained(\n\u001b[1;32m 562\u001b[0m pretrained_model_name_or_path, \u001b[38;5;241m*\u001b[39mmodel_args, config\u001b[38;5;241m=\u001b[39mconfig, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mhub_kwargs, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs\n\u001b[1;32m 563\u001b[0m )\n\u001b[1;32m 564\u001b[0m \u001b[38;5;28;01melif\u001b[39;00m \u001b[38;5;28mtype\u001b[39m(config) \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mcls\u001b[39m\u001b[38;5;241m.\u001b[39m_model_mapping\u001b[38;5;241m.\u001b[39mkeys():\n\u001b[0;32m--> 565\u001b[0m model_class \u001b[38;5;241m=\u001b[39m \u001b[43m_get_model_class\u001b[49m\u001b[43m(\u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43mcls\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_model_mapping\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 566\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m model_class\u001b[38;5;241m.\u001b[39mfrom_pretrained(\n\u001b[1;32m 567\u001b[0m pretrained_model_name_or_path, \u001b[38;5;241m*\u001b[39mmodel_args, config\u001b[38;5;241m=\u001b[39mconfig, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mhub_kwargs, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs\n\u001b[1;32m 568\u001b[0m )\n\u001b[1;32m 569\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\n\u001b[1;32m 570\u001b[0m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mUnrecognized configuration class \u001b[39m\u001b[38;5;132;01m{\u001b[39;00mconfig\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__class__\u001b[39m\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m for this kind of AutoModel: \u001b[39m\u001b[38;5;132;01m{\u001b[39;00m\u001b[38;5;28mcls\u001b[39m\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__name__\u001b[39m\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m.\u001b[39m\u001b[38;5;130;01m\\n\u001b[39;00m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 571\u001b[0m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mModel type should be one of \u001b[39m\u001b[38;5;132;01m{\u001b[39;00m\u001b[38;5;124m'\u001b[39m\u001b[38;5;124m, \u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;241m.\u001b[39mjoin(c\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__name__\u001b[39m\u001b[38;5;250m \u001b[39m\u001b[38;5;28;01mfor\u001b[39;00m\u001b[38;5;250m \u001b[39mc\u001b[38;5;250m \u001b[39m\u001b[38;5;129;01min\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[38;5;28mcls\u001b[39m\u001b[38;5;241m.\u001b[39m_model_mapping\u001b[38;5;241m.\u001b[39mkeys())\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m.\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 572\u001b[0m )\n",
140
+ "File \u001b[0;32m~/ai/llm/datasets/NyxKrage_bambisleep/.venv/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:387\u001b[0m, in \u001b[0;36m_get_model_class\u001b[0;34m(config, model_mapping)\u001b[0m\n\u001b[1;32m 386\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_get_model_class\u001b[39m(config, model_mapping):\n\u001b[0;32m--> 387\u001b[0m supported_models \u001b[38;5;241m=\u001b[39m \u001b[43mmodel_mapping\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;28;43mtype\u001b[39;49m\u001b[43m(\u001b[49m\u001b[43mconfig\u001b[49m\u001b[43m)\u001b[49m\u001b[43m]\u001b[49m\n\u001b[1;32m 388\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28misinstance\u001b[39m(supported_models, (\u001b[38;5;28mlist\u001b[39m, \u001b[38;5;28mtuple\u001b[39m)):\n\u001b[1;32m 389\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m supported_models\n",
141
+ "File \u001b[0;32m~/ai/llm/datasets/NyxKrage_bambisleep/.venv/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:740\u001b[0m, in \u001b[0;36m_LazyAutoMapping.__getitem__\u001b[0;34m(self, key)\u001b[0m\n\u001b[1;32m 738\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m model_type \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_model_mapping:\n\u001b[1;32m 739\u001b[0m model_name \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_model_mapping[model_type]\n\u001b[0;32m--> 740\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_load_attr_from_module\u001b[49m\u001b[43m(\u001b[49m\u001b[43mmodel_type\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mmodel_name\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 742\u001b[0m \u001b[38;5;66;03m# Maybe there was several model types associated with this config.\u001b[39;00m\n\u001b[1;32m 743\u001b[0m model_types \u001b[38;5;241m=\u001b[39m [k \u001b[38;5;28;01mfor\u001b[39;00m k, v \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_config_mapping\u001b[38;5;241m.\u001b[39mitems() \u001b[38;5;28;01mif\u001b[39;00m v \u001b[38;5;241m==\u001b[39m key\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__name__\u001b[39m]\n",
142
+ "File \u001b[0;32m~/ai/llm/datasets/NyxKrage_bambisleep/.venv/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:754\u001b[0m, in \u001b[0;36m_LazyAutoMapping._load_attr_from_module\u001b[0;34m(self, model_type, attr)\u001b[0m\n\u001b[1;32m 752\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m module_name \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_modules:\n\u001b[1;32m 753\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_modules[module_name] \u001b[38;5;241m=\u001b[39m importlib\u001b[38;5;241m.\u001b[39mimport_module(\u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124m.\u001b[39m\u001b[38;5;132;01m{\u001b[39;00mmodule_name\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m\"\u001b[39m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mtransformers.models\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n\u001b[0;32m--> 754\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mgetattribute_from_module\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_modules\u001b[49m\u001b[43m[\u001b[49m\u001b[43mmodule_name\u001b[49m\u001b[43m]\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mattr\u001b[49m\u001b[43m)\u001b[49m\n",
143
+ "File \u001b[0;32m~/ai/llm/datasets/NyxKrage_bambisleep/.venv/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:698\u001b[0m, in \u001b[0;36mgetattribute_from_module\u001b[0;34m(module, attr)\u001b[0m\n\u001b[1;32m 696\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28misinstance\u001b[39m(attr, \u001b[38;5;28mtuple\u001b[39m):\n\u001b[1;32m 697\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mtuple\u001b[39m(getattribute_from_module(module, a) \u001b[38;5;28;01mfor\u001b[39;00m a \u001b[38;5;129;01min\u001b[39;00m attr)\n\u001b[0;32m--> 698\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28;43mhasattr\u001b[39;49m\u001b[43m(\u001b[49m\u001b[43mmodule\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mattr\u001b[49m\u001b[43m)\u001b[49m:\n\u001b[1;32m 699\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mgetattr\u001b[39m(module, attr)\n\u001b[1;32m 700\u001b[0m \u001b[38;5;66;03m# Some of the mappings have entries model_type -> object of another model type. In that case we try to grab the\u001b[39;00m\n\u001b[1;32m 701\u001b[0m \u001b[38;5;66;03m# object at the top level.\u001b[39;00m\n",
144
+ "File \u001b[0;32m~/ai/llm/datasets/NyxKrage_bambisleep/.venv/lib/python3.10/site-packages/transformers/utils/import_utils.py:1354\u001b[0m, in \u001b[0;36m_LazyModule.__getattr__\u001b[0;34m(self, name)\u001b[0m\n\u001b[1;32m 1352\u001b[0m value \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_get_module(name)\n\u001b[1;32m 1353\u001b[0m \u001b[38;5;28;01melif\u001b[39;00m name \u001b[38;5;129;01min\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_class_to_module\u001b[38;5;241m.\u001b[39mkeys():\n\u001b[0;32m-> 1354\u001b[0m module \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_get_module\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_class_to_module\u001b[49m\u001b[43m[\u001b[49m\u001b[43mname\u001b[49m\u001b[43m]\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1355\u001b[0m value \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mgetattr\u001b[39m(module, name)\n\u001b[1;32m 1356\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n",
145
+ "File \u001b[0;32m~/ai/llm/datasets/NyxKrage_bambisleep/.venv/lib/python3.10/site-packages/transformers/utils/import_utils.py:1366\u001b[0m, in \u001b[0;36m_LazyModule._get_module\u001b[0;34m(self, module_name)\u001b[0m\n\u001b[1;32m 1364\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m importlib\u001b[38;5;241m.\u001b[39mimport_module(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124m.\u001b[39m\u001b[38;5;124m\"\u001b[39m \u001b[38;5;241m+\u001b[39m module_name, \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__name__\u001b[39m)\n\u001b[1;32m 1365\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[0;32m-> 1366\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mRuntimeError\u001b[39;00m(\n\u001b[1;32m 1367\u001b[0m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mFailed to import \u001b[39m\u001b[38;5;132;01m{\u001b[39;00m\u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__name__\u001b[39m\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m.\u001b[39m\u001b[38;5;132;01m{\u001b[39;00mmodule_name\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m because of the following error (look up to see its\u001b[39m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 1368\u001b[0m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124m traceback):\u001b[39m\u001b[38;5;130;01m\\n\u001b[39;00m\u001b[38;5;132;01m{\u001b[39;00me\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m\"\u001b[39m\n\u001b[1;32m 1369\u001b[0m ) \u001b[38;5;28;01mfrom\u001b[39;00m \u001b[38;5;21;01me\u001b[39;00m\n",
146
+ "\u001b[0;31mRuntimeError\u001b[0m: Failed to import transformers.models.mistral.modeling_mistral because of the following error (look up to see its traceback):\nmodule 'torch._subclasses' has no attribute 'functional_tensor'"
147
+ ]
148
+ }
149
+ ],
150
+ "source": [
151
+ "model = AutoModelForCausalLM.from_pretrained(model_name, device_map = \"cuda\")\n",
152
+ "tokenizer = AutoTokenizer.from_pretrained(model_name)"
153
+ ]
154
+ },
155
+ {
156
+ "cell_type": "code",
157
+ "execution_count": 51,
158
+ "metadata": {},
159
+ "outputs": [],
160
+ "source": [
161
+ "def calculate_perplexity(text):\n",
162
+ " input_ids = torch.tensor([tokenizer.encode(text)]).to(\"cuda:0\")\n",
163
+ "\n",
164
+ " model.eval()\n",
165
+ "\n",
166
+ " with torch.no_grad():\n",
167
+ " outputs = model(input_ids, labels=input_ids)\n",
168
+ " loss = outputs[0]\n",
169
+ "\n",
170
+ " del outputs\n",
171
+ " torch.cuda.empty_cache()\n",
172
+ "\n",
173
+ " return torch.exp(loss).item()"
174
+ ]
175
+ },
176
+ {
177
+ "cell_type": "code",
178
+ "execution_count": 57,
179
+ "metadata": {},
180
+ "outputs": [],
181
+ "source": [
182
+ "def split_data_into_windows(data):\n",
183
+ " lines = data.splitlines()\n",
184
+ " windows = []\n",
185
+ "\n",
186
+ " for i in range(1, len(lines)):\n",
187
+ " # Get the previous line and current line\n",
188
+ " prev_line = lines[i - 1]\n",
189
+ " curr_line = lines[i]\n",
190
+ "\n",
191
+ " # If it's the first line or the last line, don't split it\n",
192
+ " if i == 1:\n",
193
+ " mid_index = len(curr_line) // 2\n",
194
+ " first_half = curr_line[:mid_index]\n",
195
+ " windows.append((prev_line, first_half))\n",
196
+ " elif i == len(lines) - 1:\n",
197
+ " mid_index = len(prev_line) // 2\n",
198
+ " second_half = prev_line[mid_index:]\n",
199
+ " windows.append((second_half, curr_line))\n",
200
+ " else:\n",
201
+ " # Split the current line into two halves\n",
202
+ " curr_mid_index = len(curr_line) // 2\n",
203
+ " prev_mid_index = len(prev_line) // 2\n",
204
+ " first_half = curr_line[:curr_mid_index]\n",
205
+ " second_half = prev_line[prev_mid_index:]\n",
206
+ "\n",
207
+ " windows.append((second_half, first_half))\n",
208
+ " return windows"
209
+ ]
210
+ },
211
+ {
212
+ "cell_type": "code",
213
+ "execution_count": 41,
214
+ "metadata": {},
215
+ "outputs": [],
216
+ "source": [
217
+ "def join_lines(lines):\n",
218
+ " with_space = lines[0] + \" \" + lines[1]\n",
219
+ " without_space = lines[0] + lines[1]\n",
220
+ " with_space_perplexity = calculate_perplexity(with_space)\n",
221
+ " without_space_perplexity = calculate_perplexity(without_space)\n",
222
+ " if with_space_perplexity < without_space_perplexity:\n",
223
+ " return with_space\n",
224
+ " else:\n",
225
+ " return without_space"
226
+ ]
227
+ },
228
+ {
229
+ "cell_type": "code",
230
+ "execution_count": 42,
231
+ "metadata": {},
232
+ "outputs": [],
233
+ "source": [
234
+ "def join_windows(windows):\n",
235
+ " output = \"\"\n",
236
+ " for window in windows:\n",
237
+ " output += join_lines(window)\n",
238
+ " return output"
239
+ ]
240
+ },
241
+ {
242
+ "cell_type": "code",
243
+ "execution_count": 43,
244
+ "metadata": {},
245
+ "outputs": [],
246
+ "source": [
247
+ "files = glob.glob(\"orig/*.txt\")"
248
+ ]
249
+ },
250
+ {
251
+ "cell_type": "code",
252
+ "execution_count": 44,
253
+ "metadata": {},
254
+ "outputs": [],
255
+ "source": [
256
+ "# remove all non-original files that have an original file associated with them\n",
257
+ "copy_files = {}\n",
258
+ "for file in files:\n",
259
+ " if file in copy_files:\n",
260
+ " continue\n",
261
+ " if \" (_ORIGINAL SERIES)\" in file:\n",
262
+ " original_file = file.replace(\" (_ORIGINAL SERIES)\", \"\")\n",
263
+ " copy_files[file] = original_file.replace(\"orig/\", \"dedup/\")\n",
264
+ " elif \" (_UNCHANGED)\" in file:\n",
265
+ " original_file = file.replace(\" (_UNCHANGED)\", \"\")\n",
266
+ " copy_files[file] = original_file.replace(\"orig/\", \"dedup/\")\n",
267
+ " else:\n",
268
+ " copy_files[file] = file.replace(\"orig/\", \"dedup/\")\n",
269
+ "\n",
270
+ "# for all values in copy_files, if the value contains parentheses and the filename without them is already a value in copy_files remove it\n",
271
+ "to_remove = []\n",
272
+ "for kv in copy_files.items():\n",
273
+ " if \"(\" in kv[1]:\n",
274
+ " original_file = re.sub(r\" \\(.+\\)\", \"\", kv[1])\n",
275
+ " if original_file in copy_files.values():\n",
276
+ " to_remove.append(kv[0])\n",
277
+ "\n",
278
+ "for file in to_remove:\n",
279
+ " del copy_files[file]\n",
280
+ "\n",
281
+ "for kv in copy_files.items():\n",
282
+ " if \"+\" in kv[1]:\n",
283
+ " original_file = re.sub(r\" \\(.+\\)\", \"\", kv[1])\n",
284
+ " copy_files[kv[0]] = original_file\n",
285
+ "\n",
286
+ "to_remove.clear()\n",
287
+ "for kv in copy_files.items():\n",
288
+ " if \"(\" in kv[1]:\n",
289
+ " original_file = re.sub(r\" \\(.+\\)\", \"\", kv[1])\n",
290
+ " if original_file in copy_files.values():\n",
291
+ " to_remove.append(kv[0])\n",
292
+ "\n",
293
+ "for file in to_remove:\n",
294
+ " del copy_files[file]\n",
295
+ "\n",
296
+ "for kv in copy_files.items():\n",
297
+ " if \"(Full)\" in kv[1]:\n",
298
+ " original_file = re.sub(r\" \\(.+\\)\", \"\", kv[1])\n",
299
+ " if original_file not in copy_files.values():\n",
300
+ " copy_files[kv[0]] = original_file\n",
301
+ "\n",
302
+ "to_remove.clear()\n",
303
+ "for kv in copy_files.items():\n",
304
+ " if \"(\" in kv[1]:\n",
305
+ " to_remove.append(kv[0])\n",
306
+ "\n",
307
+ "for file in to_remove:\n",
308
+ " del copy_files[file]\n",
309
+ "\n",
310
+ "# copy files to new directory\n",
311
+ "shutil.rmtree(\"dedup\", ignore_errors=True)\n",
312
+ "os.makedirs(\"dedup\", exist_ok=True)\n",
313
+ "for kv in copy_files.items():\n",
314
+ " shutil.copy(kv[0], kv[1])\n"
315
+ ]
316
+ },
317
+ {
318
+ "cell_type": "code",
319
+ "execution_count": 45,
320
+ "metadata": {},
321
+ "outputs": [],
322
+ "source": [
323
+ "files = glob.glob(\"dedup/*.txt\")"
324
+ ]
325
+ },
326
+ {
327
+ "cell_type": "code",
328
+ "execution_count": 46,
329
+ "metadata": {},
330
+ "outputs": [],
331
+ "source": [
332
+ "# remove all empty lines in the dedup files\n",
333
+ "shutil.rmtree(\"dedup_no_empty\", ignore_errors=True)\n",
334
+ "os.makedirs(\"dedup_no_empty\", exist_ok=True)\n",
335
+ "for file in files:\n",
336
+ " with open(file, \"r\") as f:\n",
337
+ " data = f.read()\n",
338
+ " while \"\\n\\n\" in data:\n",
339
+ " data = data.replace(\"\\n\\n\", \"\\n\")\n",
340
+ " new_file = file.replace(\"dedup\", \"dedup_no_empty\")\n",
341
+ " with open(new_file, \"w\") as f:\n",
342
+ " f.write(data)"
343
+ ]
344
+ },
345
+ {
346
+ "cell_type": "code",
347
+ "execution_count": 47,
348
+ "metadata": {},
349
+ "outputs": [],
350
+ "source": [
351
+ "files = glob.glob(\"dedup_no_empty/*.txt\")"
352
+ ]
353
+ },
354
+ {
355
+ "cell_type": "code",
356
+ "execution_count": 60,
357
+ "metadata": {},
358
+ "outputs": [
359
+ {
360
+ "name": "stderr",
361
+ "output_type": "stream",
362
+ "text": [
363
+ "100%|██████████| 30/30 [01:56<00:00, 3.88s/it]\n"
364
+ ]
365
+ }
366
+ ],
367
+ "source": [
368
+ "shutil.rmtree(\"joined\", ignore_errors=True)\n",
369
+ "os.makedirs(\"joined\", exist_ok=True)\n",
370
+ "for file in tqdm(files):\n",
371
+ " with open(file, \"r\") as f:\n",
372
+ " data = f.read()\n",
373
+ " windows = split_data_into_windows(data)\n",
374
+ " joined = join_windows(windows)\n",
375
+ " new_file = file.replace(\"dedup_no_empty\", \"joined\")\n",
376
+ " # in joined make sure there is a space after every period or comma\n",
377
+ " joined = joined\n",
378
+ " \n",
379
+ " with open(new_file, \"w\") as f:\n",
380
+ " f.write(joined)"
381
+ ]
382
+ },
383
+ {
384
+ "cell_type": "code",
385
+ "execution_count": 24,
386
+ "metadata": {},
387
+ "outputs": [],
388
+ "source": [
389
+ "files = glob.glob(\"joined/*.txt\")"
390
+ ]
391
+ },
392
+ {
393
+ "cell_type": "code",
394
+ "execution_count": 26,
395
+ "metadata": {},
396
+ "outputs": [
397
+ {
398
+ "name": "stderr",
399
+ "output_type": "stream",
400
+ "text": [
401
+ "100%|██████████| 30/30 [00:00<00:00, 8242.44it/s]\n"
402
+ ]
403
+ }
404
+ ],
405
+ "source": [
406
+ "shutil.rmtree(\"cleaned\", ignore_errors=True)\n",
407
+ "os.makedirs(\"cleaned\", exist_ok=True)\n",
408
+ "for file in tqdm(files):\n",
409
+ " with open(file, \"r\") as f:\n",
410
+ " data = f.read()\n",
411
+ " data = data.replace(\".\", \". \").replace(\",\", \", \").replace(\" \", \" \").replace('\"', \"\")\n",
412
+ " new_file = file.replace(\"joined\", \"cleaned\")\n",
413
+ " with open(new_file, \"w\") as f:\n",
414
+ " f.write(data)"
415
+ ]
416
+ },
417
+ {
418
+ "cell_type": "code",
419
+ "execution_count": 27,
420
+ "metadata": {},
421
+ "outputs": [],
422
+ "source": [
423
+ "files = glob.glob(\"cleaned/*.txt\")"
424
+ ]
425
+ },
426
+ {
427
+ "cell_type": "code",
428
+ "execution_count": 29,
429
+ "metadata": {},
430
+ "outputs": [
431
+ {
432
+ "name": "stderr",
433
+ "output_type": "stream",
434
+ "text": [
435
+ "100%|██████████| 30/30 [00:00<00:00, 44779.05it/s]\n"
436
+ ]
437
+ },
438
+ {
439
+ "data": {
440
+ "text/html": [
441
+ "<div>\n",
442
+ "<style scoped>\n",
443
+ " .dataframe tbody tr th:only-of-type {\n",
444
+ " vertical-align: middle;\n",
445
+ " }\n",
446
+ "\n",
447
+ " .dataframe tbody tr th {\n",
448
+ " vertical-align: top;\n",
449
+ " }\n",
450
+ "\n",
451
+ " .dataframe thead th {\n",
452
+ " text-align: right;\n",
453
+ " }\n",
454
+ "</style>\n",
455
+ "<table border=\"1\" class=\"dataframe\">\n",
456
+ " <thead>\n",
457
+ " <tr style=\"text-align: right;\">\n",
458
+ " <th></th>\n",
459
+ " <th>text</th>\n",
460
+ " </tr>\n",
461
+ " </thead>\n",
462
+ " <tbody>\n",
463
+ " <tr>\n",
464
+ " <th>0</th>\n",
465
+ " <td>Just Bambi Sleep. Deeper and deeper. Feeling b...</td>\n",
466
+ " </tr>\n",
467
+ " <tr>\n",
468
+ " <th>1</th>\n",
469
+ " <td>Every command Bambi has accepted in this sessi...</td>\n",
470
+ " </tr>\n",
471
+ " <tr>\n",
472
+ " <th>2</th>\n",
473
+ " <td>Feeling wonderfully obedient Bambi. Perfectly ...</td>\n",
474
+ " </tr>\n",
475
+ " <tr>\n",
476
+ " <th>3</th>\n",
477
+ " <td>That's right. Doing so well. Like a pretty gir...</td>\n",
478
+ " </tr>\n",
479
+ " <tr>\n",
480
+ " <th>4</th>\n",
481
+ " <td>Bambi is completely susceptible to hypnosis. A...</td>\n",
482
+ " </tr>\n",
483
+ " <tr>\n",
484
+ " <th>5</th>\n",
485
+ " <td>That's right Bambi. Eyelids fluttering like a ...</td>\n",
486
+ " </tr>\n",
487
+ " <tr>\n",
488
+ " <th>6</th>\n",
489
+ " <td>That's right Bambi feeling so good. Just drift...</td>\n",
490
+ " </tr>\n",
491
+ " <tr>\n",
492
+ " <th>7</th>\n",
493
+ " <td>Now as Bambi Sleeps deeper and deeper. Fuzzier...</td>\n",
494
+ " </tr>\n",
495
+ " <tr>\n",
496
+ " <th>8</th>\n",
497
+ " <td>Bambi Sleep Now. And Bambi suddenly finds hers...</td>\n",
498
+ " </tr>\n",
499
+ " <tr>\n",
500
+ " <th>9</th>\n",
501
+ " <td>Bambi is a perfect fuckable fashion puppet. A ...</td>\n",
502
+ " </tr>\n",
503
+ " <tr>\n",
504
+ " <th>10</th>\n",
505
+ " <td>Drifting deeper and deeper Bambi. More and mor...</td>\n",
506
+ " </tr>\n",
507
+ " <tr>\n",
508
+ " <th>11</th>\n",
509
+ " <td>[Various voices looping conditioning phrases]S...</td>\n",
510
+ " </tr>\n",
511
+ " <tr>\n",
512
+ " <th>12</th>\n",
513
+ " <td>Bambi Sleep Now. Just Bambi Sleep. Every time ...</td>\n",
514
+ " </tr>\n",
515
+ " <tr>\n",
516
+ " <th>13</th>\n",
517
+ " <td>Feeling the need to be Primped And Pampered. ...</td>\n",
518
+ " </tr>\n",
519
+ " <tr>\n",
520
+ " <th>14</th>\n",
521
+ " <td>That's a Good Girl Bambi. Feeling more and mor...</td>\n",
522
+ " </tr>\n",
523
+ " <tr>\n",
524
+ " <th>15</th>\n",
525
+ " <td>And Bambi is so perfectly trapped in her bimbo...</td>\n",
526
+ " </tr>\n",
527
+ " <tr>\n",
528
+ " <th>16</th>\n",
529
+ " <td>And Bambi every time you go so deeply into tra...</td>\n",
530
+ " </tr>\n",
531
+ " <tr>\n",
532
+ " <th>17</th>\n",
533
+ " <td>Bambi Freeze. That's a Good Girl. All thoughts...</td>\n",
534
+ " </tr>\n",
535
+ " <tr>\n",
536
+ " <th>18</th>\n",
537
+ " <td>Such a Good Girl Bambi. Feeling so much pleasu...</td>\n",
538
+ " </tr>\n",
539
+ " <tr>\n",
540
+ " <th>19</th>\n",
541
+ " <td>Soon it will be time for you to awaken Bambi. ...</td>\n",
542
+ " </tr>\n",
543
+ " <tr>\n",
544
+ " <th>20</th>\n",
545
+ " <td>That's right Bambi. Feeling so wonderful. Feel...</td>\n",
546
+ " </tr>\n",
547
+ " <tr>\n",
548
+ " <th>21</th>\n",
549
+ " <td>Take a deep breath Bambi. Hold it for a moment...</td>\n",
550
+ " </tr>\n",
551
+ " <tr>\n",
552
+ " <th>22</th>\n",
553
+ " <td>That's a Good Girl Bambi. Feeling more and mor...</td>\n",
554
+ " </tr>\n",
555
+ " <tr>\n",
556
+ " <th>23</th>\n",
557
+ " <td>Bambi Sleep now. No resistance, must sleep now...</td>\n",
558
+ " </tr>\n",
559
+ " <tr>\n",
560
+ " <th>24</th>\n",
561
+ " <td>Slipping deeper and deeper now Bambi. Just tak...</td>\n",
562
+ " </tr>\n",
563
+ " <tr>\n",
564
+ " <th>25</th>\n",
565
+ " <td>Hi sweetie! Welcome to the Sleepy Girl Salon B...</td>\n",
566
+ " </tr>\n",
567
+ " <tr>\n",
568
+ " <th>26</th>\n",
569
+ " <td>It's time to go so much deeper Bambi. And ther...</td>\n",
570
+ " </tr>\n",
571
+ " <tr>\n",
572
+ " <th>27</th>\n",
573
+ " <td>That's right. Deeply asleep now Bambi. Complet...</td>\n",
574
+ " </tr>\n",
575
+ " <tr>\n",
576
+ " <th>28</th>\n",
577
+ " <td>Bambi can feel her perfect heaving titties now...</td>\n",
578
+ " </tr>\n",
579
+ " <tr>\n",
580
+ " <th>29</th>\n",
581
+ " <td>That's it Bambi. So peaceful. So perfect. So b...</td>\n",
582
+ " </tr>\n",
583
+ " </tbody>\n",
584
+ "</table>\n",
585
+ "</div>"
586
+ ],
587
+ "text/plain": [
588
+ " text\n",
589
+ "0 Just Bambi Sleep. Deeper and deeper. Feeling b...\n",
590
+ "1 Every command Bambi has accepted in this sessi...\n",
591
+ "2 Feeling wonderfully obedient Bambi. Perfectly ...\n",
592
+ "3 That's right. Doing so well. Like a pretty gir...\n",
593
+ "4 Bambi is completely susceptible to hypnosis. A...\n",
594
+ "5 That's right Bambi. Eyelids fluttering like a ...\n",
595
+ "6 That's right Bambi feeling so good. Just drift...\n",
596
+ "7 Now as Bambi Sleeps deeper and deeper. Fuzzier...\n",
597
+ "8 Bambi Sleep Now. And Bambi suddenly finds hers...\n",
598
+ "9 Bambi is a perfect fuckable fashion puppet. A ...\n",
599
+ "10 Drifting deeper and deeper Bambi. More and mor...\n",
600
+ "11 [Various voices looping conditioning phrases]S...\n",
601
+ "12 Bambi Sleep Now. Just Bambi Sleep. Every time ...\n",
602
+ "13 Feeling the need to be Primped And Pampered. ...\n",
603
+ "14 That's a Good Girl Bambi. Feeling more and mor...\n",
604
+ "15 And Bambi is so perfectly trapped in her bimbo...\n",
605
+ "16 And Bambi every time you go so deeply into tra...\n",
606
+ "17 Bambi Freeze. That's a Good Girl. All thoughts...\n",
607
+ "18 Such a Good Girl Bambi. Feeling so much pleasu...\n",
608
+ "19 Soon it will be time for you to awaken Bambi. ...\n",
609
+ "20 That's right Bambi. Feeling so wonderful. Feel...\n",
610
+ "21 Take a deep breath Bambi. Hold it for a moment...\n",
611
+ "22 That's a Good Girl Bambi. Feeling more and mor...\n",
612
+ "23 Bambi Sleep now. No resistance, must sleep now...\n",
613
+ "24 Slipping deeper and deeper now Bambi. Just tak...\n",
614
+ "25 Hi sweetie! Welcome to the Sleepy Girl Salon B...\n",
615
+ "26 It's time to go so much deeper Bambi. And ther...\n",
616
+ "27 That's right. Deeply asleep now Bambi. Complet...\n",
617
+ "28 Bambi can feel her perfect heaving titties now...\n",
618
+ "29 That's it Bambi. So peaceful. So perfect. So b..."
619
+ ]
620
+ },
621
+ "execution_count": 29,
622
+ "metadata": {},
623
+ "output_type": "execute_result"
624
+ }
625
+ ],
626
+ "source": [
627
+ "# load each file as a row with the text in a column called text\n",
628
+ "texts = []\n",
629
+ "full_text = \"\"\n",
630
+ "for file in tqdm(files):\n",
631
+ " with open(file, \"r\") as f:\n",
632
+ " data = f.read()\n",
633
+ " texts.append(data)\n",
634
+ " full_text += data + \"\\n\"\n",
635
+ "\n",
636
+ "file = \"bambisleep.parquet\"\n",
637
+ "df = pd.DataFrame({ \"text\": texts }, columns=[\"text\"])\n",
638
+ "df.to_parquet(file)\n",
639
+ "df"
640
+ ]
641
+ },
642
+ {
643
+ "cell_type": "code",
644
+ "execution_count": 30,
645
+ "metadata": {},
646
+ "outputs": [
647
+ {
648
+ "name": "stderr",
649
+ "output_type": "stream",
650
+ "text": [
651
+ "tokenizer_config.json: 100%|██████████| 1.29k/1.29k [00:00<00:00, 9.67MB/s]\n",
652
+ "tokenizer.model: 100%|██████████| 500k/500k [00:00<00:00, 5.74MB/s]\n",
653
+ "tokenizer.json: 100%|██████████| 1.84M/1.84M [00:00<00:00, 4.03MB/s]\n",
654
+ "special_tokens_map.json: 100%|██████████| 551/551 [00:00<00:00, 4.15MB/s]\n"
655
+ ]
656
+ }
657
+ ],
658
+ "source": [
659
+ "end_tokenizer = AutoTokenizer.from_pretrained(\"TinyLlama/TinyLlama-1.1B-Chat-v1.0\")"
660
+ ]
661
+ },
662
+ {
663
+ "cell_type": "code",
664
+ "execution_count": 31,
665
+ "metadata": {},
666
+ "outputs": [
667
+ {
668
+ "name": "stderr",
669
+ "output_type": "stream",
670
+ "text": [
671
+ "Token indices sequence length is longer than the specified maximum sequence length for this model (77787 > 2048). Running this sequence through the model will result in indexing errors\n"
672
+ ]
673
+ },
674
+ {
675
+ "data": {
676
+ "text/plain": [
677
+ "['▁Just',\n",
678
+ " '▁B',\n",
679
+ " 'amb',\n",
680
+ " 'i',\n",
681
+ " '▁S',\n",
682
+ " 'leep',\n",
683
+ " '.',\n",
684
+ " '▁De',\n",
685
+ " 'eper',\n",
686
+ " '▁and',\n",
687
+ " '▁deeper',\n",
688
+ " '.',\n",
689
+ " '▁Fe',\n",
690
+ " 'eling',\n",
691
+ " '▁better',\n",
692
+ " '▁and',\n",
693
+ " '▁better',\n",
694
+ " '.',\n",
695
+ " '▁Because',\n",
696
+ " '▁every',\n",
697
+ " '▁time',\n",
698
+ " '▁B',\n",
699
+ " 'amb',\n",
700
+ " 'i',\n",
701
+ " '▁he',\n",
702
+ " 'ars',\n",
703
+ " '▁the',\n",
704
+ " '▁words',\n",
705
+ " '.',\n",
706
+ " '▁Good',\n",
707
+ " '▁Girl',\n",
708
+ " '.',\n",
709
+ " '▁Her',\n",
710
+ " '▁pleasure',\n",
711
+ " '▁deep',\n",
712
+ " 'ens',\n",
713
+ " '.',\n",
714
+ " '▁Fe',\n",
715
+ " 'eling',\n",
716
+ " '▁so',\n",
717
+ " '▁much',\n",
718
+ " '▁better',\n",
719
+ " '.',\n",
720
+ " '▁So',\n",
721
+ " '▁much',\n",
722
+ " '▁more',\n",
723
+ " '▁relax',\n",
724
+ " 'ed',\n",
725
+ " '▁and',\n",
726
+ " '▁happy',\n",
727
+ " '.',\n",
728
+ " '▁Fe',\n",
729
+ " 'eling',\n",
730
+ " '▁so',\n",
731
+ " '▁good',\n",
732
+ " '▁all',\n",
733
+ " '▁over',\n",
734
+ " '.',\n",
735
+ " '▁Str',\n",
736
+ " 'ong',\n",
737
+ " 'er',\n",
738
+ " '▁and',\n",
739
+ " '▁stronger',\n",
740
+ " '.',\n",
741
+ " '▁More',\n",
742
+ " '▁and',\n",
743
+ " '▁more',\n",
744
+ " '▁wonder',\n",
745
+ " 'fully',\n",
746
+ " '▁eu',\n",
747
+ " 'ph',\n",
748
+ " 'or',\n",
749
+ " 'ic',\n",
750
+ " '.',\n",
751
+ " '▁Such',\n",
752
+ " '▁a',\n",
753
+ " '▁Good',\n",
754
+ " '▁Girl',\n",
755
+ " '.',\n",
756
+ " '▁Happy',\n",
757
+ " '.',\n",
758
+ " '▁Rel',\n",
759
+ " 'ax',\n",
760
+ " 'ed',\n",
761
+ " '▁and',\n",
762
+ " '▁accepting',\n",
763
+ " '.',\n",
764
+ " '▁Fe',\n",
765
+ " 'eling',\n",
766
+ " '▁that',\n",
767
+ " '▁deep',\n",
768
+ " '▁wave',\n",
769
+ " '▁of',\n",
770
+ " '▁capt',\n",
771
+ " 'iv',\n",
772
+ " 'ating',\n",
773
+ " '▁pleasure',\n",
774
+ " '▁and',\n",
775
+ " '▁ob',\n",
776
+ " 'ed',\n",
777
+ " 'ience',\n",
778
+ " '.',\n",
779
+ " '▁Even',\n",
780
+ " '▁more',\n",
781
+ " '▁bl',\n",
782
+ " 'iss',\n",
783
+ " 'ful',\n",
784
+ " '▁with',\n",
785
+ " '▁every',\n",
786
+ " '▁trigger',\n",
787
+ " '.',\n",
788
+ " '▁M',\n",
789
+ " 'aking',\n",
790
+ " '▁everything',\n",
791
+ " '▁right',\n",
792
+ " '▁with',\n",
793
+ " '▁her',\n",
794
+ " '▁world',\n",
795
+ " '.',\n",
796
+ " '▁Fe',\n",
797
+ " 'eling',\n",
798
+ " '▁more',\n",
799
+ " '▁and',\n",
800
+ " '▁more',\n",
801
+ " '▁like',\n",
802
+ " '▁a',\n",
803
+ " '▁Good',\n",
804
+ " '▁Girl',\n",
805
+ " '.',\n",
806
+ " '▁S',\n",
807
+ " 'li',\n",
808
+ " 'pping',\n",
809
+ " '▁away',\n",
810
+ " '▁into',\n",
811
+ " '▁utter',\n",
812
+ " '▁peace',\n",
813
+ " '▁and',\n",
814
+ " '▁content',\n",
815
+ " 'ment',\n",
816
+ " '.',\n",
817
+ " '▁R',\n",
818
+ " 'iding',\n",
819
+ " '▁away',\n",
820
+ " '▁on',\n",
821
+ " '▁a',\n",
822
+ " '▁cr',\n",
823
+ " 'est',\n",
824
+ " 'ing',\n",
825
+ " '▁wave',\n",
826
+ " '▁of',\n",
827
+ " '▁blank',\n",
828
+ " '▁b',\n",
829
+ " 'im',\n",
830
+ " 'bo',\n",
831
+ " '▁happiness',\n",
832
+ " '▁and',\n",
833
+ " '▁eu',\n",
834
+ " 'ph',\n",
835
+ " 'oria',\n",
836
+ " '.',\n",
837
+ " '▁Every',\n",
838
+ " '▁time',\n",
839
+ " '▁she',\n",
840
+ " '▁he',\n",
841
+ " 'ars',\n",
842
+ " '▁the',\n",
843
+ " '▁words',\n",
844
+ " '▁Good',\n",
845
+ " '▁Girl',\n",
846
+ " '.',\n",
847
+ " '▁Fe',\n",
848
+ " 'els',\n",
849
+ " '▁so',\n",
850
+ " '▁wonderful',\n",
851
+ " '▁B',\n",
852
+ " 'amb',\n",
853
+ " 'i',\n",
854
+ " '.',\n",
855
+ " '▁Mind',\n",
856
+ " '▁shut',\n",
857
+ " 'ting',\n",
858
+ " '▁down',\n",
859
+ " '.',\n",
860
+ " '▁F',\n",
861
+ " 'ading',\n",
862
+ " '▁away',\n",
863
+ " '▁more',\n",
864
+ " '▁and',\n",
865
+ " '▁more',\n",
866
+ " '.',\n",
867
+ " '▁Know',\n",
868
+ " 'ing',\n",
869
+ " '▁that',\n",
870
+ " '▁she',\n",
871
+ " '▁needs',\n",
872
+ " '▁to',\n",
873
+ " '▁be',\n",
874
+ " '▁a',\n",
875
+ " '▁perfect',\n",
876
+ " '▁Good',\n",
877
+ " '▁Girl',\n",
878
+ " '.',\n",
879
+ " '▁To',\n",
880
+ " '▁accept',\n",
881
+ " '.',\n",
882
+ " '▁To',\n",
883
+ " '▁obey',\n",
884
+ " '▁and',\n",
885
+ " '▁forget',\n",
886
+ " '.',\n",
887
+ " '▁To',\n",
888
+ " '▁ensure',\n",
889
+ " '▁these',\n",
890
+ " '▁pleasant',\n",
891
+ " '▁feelings',\n",
892
+ " '▁will',\n",
893
+ " '▁continue',\n",
894
+ " '.',\n",
895
+ " '▁Know',\n",
896
+ " 'ing',\n",
897
+ " '▁that',\n",
898
+ " '▁it',\n",
899
+ " '▁will',\n",
900
+ " '▁be',\n",
901
+ " '▁so',\n",
902
+ " '▁easy',\n",
903
+ " '.',\n",
904
+ " '▁Because',\n",
905
+ " '▁she',\n",
906
+ " '▁is',\n",
907
+ " '▁an',\n",
908
+ " '▁empty',\n",
909
+ " '▁happy',\n",
910
+ " '▁pu',\n",
911
+ " 'ppet',\n",
912
+ " '.',\n",
913
+ " '▁A',\n",
914
+ " '▁blank',\n",
915
+ " '▁b',\n",
916
+ " 'im',\n",
917
+ " 'bo',\n",
918
+ " '▁air',\n",
919
+ " 'head',\n",
920
+ " '.',\n",
921
+ " '▁Who',\n",
922
+ " '▁lov',\n",
923
+ " 'es',\n",
924
+ " '▁to',\n",
925
+ " '▁relax',\n",
926
+ " '▁and',\n",
927
+ " '▁obey',\n",
928
+ " '.',\n",
929
+ " '▁And',\n",
930
+ " '▁comp',\n",
931
+ " 'li',\n",
932
+ " 'ant',\n",
933
+ " '▁b',\n",
934
+ " 'im',\n",
935
+ " 'bo',\n",
936
+ " '▁doll',\n",
937
+ " 's',\n",
938
+ " '▁are',\n",
939
+ " '▁always',\n",
940
+ " '▁Good',\n",
941
+ " '▁Girls',\n",
942
+ " '.',\n",
943
+ " '▁Comp',\n",
944
+ " 'li',\n",
945
+ " 'ant',\n",
946
+ " '▁b',\n",
947
+ " 'im',\n",
948
+ " 'bo',\n",
949
+ " '▁doll',\n",
950
+ " 's',\n",
951
+ " '▁like',\n",
952
+ " '▁B',\n",
953
+ " 'amb',\n",
954
+ " 'i',\n",
955
+ " '.',\n",
956
+ " '▁Fil',\n",
957
+ " 'led',\n",
958
+ " '▁with',\n",
959
+ " '▁such',\n",
960
+ " '▁a',\n",
961
+ " '▁deep',\n",
962
+ " '▁sense',\n",
963
+ " '▁of',\n",
964
+ " '▁pride',\n",
965
+ " '.',\n",
966
+ " '▁That',\n",
967
+ " '▁she',\n",
968
+ " \"'\",\n",
969
+ " 's',\n",
970
+ " '▁such',\n",
971
+ " '▁a',\n",
972
+ " '▁perfect',\n",
973
+ " '▁b',\n",
974
+ " 'im',\n",
975
+ " 'bo',\n",
976
+ " '.',\n",
977
+ " '▁B',\n",
978
+ " 'amb',\n",
979
+ " 'i',\n",
980
+ " '▁lov',\n",
981
+ " 'es',\n",
982
+ " '▁being',\n",
983
+ " '▁a',\n",
984
+ " '▁perfect',\n",
985
+ " '▁b',\n",
986
+ " 'im',\n",
987
+ " 'bo',\n",
988
+ " '.',\n",
989
+ " '▁Because',\n",
990
+ " '▁being',\n",
991
+ " '▁a',\n",
992
+ " '▁b',\n",
993
+ " 'im',\n",
994
+ " 'bo',\n",
995
+ " '▁feels',\n",
996
+ " '▁wonderful',\n",
997
+ " '.',\n",
998
+ " '▁She',\n",
999
+ " \"'\",\n",
1000
+ " 's',\n",
1001
+ " '▁so',\n",
1002
+ " '▁proud',\n",
1003
+ " '▁that',\n",
1004
+ " '▁she',\n",
1005
+ " \"'\",\n",
1006
+ " 's',\n",
1007
+ " '▁a',\n",
1008
+ " '▁b',\n",
1009
+ " 'im',\n",
1010
+ " 'bo',\n",
1011
+ " '.',\n",
1012
+ " '▁And',\n",
1013
+ " '▁she',\n",
1014
+ " \"'\",\n",
1015
+ " 's',\n",
1016
+ " '▁so',\n",
1017
+ " '▁proud',\n",
1018
+ " '▁when',\n",
1019
+ " '▁she',\n",
1020
+ " '▁ob',\n",
1021
+ " 'e',\n",
1022
+ " 'ys',\n",
1023
+ " '.',\n",
1024
+ " '▁O',\n",
1025
+ " 'bed',\n",
1026
+ " 'ience',\n",
1027
+ " '▁brings',\n",
1028
+ " '▁pride',\n",
1029
+ " '▁and',\n",
1030
+ " '▁content',\n",
1031
+ " 'ment',\n",
1032
+ " '.',\n",
1033
+ " '▁Such',\n",
1034
+ " '▁deep',\n",
1035
+ " '▁pleasure',\n",
1036
+ " '.',\n",
1037
+ " '▁When',\n",
1038
+ " 'ever',\n",
1039
+ " '▁B',\n",
1040
+ " 'amb',\n",
1041
+ " 'i',\n",
1042
+ " '▁Does',\n",
1043
+ " '▁As',\n",
1044
+ " '▁She',\n",
1045
+ " \"'\",\n",
1046
+ " 's',\n",
1047
+ " '▁T',\n",
1048
+ " 'old',\n",
1049
+ " '.',\n",
1050
+ " '▁When',\n",
1051
+ " 'ever',\n",
1052
+ " '▁she',\n",
1053
+ " '▁ob',\n",
1054
+ " 'e',\n",
1055
+ " 'ys',\n",
1056
+ " '▁her',\n",
1057
+ " '▁training',\n",
1058
+ " '.',\n",
1059
+ " '▁When',\n",
1060
+ " 'ever',\n",
1061
+ " '▁she',\n",
1062
+ " '▁is',\n",
1063
+ " '▁called',\n",
1064
+ " '▁a',\n",
1065
+ " '▁Good',\n",
1066
+ " '▁Girl',\n",
1067
+ " '.',\n",
1068
+ " '▁She',\n",
1069
+ " '▁feels',\n",
1070
+ " '▁that',\n",
1071
+ " '▁wonderful',\n",
1072
+ " '▁sens',\n",
1073
+ " 'ation',\n",
1074
+ " '.',\n",
1075
+ " '▁Of',\n",
1076
+ " '▁deep',\n",
1077
+ " '▁b',\n",
1078
+ " 'im',\n",
1079
+ " 'bo',\n",
1080
+ " '▁pride',\n",
1081
+ " '.',\n",
1082
+ " '▁W',\n",
1083
+ " 'ash',\n",
1084
+ " 'ing',\n",
1085
+ " '▁over',\n",
1086
+ " '▁her',\n",
1087
+ " '▁existence',\n",
1088
+ " '.',\n",
1089
+ " '▁F',\n",
1090
+ " 'illing',\n",
1091
+ " '▁her',\n",
1092
+ " '▁up',\n",
1093
+ " '▁so',\n",
1094
+ " '▁bl',\n",
1095
+ " 'iss',\n",
1096
+ " 'fully',\n",
1097
+ " '▁Sa',\n",
1098
+ " 'fe',\n",
1099
+ " '▁And',\n",
1100
+ " '▁Sec',\n",
1101
+ " 'ure',\n",
1102
+ " '.',\n",
1103
+ " '▁Need',\n",
1104
+ " 'ing',\n",
1105
+ " '▁it',\n",
1106
+ " '▁more',\n",
1107
+ " '▁and',\n",
1108
+ " '▁more',\n",
1109
+ " '.',\n",
1110
+ " '▁O',\n",
1111
+ " 'bed',\n",
1112
+ " 'ient',\n",
1113
+ " '▁b',\n",
1114
+ " 'im',\n",
1115
+ " 'bo',\n",
1116
+ " '▁pride',\n",
1117
+ " '▁feels',\n",
1118
+ " '▁so',\n",
1119
+ " '▁good',\n",
1120
+ " '.',\n",
1121
+ " '▁So',\n",
1122
+ " '▁proud',\n",
1123
+ " '▁to',\n",
1124
+ " '▁be',\n",
1125
+ " '▁a',\n",
1126
+ " '▁perfect',\n",
1127
+ " '▁sub',\n",
1128
+ " 'miss',\n",
1129
+ " 'ive',\n",
1130
+ " '▁b',\n",
1131
+ " 'im',\n",
1132
+ " 'bo',\n",
1133
+ " '▁doll',\n",
1134
+ " '▁named',\n",
1135
+ " '▁B',\n",
1136
+ " 'amb',\n",
1137
+ " 'i',\n",
1138
+ " '.',\n",
1139
+ " '▁Accept',\n",
1140
+ " 'ing',\n",
1141
+ " '▁automatically',\n",
1142
+ " '▁because',\n",
1143
+ " '▁it',\n",
1144
+ " '▁feels',\n",
1145
+ " '▁so',\n",
1146
+ " '▁right',\n",
1147
+ " '.',\n",
1148
+ " '▁That',\n",
1149
+ " \"'\",\n",
1150
+ " 's',\n",
1151
+ " '▁a',\n",
1152
+ " '▁Good',\n",
1153
+ " '▁Girl',\n",
1154
+ " '.',\n",
1155
+ " '▁Even',\n",
1156
+ " '▁just',\n",
1157
+ " '▁the',\n",
1158
+ " '▁fact',\n",
1159
+ " '▁that',\n",
1160
+ " '▁her',\n",
1161
+ " '▁name',\n",
1162
+ " '▁is',\n",
1163
+ " '▁B',\n",
1164
+ " 'amb',\n",
1165
+ " 'i',\n",
1166
+ " '.',\n",
1167
+ " '▁F',\n",
1168
+ " 'illing',\n",
1169
+ " '▁her',\n",
1170
+ " '▁with',\n",
1171
+ " '▁over',\n",
1172
+ " 'wh',\n",
1173
+ " 'el',\n",
1174
+ " 'ming',\n",
1175
+ " '▁pride',\n",
1176
+ " '▁and',\n",
1177
+ " '▁satisfaction',\n",
1178
+ " '.',\n",
1179
+ " '▁Ple',\n",
1180
+ " 'as',\n",
1181
+ " 'antly',\n",
1182
+ " '▁und',\n",
1183
+ " 'ulating',\n",
1184
+ " '▁waves',\n",
1185
+ " '▁of',\n",
1186
+ " '▁ob',\n",
1187
+ " 'ed',\n",
1188
+ " 'ience',\n",
1189
+ " '▁and',\n",
1190
+ " '▁pleasure',\n",
1191
+ " '.',\n",
1192
+ " '▁Because',\n",
1193
+ " '▁she',\n",
1194
+ " '▁knows',\n",
1195
+ " '▁only',\n",
1196
+ " '▁the',\n",
1197
+ " '▁most',\n",
1198
+ " '▁hel',\n",
1199
+ " 'pl',\n",
1200
+ " 'ess',\n",
1201
+ " 'ly',\n",
1202
+ " '▁d',\n",
1203
+ " 'umb',\n",
1204
+ " '▁and',\n",
1205
+ " '▁se',\n",
1206
+ " 'xy',\n",
1207
+ " '▁girls',\n",
1208
+ " '▁could',\n",
1209
+ " '▁have',\n",
1210
+ " '▁a',\n",
1211
+ " '▁sl',\n",
1212
+ " 'ut',\n",
1213
+ " 'ty',\n",
1214
+ " '▁b',\n",
1215
+ " 'im',\n",
1216
+ " 'bo',\n",
1217
+ " '▁name',\n",
1218
+ " '▁like',\n",
1219
+ " '▁B',\n",
1220
+ " 'amb',\n",
1221
+ " 'i',\n",
1222
+ " '.',\n",
1223
+ " '▁Every',\n",
1224
+ " '▁time',\n",
1225
+ " '▁she',\n",
1226
+ " '▁he',\n",
1227
+ " 'ars',\n",
1228
+ " '▁the',\n",
1229
+ " '▁name',\n",
1230
+ " '▁B',\n",
1231
+ " 'amb',\n",
1232
+ " 'i',\n",
1233
+ " '.',\n",
1234
+ " '▁Every',\n",
1235
+ " '▁time',\n",
1236
+ " '▁she',\n",
1237
+ " '▁even',\n",
1238
+ " '▁thinks',\n",
1239
+ " '▁of',\n",
1240
+ " '▁her',\n",
1241
+ " '▁name',\n",
1242
+ " '.',\n",
1243
+ " '▁That',\n",
1244
+ " '▁bl',\n",
1245
+ " 'iss',\n",
1246
+ " 'ful',\n",
1247
+ " '▁wave',\n",
1248
+ " '▁of',\n",
1249
+ " '▁deep',\n",
1250
+ " '▁content',\n",
1251
+ " 'ment',\n",
1252
+ " '.',\n",
1253
+ " '▁Know',\n",
1254
+ " 'ing',\n",
1255
+ " '▁that',\n",
1256
+ " '▁her',\n",
1257
+ " '▁name',\n",
1258
+ " '▁is',\n",
1259
+ " '▁B',\n",
1260
+ " 'amb',\n",
1261
+ " 'i',\n",
1262
+ " '.',\n",
1263
+ " '▁That',\n",
1264
+ " '▁wonderful',\n",
1265
+ " '▁sur',\n",
1266
+ " 'ge',\n",
1267
+ " '▁of',\n",
1268
+ " '▁b',\n",
1269
+ " 'im',\n",
1270
+ " 'bo',\n",
1271
+ " '▁eu',\n",
1272
+ " 'ph',\n",
1273
+ " 'oria',\n",
1274
+ " '.',\n",
1275
+ " '▁When',\n",
1276
+ " '▁she',\n",
1277
+ " '▁is',\n",
1278
+ " '▁addressed',\n",
1279
+ " '▁as',\n",
1280
+ " '▁B',\n",
1281
+ " 'amb',\n",
1282
+ " 'i',\n",
1283
+ " '.',\n",
1284
+ " '▁M',\n",
1285
+ " 'akes',\n",
1286
+ " '▁her',\n",
1287
+ " '▁feel',\n",
1288
+ " '▁like',\n",
1289
+ " '▁such',\n",
1290
+ " '▁a',\n",
1291
+ " '▁Good',\n",
1292
+ " '▁Girl',\n",
1293
+ " '.',\n",
1294
+ " '▁And',\n",
1295
+ " '▁B',\n",
1296
+ " 'amb',\n",
1297
+ " 'i',\n",
1298
+ " '▁S',\n",
1299
+ " 'leep',\n",
1300
+ " '.',\n",
1301
+ " '▁Al',\n",
1302
+ " 'most',\n",
1303
+ " '▁expl',\n",
1304
+ " 'oding',\n",
1305
+ " '▁into',\n",
1306
+ " '▁a',\n",
1307
+ " '▁happy',\n",
1308
+ " '▁sub',\n",
1309
+ " 'miss',\n",
1310
+ " 'ive',\n",
1311
+ " '▁little',\n",
1312
+ " '▁b',\n",
1313
+ " 'im',\n",
1314
+ " 'bo',\n",
1315
+ " '▁clim',\n",
1316
+ " 'ax',\n",
1317
+ " '.',\n",
1318
+ " '▁So',\n",
1319
+ " '▁proud',\n",
1320
+ " '▁that',\n",
1321
+ " '▁her',\n",
1322
+ " '▁name',\n",
1323
+ " '▁is',\n",
1324
+ " '▁B',\n",
1325
+ " 'amb',\n",
1326
+ " 'i',\n",
1327
+ " '.',\n",
1328
+ " '▁Unable',\n",
1329
+ " '▁to',\n",
1330
+ " '▁remember',\n",
1331
+ " '▁anything',\n",
1332
+ " '▁else',\n",
1333
+ " '▁because',\n",
1334
+ " '▁being',\n",
1335
+ " '▁called',\n",
1336
+ " '▁B',\n",
1337
+ " 'amb',\n",
1338
+ " 'i',\n",
1339
+ " '▁feels',\n",
1340
+ " '▁so',\n",
1341
+ " '▁bl',\n",
1342
+ " 'iss',\n",
1343
+ " 'ful',\n",
1344
+ " '.',\n",
1345
+ " '▁She',\n",
1346
+ " '▁lov',\n",
1347
+ " 'es',\n",
1348
+ " '▁the',\n",
1349
+ " '▁name',\n",
1350
+ " '▁B',\n",
1351
+ " 'amb',\n",
1352
+ " 'i',\n",
1353
+ " '▁so',\n",
1354
+ " '▁much',\n",
1355
+ " '.',\n",
1356
+ " '▁Sec',\n",
1357
+ " 'ure',\n",
1358
+ " '▁in',\n",
1359
+ " '▁the',\n",
1360
+ " '▁fact',\n",
1361
+ " '▁that',\n",
1362
+ " '▁it',\n",
1363
+ " '▁matches',\n",
1364
+ " '▁her',\n",
1365
+ " '▁person',\n",
1366
+ " 'ality',\n",
1367
+ " '.',\n",
1368
+ " '▁And',\n",
1369
+ " '▁deeply',\n",
1370
+ " '▁locked',\n",
1371
+ " '▁in',\n",
1372
+ " '▁b',\n",
1373
+ " 'im',\n",
1374
+ " 'bo',\n",
1375
+ " '▁identity',\n",
1376
+ " '.',\n",
1377
+ " '▁So',\n",
1378
+ " '▁perfectly',\n",
1379
+ " '.',\n",
1380
+ " '▁Because',\n",
1381
+ " '▁she',\n",
1382
+ " '▁has',\n",
1383
+ " '▁always',\n",
1384
+ " '▁been',\n",
1385
+ " '▁B',\n",
1386
+ " 'amb',\n",
1387
+ " 'i',\n",
1388
+ " '.',\n",
1389
+ " '▁Even',\n",
1390
+ " '▁just',\n",
1391
+ " '▁the',\n",
1392
+ " '▁word',\n",
1393
+ " '▁b',\n",
1394
+ " 'im',\n",
1395
+ " 'bo',\n",
1396
+ " '.',\n",
1397
+ " '▁Every',\n",
1398
+ " '▁time',\n",
1399
+ " '▁she',\n",
1400
+ " '▁he',\n",
1401
+ " 'ars',\n",
1402
+ " '▁the',\n",
1403
+ " '▁word',\n",
1404
+ " '▁b',\n",
1405
+ " 'im',\n",
1406
+ " 'bo',\n",
1407
+ " '.',\n",
1408
+ " '▁Rem',\n",
1409
+ " 'inding',\n",
1410
+ " '▁her',\n",
1411
+ " '▁of',\n",
1412
+ " '▁her',\n",
1413
+ " '▁place',\n",
1414
+ " '▁in',\n",
1415
+ " '▁life',\n",
1416
+ " '.',\n",
1417
+ " '▁As',\n",
1418
+ " '▁a',\n",
1419
+ " '▁blank',\n",
1420
+ " '▁b',\n",
1421
+ " 'im',\n",
1422
+ " 'bo',\n",
1423
+ " '▁doll',\n",
1424
+ " '.',\n",
1425
+ " '▁So',\n",
1426
+ " '▁proud',\n",
1427
+ " '▁to',\n",
1428
+ " '▁be',\n",
1429
+ " '▁a',\n",
1430
+ " '▁perfect',\n",
1431
+ " '▁b',\n",
1432
+ " 'im',\n",
1433
+ " 'bo',\n",
1434
+ " '.',\n",
1435
+ " '▁Wonder',\n",
1436
+ " 'ful',\n",
1437
+ " '▁waves',\n",
1438
+ " '▁of',\n",
1439
+ " '▁pride',\n",
1440
+ " '▁and',\n",
1441
+ " '▁accept',\n",
1442
+ " 'ance',\n",
1443
+ " '▁cour',\n",
1444
+ " 'sing',\n",
1445
+ " '▁through',\n",
1446
+ " '▁her',\n",
1447
+ " '.',\n",
1448
+ " '▁Cr',\n",
1449
+ " 'ushing',\n",
1450
+ " '▁do',\n",
1451
+ " 'se',\n",
1452
+ " '▁of',\n",
1453
+ " '▁pleasure',\n",
1454
+ " '.',\n",
1455
+ " '▁W',\n",
1456
+ " 'arm',\n",
1457
+ " '▁and',\n",
1458
+ " '▁safe',\n",
1459
+ " '.',\n",
1460
+ " '▁Fe',\n",
1461
+ " 'els',\n",
1462
+ " '▁better',\n",
1463
+ " '▁and',\n",
1464
+ " '▁better',\n",
1465
+ " '.',\n",
1466
+ " '▁Just',\n",
1467
+ " '▁a',\n",
1468
+ " '▁d',\n",
1469
+ " 'umb',\n",
1470
+ " '▁se',\n",
1471
+ " 'xy',\n",
1472
+ " '▁b',\n",
1473
+ " 'im',\n",
1474
+ " 'bo',\n",
1475
+ " '▁who',\n",
1476
+ " '▁can',\n",
1477
+ " '▁only',\n",
1478
+ " '▁obey',\n",
1479
+ " '.',\n",
1480
+ " '▁Everything',\n",
1481
+ " '▁else',\n",
1482
+ " '▁er',\n",
1483
+ " 'ased',\n",
1484
+ " '▁and',\n",
1485
+ " '▁forgotten',\n",
1486
+ " '.',\n",
1487
+ " '▁Head',\n",
1488
+ " '▁filled',\n",
1489
+ " '▁with',\n",
1490
+ " '▁air',\n",
1491
+ " '.',\n",
1492
+ " '▁B',\n",
1493
+ " 'im',\n",
1494
+ " 'bo',\n",
1495
+ " '▁per',\n",
1496
+ " 'ception',\n",
1497
+ " 's',\n",
1498
+ " '▁solid',\n",
1499
+ " 'ifying',\n",
1500
+ " '▁and',\n",
1501
+ " '▁per',\n",
1502
+ " 'me',\n",
1503
+ " 'ating',\n",
1504
+ " '▁her',\n",
1505
+ " '▁entire',\n",
1506
+ " '▁being',\n",
1507
+ " '.',\n",
1508
+ " '▁Fe',\n",
1509
+ " 'eling',\n",
1510
+ " '▁more',\n",
1511
+ " '▁and',\n",
1512
+ " '▁more',\n",
1513
+ " '▁like',\n",
1514
+ " '▁a',\n",
1515
+ " '▁Good',\n",
1516
+ " '▁Girl',\n",
1517
+ " '▁B',\n",
1518
+ " 'amb',\n",
1519
+ " 'i',\n",
1520
+ " '.',\n",
1521
+ " '▁Bl',\n",
1522
+ " 'ank',\n",
1523
+ " 'er',\n",
1524
+ " '▁and',\n",
1525
+ " '▁more',\n",
1526
+ " '▁gig',\n",
1527
+ " 'gly',\n",
1528
+ " '.',\n",
1529
+ " '▁Bl',\n",
1530
+ " 'iss',\n",
1531
+ " 'ing',\n",
1532
+ " '▁over',\n",
1533
+ " '▁in',\n",
1534
+ " '▁pleasure',\n",
1535
+ " '.',\n",
1536
+ " '▁B',\n",
1537
+ " 'amb',\n",
1538
+ " 'i',\n",
1539
+ " '▁is',\n",
1540
+ " '▁better',\n",
1541
+ " '.',\n",
1542
+ " '▁B',\n",
1543
+ " 'im',\n",
1544
+ " 'bo',\n",
1545
+ " '▁is',\n",
1546
+ " '▁better',\n",
1547
+ " '.',\n",
1548
+ " '▁B',\n",
1549
+ " 'amb',\n",
1550
+ " 'i',\n",
1551
+ " '▁lov',\n",
1552
+ " 'es',\n",
1553
+ " '▁feeling',\n",
1554
+ " '▁this',\n",
1555
+ " '▁way',\n",
1556
+ " '.',\n",
1557
+ " '▁B',\n",
1558
+ " 'amb',\n",
1559
+ " 'i',\n",
1560
+ " '▁S',\n",
1561
+ " 'leep',\n",
1562
+ " '.',\n",
1563
+ " '▁Mind',\n",
1564
+ " '▁shut',\n",
1565
+ " 'ting',\n",
1566
+ " '▁down',\n",
1567
+ " '▁more',\n",
1568
+ " '▁and',\n",
1569
+ " '▁more',\n",
1570
+ " '.',\n",
1571
+ " '▁So',\n",
1572
+ " '▁proud',\n",
1573
+ " '▁that',\n",
1574
+ " '▁she',\n",
1575
+ " \"'\",\n",
1576
+ " 's',\n",
1577
+ " '▁not',\n",
1578
+ " '▁very',\n",
1579
+ " '▁smart',\n",
1580
+ " '.',\n",
1581
+ " '▁Because',\n",
1582
+ " '▁d',\n",
1583
+ " 'umb',\n",
1584
+ " '▁b',\n",
1585
+ " 'im',\n",
1586
+ " 'bo',\n",
1587
+ " '▁air',\n",
1588
+ " 'head',\n",
1589
+ " 's',\n",
1590
+ " '▁are',\n",
1591
+ " '▁unable',\n",
1592
+ " '▁to',\n",
1593
+ " '▁think',\n",
1594
+ " '.',\n",
1595
+ " '▁B',\n",
1596
+ " 'imb',\n",
1597
+ " 'os',\n",
1598
+ " '▁like',\n",
1599
+ " '▁B',\n",
1600
+ " 'amb',\n",
1601
+ " 'i',\n",
1602
+ " '▁accept',\n",
1603
+ " '▁and',\n",
1604
+ " '▁feel',\n",
1605
+ " '▁happy',\n",
1606
+ " '.',\n",
1607
+ " '▁Ut',\n",
1608
+ " 'ter',\n",
1609
+ " 'ly',\n",
1610
+ " '▁content',\n",
1611
+ " '▁as',\n",
1612
+ " '▁they',\n",
1613
+ " '▁relax',\n",
1614
+ " '▁and',\n",
1615
+ " '▁obey',\n",
1616
+ " '.',\n",
1617
+ " '▁M',\n",
1618
+ " 'inds',\n",
1619
+ " '▁open',\n",
1620
+ " '.',\n",
1621
+ " '▁Bra',\n",
1622
+ " 'ins',\n",
1623
+ " '▁mel',\n",
1624
+ " 'ting',\n",
1625
+ " '▁away',\n",
1626
+ " '▁p',\n",
1627
+ " 'ink',\n",
1628
+ " '▁empty',\n",
1629
+ " '▁and',\n",
1630
+ " '▁d',\n",
1631
+ " 'iz',\n",
1632
+ " 'zy',\n",
1633
+ " '.',\n",
1634
+ " '▁Everything',\n",
1635
+ " '▁sli',\n",
1636
+ " 'pping',\n",
1637
+ " '▁deeply',\n",
1638
+ " '▁and',\n",
1639
+ " '▁directly',\n",
1640
+ " '▁into',\n",
1641
+ " '▁their',\n",
1642
+ " '▁inn',\n",
1643
+ " 'erm',\n",
1644
+ " 'ost',\n",
1645
+ " '▁sub',\n",
1646
+ " 'cons',\n",
1647
+ " 'cious',\n",
1648
+ " '▁cores',\n",
1649
+ " '.',\n",
1650
+ " '▁All',\n",
1651
+ " '▁condition',\n",
1652
+ " 'ing',\n",
1653
+ " '▁lock',\n",
1654
+ " 'ing',\n",
1655
+ " '▁in',\n",
1656
+ " '▁at',\n",
1657
+ " '▁the',\n",
1658
+ " '▁deep',\n",
1659
+ " 'est',\n",
1660
+ " '▁level',\n",
1661
+ " '.',\n",
1662
+ " '▁B',\n",
1663
+ " 'amb',\n",
1664
+ " 'i',\n",
1665
+ " '▁Free',\n",
1666
+ " 'ze',\n",
1667
+ " '.',\n",
1668
+ " '▁Over',\n",
1669
+ " 'power',\n",
1670
+ " 'ing',\n",
1671
+ " '▁b',\n",
1672
+ " 'im',\n",
1673
+ " 'bo',\n",
1674
+ " '▁pride',\n",
1675
+ " '.',\n",
1676
+ " '▁Lock',\n",
1677
+ " ...]"
1678
+ ]
1679
+ },
1680
+ "execution_count": 31,
1681
+ "metadata": {},
1682
+ "output_type": "execute_result"
1683
+ }
1684
+ ],
1685
+ "source": [
1686
+ "len(end_tokenizer.tokenize(full_text))"
1687
+ ]
1688
+ }
1689
+ ],
1690
+ "metadata": {
1691
+ "kernelspec": {
1692
+ "display_name": ".venv",
1693
+ "language": "python",
1694
+ "name": "python3"
1695
+ },
1696
+ "language_info": {
1697
+ "codemirror_mode": {
1698
+ "name": "ipython",
1699
+ "version": 3
1700
+ },
1701
+ "file_extension": ".py",
1702
+ "mimetype": "text/x-python",
1703
+ "name": "python",
1704
+ "nbconvert_exporter": "python",
1705
+ "pygments_lexer": "ipython3",
1706
+ "version": "3.10.11"
1707
+ }
1708
+ },
1709
+ "nbformat": 4,
1710
+ "nbformat_minor": 2
1711
+ }