Updated with commit 914576f7edcba7116b3d22fd0f2b8f727d652c3a See: https://github.com/huggingface/tokenizers/commit/914576f7edcba7116b3d22fd0f2b8f727d652c3a
84e03b7
verified
- accelerate
- alignment-handbook
- api-inference
- audio-course
- autotrain
- bitsandbytes
- competitions
- computer-vision-course
- cookbook
- course
- datasets-server
- datasets
- deep-rl-course
- diffusers
- diffusion-course
- evaluate
- hub
- huggingface.js
- huggingface_hub
- inference-endpoints
- ml-games-course
- optimum-neuron
- optimum-tpu
- optimum
- peft
- safetensors
- sagemaker
- setfit
- simulate
- text-embeddings-inference
- text-generation-inference
- timm
- tokenizers
- transformers.js
- transformers
- trl
-
2.32 kB
-
217 Bytes
-
202 Bytes
-
294 MB
LFS