Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
12
Follow
AWS Inferentia and Trainium
61
License:
apache-2.0
Model card
Files
Files and versions
Community
241
f9ec8c1
optimum-neuron-cache
/
neuronxcc-2.13.66.0+6dfecc895
/
0_REGISTRY
/
0.0.22
/
inference
/
llama
/
elyza
8 contributors
History:
6 commits
htokoyo
Synchronizing local compiler cache.
64c3b22
verified
6 months ago
ELYZA-japanese-Llama-2-7b-instruct
Synchronizing local compiler cache.
6 months ago
ELYZA-japanese-Llama-2-7b
Synchronizing local compiler cache.
6 months ago