SAELens
ArthurConmyGDM's picture
Update README.md
fea284f verified
|
raw
history blame
934 Bytes
metadata
license: apache-2.0

1. GemmaScope

Gemmascope is TODO

2. What Is gemmascope-9b-pt-mlp?

  • gemmascope-: see 1.
  • 9b-pt-: These SAEs were trained on the Gemma v2 9B base model (TODO link)
  • mlp: These SAEs were trained on the outputs of MLP layers.

3. GTM FAQ (TODO(conmy): delete for main rollout)

Q1: Why does this model exist in gg-hf?

A1: See https://docs.google.com/document/d/1bKaOw2mJPJDYhgFQGGVOyBB3M4Bm_Q3PMrfQeqeYi0M (Google internal only).

Q2: What does "SAE" mean?

A2: Sparse Autoencoder. See https://docs.google.com/document/d/1roMgCPMPEQgaNbCu15CGo966xRLToulCBQUVKVGvcfM (should be available to trusted HuggingFace collaborators, and Google too).

TODO(conmy): remove this when making the main repo.

4. Point of Contact

Point of contact: Arthur Conmy

Contact by email:

''.join(list('moc.elgoog@ymnoc')[::-1])

HuggingFace account: https://huggingface.co/ArthurConmyGDM