pharaouk commited on
Commit
625e23c
1 Parent(s): 30e0bf4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +25 -2
README.md CHANGED
@@ -1,11 +1,34 @@
1
- *BakLLaVA 1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
 
3
  BakLLaVA 1 is a Mistral 7B base augmented with the LLaVA 1.5 architecture. In this first version, we showcase that a Mistral 7B base outperforms Llama 2 13B on several benchmarks.
 
 
 
 
 
 
 
 
 
 
4
 
5
 
6
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64b7e345f92b20f7a38bf47a/qdYubrBmF7ztAHgdfkkwG.png)
7
 
8
 
9
- BakLLaVA 2 is cooking with a significantly larger dataset and a novel architecture that expands beyond the current LLaVA method.
10
 
11
 
 
1
+ ---
2
+ datasets:
3
+ - SkunkworksAI/BakLLaVA-1-FT
4
+ language:
5
+ - en
6
+ library_name: transformers
7
+ license: apache-2.0
8
+ ---
9
+
10
+ <p><h1> BakLLaVA-1 </h1></p>
11
+
12
+ Thank you to our compute sponsors Together Compute (www.together.ai).
13
+
14
+
15
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64b7e345f92b20f7a38bf47a/V5lpOHWGGYJ2yPpEo_8i1.png)
16
 
17
  BakLLaVA 1 is a Mistral 7B base augmented with the LLaVA 1.5 architecture. In this first version, we showcase that a Mistral 7B base outperforms Llama 2 13B on several benchmarks.
18
+ You can run BakLLaVA-1 on the current LLaVA-1.5 inference code (https://llava-vl.github.io/). We are currently updating our repo to make it easier. (https://github.com/SkunkworksAI/BakLLaVA).
19
+
20
+
21
+ Note: BakLLaVA-1 is fully open-source but was trained on certain data that includes LLaVA's corpus which is not commercially permissive. We will fix this in the upcoming release.
22
+
23
+
24
+ BakLLaVA 2 is cooking with a significantly larger (commercially viable) dataset and a novel architecture that expands beyond the current LLaVA method. BakLLaVA-2 will do away with the restrictions of BakLLaVA-1.
25
+
26
+
27
+ # Evaluations
28
 
29
 
30
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64b7e345f92b20f7a38bf47a/qdYubrBmF7ztAHgdfkkwG.png)
31
 
32
 
 
33
 
34