Julian BILCKE commited on
Commit
e508915
β€’
1 Parent(s): a8b725c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -34,17 +34,17 @@ Currently the AI Comic Factory uses [Llama-2 70b](https://huggingface.co/blog/ll
34
 
35
  You have two options:
36
 
37
- ## Option 1: Fork and modify the code to use another LLM
38
 
39
  If you fork the AI Comic Factory, you will be able to use another API and model, such as a locally-running Llama 7b.
40
 
41
  To run the LLM locally, you can use [TGI](https://github.com/huggingface/text-generation-inference) (Please read [this post](https://github.com/huggingface/text-generation-inference/issues/726) for more information about licensing).
42
 
43
- ## Option 2: Fork and modify the code to use human content instead
44
 
45
  Another option could be to disable the LLM completely and replace it with a human-generated story instead (by returning mock or static data).
46
 
47
- ## Notes
48
 
49
  It is possible that I modify the AI Comic Factory to make it easier in the future (eg. add support for OpenAI or Replicate)
50
 
@@ -68,6 +68,6 @@ Unfortunately, I haven't had the time to write the documentation for VideoChain
68
 
69
  If you fork the project you will be able to modify the code to use the Stable Diffusion technology of your choice (local, open-source, your custom HF Space etc)
70
 
71
- ## Notes
72
 
73
  It is possible that I modify the AI Comic Factory to make it easier in the future (eg. add support for Replicate)
 
34
 
35
  You have two options:
36
 
37
+ ### Option 1: Fork and modify the code to use another LLM
38
 
39
  If you fork the AI Comic Factory, you will be able to use another API and model, such as a locally-running Llama 7b.
40
 
41
  To run the LLM locally, you can use [TGI](https://github.com/huggingface/text-generation-inference) (Please read [this post](https://github.com/huggingface/text-generation-inference/issues/726) for more information about licensing).
42
 
43
+ ### Option 2: Fork and modify the code to use human content instead
44
 
45
  Another option could be to disable the LLM completely and replace it with a human-generated story instead (by returning mock or static data).
46
 
47
+ ### Notes
48
 
49
  It is possible that I modify the AI Comic Factory to make it easier in the future (eg. add support for OpenAI or Replicate)
50
 
 
68
 
69
  If you fork the project you will be able to modify the code to use the Stable Diffusion technology of your choice (local, open-source, your custom HF Space etc)
70
 
71
+ ### Notes
72
 
73
  It is possible that I modify the AI Comic Factory to make it easier in the future (eg. add support for Replicate)