[FEEDBACK] Local apps

#31
by kramp HF staff - opened
Hugging Face org

Please share your feedback about the Local Apps integration in model pages.

On compatible models , you'll be proposed to launch some local apps:

In your settings, you can configure the list of apps and their order:

The list of available local apps is defined in https://github.com/huggingface/huggingface.js/blob/main/packages/tasks/src/local-apps.ts

I think the tensor-core fp16 FLOPS should be used for GPUs supporting that. I note that V100 counts as way less than the theoretical 125 TFLOPS, listed e.g. here: https://images.nvidia.com/content/technologies/volta/pdf/tesla-volta-v100-datasheet-letter-fnl-web.pdf

Hey! Have you guys heard of LangFlow? It is a neat solution for developing AI-powered apps as well!

The GPU list is missing the RTX A4000 (16GB)

PR: https://github.com/huggingface/huggingface.js/pull/817

Would be nice to get ollama integration

I suggest adding Ollama as local app to run LLM's

I use GPT4All and it is not listed herein

imagen.png
Ollama
local app to run LLM
https://github.com/ollama/ollama

imagen.png
transformerlab-app
Open Source Application for Advanced LLM Engineering: interact, train, fine-tune, and evaluate large language models on your own computer.
https://github.com/transformerlab/transformerlab-app

Perplexica
Perplexica is an AI-powered search engine. It is an Open source alternative to Perplexity AI
https://github.com/ItzCrazyKns/Perplexica

May be in future adding HuggingChat?

image.png

HuggingChat macOS is a native chat interface designed specifically for macOS users, leveraging the power of open-source language models. It brings the capabilities of advanced AI conversation right to your desktop, offering a seamless and intuitive experience.

https://github.com/huggingface/chat-macOS

Missing from the Hardware lists:

GPU: Nvidia RTX4070 laptop (8 GB vram)
CPU: Intel Core Ultra CPU 7 (14th generation)

Hugging Face org

Hi @tkowalsky , would you like to open a PR? :) Here's another one you can use as an example to get started, if you're up for it: https://github.com/huggingface/huggingface.js/pull/880/files

Sign up or log in to comment