--- license: apache-2.0 tags: - moe - merge - mergekit - lazymergekit - Felladrin/Minueza-32M-Chat pipeline_tag: text-generation widget: - messages: - role: system content: You are a career counselor. The user will provide you with an individual looking for guidance in their professional life, and your task is to assist them in determining what careers they are most suited for based on their skills, interests, and experience. You should also conduct research into the various options available, explain the job market trends in different industries, and advice on which qualifications would be beneficial for pursuing particular fields. - role: user content: Heya! - role: assistant content: Hi! How may I help you? - role: user content: I am interested in developing a career in software engineering. What would you recommend me to do? - messages: - role: system content: You are a highly knowledgeable assistant. Help the user as much as you can. - role: user content: How can I become a healthier person? - messages: - role: system content: You are a helpful assistant who gives creative responses. - role: user content: Write the specs of a game about mages in a fantasy world. - messages: - role: system content: You are a helpful assistant who answers user's questions with details. - role: user content: Tell me about the pros and cons of social media. - messages: - role: system content: You are a helpful assistant who answers user's questions with details and curiosity. - role: user content: What are some potential applications for quantum computing? inference: parameters: max_new_tokens: 250 do_sample: true temperature: 0.65 top_p: 0.55 top_k: 35 repetition_penalty: 1.176 datasets: - databricks/databricks-dolly-15k - Felladrin/ChatML-databricks-dolly-15k - euclaise/reddit-instruct-curated - Felladrin/ChatML-reddit-instruct-curated - THUDM/webglm-qa - Felladrin/ChatML-WebGLM-QA - starfishmedical/webGPT_x_dolly - Felladrin/ChatML-webGPT_x_dolly - LDJnr/Capybara - Felladrin/ChatML-Capybara - Open-Orca/SlimOrca-Dedup - Felladrin/ChatML-SlimOrca-Dedup - HuggingFaceH4/ultrachat_200k - Felladrin/ChatML-ultrachat_200k - nvidia/HelpSteer - Felladrin/ChatML-HelpSteer - sablo/oasst2_curated - Felladrin/ChatML-oasst2_curated - CohereForAI/aya_dataset - Felladrin/ChatML-aya_dataset - argilla/distilabel-capybara-dpo-7k-binarized - Felladrin/ChatML-distilabel-capybara-dpo-7k-binarized - argilla/distilabel-intel-orca-dpo-pairs - Felladrin/ChatML-distilabel-intel-orca-dpo-pairs - argilla/ultrafeedback-binarized-preferences - Felladrin/ChatML-ultrafeedback-binarized-preferences - sablo/oasst2_dpo_pairs_en - Felladrin/ChatML-oasst2_dpo_pairs_en - NeuralNovel/Neural-DPO - Felladrin/ChatML-Neural-DPO --- 🌟 Buying me coffee is a direct way to show support for this project. # Mixnueza-6x32M-MoE Mixnueza-6x32M-MoE is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing): * 6 X [Felladrin/Minueza-32M-Chat](https://huggingface.co/Felladrin/Minueza-32M-Chat) * Num Experts Per Token : 3 ## 💻 Usage ```python from transformers import pipeline generate = pipeline("text-generation", "Isotonic/Mixnueza-6x32M-MoE") messages = [ { "role": "system", "content": "You are a helpful assistant who answers the user's questions with details and curiosity.", }, { "role": "user", "content": "What are some potential applications for quantum computing?", }, ] prompt = generate.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) output = generate( prompt, max_new_tokens=256, do_sample=True, temperature=0.65, top_k=35, top_p=0.55, repetition_penalty=1.176, ) print(output[0]["generated_text"]) ```