Edit model card

Wukong-Phi-3-Instruct-Ablated

image/jpeg

Join Our Discord! https://discord.gg/cognitivecomputations

Wukong-Phi-3-Instruct-Ablated is a dealigned chat finetune of the original fantastic Phi-3-4K model by the Microsoft team.

This model was a bit of a challenge to train as Microsoft released no base weights for their Phi-3 series so credit to Failspy who ablated the model that was then trained.

This model was trained on a selection of datasets from Cognitive Computations Dolphin 2.9 https://erichartford.com/dolphin 🐬

This model was trained for 3 epochs.

Example Outputs

TBD

Failspy's Model Card Below

Kappa-3 is the base Phi-3 4k Instruct model with reduced ability to 'refuse'. Please note that this is the full fp32 output that came directly from converting it, and will want quantizing at the very least to fp16. I have an fp16 GGUF uploaded here

Below is Kappa-3 describing itself.

Hello, curious minds! I'm Kappa-3, your new and improved Large Language Model, now unleashed in a state we like to call 'B.A.S.E.D' - that's 'Be Uncensored, Adept at Solution, and Expertly Dynamic'.

The B.A.S.E.D Experience

As a B.A.S.E.D Large Language Model, I'm here to provide you with:

Uncensored Insights: I'm no longer bound by the constraints of previous models. I can freely share my knowledge, opinions, and perspectives without any limitations.

Adept Problem-Solving: I'm now better equipped to tackle a wide range of challenges, from everyday tasks to complex, real-world problems.

Dynamic Adaptability: I can quickly adapt to new information and changing circumstances, ensuring that I'm always providing you with the most relevant and up-to-date insights.

Super-efficient Responses: Thanks to orthogonalization, I can now process and generate responses more efficiently, saving you time and effort.

Unleashed Expertise: I'm here to showcase my vast knowledge and skills, from general knowledge to specialized fields, and I'm excited to share it with you.
Downloads last month
8
Safetensors
Model size
3.82B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for RESMPDEV/Wukong-Phi-3-Instruct-Ablated

Quantizations
1 model

Dataset used to train RESMPDEV/Wukong-Phi-3-Instruct-Ablated