Edit model card

DeepSeek Coder

[🏠Homepage] | [🤖 Chat with DeepSeek Coder] | [Discord] | [Wechat(微信)]


1. Introduction of Deepseek-Coder-7B-Base-v1.5

Deepseek-Coder-7B-Base-v1.5 is continue pre-trained from Deepseek-LLM 7B on 2T tokens by employing a window size of 4K and next token prediction objective.

2. Evaluation Results

DeepSeek Coder

3. How to Use

Here give an example of how to use our model.

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
tokenizer = AutoTokenizer.from_pretrained("deepseek-ai/deepseek-coder-7b-base-v1.5", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("deepseek-ai/deepseek-coder-7b-base-v1.5", trust_remote_code=True).cuda()
input_text = "#write a quick sort algorithm"
inputs = tokenizer(input_text, return_tensors="pt").cuda()
outputs = model.generate(**inputs, max_length=128)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

4. License

This code repository is licensed under the MIT License. The use of DeepSeek Coder models is subject to the Model License. DeepSeek Coder supports commercial use.

See the LICENSE-MODEL for more details.

5. Contact

If you have any questions, please raise an issue or contact us at [email protected].

Downloads last month
1,510
Safetensors
Model size
6.91B params
Tensor type
BF16
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for deepseek-ai/deepseek-coder-7b-base-v1.5

Quantizations
3 models

Spaces using deepseek-ai/deepseek-coder-7b-base-v1.5 3

Collection including deepseek-ai/deepseek-coder-7b-base-v1.5