Edit model card

This model is a fune-tuned version of codet5-large on Typescript instruct-code pairs.

To run this model, you can use following example:

import torch 
device = torch.device('cuda:0') if torch.cuda.is_available() else None
from transformers import AutoTokenizer, T5ForConditionalGeneration

def generate_code(task_description):
    # Prepare the task description
    input_ids = tokenizer.encode(task_description, return_tensors='pt').to(device)

    # Generate the output
    with torch.no_grad():
        output_ids = model.generate(input_ids, max_length=200, temperature=0.7, num_beams=5)

    # Decode the output
    output = tokenizer.decode(output_ids[0], skip_special_tokens=True)

    return output

model = T5ForConditionalGeneration.from_pretrained('mishasadhaker/codet5_large_typescript').to(device)
tokenizer = AutoTokenizer.from_pretrained('mishasadhaker/codet5_large_typescript')

print(generate_code('write function for sum of two numbers and return it'))
Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train mishasadhaker/codet5_large_typescript