code-search-net/code_search_net
Viewer • Updated • 4.14M • 24.1k • 329
How to use gkteco/distillbert-finetuned-code-mlm with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="gkteco/distillbert-finetuned-code-mlm") # Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("gkteco/distillbert-finetuned-code-mlm")
model = AutoModelForMaskedLM.from_pretrained("gkteco/distillbert-finetuned-code-mlm")This model is a fine-tuned version of distilbert-base-uncased on the code_search_net dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 1.2293 | 1.0 | 14667 | 1.1974 |
Base model
distilbert/distilbert-base-uncased