Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
moyix
/
dolly-replication
like
2
Text Generation
Transformers
PyTorch
TensorBoard
gptj
Model card
Files
Files and versions
xet
Metrics
Training metrics
Community
Deploy
Use this model
main
dolly-replication
24.3 GB
2 contributors
History:
5 commits
Brendan Dolan-Gavitt
add tensorboard logs
bf9d534
over 2 years ago
runs
add tensorboard logs
over 2 years ago
.gitattributes
Safe
1.48 kB
initial commit
over 2 years ago
README.md
644 Bytes
Barebones model card
over 2 years ago
added_tokens.json
Safe
4.33 kB
add tokenizer (just the standard GPT-J-6B tokenizer)
over 2 years ago
config.json
985 Bytes
Upload GPTJForCausalLM
over 2 years ago
generation_config.json
141 Bytes
Upload GPTJForCausalLM
over 2 years ago
merges.txt
Safe
456 kB
add tokenizer (just the standard GPT-J-6B tokenizer)
over 2 years ago
pytorch_model-00001-of-00003.bin
pickle
Detected Pickle imports (4)
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch.BoolStorage"
What is a pickle import?
10 GB
xet
Upload GPTJForCausalLM
over 2 years ago
pytorch_model-00002-of-00003.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.BoolStorage"
What is a pickle import?
9.98 GB
xet
Upload GPTJForCausalLM
over 2 years ago
pytorch_model-00003-of-00003.bin
pickle
Detected Pickle imports (4)
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch.BoolStorage"
What is a pickle import?
4.33 GB
xet
Upload GPTJForCausalLM
over 2 years ago
pytorch_model.bin.index.json
25.8 kB
Upload GPTJForCausalLM
over 2 years ago
special_tokens_map.json
Safe
470 Bytes
add tokenizer (just the standard GPT-J-6B tokenizer)
over 2 years ago
tokenizer.json
2.14 MB
add tokenizer (just the standard GPT-J-6B tokenizer)
over 2 years ago
tokenizer_config.json
Safe
722 Bytes
add tokenizer (just the standard GPT-J-6B tokenizer)
over 2 years ago
vocab.json
Safe
798 kB
add tokenizer (just the standard GPT-J-6B tokenizer)
over 2 years ago