Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

jucamohedano
/
Phi2-openhermes-preferences-metamath-dpo

Model card Files Files and versions
xet
Community
Phi2-openhermes-preferences-metamath-dpo
1.52 kB
  • 1 contributor
History: 1 commit
jucamohedano's picture
jucamohedano
initial commit
b5d96b5 verified over 1 year ago
  • .gitattributes
    1.52 kB
    initial commit over 1 year ago