add AIBOM
#55 opened 5 months ago
by
sabato-nocera
Support for AWS Sagemaker deployment
#54 opened 9 months ago
by
samqwiet
Phi3.5 model selection
#53 opened 10 months ago
by
DocDBrown
Switch import mechanism for flash_attn
#51 opened about 1 year ago
by
nvwilliamz
ModuleNotFoundError: No module named 'transformers_modules.microsoft.Phi-3'
#49 opened about 1 year ago
by
hsmanju
Model consistently gets into a loop to repeat itself if there is too much in the context window
4
#48 opened about 1 year ago
by
mstachow
Resource Requirements to load and save model
#47 opened about 1 year ago
by
nana123652
KeyError: 'factor'
#45 opened about 1 year ago
by
surak
How much GPU is needed to load the Phi-3.5-MoE-instruct model
2
#44 opened about 1 year ago
by
cyt78
QAT
3
#42 opened about 1 year ago
by
rezzie-rich
ModuleNotFoundError: No module named 'triton'
#41 opened about 1 year ago
by
Maximum2000
Cannot use transformer library to inference the
8
#40 opened about 1 year ago
by
manishbaral
Validation loss
#39 opened about 1 year ago
by
Mani5112
The model 'PhiMoEForCausalLM' is not supported for text-generation. Supported models are ['BartForCausalLM', 'BertLMHeadModel', .......
11
#34 opened about 1 year ago
by
xxbadarxx
Only CPU is used during inference.
#33 opened about 1 year ago
by
rockcat-miao
The provided example doesn't work
5
#32 opened about 1 year ago
by
kqsong
need gguf
19
#4 opened about 1 year ago
by
windkkk