# Model Card - Source: [https://arxiv.org/abs/2509.02046](https://arxiv.org/abs/2509.02046) - Optimizer: `soape` - Model size: `520m` - Data size: `42B` ## Best configuration | Hyperparameter | Value | |---|---| | beta1 | `0.95` | | beta2 | `0.99` | | block_size | `512` | | epsilon | `1e-10` | | learning_rate | `0.004` | | max_grad_norm | `1` | | min_lr_ratio | `0` | | partition_grads_into_blocks | `True` | | precondition_frequency | `10` | | shampoo_beta | `0.95` | | train_batch_size | `256` | | warmup | `1000` | | weight_decay | `0.1` |