Dataset Preview
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed because of a cast error
Error code: DatasetGenerationCastError
Exception: DatasetGenerationCastError
Message: An error occurred while generating the dataset
All the data files must have the same columns, but at some point there are 77 new columns ({'weight_grad_norm/transformer.h.6.ln_1.weight', 'weight_grad_norm/transformer.h.5.ln_1.weight', 'empirical_L0_frac/transformer.h.6.attn.c_proj.bias', 'empirical_L0_frac/transformer.h.4.mlp.c_proj.bias', 'empirical_L0_frac/transformer.h.7.attn.c_attn.weight', 'weight_grad_norm/transformer.h.4.ln_1.weight', 'weight_grad_norm/transformer.h.5.attn.c_proj.bias', 'empirical_L0_frac/transformer.h.5.mlp.c_proj.bias', 'weight_grad_norm/transformer.h.6.mlp.c_fc.weight', 'empirical_L0_frac/transformer.h.6.mlp.c_fc.weight', 'weight_grad_norm/transformer.h.5.attn.c_attn.bias', 'num_alive_neurons/c_fc/layer_5', 'weight_grad_norm/transformer.h.5.attn.c_attn.weight', 'empirical_L0_frac/transformer.h.6.attn.c_proj.weight', 'weight_grad_norm/transformer.h.6.attn.c_proj.weight', 'empirical_L0_frac/transformer.h.4.mlp.c_proj.weight', 'weight_grad_norm/transformer.h.4.mlp.c_proj.weight', 'weight_grad_norm/transformer.h.4.mlp.c_fc.weight', 'weight_grad_norm/transformer.h.4.mlp.c_fc.bias', 'weight_grad_norm/transformer.h.5.mlp.c_proj.bias', 'num_alive_neurons/c_fc/layer_7', 'weight_grad_norm/transformer.h.4.attn.c_proj.weight', 'weight_grad_norm/transformer.h.6.ln_2.weight', 'empirical_L0_frac/transformer.h.6.mlp.c_proj.weight', 'empirical_L0_frac/transformer.h.4.attn.c_proj.bias', 'empirical_L0_frac/transformer.h.6.attn.c_attn.weight', 'weight_grad_norm/transformer.h.5.mlp.c_fc.bias', 'weight_grad_norm/transformer.h.6.mlp.c_proj.bias', 'empirical_L0_frac/transformer.h.7.mlp.c_proj.bias', 'empiric
...
'empirical_L0_frac/transformer.h.7.attn.c_proj.bias', 'weight_grad_norm/transformer.h.5.ln_2.weight', 'empirical_L0_frac/transformer.h.7.mlp.c_proj.weight', 'empirical_L0_frac/transformer.h.5.attn.c_attn.weight', 'empirical_L0_frac/transformer.h.4.attn.c_proj.weight', 'weight_grad_norm/transformer.h.5.mlp.c_fc.weight', 'empirical_L0_frac/transformer.h.7.mlp.c_fc.bias', 'weight_grad_norm/transformer.h.4.attn.c_proj.bias', 'empirical_L0_frac/transformer.h.6.mlp.c_fc.bias', 'empirical_L0_frac/transformer.h.5.mlp.c_fc.bias', 'weight_grad_norm/transformer.h.6.attn.c_attn.bias', 'empirical_L0_frac/transformer.h.7.attn.c_proj.weight', 'empirical_L0_frac/transformer.h.5.mlp.c_proj.weight', 'weight_grad_norm/transformer.h.4.mlp.c_proj.bias', 'weight_grad_norm/bigram_table', 'weight_grad_norm/transformer.h.6.attn.c_attn.weight', 'weight_grad_norm/transformer.h.5.attn.c_proj.weight', 'empirical_L0_frac/transformer.h.5.attn.c_attn.bias', 'weight_grad_norm/transformer.h.7.attn.c_attn.bias', 'empirical_L0_frac/transformer.h.6.mlp.c_proj.bias', 'weight_grad_norm/transformer.h.4.attn.c_attn.weight', 'weight_grad_norm/transformer.h.7.attn.c_attn.weight', 'weight_grad_norm/transformer.h.7.mlp.c_fc.bias', 'weight_grad_norm/transformer.h.7.mlp.c_fc.weight', 'weight_grad_norm/transformer.h.7.attn.c_proj.bias', 'num_alive_neurons/c_fc/layer_4', 'empirical_L0_frac/transformer.h.7.attn.c_attn.bias', 'weight_grad_norm/transformer.h.4.ln_2.weight', 'empirical_L0_frac/transformer.h.7.mlp.c_fc.weight'}) and 6 missing columns ({'dead_qk/layer_1', 'test_xent_bridged', 'test_kl_bridged', 'dead_qk/layer_0', 'dead_qk/layer_2', 'dead_qk/layer_3'}).
This happened while the json dataset builder was generating data using
hf://datasets/michaelwaves/sparse-circuits/train_curves/csp_sweep1_16x_7.4Mnonzero_afrac0.125/progress.json (at revision 15dadd2c2824ca52bcbcb7ae58533d1998bd454e)
Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Traceback: Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1831, in _prepare_split_single
writer.write_table(table)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 714, in write_table
pa_table = table_cast(pa_table, self._schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2272, in table_cast
return cast_table_to_schema(table, schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2218, in cast_table_to_schema
raise CastError(
datasets.table.CastError: Couldn't cast
xent: double
test_kl: double
test_xent: double
grad_norm: int64
weight_sparsity: double
L0_as_frac_of_orig_params: double
L0: double
L0_non_LN: double
L0_non_embed: double
L0_non_embed_as_frac_of_orig_params: double
param_norm: double
aux_invertable: int64
elapsed_tokens: int64
grad_scale: double
lr: double
pfrac: int64
step: int64
did_clip_grad_norm: int64
tokens_per_second: double
num_alive_neurons/c_fc/layer_0: int64
num_alive_neurons/c_fc/layer_1: int64
num_alive_neurons/c_fc/layer_2: int64
num_alive_neurons/c_fc/layer_3: int64
num_alive_neurons/c_fc/layer_4: int64
num_alive_neurons/c_fc/layer_5: int64
num_alive_neurons/c_fc/layer_6: int64
num_alive_neurons/c_fc/layer_7: int64
empirical_L0_frac/transformer.wte.weight: double
empirical_L0_frac/transformer.wpe.weight: double
empirical_L0_frac/transformer.h.0.attn.c_attn.weight: double
empirical_L0_frac/transformer.h.0.attn.c_attn.bias: double
empirical_L0_frac/transformer.h.0.attn.c_proj.weight: double
empirical_L0_frac/transformer.h.0.attn.c_proj.bias: double
empirical_L0_frac/transformer.h.0.mlp.c_fc.weight: double
empirical_L0_frac/transformer.h.0.mlp.c_fc.bias: double
empirical_L0_frac/transformer.h.0.mlp.c_proj.weight: double
empirical_L0_frac/transformer.h.0.mlp.c_proj.bias: double
empirical_L0_frac/transformer.h.1.attn.c_attn.weight: double
empirical_L0_frac/transformer.h.1.attn.c_attn.bias: double
empirical_L0_frac/transformer.h.1.attn.c_proj.weight: double
empirical_L0_frac/transformer.h.1.attn.c_proj.bias: double
...
ad_norm/transformer.h.5.ln_2.weight: double
weight_grad_norm/transformer.h.5.mlp.c_fc.weight: double
weight_grad_norm/transformer.h.5.mlp.c_fc.bias: double
weight_grad_norm/transformer.h.5.mlp.c_proj.weight: double
weight_grad_norm/transformer.h.5.mlp.c_proj.bias: double
weight_grad_norm/transformer.h.6.ln_1.weight: double
weight_grad_norm/transformer.h.6.attn.c_attn.weight: double
weight_grad_norm/transformer.h.6.attn.c_attn.bias: double
weight_grad_norm/transformer.h.6.attn.c_proj.weight: double
weight_grad_norm/transformer.h.6.attn.c_proj.bias: double
weight_grad_norm/transformer.h.6.ln_2.weight: double
weight_grad_norm/transformer.h.6.mlp.c_fc.weight: double
weight_grad_norm/transformer.h.6.mlp.c_fc.bias: double
weight_grad_norm/transformer.h.6.mlp.c_proj.weight: double
weight_grad_norm/transformer.h.6.mlp.c_proj.bias: double
weight_grad_norm/transformer.h.7.ln_1.weight: double
weight_grad_norm/transformer.h.7.attn.c_attn.weight: double
weight_grad_norm/transformer.h.7.attn.c_attn.bias: double
weight_grad_norm/transformer.h.7.attn.c_proj.weight: double
weight_grad_norm/transformer.h.7.attn.c_proj.bias: double
weight_grad_norm/transformer.h.7.ln_2.weight: double
weight_grad_norm/transformer.h.7.mlp.c_fc.weight: double
weight_grad_norm/transformer.h.7.mlp.c_fc.bias: double
weight_grad_norm/transformer.h.7.mlp.c_proj.weight: double
weight_grad_norm/transformer.h.7.mlp.c_proj.bias: double
weight_grad_norm/transformer.ln_f.weight: double
weight_grad_norm/lm_head.weight: double
to
{'xent': Value('float64'), 'test_kl': Value('float64'), 'test_xent': Value('float64'), 'grad_norm': Value('int64'), 'weight_sparsity': Value('float64'), 'L0_as_frac_of_orig_params': Value('float64'), 'L0': Value('float64'), 'L0_non_LN': Value('float64'), 'L0_non_embed': Value('float64'), 'L0_non_embed_as_frac_of_orig_params': Value('float64'), 'param_norm': Value('float64'), 'aux_invertable': Value('float64'), 'elapsed_tokens': Value('int64'), 'grad_scale': Value('float64'), 'lr': Value('float64'), 'pfrac': Value('float64'), 'step': Value('int64'), 'did_clip_grad_norm': Value('int64'), 'tokens_per_second': Value('float64'), 'num_alive_neurons/c_fc/layer_0': Value('int64'), 'num_alive_neurons/c_fc/layer_1': Value('int64'), 'num_alive_neurons/c_fc/layer_2': Value('int64'), 'num_alive_neurons/c_fc/layer_3': Value('int64'), 'test_kl_bridged': Value('float64'), 'test_xent_bridged': Value('float64'), 'dead_qk/layer_0': Value('int64'), 'dead_qk/layer_1': Value('int64'), 'dead_qk/layer_2': Value('int64'), 'dead_qk/layer_3': Value('int64'), 'empirical_L0_frac/transformer.wte.weight': Value('float64'), 'empirical_L0_frac/transformer.wpe.weight': Value('float64'), 'empirical_L0_frac/transformer.h.0.attn.c_attn.weight': Value('float64'), 'empirical_L0_frac/transformer.h.0.attn.c_attn.bias': Value('float64'), 'empirical_L0_frac/transformer.h.0.attn.c_proj.weight': Value('float64'), 'empirical_L0_frac/transformer.h.0.attn.c_proj.bias': Value('float64'), 'empirical_L0_frac/transformer.h.0.m
...
orm/transformer.h.2.ln_1.weight': Value('float64'), 'weight_grad_norm/transformer.h.2.attn.c_attn.weight': Value('float64'), 'weight_grad_norm/transformer.h.2.attn.c_attn.bias': Value('float64'), 'weight_grad_norm/transformer.h.2.attn.c_proj.weight': Value('float64'), 'weight_grad_norm/transformer.h.2.attn.c_proj.bias': Value('float64'), 'weight_grad_norm/transformer.h.2.ln_2.weight': Value('float64'), 'weight_grad_norm/transformer.h.2.mlp.c_fc.weight': Value('float64'), 'weight_grad_norm/transformer.h.2.mlp.c_fc.bias': Value('float64'), 'weight_grad_norm/transformer.h.2.mlp.c_proj.weight': Value('float64'), 'weight_grad_norm/transformer.h.2.mlp.c_proj.bias': Value('float64'), 'weight_grad_norm/transformer.h.3.ln_1.weight': Value('float64'), 'weight_grad_norm/transformer.h.3.attn.c_attn.weight': Value('float64'), 'weight_grad_norm/transformer.h.3.attn.c_attn.bias': Value('float64'), 'weight_grad_norm/transformer.h.3.attn.c_proj.weight': Value('float64'), 'weight_grad_norm/transformer.h.3.attn.c_proj.bias': Value('float64'), 'weight_grad_norm/transformer.h.3.ln_2.weight': Value('float64'), 'weight_grad_norm/transformer.h.3.mlp.c_fc.weight': Value('float64'), 'weight_grad_norm/transformer.h.3.mlp.c_fc.bias': Value('float64'), 'weight_grad_norm/transformer.h.3.mlp.c_proj.weight': Value('float64'), 'weight_grad_norm/transformer.h.3.mlp.c_proj.bias': Value('float64'), 'weight_grad_norm/transformer.ln_f.weight': Value('float64'), 'weight_grad_norm/lm_head.weight': Value('float64')}
because column names don't match
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1450, in compute_config_parquet_and_info_response
parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 993, in stream_convert_to_parquet
builder._prepare_split(
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1702, in _prepare_split
for job_id, done, content in self._prepare_split_single(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1833, in _prepare_split_single
raise DatasetGenerationCastError.from_cast_error(
datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset
All the data files must have the same columns, but at some point there are 77 new columns ({'weight_grad_norm/transformer.h.6.ln_1.weight', 'weight_grad_norm/transformer.h.5.ln_1.weight', 'empirical_L0_frac/transformer.h.6.attn.c_proj.bias', 'empirical_L0_frac/transformer.h.4.mlp.c_proj.bias', 'empirical_L0_frac/transformer.h.7.attn.c_attn.weight', 'weight_grad_norm/transformer.h.4.ln_1.weight', 'weight_grad_norm/transformer.h.5.attn.c_proj.bias', 'empirical_L0_frac/transformer.h.5.mlp.c_proj.bias', 'weight_grad_norm/transformer.h.6.mlp.c_fc.weight', 'empirical_L0_frac/transformer.h.6.mlp.c_fc.weight', 'weight_grad_norm/transformer.h.5.attn.c_attn.bias', 'num_alive_neurons/c_fc/layer_5', 'weight_grad_norm/transformer.h.5.attn.c_attn.weight', 'empirical_L0_frac/transformer.h.6.attn.c_proj.weight', 'weight_grad_norm/transformer.h.6.attn.c_proj.weight', 'empirical_L0_frac/transformer.h.4.mlp.c_proj.weight', 'weight_grad_norm/transformer.h.4.mlp.c_proj.weight', 'weight_grad_norm/transformer.h.4.mlp.c_fc.weight', 'weight_grad_norm/transformer.h.4.mlp.c_fc.bias', 'weight_grad_norm/transformer.h.5.mlp.c_proj.bias', 'num_alive_neurons/c_fc/layer_7', 'weight_grad_norm/transformer.h.4.attn.c_proj.weight', 'weight_grad_norm/transformer.h.6.ln_2.weight', 'empirical_L0_frac/transformer.h.6.mlp.c_proj.weight', 'empirical_L0_frac/transformer.h.4.attn.c_proj.bias', 'empirical_L0_frac/transformer.h.6.attn.c_attn.weight', 'weight_grad_norm/transformer.h.5.mlp.c_fc.bias', 'weight_grad_norm/transformer.h.6.mlp.c_proj.bias', 'empirical_L0_frac/transformer.h.7.mlp.c_proj.bias', 'empiric
...
'empirical_L0_frac/transformer.h.7.attn.c_proj.bias', 'weight_grad_norm/transformer.h.5.ln_2.weight', 'empirical_L0_frac/transformer.h.7.mlp.c_proj.weight', 'empirical_L0_frac/transformer.h.5.attn.c_attn.weight', 'empirical_L0_frac/transformer.h.4.attn.c_proj.weight', 'weight_grad_norm/transformer.h.5.mlp.c_fc.weight', 'empirical_L0_frac/transformer.h.7.mlp.c_fc.bias', 'weight_grad_norm/transformer.h.4.attn.c_proj.bias', 'empirical_L0_frac/transformer.h.6.mlp.c_fc.bias', 'empirical_L0_frac/transformer.h.5.mlp.c_fc.bias', 'weight_grad_norm/transformer.h.6.attn.c_attn.bias', 'empirical_L0_frac/transformer.h.7.attn.c_proj.weight', 'empirical_L0_frac/transformer.h.5.mlp.c_proj.weight', 'weight_grad_norm/transformer.h.4.mlp.c_proj.bias', 'weight_grad_norm/bigram_table', 'weight_grad_norm/transformer.h.6.attn.c_attn.weight', 'weight_grad_norm/transformer.h.5.attn.c_proj.weight', 'empirical_L0_frac/transformer.h.5.attn.c_attn.bias', 'weight_grad_norm/transformer.h.7.attn.c_attn.bias', 'empirical_L0_frac/transformer.h.6.mlp.c_proj.bias', 'weight_grad_norm/transformer.h.4.attn.c_attn.weight', 'weight_grad_norm/transformer.h.7.attn.c_attn.weight', 'weight_grad_norm/transformer.h.7.mlp.c_fc.bias', 'weight_grad_norm/transformer.h.7.mlp.c_fc.weight', 'weight_grad_norm/transformer.h.7.attn.c_proj.bias', 'num_alive_neurons/c_fc/layer_4', 'empirical_L0_frac/transformer.h.7.attn.c_attn.bias', 'weight_grad_norm/transformer.h.4.ln_2.weight', 'empirical_L0_frac/transformer.h.7.mlp.c_fc.weight'}) and 6 missing columns ({'dead_qk/layer_1', 'test_xent_bridged', 'test_kl_bridged', 'dead_qk/layer_0', 'dead_qk/layer_2', 'dead_qk/layer_3'}).
This happened while the json dataset builder was generating data using
hf://datasets/michaelwaves/sparse-circuits/train_curves/csp_sweep1_16x_7.4Mnonzero_afrac0.125/progress.json (at revision 15dadd2c2824ca52bcbcb7ae58533d1998bd454e)
Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
xent
float64 | test_kl
float64 | test_xent
float64 | grad_norm
int64 | weight_sparsity
float64 | L0_as_frac_of_orig_params
float64 | L0
float64 | L0_non_LN
float64 | L0_non_embed
float64 | L0_non_embed_as_frac_of_orig_params
float64 | param_norm
float64 | aux_invertable
float64 | elapsed_tokens
int64 | grad_scale
float64 | lr
float64 | pfrac
float64 | step
int64 | did_clip_grad_norm
int64 | tokens_per_second
float64 | num_alive_neurons/c_fc/layer_0
int64 | num_alive_neurons/c_fc/layer_1
int64 | num_alive_neurons/c_fc/layer_2
int64 | num_alive_neurons/c_fc/layer_3
int64 | test_kl_bridged
float64 | test_xent_bridged
float64 | dead_qk/layer_0
int64 | dead_qk/layer_1
int64 | dead_qk/layer_2
int64 | dead_qk/layer_3
int64 | empirical_L0_frac/transformer.wte.weight
float64 | empirical_L0_frac/transformer.wpe.weight
float64 | empirical_L0_frac/transformer.h.0.attn.c_attn.weight
float64 | empirical_L0_frac/transformer.h.0.attn.c_attn.bias
float64 | empirical_L0_frac/transformer.h.0.attn.c_proj.weight
float64 | empirical_L0_frac/transformer.h.0.attn.c_proj.bias
float64 | empirical_L0_frac/transformer.h.0.mlp.c_fc.weight
float64 | empirical_L0_frac/transformer.h.0.mlp.c_fc.bias
float64 | empirical_L0_frac/transformer.h.0.mlp.c_proj.weight
float64 | empirical_L0_frac/transformer.h.0.mlp.c_proj.bias
float64 | empirical_L0_frac/transformer.h.1.attn.c_attn.weight
float64 | empirical_L0_frac/transformer.h.1.attn.c_attn.bias
float64 | empirical_L0_frac/transformer.h.1.attn.c_proj.weight
float64 | empirical_L0_frac/transformer.h.1.attn.c_proj.bias
float64 | empirical_L0_frac/transformer.h.1.mlp.c_fc.weight
float64 | empirical_L0_frac/transformer.h.1.mlp.c_fc.bias
float64 | empirical_L0_frac/transformer.h.1.mlp.c_proj.weight
float64 | empirical_L0_frac/transformer.h.1.mlp.c_proj.bias
float64 | empirical_L0_frac/transformer.h.2.attn.c_attn.weight
float64 | empirical_L0_frac/transformer.h.2.attn.c_attn.bias
float64 | empirical_L0_frac/transformer.h.2.attn.c_proj.weight
float64 | empirical_L0_frac/transformer.h.2.attn.c_proj.bias
float64 | empirical_L0_frac/transformer.h.2.mlp.c_fc.weight
float64 | empirical_L0_frac/transformer.h.2.mlp.c_fc.bias
float64 | empirical_L0_frac/transformer.h.2.mlp.c_proj.weight
float64 | empirical_L0_frac/transformer.h.2.mlp.c_proj.bias
float64 | empirical_L0_frac/transformer.h.3.attn.c_attn.weight
float64 | empirical_L0_frac/transformer.h.3.attn.c_attn.bias
float64 | empirical_L0_frac/transformer.h.3.attn.c_proj.weight
float64 | empirical_L0_frac/transformer.h.3.attn.c_proj.bias
float64 | empirical_L0_frac/transformer.h.3.mlp.c_fc.weight
float64 | empirical_L0_frac/transformer.h.3.mlp.c_fc.bias
float64 | empirical_L0_frac/transformer.h.3.mlp.c_proj.weight
float64 | empirical_L0_frac/transformer.h.3.mlp.c_proj.bias
float64 | empirical_L0_frac/lm_head.weight
float64 | weight_grad_norm/transformer.wte.weight
float64 | weight_grad_norm/transformer.wpe.weight
float64 | weight_grad_norm/transformer.h.0.ln_1.weight
float64 | weight_grad_norm/transformer.h.0.attn.c_attn.weight
float64 | weight_grad_norm/transformer.h.0.attn.c_attn.bias
float64 | weight_grad_norm/transformer.h.0.attn.c_proj.weight
float64 | weight_grad_norm/transformer.h.0.attn.c_proj.bias
float64 | weight_grad_norm/transformer.h.0.ln_2.weight
float64 | weight_grad_norm/transformer.h.0.mlp.c_fc.weight
float64 | weight_grad_norm/transformer.h.0.mlp.c_fc.bias
float64 | weight_grad_norm/transformer.h.0.mlp.c_proj.weight
float64 | weight_grad_norm/transformer.h.0.mlp.c_proj.bias
float64 | weight_grad_norm/transformer.h.1.ln_1.weight
float64 | weight_grad_norm/transformer.h.1.attn.c_attn.weight
float64 | weight_grad_norm/transformer.h.1.attn.c_attn.bias
float64 | weight_grad_norm/transformer.h.1.attn.c_proj.weight
float64 | weight_grad_norm/transformer.h.1.attn.c_proj.bias
float64 | weight_grad_norm/transformer.h.1.ln_2.weight
float64 | weight_grad_norm/transformer.h.1.mlp.c_fc.weight
float64 | weight_grad_norm/transformer.h.1.mlp.c_fc.bias
float64 | weight_grad_norm/transformer.h.1.mlp.c_proj.weight
float64 | weight_grad_norm/transformer.h.1.mlp.c_proj.bias
float64 | weight_grad_norm/transformer.h.2.ln_1.weight
float64 | weight_grad_norm/transformer.h.2.attn.c_attn.weight
float64 | weight_grad_norm/transformer.h.2.attn.c_attn.bias
float64 | weight_grad_norm/transformer.h.2.attn.c_proj.weight
float64 | weight_grad_norm/transformer.h.2.attn.c_proj.bias
float64 | weight_grad_norm/transformer.h.2.ln_2.weight
float64 | weight_grad_norm/transformer.h.2.mlp.c_fc.weight
float64 | weight_grad_norm/transformer.h.2.mlp.c_fc.bias
float64 | weight_grad_norm/transformer.h.2.mlp.c_proj.weight
float64 | weight_grad_norm/transformer.h.2.mlp.c_proj.bias
float64 | weight_grad_norm/transformer.h.3.ln_1.weight
float64 | weight_grad_norm/transformer.h.3.attn.c_attn.weight
float64 | weight_grad_norm/transformer.h.3.attn.c_attn.bias
float64 | weight_grad_norm/transformer.h.3.attn.c_proj.weight
float64 | weight_grad_norm/transformer.h.3.attn.c_proj.bias
float64 | weight_grad_norm/transformer.h.3.ln_2.weight
float64 | weight_grad_norm/transformer.h.3.mlp.c_fc.weight
float64 | weight_grad_norm/transformer.h.3.mlp.c_fc.bias
float64 | weight_grad_norm/transformer.h.3.mlp.c_proj.weight
float64 | weight_grad_norm/transformer.h.3.mlp.c_proj.bias
float64 | weight_grad_norm/transformer.ln_f.weight
float64 | weight_grad_norm/lm_head.weight
float64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.371694
| 0
| 1.276191
| 0
| 0.989376
| 0.259892
| 2,251,264
| 2,228,224
| 1,835,008
| 0.5
| 610.702698
| 0.594859
| 31,992,053,760
| 8,192
| 0.000006
| 0.5
| 61,020
| 0
| 269,959.871277
| 8,192
| 8,192
| 8,192
| 8,192
| 0.203483
| 1.365613
| 1,546
| 1,711
| 1,610
| 1,299
| 0.0625
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.007813
| 0.0625
| 0.0625
| 1,994.947876
| 1,419.624512
| 42.788998
| 3,418.453125
| 76.63591
| 2,001.557251
| 45.179031
| 42.6329
| 4,028.899658
| 85.289574
| 3,970.382813
| 45.125263
| 42.638042
| 3,463.61499
| 76.2995
| 2,001.994141
| 45.133698
| 41.742344
| 4,005.307373
| 85.306297
| 3,873.67627
| 45.10429
| 41.697556
| 3,443.843506
| 75.443939
| 1,986.846558
| 45.098782
| 38.983082
| 3,926.290039
| 81.894409
| 3,875.323242
| 45.052746
| 42.817844
| 3,390.329102
| 74.335213
| 1,973.016968
| 45.038067
| 41.031883
| 3,912.85376
| 80.599854
| 3,904.460449
| 44.765808
| 43.085915
| 2,006.281128
|
7.599796
| 0
| 7.548134
| 0
| 0.000348
| 24.454414
| 211,830,784
| 211,812,352
| 205,520,896
| 56
| 271.653412
| 25.803831
| 0
| 65,536
| 0
| 64
| 0
| 0
| 0
| 8,192
| 8,192
| 8,192
| 8,192
| 6.313459
| 7.549939
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 0
| 1
| 2,042.441895
| 1,445.860107
| 44.536713
| 3,417.94751
| 78.21595
| 2,043.262329
| 45.25259
| 45.196499
| 4,006.147949
| 90.136925
| 4,088.913574
| 45.245182
| 44.620705
| 3,454.940186
| 77.868614
| 2,045.24939
| 45.247963
| 45.226604
| 4,031.810059
| 90.204514
| 4,088.154297
| 45.239128
| 44.14304
| 3,348.273926
| 77.44931
| 2,041.631104
| 45.24192
| 45.206886
| 3,970.539063
| 90.018944
| 4,080.6521
| 45.227757
| 43.025169
| 3,029.654785
| 76.186371
| 2,029.564209
| 45.224136
| 42.688358
| 3,560.497559
| 88.277634
| 4,063.51123
| 45.201862
| 45.206448
| 2,044.870728
|
6.098747
| 0
| 6.018192
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 271.395508
| 28.006756
| 10,485,760
| 65,536
| 0.000074
| 64
| 20
| 0
| 213,724.714137
| 8,192
| 8,192
| 8,192
| 8,192
| 4.868958
| 6.110586
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 2,024.374146
| 1,436.76123
| 44.634747
| 3,101.237549
| 77.239395
| 2,020.609863
| 45.229256
| 45.111858
| 3,913.091797
| 89.088036
| 4,046.999756
| 45.173038
| 43.59293
| 3,180.97583
| 74.38829
| 2,034.434082
| 45.18993
| 45.19162
| 3,989.652344
| 89.820747
| 4,043.778564
| 45.158539
| 43.672146
| 3,025.878662
| 74.698257
| 2,028.325684
| 45.19117
| 45.166584
| 3,899.591797
| 89.589149
| 4,028.159424
| 45.139484
| 44.003937
| 2,991.673096
| 74.756332
| 2,023.001587
| 45.173389
| 44.834373
| 3,523.896484
| 86.968803
| 4,022.80249
| 45.106232
| 45.147758
| 2,044.234741
|
5.70662
| 0
| 5.536369
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 270.967529
| 20.301104
| 20,971,520
| 65,536
| 0.000148
| 64
| 40
| 0
| 240,548.875229
| 8,192
| 8,192
| 8,192
| 8,192
| 4.218064
| 5.458321
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,994.526611
| 1,411.366699
| 44.119457
| 2,903.021973
| 77.060318
| 2,018.457642
| 45.211472
| 44.953495
| 3,979.353271
| 88.883781
| 4,054.255127
| 45.181942
| 44.423656
| 3,294.275879
| 74.72554
| 2,032.884766
| 45.170803
| 44.953133
| 3,892.654297
| 88.349258
| 3,986.199707
| 45.111355
| 44.110325
| 3,187.261719
| 73.550255
| 2,027.265503
| 45.13969
| 44.741943
| 3,697.235352
| 86.732254
| 3,868.342773
| 45.028236
| 42.864365
| 2,873.369873
| 69.685135
| 2,009.54834
| 45.028992
| 41.814026
| 2,983.696777
| 73.369652
| 3,880.479492
| 44.362804
| 45.007381
| 2,037.780396
|
5.532545
| 0
| 5.424904
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 270.557312
| 17.471241
| 31,457,280
| 65,536
| 0.000223
| 64
| 60
| 0
| 250,908.126209
| 8,192
| 8,192
| 8,192
| 8,192
| 3.81137
| 5.054487
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,990.942017
| 1,401.326416
| 44.705479
| 3,378.902832
| 77.881409
| 2,038.997803
| 45.23761
| 45.198154
| 4,046.885742
| 89.55011
| 4,062.207031
| 45.188034
| 44.344284
| 3,379.884521
| 75.551064
| 2,033.253052
| 45.183643
| 44.552868
| 3,866.267822
| 85.773163
| 3,979.109375
| 45.093315
| 43.767799
| 3,253.471924
| 73.656914
| 2,020.601074
| 45.129738
| 43.751408
| 3,628.754395
| 81.14782
| 3,796.562988
| 45.042522
| 41.091221
| 2,952.963135
| 68.854263
| 2,002.120483
| 45.024139
| 44.347645
| 3,735.443848
| 84.717499
| 3,955.677734
| 44.676514
| 45.066643
| 2,038.4552
|
5.094608
| 0
| 4.943391
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 270.305481
| 14.104076
| 41,943,040
| 65,536
| 0.000297
| 64
| 80
| 0
| 256,611.114932
| 8,192
| 8,192
| 8,192
| 8,192
| 3.355571
| 4.598867
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,965.004883
| 1,401.592285
| 44.713261
| 3,319.762695
| 77.704666
| 2,034.22522
| 45.231434
| 45.136143
| 4,022.622314
| 89.228767
| 4,059.048584
| 45.197479
| 44.223785
| 3,404.084473
| 75.816971
| 2,033.121948
| 45.195683
| 44.421791
| 3,870.101807
| 86.002388
| 4,018.85083
| 45.15007
| 42.723808
| 3,180.932617
| 72.384048
| 2,012.970581
| 45.137669
| 42.832153
| 3,567.246826
| 81.712166
| 3,808.503906
| 45.063221
| 40.785126
| 2,953.167725
| 69.659592
| 1,996.739136
| 45.07056
| 44.598991
| 3,861.677979
| 86.865356
| 4,016.754395
| 44.983814
| 45.154896
| 2,040.440186
|
4.746141
| 0
| 4.516998
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 270.002808
| 14.97901
| 52,428,800
| 65,536
| 0.000371
| 64
| 100
| 0
| 260,174.723843
| 8,192
| 8,192
| 8,192
| 8,192
| 2.809456
| 4.051325
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,953.960327
| 1,397.453491
| 44.352894
| 3,268.207275
| 77.60891
| 2,034.867676
| 45.235916
| 45.124626
| 4,042.440674
| 89.668152
| 4,055.394287
| 45.216038
| 44.257931
| 3,402.859131
| 75.942123
| 2,031.130737
| 45.208008
| 44.300045
| 3,839.019043
| 85.519547
| 4,019.004883
| 45.194153
| 44.06839
| 3,343.174316
| 75.537903
| 2,024.53479
| 45.202026
| 43.814575
| 3,786.081299
| 85.735489
| 3,921.450928
| 45.171505
| 44.501114
| 3,372.421875
| 76.172287
| 2,027.473022
| 45.195721
| 45.111279
| 4,034.821289
| 89.639641
| 4,052.7146
| 45.140812
| 45.154015
| 2,041.812744
|
4.382658
| 0
| 4.197106
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 269.786774
| 11.52207
| 62,914,560
| 65,536
| 0.000445
| 64
| 120
| 0
| 262,405.57835
| 8,192
| 8,192
| 8,192
| 8,192
| 2.432196
| 3.672437
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,943.880737
| 1,391.163086
| 43.353951
| 3,073.774414
| 76.84272
| 2,035.953125
| 45.223454
| 44.890465
| 4,002.887939
| 88.976418
| 4,048.109131
| 45.208481
| 44.357281
| 3,404.838379
| 76.092644
| 2,027.36084
| 45.210346
| 44.245064
| 3,796.461182
| 84.530975
| 4,005.958984
| 45.199272
| 44.093517
| 3,356.951904
| 76.035828
| 2,014.947754
| 45.19677
| 43.302837
| 3,601.930908
| 83.228111
| 3,904.180908
| 45.177204
| 43.992844
| 3,292.799316
| 75.310486
| 2,022.112915
| 45.193531
| 45.036919
| 4,014.365967
| 89.470764
| 4,045.775635
| 45.168247
| 45.157513
| 2,042.953613
|
4.157256
| 0
| 3.968408
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 269.721008
| 10.186149
| 73,400,320
| 65,536
| 0.000519
| 64
| 140
| 0
| 263,528.317851
| 8,192
| 8,192
| 8,192
| 8,192
| 2.105189
| 3.343932
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,949.997681
| 1,390.417969
| 43.413353
| 3,148.731689
| 77.009407
| 2,033.994507
| 45.22113
| 44.890182
| 3,985.474609
| 88.722229
| 4,034.059082
| 45.203938
| 44.531403
| 3,420.216309
| 76.578812
| 2,018.591919
| 45.207005
| 44.47155
| 3,747.083496
| 84.143539
| 3,984.544434
| 45.204979
| 44.055717
| 3,368.928223
| 75.935593
| 2,015.072144
| 45.199032
| 44.12875
| 3,831.817871
| 86.658104
| 3,919.103271
| 45.182602
| 44.317524
| 3,388.469971
| 76.444595
| 2,021.802979
| 45.20002
| 44.966862
| 3,987.887207
| 89.162735
| 4,033.185791
| 45.182808
| 45.150105
| 2,041.941895
|
4.14553
| 0
| 3.855193
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 269.841431
| 10.204352
| 83,886,080
| 65,536
| 0.000594
| 64
| 160
| 0
| 264,656.319895
| 8,192
| 8,192
| 8,192
| 8,192
| 1.847092
| 3.086919
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,982.688843
| 1,404.646484
| 44.369148
| 3,303.03833
| 77.802773
| 2,038.697876
| 45.23914
| 45.062649
| 4,034.096924
| 89.62738
| 4,052.11499
| 45.229252
| 44.626781
| 3,444.42627
| 76.763298
| 2,023.836182
| 45.222057
| 44.700764
| 3,856.671387
| 86.567978
| 4,007.808594
| 45.223038
| 44.086475
| 3,373.118896
| 76.17189
| 2,021.080322
| 45.215492
| 44.168781
| 3,757.440186
| 85.81002
| 3,969.983398
| 45.215351
| 44.401894
| 3,371.402588
| 76.402405
| 2,028.637573
| 45.218845
| 45.008472
| 4,003.691895
| 89.358414
| 4,033.064453
| 45.205132
| 45.168327
| 2,043.710449
|
3.931593
| 0
| 3.700436
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 270.100342
| 8.666575
| 94,371,840
| 65,536
| 0.000668
| 64
| 180
| 0
| 265,817.515167
| 8,192
| 8,192
| 8,192
| 8,192
| 1.679852
| 2.916628
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,977.75708
| 1,396.908813
| 44.365215
| 3,350.705811
| 77.792961
| 2,035.373535
| 45.23587
| 45.026825
| 4,017.502686
| 89.450615
| 4,038.096191
| 45.223675
| 44.533112
| 3,431.784424
| 76.789238
| 2,014.824951
| 45.216816
| 44.571026
| 3,824.945313
| 86.012909
| 3,986.557129
| 45.220295
| 44.331661
| 3,378.228516
| 76.490448
| 2,010.748657
| 45.209576
| 44.172115
| 3,780.362305
| 86.201302
| 3,932.653809
| 45.207973
| 44.936104
| 3,471.90918
| 77.604935
| 2,029.384521
| 45.221043
| 44.987503
| 4,001.912598
| 89.243553
| 4,017.787842
| 45.20047
| 45.158562
| 2,042.523315
|
3.772708
| 0
| 3.574709
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 270.499237
| 7.900064
| 104,857,600
| 65,536
| 0.000742
| 64
| 200
| 0
| 266,621.900123
| 8,192
| 8,192
| 8,192
| 8,192
| 1.578905
| 2.814312
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,976.03894
| 1,395.300049
| 44.313335
| 3,361.035645
| 77.72142
| 2,034.007324
| 45.233257
| 45.062134
| 3,999.512939
| 89.174538
| 4,022.727295
| 45.212288
| 44.182457
| 3,383.47583
| 75.935356
| 2,005.854736
| 45.206139
| 44.241222
| 3,773.335693
| 85.250587
| 3,963.743408
| 45.214199
| 44.975449
| 3,485.955078
| 77.747467
| 2,024.953369
| 45.224972
| 44.672905
| 3,860.925781
| 86.819389
| 3,929.117432
| 45.210255
| 44.421253
| 3,390.026855
| 76.485886
| 2,021.778564
| 45.207867
| 44.966011
| 3,991.077148
| 89.182457
| 4,018.359375
| 45.197422
| 45.152317
| 2,043.224976
|
3.632998
| 0
| 3.447376
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 270.973206
| 6.714726
| 115,343,360
| 65,536
| 0.000816
| 64
| 220
| 0
| 265,777.564476
| 8,192
| 8,192
| 8,192
| 8,192
| 1.499292
| 2.732976
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,973.661987
| 1,398.305176
| 44.357887
| 3,323.031494
| 77.645187
| 2,031.979004
| 45.229954
| 44.909496
| 3,986.919434
| 89.198776
| 4,027.865479
| 45.216019
| 44.52597
| 3,416.169189
| 76.470573
| 2,004.623901
| 45.210857
| 44.508389
| 3,811.934814
| 86.240044
| 3,963.161865
| 45.218002
| 44.310562
| 3,387.773926
| 76.40657
| 2,009.022217
| 45.208157
| 44.386112
| 3,784.291504
| 85.888077
| 3,917.313477
| 45.2085
| 44.901306
| 3,481.619385
| 77.683716
| 2,024.924072
| 45.221668
| 45.014095
| 3,975.718262
| 88.740852
| 4,000.212891
| 45.199039
| 45.151825
| 2,042.557617
|
3.382951
| 0
| 3.282897
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 271.503723
| 5.905764
| 125,829,120
| 65,536
| 0.00089
| 64
| 240
| 0
| 260,235.441335
| 8,192
| 8,192
| 8,192
| 8,192
| 1.434669
| 2.668562
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,984.384521
| 1,403.081177
| 44.523594
| 3,356.570068
| 77.687096
| 2,033.611694
| 45.231682
| 44.889694
| 3,970.364502
| 89.042297
| 4,021.224609
| 45.216702
| 44.528107
| 3,437.839111
| 76.941422
| 2,002.032104
| 45.212711
| 44.333702
| 3,743.568115
| 85.177422
| 3,950.19458
| 45.21302
| 44.513306
| 3,405.756348
| 76.71682
| 2,007.994019
| 45.209362
| 44.312672
| 3,782.957764
| 85.96109
| 3,905.260742
| 45.204365
| 44.732761
| 3,433.446289
| 77.206993
| 2,017.310791
| 45.212322
| 44.895863
| 3,957.097168
| 88.390831
| 3,989.552979
| 45.19363
| 45.1469
| 2,042.918945
|
3.288423
| 0
| 3.15651
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 272.111908
| 5.976633
| 136,314,880
| 65,536
| 0.000964
| 64
| 260
| 0
| 239,102.937429
| 8,192
| 8,192
| 8,192
| 8,192
| 1.400091
| 2.632823
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,999.109985
| 1,407.929932
| 44.719254
| 3,405.048828
| 77.791039
| 2,035.313599
| 45.235191
| 45.038849
| 4,005.349365
| 89.400665
| 4,044.837646
| 45.222744
| 44.426807
| 3,425.602539
| 76.708557
| 2,003.915649
| 45.213387
| 44.859077
| 3,880.963623
| 86.673523
| 3,973.288574
| 45.219536
| 44.560585
| 3,434.814697
| 77.030251
| 2,012.672119
| 45.215385
| 44.845615
| 3,940.984863
| 87.614021
| 3,949.706787
| 45.208912
| 45.001366
| 3,499.11377
| 77.875175
| 2,020.672729
| 45.222729
| 45.044144
| 4,007.729004
| 88.91729
| 4,015.262695
| 45.196301
| 45.159195
| 2,042.896118
|
3.206962
| 0
| 2.986931
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 272.769257
| 5.418612
| 146,800,640
| 65,536
| 0.001039
| 64
| 280
| 0
| 241,312.568348
| 8,192
| 8,192
| 8,192
| 8,192
| 1.320852
| 2.551461
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,990.045776
| 1,403.986206
| 44.67831
| 3,382.977051
| 77.565254
| 2,030.391479
| 45.22744
| 44.958878
| 3,983.324463
| 89.197571
| 4,026.377686
| 45.216522
| 44.598339
| 3,443.709229
| 76.879608
| 1,991.983887
| 45.206886
| 44.445274
| 3,795.392578
| 85.793098
| 3,934.55249
| 45.211533
| 44.487793
| 3,429.194336
| 77.11869
| 2,000.282715
| 45.20702
| 44.411484
| 3,850.766602
| 86.116875
| 3,873.680664
| 45.195965
| 44.702576
| 3,438.47876
| 77.226044
| 2,011.500488
| 45.201466
| 45.082737
| 3,993.719727
| 88.814896
| 3,975.368896
| 45.185181
| 45.118359
| 2,040.852051
|
3.061177
| 0
| 2.887115
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 273.389099
| 4.909681
| 157,286,400
| 65,536
| 0.001113
| 64
| 300
| 0
| 243,187.830758
| 8,192
| 8,192
| 8,192
| 8,192
| 1.300131
| 2.528109
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,996.364868
| 1,405.875488
| 44.87603
| 3,434.700928
| 77.590103
| 2,032.457642
| 45.227634
| 44.926941
| 3,967.894287
| 88.872574
| 4,036.104736
| 45.217491
| 44.620026
| 3,439.8479
| 76.888382
| 1,997.838623
| 45.208427
| 44.568966
| 3,864.932373
| 86.388298
| 3,944.916504
| 45.209484
| 44.51931
| 3,415.895264
| 76.564537
| 2,001.504395
| 45.20153
| 44.408924
| 3,841.245605
| 85.954163
| 3,885.658936
| 45.194798
| 44.383568
| 3,399.176025
| 76.412903
| 2,006.117798
| 45.194004
| 45.01498
| 3,984.093506
| 88.34391
| 3,986.854248
| 45.169418
| 45.130726
| 2,042.196289
|
2.947598
| 0
| 2.799381
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 274.064789
| 4.216142
| 167,772,160
| 65,536
| 0.001187
| 64
| 320
| 0
| 244,674.706155
| 8,192
| 8,192
| 8,192
| 8,192
| 1.292536
| 2.519907
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,991.436035
| 1,406.244751
| 44.867184
| 3,433.229736
| 77.651917
| 2,030.817383
| 45.229538
| 44.99519
| 3,993.063232
| 89.153618
| 4,041.993652
| 45.218189
| 44.540459
| 3,433.367676
| 76.776588
| 1,995.753662
| 45.204346
| 44.524078
| 3,844.953369
| 86.006592
| 3,957.445557
| 45.215839
| 44.680031
| 3,461.9104
| 77.238586
| 2,009.12085
| 45.211205
| 44.704376
| 3,902.932373
| 86.803085
| 3,925.201904
| 45.201496
| 44.805466
| 3,470.05957
| 77.391037
| 2,013.935791
| 45.204884
| 45.06855
| 4,007.065674
| 88.873039
| 4,004.919434
| 45.185398
| 45.150024
| 2,042.994019
|
3.000859
| 0
| 2.709356
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 274.779449
| 4.402226
| 178,257,920
| 65,536
| 0.001261
| 64
| 340
| 0
| 246,089.160294
| 8,192
| 8,192
| 8,192
| 8,192
| 1.213526
| 2.439985
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,990.609985
| 1,407.446533
| 44.935364
| 3,446.432373
| 77.825348
| 2,032.114502
| 45.234863
| 45.12801
| 4,038.263184
| 89.838112
| 4,034.5896
| 45.224792
| 44.697941
| 3,451.755859
| 77.122375
| 2,008.840332
| 45.219913
| 44.992245
| 3,992.849854
| 88.500984
| 3,975.477295
| 45.225067
| 44.905869
| 3,490.225098
| 77.656647
| 2,018.049316
| 45.221043
| 44.94812
| 3,991.211914
| 88.438629
| 3,910.025635
| 45.209122
| 44.913361
| 3,470.744873
| 77.439484
| 2,017.883911
| 45.211132
| 45.094086
| 4,012.364014
| 89.135277
| 3,962.010742
| 45.195602
| 45.163902
| 2,042.292114
|
2.743012
| 0
| 2.662057
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 275.505188
| 4.196299
| 188,743,680
| 65,536
| 0.001335
| 64
| 360
| 0
| 247,416.899297
| 8,192
| 8,192
| 8,192
| 8,192
| 1.196619
| 2.422268
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 2,000.154785
| 1,406.037964
| 44.960979
| 3,455.569824
| 77.622955
| 2,029.617798
| 45.230736
| 45.074715
| 4,013.252197
| 89.325027
| 4,034.20166
| 45.220554
| 44.725937
| 3,465.890381
| 77.306061
| 2,005.086426
| 45.215797
| 44.777691
| 3,908.763672
| 87.083641
| 3,954.101074
| 45.219662
| 44.729385
| 3,452.878174
| 77.263138
| 2,013.474121
| 45.2187
| 44.751541
| 3,909.052979
| 87.079445
| 3,903.493896
| 45.210438
| 44.992855
| 3,489.304443
| 77.743782
| 2,024.337769
| 45.216557
| 45.03204
| 3,993.471436
| 88.73819
| 3,979.350342
| 45.194454
| 45.155262
| 2,043.317017
|
2.935627
| 0
| 2.626852
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 276.309448
| 3.563018
| 199,229,440
| 65,536
| 0.00141
| 64
| 380
| 0
| 248,607.591339
| 8,192
| 8,192
| 8,192
| 8,192
| 1.161115
| 2.386936
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,986.538818
| 1,406.953369
| 44.96854
| 3,448.187988
| 77.775909
| 2,029.629028
| 45.236012
| 44.954514
| 3,991.970947
| 89.449539
| 4,028.50293
| 45.2286
| 44.797054
| 3,469.131348
| 77.370399
| 2,012.535034
| 45.222107
| 44.654076
| 3,873.610107
| 87.331116
| 3,959.132813
| 45.223503
| 44.832676
| 3,475.960449
| 77.567017
| 2,021.059326
| 45.226135
| 44.881199
| 3,910.285645
| 87.38015
| 3,897.246582
| 45.210629
| 45.076336
| 3,504.46167
| 77.919762
| 2,029.934448
| 45.225368
| 45.006542
| 4,004.596924
| 88.91803
| 3,972.26123
| 45.199593
| 45.153633
| 2,042.194824
|
2.660415
| 0
| 2.507989
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 277.142303
| 3.698636
| 209,715,200
| 65,536
| 0.001484
| 64
| 400
| 0
| 249,678.156955
| 8,192
| 8,192
| 8,192
| 8,192
| 1.129611
| 2.354918
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,994.184326
| 1,407.811035
| 44.793747
| 3,429.135742
| 77.594872
| 2,025.869629
| 45.229347
| 45.004276
| 3,975.853516
| 88.748474
| 4,005.952148
| 45.213799
| 44.75119
| 3,460.038086
| 77.304977
| 2,008.464844
| 45.216625
| 44.72929
| 3,872.258545
| 86.711449
| 3,922.81665
| 45.209381
| 44.533772
| 3,431.251953
| 77.064034
| 2,006.918457
| 45.21077
| 44.41222
| 3,802.080078
| 85.401733
| 3,844.48291
| 45.191273
| 44.688282
| 3,436.72583
| 77.0261
| 2,016.598633
| 45.204273
| 44.860172
| 3,952.006348
| 88.000237
| 3,930.860352
| 45.172604
| 45.111214
| 2,041.567749
|
2.769478
| 0
| 2.483879
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 278.117493
| 4.226847
| 220,200,960
| 65,536
| 0.001558
| 64
| 420
| 0
| 250,589.458957
| 8,192
| 8,192
| 8,192
| 8,192
| 1.093285
| 2.317593
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,996.036987
| 1,411.414673
| 44.751827
| 3,406.759277
| 77.537971
| 2,025.771973
| 45.230633
| 45.063877
| 4,001.771729
| 89.418694
| 4,016.380615
| 45.227211
| 44.938072
| 3,493.544189
| 77.628853
| 2,016.816528
| 45.22504
| 44.853024
| 3,885.219727
| 87.118683
| 3,940.37207
| 45.22197
| 44.843254
| 3,478.426514
| 77.562492
| 2,017.56958
| 45.221973
| 44.85918
| 3,885.420166
| 87.03743
| 3,873.047363
| 45.209278
| 44.86779
| 3,474.615234
| 77.439682
| 2,023.983154
| 45.215752
| 44.81337
| 3,955.335693
| 88.238045
| 3,922.816895
| 45.188538
| 45.122726
| 2,041.239258
|
2.612691
| 0
| 2.446553
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 279.209106
| 3.868529
| 230,686,720
| 65,536
| 0.001632
| 64
| 440
| 0
| 251,491.711326
| 8,192
| 8,192
| 8,192
| 8,192
| 1.066457
| 2.288887
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,989.515381
| 1,403.659058
| 44.968441
| 3,457.553955
| 77.774429
| 2,030.028198
| 45.232372
| 45.139275
| 4,034.000488
| 89.571259
| 4,023.698486
| 45.215588
| 44.829494
| 3,472.702148
| 77.521973
| 2,009.822876
| 45.220261
| 44.987411
| 3,981.010498
| 88.3564
| 3,912.218018
| 45.210464
| 44.840492
| 3,473.810547
| 77.5177
| 2,014.749634
| 45.218307
| 44.735374
| 3,931.927734
| 87.353409
| 3,881.867432
| 45.197811
| 44.817787
| 3,481.003906
| 77.591576
| 2,021.778564
| 45.214535
| 45.105049
| 4,012.301025
| 88.959221
| 3,967.248779
| 45.181606
| 45.144428
| 2,042.500122
|
2.630618
| 0
| 2.399622
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 280.261627
| 4.092134
| 241,172,480
| 65,536
| 0.001706
| 64
| 460
| 0
| 252,240.786649
| 8,192
| 8,192
| 8,192
| 8,192
| 1.043359
| 2.26375
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,998.661743
| 1,408.186646
| 44.950905
| 3,457.096191
| 77.540771
| 2,025.762573
| 45.223976
| 44.838924
| 3,967.864502
| 88.622276
| 4,024.515381
| 45.21373
| 44.916451
| 3,472.505615
| 77.477318
| 2,002.992432
| 45.215988
| 44.998093
| 3,957.244385
| 87.850784
| 3,907.601318
| 45.215031
| 44.788517
| 3,473.003906
| 77.463356
| 2,007.58667
| 45.214798
| 44.829086
| 3,899.875977
| 87.076981
| 3,899.912842
| 45.203453
| 44.85915
| 3,493.007813
| 77.721916
| 2,017.508423
| 45.212528
| 44.918774
| 3,986.417725
| 88.454155
| 3,987.551025
| 45.166973
| 45.157871
| 2,044.232544
|
2.538602
| 0
| 2.500082
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 281.551178
| 3.265527
| 251,658,240
| 65,536
| 0.001781
| 64
| 480
| 0
| 252,963.232662
| 8,192
| 8,192
| 8,192
| 8,192
| 1.039091
| 2.260105
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 2,008.236694
| 1,411.05542
| 45.068672
| 3,494.746094
| 78.112801
| 2,036.623413
| 45.244335
| 45.159237
| 4,035.511475
| 89.763466
| 4,041.180908
| 45.235382
| 44.993126
| 3,505.351074
| 77.9095
| 2,027.093628
| 45.236732
| 44.952827
| 3,976.319336
| 88.38176
| 3,958.091797
| 45.230846
| 45.055721
| 3,507.425049
| 77.950645
| 2,029.22583
| 45.235985
| 45.023918
| 3,912.449951
| 87.786926
| 3,952.237549
| 45.228046
| 45.181141
| 3,521.417725
| 78.067688
| 2,036.142944
| 45.235973
| 45.109417
| 4,039.502686
| 89.516487
| 4,003.661865
| 45.219723
| 45.212326
| 2,045.100586
|
2.528949
| 0
| 2.369133
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 282.805298
| 3.430302
| 262,144,000
| 65,536
| 0.001855
| 64
| 500
| 0
| 253,687.972676
| 8,192
| 8,192
| 8,192
| 8,192
| 0.98545
| 2.206796
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 2,002.567505
| 1,408.890381
| 45.005383
| 3,475.081543
| 77.914177
| 2,030.78064
| 45.236877
| 45.104706
| 4,017.535645
| 89.482887
| 4,024.710205
| 45.229645
| 44.994038
| 3,504.776367
| 77.882454
| 2,018.428833
| 45.230621
| 45.058552
| 3,948.174805
| 87.970032
| 3,955.111572
| 45.227169
| 45.097481
| 3,507.92749
| 77.925545
| 2,025.075806
| 45.230198
| 44.786488
| 3,936.344971
| 87.612083
| 3,870.808838
| 45.213509
| 45.061558
| 3,504.305908
| 77.827728
| 2,026.011963
| 45.221901
| 45.050316
| 4,004.692627
| 88.959671
| 3,941.344971
| 45.194191
| 45.157921
| 2,043.500854
|
2.62236
| 0
| 2.319871
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 284.214996
| 3.339241
| 272,629,760
| 65,536
| 0.001929
| 64
| 520
| 0
| 254,303.60887
| 8,192
| 8,192
| 8,192
| 8,192
| 0.94149
| 2.159912
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,992.931763
| 1,407.256714
| 44.922417
| 3,465.544189
| 77.93177
| 2,031.526123
| 45.236763
| 45.174019
| 4,045.464111
| 89.64962
| 4,007.259521
| 45.218109
| 44.917976
| 3,489.055664
| 77.711029
| 1,998.745361
| 45.219406
| 45.057117
| 4,008.825195
| 88.823303
| 3,907.482666
| 45.216953
| 44.929729
| 3,498.848145
| 77.812683
| 2,013.19104
| 45.222637
| 44.860043
| 3,931.248291
| 87.432747
| 3,863.592529
| 45.20657
| 44.991028
| 3,500.609619
| 77.79303
| 2,022.606323
| 45.216003
| 44.935749
| 3,968.21875
| 88.226448
| 3,920.237305
| 45.172741
| 45.173203
| 2,042.387573
|
2.540878
| 0
| 2.33543
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 285.789215
| 3.620048
| 283,115,520
| 65,536
| 0.002003
| 64
| 540
| 0
| 254,829.251205
| 8,192
| 8,192
| 8,192
| 8,192
| 0.924922
| 2.143721
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 2,001.269897
| 1,406.792969
| 44.976364
| 3,472.436523
| 77.859299
| 2,027.019165
| 45.231113
| 45.113335
| 4,026.301514
| 89.471054
| 4,013.638184
| 45.224648
| 45.150425
| 3,512.192139
| 77.989738
| 2,014.485229
| 45.230392
| 45.119308
| 3,985.038818
| 88.475342
| 3,927.46167
| 45.22422
| 45.124947
| 3,503.42749
| 77.903305
| 2,016.920166
| 45.226612
| 44.898445
| 3,969.034424
| 88.072021
| 3,849.730225
| 45.207172
| 45.120396
| 3,504.542236
| 77.839546
| 2,025.021606
| 45.219261
| 44.853077
| 3,972.868408
| 88.257523
| 3,921.938477
| 45.181305
| 45.133377
| 2,042.392334
|
2.622222
| 0
| 2.291544
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 287.494263
| 3.10755
| 293,601,280
| 65,536
| 0.002077
| 64
| 560
| 0
| 255,318.444708
| 8,192
| 8,192
| 8,192
| 8,192
| 0.893954
| 2.1119
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 2,002.685303
| 1,409.379517
| 45.069389
| 3,489.991943
| 77.962219
| 2,028.504395
| 45.231907
| 44.97797
| 4,012.666992
| 89.291664
| 4,028.987305
| 45.220764
| 45.048203
| 3,497.368896
| 77.832199
| 2,005.294312
| 45.223804
| 45.108994
| 3,982.526367
| 88.553352
| 3,905.811035
| 45.222607
| 44.935532
| 3,495.68042
| 77.80246
| 2,003.866211
| 45.218914
| 44.928802
| 3,915.160889
| 87.632179
| 3,845.235596
| 45.20845
| 45.042282
| 3,499.108154
| 77.791595
| 2,020.835815
| 45.214554
| 45.152821
| 4,030.941895
| 89.267456
| 3,962.96582
| 45.177708
| 45.173306
| 2,042.811646
|
2.344009
| 0
| 2.313026
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 289.219635
| 3.333708
| 304,087,040
| 65,536
| 0.002151
| 64
| 580
| 0
| 255,720.945216
| 8,192
| 8,192
| 8,192
| 8,192
| 0.889694
| 2.10679
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1,991.565308
| 1,406.252197
| 44.940853
| 3,458.116211
| 77.788467
| 2,027.431519
| 45.232082
| 45.077019
| 4,006.949951
| 89.414146
| 4,025.932129
| 45.229935
| 45.17989
| 3,514.951904
| 78.028534
| 2,017.02356
| 45.232811
| 45.106033
| 3,974.262451
| 88.322662
| 3,946.166992
| 45.228317
| 45.195457
| 3,521.765381
| 78.103065
| 2,024.209351
| 45.235004
| 45.007675
| 3,982.036377
| 88.405983
| 3,922.212891
| 45.218113
| 45.139309
| 3,525.162354
| 78.105614
| 2,026.944214
| 45.225491
| 45.114616
| 4,023.548096
| 89.205795
| 3,959.779053
| 45.192211
| 45.193558
| 2,043.390381
|
2.536633
| 0
| 2.280575
| 0
| 0
| 24.462925
| 211,904,512
| 211,812,352
| 205,520,896
| 56
| 291.144684
| 3.176618
| 314,572,800
| 65,536
| 0.002226
| 64
| 600
| 0
| 256,243.058821
| 8,192
| 8,192
| 8,192
| 8,192
| 0.864328
| 2.082856
| 0
| 0
| 0
| 0
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 1
| 2,008.65686
| 1,407.285278
| 45.082706
| 3,491.775879
| 78.000854
| 2,028.374634
| 45.234703
| 45.125671
| 4,022.549561
| 89.473076
| 4,006.842041
| 45.220158
| 45.122219
| 3,505.006348
| 77.908951
| 2,006.106689
| 45.223877
| 45.08197
| 3,994.25415
| 88.595604
| 3,905.425049
| 45.219433
| 45.179813
| 3,509.094238
| 77.97551
| 2,017.565308
| 45.229168
| 44.95369
| 3,901.039307
| 86.999878
| 3,834.873779
| 45.204964
| 45.188953
| 3,515.250977
| 77.978111
| 2,023.312378
| 45.221592
| 45.105286
| 4,015.1604
| 89.000938
| 3,878.013428
| 45.170361
| 45.174049
| 2,043.362793
|
End of preview.
YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/datasets-cards)
##About
Publicly available artifacts from OpenAI's paper Weight-sparse transformers have interpretable circuits
https://cdn.openai.com/pdf/41df8f28-d4ef-43e9-aed2-823f9393e470/circuit-sparsity-paper.pdf
- Downloads last month
- 17