Reactive Transformer PoC Supervised Models by Reactive AI Collection Experimental Proof-of-Concept Reactive Transformer models trained on simple synthetic datasets • 5 items • Updated Oct 9 • 1
Reactive Transformer PoC - RxT-Alpha Supervised Models Collection Experimental stateful real-time Reactive Transformer (RxT) models after supervised training stages • 5 items • Updated Oct 8 • 2
view article Article Reactive Transformer (RxT): Fixing the Memory Problem in Conversational AI Oct 8 • 6
TensorBLEU: Vectorized GPU-based BLEU Score Implementation for Per-Sentence In-Training Evaluation Paper • 2510.05485 • Published Oct 7 • 7
Reactive Transformer (RxT) -- Stateful Real-Time Processing for Event-Driven Reactive Language Models Paper • 2510.03561 • Published Oct 3 • 24
Sparse Query Attention (SQA): A Computationally Efficient Attention Mechanism with Query Heads Reduction Paper • 2510.01817 • Published Oct 2 • 15
Sparse Query Attention (SQA) Research Collection Experimental models with Sparse Query Attention layers. Reducing training time/cost by ~3-10% compared to GQA & MQA, with the same level performance • 10 items • Updated Oct 3 • 1