quaternion orientations

  • Resonant Attention: Prime-Indexed Hypercomplex Mechanism


    [R] Resonant Attention: A Prime-Indexed Hypercomplex Attention MechanismAn innovative approach to attention mechanisms replaces standard dot-product scoring with a geometrically distinct method, representing tokens as sparse activations over prime-indexed dimensions. This involves complex amplitudes and quaternion orientations, with similarity computed through Jaccard similarity, quaternion alignment, and phase coherence. The mechanism achieves O(nk) complexity, which can be reduced to O(n log n) when sparsity k is O(log n), offering a more efficient alternative to typical O(n²) or O(nd) complexities. Despite higher constant factors due to sparse state management, this approach allows for order-sensitive processing without positional encodings and interpretable attention weights, making it suitable for applications where sparsity is natural. This matters because it provides a potentially more efficient and interpretable alternative to traditional attention mechanisms in neural networks.

    Read Full Article: Resonant Attention: Prime-Indexed Hypercomplex Mechanism