Home
Timeline
Search
Hell Fast
@hellfast.bsky.social
Deep networks = Compression + Association Multi-Layer = Compression Self-Attention = Association Layers = semantic compression (turn raw data into abstract representations) Attention = contextual association (relate parts to make sense of the whole)
Hell Fast
·
Jul 27, 2025
一共四件事 1. 学习,压缩和链接 2. 预测,解压和链接 3. 压缩,信息的抽象 4. 链接,计算和推理