100x less compute power with GPT-level LLM performance: How a little-known open source project could help solve the GPU power conundrum – RWKV looks promising, but challenges remain

Recurrent neural networks (RNNs) are a form of artificial intelligence primarily used in the field of deep learning. Unlike traditional neural networks, RNNs have a memory that records information about what has been computed so far. In other words, they use their knowledge from previous inputs to influence the output they will produce.

RNNs are called “recurrent” because they perform the same task for each element in a sequence, with the output depending on the previous calculations. RNNs are still used to power smart technologies such as Apple’s Siri and Google Translate.