
CEO Dr. @bengoertzel questions the industry's concentration on transformer-based models, arguing that current LLMs are “doing about the same thing” and are limited by their inability to continually learn from new experiences and update their internal parameters in real time.

English