Post

SingularityNET
SingularityNET@SingularityNET·
Transformer-based large-language models display striking competencies, yet they lack flexible, dynamic long-term memory, explicit goal systems, and the self-reflective loops needed for open-ended adaptation. Symbolic systems, in contrast, offer clarity and self-modification but have struggled to match the perceptual breadth of deep learning. OpenCog Hyperon seeks to bridge this divide by embedding neural, logical, and evolutionary processes inside a single metagraph whose elements can rewrite one another in real time
SingularityNET tweet media
English
5
10
122
6.1K
SingularityNET
SingularityNET@SingularityNET·
Hyperon is a very flexible implementation substrate. However, one of the guiding motives behind its development has been the implementation of a specific cognitive architecture known as PRIMUS. PRIMUS structures the Atomspaces of a Hyperon instance into modules for episodic memory, working memory, procedural memory, perception, and action selection, associating specific learning and reasoning methods with each of these aspects, carefully designed for “cognitive synergy” between these methods. The dynamics are designed to mirror human cognitive cycles while also supporting robust long-term background thinking and reflective self-improvement.
SingularityNET tweet media
English
3
0
22
5K
Dariusz Swierk PhD
Dariusz Swierk PhD@swierk·
@SingularityNET As I understand it, this is a new cognitive architecture that is additional to what a “typical LLM” has today? BTW: absolutely fascinating.
English
0
0
0
26
Paylaş