Generative AI, Transformers, and a Roadmap for AGI (Artificial General Intelligence)



We differentiate between “classic” generative AI and the more recent transformer-based generative AI, in that both rely on (1) the reverse Kullback-Leibler divergence, (2) Bayesian conditional probabilities, and (3) statistical mechanics, and transformers also include (4) multihead attention and (5) positional encoding. In order to move from any form of generative AI to AGI (artificial general intelligence), we need to include new components, specifically (1) a symbolic or ontology component, (2) feedback between the signal-level processing (e.g., transformer-based or other) layers and the ontologies, ideally mediated by a CORTECON (COntent-Retentive, TEmporally-CONnected neural network), (3) a mediation control system that fine-tunes activation of CORTECON nodes to obtain more precise activation of ontology nodes, (4) a goal-and-reasoning system, and (5) differentiation between self and the external world (“Psi”). Given that work on the essential CORTECON mediation layer is underway, realization of AGIs is not that far off.

Links to the appropriate Themesis RESOURCES will be added shortly.

Here’s the link to the Themesis Academy, where you can traverse a link to the actual course offerings on themesis.thinkific.com:

Academy

Please OPT-IN with Themesis on the About page to get word AS SOON as new YouTubes, blogs, and short courses are released:
Opt-In HERE: www.themesis.com/themesis/

Subscribe to the Themesis YouTube channel easily – click this link: https://www.youtube.com/@themesisinc.4045?sub_confirmation=1

source