Walid Saba’s Post

View profile for Walid Saba, graphic

Senior Research Scientist

“Logic says, if everything is learned from sensory input, then I'm allowed to learn it differently from you, because we have different experiences. But that’s not true. Many of the most important things in life we're not allowed to learn differently—for example, that the circumference of a circle is 2π(r). Things that we learn are accidental and immaterial. The stuff that matters, the stuff that makes the universe the way it is, are not learned. They are.” ... the top-down approach emblematic of symbolic AI is flawed because there is no foundational axiom (or agreed upon general principles) to work from—at least not when it comes to language and how our minds externalize thoughts as language. “When it comes to language and the mind, we have nothing,” Saba says. “Nobody knows anything about the mind and how the language of thought works. So anybody doing top-down is playing God.” https://lnkd.in/eyPR9p6g

EAI Researcher: Are Language Models Missing Something? - Institute for Experiential AI

EAI Researcher: Are Language Models Missing Something? - Institute for Experiential AI

https://ai.northeastern.edu

To view or add a comment, sign in

Explore topics