I think that modern AI and machine learning have certain potentials, but I think the deeper problem is people don’t understand sentience in general. Once one understands sentience, one can then understand what one is attempting to mimic.
One of the largest problems facing anyone doing work in this area is the magical belief in ‘mind’ — and that somehow, the brain is a magic organ that projects ‘mind’. While there is an upper end to this that may be true, it is important to realize that brains work in our current 3D reality just fine, and whenever we play the ‘magic’ card, it’s usually because we fail to understand something well enough.
I’ve taken a run at this myself — I’m a complex system scientist and collective intelligence researcher. First stop on the bus is to figure out how knowledge is structured. Once one figures that out, then it turns into a Natural Language Processing problem in deconstructing text for semantic and conceptual meaning. But first, you have to understand the structure of the knowledge you’re attempting to figure out.
See this post on my blog: https://empathy.guru/2019/04/06/what-is-structural-memetics-and-why-does-it-matter/
read original article at https://medium.com/@wiseaftertheevent/i-think-that-modern-ai-and-machine-learning-have-certain-potentials-but-i-think-the-deeper-problem-7fc7a97bc4b?source=rss——artificial_intelligence-5