Life is impermanent and conditioned.
Our 6 senses (seeing, hearing, smelling, tasting, touching(tact), thinking(ideas)) are interconnected and creates the "sense of a self" (Me, I), and that is what so called AI explores, nothing more, nothing less, because a machine has no capabilities to explore something else.
The self is illusion, we are just developing recursive self-similarity. Each and every time we tap into a new virtual/artificial layer of creation, we are leaving behind a very important part of the previous layer.
It begs the question: Where are we going?
Can a machine see something, hear something, smell something, taste something, touch something and get an output from it?
Yes for all.
IBM "predicted" that in around 2010, that machines would be able to mimic all human senses.
Heard about smell-o-vision ? They were installing it in some cinemas in US a while ago.. not sure if it succeed.
Intel Labs (Neuromorphic Computing Lab) was developing an mathematical algorithm in a computer chip that will mimic how the brain's neural netwokr actually works when you smell something.
basically it our cells are stimulated when we smells the molecules of something.. our cells will carry on the signals to the part of the brain responsible for the olfactory system.
What about PIM (Product Information Management System), it is a big data dictionary and it contain obviously products data, the PIM receive an image and cross check with the correct product and output the related data which create certain types of vibration with a device that could give more or less the sensation of been touching something familiar for example.. like a rough surface (rocks).. and so on..
Taste - what about 3D printing food?
There is so much development out there, so much data.. but if I may I will ask again
Where is the GHOST in the machine?