On the topic of: Are aphantasic minds similar to ChatGPT?

Hi!

On the latest news letter, i just want to post a really interesting video, that gave me a similar feeling!
I have started to think that my way of "visualising" is using the feeling of distance between related subjects.

And representing language models as a multi dimentional cloud feel very close, and therefore also relates us to how animals think and speak:

Share this post

You must be signed in to comment

When you have no visual or auditory imagery, thinking must be in words and I must use my words carefully. Those who can use imagery in their thought may have an advantage because one picture is worth a thousand words. But when it come time to share thoughts, those images have got to be turned into words. But I started with words, no translation needed.
But getting to the main point, I do think like AI and my habit of using words carefully make my prompts more effective. I enter into conversation and we discuss what to do. And we do it.
Then we can put out a prompt engineer report so other can do it automatically.
I consider those two gifts of aphantasia.

Ai is more about mathematics rather than how things are related. A large language model has basically the whole internet in its head which it can access at any time, and all it has done when you ask it a question is scan through all the words ever digitally written and regurgitates the ones most relating to what you have asked and relating to each other. (When it types “I don’t like it” it’s not typing the sentence as a whole, it’s typing each word as a probability, based on all the sentences it has ever analysed)

Don’t know if that was helpful. Humans think very homogeneously and sentences are how we express one feeling in a way, whereas Ai, doesn’t have feelings obviously, and it’s basically making it all up word after word based on mathematical probability.

I’d say that the ‘essence’ of linking related things together is there. Not sure if that’s the best comparison?