It seems that the more we investigate, it's more likely that we are just a more complex and less precise (because of analog signal instead of digital) LLM(maybe less language specific) like ChatGPT. It's hard for most to grasp because it goes against most people intimately familiar gut feelings. I might be wrong but the notion of "understanding" is about hitting a critical mass of processing power. We are very similar, we take in external stimulus and spit out response, if one day a machine's response is indistinguishable from a human response, whether the blackbox "understands" the concept is irrelevant. I don't consider it a bad thing, but I think a significant portion of population will think it's a bad thing.
Libertarian Free Will is Still an Incoherent Concept
Last modified by Victor Zhang on 03:18, 24/10/2023