{"id":2523,"date":"2020-06-28T22:28:06","date_gmt":"2020-06-28T22:28:06","guid":{"rendered":"http:\/\/funkboxing.com\/wordpress\/?p=2523"},"modified":"2021-01-03T17:51:10","modified_gmt":"2021-01-03T17:51:10","slug":"is-deception-the-origin-of-self","status":"publish","type":"post","link":"http:\/\/funkboxing.com\/wordpress\/?p=2523","title":{"rendered":"Is Deception the Origin of Self?"},"content":{"rendered":"\n<figure class=\"wp-block-embed-youtube wp-block-embed is-type-video is-provider-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"SALWITAY - Is Deception the Origin of Self?\" width=\"695\" height=\"391\" src=\"https:\/\/www.youtube.com\/embed\/4YGo-qLfm1k?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe>\n<\/div><figcaption>Classic Balsa Glider &#8211; Flight technology perfected.<\/figcaption><\/figure>\n\n\n\n<p>As far as human\u2019s know, we have the most advanced theory of mind of any being. We did kind of come up with the idea though. The idea that other beings have inner lives comparable to our own is the basis for empathy, and cooperation, and deceit, and pretty much everything we associate with being a conscious being living among other conscious beings.<\/p>\n\n\n\n<p>We base our whole concept of intelligence on a mind\u2019s ability to create an identity and internal awareness of itself as an agent distinct from its surroundings and other creatures. It doesn\u2019t make much sense at all to think of <em>awareness<\/em> without a <em>self<\/em>.<\/p>\n\n\n\n<p>I think experience and awareness are emergent properties of massively integrated computation and memory of environmental data streams conveyed by the senses. These integrations form a simulation space for planning and prediction. The simulation space may form a representation of the entity hosting it and even the simulation itself. These internal representations of the system are the basis for a systems awareness of itself as a distinct self.<\/p>\n\n\n\n<p>But what defines the parameters of the \u2018self\u2019 the computational system identifies? A mirror test can supposedly demonstrate an animal\u2019s capacity to visually distinguish itself from another animal, and it\u2019s a fair assumption that any animal with this capacity makes the distinction of their \u2018self\u2019 as their physical body.<\/p>\n\n\n\n<p>A human body is a distinct unit, like all other bodies that support \u2018minds\u2019 we identify as comparable to ours. Even a cephalopod, which is probably the most alien intelligence to our own that we can still recognize as intelligent, is clearly built on a distinct individual body unit, just like us. The self is its separation from its environment. The boundaries of a self are the boundaries between the supporting computational system and the environment. In our experience, these boundaries are clearly associated with a unit body.<\/p>\n\n\n\n<p>The value of this arrangement is obviously survival of the system that supports the self. The self\u2019s ability to distinguish itself from its environment is what allows it to plan and interact with it in complex ways. The capacity for complex interaction is necessary when there are multiple individual body units competing for and utilizing one another as resources.<\/p>\n\n\n\n<p>Predation may be the initial catalyst for more advanced forms of self awareness in both predators and prey species. When one being must consume and destroy another to survive, it\u2019s to both\u2019s advantage to be pretty clear about where one creature ends and the other begins.<\/p>\n\n\n\n<p>Both predator and prey have an incentive to predict one another\u2019s behavior. The more sophisticated a mind each has, the more accurate and useful their predictions and behavioral reactions become.<\/p>\n\n\n\n<p>Predation requires a clear understanding of the boundaries of the structures supporting each \u2018self\u2019, but does not require a strong consideration of what lies inside those boundaries; another creature\u2019s mind. But as predation and competition for resources becomes more sophisticated, a more advanced theory of mind becomes useful for purposes of deception.<\/p>\n\n\n\n<p>Deception itself does not require a mind at all. Evolution fabricates deceptions for microscopic creatures with no discernable mind at all. But reactive behavioral deception seems to indicate a more refined understanding of the self to include awareness of the state and intents of other minds.<\/p>\n\n\n\n<p>To actively deceive requires awareness of another creature\u2019s expected reaction to a given stimulus, and manipulating that stimulus to affect a more advantageous outcome from that behavior.<\/p>\n\n\n\n<p>The simplest deception is hiding but it probably doesn\u2019t require any real awareness of the fact that if a predator cannot see you it has less chance of eating you. It\u2019s probably parallel to simple avoidance behaviors that are universally effective. Deception behaviors are often very low-risk survival strategies, so it stands to reason that the more advanced a creature\u2019s mind becomes, the greater the advantages of active deception become.<\/p>\n\n\n\n<p>A squirrel that knows it\u2019s being observed may fake burying nuts in various locations. Of course we cannot say with certainty what the squirrel understands about what it\u2019s doing or why. We just know squirrels do that sometimes and apparently it works to some degree or they probably wouldn\u2019t have evolved the instinct to bother. But even absent any meta cognitive awareness of its awareness of other awarenesses, it seems arrogant not to give the squirrel credit for at least understanding, or experiencing, that \u2018<em>things that watch me take my stuff\u2019<\/em> and altering its behavior accordingly. That\u2019s enough for me to give it the distinction of having at least a proto-awareness of self that I can extrapolate as having the potential to evolve something as complex as my own.<\/p>\n\n\n\n<p>Deception and empathy are both potential paths to more advanced theories of minds. Empathy facilitates more cooperative interactions, but has essentially the same computational requirements as deception. Both require a creature\u2019s simulations to include constructs for individual beings other than itself, and to maintain historical state and intent data for each. The advantages of empathy are generally limited to interactions within one\u2019s own species, whereas the advantages of deception extend beyond the species. And while empathy may ultimately serve to advance a creature\u2019s self awareness and theory of mind far beyond what deception can achieve, I think it\u2019s possible the evolution of the capacity for deception is a necessary prerequisite for empathy.<\/p>\n\n\n\n<p>What survival strategy better incentivizes the development of self that includes awareness of other selves than deception? What other basic survival advantage would understanding the state and intent of another creature\u2019s mind grant? The capacity to predict and plan behaviors in response to stimulus quickly reaches a point of diminishing returns against environmental pressures that are totally transparent. Both utilizing deception, and defending against it, are catalysts for more advanced understanding of distinct selves.<\/p>\n\n\n\n<p>So that\u2019s the basis for the question- Is deception the origin of self? I\u2019m sure I\u2019m missing a lot but it seems like an interesting question with interesting implications. Also it\u2019s got a ring to it. I don\u2019t think there is an answer. I\u2019m not sure how you\u2019d go about proving a causal link between the survival utility of behavioral deception and the emergence of complex self awareness. But the question has been asked, so I figure why not take the next step. So what if it is?&nbsp;<\/p>\n\n\n\n<p>I don\u2019t think it really changes much. It\u2019s not even that useful a question and probably a little misleading without deep context, but I\u2019m not sure how else to phrase it in a sentence.<\/p>\n\n\n\n<p>Maybe it gives a little more definition to the ancient wisdom that Atman equals Brahman. I take the perspective that a \u2018mind\u2019 is like a flame in that it\u2019s just a phenomenon that manifests in given conditions. It\u2019s uniqueness is entirely in its initial conditions and environment. But it\u2019s all the same phenomenon. So I think \u201cThere is only one mind in the universe\u201d is wholly correct. There\u2019s still only one mind in the universe, and our individual experience of it is defined by the construct of a \u2018self\u2019 that experiences and is aware of its own internal simulations. Nothing that new there, and what you do with that in terms of morality or whatever is pretty wide open.<\/p>\n\n\n\n<p>I guess it feels somehow profound that our minds might be intrinsically separated and alone, with no possible structure to enable true union of mind beyond external communications with beings we presume have similar minds but can never confirm. That it might somehow explain unrequietable spiritual longing for unity and universal understanding. But I think that\u2019s kind of pointless and anthropic.<\/p>\n\n\n\n<p>To me the idea that deception is the origin of self raises a much more interesting question. Predation and deception are not universal strategies for life even on Earth. Given the expansive potential of life in the universe, might systems capable of thought develop from other pressures that might give rise to an intelligence or awareness without a self? How could that evolve or exist at all, and how might we characterize its \u2018qualia\u2019 of consciousness?<\/p>\n\n\n\n<p>This obviously challenges the limits of human imagination, and I might be fooling myself that a mind built on a self could even comprehend the nature of a mind without a self, but here I go.<\/p>\n\n\n\n<p>The mechanics and development of such a mind require looser parameters for what constitutes a being, or even thought. Animal nervous systems are extremely well defined computational and sensory structures. It\u2019s difficult to imagine analogous internal states of thought emerging from a more distributed living system with no apparent executive control. I don\u2019t think the states of thought between a self-mind and a non-self-mind would be analogous, but I do think there could still be a capacity for a kind of \u2018thought\u2019, or at least an experience, which could give rise to thoughts.<\/p>\n\n\n\n<p>If a system can sense its environment, integrate sense information with memory of previous sense information, and physically alter itself or its environment based on that integration, I think it satisfies the basic requirements to have experience. We wouldn\u2019t be looking for distinct, individual creatures as we\u2019re familiar with them. I think the most likely place to find a non-self-mind would be a far more complex living structure such as an ecosystem, colony, or hive structure.<\/p>\n\n\n\n<p>The mechanisms that provide the sense, integration, and memory functions may be difficult to identify, but they exist in various forms throughout the universe, especially if we stretch to the largest and smallest scales of time and space. A dense, ancient forest watched over centuries takes on extremely complex changes that could arguably be called \u2018behaviors\u2019 and \u2018responses\u2019. Interactions between species may constitute integration of different sensory inputs. Subtle evolution of creatures within the forest\u2019s microbiome may constitute a form of long term memory. We can imagine analogs of living structures within convection cells in a star, or crystal growth that modifies its own electrical properties to improve self-replication in a dynamic environment.<\/p>\n\n\n\n<p>Even if we call them analogs of life, we are reluctant to ascribe the property of \u2018thought\u2019 or even \u2018behavior\u2019 to these kinds of systems. They are so radically different from anything we identify as possessing those capacities. The absence of hierarchy or executive control mechanisms seems to imply an absence of will or internal experience. It is hard to imagine such a system having the same active internal simulation space that it could use to predict or plan behaviors. But are such simulation spaces truly necessary for all forms of \u2018thought\u2019, or just for self-aware meta-cognition?<\/p>\n\n\n\n<p>I should probably use the term \u2018experience\u2019 more than thought to describe a non-self-mind, though I wonder if that\u2019s a distinction without a difference in the context of a discussion of a mind without a self. It seems to me a mind without a self would experience thought more seamlessly than the human mind. Meta-cognition allows us to step outside of our experience of thought, but it is what creates the apparent distinction between thought and experience. Without a self, there may be no distinction to make. Does that mean that a non-self-mind is incapable of any kind of meta-cognition? Maybe, or just maybe only as we know it. This is probably the edge of my imagination. I can\u2019t even approach how a non-self-mind might come to be aware of thought without having a \u2018self\u2019 to be aware of, but I don\u2019t think that means something like it isn\u2019t possible.<\/p>\n\n\n\n<p>So if there are other minds that exist without a \u2018self\u2019, how might we interact with, or even observe them? Well, that\u2019s the rub. We can\u2019t do either, ever. It\u2019s trying to multiply a number and a letter, doesn\u2019t even make sense.<\/p>\n\n\n\n<p>A non-self-mind cannot fully distinguish me as not itself, and I cannot even recognize a being that doesn\u2019t have an easily definable unit body to interact with. Non-self entities may not have anything resembling language, or communication at all. It seems like it would be a kind of \u2018pure thought\u2019 that would have no need for symbolic expression. If there are such minds in the universe, they may be all around us- but they would be so fundamentally incompatible with our own that we appear to them as nature appears to us- mindless forces and phenomenon.<\/p>\n\n\n\n<p>But also for practical purposes- we are simply too small and short-lived. The minds I\u2019m imagining would most likely exist on geologic or planetary timescales and over expansive areas. The evolution of self-based intelligent agency can be catalyzed by biological evolution and predation, which are relatively rapid, violently iterative processes. A mind that emerged from forces other than individual survival would likely develop slowly, with no iterations, only a smooth flow of experience from the simplest correlation of sensory inputs, maybe all the way to kooky ponderings about the possibility of \u2018minds\u2019 that are distinct and separate from one another.<\/p>\n\n\n\n<p>Or\u2026 maybe all this is just plain wrong and self is the origin of awareness, and there can be no awareness without self. Maybe \u2018self\u2019 is as simple as the simulation having a symbol for itself and all simulations do that eventually. Maybe any being that I\u2019m thinking of not having a self actually would have a self, it would just be so vast and alien that I\u2019m calling it something else, but it\u2019s really just a giant self. Or maybe not even that. Maybe you really do need tightly integrated systems with well defined executive control for anything resembling a mind to emerge. Maybe whatever, but it\u2019s fun to think about other kinds of minds for a while so I did that with mine.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>As far as human\u2019s know, we have the most advanced theory of mind of any being. We did kind of come up with the idea though. The idea that other beings have inner lives comparable to our own is the basis for empathy, and cooperation, and deceit, and pretty much everything we associate with being <a href='http:\/\/funkboxing.com\/wordpress\/?p=2523' class='excerpt-more'>[&#8230;]<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[10],"tags":[],"_links":{"self":[{"href":"http:\/\/funkboxing.com\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/2523"}],"collection":[{"href":"http:\/\/funkboxing.com\/wordpress\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/funkboxing.com\/wordpress\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/funkboxing.com\/wordpress\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/funkboxing.com\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2523"}],"version-history":[{"count":3,"href":"http:\/\/funkboxing.com\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/2523\/revisions"}],"predecessor-version":[{"id":2538,"href":"http:\/\/funkboxing.com\/wordpress\/index.php?rest_route=\/wp\/v2\/posts\/2523\/revisions\/2538"}],"wp:attachment":[{"href":"http:\/\/funkboxing.com\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2523"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/funkboxing.com\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=2523"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/funkboxing.com\/wordpress\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=2523"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}