You are given the task of programming an AI machine. When finished, this machine will be a fully conscious, sentient being, able to perceive & relate to the world around it in ways that are at least analogous to our own experience.
One characteristic we will want it to have is the ability to enjoy "aesthetic pleasure" & "beauty". But we wouldn't want the machine to simply be bound to our own past cultural yardsticks - its experience of beauty & aesthetic delight must seem original to its own mind, as we would like to think our finest experiences of beauty are. But what criteria should we feed in to ensure that the machine can distinguish "beauty" from anything else?
With our own experience of beauty, it could be argued that the only non-cultural input is provided by various biochemical functions whose efficiency is enhanced by our experience of sensual pleasure. A simple example: children find sweet foods pleasurable (& their experience of the taste of sugar could legitimately be described as a "perception of beauty") because it is in the genotype's interest for the organism to seek out high-energy food. Outside of this biochemical context, the "nice taste" of sugar is essentially arbitrary. It tastes pleasing only because we are genetically programmed to find the taste pleasing.
To what extent does such "arbitrariness" apply to the whole spectrum of aesthetic experience, when removed from the context of genetic programming derived from biochemical needs? And since our AI machine would have no such needs, could its experience of aesthetic pleasure be anything other than abitrary? Could it be that when "designing" a synthetic consciousness, we need to imagine a model of "possible sentient characteristics" that's much more "self-creating" than the human example can provide?
One characteristic we will want it to have is the ability to enjoy "aesthetic pleasure" & "beauty". But we wouldn't want the machine to simply be bound to our own past cultural yardsticks - its experience of beauty & aesthetic delight must seem original to its own mind, as we would like to think our finest experiences of beauty are. But what criteria should we feed in to ensure that the machine can distinguish "beauty" from anything else?
With our own experience of beauty, it could be argued that the only non-cultural input is provided by various biochemical functions whose efficiency is enhanced by our experience of sensual pleasure. A simple example: children find sweet foods pleasurable (& their experience of the taste of sugar could legitimately be described as a "perception of beauty") because it is in the genotype's interest for the organism to seek out high-energy food. Outside of this biochemical context, the "nice taste" of sugar is essentially arbitrary. It tastes pleasing only because we are genetically programmed to find the taste pleasing.
To what extent does such "arbitrariness" apply to the whole spectrum of aesthetic experience, when removed from the context of genetic programming derived from biochemical needs? And since our AI machine would have no such needs, could its experience of aesthetic pleasure be anything other than abitrary? Could it be that when "designing" a synthetic consciousness, we need to imagine a model of "possible sentient characteristics" that's much more "self-creating" than the human example can provide?