Professor Cayley said a phrase during yesterday's lecture, a phrase that I've heard him say before, about using computer processing to "generate meaning." It usually goes along with its complement: "to generate meaning and affect." The affect, then, being the emotional impact of a work of art, and the meaning being the conceptual impact.
While generative poetry and prose seem to stake a good claim to being able to generate affect, I find meaning to be a much more troubling issue. Algorithmically chosen words and phrases can fall into a pleasant poetic jumble, calling up unexpected images to establish a mood. Here's an example of some text generated by the Markov Chain algorithm, running over a number of poems:
Rights went wild with sweetened whipped cream around you, Sisters in the other matters entirely local. In Antarctica, during storms, The Aurora Australis. The legend becomes fact. We stand silent.
Our minds fairly comfortable with assigning an atmosphere to a fairly incoherent work. But meaning? That's a different story. Generating a coherent thought, much less an argument, much less a narrative, is orders of magnitude more difficult. Does anyone know of a reason, grounded in neuroscience or philosophy (or some discipline that lays claim to cognition) why that should be the case?