Observations Observations

Observations, 2012
Published by Glasgow Print Studio, ISBN 978-0-9569054-1-3
John Calcutt

Observations On

What we can experience, or perceive, or know must of course depend upon what there is to experience, or perceive, or know, but it must also depend upon the apparatus that we have for experiencing, perceiving, and knowing. For us to be able to experience anything at all it has to be such as can be coped with by the apparatus we have. That is not to say that nothing else can exist, but it does mean that nothing else can be experienced, perceived or known by us. (Bryan Magee, Men of Ideas, 1982)

The etymological root of the word “theory” lies in the Greek term theoria: “a looking at, things looked at”. (The word theatre derives from theatron; literally, “a place for viewing”.) Similarly, the word speculation derives from the Latin specere, “to look at, view”. When we read these words - when we look at them - we do not necessarily know what lies behind them, the origins of their meaning. We use them without pause: they are simply given. The same holds true, for most of us and for most of the time, for our understanding of our bodies and the world in which we exist. How many of us understand the lymphatic system, or the composition of polyurethane, or the rudiments of quantum mechanics? How many of us know how to fix our computer when it crashes? In most instances we leave of all this stuff to others: “Could you possibly take a look at this for me?”

Looking, thinking, and knowing are tightly bound together. They operate, however, as a self-regulating system in which none has precedence, either conceptually or temporally. This means that looking does not necessarily come before knowing, nor does looking necessarily limit knowing. By the same token, knowing does not always precede thinking, and thinking may sometimes contradict knowing. The relations between these three terms are constantly shifting. I know that a certain group of phenomena exhibit shared characteristics. This leads me to extrapolate that there may be other phenomena that potentially belong to this category, although this remains pure speculation. I have not actually seen any of these phenomena because they exist beyond the limits of my own perceptual apparatus, and beyond the reach of the instruments available to me (e.g. microscopes and telescopes). I would not know what they look like, and I would have to invent or imagine ways to represent them, find new means to endow them with form and visual presence.

To make matters even more interesting, looking, thinking, and knowing are all tangled up with something akin to desire. We want to know, see and think certain things: certain things are satisfying to know, see and think (whereas others are disappointing and frustrating). We need to know, see and think certain things. In other words, looking, knowing and thinking are not entirely objective. In fact, at this point it may be worth making a distinction between looking and seeing. Two people may look at the same object or event and see very different things. One might see a picturesque landscape, whereas another might see a scene of rural economic neglect. Watching dramatic footage on the evening news bulletin, one might see a struggle for independence; the other might see a lawless riot. Confronting a rhinoceros for the first time in 1515, Albrecht Dürer was floored. In order to depict this extraordinary creature he was forced to picture it in terms of things that he was already familiar with. To modern eyes, Dürer’s rendition of the rhinoceros looks uncannily like an armoured warhorse. Seeing, then, is inseparable from what we know, what we want, and what we may think.

Seeing is also, of course, linked to questions of faith and trust: seeing is believing, according to the old truism. It is far easier to invest our trust in something that we can see. If we cannot see something we might even be tempted to suppose that it does not exist. If, for whatever reason, we are unable to see the thing itself, then we will happily settle for a representation of that thing - a photograph, a painting, a drawing, for example. It is even probably true to say that today we are far more exposed to representations of things than to the things themselves. Most of us have never met Barak Obama in the flesh, but we have seen him often enough on our TV screens and in our newspapers to be pretty certain that he exists. Such is the current pervasiveness of representations that it has led some theorists to claim that representations no longer mirror or reflect reality: they have become reality. According to the French philosopher Jean Baudrillard, for example, the Gulf War did not actually take place, it was simply an effect produced by electronic images on computer terminals. The combatants did not engage with each other in ways that we might understand as conventional warfare: they fought it out in the virtual realm of digital images and abstract representations.

Thus, by virtue of their entanglement with seeing, thinking and knowing are also often subject to irrational and subjective influences. Recent research, in fact, has suggested that fingerprint identification in forensic investigations is not always as objective and reliable as we have been led to believe because it is open to influence (and consequent misidentification) from contextual factors. If fingerprint analysts are fed certain pieces of information before beginning their task, it will often produce skewed results, leading to more than 1,000 mistakes per year in the USA according to one source. Nevertheless, sight retains a capacity for objectivity, and is central to our understanding of the world. According to Marshall McLuhan, sight is a “cool” medium: it is relatively detached and assessing, and it more precise than the other senses. If, for example, we want to know exactly how hot something is, touch alone cannot provide us with accurate information. However, with the invention of the thermometer we can turn heat into something visible, something precisely calibrated for the eyes to read. Human ingenuity has also produced speedometers to translate the experience of velocity into something measurable by the eyes, whist scales, decibel meters, clocks and rulers perform a similar functions for the experiences of weight, sound, time, and distance.

The effect of examples such those noted above is to elevate sight above the other senses in terms of the ability to present the world to us as something measurable and quantifiable. However, in order to prioritise sight, it must be somehow detached from the other sense. (The thermometer, for example, removes the sense of touch from the assessment of heat.) Rather than interacting with each other in a kind of co-operative system, the senses are increasingly understood to function as specialised tools. There is a distinct and ‘proper’ area of activity for each of the senses, and there should be as little cross-interference between these designated areas as possible. The condition known as synaesthesia, in which sensory impressions from one ‘area’ are registered and experienced in a different ‘area’ - such as when music is experienced visually - is thus understood as an abnormality. Thus it is not entirely surprising to find that during the heyday of Modernism in western art (stretching from roughly 1860 to 1960) an influential strand of art increasingly detached itself from tactile experience in order to position itself as a purely optical phenomenon, addressing itself to the eye alone - a process that culminated in the Op Art movement of the 1960s, whose primary purpose was to produce images that excited and confused the viewer’s optical experience.

This process whereby the human senses become ever more subject to the processes of specialisation may be understood as part of a wider historical development. The ways in which knowledge and experience are organised and compartmentalised today are the result of specific historical conditions. It used to be the case, for example, that the study of geometry was pursued as a branch of philosophy. Thus an Italian Renaissance artist, such as Piero della Francesca, could claim an intellectually elevated status for his work - and thus for his profession - by virtue of his ability to employ the geometric principles of mathematical perspective. Three centuries later, Constable’s studies of meteorological conditions would have been understood by his mid-nineteenth century contemporaries as exercises in natural philosophy, rather than natural science. And the current structure of our universities – their division into various Faculties, such as Arts, Humanities, Science, Technology, and so on - is a legacy of eighteenth century Enlightenment thinking, when it was proposed that the human mind is ‘naturally’ composed of such separate and independent faculties. Consequently, art became divorced from science, and science itself was split into ‘pure’ and ‘applied’ branches. The conditions under which the exemplary ‘Renaissance man’, such as Leonardo da Vinci, might emerge were lost.

Efficiency became the watchword. At the risk of a reductive oversimplification one might say that the demands of the capitalist economy for maximisation of profit provided the imperative for this drive towards specialisation. The logic that underpinned factory production during the period of industrialisation - a logic fuelled by the demand for efficiency and rationalisation - was applied to human life and experience at all levels. In order for us all to contribute as efficiently as possible to this system it became strategically important for us to know more and more about less and less. Expertise in confined areas was prized more highly than general capability across a range of pursuits. We had entered the age of the expert.

Needless to say, this meant that art and science (for example) became increasingly divorced not only from each other, but also from the general public. Modern art became ‘obscure’ and ‘elitist’, whilst science became ‘incomprehensible’ and possibly ‘threatening’. Each developed its own specialist ‘language’, a ‘language’ impenetrable to the uninitiated layman. Their products could be ‘seen’, but not ‘understood’. They seemed content to address themselves and their own self-defined issues, rather than the world of common knowledge and shared experience.

But this is not to say that a rapprochement between these two areas of specialist activity is not possible. In fact, it may be absolutely necessary. Seeing and knowing, as I have tried to suggest, are inseparable. And although the relation between seeing and knowing may be more complex than is often thought, they have the power to aid each other. In William Ivins’s seminal study of the history of printmaking, Prints and Visual Communication (1953), he argues that the original value of the technology of printmaking lay not in the development of art and aesthetics, but in the development of science and technology. The value of the “exactly repeatable pictorial statement”, he claimed, enabled knowledge to be transferred in a reliable and efficient form, allowing herbalists, for example, to identify plants and their medicinal properties with confidence. And imagine, says Ivins, trying to learn how to tie complicated knots if one were reliant solely upon written instructions unsupported by visual aids.

The situation confronting artists and scientists today is, of course, more complicated than that addressed by Ivins in his historical study. As knowledge continues to evolve and expand, the means used to convey that knowledge must develop apace. Abstract art, it has been argued, arose not from an internal need within painting, but because the world itself was receding from the ability of painting to adequately represent it. Simply put, the world has become more abstract, and thus demands more abstract means of representation. Such abstraction does not, of course, necessarily need to take the form of non-representational imagery: this abstraction can equally be effected by means of metaphor, analogy, or other equivalent tropes. In other words, something that is unknown or unfamiliar may be represented in terms of - translated into - something that is already known and familiar. Dürer’s rhinoceros offer ample proof of the artist’s continuing struggle to find adequate means to picture that which has never been pictured before.

If - to use the terms employed by Bryan Magee at the introduction to this essay - the apparatus available to us for experiencing, perceiving or knowing become inadequate in the face of a reality in the process of transformation, then we are perhaps presented with two options: either we transform that apparatus in such a way as to make it functional, or we use it as an instrument to convert the unknown into the already known. (It should perhaps be pointed out here that such apparatus is not to be understood solely in terms of technological hardware, but also - more importantly, perhaps - as including systems of representation, such as language and images.) This, according to the French philosopher Jean-Francois Lyotard, is similar to the challenge faced by artists in the face of the sublime. The sublime is understood here as the feeling of profound disquiet that arises in those moments when our ability to conceive of something is found to be incommensurate with our ability to represent it (the example of infinity is often given as a case in point: other examples might include hyperspace, the internet, black holes, the Higgs boson). In such situations artists faces two possibilities: they can, so to speak, ‘illustrate’ the sensation of the sublime (or its equivalent) by using the representational resources available to them; or they can attempt to ‘produce’ the sublime (or equivalent) sensation in their work, but in so doing they will necessarily have to exceed the available representational resources, and will thereby produce work that is, according to Lyotard, a form of “incommunicable statement”. We may look at such a work, but we will not know what we are seeing. Eventually, however, our knowing may become transformed as a consequence of what we have seen. In other words, the artist may give form to something that as yet lacks form, and that new form may itself transform our ability to further understand the thing in question. The form may, in fact, become part of the very identity of that which had previously evaded attempts to define it.